Consumers are increasingly turning to health apps for a variety of medical and wellness-related purposes. This has in turn caused greater amounts of data—including highly sensitive information—to flow through these apps. These data troves can trigger significant compliance responsibilities for the app developer, along with significant legal and contractual risk. This latest installment in our health app series will introduce some of these considerations, including approaches that developers can take to minimize their risk.
HIPAA
The primary federal law of concern to health app developers is the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”). To determine whether HIPAA applies, developers need to first understand the nature and the source of the personal data associated with their app. HIPAA only applies to protected health information (“PHI”) which is broadly defined as information that:
- is created or received by a covered entity (i.e., a health care provider, health plan or health clearinghouse); and
- which relates to the past, present or future mental or physical health of an individual; and
- that identifies the individual.
Some apps will avoid HIPAA by virtue of the app not interacting with data that is created or received by a provider, plan or clearinghouse. For example, apps that require end users to input their own health information may not have to comply with HIPAA. However, other developers providing apps on behalf of covered entities are not so lucky. In these cases, the developer is likely to be considered a “business associate” of the covered entity. A business associate is broadly defined as anyone who stores, collects, maintains, or transmits PHI on behalf of a covered entity. Again, context is crucial when determining whether a developer is a business associate. A developer offering a diabetes app on behalf of an insurer, and using PHI of the insurer, would be considered a business associate, but a developer independently offering the same exact app to the general public, and using health information volunteered by the public, would not. These distinctions can be subtle, and developers should take time to determine their position within this framework before marketing—and ideally before developing—their app. Fortunately, the U.S. Department of Health and Human Services Office for Civil Rights (OCR) has released guidance that analyzes various scenarios involving health apps. It is a must read for any developer that comes anywhere near health information.
HIPAA Security
Sometimes HIPAA is unavoidable given the functionality of the app and the context in which it is being provided. In these cases, the developer should not take its HIPAA obligations lightly. Developers will face a number of obligations, many of which emanate from HIPAA’s Security Rule. The Security Rule, which applies to electronic PHI (“ePHI”), requires both covered entities and business associates to ensure the confidentiality, integrity, and availability of all ePHI that the entity creates, receives, maintains, or transmits. To do this, the developer is obligated to implement administrative, physical and technical safeguards for the ePHI, and must conduct a thorough documented analysis of its systems in order to determine how it will implement these safeguards. Policies and procedures must also be created and enforced, and the developer’s workforce must be trained. A developer that fails to comply with HIPAA can be subject to steep fines and penalties.
HIPAA and Cloud Services Providers
Many developers contract with cloud services providers (“CSPs”) for various services, including storage of user data. These “downstream” relationships may also be subject to HIPAA. Last year, the OCR published guidance to assist developers and their CSPs with HIPAA compliance. They provided a principle that is useful for developers and CSPs alike:
When a covered entity engages the services of a CSP to create, receive, maintain, or transmit ePHI (such as to process and/or store ePHI), on its behalf, the CSP is a business associate under HIPAA. Further, when a business associate subcontracts with a CSP to create, receive, maintain, or transmit ePHI on its behalf, the CSP subcontractor itself is a business associate.
The guidance delves deeper into the application of HIPAA in the cloud, and should be reviewed by developers that use CSPs. If the parties determine that the CSP is a business associate, the parties must enter into their own business associate agreement (often referred to as a “sub-business associate agreement” or a “subcontractor agreement”) that is no less strict than the upstream agreement between the developer and the covered entity. The guidance also addresses the effect of encryption on the application of HIPAA to CSPs. In the past, some CSPs have questioned whether they are a business associate if they handle only encrypted ePHI for which they do not have the decryption key. The guidance clarified that encryption of its client’s data does not allow the CSP to avoid being a business associate.
HIPAA and Encryption
Contrary to popular belief, HIPAA does not mandate encryption. Under HIPAA, encryption is an “addressable” security standard. This means that it is only required if a covered entity or business associate determines that it is reasonable and appropriate to implement. It will almost certainly be the case that implementing encryption is reasonable and appropriate for health app developers. Developers who determine that encryption is not reasonable or appropriate should identify and document compensating security controls to support their contention that encryption is not required for their app. When implementing encryption, developers should focus on encrypting data that is on the device, as well as data that is in transit, at rest in the cloud, and at rest on the developer’s own systems. Properly encrypting PHI has the added benefit of providing a safe harbor against breach notification. Under HIPAA, the developer could leave a laptop full of PHI on the subway, and, as long as it is encrypted in a manner that satisfies certain federal standards, the loss will not be considered a breach. Since the covered entity is ultimately responsible for reporting breaches to individuals (and potentially the media), avoiding such breaches can help protect the developer’s reputation.
Other Laws of Concern
This post has focused primarily on HIPAA, but that’s not the only law that can be implicated by health apps. For example, our previous post discussed some of the federal consumer protection laws that are enforced by the Federal Trade Commission (“FTC”) and that apply to developers. Consideration should also be given to whether an app will be used by those under the age of 13. If so, the app must also comply with the Children’s Online Privacy Protection Act (“COPPA”).
A separate set of laws protecting health information and personal information exist at the state level. These laws generally provide additional protection for sensitive categories of data, such as behavioral health information, HIV test results, genetic testing and counseling information and drug and alcohol treatment information. State laws impose additional compliance requirements. They may also overlap, conflict with, and in many cases, preempt HIPAA. A developer must comply with state law and not HIPAA if state law is more stringent than HIPAA. Some states, including California, contain requirements that must be specifically addressed in the app’s privacy policy, discussed below. The variety of state laws and the complexity of their applicability makes careful planning critical for app developers in order to ensure compliance.
Privacy Policies
Aside from HIPAA, developers should make sure that their app is accompanied by a clear and easily readable privacy policy describing the developer’s uses and disclosures of information, including PHI. Keep in mind that a privacy policy is a binding document, and the FTC has fined many companies for failing to live up to the promises they make in these policies. Developers should also take care to revisit the privacy policy whenever the app’s functionality is expanded or changed, as this can require revisions to the policy. If revisions are made, the developer should take care to retain prior versions of the policy in case there is a dispute about what the operative policy was at a previous time.
Final Thoughts
Addressing privacy and data security issues during development will pay dividends down the road and help the developer reduce risks to their finances and reputation. This will require a thorough examination of the source and the type of data involved in the app, as well as the relationship between the developer and those parties upstream and downstream to the developer. If HIPAA applies, developers should conduct the required Security Rule analyses and carefully negotiate business associate agreements with their covered entities and subcontractors. Developers should also take advantage of the increasing amount of guidance and tools provided by the federal government. In 2013, the FTC published its Mobile App Marketing Guidance, which is a good resource for developers as they move forward. Last May, the FTC also published “App Developers: Start with Security” which contains important considerations regarding app security. And last but not least, OCR has released a platform to provide developers (and any other interested stakeholders) a sounding board to ask questions and voice concerns about how HIPAA applies to app developers.
Part 1 - What You Need to Know
Part 2 - Protecting Your Intellectual Property
Part 3 - What You Need to Know About FDA’s Regulation of Mobile Apps
Part 4 -Avoiding an FTC Enforcement Action
Part 5 - Contracting for Health App Construction
Part 6: -HIPAA and Other Privacy and Security Considerations