Last month, the Federal Trade Commission (“FTC”) hosted its annual PrivacyCon event, featuring an array of experts discussing the latest in privacy and data security research. This post, covering healthcare privacy issues, is the first in a two-part series on PrivacyCon’s key takeaways for healthcare organizations. The second post will cover topics on artificial intelligence in healthcare.
In the healthcare privacy segment of the event, the FTC shined a spotlight on three privacy research projects that focused on: (1) tracking technology use by healthcare providers;[1] (2) women’s privacy concerns in the post Roe era;[2] and (3) the bias that can be propagated through large language learning models (“LLMs”). [3] Here are the key takeaways.
In light of newly published guidance from the Office of Civil Rights (“OCR”)[4] and increased FTC enforcement,[5] healthcare stakeholders are more aware than ever that certain tracking technologies are capable of monitoring a user’s activity and collecting user data from apps, websites, and related platforms, and that the technologies can reveal insights about an individual’s personal health status. According to the panel, roughly 90-99% of hospital websites have some form of tracking, which can include monitoring how far down the page someone scrolled, what links they clicked on, and even what forms they filled out.[6] This can reveal very personal information, such as treatment sought or health concerns, and it may be exploited in harmful ways. The presenters highlighted some examples:
- A person viewing information on dementia treatment may be flagged as potentially vulnerable to scams or phishing schemes.[7]
- Period tracking data can reveal if and when a user becomes pregnant and any early termination of such pregnancy. This data could potentially be used in investigations and related prosecutions where abortion treatment is criminalized.[8]
Additionally, despite the high personal stakes involved, healthcare data privacy concerns are simply not on people’s radars. In fact, even after the overturn of Roe v. Wade, users of period tracking apps remained largely unaware of these concerns despite the increased risks related to storing period and intimacy-related information.[9]
Lastly, the panel highlighted the ways in which bias in LLM training can lead to biased healthcare. To train an LLM, the model is fed large data sets, primarily extensive batches of textual information, and then rewarded for generating correct predictions based on that information. This means that the LLM may propagate the biases of its source material. In the case of healthcare, models are primarily trained with internet and textbook sources, some of which contain racial bias and debunked race-based medicine.[10] As a result, LLMs have been found to allege racist tropes including false assertions of biological differences between races such as lung capacity or pain threshold. This means medical centers and clinicians must exercise extreme caution in the use of LLMs for medical decision making and should not rely on it LLMs for researching patient treatment.
Ultimately, these presentations highlight a common theme for all platforms interacting with healthcare data – transparency is key. Use of LLMs should be accompanied by transparent disclosures of the potential biases and related risks. Websites and apps need to have clear and transparent policies around how user data is being collected and used. As seen with the OCR’s latest guidance released in March, the OCR is prioritizing compliance with the HIPAA Security Rule as it relates to tracking technologies.[11] Regulated entities should only use protected health information (“PHI”) collected by tracking technologies in accordance with the Health Insurance Portability and Accountability Act (“HIPAA”),[12] which involves, in part, ensuring that the disclosures and uses are permitted by the Privacy Rule after identifying whether any PHI is involved (neither of which is often straight-forward). For transparency purposes, regulated entities should identify tracking technology use in their privacy policies and notices. Any entity interacting with healthcare data in the digital space should ensure that its data protection policies comply with applicable state and federal law, including HIPAA and FTC rules,[13] and develop transparent and accurate privacy notices for users.
FOOTNOTES
[1] Ari B. Friedman, Hospital Website Privacy Policies, March 6, 202 (hereinafter Hospital Website Privacy Policies).
[2] Hiba Laabadli, Women’s Privacy Concerns Towards Period-Tracking Apps in Post-Roe v. Wade Era, March 6, 2024, “I Deleted It After the Overturn of Roe v. Wade’’: Understanding Women’s Privacy Concerns Toward Period-Tracking Appsin the Post Roe v. Wade Era (ftc.gov) (hereinafter Period-Tracking Apps in Post-Roe).
[3] Jesutofunmi Omiye, MD, MS, How LLMs Can Propagate Race-Based Medicine, March 6, 2024, Beyond the hype: large language models propagate race-based medicine (ftc.gov) (hereinafter How LLMs Can Propagate Race-Based Medicine).
[4] See OCR Guidance, March 18, 2024, Use of Online Tracking Technologies by HIPAA Covered Entities and Business Associates | HHS.gov (Hereinafter March OCR Guidance).
[5] See Web Tracking Creates a Web of Data Privacy Risks | Healthcare Law Blog.
[6] Hospital Website Privacy Policies.
[7] Id.
[8] Period-Tracking Apps in Post-Roe.
[9] Period-Tracking Apps in Post-Roe.
[10] See How LLMs Can Propagate Race-Based Medicine.
[11] March OCR Guidance. For additional information on past OCR guidance, see OCR Releases Guidance on Use of Tracking Technologies | Healthcare Law Blog. See also Caught in the Web: Hospital Associations Sue OCR on Third-Party Web Tracking Guidance | Healthcare Law Blog.
[12] Id.
[13] See also FTC Proposes Changes to Health Breach Notification Rule Clarifying Application to Health and Wellness Apps | Healthcare Law Blog.