STATE & LOCAL LAWS & REGULATION
CPPA Issues Data Minimization Enforcement Advisory
The California Privacy Protection Agency (“CPPA”), the enforcement authority of the California Privacy Rights Act (“CPRA”), has released its first-ever enforcement advisory focused on how data minimization principles apply to the processing of consumers’ CPRA requests. The advisory explains that the CPRA requires businesses to collect consumers’ personal information only to the extent that it is relevant and limited to what is necessary in relation to the purposes for which it is being collected, used, and shared. The advisory highlights the fact that data minimization is required when responding to consumer requests under the CPRA’s implementing regulations. For requests to opt-out of the sale/sharing of personal information or limit use/disclosure of sensitive personal information, businesses are prohibited from requiring consumers to create an account or provide additional information beyond what is necessary. Data minimization also applies when verifying the consumer’s identity before responding to requests to know, delete, and correct.
Maryland Passes Comprehensive Data Privacy Legislation
Maryland joined the list of states that have adopted comprehensive privacy laws with the passage of the Maryland Online Data Privacy Act of 2024 (“MODPA”). MODPA applies to persons that conduct business in Maryland or provide products or services that are targeted to Maryland residents and, during the preceding calendar year, either controlled or processed the personal data of at least: (1) 35,000 consumers, excluding personal data controlled or processed solely for the purpose of completing a payment transaction; or (2) 10,000 consumers and derived more than 20 percent of its gross revenue from the sale of personal data. Financial institutions and data subject to the Gramm-Leach-Bliley Act are exempted and MODPA does not apply to individuals acting in a commercial or employment context. MODPA does not contain exemptions for nonprofits or institutions of higher education. Protected Health Information (“PHI”) under the Health Insurance Portability and Accountability Act (“HIPAA”) is exempted, but there is no exemption for entities subject to HIPAA. MODPA provides for consumer data rights and requires controllers provide an appropriate notice describing data processing activities. MODPA also requires data protection assessments in certain circumstances, restricts processing and sale of the personal data of minors, prohibits the sale of sensitive data, and requires heightened data minimization requirements for sensitive personal data. MODPA will be enforced by the Maryland Attorney General. MODPA takes effect on October 1, 2025, but requirements relating to personal data processing will apply starting on April 1, 2026.
Kentucky Passes Comprehensive Privacy Law
The Kentucky Legislature has passed the Kentucky Consumer Data Protection Act (“KCDPA”), which will take effect on January 1, 2026. The KCDPA mimics its predecessors, with applicability thresholds based on the volume of Kentucky consumers’ personal information controlled/processed and the information and entity exemptions it provides. The KCDPA affords Kentucky consumers (i.e., those not acting in a commercial or employment context) rights to access, delete, correct, and obtain a copy of the consumer’s personal information, and to opt out of the processing of personal information for purposes of targeted advertising, sale, or profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer. Notably, the definition of “sale” is limited to the exchange of personal information for monetary consideration. Sensitive data may also not be processed without the consumer’s opt-in consent. The KCDPA is solely enforceable by Kentucky’s Attorney General and provides for a 30-day cure period.
Nebraska Enacts Comprehensive Privacy Law
Nebraska has enacted the Nebraska Data Privacy Act (“NDPA”), which will take effect on January 1, 2025. The NDPA most closely follows the Texas Data Privacy Act and applies to entities that conduct business in Nebraska or produce a product or service consumed by Nebraska residents and: (1) process or sell personal data; and (2) is not a small business under the federal Small Business Act, except if such entity engages in the sale of sensitive data without receiving prior consent from the Nebraska consumer. However, like its predecessors, the NDPA provides for various information and entity exemptions and provides Nebraska residents with the rights to access, delete, correct, and obtain a copy of the consumer’s personal information, and to opt out of the processing of personal information for purposes of targeted advertising, sale, or profiling. The NDPA is solely enforceable by Nebraska’s Attorney General and provides for a 30-day cure period.
Florida Legislature Passes Cybersecurity Incident Liability Act
The Florida legislature passed the Cybersecurity Liability Act, HB 473, designed to protect companies that suffer data breaches from lawsuits where they comply with Florida’s breach notification statute and maintain a cybersecurity program that meets relevant industry standards. The bill currently awaits signature from Florida governor Ron Disantis. Florida’s follows in the wake of similar bills in Ohio, Utah, and Connecticut, which provide certain exceptions for companies that suffer a breach despite implementing appropriate security controls. However, Florida’s HB 473 is arguably more protective of businesses than its predecessors, providing broader immunities and benchmarking against a broad array of industry standards, including those published by the National Institute for Standards and Technology or those designed to comply with U.S. federal laws, such as HIPAA or the Gramm-Leach-Bliley Act (“GLBA”).
Florida Releases Proposed Rules for Digital Bill of Rights
Florida has released proposed rules for its Digital Bill of Rights, which was passed into law in June 2023. The Digital Bill of Rights, which will become effective on July 1, 2024, is a comprehensive data privacy law regulating certain businesses that conduct operations within the State of Florida. The proposed rules include definitions of “Authorized Persons” and “Authorized Users,” and outline the requirements for a controller covered by the Digital Bill of Rights. In addition, the proposed rules include definitions and regulations related to enforcement, including how consumers can file a complaint with the Florida Department of Legal Affairs. Finally, the proposed rules include standards for authenticated consumer requests. Under these standards, a controller must use a commercially reasonable method to authenticate a consumer requesting to exercise their rights under the Digital Bill of Rights.
Maryland Legislature Passes Maryland Kids Code
The Maryland General Assembly passed the Maryland Kids Code (“MKC”), which provides certain protections regarding the use of children’s personal data. Under the MKC, for-profit companies doing business in Maryland are subject to the law if they have annual gross revenues of at least $25 million and process the personal data of 50,000 or more consumers or derive at least 50 percent of their revenue from the sale of consumers’ personal data. The MKC requires covered companies to prepare a Data Protection Impact Assessment (“DPIA”) if they provide an online product that is reasonably likely to be accessed by children. The DPIA contains numerous requirements, including identifying the purpose of the online product, how it uses personal data, and determining whether the product is designed in a manner consistent with the best interests of children. The MKC will take effect on October 1, 2024, and will be enforced by the Maryland Division of Consumer Protection of the Office of the Attorney General.
Colorado Regulates Privacy of Neural Data
Colorado has passed an amendment to the Colorado Privacy Act (“CPA”) to regulate the privacy of neural data. HB 24 amends the CPA to add “biological data” to the definition of sensitive data. “Biological data” is defined as data generated by the technological processing, measurement, or analysis of an individual’s biological, genetic biochemical, physiological, or neural properties, compositions, or activities, or of an individual’s body or bodily functions, which data is used or intended to be used singly or in combination with other personal data, for identification purposes. Biological data includes “neural data,” which is defined as “information that is generated by the measurement of the activity of an individual’s central or peripheral nervous systems and that can be processed by or with the assistance of a device.” For an in depth look at Colorado’s new law, please see our article in Blank Rome's Biometric Privacy Insider blog.
FEDERAL LAWS & REGULATION
New Bipartisan Federal Privacy Bill Unveiled
Two members of Congress released a draft bipartisan federal privacy bill named the American Privacy Rights Act (“APRA”). The proposed Act includes various regulations, including requirements on data minimization and consumers’ rights to opt out of targeted advertising and view, correct, export, or delete their data. The APRA also contains a section focused on civil rights, barring companies from using individuals’ personal information to discriminate against them. While the APRA would preempt state privacy laws, it would not preempt state laws that regulate civil rights, consumer protection, contracting, and other categories. The Act includes a private right of action, which would be brought by the Federal Trade Commission, state attorneys general, and individual citizens. The APRA is expected to be introduced and marked up in committee before going to the House and Senate for a vote.
OCR Issues New Rule on Reproductive Health Care Privacy
The U.S. Department of Health and Human Services Office for Civil Rights (“OCR”) announced a Final Rule entitled HIPAA Privacy Rule to Support Reproductive Health Care Privacy (“Final Rule”). The Final Rule prohibits the disclosure of PHI related to lawful reproductive health care in certain circumstances. Specifically, the Final Rule (1) prohibits the use or disclosure of PHI when it is sought to investigate or impose liability on individuals, health care providers, or others who seek, obtain, provide, or facilitate reproductive health care that is lawful under the circumstances in which such health care is provided, or to identify persons for such activities; (2) requires a regulated health care provider, health plan, clearinghouse, or their business associates to obtain a signed attestation that certain requests for PHI potentially related to reproductive health care are not for these prohibited purposes; and (3) requires regulated health care providers, health plans, and clearinghouses to modify their Notice of Privacy Practices to support reproductive health care privacy.
President Biden Signs Omnibus Foreign Aid Package with Laws Prohibiting Transfer of Data to Foreign Adversaries and TikTok Ban
President Joe Biden signed an omnibus foreign aid package that includes the Protecting Americans’ Data from Foreign Adversaries Act as well as legislation that could ban TikTok if the social media platform’s owner, ByteDance, does not sell it within nine months. The Protecting Americans’ Data from Foreign Adversaries Act makes it unlawful for a data broker to provide access to the "personally identifiable sensitive data" of any U.S. person to an entity controlled by a foreign adversary, including any company with more than 20 percent ownership by an entity established in a covered country. Categories of sensitive data include government identification numbers, health data, financial data, biometric information, genetic information, precise geolocation information, and private communications, among other information. Under the law, data brokers are any entity that, for valuable consideration, sells, licenses, rents, trades, transfers, releases, discloses, provides access to, or otherwise makes available data of U.S. individuals that the entity did not collect directly from such individuals to another entity that is not acting as a service provider.
FTC Updates Health Breach Notification Rule
The Federal Trade Commission (“FTC”) finalized changes to the Health Breach Notification Rule (“HBNR”). The HBNR requires vendors of personal health records (“PHR”) and related entities that are not covered by HIPAA to notify individuals, the FTC, and, in some cases, the media of a breach of unsecured personally identifiable health data. The changes revise definitions to underscore the HBNR’s applicability to health apps and similar technologies not covered by HIPAA, including modifying the definition of “PHR identifiable health information” and adding two new definitions for “covered health care provider” and “health care services or supplies.” The changes also attempt to clarify the definition of “breach of security,” revise the definition of “PHR related entity,” expand the content that must be provided in notice to consumers, and modify when the FTC must be notified of a breach, among other changes. The final rule will go into effect 60 days after its publication in the Federal Register.
FCC Announces Intent to Issue Broadband Privacy Rules
The Federal Communications Commission (“FCC”) issued a Declaratory Ruling, Report and Order, and Order on Reconsideration regarding Safeguarding and Securing the Open Internet. The Declaratory Ruling would classify broadband internet access as a telecommunication service, mobile broadband as a commercial service, and prohibit—among other practices—data rate regulation and tariffing. Additionally, the Declaratory Ruling would include enhanced cybersecurity, national security, and network resilience and monitoring obligations for service providers. Similarly, the FCC’s Report and Order would reinstate clear rules prohibiting blocking, throttling, or affiliate-prioritization by broadband providers and require enhanced transparency requirements for broadband providers. These efforts are designed to “reestablish the FCC’s authority to protect consumers and safeguard the fair and open Internet. These new and reinstated net-neutrality rules may have a significant impact on a broad range of broadband internet services and internet service providers.
FTC Technology Blog Emphasizes Importance of Systematic Approach to Cyber Risk Management
The FTC Office of Technology published a blog emphasizing the importance of a systemic approach to cybersecurity risk management. In recent years, the FTC has brought a constant stream of enforcement actions against companies that have failed to implement appropriate security measures, such as its action against not only Drizly, but also Drizly’s CEO. The FTC’s blog highlights industry guidance on building and implementing a cybersecurity program as well as key vulnerabilities that companies must take into consideration. While this blog may not have much impact on entities that already have robust cybersecurity frameworks in place, it provides another layer of express support for stakeholders seeking resources to build and implement these necessary programs.
U.S. LITIGATION
Nationwide Optometry Settles for $3.4 Million Over 2021 Data Breach
According to a preliminary approval motion filed in New Jersey federal court, Nationwide Optometry (“Nationwide”) has entered into a settlement to pay $3.4 million to a proposed class of approximately 714,000 patients impacted by a 2021 ransomware attack on network servers belonging to a co-defendant named U.S. Vision. The complaint alleged that U.S. Vision, USV Optical, and Nationwide failed to secure and safeguard the personal information of its customers, including names, birth dates, Social Security numbers, driver’s licenses, financial account information, health insurance, and usernames and passwords. The complaint also alleges that U.S. Vision learned about the breach 17 months prior to alerting consumers about the breach, which caused customers to spend time, effort, and resources to resolve issues related to the breach.
U.S. ENFORCEMENT
Proposed FTC Order Prohibits Telehealth Firm from Sensitive Data Disclosure and Use
The FTC filed a proposed order against Cerebral, Inc. (“Cerebral”), which provides online mental health and related services. The FTC complaint alleged that Cerebral and its former CEO repeatedly broke their privacy promises to consumers and misled them about the company’s cancellation policies. Specifically, the complaint alleges that Cerebral provided the sensitive information of nearly 3.2 million consumers to various third parties, including LinkedIn, Snapchat, and TikTok. The complaint also alleges that Cerebral violated the Opioid Addiction Recovery Fraud Prevention Act of 2018 (“OARFPA”) by engaging in unfair and deceptive practices regarding substance use disorder treatment services. Under the proposed order, Cerebral would be restricted from using or disclosing sensitive consumer data. Cerebral would also be required to provide consumers a simple mechanism to cancel services and pay over $7 million.
FTC Takes Action Against Online Addiction Treatment Service for Disclosing Data to Third Party Advertising Platforms
The FTC filed a complaint for permanent injunction against Monument, Inc., an online alcohol addiction treatment service. At the root of this complaint, the FTC alleges that Monument disclosed user “personal health data to third-party advertising platforms, including Meta and Google, for advertising without consumer consent, after promising to keep such information confidential.” This action highlights a key risk for businesses with exposure to sensitive health information, even when such information may not constitute Protected Health Information under federal law. Basic identifying information, such as name or contact information, may constitute sensitive health data when disclosed in such a manner that may reveal or imply health information about a consumer. Such entities must be extremely careful about the cookies and related online trackers implemented on their websites, as well as their disclosures surrounding the use and disclosure of information collected through these trackers. This action mirrors the FTC’s 2023 actions against BetterHelp and GoodRx, demonstrating the FTC’s scrutiny of sharing health and other sensitive data with third-party advertising platforms.
OCR Settles with Phoenix Healthcare
OCR announced a settlement with Phoenix Healthcare, an Oklahoma multi-facility nursing care organization, for a potential violation of the HIPAA right of access provision. HIPAA requires covered entities to provide individuals or their personal representatives access to the individual’s protected health information (“PHI”) within 30 days of receiving a request. OCR’s investigation of Phoenix Healthcare involved a daughter, serving as a personal representative for her mother, who was not able to obtain access to her mother’s PHI for nearly one year, despite multiple requests. As part of its settlement with OCR, Phoenix Healthcare has agreed to pay OCR $35,000 in civil monetary penalties and to take certain corrective actions, including revising its HIPAA policies and procedures, providing HIPAA training to its workforce, and entering into business associate agreements with its business associates.
FDIC Enters into Consent Orders with Two Banks for Service Provider Risk Management Failures
The Federal Deposit Insurance Corporation (“FDIC”) entered into consent orders with two separate banks, Piermont Bank and Sutton Bank, over alleged banking violations and failure to manage risks associated with third-party service providers. Neither bank admitted nor denied the factual allegations contained in the relevant consent orders. Looking first to the Sutton Bank consent order, the FDIC focused heavily on alleged anti-money laundering and counter-terrorism financing ("AML"/"CFT") deficiencies. The consent order mandates enhanced internal governance structures, including third-party risk management and board oversight. Moving to the Piermont Bank consent order, the FDIC alleged that the bank engaged in “unsafe and unsound banking practices relating to, among other things, the failure to have internal controls and information systems appropriate for the size of the Bank and the nature, scope, complexity, and risk of its Third Party Relationships[.]” It also highlighted the board’s alleged failure to oversee and monitor asset growth, management performance, and business arrangements with third parties. These consent orders show the FDIC’s continued focus on third-party risk management and board oversight of financial institutions’ risk assessment and management.
INTERNATIONAL LAWS & REGULATION
U.S. and UK Sign AI Safety Memorandum of Understanding
The United States and UK signed a Memorandum of Understanding (“MOU”) to support the countries working together to develop tests and evaluations for artificial intelligence (“AI”) systems, models, and agents. Under this MOU, the United States and UK AI Safety Institutes have laid out plans to build a common approach to risk management as well as AI testing and safety. These institutes agreed to collaborate on at least one joint testing exercise on a publicly accessible model as an example of this combined approach. Safe and efficient development of AI tools is a key priority of both the United States and UK and this MOU is designed to support both information sharing and the establishment of consistent international standards across the Atlantic Ocean.
EDPB Issues Opinion on Pay or Consent Model
The European Data Protection Board (“EDPB”) published anticipated guidance regarding the “Pay or Consent” models implemented by very large online platforms, such as Facebook, for processing personal information of EU citizens. Companies have implemented these pay or consent models to establish a valid basis for processing under the EU’s relatively stringent General Data Protection Regulation (“GDPR”) and ePrivacy Directive. However, for consent to form a valid basis for processing under the GDPR, it must be freely given. In its 42-page opinion, the EDPB clarified that it is “unlikely” that consent captured through pay or consent models would be freely given. While this opinion is limited in scope to only “very large online platforms” and so-called “gatekeepers”—platforms with more than 45 million EU based users per month or with an outsized impact on the EU market—this opinion likely sheds light on the EDPB’s forthcoming guidance applicable to smaller platforms anticipated before the end of the year.
CNIL Publishes Recommendations on the Development of Artificial Intelligence Systems
The French data protection authority, the Commission Nationale de l'informatique et des Libertés (“CNIL”), published guidance on AI system development. The recommendations apply to the development of AI systems involving the processing of personal data, which will need to comply with the EU GDPR and the EU AI Act in Europe. The guidance recommends several steps in the development and deployment of AI systems, specifically (1) defining the purpose of the AI system; (2) determining responsibilities under the GDPR with respect to the system (i.e., controller or processor); (3) defining the legal basis that allows the processing of personal data; (4) if the AI system requires re-use of previously collected personal data, reviewing whether re-use of that personal data is lawful; (5) minimizing personal data; (6) determining a retention period for personal data that is processed; and (7) conducting a data protection impact assessment. The CNIL provides recommendations for putting each step into practice.
Daniel R. Saeedi, Rachel L. Schaller, Ana Tagvoryan, Timothy W. Dickens, Gabrielle N. Ganze, Jason C. Hirsch, Amanda M. Noonan, and Karen H. Shin contributed to this report.