We have previously reported on the requirements, including mandatory risk assessments, of the California Age Appropriate Design Code Act, (CAADCA or Act) and that the Act was enjoined by the federal District Court as likely a violation of the publisher’s free speech rights under the First Amendment of the U.S. Constitution. The 9th Circuit has upheld that decision, but only as to Data Protection Impact Assessments (DPIAs), and gone further to find that such assessments are subject to strict scrutiny and facially unconstitutional. See Netchoice, LLC v Rob Bonta, Atty General of the State of California (9th Cir., August 16, 2024) – a copy of the opinion is here. The Court, however, overruled the District Court as to the injunction of other provisions of CAADCA, such as restrictions on the collection, use, and sale of minor’s personal data and how data practices are communicated. Today, we will focus on what the decision means for DPIA requirements under consumer protection laws, including the 18 (out of 20) state consumer privacy laws that mandate DPIAs for certain “high-risk” processing activities.
The 9th Circuit reached its conclusions by finding first that a “DPIA report requirement clearly compels speech by requiring covered businesses to opine on potential harm to children…. [and] it is well established that the forced disclosure of information, even purely commercial information, triggers First Amendment scrutiny.’’ It then found that “in every circumstance in which a covered business creates a DPIA report for a particular service, the business must ask whether the new service may lead to children viewing or receiving harmful or potentially harmful materials[,]” justifying a facial rather than as an applied challenge. Then, the court, applied strict scrutiny (the standard for restrictions on non-commercial/editorial/expressive speech), rather than intermediate scrutiny (the standard generally applied to commercial speech), because “[t]he DPIA report requirement – in requiring covered businesses to opine on and mitigate the risk that children are exposed to harmful content online – regulates more than mere commercial speech.” Applying the strict scrutiny requirement that the regulation of protected non-commercial speech is the least restrictive manner of achieving an assumed compelling interest in protecting children from harmful material, the Court held that “a disclosure regime that requires the forced creation and disclosure of highly subjective opinions about content-related harms to children is unnecessary for fostering a proactive environment in which companies, the State, and the general public work to protect children’s safety online. For instance, the State could have developed a disclosure regime that defined data management practices and product design without reference to whether children would be exposed to harmful or potentially harmful content or proxies for content. Instead, the State attempts to indirectly censor the material available to children online….”
So, what then about the DPIA requirement regarding data processing activities that do not impact what types of content are restricted or available for viewing via the business. While not directly at issue in Netchoice, the Court addresses an argument by an amicus party “that the district court’s striking down of the DPIA report requirements in the CAADCA necessarily threatens the same requirement in the CCPA (California’s consumer privacy law) [and other US privacy laws].” After first finding in dicta that the mandatory consumer rights statistics reporting requirement of large volume personal information processing businesses under CCPA regulations is a mere “obligation to collect, retain and disclose purely factual information” and is a “far cry” from the CAADCA’s “particular focus on whether [online services] may result in children witnessing harmful or potentially harmful content online, the court referenced CA Civ. Code Section 1798.185(a)(15)(B), which provides:
(15) … requiring businesses whose processing of consumers’ personal information presents significant risk to consumers’ privacy or security, to: … (B) submit to the California Privacy Protection Agency on a regular basis a risk assessment with respect to their processing of personal Information, including whether the processing involves sensitive personal information, and Identifying and weighing the benefits resulting from the processing to the business, the consumer, other stakeholders, and the public, against the potential risks to the rights of the consumer associated with such processing, with the goal of restricting or prohibiting such processing if the risks to privacy of the consumer outweigh the benefits resulting from processing to the consumer, the business, other stakeholders, and the public….
Again in dicta, the Court wrote, but without direct discussion of the specific requirements of 1798.185(a)(15)(B) or the current very complex DPIA requirements in draft Section 1798.185(a)(15)(B) implementing regulations:
A DPIA report requirement that compels businesses to measure and disclose to the government certain types of risks created by their services might not be a problem. The problem here is that the risk that businesses must measure and disclose to the government that children will be exposed to disfavored speech online. Accordingly, [Amicus’] concern that the district court’s ruling necessarily threatens other DPIA schemes throughout the country, is misguided.
The Court also looked at the CAADCA’s prohibition on dark patterns, which are practices deemed to frustrate consumer notice and choice, and rejected a facial challenge finding that “it is far from certain that such a ban should be scrutinized as a content-based restriction, as opposed to a content-neutral regulation of expression.” It also reached a similar conclusion as to the CAADCA’s mandate for clear, conspicuous, and easily understandable privacy notice and service terms and policies finding that to be of a factual nature, which triggers a deferential review.
The 9th Circuit was careful to restrict its holding to DPIAs that require content evaluation and judgment and censorship of governmentally disfavored content. However, the door is left open for challenges to more traditional DPIAs that require the evaluation of high-risk data processing activities, even if that involves restriction on creation and dissemination of information based on risk and data subject impact, and documentation thereof. Such a requirement would seem to go beyond the purely factual and uncontroversial deferential review standard that the Court seems to suggest should apply to CCPA statistical reporting and privacy policy requirements. Assuming that intermediary rather than strict scrutiny would apply to DPIAs not impacting content publication choices, a challenged state would need to establish that general data practice risk assessment requirements (1) directly advance a substantial government interest (e.g., protecting the privacy of consumers), and (2) the restriction is not more extensive than necessary to serve that interest. While the Court suggests that such assessments “might not be a problem” and that its, and the underlying District Court’s, decisions in Netchoice “do not necessarily threaten[] other DPIA schemes throughout the country,” that issue is left to be resolved, most certainly on an as-applied basis.
In the meantime, 18 of the 20 state consumer privacy laws (all but Iowa and Utah) require that completed DPIAs of “high-risk” data practices be made available for inspection, with some sort of yet-to-be-determined filing system to be required in California. Minnesota goes even further and requires a written privacy program designed to comply with all aspects of its privacy law and documentation of data inventories.