HB Ad Slot
HB Mobile Ad Slot
The California Age-Appropriate Design Code Act Enjoined
Saturday, March 9, 2024

On September 18, 2023, NetChoice, LLC — a national trade association with members from the tech and social media industry — obtained a preliminary injunction from the District Court for the Northern District of California preventing the State of California from enforcing the California Age-Appropriate Design Code Act (“AADC” or the “Act”). The reason? The court found it likely violates the First Amendment.

The AADC targets all online products and services that children under the age of 18 are likely to access by, among other things, requiring the relevant online product and service providers to provide Data Protection Impact Assessment (“DPIA”) reports to the California Attorney General identifying (1) each offered service, product, or feature likely to be accessed by children, (2) any risk of “material detriment to children” arising from the provider’s data management practices, and (3) a timed plan to mitigate such risks.

How the AADC Came to Be

With a growing number of children using the Internet, and the rising number of hours children spend online, federal and state governments try to protect these children from the harms of Internet use, by limiting the collection and use of their data as well as by limiting the content to which the minors are exposed. While the intentions may be pure, the AADC raises serious constitutional issues related to the compelled speech aspects of the law.

The California legislature attempted to take these protections a few steps further for children under the age of 18 with the AADC. The AADC includes a long list of requirements that online providers must abide by, or risk hefty fines. The most important and controversial requirements are discussed in more detail in the following section.

Why the Court Found the AADC is Likely Unconstitutional

As a threshold matter, the Court agreed with NetChoice’s argument that the AADC regulates protected expression. It found that the Act’s prohibitions, such as the collecting, sharing, or selling of personal information for age-profiling or other purposes, “limits the ‘availability and use’ of information by certain speakers and for certain purposes and thus regulates protected speech.” [SPB1] Likewise, the court held that the Act’s mandates, such as creating and providing the DPIA report, regulate the distribution of speech because it compels speech. Therefore, the Act triggers First Amendment protection because it regulates and concerns the exercise of speech.

The First Amendment allows some regulation of speech depending on the type of speech, with commercial speech generally being subject to greater regulation. Here, NetChoice argued that the Act regulates non-commercial speech, which is subject to strict scrutiny, and the State countered that the speech regulated by the Act “is at most commercial speech”[SPB2] and subject to lower-level scrutiny. The court chose the middle ground and applied intermediate scrutiny for the speech at issue requiring the state to show (1) that the Act directly advances a substantial governmental interest and (2) that the Act is written to achieve that interest. The government’s interest does not have to be advanced by the least restrictive means, but the means must also not be “substantially excessive.”[SPB3] 

As to the first prong of the intermediate scrutiny test, the court found that the State satisfied its burden and that the state had a substantial interest in “protecting the physical, mental, and emotional health and well-being of minors.”[SPB4] The court then examined the means by which California chose to accomplish that goal by analyzing each of the challenged provisions of the Act.

  1. DPIA Report Requirement: The State argued that these extensive and time-consuming reports will require businesses to assess “how their products use children’s data and whether their data management practices or product designs pose risks to children.”[SPB5] But the court did not agree that this requirement adequately addresses the harms identified by the Stat, and the law failed scrutiny because of its focus on businesses’ data management practices rather than product designs. The court explained that the State’s own expert, Dr. Radesky, testified that the product designs are what presents digital risks for children. The court provided the example of the mobile application Snapchat discontinuing the use of a filter that captured a user’s current speed after it was linked to reckless driving accidents by adolescents. The DPIA report is not currently designed to capture this type of product design that can have dangerous, albeit unintentional, effects. Thus, the court held that the State’s goal is not accomplished by this requirement, and it would likely fail intermediate scrutiny.
  2. Age Estimation: The Act requires online providers to estimate the age of child users so that the highest privacy setting is applied when a child visits the site. The court found this requirement would actually require users to divulge more personal information for the site to confirm the user’s age. In other words, the court found that this requirement might actually increase the harm (or at least the potential for harm) to minors and therefore did not advance California’s legitimate interest and would likely fail intermediate scrutiny.
  3. High Default Privacy Settings: The Act requires a “high” level of privacy setting as the “default” for children visiting a site. “High” privacy settings can often lead to the services subject to the regulation effectively functioning poorly. Thus, this requirement could lead to businesses prohibiting children from using their services and products altogether because it did not want to offer high level privacy settings. In other words, the danger is that a company would choose to simply ban children rather than risk being forced to offer a poor-quality service. Accordingly, the court found that this restriction would likely fail intermediate scrutiny.
  4. Age-Appropriate Policy Language: The Act requires privacy information, terms of service, policies, and community standards to be written in clear language “suited to the age of children likely to access that online service, product, or feature.” [SPB6] The court was not convinced that this requirement would alleviate any harm caused by the current language of privacy policies or terms of service. That is, the state did not carry its burden of showing that this requirement would further the statute’s goals, and thus likely fails scrutiny.
  5. Internal Policy Enforcement: The policies required by the Act mentioned above must, of course, be enforced by the companies. The Court was not, however, persuaded of the relationship between a business enforcing its policies and alleviating harm caused to children’s well-being. Accordingly, because the state failed to carry its burden on the causal relationship between enforcement and furtherance of the statute’s goal, the court found that it likely fails intermediate scrutiny.
  6. Knowingly Harmful Use of Children’s Data: This one is quite self-explanatory — the Act prohibits a business from using a child’s personal information in any way that is detrimental to a child’s well-being. The Act does not define what uses of information may be considered harmful, and so the court found that because it was so vague, the Act would burden more speech than is necessary to advance the State’s interest in protecting children’s information online and, therefore, likely fails intermediate scrutiny.
  7. Profiling Children by Default: The court accepts the state’s argument that the default profiling of a child puts them in harm’s way by placing them in “target audience categories for products related to harmful content such as smoking, gambling, alcohol, or extreme weight loss,”[SPB7] but did not agree that this provision is narrowly tailored to address those harms. To the contrary, NetChoice presented evidence of the benefits of profiling and offering targeted content to minors, “particularly those in vulnerable populations[SSD8] ” such as LGBTQ+ youth and pregnant teens. The State argued that children can still access, and businesses can still recommend, such content through means other than default profiling. With this assertion, the court concluded that the Act does indeed prohibit beneficial profiling, and, thus, is not so narrowly tailored to be likely to pass scrutiny.
  • Restriction on Collecting, Selling, Sharing, and Retaining Children’s Data: The court found this prohibition excessive because it would not only prevent harmful content but also restrict beneficial or neutral content. The court cited the same reasons as explained directly above in support of the conclusion that this provision likely fails intermediate scrutiny — i.e., the State’s “laudable goal of protecting children does not permit the government to shield children form harmful content by enacting greatly overinclusive … legislation.”
  • Unauthorized Use of Children’s Personal Information: Covered businesses cannot use a child’s personal information for any reason other than for which it was collected unless a compelling reason exists. The State did not persuade the court that using a child’s data for multiple purposes is harmful. Accordingly, the court found that this provision likely failed to pass intermediate scrutiny.
  • Use of Dark Patterns: This references website and app designs that “’nudge’ individuals into making certain decisions, such as spending more time on an application.”[SPB9] The State did not meet their burden in proving that this requirement would benefit all children the Act aims to protect, therefore the court found that NetChoice would likely succeed in showing that this requirement fails commercial speech scrutiny.

Related Cases

Further regulation of online providers is a trend being set across the country by states and the federal government. The Supreme Court is also set to decide the constitutionality of a Texas social media law that bans social media companies from restricting posts based on political viewpoint and a Florida social media law that prevents the banning of political candidates and censoring content by journalistic enterprises.

Implications for the Future

The NetChoice case makes a few things clear. First, state legislatures will need to walk an extremely tight rope as they attempt to regulate social media and other companies that collect individuals’ (including minors’) data. They will need to be prepared to draft legislation that can withstand Constitutional scrutiny and that narrowly addresses the stated goals of the law.

Second, online services providers should prepare for the eventual regulation of data management practices for users under 18 years old. While various states have proposed different ages for “children” (13, 16, 18, etc.), the safest and most conservative approach is to plan on needing special procedures for any user under the age of 18. Proactive steps, such as determining ways of identifying users’ ages in a “low friction” way and putting in safeguards to protect these users, can have the simultaneous effect of potentially staving off further regulation, and, if regulation comes nonetheless, making compliance that much easier.

Third, this case makes clear that what is “safe,” “harmful,” or a “threat” to the wellbeing of children is not clear to courts, legislatures, or businesses attempting to comply with laws. While the states and online services providers continue battling it out in court, parents, teachers, and guardians are left to take self-help steps by educating children on safe online practices and responsible data management with free resources and educational materials from government agencies, such as the Cybersecurity & Infrastructure Security Agency (CISA), the FBI, and the FTC.

Finally, this case also makes clear that this is far from the last attempt by states (or Congress) from regulating “Big Tech’s” operations. We will continue to cover the updates to both this law and the other laws regulating online speech.

The case is NetChoice, LLC v. Bonta, Case No. 5:2022cv08861 (N.D. Cal. 2023). On October 18, 2023, the State filed a notice of appeal to the 9th Circuit Court of Appeals (Case No. 23-2969).


[SPB1] Page 12 of the opinion.

[SPB2] Page 17

[SPB3] Page 18

[SPB4] Page 19.

[SPB5] Page 20

[SPB6] Page 25.

[SPB7] Page 29

[SSD8] Page 29

[SPB9] Page 32.

HTML Embed Code
HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins