HB Ad Slot
HB Mobile Ad Slot
2023 U.S. Advertising and Privacy Trends and 2024 Forecast: Focus on Kids and Teens
Thursday, February 1, 2024

During 2023, legislative, congressional, and executive actions aimed at protecting children and teens online took center stage. Such actions included: legislative attempts to raise the age of a “child” at both the federal and state levels for advertising and privacy purposes; bans on behavioral advertising targeting minors; efforts to restrict access to social media by minors; First Amendment legal challenges; and the Federal Trade Commission’s (FTC or the Commission) long-awaited proposed changes to the Children’s Online Privacy Protection Act (COPPA) Rule.

Over the last few years, we have reported on regulatory, legal, and voluntary initiatives aimed at expanding protections for minors online, from calls by advocates for the FTC to up-age COPPA, federal and state restrictions on targeted advertising directed at children and teens, and debates about the scope of COPPA preemption, to friction between First Amendment rights and the public policy objectives around protecting minors from certain types of content online. In 2024, we expect that policy and legal actions related to advertising and privacy will continue to focus on protecting children and teens. 

I. Biden Administration’s Focus on Kids and Teens

Protecting children and teens is clearly a priority for the Biden Administration. In his March 1, 2022 State of the Union Address, President Biden declared, “It’s time to strengthen privacy protections, ban targeted advertising to children, [and] demand tech companies stop collecting personal data on our children.” In his February 7, 2023 State of the Union, President Biden again called on legislators to ban behavioral advertising directed towards children and, for the first time, also referenced teens. This reignites a discussion that most thought had been settled when COPPA was enacted in 1998.

A History Lesson

When it was ultimately adopted, COPPA defined a “child” as 13, but not without some debate about the age of a “child” for privacy purposes. That 25-year-old debate is worth reviewing as we consider today’s state of play.

As originally drafted, COPPA defined “children” to include those under 16[1]. Advocacy groups opposed this age cut-off, fearing that requiring parental permission would hinder teens from accessing online information important to their safety, health, and well-being and threaten their privacy rights. The FTC proposed a graded approach, with parental consent requirements for children under 12 and a notice and opt-out mechanism for those ages 12 to 17[2]. Defining all minors under age 16 as children was also inconsistent with longstanding research on children’s cognitive development and ability to perceive and understand advertising. Age 12 was customarily viewed as the age of a child for purposes of advertising to children, including under the Children’s Television Act of 1990 and the earlier Children’s Advertising Review Unit (CARU) self-regulatory guidelines, which were adopted in 1975[3]. Research published around the time COPPA was drafted supported existing child development research indicating that children develop more nuanced understanding of persuasive intent around age 12, and the final enacted version of COPPA settled on “under 13” as the age of a “child” for privacy purposes. Ultimately, as enacted, COPPA applies to online websites or service operators that (1) directly target children under 13 to collect, use, or disclose personal information from children, or (2) have actual knowledge that that they are collecting, using, or disclosing personal information from children under 13. Now, it appears we are back to a debate about how to define a “child” for privacy purposes, with concerns about the prevalence of social media in the lives of children and teens creating new questions about inappropriate content and advertising and raising important legal issues.

Social Media Impacts

On May 23, 2023, the U.S. Surgeon General released an Advisory on Social Media and Youth Mental Health that analyzes current evidence on the positive and negative impacts of social media on children and adolescents and calls for more research on the topic. The same day, the Biden administration announced several executive actions intended to increase protections for minors – including teens – online. A new task force was established to produce voluntary guidance, policy recommendations, and a toolkit on safety, health, and privacy-by-design for digital products and services by Spring 2024. As part of this initiative, the U.S. Department of Commerce National Telecommunications and Information Administration (NTIA) published a Request for Comments (RFC) in the Federal Register on October 10, 2023, seeking public feedback on the best ways to protect the mental health, safety, and privacy of minors online, as voiced in the Surgeon General’s advisory.

These ongoing concerns about the well-being of children and teens online have translated into a variety of legislative responses at the federal and state levels, most recently reflected in a combative Senate hearing on January 31, 2024, featuring CEOs from major social media platforms. Whether algorithms are products, and what theories of liability should platforms face for harms to consumers are among the many thorny issues that will continue to be debated in policy arenas and the courts in the coming months.

II. Federal Legislation

Over the last two years, Congress has proposed a number of bills focused on “children’s” privacy that have garnered bipartisan support. Senator Markey (D-MA), one of the original co-sponsors of COPPA, proposed a new version of COPPA, the Children and Teens’ Online Privacy Protection Act (COPPA 2.0), which raises its age threshold from 13 to 16. In line with President Biden’s call for a ban on behavioral advertising aimed at minors, COPPA 2.0 prohibits targeted advertising directed to children under 16. Another bill, the Protecting the Information of our Vulnerable Children and Youth Act (Kids PRIVCY Act), goes a step further than COPPA 2.0 by raising the age threshold to 18 and altering COPPA’s “actual knowledge” standard to cover online services “targeted to or attractive to children.” The Kids Online Safety Act (KOSA) also raises the age threshold to 16. KOSA, which is aimed at large tech companies, requires social media platforms to give “minors” under 16 tools for protecting their personal information and makes proprietary algorithms available to researchers studying harms to the safety and well-being of minors. Other children’s privacy bills, such as the Protecting Kids on Social Media Act, would require social media platforms to verify the age of users, prohibit the use of algorithmic recommendation systems on individuals under age 18, require parental or guardian consent for social media users under age 18, and prohibit users who are under age 13 from accessing social media platforms.

COPPA 2.0 and KOSA both passed out of the U.S. Senate Committee on Commerce, Science, and Transportation on July 27, 2022, but have stalled since. Senator Maria Cantwell, Chair of the Committee, released a statement on November 15, 2022, that she would push to get children’s privacy legislation passed before the end of the year, but the session ended with no action.

The much-discussed American Data Privacy and Protection Act (ADPPA), a broad general privacy bill, includes provisions banning covered businesses from directing targeted advertising to minors known to be under age 17 (“though certain social media companies or large data holders would be deemed to “know” an individual’s age in more circumstances”[4]). The ADPPA would also establish a Youth Privacy and Marketing Division at the FTC. The modified knowledge standard clearly raises a variety of both legal and operational questions.

As achieving consensus on any type of federal legislation has proven challenging, states have proposed or adopted privacy bills to protect children, some setting higher age limits. Some of the state developments are described below.

III. State Legislation

State privacy laws that address children and teens have either adopted the COPPA age of 13 or adopted requirements for parental consent before collecting personal information from children under 13, and they impose a new opt-in requirement before collecting information from teens (usually under 16), including, specifically, for targeted advertising purposes. The trend started with California’s adoption of a comprehensive privacy law, the California Consumer Privacy Act (CCPA) in 2018. Since then, Colorado, Connecticut, Utah, and Virginia passed laws that took effect in 2023, and California adopted an amendment to the CCPA – the California Privacy Rights Act (CPRA) – which also became effective in 2023. Last year, Delaware, Florida, Indiana, Iowa, Montana, Oregon, Tennessee, and Texas passed comprehensive privacy laws, and New Jersey became the first state to adopt a privacy law this year. Most of these states define a “child” as under 13, and some states consider children’s personal information to be “sensitive information” subject to special processing rules.

Among the most controversial of recent laws governing children’s privacy online is the California Age-Appropriate Design Code Act (CAADCA), which applies to any business that provides an online service, product, or feature “likely to be accessed by children” (defined as anyone under age 18), and directs businesses to (1) consider the best interests of children when designing, developing, and providing such online service, product, or feature and (2) prioritize the privacy, safety, and well-being of children over commercial interests in the event of a conflict between the two. While modeled primarily on the UK Age-Appropriate Design Code, the CAADCA also references the UN Convention on the Rights of the Child, which defines a “child” as under 18. (The age 18 threshold is higher than other U.S. data privacy laws, but new bills often use this age threshold.)

The Florida Digital Bill of Rights, which goes into effect later this year, defines “child” as 18. It prohibits online platforms that are “predominantly accessed by children” from processing children’s personal information if the platform has actual knowledge of or willfully disregards that the processing may result in substantial harm or privacy risk to children.

As we discuss in the Courts section below, a federal district court issued a preliminary injunction against enforcement of the CAADCA on First Amendment grounds, and First Amendment litigation challenges have also been successfully filed in challenges to other state privacy laws.

IV. Federal Regulatory

In the absence of a comprehensive federal privacy law, the FTC is considering expanding regulation of data collection through a rulemaking proceeding, and the FTC has also initiated a variety of important enforcement actions with a focus on children and teens.

A. Rulemakings and Workshops

Advance Notice of Proposed Rulemaking on Commercial Surveillance

On August 11, 2022, the FTC announced a wide-ranging and complex Advance Notice of Proposed Rulemaking (ANPR) to explore potential new rules governing what the FTC characterizes as prevalent “commercial surveillance” and “lax data security practices,” including behavioral advertising to children. The FTC issued the ANPR pursuant to its Section 18 authority under the Magnuson-Moss Act, which authorizes the Commission to promulgate, modify, and repeal rules that define with specificity unfair or deceptive acts or practices within the meaning of Section 5(a)(1) of the FTC Act. (In the 1970s, the FTC proposed to ban advertising to young children under 6 in the infamous “kid-vid” proceeding, which earned the FTC the moniker of “national nanny.” In response, Congress passed Section 18(h) of the FTC Act in 1975, which restricted the Commission’s ability to act in that case and also leashes the Commission’s ability to issue rules in “any substantially similar proceeding on the basis of a determination by the Commission that such advertising constitutes an unfair act or practice in or affecting commerce.”)

The numerous questions posed by the ANPR touch on both advertising and privacy issues and include several pertaining to children and teens, such as “to what extent should new trade regulation rules provide teenagers with an erasure mechanism in a similar way that COPPA provides for children under 13?” and “which measures beyond those required under COPPA would best protect children, including teenagers, from harmful commercial surveillance practices?” The vast, unfocused scope of the ANPR implicates virtually all data collection activities, and there are serious questions about whether the ANPR meets the minimum standards for this type of rulemaking.

Stealth Advertising

Following a 2022 workshop on “Stealth Advertising,” in which we took part, the FTC staff published a paper on September 14, 2023 styled as a “staff perspective” on Protecting Kids from Stealth Advertising in Digital Media. The staff offered repeated longstanding recommendations on avoiding “blurring” of advertising and content, suggested that platforms, content creators, and advertisers consider developing an easy to see and understand icon to identify advertising, and identified a role for education (points discussed in the 2022 workshop).

COPPA Notice of Proposed Rulemaking

Four years after issuing a request for comments on possible updates to COPPA, the FTC released a Notice of Proposed Rulemaking (NPRM) in December 2023 outlining several changes to the COPPA Rule. The FTC rejected a number of amendments that advocates called for, including changing the age and “actual knowledge” standard, as it lacked authority to do so. The FTC did, however, suggest several changes, including requiring a separate opt-in for targeted advertising.

Key proposals include the following:

  • While COPPA prohibits conditioning a child’s participation on the collection of more personal information than necessary to participate in an online activity, the FTC is considering adding new language to clarify the meaning of “activity.”
  • The FTC elected to retain the important “support for operations” exception to the COPPA notice and consent requirement, but it is proposing to require that parental notices describe the specific internal operations for which the operator has collected a persistent identifier, and how operators will ensure that such identifier is not used or disclosed to contact a specific individual, including through targeted advertising.
  • A new prohibition on using children’s online contact information (including persistent identifiers) to “nudge” kids to stay online, along with related disclosure obligations.
  • Education Technology (ed tech) was a topic of considerable interest in response to the 2019 request for comments, and the FTC has proposed codifying its current ed tech guidance to prohibit commercial use of children’s information and implement additional safeguards. It would, however, allow schools and school districts to authorize ed tech providers to collect, use, and disclose students’ personal information for a school-authorized educational purpose only, and not for any commercial purpose.
  • Safe harbor programs have garnered criticism, but they are authorized by statute, and the FTC did not propose eliminating them. Instead, the COPPA NPRM imposes new requirements for COPPA Safe Harbor programs, including requiring each program to publicly disclose its membership list and report additional information to the Commission.
  • The FTC proposes to expand information security obligations, requiring that operators establish, implement, and maintain a written children’s personal information security program that contains safeguards appropriate to the sensitivity of the personal information collected from children. Operators must designate one or more employees to coordinate the program and identify and perform, at least annually, additional assessments and update the program as needed.
  • The COPPA Rule requires that operators retain personal information only for as long as necessary to fulfill the specific purpose for which it was collected, but the FTC proposes to also prohibit operators from using retained information for any secondary purpose. The proposed Rule requires operators to establish and make public a written data retention policy and explicitly state that operators cannot retain the information indefinitely.

Some of these proposals may not be consistent with the FTC’s statutory authority, while others may have significant operational impact on businesses. Comments are due March 11, 2024.

B. Enforcement Actions

The FTC initiated a number of important enforcement actions in 2023 related to minors, including its record-setting settlement with Epic Games. The FTC also brought complaints against MicrosoftAmazon, and ed tech provider Edmodo for violations of COPPA. In total, the Commission imposed a total of $326 million in fines for COPPA violations and, in the case of Epic Games, another $245 million in consumer relief for violating Section 5 of the FTC Act through deceptive practices.

On December 19, 2022, the FTC brought a complaint in federal court against Epic Games, creator of the popular online video game Fortnite, for multiple violations of COPPA. The FTC alleged that the company failed to protect its underage users’ privacy and engaged in “dark patterns” that duped users of all ages into making unintended purchases. The Commission also claimed that “Fortnite’s unfair default settings have harmed children and teens” by putting minors in direct contact with strangers during gameplay, and that not offering minors a way to opt-out before this feature was enabled was an unfair and deceptive trade practice that violated Section 5 of the FTC Act. The Epic settlements are noteworthy not just for the size of the fines imposed by the FTC and payments obtained by the FTC, but also for the mandatory measures to safeguard the privacy of children and teens, including a requirement for default privacy settings.

The FTC, like lawmakers at the state and federal levels, also turned its attention to social media. On May 3, 2023, the FTC unilaterally announced it was proposing to change a privacy order with Meta Platforms Inc. (Meta) (formerly Facebook), seeking to impose a “blanket prohibition” on Meta and all its related entities on using data collected from children and teens except to provide services or for security purposes. Under the proposed order, Meta “would be prohibited from monetizing children’s data or otherwise using it for commercial gain even after those users turn 18.” Meta continues to oppose the action, seeking intervention by the D.C. Circuit. Meta filed a separate action challenging the constitutionality of the agency as well.

V. The Courts

Two lines of legal challenge are most relevant for those interested in children and teen privacy and advertising: the Court of Appeals for the Ninth Circuit’s 2023 ruling that COPPA did not preempt a state cause of action and First Amendment challenges targeting privacy and social media laws. We have addressed both below.

On July 13, 2023, the Ninth Circuit ruled en banc that COPPA did not preempt state privacy claims. The three-judge panel denied Google’s challenge of the court’s December 28, 2022 decision in Jones v. Google[5], which held that state privacy law claims in a putative class action are not preempted by COPPA. The December decision reversed a lower court’s dismissal of the action on the grounds that COPPA preempted identical state law claims. Google petitioned the Ninth Circuit to have the case reheard by the full court, and the panel asked the FTC to weigh in on the preemption question. Last May, the FTC submitted an amicus brief in support of the Ninth Circuit’s finding that COPPA does not preclude identical state law claims. In July 2023, the panel affirmed its December 2022 opinion and amended it to note the FTC’s support.

Regarding First Amendment challenges, the tech group NetChoice filed a lawsuit on December 14, 2022, challenging the CAADCA on multiple constitutional grounds, including violation of the First Amendment. NetChoice argued that the CAADCA was a prior restraint on speech and unconstitutionally overbroad, and that the Act impermissibly regulated protected expression and failed even the Central Hudson legal test for assessing the constitutionality of restrictions on commercial speech. NetChoice also claimed that the CAADCA violated the California constitution and was preempted by COPPA and Section 230 of the Communications Decency Act. On February 17, 2023, NetChoice sought a preliminary injunction to enjoin the law. The United States District Court Northern District of California agreed with NetChoice and on September 8, 2023, granted a preliminary injunction, finding that the CAADCA, which was slated to take effect on July 1, 2024, likely violated the First Amendment.

While this decision is a victory for online businesses, the reach of the CAADCA does not stop with California; already, several states – Connecticut, Maryland, Minnesota, Nevada, New Jersey, New Mexico, and Oregon – have introduced legislation modeled on the CAADCA. The CAADCA’s broad, vague language is also manifest in some of the federal privacy bills that were recently introduced. KOSA includes similar vague language of the sort challenged by NetChoice; KOSA applies to “covered platforms” “likely to be accessed” by users 16 or younger and requires them to “act in the best interest” of children using their site and “protect minors from online harms.” This obligation includes a duty to prevent and mitigate heightened risks of harms that may arise from using the platform, using language similar to the CAADCA. On November 28, 2022, 90 non-profit organizations, including the ACLU, GLAAD, and a number of internet advocacy groups, wrote to Congress expressing concern that “KOSA was not only unconstitutional in its broadness, but would harm young people by restricting their access to certain online content.”

Social media platforms have found themselves in the crosshairs of state enforcement and legislative actions. Montana became the first state to ban TikTok outright (Utah banned TikTok on state-owned devices in 2022)[6]. Then, on October 24, 2023, a bipartisan group of state attorneys general (AGs) filed a federal lawsuit against Meta and other Meta entities, and nine AGs filed complaints in their states. The complaints allege violations of COPPA and other legal violations related to allegedly harmful design features and practices by the Meta entities that the complaints allege, contribute to body dysmorphia, sadness, suicidal thoughts, and other mental health harms. The suit charges Meta with routinely harvesting personal information of children without verifiable parental consent as required by COPPA and deliberately misleading the public about the harms to minors caused by the company’s business practices. The complaint alleges that those business practices target minors and encourage harmful and addictive behavior and draws from testimony offered last year by a Meta whistleblower who identified infinite scrolling, auto-play, likes, and algorithmic design features, among others, as features that are intended to keep children and teens engaged and on the platform.

An earlier class action suit filed in the Northern District of California took aim more broadly at Meta, Snapchat, TikTok, and YouTube, alleging that the platforms endanger the mental health of minors through “addictive and dangerous social media products,” and urged that the platforms should be held liable on strict liability and negligence grounds for physical and mental harms to minors. These alleged harms are central to ongoing policy debates on regulation of social media and protection of minors.

VI. Conclusion: Balancing Public Interest in Protecting Minors with Constitutional Rights

Enhancing protections for children and teens online has been a focus of legislative, congressional, and executive actions, as well as enforcement actions and lawsuits, in recent years, and we expect the trend to continue during 2024. States are likely to continue to put forward new children’s privacy legislation that expands the definition of “children” to all minors[7]. Already, California has introduced two such bills: (1) the Social Media Addiction Law (SB 976), which is aimed at curbing social media addiction amongst children and prohibits companies from showing minors “addictive feeds” unless they first obtain parental consent or have reasonably determined the user is not under 18, and (2) the California Children’s Data Privacy Act (AB 1949), which would prohibit businesses from collecting, using, sharing, or selling personal data of anyone under the age of 18 absent informed consent (and in the case of a user under 13, consent from a parent) unless such uses are strictly necessary for business purposes.

Recent legal challenges, debates about the scope of COPPA preemption, and the friction between First Amendment rights and the public policy objective of protecting minors from certain content online have influenced the legislative landscape, and that continues to be a trend to watch as more legislation is introduced at the federal and state levels.

It is clear that children's advertising and privacy policy developments and litigation are having an impact well beyond the online services that are considered to be “directed to children.” It will be important for all businesses to consider weighing in with comments to the FTC on the proposed COPPA changes, and to monitor and evaluate the broader impact on their businesses and operations associated with the various federal and state policy developments we have discussed.


[1] 105 Cong. Rec. S8482 (July 17, 1998)

[2] Prepared Statement of the Federal Trade Commission On Protection of Children's Privacy on the World Wide Web, September 22, 1998.

[3] CARU, “Self-Regulatory Guidelines for Children’s Advertising,” Council of the Better Business Bureaus, 1975.

[4] Congressional Research Service, “Overview of the American Data Privacy and Protection Act, H.R. 8152,” August 31, 2022.

[5] Jones v. Google, No. 21-16281, 56 F.4th 735 (9th Cir. Dec. 28, 2022). 

[6] On November 30, 2023, a federal judge issued a preliminary injunction against Montana’s ban on constitutional grounds.

[7] In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation (MDL No. 3047 Case No. 4:22-md-03047-YG (Oct. 6, 2022)).

HTML Embed Code
HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins