Should kids be on social media? At what age? Should parents monitor their conversations on those platforms? Do parental controls work? These are questions facing many parents and guardians, especially with the increasing use of social media platforms by kids and teens. The Pew Research Center reported that 58% of teens are daily users of TikTok, and 50% of teens use Snapchat and Instagram daily.
With kids using social media platforms so frequently, concern is growing about their effects on adolescents. That concern comes not only from parents but lawmakers, too. Lawmakers have conducted multiple congressional hearings on online safety for children, but even with bipartisan agreement, drafting and implementing laws takes time.
While there is already technically a law that prohibits children under the age of 13 from using online platforms that advertise to them and collect their data without parental consent (Children’s Online Privacy Protection Act (COPPA)), it went into effect almost 25 years ago. The goal of COPPA was to protect children’s online privacy by requiring transparency in data collection with disclosures in privacy policies and to obtain consent before collecting personal information from children under the age of 13. To comply with COPPA, most social media companies simply ban children under the age of 13 from using the service at all.
However, times have changed, and online privacy is no longer the only concern when it comes to children’s use of social media platforms. Now, there are concerns like cyberbullying, harassment, the risk of developing eating disorders, and suicidal thoughts. New York Attorney General Letitia James said, “Young people across our country are struggling, and …addictive social media algorithms are only making th[e] mental health crisis worse.” Attorney General James’s statement comes after 42 state Attorneys General wrote a letter this week urging Congress to require labels on social media platforms to warn of the potential risks to children.
This push comes after a June New York Times op-ed by U.S. Surgeon General Vivek Murthy, in which he urges lawmakers to require social media platforms to place tobacco-style warning labels on social media to alert users that the platforms can harm children’s mental health. The coalition of 42 Attorneys General endorsed Murthy’s plan in its letter to Congress, stating that such requirements on social media platforms would be only “one consequential step toward mitigating the risk of harm to youth.” The letter further states, “By mandating a surgeon general’s warning on algorithm-driven social media platforms, Congress can help abate this growing crisis and protect future generations of Americans.”
In his op-ed, Murthy cited evidence that adolescents who spend substantial time on social media are at greater risk of facing anxiety and depression, and many teens say the sites have worsened their body images. While many have suggested that more research is necessary before taking action, many state and federal officials voice their concern about the dangers that social media platforms such as Instagram and TikTok can pose to children’s mental health, including by exposing them to bullying, harassment, illicit drugs, and sexually abusive material. For example, in July, the Senate passed legislation that requires tech companies to take “reasonable” steps to prevent harm to children who use their platforms and to expand existing protections for children’s online data. At the state level, several Attorneys General filed a lawsuit against Meta for its use of addictive design features on Instagram and Facebook.
However, this push is also being challenged by tech industry groups and free-speech advocates.
In the state Attorneys General’s letter to Congress, the group stated that a social media warning label “would not only highlight the inherent risks that social media platforms presently pose for young people, but also complement other efforts to spur attention, research, and investment into the oversight of social media platforms.” We will surely see a surge in efforts to better protect children online and on social media platforms—the use is only increasing so the legislative efforts will likely increase, too.