On July 14, 2025, the European Commission published guidelines on the protection of minors under the Digital Services Act (“DSA”) aimed at safeguarding minors in the digital environment (the “Guidelines”). The European Commission considers that the Guidelines are a significant milestone in bolstering online safety for children and young people in the EU. The Guidelines include a non-exhaustive list of measures aimed at addressing risks such as grooming, harmful content, cyberbullying, and exploitative commercial practices. Following the Guidelines is voluntary, but they will be used by the European Commission as a benchmark for compliance with Article 28(1) of the DSA.
Key recommendations in the Guidelines include:
- Age assurance: Implementing robust, reliable, accurate, non-discriminatory and non-intrusive age verification methods to restrict access to age-inappropriate content and/or comply with national restrictions which set a minimum age to access certain services. Once EU Digital Identity Wallets become available, they will provide a reference standard for a device-based method of age verification.
- Privacy settings: Defaulting minors' accounts to private to mitigate risks to minors’ privacy, safety and security, for example the risk of unsolicited contact.
- Content protection: Restricting users ability to download and screenshot minors' posts to prevent sexual extortion and distribution of intimate content.
- Feature restrictions: Disabling features that encourage excessive use, such as communication “streaks,” “read receipts,” autoplay, and push notifications, and safeguarding against manipulative design elements.
- Safeguarding AI features: Including child-friendly language and mechanisms to warn minors that interactions with an AI feature are different from human interactions, and ensuring AI chatbots are not displayed prominently and their use may be opted-out of by minors or their guardians.
- Content recommender systems: Adjusting algorithms to reduce exposure to harmful content, and enhancing minors’ control over their feeds.
- User interaction controls: Enabling minors to block and mute users easily and requiring consent before adding minors to groups to help prevent cyberbullying and unwanted contact.
- Commercial practices: Protecting minors from exploitative commercial practices, including virtual currencies and loot boxes, by ensuring they are not exposed to harmful, unethical and unlawful advertising or anything that may take advantage of their lack of commercial literacy.
- Moderation tools: Enhancing moderation and reporting tools with prompt feedback and by strengthening parental control tools.
The European Commission stressed that the Guidelines are built on a risk-based approach, taking into consideration that platforms pose different risks based on their characteristics. The Guidelines intend to emphasize a safety and privacy by design ethos, rooted in children's rights, ensuring measures do not disproportionately limit minors’ rights and freedoms.
Read the Guidelines.