
The adoption of the guidelines marks a milestone in the Commission’s efforts to boost online safety for children and young people under the DSA. The guidelines set out a non-exhaustive list of proportionate and appropriate measures to protect children from online risks such as grooming, harmful content, problematic and addictive behaviours, as well as cyberbullying and harmful commercial practices.
The guidelines will apply to all online platforms accessible to minors, with the exception of micro and small enterprises. Key recommendations include the following:
- Setting minors' accounts to private by default so their personal information, data, and social media content is hidden from those they aren't connected with to reduce the risk of unsolicited contact by strangers.
- Modifying the platforms’ recommender systems to lower the risk of children encountering harmful content or getting stuck in rabbit holes of specific content, including by advising platforms to prioritise explicit signals from children over behavioural signals as well as empowering children to be more in control of their feeds.
- Empowering children to be able to block and mute any user and ensuring they can't be added to groups without their explicit consent, which could help prevent cyberbullying.
- Prohibiting accounts from downloading or taking screenshots of content posted by minors to prevent the unwanted distribution of sexualised or intimate content and sexual extortion.
- Disabling by default features that contribute to excessive use, like communication "streaks," ephemeral content, "read receipts," autoplay, or push notifications, as well as removing persuasive design features aimed predominantly at engagement and putting safeguards around AI chatbots integrated into online platforms.
- Ensuring that children’s lack of commercial literacy is not exploited and that they are not exposed to commercial practices that may be manipulative, lead to unwanted spending or addictive behaviours, including certain virtual currencies or loot-boxes.
- Introducing measures to improve moderation and reporting tools, requiring prompt feedback, and minimum requirements for parental control tools.
The guidelines also recommend the use of effective age assurance methods provided that they are accurate, reliable, robust, non-intrusive, and non-discriminatory. In particular, the guidelines recommend age verification methods to restrict access to adult content such as pornography and gambling, or when national rules set a minimum age to access certain services such as defined categories of online social media services. The EU Digital Identity Wallets, and before they become available, the blueprint for age verification on which applications can be built, will provide a compliance example and a reference standard for a device-based method of age verification. The guidelines recommend age estimation in other cases, such as when terms and conditions prescribe a minimum age lower than 18 due to identified risks to minors.
Like the DSA, the guidelines adopt a risk-based approach, recognising that online platforms may pose different types of risks to minors, depending on their nature, size, purpose, and user base. The guidelines enshrine a safety and privacy by design approach and are grounded in children’s rights. Platforms should ensure that the measures they take do not disproportionately or unduly restrict children’s rights.
The Commission will use these guidelines to assess compliance with Article 28(1) of the DSA. They will serve as a reference point for checking if online platforms that allow minors to use them meet the necessary standards and may inform national regulators in their enforcement actions. However, following these guidelines is voluntary and does not automatically guarantee compliance. The guidelines were developed following a comprehensive process including:
- Feedback gathered through a call for evidence
- Stakeholder workshops held in October 2024 and June 2025, engagement with experts
- A targeted public consultation
- Engagement with children and young people
- Meetings of the European Board for Digital Services within its working group on the protection of minors
Downloads
Related content
Press release | 14 July 2025
The Commission has presented guidelines on the protection of minors, as well as a prototype of an age-verification app under the Digital Services Act (DSA).
Consultation results | 14 July 2025
The targeted public consultation on the protection of minors guidelines pursuant to Article 28 of the Digital Services Act (DSA) ran from 13 May to 15 June 2025.
Report / Study | 14 July 2025
The European Commission has published a report summarising the contributions received during the call for evidence on the guidelines for the protection of minors online under the Digital Services Act (DSA).
Report / Study | 14 July 2025
The European Commission has published a report summarising the process and the outcome of focus groups organised with children and young people on the Commission’s draft guidelines on the protection of minors under the Digital Services Act (DSA).