top of page

European MEPs Push for Minimum Age 16+ for Digital Access

  • Dec 16, 2025
  • 4 min read
Students gather around a classmate looking at a phone, smiling in a classroom. Books and a map are visible in the background.
MEPs call for stricter online rules, suggesting social media and AI companions be off-limits to under-16s without parental consent. PHOTO: Max Fischer

A group of Members of the European Parliament (MEPs) is pushing for stricter online protections for minors, calling for social media, video-sharing platforms and AI companions to be off-limits to anyone under 16 unless parents give explicit consent. A non-legislative report on the protection of minors online, adopted on 26 November 2025 by 483 votes in favour, 92 against, and 86 abstentions, highlights concerns over addictive features, manipulative algorithms and exposure to age-inappropriate content for children.

According to the report, 97% of young people go online daily, and 78% of 13- to 17-year-olds check their devices at least hourly. One in four minors displays ‘problematic’ smartphone use, mirroring patterns of behavioural addiction, while over 90% of Europeans view stronger child protection online as a pressing issue.

Rapporteur Christel Schaldemose (S&D, Denmark) said during the debate: “I am proud of this parliament, that we can stand together in protecting minors online. Together with strong, consistent enforcement of the Digital Services Act, these measures will dramatically raise the level of protection for children. We are finally drawing a line. We are saying clearly to platforms: your services are not designed for children. And the experiment ends here.”


Key Measures Proposed


The report outlines several measures aimed at improving online safety for minors:

  • Minimum age requirements: A default minimum age of 16, with parental consent exceptions for 13- to 16-year-olds.

  • Age verification and parental controls: Platforms must implement accurate, privacy-preserving systems to prevent exposure to harmful content. These measures complement ongoing EU initiatives, such as the European digital identity (eID) wallet.

  • Bans on addictive features: Harmful features such as infinite scrolling, autoplay, pull-to-refresh, reward loops, and gamified engagement mechanics would be restricted for minors.

  • Algorithmic transparency: Engagement-based recommendation systems targeting minors would be prohibited.

  • Gaming safeguards: Loot boxes, in-app currencies, fortune wheels, and other gambling-like features in digital games would be banned for underage users.

  • Protection from commercial exploitation: Platforms would be prevented from offering financial incentives to minors acting as influencers, known as “kidfluencing.”

  • AI oversight: Generative AI tools such as deepfakes, nudity apps, companionship chatbots and AI agents would face scrutiny to prevent non-consensual image creation, manipulation and other harms.

Industry Responses


Tech companies have shown cautious support for a consistent age-based framework. Meta, in a statement from 3 July 2025, emphasized that ‘ensuring the safety of young people is a top priority,’ supporting an EU-wide Digital Majority Age with parental approval for younger teens. The company highlighted the importance of consistency across platforms, robust age verification, and privacy-preserving measures.


Similarly, Mexedia S.p.A., a tech company listed on Euronext Growth Paris, welcomed the European Parliament’s resolution in a 4 December 2025 press release. Anna Lisa Trulli, Head of Mexedia’s Benefit Unit, said “The European Parliament’s proposal aims at harmonization, but to become mandatory it will require a binding European law. Following these recommendations is not only about protecting young users but also about building a safer and more inclusive digital environment, where innovation coexists with the protection of rights and citizens’ well-being.”


Vincenzo La Barbera, Communication Officer at Mexedia, added “European regulatory developments are increasingly impacting the daily lives of users and the relationship between young people and technology. The debate opened by the European Parliament’s resolution will certainly help guide further reflections in the coming months.”



Background and Context


Hands reviewing documents with a laptop, phone, and notebooks on a desk. Pens point to text. Light tones, business setting.
A new European Parliament report highlights risks in online spaces for minors including addictive features, manipulative algorithms and exposure to age-inappropriate content. PHOTO: Karola G

The 26 November 2025 report builds on earlier research including a 4 November 2025 internal committee report and a 2019 study on harmful internet use. It highlights the dual nature of digital engagement: while online technologies provide learning, creativity, self-expression and civic participation, they also carry risks. Addictive designs, exposure to violent or pornographic content, manipulative commercial strategies and AI-driven services can harm minors’ mental and physical health.

The report calls for stronger enforcement of existing EU legislation including the Digital Services Act (DSA), Audiovisual Media Services Directive (AVMSD) and AI Act alongside coordinated EU-wide efforts to ensure long-term protection of minors online. Proposed safeguards include age-appropriate design, parental control tools, and bans on dark patterns and harmful gamified mechanics. It also emphasizes media and digital literacy, urging national curricula to integrate guidance for teachers and students to navigate online risks responsibly.


Next Steps


While the report is non-legislative, it sends a strong political signal to both platforms and regulators. Platforms are encouraged to act proactively, while member states may consider complementary measures. Coordinated action with national authorities, Safer Internet Centres, and consumer protection networks will be crucial to translating the report’s recommendations into meaningful protections for minors.

 

By highlighting the need for minimum age protections, robust parental controls and regulation of addictive or manipulative features, MEPs are signaling a significant step toward healthier, safer and more accountable digital environments for young Europeans.


Your Thoughts

What do you think? Should social media and digital platforms set stricter minimum ages for teens or are parental controls enough? Share your thoughts with us.

Comments


bottom of page