EU Parliament Says Children Should Be At Least 16 to Access Social Media

EU Parliament Says Children Should Be At Least 16 to Access Social Media
Depositphotos

Representatives in the European Parliament are calling for ambitious EU action to protect minors online. This includes an EU-wide minimum age of 16 and bans on the most harmful addictive practices.

MEPs adopted a non-legislative report, expressing deep concern over the physical and mental health risks minors face online and calling for stronger protection against the manipulative strategies that can increase addiction and that are detrimental to children’s ability to concentrate and engage healthily with online content. To help parents manage their children’s digital presence and ensure age-appropriate online engagement, Parliament proposes a harmonised EU digital minimum age of 16 for access to social media, video-sharing platforms, and AI companions, while allowing 13- to 16-year-olds access with parental consent.

Expressing support for the European Commission’s work to develop an EU age verification app and the European digital identity (eID) wallet, MEPs insist that age assurance systems must be accurate and preserve minors’ privacy. Such systems do not relieve platforms of their responsibility to ensure their products are safe and age-appropriate by design, they add. To incentivise better compliance with the EU’s Digital Services Act (DSA) and other relevant laws, MEPs suggest senior managers could be made personally liable in cases of serious and persistent non-compliance, with particular respect to the protection of minors and age verification.

Parliament is also calling for a ban on the most harmful addictive practices and default disabling of other addictive features for minors (including infinite scrolling, auto play, pull-to-refresh, reward loops, harmful gamification); a ban on sites not complying with EU rules; and action to tackle persuasive technologies, such as targeted ads, influencer marketing, addictive design, and dark patterns under the forthcoming Digital Fairness Act.

MEPs also call for a ban on engagement-based recommendation systems for minors; application of DSA rules to online video platforms and outlawing of loot boxes and other randomized gaming features (in-app currencies, fortune wheels, pay-to-progress); protection of minors from commercial exploitation, including by prohibiting platforms from offering financial incentives for kidfluencing; and urgent action to address the ethical and legal challenges posed by generative AI tools including deepfakes, companionship chatbots, AI agents and AI-powered nudity apps.