The growing support from key European Union members is pushing forward a new proposal that restricts social media access for children. If the proposal becomes law it will place the European Union ahead of all other countries in online child safety while establishing an unprecedented regulatory framework for digital platforms.
Push for a Digital Age Limit
The digital adulthood age initiative, initiated through Greece with backing from France and Spain, will establish an EU-wide standard. The plan aims to establish a requirement for children to obtain parental authorization before they open TikTok, Instagram and Snapchat accounts when their age falls below the expected 15-16 range.
Australia has anticipated an end-of-2024 implementation to establish a social media account age minimum of 16 years, which resembles the measures taken by other countries.
A leaked report published by POLITICO reveals the document that establishes the basis for EU digital ministers to discuss the proposal during their upcoming June Council meeting. The proposal seeks to develop common regulations throughout the European Union that would safeguard minors from addictive algorithms as well as online dangers.
Growing Concern Over Online Risks
Supporters of the proposal say children across Europe are spending increasing amounts of time online, often exposed to risks ranging from screen addiction to harmful content. Advocates argue that tech companies have not done enough to shield younger users and that action at the EU level is now necessary.
“Protecting our children online will be a key priority for the upcoming Danish EU presidency,” said Caroline Stage Olsen, Denmark’s Minister of Digital Affairs, in a statement. Leadership of the upcoming EU Council presidency will be entrusted to Denmark, which plans to lead efforts in supporting the cause.
Children under 15 will receive restricted access to social media platforms, according to the statement provided by Danish Prime Minister Mette Frederiksen.
France, Greece, and Spain Lead the Charge
France serves as a major driving force behind European Union action since it implemented its national law in 2023 which prohibits social media access for children under 15. Though the French law has not been fully implemented, President Emmanuel Macron has been outspoken, calling for a digital age of majority.
“We must regain control of the lives of our children and teenagers in Europe,” Macron said in April 2024. “Digital majority at age 15, not before.”
His junior digital minister, Clara Chappaz, has been actively rallying support among other EU nations. Along with Greece’s digital minister, Dimitris Papastergiou, and Spain’s Oscar López Agueda, Chappaz co-signed the new EU proposal.
Greece has taken a slightly different approach. While Prime Minister Kyriakos Mitsotakis has said a total ban on children using social media is impractical, his government is pushing for smarter regulation. This includes age verification systems and design changes to reduce the addictive nature of apps.
What the Proposal Includes
The document outlines several broad ideas, such as:
- Creating mandatory age verification systems that work at the device level, not just within apps.
- Requiring parental controls to be built into smartphones, tablets, and computers.
- Establishing “European norms” to curb design features that keep users hooked, such as autoplay, personalization, and pop-up suggestions.
This focus on persuasive architecture, the underlying design elements that encourage excessive use, puts pressure on big tech companies to rethink how their platforms interact with younger users.
The proposal calls for device manufacturers like Apple and Google to take more responsibility, especially in enforcing age checks. However, there is already pushback. Social media companies such as Meta (Instagram and Facebook’s parent), TikTok, and X (formerly Twitter) have advocated for solutions to be implemented either within apps or app stores, not at the device level.
As of now, none of these companies has publicly commented on the new EU proposal.
Trial App and Next Steps
France, Greece, and Spain have also been selected by the European Commission to pilot a new age verification app. The tool is still in development, and full rollout could take time.
EU regulators are already using existing laws like the Digital Services Act to police content and platform behavior. However, this new proposal would significantly expand the scope of regulation, requiring changes not just to how content is moderated, but to how devices and apps are designed for young users.
A Global Debate
The European Union follows a worldwide pattern that extends beyond its borders. Governments worldwide face challenges in handling social media’s effects on children since research shows excessive screen time leads to mental health problems as well as reduced sleep quality and developmental setbacks.
Several states in the United States have either proposed or implemented laws that call for parental permission when minors use specific applications. The UK enforces the Children’s Code, which sets forth particular requirements for online platforms that serve individuals below the age of 18.
Now, Europe could go a step further, creating a bloc-wide definition of when a child becomes a “digital adult,” setting clear boundaries for platforms and devices alike.
The EU presidency changes in July so the discussions set for June will hold significant importance for all involved parties. The present momentum indicates that Brussels considers online safety for children a rising political issue although it remains unknown whether all member states will support the proposal.