93 per cent of EU citizens are concerned about children’s mental health. 92 per cent identify cyberbullying as the primary online threat, according to the State of the Digital Decade Eurobarometer 2025.

Brussels has adopted a tougher stance on online child safety.

Last week, European Commission President Ursula von der Leyen announced that a new “age verification app is technically ready and soon available for citizens to use”.

The system requires citizens to verify their age to access online platforms without sharing personal data.

EU member states are already taking decisive actions. France has enacted a ban targeting users under 15. Spain, Austria, Greece, Ireland, Denmark, and the Netherlands are gearing up to introduce similar rules soon.

Christel Schaldemose, Member of the Group of the Progressive Alliance of Socialists and Democrats in the European Parliament and rapporteur of the non-legislative report on an EU-wide minimum age for social media, senses hesitation in the Commission’s actions.

“I don’t know if they’re delaying [actions] on purpose, but I think that they are too slow. Like this we end up with a fragmented internal market because so many countries have already suggested an age limit”.

Children and online platforms

Social media has become a pervasive and risky environment for children, mostly due to addictive designs, constant connectivity, heavy personalisation, and AI tools.

In 2022, 96 per cent of 15-year-olds were active on social media, with 37 per cent spending more than 3 hours a day on these platforms. Female teens tend to use social media more, at 42 per cent compared to 32 per cent of their male counterparts, a 2025 Joint Research Centre (JRC) study found.

Among 9 to 15-year-olds, daily usage frequently hits the three-hour mark; 78 per cent of teens aged 13 to 17 check their devices at least once an hour; a quarter admit to struggling with dysfunctional internet habits, as stated in the European Parliament’s non-legislative report of November 2025.

Almost 99 per cent of teens aged 16-17 participated actively (creating user profiles, posting messages, using Facebook, X, etc) on social media in 2025, according to the Eurobarometer.

For Schaldemose, the Commission’s panel on child safety online is a positive first step. Experts’ deep knowledge will efficiently advise the Commission’s actions, she said.

The JRC warns that uncontrolled social media usage harms children’s mental health, raising depression and anxiety levels. Harmful content, such as violent, sexualised, and pro‑eating‑disorder material, can affect children’s brain development and social behaviours.

60 per cent of young females show depression symptoms compared to 35 per cent of males, and 65 per cent experience anxiety as opposed to 41 per cent of males, the JRC study reveals.

As many internet platforms are primarily targeted to adults, their advertising-driven business models have serious repercussions on younger users, fostering dependence.

36 per cent of adolescents in Europe, Central Asia, and Canada keep constant contact via social media.11 per cent show problematic social media use, with girls (13 per cent) reporting higher rates than boys (9 per cent), according to a 2024 World Health Organisation (WHO) report.

Bans, a national competence

On April 8, 2026, Greece’s prime minister Kyriakos Mitsotakis announced a social media ban for children under 15. The law, taking effect in January 2027 and still pending Parliament approval, blocks minors from social media accounts, requiring platforms to enforce strict age verification or face financial penalties.

The move was triggered by data showing that 75% of Greek primary school children were active on social media, while about 48% of teenagers reported negative mental health effects.

Public sentiment also peaked, with 80 per cent supporting a ban after the March 2026 US verdict holding major tech platforms liable for addictive app design. Building on the success of Greece’s 2024 school smartphone ban, the government cited “lifeless” students and sleep deprivation as the restriction’s catalyst.

Greece joins other EU countries: France approved a bill in January 2026 to ban social media for under-15s, citing a “health emergency” and the need to protect minors from cyberbullying and psychological harm. February 2026 saw Spain announce plans for an under-16s ban to “tame the digital Wild West,” while Austria, Denmark, and Slovenia are drafting bans for under the ages of 14, 15, and 15, respectively.

Italy and Ireland are exploring bans for under-15s and under-16s respectively, while Germany and other states debate age limits or “youth versions” of platforms. They are motivated by a surge in mental health issues and wanting to hold tech giants accountable for addictive platform designs, following the precedent set by Australia’s 2025 world-first under-16 ban.

Self-reported birthdates are ineffective. Implementation has been done through systems like digital wallets or identity tokens, but the European Commission’s brand-new age verification app “will allow users to prove their age when accessing online platforms, just like shops ask for proof of age for people buying [alcohol]”, according to von der Leyen.

Platforms have the main responsibility, and national regulators enforce compliance through oversight and fines. While EU-wide rules like the Digital Services Act (DSA) and General Data Protection Regulation (GDPR) establish baseline protections for minors, national bans go further by setting strict age limits and increasing accountability for tech companies.

However, opposing political figures, like members of Spain’s Vox party and Italian lawmakers, consider these bans excessive government intervention, arguing that education, parental control, and digital literacy would be more effective than outright restrictions.

This perspective is also shared by consumers’ rights advocates; Olivia Brown, Policy Officer at global consumers group Euroconsumers, considers blanket bans a political shortcut that let platforms off the hook.

“Banning social media doesn’t make the internet safer. It just moves the problem out of sight. What minors need is safety built into platforms by design, real user controls, and algorithms they can shape themselves, not doors that are simply slammed shut, only to swing wide open the moment they turn 18.”

Moves towards EU-wide regulation

The issue is politically sensitive, and an EU-wide ban risks aggravating polarisation. Instead, the Commission is first releasing the age-verification app as a tool for member states to implement their own national bans.

First conceived in 2025, the app is designed as a technical framework that can be integrated into national digital wallets or separate applications to verify users’ ages. To confirm their age, users must download an app, provide consent for data use, scan an identity document (including its chip), and complete facial recognition. This process may need to be repeated regularly, and platforms could require verification each time users access age-restricted services.

Concerns arise from its complexity, privacy implications, ease of circumvention (such as via VPNs), and fears that it could shift responsibility away from platforms, unlike other EU-wide regulatory tools.

One is the GDPR, adopted in 2016 and implemented by 2018–2020, set strict rules on children’s data, the default age of digital consent at 16 (with flexibility down to 13) requires parental approval for younger users.

The revised Audiovisual Media Services Directive, effective from 2020, introduced age-rating systems and parental controls on streaming platforms, alongside strict bans on harmful content such as child exploitation material. Then, in 2021, the EU launched its broader child online safety strategy, combining funding, research, and voluntary codes of conduct to address risks like grooming and disinformation.

More recently, the Commission proposed practical measures such as private-by-default accounts for minors, and limits on addictive features like autoplay and infinite-scrolling. Parts of the AI Act, with bans active as of February 2025, specifically prohibit systems that use subliminal techniques or exploit children’s vulnerabilities to distort their behaviour. The Digital Fairness Act, expected for a formal proposal in late 2026, will tighten platform design rules by banning “dark patterns” and addictive features like infinite scrolling.

At the centre of this framework is the DSA, a landmark regulation to overhaul online platforms. Proposed by the Commission in 2020, it was agreed by the European Parliament and Council in 2022 and came into force in February 2024 after a phased rollout.

The DSA requires platforms to protect users, with minors as a priority. These include safer default settings, content moderation, and restrictions on targeted advertising. It also establishes a new enforcement system involving national Digital Services Coordinators and EU-level oversight.

Post-DSA, EU citizens saw more transparency, stronger user rights, and limits in harmful or exploitative practices. Users now have clearer ways to report and appeal content decisions, while minors benefit from stricter privacy protections and reduced exposure to targeted ads.

The impact on digital platforms

Social media age restrictions significantly hit teen reach: businesses lose a major social group that drives online activity, leading to reduced ad impressions and traffic revenue.

Online platforms rely heavily on teens and children for their ad revenue. Age limits on social media can reduce it because they shrink the number of young users and make it harder to target them with ads.

Compliance costs may rise as companies must enhance age-assurance systems and parental-consent processes. These are costly and complex due to the need for advanced identity verification and data protection technologies.

A ban on addictive design features and engagement algorithms in social media necessitates product redesign, increasing engineering costs and delaying EU market launches. A shift towards safer content may strain budgets.

According to Schaldemose, big companies must develop “new platforms with a completely different business model that protects children”.

Companies can also face stricter legal risks in the event of a rule violation. The Parliament proposed holding platform owners personally accountable for serious and repeated breaches of minor protection provisions.

“They’re the one who make the platforms available. If we agree on an age limit, the responsibility is definitely on the companies in case of violations”, Schaldemose told Euronews.

Europe needs to speed up

For Schaldemose, the Commission is acting too slowly. It announced the panel in September, but it only started working in March.

Some member states are already reportedly opposing the app. “The longer it takes for the Commission to come up with a proposal, the more likely it is we have a fragmented market and loopholes”, she claimed.

“I have become impatient with the Commission. It looks like member states are also a bit impatient because they are also pushing”, Schaldemose said.

Privacy and data sharing issues cannot be an excuse anymore. “In the last two years, we have developed tools that do not compromise personal data and safety”, she added.

The Parliament will continue to push until the Commission finds a reasonable solution. “We need to act at European level, and the Parliament is clear on this”, Schaldemose concluded.

Read the full article here

Share.
Leave A Reply