New push in Europe to curb children’s social media use

Sign up now: Get ST's newsletters delivered to your inbox

(FILES) A student uses his phone in Melbourne on November 27, 2024, as Australia looks to ban children under 16 from social media with claims social media platforms have been tarnished by cyberbullying, the spread of illegal content, and election-meddling claims. New Zealand's prime minister on May 6, 2025 proposed banning children under 16 from social media, stressing the need to protect them from the perils of big tech platforms. The proposed ban was modelled on that of Australia, which sits at the forefront of global efforts to regulate social media. (Photo by William WEST / AFP)

A rising body of evidence has shown the negative effects of social media on children’s mental and physical health.

PHOTO: AFP

Follow topic:

- From dangerous diet tips and disinformation to cyber bullying and hate speech, the glut of online content harmful to children grows every day. Several European countries have had enough and agree that the European Union should do more to prevent minors’ access to social media.

The EU already has some of the world’s most stringent digital rules to rein in Big Tech, with multiple probes ongoing into how platforms protect children – or fail to do so.

Backed by France and Spain, Greece spearheaded a proposal for how the EU should limit children’s use of online platforms, as a rising body of evidence shows the negative effects of social media on children’s mental and physical health.

They discussed the plan on June 6 with their EU counterparts in Luxembourg to push the idea of setting an age of digital adulthood across the 27-nation bloc, meaning children would not be able to access social media without parental consent.

France, Greece and Denmark believe there should be a ban on social media for under-15s, while Spain has suggested a ban for under-16s.

Australia has banned social media for under-16s, taking effect later in 2025, while New Zealand and Norway are considering a similar prohibition.

After the day’s talks in Luxembourg, it appeared there was no real appetite at this stage for an EU-wide ban on children under a specific age.

But Danish Digital Minister Caroline Stage Olsen indicated there would be no let-up. “It’s going to be something we’re pushing for,” she said.

Top EU digital official Henna Virkkunen admitted specific age limits would be “challenging” for multiple reasons, including cultural differences in member states and how it would work in practice.

But the European Commission, the EU’s digital watchdog, still intends to launch an age-verification app in July, insisting it can be done without disclosing personal details.

‘Very big step’

The EU in May published non-binding draft guidelines for platforms to protect minors, to be finalised once a public consultation ends in June, including setting children’s accounts to private by default, and making it easier to block and mute users.

French Digital Minister Clara Chappaz said it would be “a very big step” if the EU made platforms check the real age of their users, as theoretically required under current regulation.

The worry is that children as young as seven or eight can easily create an account on social media platforms despite a minimum age of 13, by giving a false date of birth.

“If we all agree as Europeans to say this needs to stop, there needs to be a proper age verification scheme, then it means that children below 13 won’t be able to access the platform,” Ms Chappaz said.

France has led the way in cracking down on platforms, passing a 2023 law requiring them to obtain parental consent for users under the age of 15.

But the measure has not received the EU green light it needs to come into force.

France also gradually introduced requirements in 2025 for all adult websites to have users confirm their age to prevent children accessing porn – with three major platforms going dark this week in anger over the move.

TikTok, also under pressure from the French government, on June 1

banned the “#SkinnyTok” hashtag

, part of a trend promoting extreme thinness on the platform.

In-built age verification

France, Greece and Spain expressed concern about the algorithmic design of digital platforms increasing children’s exposure to addictive and harmful content – with the risk of worsening anxiety, depression and self-esteem issues.

Their proposal – also supported by Cyprus and Slovenia – blames excessive screen time at a young age for hindering the development of minors’ critical-thinking and relationship skills.

They demand “an EU-wide application that supports parental control mechanisms, allows for proper age verification and limits the use of certain applications by minors”.

The goal would be for devices such as smartphones to have in-built age verification.

The EU is clamping down in other ways as well.

It is currently investigating Meta’s Facebook and Instagram as well as TikTok under its mammoth content moderation law, the Digital Services Act, fearing the platforms are failing to do enough to prevent children from accessing harmful content.

And last week, it launched an investigation into four pornographic platforms over suspicions they are failing to stop children from accessing adult content. AFP

See more on