Media and privacy regulators in the United Kingdom have urged major social media companies to take stronger action to prevent children from accessing their platforms, warning that many firms are not properly enforcing their own minimum age requirements.
The government is considering stricter limits on children’s use of social media, including a potential ban on users under 16—similar to measures adopted in Australia. Regulators said they are increasingly concerned about algorithm-driven feeds that may expose young users to harmful or addictive material.
Ofcom and the Information Commissioner’s Office said platforms must prioritise children’s safety in their design and operations. Ofcom chief executive Melanie Dawes warned that companies must act quickly or face regulatory enforcement.
As part of the latest phase of the Online Safety Act, Ofcom has asked major platforms—including Meta Platforms’s Facebook and Instagram, Roblox Corporation’s Roblox, Snap Inc.’s Snapchat, ByteDance’s TikTok, and Alphabet’s YouTube—to explain by April 30 how they will strengthen age verification, limit contact between children and strangers, make algorithmic feeds safer, and stop testing new products on minors.
The Information Commissioner’s Office also called on the same companies to adopt modern age-assurance technologies to prevent users under 13 from accessing services not designed for them. ICO chief executive Paul Arnold said the availability of advanced technology means companies have no excuse for failing to implement stronger safeguards.
Meta said it already uses artificial intelligence to detect and estimate users’ ages and automatically places teenagers in accounts with built-in protections. The company also argued that age verification should be handled centrally through app stores to avoid families repeatedly sharing personal information.
YouTube said it already provides age-appropriate experiences and expressed surprise at Ofcom’s shift away from a risk-based regulatory approach, suggesting the regulator should focus on platforms that pose the highest risks.
Roblox said it had introduced more than 140 new safety features in the past year, including mandatory age verification for chat functions aimed at preventing adults from communicating with children.
Regulators have the power to impose significant penalties. Ofcom can fine companies up to 10% of their qualifying global revenue, while the Information Commissioner’s Office can issue fines of up to 4% of worldwide annual turnover. The privacy regulator recently fined Reddit £14.5 million for failing to introduce effective age checks and for unlawfully processing children’s data.
Click here for more on Technology


