British regulators have ordered major social media platforms to strengthen child safety measures, citing failures to enforce age rules and protect minors from harmful content. They have given companies like Meta, TikTok, and YouTube until April 30 to detail plans for stricter age verification, safer algorithmic feeds, and restrictions on stranger contact.
The Information Commissioner's Office specifically demanded the adoption of modern age-assurance technology to prevent under-13s from accessing inappropriate services. Regulators warned of potential fines for non-compliance, referencing recent penalties against other companies for similar failures.
The government is also considering broader legislative action, such as potentially barring under-16s from social media platforms altogether.
Main Topics: UK regulatory action on social media, child online safety, age verification and access restrictions, algorithmic content feeds, enforcement of the Online Safety Act.
Britain's media and privacy regulators on Thursday demanded that major social media platforms do more to keep children off their services, warning that companies were failing to enforce their own minimum age rules.
Britain has been weighing tougher curbs on âchildren's access to social â media, with â the government considering barring under 16s from such platforms - mirroring a move by Australia.
Ofcom and the Information Commissioner's Office said they had grown increasingly concerned about algorithmic feeds that expose children to harmful or addictive content.
"These online services are household names, âbut they're failing to put children's safety at the heart of their products," Melanie Dawes, Ofcom's chief executive, said.
"That must now change quickly, or Ofcom will act."
USE 'MODERN' TECH, COMPANIES TOLD
In the latest implementation phase of Britain's Online Safety Act, Ofcom told Facebook and Instagram - both owned â by Meta - as well âas Roblox, Snapchat, ByteDance's TikTok and Alphabet's YouTube to show by April 30 how they would tighten age checks, restrict strangers from contacting children, make feeds safer and stop testing â new products on minors.
The ICO separately issued an open letter to the âsame platforms, calling on them to adopt "modern, viable" age-assurance tools to stop those under â13 accessing services not designed for them.
"There's now modern technology at your fingertips, so there is no excuse," Paul Arnold, ICO's chief executive, said.
A Meta spokesperson said the company already uses AI-based age detection and age-estimation tools and places teens in accounts with built-in protections. The spokesperson added that age should be verified "centrally at the app store level" so families do not have to provide personal information multiple times.
A YouTube spokesperson âsaid the platform also offered age-appropriate experiences and was "surprised to see Ofcom move away from a risk-based approach", urging the regulator to focus on "high-risk services" that were failing to comply âwith the law.
A Roblox âspokesperson said the company â had launched more than 140 new safety features in the past year, including mandatory age checks for chat, designed to prevent adults from communicating with children.
"While no system is ever perfect, we continue to strengthen protections designed âto keep players safe," the spokesperson said.
Snapchat did not respond to a request for comment. TikTok declined to comment.
Ofcom can fine companies up to 10% of their qualifying global revenue, while the ICO can issue fines of up to 4% of a company's global annual turnover.
The privacy watchdog last month fined Reddit nearly 14.5 million pounds for failing to introduce meaningful age checks and for processing children's data unlawfully. ($1 = 0.7439 pounds)
Britain has been weighing tougher curbs on âchildren's access to social â media, with â the government considering barring under 16s from such platforms - mirroring a move by Australia.
Ofcom and the Information Commissioner's Office said they had grown increasingly concerned about algorithmic feeds that expose children to harmful or addictive content.
"These online services are household names, âbut they're failing to put children's safety at the heart of their products," Melanie Dawes, Ofcom's chief executive, said.
"That must now change quickly, or Ofcom will act."
USE 'MODERN' TECH, COMPANIES TOLD
In the latest implementation phase of Britain's Online Safety Act, Ofcom told Facebook and Instagram - both owned â by Meta - as well âas Roblox, Snapchat, ByteDance's TikTok and Alphabet's YouTube to show by April 30 how they would tighten age checks, restrict strangers from contacting children, make feeds safer and stop testing â new products on minors.
The ICO separately issued an open letter to the âsame platforms, calling on them to adopt "modern, viable" age-assurance tools to stop those under â13 accessing services not designed for them.
"There's now modern technology at your fingertips, so there is no excuse," Paul Arnold, ICO's chief executive, said.
A Meta spokesperson said the company already uses AI-based age detection and age-estimation tools and places teens in accounts with built-in protections. The spokesperson added that age should be verified "centrally at the app store level" so families do not have to provide personal information multiple times.
A YouTube spokesperson âsaid the platform also offered age-appropriate experiences and was "surprised to see Ofcom move away from a risk-based approach", urging the regulator to focus on "high-risk services" that were failing to comply âwith the law.
A Roblox âspokesperson said the company â had launched more than 140 new safety features in the past year, including mandatory age checks for chat, designed to prevent adults from communicating with children.
"While no system is ever perfect, we continue to strengthen protections designed âto keep players safe," the spokesperson said.
Snapchat did not respond to a request for comment. TikTok declined to comment.
Ofcom can fine companies up to 10% of their qualifying global revenue, while the ICO can issue fines of up to 4% of a company's global annual turnover.
The privacy watchdog last month fined Reddit nearly 14.5 million pounds for failing to introduce meaningful age checks and for processing children's data unlawfully. ($1 = 0.7439 pounds)