Ofcom Warns Social Media Platforms to Improve Child Safety or Face Hefty Fines

Ofcom Warns Social Media Platforms to Improve Child Safety or Face Hefty Fines
Credit: BBC

Social media giants such as Facebook, Instagram, and WhatsApp are under scrutiny as UK communications regulator Ofcom prepares to enforce the Online Safety Act, set to take effect next year. Ofcom’s chief executive, Dame Melanie Dawes, has emphasized that it is the responsibility of tech companies—not parents or children—to ensure online safety for young users. Firms failing to comply with the new legislation could face fines of up to 10% of their global revenue and even risk losing access to the UK market.

The Online Safety Act requires social media platforms to protect children from exposure to harmful content, including self-harm material, pornography, and violent content. Companies will have three months after the finalization of Ofcom’s guidance to complete risk assessments and implement necessary safety measures.

Dame Melanie underscored the need for transparency, urging social media firms to be clear about what they expose users to.

“If we don’t think they’ve done that job well enough, we can take enforcement action against that failure,”

she stated.

Bereaved Parents Demand Faster Action

While Ofcom prepares for enforcement, some parents remain frustrated at the pace of change. Ellen Roome, whose son Jools Sweeney passed away in 2022, believes her son may have died after participating in an online challenge. Now a member of Bereaved Parents for Online Safety, Roome criticized the lack of urgency from social media platforms and Ofcom.

“They don’t seem to be doing enough to protect children from harmful content,”

she said.

In response to increasing concerns, platforms like Instagram are beginning to roll out new safety features, such as controls to prevent “sextortion.” However, activists and parents argue that stricter age verification and quicker enforcement are still needed.

A Landmark Step Toward Online Accountability

With the Online Safety Act, Ofcom is aiming to hold social media companies accountable for the content shared on their platforms. This includes potential changes to allow users to leave group chats without notification, providing greater control over their online interactions.

Dr. Lucie Moore, chief executive of the Centre to End All Sexual Exploitation (CEASE), welcomed Dame Melanie’s commitment but voiced disappointment over a lack of clear guidelines on age verification within Ofcom’s plans. As the law comes into force, Ofcom has indicated it is “ready to go,” with Dame Melanie stating that the regulator expects “very significant changes” from tech companies to create a safer online environment for all, particularly vulnerable young users.

Alistair Thompson

Alistair Thompson is the Director of Team Britannia PR and a journalist.