The Online Safety Bill

MPs are coming together to form the Online Safety Bill. This bill will examine the often harmful content involved within online journalism and what constitutes as online journalistic content. The bill was also impose a duty of care on large tech companies and consider the inevitable consequences on the public’s freedom of speech.

 

The Online Safety Bill will undoubtedly set the tone for global social media regulation. Huge companies within the tech industry, predominantly Facebook and Google, will be watching closely to see how this landmark legislation could impact their industry.

 

The bill predominately covers tech firms that allow their users to post their own, unregulated content online. This includes platforms such as Facebook, Instagram, Twitter, Snapchat, YouTube and TikTok. Many of which are used daily by billions of active users. The changes this bill will have on these social media platforms isn’t set in stone, but will undeniably be revolutionary.

 

Ultimately, the bill solidifies the companies duty of care to its users. This covers multiple areas: preventing illegal content, content related to terrorism and content encouraging hate crimes. Unfortunately, our social media channels are filled with misinformation. This misinformation can expose both adults and children to harmful propaganda, illegal content, racial abuse. Tech companies will now be liable to protect their users from exposure to this content.

 

However, this causes complications with our human right to the freedom of speech. The bill will take this into consideration and companies will be required to put ‘safeguards for freedom of expression’ into place. As a result, the moderators of our content will face difficult decisions on what to allow. As users, we will be able to appeal any removed content if we believe it impacts our right to free expression. Furthermore, the issue of biases is prevalent. The moderators involved will be unable to favour particular content based on their political views. This has raised numerous concerns over the clarity of the bill and how it will effectively be implemented. Undoubtedly, the Online Safety Bill will encounter inevitable challenges. Companies will be required to report on how they are maintaining freedom of expression of their users as the government is insisting content must not be ‘over-removed’.

 

In order to successfully implement this bill, a fine system will be introduced. Failure to maintain their duty of care will see companies fined up to £18 million or 10% of their annual turnover. This formidable approach to the bill comes after the recent whistleblower exposure of Facebook’s choice to prioritise profit over consumer safety. The accusations suggest the site has led teenagers to eating disorder content and fuelled violence within Ethiopia.

 

A revision of the Online Safety Bill has been requested to more fully encompass crimes against women and girls, an area that was felt to be previously lacking. The original draft of the bill failed to mention violence against women and girls explicitly, a misstep charities including Refuge have insisted be corrected.

Eleanor Wadley

Eleanor Wadley has extensive experience in writing in a wide-range of fields including travel and lifestyle, tourism, educational content for charities. Eleanor kicked off her career with a first class degree in Illustration and Creative and Professional Writing from the University of Worcester.