Online Safety Bill must safeguard journalistic content – UK press body

Additional safeguards to protect journalists and freedom of expression are needed in the Government’s proposals for new internet safety laws, an industry body for the UK press has said.

The Online Safety Bill, which has been in progress for around five years, will see Ofcom as a new regulator for the sector and with the power to fine companies or block access to sites that fail to comply with the new rules.

The Society of Editors has welcomed assurances by the Government that it intends to strengthen protections for journalists and freedom of expression.

It comes after Culture Secretary Nadine Dorries said journalistic content would be protected “providing it’s legal”.

Executive director Dawn Alford said: “Since the publication of the Online Harms White Paper in 2019, the Society has campaigned for a broad and workable exemption for journalistic content to be included on the face of the Bill.

“The Society thanks the Government for listening to the concerns of the industry as well as the recommendations of two committees that expertly scrutinised the Bill.

“As recognised by the Culture Secretary, the Bill does not, in its present form, do enough to protect legitimate journalistic content and further amendments must be added as a matter of priority if the government is to fulfil its manifesto pledge of defending freedom of expression.

“What we now need to see is additional safeguards to protect journalistic content from take-down by broad-brush algorithms and it is essential that any appeals process reflects the fast-paced nature of news.

“While we welcome the Government’s ‘every intention’ of further improving the requirements to safeguard journalistic content, what this must translate into is robust and workable protections that work for the industry as a whole and that are published and open to scrutiny at the earliest opportunity.”

The revised Bill has changed its approach to “legal but harmful” content – material which is not itself illegal but could cause harm to users who encounter it.

It means the biggest social media platforms must address this content and carry out risk assessments on the types of harms that could appear on their service and how they plan to address it.

Asked about concerns over free speech under the new rules, Ms Dorries said: “If those platforms remove something – they have to notify that journalist that they are about to remove that content, they have to say why, and they give the journalist the right to appeal – and the content remains online while that happens, so they have a process now.”