You are currently viewing Plan to make big tech remove harmful content axed

Controversial measures which would have forced big technology platforms to take down legal but harmful material have been axed from the Online Safety Bill.

Critics of the section in the bill claimed it posed a risk to free speech.

Culture Secretary Michelle Donelan denied weakening laws protecting social media users and said adults would have more control over what they saw online.

The bill – which aims to police the internet – is intended to become law in the UK before MPs break next summer.

It previously included a section which required “the largest, highest-risk platforms” to tackle some legal but harmful material accessed by adults.

It meant that the likes of Facebook, Instagram and YouTube, would have been tasked with preventing people being exposed to content like for example self-harm, eating disorder and misogynistic posts.

Instead, tech giants will be told to introduce a system allowing users more control to filter out harmful content they do not want to see.

Ms Donelan told the BBC the bill was not being watered down – and that tech companies had the expertise to protect people online.

“These are massive, massive corporations that have the money, the knowhow and the tech to be able to adhere to this,” she said.

She warned that those who did not comply would face significant fines and “huge reputational damage”.

Some critics of the provision in the bill have argued it opened the door for technology companies to censor legal speech.

It was “legislating for hurt feelings”, former Conservative leadership candidate Kemi Badenoch said.

And in July, nine senior Conservatives, including former ministers Lord Frost, David Davis and Steve Baker, who has since returned to the government, wrote a letter to then Culture Secretary Nadine Dorries, saying the provision could be used to clamp down on free speech by a future Labour government.

Adults will be able to access and post anything legal, provided a platform’s terms of service allow it – although, children must still be protected from viewing harmful material.

Mr Davis told the BBC he was glad that the legal but harmful duties had been taken out the bill but he still had other “serious worries” about the threat to privacy and freedom of expression which could “undermine end-to-end encryption”.

In some scenarios the bill permits the government to direct companies to use technology to examine private messages.

“I urge the government to accept the amendments in my name to fix these technology notices so that they no longer pose a threat to encryption, which we all rely on to keep safe online,” he said.

Lucy Powell MP, Labour’s Shadow Culture Secretary, criticised the decision to remove obligations over “legal but harmful” material.

She said it gave a “free pass to abusers and takes the public for a ride” that it was “a major weakening, not strengthening, of the Bill”.

But Ms Donelan told BBC News the revised bill offered “a triple shield of protection – so it’s certainly not weaker in any sense”.

This requires platforms to:

  • remove illegal content
  • remove material that violates their terms and conditions
  • give users controls to help them avoid seeing certain types of content to be specified by the bill

This could include content promoting eating disorders or inciting hate on the basis of race, ethnicity, sexual orientation or gender reassignment- although, there will be exemptions to allow legitimate debate.

But the first two parts of the triple shield were already included in the draft bill.

                                                                                    NEWS ANALYSIS

2px presentational grey line
Analysis box by Angus Crawford, news correspondent

At its heart this complicated bill has a simple aim: those things that are criminal or unacceptable in real life should be treated the same online.

But that means reining in the power of the big tech companies and bringing an end to the era of self regulation.

Getting the bill this far has been a complex balancing act. Dropping the need to define what counts as “legal but harmful” content may have satisfied free speech advocates.

Including new criminal offences around encouraging self harm or sharing deep fake porn could feel like a win for campaigners.

But it won’t satisfy everyone – the Samaritans for example don’t feel it adequately protects adults from harmful material.

The Molly Rose Foundation set up by Molly Russell’s family believes the bill’s been watered down. It’s not about freedom of speech, it said in a statement, it’s about the freedom to live.

And there’s much about the bill that is still unclear.

2px presentational grey line

Campaign group the Centre for Countering Digital Hate (CCDH) said platforms might feel “off the hook” because of the new focus on user controls “in place of active duties to deal with bad actors and dangerous content”.

Elon Musk’s takeover of Twitter indicated tough rules were needed, it said. Twitter recently reinstated a number of banned accounts, including that of Ye, formerly known as Kanye West, which had been suspended over antisemitic posts.

But CCDH chief executive Imran Ahmed added it was welcome the government “had strengthened the law against encouragement of self-harm and distribution of intimate images without consent”.

It was recently announced that the encouragement of self-harm would be prohibited in the update to the Online Safety Bill.

Fine companies

Other changes will require technology companies to assess and publish the risk of potential harm to children on their sites.

Companies must also explain how they will enforce age limits – knowing users’ age will be a key part in preventing children seeing certain types of content.

And users’ accounts must not be removed unless they have broken the law or the site’s rules.

Tech policy expert at the Open Rights Group, Dr Monica Horten, said the bill lacked definition about how companies will know the age of their users.

“Companies are likely to use AI systems analysing biometric data including head and hand measurements, and voices,” she said.

“This is a recipe for a gated internet, currently subject to minimal regulation and run by third-party private operators.”

Much of the enforcement of the new law will be by communications and media regulator Ofcom, which will be able to fine companies up to 10% of their worldwide revenue.

It must now consult the victims’ commissioner, the domestic-abuse commissioner and the children’s commissioner when drawing up the codes technology companies must follow.

DISCLAIMER: The opinions expressed on this platform do NOT represent the views of The Business Executive (TBE) Ltd. or its agents. They represent the views of the author/authors. TBE, therefore, cannot be held responsible for these opinions. 

Get Published!, Send In Your Guest Posts/Articles/Opinion Pieces To editor@thebusinessexecutive.net.

Ayuure Atafori
Author: Ayuure Atafori

Leave a Comment