1 December 2022

Tweaking the Online Safety Bill isn’t enough – it needs to be broken up


At first blush, news that Culture Secretary Michelle Donelan is making a series of changes to the Online Safety Bill sounds extremely welcome. But even with some welcome tweaks, be in no doubt that the legislation still poses a significant threat to free speech, privacy, and competition.

Perhaps the most notable of the changes to the Bill is the removal of the so-called ‘legal but harmful’ content clause, which has provoked a great deal of opposition among free speech groups and others. Under the previous version of the bill, large online firms would have had to disclose how they planned to treat ‘legal but harmful’ content.

Now, the bill did not actually require that firms remove such content. Rather, the it demanded that firms commit to a slate of other measures, including restricting access to content, limiting the recommendation of content, and/or recommending or promoting the content. As I noted in the Centre for Policy Studies (CPS) pointmaker ‘A Censor’s Charter? The case against the Online Safety Bill’:

‘…given that the largest social media firms already take steps to moderate speech that is likely to be included in the ‘legal but harmful’ category (such as misogynistic abuse or content associated with eating disorders) it is unlikely that they will choose to recommend or promote such content. They will therefore be put in a position of pledging to moderate such content and face significant fines if they fail to act on it swiftly.

To be clear, removing the “legal but harmful” clause is certainly a welcome change. However, the DCMS press release did not remove the obligations on large firms to prevent children from accessing ‘content that is harmful to children’. And because no one needs to prove their age to surf the web, it is impossible for online firms to know which of their users is a child and which is an adult. Firms will therefore probably continue to reduce user access to ‘legal but harmful’ content in order to comply with the Bill’s child protection provisions. 

Donelan also announced that the new version of the legislation would not include the ‘Harmful Communications’ offence, which would have criminalised sending messages that posed ‘a real and substantial risk that it would cause harm to a likely audience’ and the sender of the message ‘intended to cause harm to a likely audience’. The bill defined ‘harm’, in impossibly broad terms, as ‘psychological harm amounting to at least serious distress’. To make things even more confusing, if the ‘likely audience’ is made up of ‘several’ or ‘many’ people, then it would not have mattered if the original composer of the message intended harm. This would have allowed for the prosecution of the message’s composer in cases where a social media post ‘goes viral’, even if they never intended to cause distress.

The removal of this section is undoubtedly a good first step. However, the Secretary of State also rowed back on plans to scrap Section 127 of the Communications Act. If you’re not familiar with this law, it means those who upload social media posts deemed ‘grossly offensive or of an indecent, obscene or menacing character’ can face jail time. Those convicted under Section 127 include a man who sent an offensive tweet about NHS fundraiser Captain Sir Tom Moore and a YouTuber who filmed himself urging his girlfriend’s dog to perform Nazi salutes in response to commands such as ‘Sieg Heil!’ and ‘Gas the Jews!’.

Grim though both these examples were, a commitment to free speech means tolerating objectionable, offensive, and upsetting speech. The Government has sought time and again to insist that the bill is not a threat to free speech. Yet by abandoning its plans to repeal Section 127 the Government weakens these claims.

The problems with the OSB aren’t just about free speech though. There are also huge concerns about privacy and competition.

For a start, the DCMS’ press release does not even mention encryption. The CPS and many other groups have noted that the OSB poses a significant threat to encrypted messaging services such as WhatsApp, iMessage, Signal, and others. At a minimum, the Government should amend the bill to make clear that it does not mandate the weakening or the banning of end-to-end encryption. If the bill results in online services scrapping encrypted messaging services, the privacy and security of millions of law-abiding British citizens will be put at risk, to say nothing of the serious economic costs of such a measure.

Nor have the changes proposed so far done much to reduce the huge regulatory burden online firms could face. Keeping up with the OSB  will mean dedicating significant time and money to compliance. Big Tech giants such as Meta, Alphabet or Google have the resources to deal with a flurry of new rule and regs, but their smaller competitors and nascent start-ups do not. Indeed, large companies generally favour extra regulation precisely because it creates a ‘moat’ between them and their competitors.

And, as the Institute for Economic Affairs briefing paper on the Online Safety Bill has noted, the Government’s estimated £2.5bn cost of implementation over ten years is likely a significant underestimate. Nothing announced this week suggests that the anti-competitive effect of the OSB is going to be reduced.

Michelle Donelan’s recent announcement does show that ministers are open to changing the bill. However, as it stands this legislation will continue to pose a threat to free speech, privacy, and competition. Rather than trying to deal with every perceived harm in one all-encompassing piece of law, the best approach now would be to tackle the issue of child safety in a separate, specific legislation.

Click here to subscribe to our daily briefing – the best pieces from CapX and across the web.

CapX depends on the generosity of its readers. If you value what we do, please consider making a donation.

Matthew Feeney is Head of Tech and Innovation at the Centre for Policy Studies.