This week, it was reported that at a meeting with Microsoft, Culture Secretary Nadine Dorries asked: ‘When are you getting rid of algorithms?’ Some questions are better left unanswered.
She has now unveiled the Online Safety Bill, a mammoth piece of legislation. To be honest, I didn’t hold out much hope that this would be anything but a mess, even before that algorithm gaffe. Every activist’s gripe about Facebook, Google, and let’s face it, the internet itself seems to have been tacked on to the bill – and it shows. The bill clocks in at a hefty 255 pages.
The sheer length of the legislation means we won’t fully understand all of its implications for months. And even then, the bill gives Ofcom so much power and discretion that we won’t really know until the bill becomes law. (By the way, did you know that Denmark abolished its internet regulator in 2011? Just an idea….) And the legislation covers so many bases that it is difficult to know where to start. But let’s try anyway and look at what the bill means for free speech.
In fairness to the Government, some steps have been taken to address criticisms that the bill is a ‘censor’s charter’.
Although the law will still require platforms to address ‘legal, but harmful’ content, the agreed categories will now be determined by Parliament in secondary legislation. This is designed to stop risk-averse platforms from taking down all content out of fear it will be deemed harmful.
There are also new protections for journalism and ‘democratic public debate’. In fact, not only will journalism be exempted from the new regulations, but as the Telegraph reports ‘any tech firms that remove journalists’ articles will have to notify the writer in advance, say why they are doing it, give them a right of appeal and leave it online until the dispute is resolved’. This creates a strange situation where journalists have more rights than the rest of us.
The reality is that all of these protections for free speech are necessary because the bill is fundamentally anti-free speech. But the protections will be insufficient for three reasons.
First, it is all well and good saying that Parliament will constrain what is classified legal, but harmful content, but the reality is that as governments change there will be a continual feature creep. Labour has already criticised the bill because it doesn’t require platforms to take down misinformation. Let’s face it, the list will keep growing as moral panics continue.
Second, platforms are fallible. The Government have deluded themselves into believing that the law will be implemented perfectly. The reality is they will be forced to rely on a mix of technology and moderation, and that sometimes they will take down legitimate content. I know this because Twitter once froze my account for posting a joke mocking anti-vax misinformation. Algorithms can do amazing things but detecting sarcasm on the internet isn’t one of them.
Third, as the IEA’s Matthew Lesh points out: ‘The duty to protect free speech and privacy is limited. There is only a requirement to ‘have regard’. That’s very weak compared to the substantive safety duties.’
It gets worse too.
Take the requirement that all in-scope services must produce annual transparency reports and risk assessments (on child protection, illegal content and free expression.) It will essentially drop a mountain of red tape on the UK’s tech startups.
Then there’s the new duty to protect children that risks requiring every website that hosts adult content – that’s Twitter and Reddit, not just PornHub – to require ID verification. Something that will not only raise major privacy and cyber-security issues, but also inconvenience millions of law-abiding adults.
Unelected Ofcom officials will also gain powers to require social media platforms to use proactive technology. At the moment, platforms only have an obligation to take down illegal content if it’s reported to them. Now they can be required to actively scan everyone’s posts, private or otherwise, to hunt down offending content. Privacy be damned.
And this is just scratching the surface with what’s wrong with the bill. I suspect we’ll still be discovering problems for weeks. The Online Safety Bill is, in short, a disaster. If the Government wants Britain to be a place to build a new tech business, then they need to kill this bill.
In theory, leaving the European Union was meant to free us to diverge from some of the bloc’s anti-tech instincts. Yet, the most significant divergence to date will be replacing the relatively pro-innovation e-Commerce Directive and its sensible liability protections with a bill that’s as, if not more, badly drafted than the Cookies Directive. At least with the Cookies Directive you can avoid the annoying mandated pop-ups with a nifty browser plugin.
It was only a few months ago that the Government was announcing plans to slash red tape and was begging business for ideas. Now, it seems the Government’s confused its ambition of being the best regulated economy in the world with being the most over-regulated economy in the world.
Click here to subscribe to our daily briefing – the best pieces from CapX and across the web.
CapX depends on the generosity of its readers. If you value what we do, please consider making a donation.