Tragedies make for bad policy. The recent killing of three girls in Southport and the subsequent riots are not exceptions.
Since the attack, riots and violent protests have taken place across England, with many participants chanting anti-Muslim and anti-immigrant slogans. Some of the rioters attacked hotels where asylum seekers were staying and we have seen street clashes between far-right protesters, British Asian communities, and counter protesters, as have multiple incidents of arson and looting. So far, hundreds of rioters have been arrested and court proceedings have begun.
The turmoil has prompted predictable and unwise calls for more regulations and restrictions of online speech. Such calls centre on two ‘justifications’, firstly that misinformation about the attacker was spreading rapidly online, including the incorrect naming of the attacker, and secondly that social media was used to organise rallies and protests which saw violence on the streets of several towns and cities.
Calls for more regulation focus on the Online Safety Act (OSA), which Parliament passed last year. The law, one of the most ambitious, lengthy, and complex pieces of internet regulation in the world, was heralded by the last government as a means to make the UK “the safest place in the world to be online”. Yet for some, that doesn’t seem to be enough. As awful as events of the last week have been, MPs should be wary of rushing to legislation in response. They risk only making a deeply flawed and anti-liberal approach to the internet much worse.
To be fair to some, the OSA has yet to be fully implemented, such is its size and complexity. Unsurprisingly, when the government mandates that a regulator come up with a code of practice for a global communications infrastructure technology used by billions of people and thousands of businesses and other institutions it’s going to take a while for that regulator to come up with guidance that makes sense.
But many commenters do not want to wait. They see speech they do not like and they want it offline yesterday. Environmental activist and journalist George Monbiot took to X (formerly Twitter) to argue for OSA amendments “to tighten the law on incitement to racial hatred and racial violence”. Journalist Paul Mason wants the government to “enact the full Online Safety Act now, no more consultations”, an odd request given that the law requires the consultations Mason would like to steamroll. Nevertheless, Mason’s comments and his follow-up claim that he’d be happy to see X and Telegram taken down to avoid riots, reveals an eagerness for the government to censor millions of people in the name of safety. He’s hardly alone. Earlier this week the lawyer Jessica Simor took to Twitter and suggested that the prime minister introduce legislation to ban X.
Elon Musk didn’t help matters when he posted claims that the UK is on the verge of a civil war and condemned “Two Tier Policing”, as well as the UK’s laws that allow people to be arrested for comments made on social media. But most of the concerns about online safety amid the riots concerned the spread of hateful content, misinformation, and disinformation.
Home Secretary Yvette Cooper said earlier this week that a “longer-term debate about the wider legal framework” concerning such content is required. This will come as something of a shock to those who have paid attention to policy debates on online speech over the past few years.
The OSA was the product of years of hearings, debates, whitepapers, consultations, etc. The result of all of this work was a piece of legislation that, when fully in force, will represent the most significant and burdensome piece of internet legislation in the English-speaking world. One wonders what a “wider legal framework” would look like.
One possibility is that the government will pursue legislation that amends the OSA’s “false communications” offence. The offence currently criminalises knowingly sending a false message intended “to cause non-trivial psychological or physical harm to a likely audience”. Much of the content associated with the riots of the past week was no doubt intended to cause harm, but it is less clear that many of the senders knew that the information was false. MPs might also want to look at the OSA’s definition of “Priority content that is harmful to children”, which includes “content which is abusive and which targets” race, religion, sex, sexual orientation, disability, or gender reassignment. Following the events of last week, some MPs might be tempted to add migration or citizenship status to the list.
Such MPs should reconsider.
Content moderation is complex, and very little harmful content is harmful by definition. Footage of rioting, racist comments, and the spread of misinformation and disinformation can be valuable in specific contexts. Police, charities, researchers, and others might wish to share such content in order to educate or warn the public. But most of the popular social media sites host so much content that they rely on AI-based content moderation tools to take down content that violates their rules. At scale, this inevitably leads to many false positives. The longer the list of content social media sites are expected to tackle, the longer the list of false positives will be.
That would be an unwelcome outcome. First, such a move would remove valuable content from the eyes of law-abiding citizens and residents and stifle important debates and discussions about ongoing social issues. Second, it would be unlikely to work and encourage those most intent on conducting violence onto platforms police struggle to surveil and infiltrate. As news from India has shown, it is possible to stir up deadly mobs on encrypted platforms such as WhatsApp.
Fortunately for the police, many of the rioters seemed keen to film themselves committing their crimes and such footage inevitably finds its way online. Such rioters should not be surprised if they hear unexpected knocks on their doors in the coming days.
Unfortunately for the rest of us, we are in a position of taking a look at the events following the Southport stabbing and wondering how a tragedy prompted so many people to commit violence and public disorder. No doubt such reflections will require us to look at a range of issues, many of which are uncomfortable to discuss. But such discussions need to occur in the freest environment possible if we are to have a realistic chance of tackling complex social issues in a robust and honest way. A crackdown on social media would only hinder that goal.
Click here to subscribe to our daily briefing – the best pieces from CapX and across the web.
CapX depends on the generosity of its readers. If you value what we do, please consider making a donation.