19 February 2019

Fixing Facebook is harder than it looks

By

The DCMS committee’s report on disinformation, published yesterday, is frequently headline-worthy. The group of MPs describe Facebook as “digital gangsters”. Mark Zuckerberg comes in for particular scorn for his “contempt” of the investigation. The shadowy cabal of data brokers tied up in the Cambridge Analytica scandal get such a roasting that the report is making news around the world.

Although the report is progress, we shouldn’t delude ourselves. It isn’t case closed. The issue of when governments and private corporations should step in to protect us from harm is no closer to being answered.

The recommendations set out in the report are largely sensible. Urgent change to electoral law and the rules around targeted political advertising are desperately needed. Current electoral law, the report concludes, “is not fit for purpose”. Calls for greater transparency are welcome. Most helpful of all is the proposed answer to the question that has dogged regulation discussions for donkeys’ years: are these things platforms, or are they publishers? The answer: neither.

This is a real step forward. Treating Facebook like a TV channel or a newspaper is patently absurd. Holding Facebook legally accountable for every last thing that flows through their networks — the hundred million photos uploaded to Instagram, the 65 billion messages sent on WhatsApp every day — is unenforceable, technologically improbable, and unlike any of the challenges facing traditional media regulators.

That said, any expectation by platforms that providing safe harbour, coordination and markets to extremists, illegal content and militarised campaigns by hostile powers should go unanswered is equally absurd.

The report calls for a new category, and in doing so hopefully puts this old argument to bed.

The question, then, is what government will demand of this new category, and here is where things get a little murky.

The report calls for a compulsory code of ethics overseen by an independent regulator. This code of ethics, and the team of experts tasked with drawing it up, will define what is and isn’t allowed, namely content that is illegal or harmful.

Illegal content I understand, though somewhat confusingly, the report warmly notes examples of legislative change in France and Germany while also stating the in the UK the “legislative tools already exist”. Terrorist propaganda, child abuse imagery – this is material we have legislated against, and it is worth noting that social media giants have made significant progress in removing this content from their platforms and forcing it into the less visible recesses of the web.

But what is harmful content? Time and again, the report calls for an enforced crackdown not just on illegal content, but harmful content. Tech companies would assume a legal liability for content “identified as harmful” after it has been posted by its users.

The report cites the tragic example of Molly Russell who took her own life late last year after engaging with material connected to depression, self-harm and suicide. That a child should be able to freely engage with this kind of thing is a failure in platforms’ duty of care to their underage users. A number of organisations, such as DotEveryone and Carnegie are vocal in their calls for better protection for young people online. But discussions of depression, self-harm and suicide are not illegal – in fact, we are in the midst of a groundswell of people opening up about their mental health and discussing these very topics.

The riposte might be that there are ways to discuss these things in a way that is harmful and a way that is responsible, which is true. But enforcing “responsible” discussions over the conversations of over a billion people requires technology that simply does not exist. This is even assuming this is something we feel is a legitimate centralisation of power in the hands of the already powerful platforms.

Regulation – of one kind or another – is coming. Giving some teeth to a regulator able to ensure greater transparency in light of updated electoral law should be welcomed. As should holding companies to account for their use and abuse of citizens’ data and improving our own ability to do the same should too, especially when those citizens are children.

But the haze around the kinds of things that are permitted online remains as grey an area as ever: as much as we need to enforce some standards, we must be cautious about the ways they are set.

CapX depends on the generosity of its readers. If you value what we do, please consider making a donation.

Alex Krasodomski-Jones is Director of the Centre for the Analysis of Social Media at Demos.