4 December 2019

Peter Pomerantsev on politics, propaganda and how to fight back

By Peter Pomerantsev

With the support of the Atlas Network, CapX is publishing a new series of essays and podcasts on the theme of Illiberalism in Europe, looking at the different threats to liberal economies and societies across the continent, from populism to protectionism, fake news and corruption.

In October, CapX editor John Ashmore spoke to one of the world’s leading experts on propaganda and fake news, Peter Pomerantsev, about the challenges facing the West from hostile troll armies, Kremlin bots and an authoritarian China. Peter boasts a fascinating and varied CV, including many years working as a TV producer in Putin’s Russia, an experience he documented in gripping style in his first book, Nothing is True, and Everything is Possible. For his latest book, This is Not Propaganda: Adventures in the War Against Reality, Peter travelled the world, finding out how despots and demagogues are twisting the truth – and how to fight back.

JA: How does your new book take up the themes of your first adventure in the world of Russian TV?

PP: Yeah, the first one is a memoir of my time in Russia and at the end of that book, I come back to to the West and say, ‘Well, Russia is messed up, but still the West is different’. In Russia, I tried to describe a propaganda system. It was very different to the old Soviet one. It wasn’t one that tried to affirm a higher truth, quite the opposite, it worked by spreading doubt and arguing the truth is unknowable in almost a postmodern way, but with a continuing argument that, therefore, all truth is subjective and in this dark, messy world where there’s nothing to fight for, no values, no ideals, and everything’s a dark conspiracy, you need a strong hand to guide you.

And I came came back to the West saying, okay, the West is messed up in its own ways, but at least we have a rational politics and we can hold politicians accountable with the truth and we have clear ideas of left and right and rational debate and all these things. And then lo and behold, I see quite a lot of the rhetorical manifestations and propaganda manifestations that I saw in Russia appear here. And the second book was an attempt to understand, if that’s systemic and if it’s systemic, why? So, I go around the world, I get to South Asia, China, Latin America – I blew my advance. My wife’s quite annoyed.

How do you reflect on your visit to China, and the rather scary system they are building there?

China is now seen as one of the forefront of a way of maximising the authoritarian power of the internet, not purely through top-down censorship but through the power of AI, by analysing everything that you do from how much wine you buy in the evenings in the store through whether you visit your parents, through to what you say politically and combining that into a sort of a citizenship score. If you’re a good boy, you’ll progress and if you’re a bad girl, you’ll go down.  And, of course, it’s not so far away from the Silicon Valley dream of a purely technocratic society where democracy is useless and slow and flawed and inefficient. And it’s almost like they’ve latched on to the latent promise of the internet, when we’d always thought it was going to be a liberating, emancipatory force, so that’s a big deal.

The scary thing about that, apart from the way it destroys rights is that it may deliver pleasure and efficiency, because back in the old days, in the Cold War, it was clear that the free model was also more efficient, and more fun and more pleasurable and it was attractive. While I suppose the argument the Chinese are making is ‘yes, it will be authoritarian, but my god it will be so much better, you’ll be able to shop so much better. You’ll find online university courses so much better because we know everything about, we will give you the best form of education possible, the best medical services’. What they’re promising is efficiency and comfort and pleasure along with authoritarian control, and something I worry about a lot nowadays is whether we can still express the pleasure of democracy. We can explain why it’s important terms of rights to have rights abstractly. But what if our system is more pleasurable?

We all have so many different ways to absorb information, is there a particular format you think is particularly insidious, be it TV, Facebook, traditional newspapers?

I actually think the TV to internet dynamic in many ways hasn’t always been contradictory, quite the opposite. If you look at the question of polarisation, and what are known as ‘echo chambers’…what we mean is the fracturing of a common public space and the fracturing of a common perception of reality. So there’s great studies now by MIT by Yochai Benkler, who is one of the premier researchers of the information space in America, showing this fragmentation in America where it’s much more extreme than here really starts with cable news and talk radio.

But then what social media does is pushes it even further. So if your old Fox News watcher would still sometimes go back to CBS or the New York Times, social media just pushes them further and further. And it’s made Fox News itself even more extreme.

There is something about the way that social media has been designed, which furthers taking extreme positions. It certainly doesn’t favour taking mild positions or the truth or being reasonable and discuss it. It’s, it really favours politicians who then play into that scandalous and extremist rhetoric. There’s a reason why Salvini and Trump and all these guys do so well online. So it’s really I think, a case of of social media taking the worst of TV and and then putting it on steroids.

I wonder whether Martin Mark Zuckerberg and Jack Dorsey and all these, these weird amoral geniuses were designing social media and they were so full of the sense of that behaviour as normal that they ended up designing social media in a way that is a giant reality show. Social media rewards being nasty and as extreme as possible and hate-filled rhetoric and sort of self-scandalisation, you get attention through through doing ridiculous things, which in another media context or just made you look stupid, but now it’s like, I don’t know you want to do that. You want to do crazy stuff that makes you look ridiculous because you’ll keep attention. And once you’ve got that, then everything else falls into place. So it’s almost like Scooby Doo, when you take off the mask of Mark Zuckerberg, you finally see Simon Cowell. We live in Simon Cowell’s world completely surrounded by it.

What are some concrete measures we could put in place to try to tackle the rise of disinformation?

it’s very interesting looking at the White Paper that the British government put out, which was a mad collage of many things. It established something called ‘legal but harmful speech’, stuff which is legal it’s not illegal speech is not threats to violence or defamation, which already exist as as legal categories. ‘Legal but harmful’ where they included things like disinformation, fake news. bots, trolls and foreign influence campaigns.

The people who were up in arms around it were freedom expression organisations like Article 19 and Index on Censorship. They were like ‘there is no such thing as legal but harmful, there is nothing in Article 19 of the Declaration of Human Rights, which is the one about freedom expression, to say that disinformation is illegal’.

However, I do actually think we face a different form of censorship today, which is about our lack of understanding and our lack of control over how our information space is designed. So we don’t know if something online is a real person or a Kremlin bot army or a Dominic Cummings troll army. We just don’t know. There is no transparency in the internet. We don’t know why an algorithm shows us one piece of content and not another. We don’t know which of our own data is used to target us and why we don’t know, when we see one ad from an official or unofficial campaigns. We don’t actually know who’s behind it. We don’t know what ads they’re showing our neighbour up the road, so we have no way to actually even criticise or engage with an election campaign.

We’re in the dark, we’re like Caliban on Prospero’s island, surrounded by the ghouls and spirits of information that we don’t understand. I think we do need more freedom of expression, and freedom of expression is not just the freedom to say things, it’s the freedom to receive information. So I think we do need radically more aggressive regulation to create a transparent internet – and [regulation] with teeth. So, if you keep on doing deceptive campaigns online, then that’s illegal. They get taken down. That’s not the same thing as anonymity. Anonymity is fine, a person can be anonymous for whatever reason, we’re not talking about individuals here. We’re talking about co-ordinating mass campaigns. That’s the problem.

Do you think some of this stuff is overdone, though – where quite mundane political tactics are dressed up as political alchemy?

Without a doubt – the book is stuffed full of interviews with different types of people and propagandists and throughout the book, I’m going ‘can I really believe these guys?’.

I actually interviewed the guy who created the company above Cambridge Analytica, SCL [Nigel Oakes]. And he was brutally honest. He said, Look, the idea that you could do, at scale, in quick time, accurate behavioural change psychological profiling online by analysing people’s likes and shares was, I think the word he used was ‘bollocks’.

He does have a methodology but it’s it’s the opposite of alchemy. It’s as close to scientific as he thinks he can get. He spent years developing it and he focuses very much on behavioural change, as opposed to attitudinal change, which is one of the big debates in propaganda – do you try to change people’s minds or their behaviour? And he’s really concerned with the latter. And he’s saying you can do it. He’s spent years and decades sort of work it out, and involves very, very deep anthropological research, sending people into a community to do social survey work which doesn’t even feel like social survey work, it has to be very natural and very organic and you’re trying to work out what drives behaviour. And there’s many, many factors to it.

So it’s the opposite of alchemy. His argument is that, no, I’m trying to get this to be a science as much as possible. But again, the actions of somebody like a Cambridge Analytica, and the reason they can provide rhetoric that has a Hocus Pocus element to it, is the lack of transparency, because the big thing that they were caught doing is using a huge amount of personal Facebook data. And the fact that we didn’t know about that the fact that Damien Collins had to call Facebook then call Cambridge Analytic until finally, this came out for a long period of denial, that’s the problem.

By having a system in the dark and allows even these fears of the Kremlin’s all-pervasive influence, it allows that to be augmented. So rather than are really kind of evidence-based discussion about what Putin is up to all campaigns are up to, we end up living in semi-darkness, and that’s no good for anyone. It’s not that hard to change. I just think there’s a little bit of reluctance by all political sides to do it, because they all know it’s wrong, but they have kind of planned their next set of campaigns around this.

There’s been a bit of self regulations from the tech platforms. And obviously Twitter have just said they’re not going to allow paid political ads on their platform. But a lot of the stuff isn’t paid a lot of this stuff is covert. I think it’s good they’ve done that. But that’s just one tiny slice of the problem.

So what’s the real problem with all of this? When you have these campaigns in the dark when we don’t understand who’s already behind them? When we don’t actually see all the campaigns because they’ve been targeting different groups, then we end up with the result that we struggled to analyse. We’re still trying to understand why did people vote for Brexit. Was it animal rights? Was it sovereignty? Was it immigration? There’s no agreement because I didn’t actually see the campaign, it was in completely dark corners of the internet. So we struggled to then really understand why a society has made decisions, which means we can’t have that kind of minimal level of trust that you need for democracy to survive. And I’m just really worried we’re going to enter another election now, which is meant to clear things up, and we’ll end up even more befuddled than before. And we won’t have the sense of closure, which we probably need.

It’s not all gloom, of course. Tell us about the people you met who are doing more positive things with data.

I meet a lot of people who still very excited by the possibilities of data and the possibilities of the internet and nasty internet campaigns do the following. They analyse groups online, they do it in a sneaky way in a non-transparent way. And then they feed them bits of content that will heighten polarisation, usually in heightened hatred, because that’s a good pretty effective way to get people to do stuff.

Now, we could we could actually be analysing the internet and doing the absolute opposite. We could be analysing different social groups and thinking about OK, what are the what are the underlying values they have in common? How do we create content that whose success we measure not through likes and shares, which usually equal hate and enmity, but measure it through how much constructive discourse it managed to foster or how much trust it inspired? So there’s other ways of measuring what we do. The problem is that that wouldn’t be immediately be financially profitable. So we probably need some sort of public service remit for this kind of content, so we’re moving towards kind of like a BBC-like thinking for the digital age, which is very different to the old  broadcast model.

And I meet all sorts of people in the book who are trying very interesting things. So I meet a guy next who organises protest movements, and he thinks that you can tell a society is going to lead to desires for social change and progress by analysing sort of Google searches and a few other data points. So the same way that Google could tell that there was a flu epidemic building somewhere through the searches people were using on their platform, he says that you can analyse sort of desires for social change in the same way and really the internet could provide us with a connection with what as a society, we really want and are interested in – and that has fascinating consequences.

That could mean that newspapers or or media don’t have to necessarily follow politicians, but could sort of try to understand what people really care about and are concerned with. It means people could connect with each other and form new alliances for change that they weren’t even aware of. Because social groups that we didn’t even think had anything common might have very similar concerns. So there’s a big possibility as long as it’s done in a transparent and ethical way.

Click here to subscribe to our daily briefing – the best pieces from CapX and across the web.

CapX depends on the generosity of its readers. If you value what we do, please consider making a donation.

Peter Pomerantsev is an author and Visiting Senior Fellow at the Institute of Global Affairs at the London School of Economics, specialising in propaganda.