This Is Not Propaganda: The technology
Previous part: Part IV — The supposed death of objectivity.
It would be wrong (and too easy) to blame Russia for what is happening. Even though Nigel Oakes, one of the founders of the SCL group, a precursor to Cambridge Analytica, visited the KGB to talk about ‘communication and persuasion’ in 1990, even though Dominic Cummings spent time in the country a few years later, even though the Tories are still sitting on the Russia interference report and the admiration of key Leavers for the Putin regime is clear, Russia is merely an early adopter rather than the evil mastermind. This is not a ‘Russia problem’. This is a ‘Democracy in danger’ problem.
None of the technology can be un-invented. Just as we were celebrating the role of social media in the Arab Spring, complicated structures were being built within it to brainwash people on a mass scale. Most regimes could only produce crude propaganda and smear opponents, however a few — Pomerantsev mentions Phillipines and Russia, whose Internet Research Agency has already been mentioned — produced highly complex operations.
Astroturfing — in various ways — is the key, and functions as a sort of con trick. You gain the trust of the reader (or listener) in various ways, usually by pretending to be apolitical. Having gained it, you start drip-feeding them your narratives. You could start local interest groups — which Pomerantsev’s anonymous Philippine interviewee at the very start of the book claims to have used with such great effect in an effort to help the election of Duterte. You could set up a page for a fake mystic healer, like Lyudmila Savchuk had to do at the Internet Research Agency (which she later exposed). You could hire an expert (like my mother’s guru Andrey Illarionov) to do podcasts on unrelated political topics, or maybe some entertainers and influencers. Anything would do — as long as you hide your true intentions. Now, sit and wait for a few months, maybe longer. Once enough people are listening, then you and your friends in the comments can start brainwashing them. In Duterte’s case, his people linked every crime to drugs. The fake mystic healer told tall tales about life in the West. The political expert, having sounded legit and sane went on about how Bill Gates created COVID-19. The entertainers and influences (in another Pomerantsev example from the Philippines) would start cracking jokes about Duterte’s political enemies.
The more conspicuous part of the operation are the ‘assault troops’ — accounts posting visible and explicit politics-related content. You have the ‘garden variety’ paid-for commenters, a speciality of Internet Research agency. You have angry trolls, who intimidate opponents, descending in whole swarms to harass them, contact their employers demanding they be sacked. You have other trolls who try to derail discussion and break consensus — in my experience they use highly charged, emotive language, always try to switch subject, always ready to attack people personally and accuse of arguing in bad faith, trying to reduce discussion to a shouting match. You have sockpuppets giving out misleading information. You have bots and half-bots, the automated and semi-automated accounts that either disrupt conversations or disseminate misinformation, sometimes shared between different regimes (Pomerantsev describes a Russian bot that suddenly started shilling for Duterte). As mentioned before, the aim is to give an impression that their ideas dominate online, are seen as the ‘natural’ feelings of ‘normal people’.
A third type are the infiltrators and the amplifiers. An infiltrator gains entry to a protest movement, typically in the West and attempts to subvert it. This can be done offline as well as online, and the Russian infiltration of Occupy and of Black Lives Matter are probably the best examples. The aim isn’t necessarily to take it over: it is rather to spread chaos and to kill trust, to make people feel there is no such thing as an honest movement or pressure group. Amplifiers, as mentioned before, amplify social media content of the ‘right’ kind, but this needn’t be just your own propaganda talking points. Russia is well-known for amplifying all cranks and conspiracy theorists, such as the anti-vaxxers, or, of late, the 5G-coronavirus conspiracists. Again, the aim is to destabilise, to sow discord, to make people believe ‘everything is possible’, to undermine trust. ‘It is when the Kremlin’s efforts are unveiled that they have their most significant effect’ — Pomerantsev writes. ‘When one hears so many stories of fake accounts… one starts doing a double take at everything one encounters online’. This makes it ‘easier for the Kremlin to argue that all protests everywhere are just covert foreign influence operations’.
The beauty of the system is that once you have enough converts, they start taking action of their own accord. Once there is a critical mass, a chain reaction starts by itself. Whole organisations spring up and start helping out — with Martin Sellner, of Generation Identitaire (controversially given an unchallenged platform by the BBC after the Christchurch shootings, illustrating the complicity of the mainstream media in the normalisation of extremism) running Discord channels such as Infokrieg — according to Pomerantsev it is an education centre for the far Right, teaching the rank and file to hide their Nazi beliefs with euphemisms, troll mainstream journalists and stamp out memes. It is not just Discord: “on reddit, on 4chan sites anonymous administrators provide an online ‘crash course’ in mass persuasion”, stamping out new, volunteer, online operatives to go and spread the word. Of course, nothing stops others from doing the same and propagating leftist or centrist memes and messages, however it is — like with everything else mentioned previously — a game where money talks, where a side with more resources will prevail.
If there is a future, it lies with the social media companies fundamentally changing their ways. Pomerantsev makes no explicit recommendations, however points to multiple ways in which the technology facilitates what is happening. The abuse and intimidation of dissidents, such as Rappler journalist Maria Ressa, through online trolling is a technological problem: they need social media for work and yet the social media companies are not able to stop them being harassed. The ‘echo chamber’ nature of social media is also a technological problem — the content that social media companies choose to show users reinforces their existing world view. We already know, thanks to Facebook’s infamous experiment, that it does tangibly change people’s attitudes. Moreover, these algorithms can be easily gamed by organisations with a lot of resources such as the ones who abuse the algorithms of YouTube in order to be more prominent — Russia Today being the more notorious example. With some more effort, one could even take action against online astroturfers, fake Facebook groups, typically local interest ones, that gain their users’ trust and start feeding them political content.
Indeed, social media companies have made a start by banning bots and sock-puppets reported to them faster. They also make efforts to improve moderation — albeit with the use of cheap labour. Some problems however go to the essence of the format: Pomerantsev mentions the groupthink, the consensus-building and polarisation of opinion that social media encourages: the tendency to form your views based on a perceived consensus, then pick out and pass on material that fits your views and adopt more extreme views in online arguments. This can only be overcome through education, possibly even in schools, equipping people with the right tools to cope in the information age.
Still, volunteers toil away online in an endless struggle against the forces of propaganda. On Twitter, of the ones I follow, conspirator0, ZellaQuixote and r0zetta (Andy Patel) locate bot networks, study their extent and report them to, hopefully, be discontinued. Alongside them is Pomerantsev’s friend Alberto Escorcia. A Mexican ‘human rights defender and internet activist’ and an online protest organiser, Escorcia once had a great idea: sick of bots and trolls attacking the protesters online, stopping them communicating with each other, he created a simple YouTube video, just a girl talking to the camera. Escorcia had spent a long time researching words that were in some way common to the protesters’ online output, guessing that they must have triggered people into some sort of cooperative action. The girl in the video said many of these words and the protesters started to ignore the trolls, as if by magic. Clearly, there is still a lot about the relationship between psychology of social media crowds and language that we do not understand.
A 2013 Pentagon paper, ‘China and the three warfares’ (economic, media and legal) describes ‘twenty-first-century warfare guided by a new and vital dimension: namely the belief that whose story wins may be more important then whose army wins’. Based on the above, it clearly goes further: it is not enough to have winning stories, winning narratives, you need winning languages: it is not only the content that matters, each word it is expressed with makes a big difference.