Essay

Hashtag war

Russian trolls and the project to undermine Australian democracy

THE CAR JOURNEY from Canberra airport to the parliamentary tri­angle is a necessary trip for Federal politicians. One route heads south along the Monaro Highway, passing Canturf, suppliers of the lawn that carpets the Parliament House roof. Canturf’s rotating roadside promotional signage features cringe-worthy puns. In April 2017, Canturf erected a sign suggested by a Gundaroo farmer – ‘Putin the seed and it comes up Trumps’ – in reference to Russian digital manipulation of the 2016 US election.

Driving south a few more clicks, you can find one of the long-forgotten sites of Russian espionage that affected an Australian election sixty-five years ago.

Living in the bush capital, you spend a lot of time passing weirdly melancholic satellite industrial estates. On the edges of one of these estates, right on the border with NSW, there is a gap in the adjacent embankment created when Transport for NSW demolished a wooden bridge from the disused Canberra–Bombala rail line. Locals knew this structure as ‘The Petrov Bridge’. It’s believed to be the bridge that Soviet diplomat and double agent Vladimir Petrov ‘had proposed as a “Bank” or safe hiding place for secret material deposited by agents’, as he put it in Empire of Fear, the 1956 book he wrote with his wife Evdokia. In 1952, as the newly appointed MVD (Ministry of Internal Affairs) Chief in the Soviet Embassy in Canberra, Petrov had received instructions from Moscow ‘about organising “illegal work” in Australia’. Moscow sought ‘a fundamental reorganisation of the whole of our intelligence work and the creation of an illegal apparatus which could uninterruptedly and effectively operate under any conditions’, meaning it could ‘continue to operate in time of war, after diplomatic representatives have been withdrawn’. This was to be done in a way that created ‘no panic’ among agents in Australia, ‘so that they would not interpret our preparations as a sign of inevitable war’.

Keep heading south down the Monaro and you’ll find the site of another Petrov event that took place in what the same book described as the ‘peaceful countryside near Canberra’ – one which had an unexpected influence on the outcome of the 1954 federal election. In the early hours of 24 December 1953, Petrov was involved in a car accident near Royalla. He insisted that a large vehicle had forced him off the road, claiming his Soviet colleagues had attempted to kill him and make it look like an accident. The meeting Petrov was going to at the time – with French-Soviet double agent Madame Rose-Marie Ollier – was integral to Labor Opposition leader Doc Evatt’s assertions that the Petrov defection was a conspiracy between ASIO and Prime Minister Menzies, timed to coincide with the May 1954 election.

Whether the Petrov defection and fear of Soviet influence in Australia turned a Labor lead into a narrow defeat in the 1954 election has been the subject of much debate. But more recent events suggest Petrov’s accusation was not all paranoia. Before coming to Australia on their diplomatic mission, the Petrovs had worked for Soviet intelligence in the Ministry of State Security (MGB), which, after the war, had included military intelligence (GRU).

In early 2018, Sergei Skripal, a Russian military double agent who worked for UK intelligence, was poisoned in England; it was later revealed the poison used was a nerve agent dispensed by Petrov’s former employer, the GRU. Perhaps Petrov was wise to be concerned. The poison used in Salisbury – Novichok A-234 – was a nerve agent developed during the Cold War along with nuclear deterrence, the internet and the information-war techniques we now see playing out on contemporary social media. Petrov’s world – a world of ‘inevitable war’, hidden information and ‘apparatus that could uninterruptedly and effectively operate under any conditions’ – is not so far removed from our own.

 

IN 2017, THOMAS Rid, Professor of Strategic Studies at Johns Hopkins University, addressed the US Select Committee on Intelligence regarding Russian use of social media with the intent of generating conflict and disrupting politics and democracy in foreign nations, known as active measures and influence campaigns. He expressed his opinion that understanding cyber operations in the twenty-first century is impossible without first understanding intelligence operations in the twentieth. ‘This is a field that’s not understanding its own history,’ Rid said. ‘It goes without saying that if you want to understand the present or the future, you have to understand the past.’ It may seem that the methods being used are utterly new, but they are strongly influenced by techniques that state-based security adversaries honed in a previous era of conflict.

Empire of Fear describes the ‘careful planning and elaborate machinery’ employed by the Soviets ‘for the purpose of espionage’:

Espionage is a distinct and principal Soviet industry. This must be so: because the Soviet Union, alone of all the great powers, regards itself as being in a continuous and chronic state of covert warfare with the whole world outside the borders of the Communist empire… Soviet espionage has reaped a rich harvest by such methods, especially against friendly and unsuspecting countries.

From the beginning, this ‘industry’ was as much about distributing information as gathering it.

The Cold War also generated two other by-products that were to grow in prominence in the twentieth century. The first, devised to avoid the risk of an actual nuclear exchange between the two superpowers and their allies, was the advent of proxy wars (in which American- and Soviet-backed forces would clash via proxies in other countries, as in Vietnam). The second was designed to cope with the possibility of a pre-emptive nuclear strike on the US: a secure communication system that ultimately became the internet.

In the 1960s on the US West coast, the Advanced Research Projects Agency Network (ARPANET) was designed to ensure the survival of secure command communications in the event of disruption. ARPANET organised communications through distributed networks so that there were multiple lines going into and out of each node, a system known as ‘packet-switching’. In the event that any particular node was blocked, packet-switching allowed communications to be rerouted through other parts of the network.

This initial network proliferated to become the global environment of information we know as cyberspace. This space, especially social media, has enabled a new battlefield for the strategic use of information to pursue competitive advantage to amplify, influence and degrade the capacity of an adversary to make decisions, perhaps even at a societal level. Human consumption of communication in cyberspace – with all our cognitive biases – in combination with a tribal, adversarial democratic system has enabled new relations between politics and war: a new type of information war.

Whereas in previous eras, the information war was slow, resource intensive, and involved the political risks of producing and distributing propaganda in a foreign territory, organised foreign-influence operations can now be executed from the security of countries without extradition treaties. A team led by Philip N Howard at the Oxford Internet Institute recently completed a second global inventory of influence operations and of the capacity of governments to ‘manipulate public opinion over social media’. These activities included use of bots, disinformation campaigns and the distribution of so called ‘fake news’. An assessment based on this contextual global activity would suggest that Australian voters and political discourse are likely to have already been targeted, possibly by organised ‘troops’ from foreign government-sponsored outlets. Our research has confirmed that Australians have been targeted online by these outlets in an organised fashion. The Russian state-supported Internet Research Agency (IRA) based in St Petersburg targeted Australian citizens possibly as early as 2014.

In its effort to aid US lawmakers investigating Russian influence on US politics, Twitter identified 3,841 accounts suspected of being run by the IRA. In 2018, researchers from Clemson University in the US released three million tweets made by those accounts. Our analysis of this data showed how these accounts targeted Australia, particularly in reaction to the Australian response to the downing of commercial flight MH17 in July 2014. Some 5,000 of the tweets mention the terms ‘#auspol’, ‘Australia’ or ‘MH17’, with ‘Australia’ the most common of the three.

These are not retweets but original tweets from IRA troll accounts, and have the objective of undermining support for the government and its diplomatic efforts. The activity correlates with the Australian government’s deployment of military planes, in the aftermath of the downing of MH17, to operate in Syrian airspace where Russian aircraft were also operational. During this period, the Australian Defence Force was also confronted by Russian military cyber operations.

Interference in Australian cyberspace was part of a much wider IRA effort, focused in particular upon the US. The US Office of the Director of National Intelligence found with a high degree of confidence that Russian military intelligence was part of an influence campaign that:

…followed a Russian messaging strategy that blends covert ­intelligence operations such as cyber activity with overt efforts by Russian Government agencies, state funded media, third party intermediaries, and paid social media users or ‘trolls’.

This strategy is commensurate with a wider Russian Federation doctrine, known as New Generation Warfare or the ‘Gerasimov Doctrine’, and described by Dmitry (Dima) Adamsky in ‘Cross-Domain Coercion: The Current Russian Art of Strategy’, where the ‘main battlefield is consciousness’. By undermining democratic norms and institutions, as Mark Galeotti describes in ‘I’m Sorry for Creating the “Gerasimov Doctrine”’, ‘subversion is not the prelude to war, but the war itself’.

In February 2018, US courts charged the Russian IRA with ‘operations to interfere with elections and political processes’. The indictment notes the IRA ‘sought, in part, to conduct what it called “information warfare against the United States of America” through fictitious US personas on social media platforms and other internet-based media’. Twitter described the actions of the IRA as ‘information operations and co-ordinated inauthentic behaviour’ by ‘bad-faith actors’ who made ‘nefarious attempts’ to ‘interfere in the public conversation’. Close to euphemism, such a description nonetheless alludes to how ‘influence’ functions. Information warfare is designed to undermine the foundations and workings of liberal democratic politics.

 

PREVAILING ACCOUNTS OF politics within liberal democracies stress that politics is primarily a contest between competing sets of policies, and that voters make their decisions within an imperfect informational economy. The task for voters is to choose between competing alternatives based on the alignment of party and candidate policy positions with their own preferences. Poor choices result from a lack of information or systematic distortions within the information available. Within mainstream political science literature, it is recognised that choices can be manipulated depending on the arrangement of alternatives – a consequence of manipulating the environment in which voters make their choices. However, information warfare strategies and tactics seek to shape both the environment of a decision as well as the parameters of the decision itself.

As our subsequent analysis has shown, Russia’s influence was not only focused on military or diplomatic incidents like MH17 and Syria. The most commonly repeated English language tweets produced by the IRA have nothing to do with politics. They’re about popular culture instead. Often before turning to explicitly political themes, operatives tweeted aphorisms, talked about sport, and discussed the end of the Mad Men series. This represents the migration of traditional spy-craft techniques – of which Petrov would have been familiar – into a social media domain. Targets are approached on terms the operative believes they have in common before attempting to shift the views of the target.

In general terms, this strategy is not the stuff of psyops; it can be found in the study of rhetoric. In Kenneth Burke’s influential A Rhetoric of Motives, communication is about the co-ordination of activity between irreducibly distinct people. Hence the process of communication involves the creation of identifications between speakers and other actors and situations. This can take the form of identification with common principles, common spirit, or common sets of activities. Change happens not by telling a target audience that they are wrong but by using their existing beliefs as a ‘fulcrum’ to move their other beliefs and attitudes, and to ultimately induce them to take particular actions.

An example of this strategy can be found in the Facebook ads purchased by the IRA. Although the press has paid much attention to the IRA’s promotion of Trump directly, there were more ads targeting African-Americans on Facebook than white Christian conservatives, and, particularly early on, these focused on developing pan-African identities rather than specifically political themes. (The latter came in the later stages of the campaign.) By developing common identifications with other African-Americans over issues such as police violence and societal racism, the IRA then ran ads the day before the election claiming such things as: ‘We do not have any choice but to boycott the election. This time we choose between two racists. No one represents Black people. Don’t go to vote.’ Although there were many factors at play, the drop in the level of the African-American vote from 2012–16 is the largest decline in African-American participation from one presidential election to the next.

Russia has been busy in an Australian-identity context as well. Their most active day came on 8 February 2017, when 499 of their 527 tweets were efforts to engage with #MakeTVShowsAustralian, a hashtag promoted by the program @midnight which aired on the US channel Comedy Central. This activity was concentrated in a three-hour window between 8 pm and 11 pm Eastern Australia Summer Time. Although most of their activity consisted of retweets, some original tweets by the IRA demonstrated knowledge of Australian slang. By the third hour, Russian operatives began injecting original tweets with divisive content, labelling Muslims as terrorists.

@midnight ran from October 2013 to August 2017 and was syndicated in Australia on Foxtel’s The Comedy Channel. In the US, during its initial test, @midnight attracted the youngest median age in the late-night slot. The show was internet-themed and interactive, with most activity focused on trending memes or online news. One daily game was ‘Hashtag Wars’, where contestants buzzed in on a specific theme and fans could submit tweets that would appear live in the show. The video tweet which announced the #MakeTVShowsAustralian hashtag game on 8 February 2017 featured a banner-image mash-up of Australian clichés: a kangaroo on a couch watching a picture of Uluru on a big-screen TV, with fauna road-sign ornaments on a coffee table.

Russian troll accounts seized the opportunity to tweet or retweet responses to @midnight’s challenge, showing a good, albeit clichéd, knowledge of Australian culture and humour. ‘Game of Thongs’, ‘Strinefeld’, and ‘Sydney Opera House of Cards’ wove Australianisms into international hit show titles. Some, like ‘The Hemsworth Brothers Karamazov’, suggested that the trolls themselves were not just selecting or copying tweets they thought would amuse Australians, but making them up themselves.

Several of the tweets touched on politics, like ‘Downtony Abbott’, ‘Malcolm Turnbull in the Middle’, and ‘American Drongo: The Story of Trump’. The last one came with an additional comment: ‘Sorry he’s such a dick to you Aussies!’ – confirming these tweets were aimed at Australian audiences, who were receiving them in the prime-time evening period. They were aimed at Australians who might be subsequently encouraged to like or follow the Russian accounts as a flow-on effect of this cultural online play.

As the night wore on, the trolls began to interlace their jokes about ‘Agents of Sheila’ and ‘Beer Eye for the Straight Guy’ with more sinister messages. ‘Lets dump all our Muslims on Australia, like their PM is doing to US. Fuckem! Australia go to Hell!’ read one. ‘#MakeTVShowsAustralian and put another country on the barby…#Iran. But murder as a business model is purely American. #MuslimHolocaust’ read another. The transnational manipulation in these tweets is complex. On the surface, both of these tweets seem targeted at an American audience: in the first instance, aimed at inflaming US–Aussie tensions, and in the second, the potential mobilisation of a Muslim population in the US.

The wider suite of tweets sent out by the IRA also suggests a hostile targeting of Muslims in Australia intended to stoke division and conflict within Australia itself. These include: ‘RT @Socalsteve661: AUSTRALIA: “Non-Muslims need not apply”’; ‘Australia says NO to Muslims? Muslims say FUCK YOU to Australia! Whatever, we don’t want to walk upside-down!’; ‘Muslim imam says Australia belongs to Muslims because “Islam was here before the English fleet”’. Other tweets reinforce racist stereotypes: ‘Muslim women in Australia explain how men should beat women’; ‘Christian man attacked by muslims b/c wearing crucifix’; ‘Muhammed’s Harem in Australia: Muslim man with 4 kids, 3 homes and multiple wives. Welfare Galore! Haha!’; ‘#top Australia’s Halal Chief Says White Women Need To Be Fertilized By Muslim Men via dailycaller #love’. A few go so far as to call on Australians to ask questions or take action: ‘I wonder why #ReclaimAustralia is racist and bigoted and Muslims calling for beheading are just offended protestors?’; ‘#top RT ConstanceQueen8: Australia Muslim calls for Sharia Opponents of Islamic Law are Bigots Don’t succumb Aust…’. @ScreamyMonkey, who had previously tweeted about MH17 and Australia, was retweeted by other IRA ‘sock-puppet’ accounts; for instance, ‘RT @ScreamyMonkey: Australian Muslim leader blasts government’s deradicalization drive #world #news’.

One of the most repeated tweet texts on this topic was: ‘My 3 point plan to rid Islam from the West... 1) don’t employ Muslims 2) don’t rent/sell property to Muslims 3) ban halal, its animal abuse #banislam Follow me’. This was a clear call to capture the new audiences initially generated by tweets about more benign subjects.

IRA hashtags in the Australian-related material, both during the #HashtagWar of 8 February 2017 and in general, include themes close to the hearts of many Australians – #Sports, #AustralianOpen, #auspol #choppergate, #SignsYouAreAnAmerican – and dangerous Australian fauna, such as #IfASnakeBitesYou. Others touch on politics and international affairs, such as #assange, #barnabyjoyce and #AbbottCurse. Of course, culture is in the details, and these hashtags reveal where even the most sophisticated cultural warfare can get it wrong. Australians refer to ‘sport’, not ‘sports’, and we are interested in all sports, not just tennis. And our fauna can be complicated too: an IRA tweet might feature ‘#thingsmoretrustedthanhillary australian fauna’ but then also tweet ‘#dumbgeniewishes move to #australia and subsequently die of some deadly tarantula bite’. Australian tarantulas might give you a nasty bite, but they won’t kill you. Yet the scary image of Australian animals – of which Australian culture is perversely proud – is one IRA trolls certainly weaponised.

 

THE IRA TROLL accounts often repeatedly tweeted the same text without explicitly retweeting the original, and the most often repeated tweets had no direct political claim or invective about political figures or objects. They were Radiohead lyrics. As music critic Alex Ross commented, reviewing the band’s 1997 album OK Computer for The New Yorker, their lyrics ‘seemed a mixture of overheard conversations, techno-speak, and fragments of a harsh diary’. He might easily have been describing Twitter itself.

Fragments from three songs in particular – ‘Creep’, ‘House of Cards’, and ‘No Surprises’ – feature heavily in the Russian tweets (some phrases are used more than 300 times) and are in the top ten of all repeated Russian troll tweets in English. Radiohead lyrics have multiple representation throughout the ‘Hottest 100’ of top IRA English expressions most repeatedly tweeted.

Maybe IRA operatives just tweeted what they were listening to, just like any other office worker. But other high-volume English language tweet quotes seem very specific to the target populations. Key quotes include motivational maxims from African-American athlete Jackie Joyner-Kersee and musician Usher, and quotes from British historian GM Trevelyan (‘education has produced a vast population able to read but unable to distinguish what is worth reading’) and French author Colette (‘music is love in search of a word’). There are also statements that are designed to attract attention: ‘I don’t wanna be your friend’ or, ‘it’s absolutely unbelievable that I want to try it again’. These are clearly decontextualised claims which move readers to engage and to try to figure out their context, which might move at least some of them to look up the page and follow to understand the narrative.

Troll accounts seek to influence by amplifying divisions, building a following from cultural cues and then demoralising people and degrading institutions. One key way to do this is to draw on polarising, emotive issues that swirl around celebrities or other influencers. Music, one of the most powerfully emotional forms, seems to have been deployed to do this. In terms of African-American culture, troll tweets referred to specific culture wars in hip-hop itself: the mid-1990s East Coast-West Coast rivalry that generated a divisive and violent split between hip-hop styles that persists today.

 

FAKE NEWS IS fake news. Literally. Influence operations are not only – or even primarily – about fake news. A recent study of Russian information operations by Bret Schafer entitled ‘A View from the Digital Trenches’ found that ‘the vast majority of the content promoted by Russian-linked networks is not, strictly speaking, “fake news”’. Schafer, the social media analyst for the Alliance for Securing Democracy, an advocacy group formed in 2017 in response to Russian interference in the 2016 US election, noted that these communications often use partial truths, normative claims and statements for which the factual predicates cannot be easily identified. Such tactics extend beyond the binaries of fact and fiction associated with a mainstream understanding of ‘fake news’.

Whether something is ‘true’ is incidental to the utility of a statement in an influence operation. Manipulation involves more than the use of falsehoods. What the IRA did was produce tweets that lead to the amplification of certain aspects of political life, deflecting attention from others. The repetition of statements (even Radiohead lyrics) can normalise otherwise extreme positions and become a key part of the radicalisation process, especially when nestled within a lot of other messages that are culturally accepted.

Although media reporting on Russian influence operations treats the idea of sowing conflict as an end in itself, the goals are far more sinister. First, engagement with multiple sides in social and political conflicts enables Russian agents to have some degree of steering capacity to strategically activate and direct those conflicts when needed. Additionally, the amplification of divisions undermines the capacity for societies to unite and address common problems and pursue common ends. Finally, playing all sides – sometimes against each other – undermines the capacity to organise a coherent response to a foreign actor.

As Whitney Phillips puts it in This is Why We Can’t Have Nice Things (MIT Press, 2015), ‘the troll problem is actually a culture problem’. What is important about trolls is not their role but the culture in which they thrive. Phillips traces the history of the troll from an online subculture (2003–07) to part of general mainstream culture (2010 onwards). Typically, we want to demonise these figures, see them as deviant and abnormal. In reality, they are quite the opposite. In many ways, trolls are adversarial and polarising in a way familiar within Western culture, from the entitled opinions of the shock jocks of contemporary media to the oppositional, individualistic dialogues of the ancient past.

 

STEAM, EVEN RAILWAYS, can seem anachronistic or even quaint today. That old railway underbridge on the Canberra–Cooma road was barely wide enough for Petrov to drive through in his dark green Skoda. When you go to the site of the old bridge now, there is a gap big enough for a semi-trailer. This is an apt metaphor for how technological changes can have unexpected outcomes. One largely unremarked death in 2018 was that of Paul Virilio. The cultural theorist of speed and war also developed the idea of the ‘integral accident’: that any invention of a new technology sees also the invention of its accident. The railway was a useful invention that changed the world forever, but it also invented a new accident: the derailment. The internet is a technological change which has produced its own original accidents, and one of these is cyber manipulation.

As academic theorist McKenzie Wark reflected in an obituary for Virilio in Frieze in September 2018:

Modernity is also war on more and more kinds of terrain. Warfare not only took to the air but to the airwaves. The modern world is a condition of generalised information warfare. Not only is architecture vulnerable to bombs, it proves defenceless against information, passing through the doors and walls of our homes, rearranging the space and time we imagine we live within. The information war reversed the power of architecture and communication. The home or the city is now exposed to its flows. The consequences may be even more far-reaching. The vectors of communication call into being a whole new geopolitics – not of territories and borders but of communication and computational infrastructure.

It’s likely that internet politics will have unintended consequences, just like the revolution of steam, and that it will impact on more than just elections in the future society. During the first Cold War, Petrov sought to retrieve and wedge documents into the physical architecture of a bridge. Petrov’s actions unexpectedly influenced Australian Federal politics of the time. We live in an era – as Wark alludes to – where both sides of politics fear the porosity of digital and physical borders. We are living through a socio-technological rupture where the internet that was created to protect Western nations from nuclear attack has instead opened a door to the manipulation of a nation’s people. The internet is a vast geopolitical bridge out to other cultures, societies and worlds, just as it forms a bridge into Australia. And a new kind of troll is hiding under these bridges, wedging them with culture and identity to influence and persuade.

Get the latest essay, memoir, reportage, fiction, poetry and more.

Subscribe to Griffith Review or purchase single editions here.

Griffith Review