how russia's internet research agency built its ... quarterly...vard academics yochai benkler,...

12
How Russias Internet Research Agency Built its Disinformation Campaign ANDREW DAWSON AND MARTIN INNES Abstract In this article we analyse features of the information inuence operations run by the St. Petersburg based Internet Research Agency, targeted at Europe. Informed by publicly available open sourcedata, the analysis delineates three key tactics that underpinned their disinformation campaign: account buying; follower shing; and narrative switching. Both individually and collectively these were designed to build the reach, impact and inuence of the ideologically loaded messages that social media account operators authored and ampli- ed. The particular value of the analysis is that whilst a lot of recent public and political attention has focussed upon Kremlin backed disinformation in respect of the 2016 United States presidential election, far less work has addressed their European activities. Keywords: disinformation, Russia, Kremlin inuence, Internet Research Agency WITH HINDSIGHT, the sheer scale and scope of activities performed by the St. Petersburg based Internet Research Agency (IRA) and allied Kremlin units attempting to inuence and interfere with the 2016 US presidential election process, suggests much more could have been done to interdict their efforts. For as described by two recent studies commis- sioned by the US Senate Permanent Select Committee on Intelligencefrom Oxford University and New Knowledgeand the indictments led by Special Counsel Mueller, the Russian backed activities were extensive, diverse and pervaded all major social media platforms. That said, doubts remain about precisely what impact they accomplished. In their recent book Network Propaganda, the Har- vard academics Yochai Benkler, Robert Faris and Hal Roberts have cautioned that digital inuencing activity does not easily translate into measurable behaviour change. 1 Thus we need to be wary of over-attributing any causal effects to even the most sophisticated disinfor- mation campaign. Set against this backdrop, given that digi- tal inuence engineering is now being undertaken regularly and routinely, this arti- cle highlights three issues. First, the impor- tance of developing a better understanding of how Russian state assets have sought to use disinformation and misinformation to inuence European politics. For understand- able reasons, much of the public conversa- tion to-date has focussed upon the American situation. But there is signicant evidence of similar inuence and interference strategies being operationalised in Europe. Under- standing this arena is especially timely, given that the European parliamentary elec- tions are scheduled for May 2019. Second, although the Senate commissioned studies did a commendable job in document- ing and describing the volume and variety of Kremlin backed inuence campaigning, the amount of material involved means there is more to do in terms of distilling the IRAs disinformation playbook. Detailed digital forensicinvestigative methods are needed to craft an evidence-based understanding of how IRA operators built their audience and inuence. This is on the grounds that identi- fying their key tactics and techniques may enable similar disinformation campaigns to be detected in other contexts. Finally, we discuss some of the challenges with attributing authorship and impact to dis- information communications. This reects how, not only have the US and its allies learned about some of the methods used to seed and amplify false and misleading infor- mation online, so too have those authoring such messages. Contemporary efforts at The Political Quarterly © 2019 The Authors. The Political Quarterly published by John Wiley & Sons Ltd on behalf of Political Quarterly Publishing Co (PQPC). This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. 1

Upload: others

Post on 21-Jan-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: How Russia's Internet Research Agency Built its ... Quarterly...vard academics Yochai Benkler, Robert Faris and Hal Roberts have cautioned that digital influencing activity does not

How Russia’s Internet Research Agency Built itsDisinformation Campaign

ANDREW DAWSON AND MARTIN INNES

AbstractIn this article we analyse features of the information influence operations run by theSt. Petersburg based Internet Research Agency, targeted at Europe. Informed by publiclyavailable ‘open source’ data, the analysis delineates three key tactics that underpinned theirdisinformation campaign: account buying; ‘follower fishing’; and narrative switching. Bothindividually and collectively these were designed to build the reach, impact and influence ofthe ideologically loaded messages that social media account operators authored and ampli-fied. The particular value of the analysis is that whilst a lot of recent public and politicalattention has focussed upon Kremlin backed disinformation in respect of the 2016 UnitedStates presidential election, far less work has addressed their European activities.Keywords: disinformation, Russia, Kremlin influence, Internet Research Agency

WITH HINDSIGHT, the sheer scale and scope ofactivities performed by the St. Petersburgbased Internet Research Agency (IRA) andallied Kremlin units attempting to influenceand interfere with the 2016 US presidentialelection process, suggests much more couldhave been done to interdict their efforts. Foras described by two recent studies commis-sioned by the US Senate Permanent SelectCommittee on Intelligence—from OxfordUniversity and New Knowledge—and theindictments filed by Special Counsel Mueller,the Russian backed activities were extensive,diverse and pervaded all major social mediaplatforms. That said, doubts remain aboutprecisely what impact they accomplished. Intheir recent book Network Propaganda, the Har-vard academics Yochai Benkler, Robert Farisand Hal Roberts have cautioned that digitalinfluencing activity does not easily translateinto measurable behaviour change.1 Thus weneed to be wary of over-attributing any causaleffects to even the most sophisticated disinfor-mation campaign.

Set against this backdrop, given that digi-tal influence engineering is now beingundertaken regularly and routinely, this arti-cle highlights three issues. First, the impor-tance of developing a better understandingof how Russian state assets have sought touse disinformation and misinformation to

influence European politics. For understand-able reasons, much of the public conversa-tion to-date has focussed upon the Americansituation. But there is significant evidence ofsimilar influence and interference strategiesbeing operationalised in Europe. Under-standing this arena is especially timely,given that the European parliamentary elec-tions are scheduled for May 2019.

Second, although the Senate commissionedstudies did a commendable job in document-ing and describing the volume and varietyof Kremlin backed influence campaigning,the amount of material involved means thereis more to do in terms of distilling the IRA’sdisinformation playbook. Detailed ‘digitalforensic’ investigative methods are needed tocraft an evidence-based understanding ofhow IRA operators built their audience andinfluence. This is on the grounds that identi-fying their key tactics and techniques mayenable similar disinformation campaigns tobe detected in other contexts.

Finally, we discuss some of the challengeswith attributing authorship and impact to dis-information communications. This reflectshow, not only have the US and its allieslearned about some of the methods used toseed and amplify false and misleading infor-mation online, so too have those authoringsuch messages. Contemporary efforts at

The Political Quarterly

© 2019 The Authors. The Political Quarterly published by John Wiley & Sons Ltd on behalf of Political Quarterly Publishing Co (PQPC).This is an open access article under the terms of the Creative Commons Attribution License, which permits use,distribution and reproduction in any medium, provided the original work is properly cited.

1

Page 2: How Russia's Internet Research Agency Built its ... Quarterly...vard academics Yochai Benkler, Robert Faris and Hal Roberts have cautioned that digital influencing activity does not

communicating disinforming narratives areincreasingly sophisticated, as those involvedare learning ‘what works’ in making messagesmore persuasive and in masking their origins.

Two main data sources underpin the evi-dence and insights reported. First, there area small number of published accounts andstories from former workers at the InternetResearch Agency describing its organisationand routines. This is supplemented by analy-sis of the ‘FiveThirtyEight Internet ResearchAgency Twitter dataset’—an extensive non-anonymised corpus of tweets posted by IRAaccounts. As detailed below, this includes alarge number of Russian language andAmerican facing accounts. But there werealso accounts messaging in a number ofother languages. Herein we focus in particu-lar on accounts that were oriented towardsGermany, as much less attention hasfocussed upon what these were doing.

Work at the Internet ResearchAgencyThere are at least seven published journalis-tic accounts, based upon interviews with for-mer employees at the Internet ResearchAgency. Collectively, these provide insightsinto the nature of the work it performed,including the organisation of decision mak-ing and delivery, main roles and responsibil-ities, and the performance indicators workerswere subject to. By collating and analysingthese materials, it is possible to construct anoutline picture of the organisational rhythmsand routines that shaped the kinds of digitalbehaviours presenting in the social mediaaccounts they were operating.

The picture painted is of an organisationbased around an orthodox division of labourwith different departments focussing uponspecific geographic regions/countries, accom-panied by some platform specialisation.2 Forexample, one unit focussed upon producingmemes, whilst another was tasked with com-menting on posts by other users. Individualoperators ran multiple fake accounts: trollswere expected to make around fifty com-ments on news articles every day. Or, theywere tasked with maintaining six Facebookpages, posting three times daily about thenews, and discussing new developments in

groups on Facebook twice daily, with a targetof at least 500 subscribers by the end of thefirst month. On Twitter, operators were gener-ally responsible for around ten accounts withup to 2,000 followers each, tweeting at leastfifty times daily.3 One source suggests theywere required to make 135 comments pertwelve-hour shift, working in internet forumsand that they would be provided with fivekeywords to feature in all posts to encouragesearch engine pickup.4 The latter is consistentwith workers describing receiving regular‘taskings’ from their managers in terms of alist of subjects/topics to focus upon.5 Similartasks were defined in relation to targetingcomments towards outlets such as CNN, BBCand the New York Times.6

One interviewee described working inteams of three: operator one would functionas ‘the villain’ criticising the authorities; thenthe others would enter a debate with him/her. One would post an image/meme insupport of their argument, the other postinga link to a supportive source.7 Other intrigu-ing comments include a suggested patternfor developing accounts in that they arestarted with a politically neutral stance,which is amended later.8 Operators wroteTwitter bots to amplify visibility and becausethe costs of doing so were low.9 A few oper-ators were ideologically committed to theirwork, but most were not. Workforce turn-over was high and featured a lot of youngpeople and students.10

These organisational arrangements certainlyhelp make sense of some of the patterns thatcan be observed in the messaging data. Forexample, sixteen confirmed Internet ResearchAgency accounts all used quotations fromOrwell, Shakespeare and other famous liter-ary figures as part of the account biography.Similarly, a second group of accounts allshared fragments of the same base image asthe account profile picture. There is a strongprobability that these were individuals engag-ing cognitive shortcuts to get the job donequickly. Given that the staff were under con-siderable pressure to meet their performancemetrics and were not necessarily deeplyinvested in their work, these are precisely thekinds of ‘easing behaviours’ found in manyorganisational settings. Analogous to whathappens in police detective work and psycho-logical profiling, these little ‘tells’ and

2 AND R EW DAW S O N AND MAR T I N I N N E S

The Political Quarterly © 2019 The Authors. The Political Quarterly published by John Wiley & Sons Ltd on behalf ofPolitical Quarterly Publishing Co (PQPC).

Page 3: How Russia's Internet Research Agency Built its ... Quarterly...vard academics Yochai Benkler, Robert Faris and Hal Roberts have cautioned that digital influencing activity does not

behavioural signatures can be used as cluesabout where suspicions should be directed.

One especially salient point though is thatthese social media accounts that were usedfor spreading disinformation were not trans-mitting such materials all the time. Althoughthere was quite considerable variation,broadly speaking they seemed to operatearound an 80:20 ratio. That is, most of thetime these accounts were engaged in mim-icking the kinds of interests and valuescoherent with the social identities that theywere ‘spoofing’, and then occasionally theywould message avowedly political content.This pattern of behaviour was aided by thefact that many accounts would expend a lotof time ‘amplifying’ messages from otherusers, and then only rarely ‘authoring’ newmaterial themselves. Such patterns areimportant as they render the task of defini-tively attributing accounts to Kremlin direc-tion and control challenging.

How IRA accounts built audienceand influenceSpoofing personas in terms of constructing afalse account profile, and then messagingaround issues of interest to the thought com-munities associated with such digital social

identities, was critical to how IRA accountsbuilt a following to enhance their persuasivecapacity and capability. Many did so overseveral years, but not all did. Some tried toshortcut the process of building audienceand influence.

Buying followersAn alternative to the normal organic strategyand long-term investment required to growfollower numbers, is to ‘buy’ a following.Websites such as ‘buycheapfollowerslikes.org’offer to increase a client’s Twitter followingby 1,000 accounts for less than $20. Accordingto reviews, the followers have profile pictures,unique bios and are active tweeters; howeverthey will not interact with posts, as they arebots. This is what is depicted in Figure 1,which is an account run by the IRA that sim-ply purchased follower accounts when set up.It is not clear whether such an action was per-formed as part of the overarching organisa-tional strategy, or because a worker was‘gaming’ the performance measures they weresubject to. In the graph, a running total oftweets are plotted on the X axis and followercount on the Y axis. The dotted line representsits followers and the solid line those accountsit is following.

0

5000

10000

15000

20000

25000

30000

35000

138

777

311

5915

4519

3123

1727

0330

8934

7538

6142

4746

3350

1954

0557

9161

7765

6369

4973

3577

2181

0784

9388

7992

6596

5110

037

1042

310

809

1119

511

581

1196

712

353

1273

913

125

1351

113

897

1428

314

669

1505

5

Running total of tweets

TodayInSyria

following followers

Figure 1: Example of an account purchasing followers

HOW RU S S I A ’ S I N T E R N E T R E S E A R C H AG EN C Y B U I L T I T S D I S I N F O RMA T I O N CAM -

P A I G N 3

© 2019 The Authors. The Political Quarterly published by John Wiley & Sons Ltd on behalf ofPolitical Quarterly Publishing Co (PQPC).

The Political Quarterly

Page 4: How Russia's Internet Research Agency Built its ... Quarterly...vard academics Yochai Benkler, Robert Faris and Hal Roberts have cautioned that digital influencing activity does not

In this case, the English language accountwas initiated in 2015 by it acquiring 10,000followers in a very short period of time. Wehave identified similar patterns of behaviourin several of the German language IRAaccounts, as exemplified in Figure 2.

On 4 June 2016, this account very rapidlyreceived 5,000 new followers. As can be seenat two later points on the graph, the followercount drops by around 1,000 accounts; thisis likely due to the fake accounts being dis-covered and banned by Twitter. The numberof followers is also unusually high: researchsuggests the average Twitter account has 453followers.11

Follower fishingA second pattern of behaviour designed tobuild audience and influence is ‘followerfishing’. The logic of which is outlined bythe metrics in Figure 3 below, based uponan English language account.

Around half of the most active IRA ‘Ger-man’ Twitter accounts engaged in prolific‘follower fishing’, with thousands of accountsbeing followed and unfollowed regularly. Thetactic works by the IRA accounts followinghundreds or thousands of new accounts in avery short time frame. The aim is to get thenewly added accounts to reciprocate by

following the IRA account (follow-back).After a few days, the IRA operator thenunfollows the accounts, increasing their ‘fol-lowers per followed’ ratio and with it boost-ing the account’s implied ‘authority’, at leastin terms of how this is assessed by platformalgorithms. This tactic is commonly used by‘social media influencers’, such as celebritiesor marketers. Figure 4 tracks this pattern inrelation to a German language account.

The solid line documents how this accountfollowed other accounts over time. They addedthousands of accounts in a very compressedtime frame hoping to get ‘follow-backs’. How-ever, a high following to follower ratioindicates a low Twitter authority score, so theyneed frequently to stop following thousands ofaccounts as well. This creates a step like pat-tern of steep rises and sharp drops.

The longitudinal analysis to detect thesekinds of behavioural patterns is time-consuming. But in an attempt to test howwidespread it was, we analysed sixty-ninerandomly selected IRA Twitter accounts, withdifferent country focusses. Table 1 shows dra-matic differences between the behaviour ofaccounts ‘representing’ different countries. Forexample, no Italian, but 86 per cent of the USaccounts reviewed employed follower fishingto increase their followers and 21 per cent pur-chased followers.

0

1000

2000

3000

4000

5000

6000

7000

8000

1 72 143

214

285

356

427

498

569

640

711

782

853

924

995

1066

1137

1208

1279

1350

1421

1492

1563

1634

1705

1776

1847

1918

1989

2060

2131

2202

2273

2344

2415

2486

2557

Running total of tweets

Erdollum

following followers

Figure 2: Example of a German account purchasing followers

4 AND R EW DAW S O N AND MAR T I N I N N E S

The Political Quarterly © 2019 The Authors. The Political Quarterly published by John Wiley & Sons Ltd on behalf ofPolitical Quarterly Publishing Co (PQPC).

Page 5: How Russia's Internet Research Agency Built its ... Quarterly...vard academics Yochai Benkler, Robert Faris and Hal Roberts have cautioned that digital influencing activity does not

Within the large amount of mainstreammedia commentary that has attended to theactivities of the IRA over the past couple ofyears, seventeen spoofed accounts haveacquired a certain public infamy,12 eitherowing to the contents of their messaging orthe relatively sizeable numbers of followersthey had. All of them engaged in followerfishing to build audience and influence.

Narrative switchingA third pattern of IRA account behaviourwas ‘narrative switching’. The operatorswould start by talking about fairly mundaneissues, consistent with the spoofed owner’spersona. However, at some point, often afteran extended time period, messages becameovertly political and frequently aligned with

0

1000

2000

3000

4000

5000

60001 89 177

265

353

441

529

617

705

793

881

969

1057

1145

1233

1321

1409

1497

1585

1673

1761

1849

1937

2025

2113

2201

2289

2377

2465

2553

2641

2729

2817

2905

2993

3081

3169

3257

3345

Running total of tweets

casCaseyP

following followers

Figure 3: Basic pattern of fishing for followers

0

1000

2000

3000

4000

5000

6000

1 78 155

232

309

386

463

540

617

694

771

848

925

1002

1079

1156

1233

1310

1387

1464

1541

1618

1695

1772

1849

1926

2003

2080

2157

2234

2311

2388

2465

2542

2619

2696

2773

2850

2927

3004

Running total of tweets

Berkhoff85

following followers

Figure 4: German example of ‘follower fishing’

HOW RU S S I A ’ S I N T E R N E T R E S E A R C H AG EN C Y B U I L T I T S D I S I N F O RMA T I O N CAM -

P A I G N 5

© 2019 The Authors. The Political Quarterly published by John Wiley & Sons Ltd on behalf ofPolitical Quarterly Publishing Co (PQPC).

The Political Quarterly

Page 6: How Russia's Internet Research Agency Built its ... Quarterly...vard academics Yochai Benkler, Robert Faris and Hal Roberts have cautioned that digital influencing activity does not

established pro-Russian interest narratives.The IRA not only switched from banal topro-Russian views, but also switchedabruptly between different political positionsaccording to current Russian operationalpriorities, or even just to create confusion.Narrative switching can also be used to col-lect certain followers, for example peopleinterested in French yellow jacket protests,so that they can be targeted with messagesat a later time. In some instances, theseswitches happened after the account wasdormant for a time. However, this was notalways the case.

A collection of IRA accounts were identi-fied where there was a significant ‘pause’ intheir messaging, lasting several months.Potentially, this might mean that they weredormant accounts that had been purchasedby IRA operators. Alternatively, this breakcould have been used to suppress past politi-cal affiliations so the account could be ‘re-purposed’ by the operator. Another theorysuggests dormancy because IRA operationalpriorities had changed. For example, Germanaccounts can only really be used to influenceGerman opinion. As such, if IRA manage-ment or their political overseers had deter-mined an alternative priority for their staff,then these accounts may have been less rele-vant, and the operators’ attention directedelsewhere.

Figure 5 shows the graph for one Germanaccount displaying this pattern of behaviour.At some point during the break, 80 per centof this account’s tweets were deleted by itsoperator. On 14 June 2016 it began tweetingagain, but with seventy-five followers

instead of the eight it had previously. Thetweets appear to be just news headlines fromGermany and around the world, most ofwhich are retweets.

Previously, this account had posted anti-Alternative f€ur Deutschland (AfD) statements,such as: ‘at least she is not populist stupid asthe afd in #merkelmuststay’: ‘the people whochoose the #afd are just sick # MerkelMustS-tay’; and, ‘#AfD is shit, AFD is shit#MerkelMustStay’. On 24 September, how-ever, it suddenly started posting original pro-AfD tweets including: ‘I #chooseAfD, becauseI remember # asylum crisis and no longertrust #Merkel!’; ‘I #chooseAfD, because I wantto live in the Federal Republic instead of Cali-phate #Germany!’; and retweeting ‘EveryGerman #Patriot will vote #AfD tomorrow!We need a political earthquake to save #Ger-many! # Btw17 #NoAntifa #NoIslam’.

Possible (unsubstantiated) explanations forthis switch in position are that it was a directresponse to Chancellor Merkel’s public state-ment on 14 September that the EU wouldnot consider lifting sanctions on Russia.13 Inthe September elections AfD achieved whatThe Guardian called a ‘stunning success’;meanwhile Merkel’s party had their worstresult since World War II.14 It should benoted that although each of the elevenaccounts posted pro-AfD tweets, their num-bers were small compared to the pro-Merkeltweets they had previously shared. One rea-son for this could be that the Germanauthorities were said to be on the lookoutfor Russian interference and operators weredirected by their managers to be subtler thanin the past.

Table 1: Regional summary of IRA behaviour

Region TotalAccounts

FollowerFishing

FF % PurchasedFollowers

Purchased %

Ukraine 3 1 33% 2 67%Iraq 7 1 14% 1 14%Italy 8 0 0% 0 0%United Kingdom 9 6 67% 2 22%Germany 11 4 36% 1 9%United States 14 12 86% 3 21%Famous Accounts(Unknown/US)

17 17 100% 4 24%

Totals 69 41 59% 13 19%

6 AND R EW DAW S O N AND MAR T I N I N N E S

The Political Quarterly © 2019 The Authors. The Political Quarterly published by John Wiley & Sons Ltd on behalf ofPolitical Quarterly Publishing Co (PQPC).

Page 7: How Russia's Internet Research Agency Built its ... Quarterly...vard academics Yochai Benkler, Robert Faris and Hal Roberts have cautioned that digital influencing activity does not

Synchronicity analysisAs noted previously, IRA operators wererequired to run multiple accounts and to gen-erate certain volumes of activity. This pro-vides an opportunity for attributing accountsto Kremlin direction and control by analysingtemporal sequences of messaging behaviour.The method for doing this involves ‘syn-chronicity analysis’ as it looks for simultane-ous patterns of messaging as a way ofidentifying where multiple accounts arebeing controlled by the same author.

The potential power and value of syn-chronicity analysis is demonstrated by exam-ining the posting activities of eleven prolificIRA accounts adopting a pro-Merkel stance.On 21 July 2016, @Martin_S32 posted its firsttweet of the day ‘Mommy is the best#MerkelMustStay’. In total, it posted 265tweets that day, most displaying very strongsupport for Angela Merkel. Initially it wasassumed that this might be one of a smallnumber of known Russian accounts adoptingthis stance, with others communicating moreextreme right-wing messages. Further investi-gation revealed, however, that a series of IRAaccounts were communicating messages sup-portive of the German Chancellor. Even moresignificantly, when the timings of these mes-sages were plotted, they displayed a very sim-ilar pattern. Figure 6 plots these activity

patterns on a time-line. The larger the bubble,the greater the volume of messages sent atthat point in time by that account.

It can be observed that the key ‘pulses’ ofmessaging activity are very similar, but notidentical. All of these accounts stopped post-ing tweets on 12 August and then were inac-tive for a minimum of six months. Theinference drawn is that these accounts wereprobably being controlled by one author,possibly using a system such as Tweetdeck.This would be consistent with the workingarrangements at the Internet ResearchAgency described above.

In July and August 2016, these accountssent hundreds of pro-Merkel tweets, oftenusing the hashtag #MerkelMustStay. As withother IRA tweets, they frequently integratedan unrelated trending hashtag in order topush their own hashtag, for example#WorldElephantDay. These posts came dur-ing a period when Angela Merkel was underintense political pressure to step down asChancellor, with Reuters noting that in Jan-uary, 40 per cent of Germans thought sheshould resign over her refugee policy.15 Thiswould be consistent with the Russian state’sknown modus operandi for seeking to lever-age political weakness to amplify social andpolitical tensions.

Germany was subject to a series of terror-ist attacks between 18 July and 26 July 2016,

0

100

200

300

400

500

600

0102030405060708090

10012

/07/

2016

21/0

7/20

1621

/07/

2016

21/0

7/20

1621

/07/

2016

21/0

7/20

1621

/07/

2016

21/0

7/20

1621

/07/

2016

21/0

7/20

1621

/07/

2016

21/0

7/20

1621

/07/

2016

21/0

7/20

1621

/07/

2016

21/0

7/20

1621

/07/

2016

21/0

7/20

1621

/07/

2016

21/0

7/20

1621

/07/

2016

30/0

7/20

1612

/08/

2016

12/0

8/20

1612

/08/

2016

12/0

8/20

1612

/08/

2016

14/0

6/20

1722

/06/

2017

11/0

7/20

1730

/07/

2017

13/0

8/20

1706

/09/

2017

24/0

9/20

17

Runn

ing

tota

l of t

wee

ts

THOMAS_GERSTER

following followers tweets

Figure 5: Example of ‘narrative switching’

HOW RU S S I A ’ S I N T E R N E T R E S E A R C H AG EN C Y B U I L T I T S D I S I N F O RMA T I O N CAM -

P A I G N 7

© 2019 The Authors. The Political Quarterly published by John Wiley & Sons Ltd on behalf ofPolitical Quarterly Publishing Co (PQPC).

The Political Quarterly

Page 8: How Russia's Internet Research Agency Built its ... Quarterly...vard academics Yochai Benkler, Robert Faris and Hal Roberts have cautioned that digital influencing activity does not

in which fifteen people died. The so-calledIslamic State claimed responsibility for twoof the attacks and three of the attackers wereasylum seekers.16 The Nice terror attack alsooccurred in July, sending shockwavesaround Europe. Despite this sequence ofevents, these accounts were very supportiveof the Merkel government’s refugee policy:‘Lady #Merkel, Stick to your line! “The#refugees” are not dangerous mass but peo-ple! # MerkelMustStay’; and ‘The refugeepolicy of Mrs. Merkel shows that she is com-passionate! #MerkelMustStay’.

Consistent with this line, all of theaccounts disparaged the anti-immigrationAfD, with multiple variations of ‘#AfD isshit, AfD is shit #MerkelMustStay’ beingposted during a time when polls were sug-gesting AfD was enjoying some mainstreamsupport (35 per cent CDU to 12 per centAfD).17 The media has typically blamed Rus-sia for supporting populist/anti-immigrantparties, but at a point where Merkel (and EUunity) was politically weakened, Russiancontrolled accounts were messaging supportfor her domestically. There were significantsimilarities between the content of tweetsbeing posted by these accounts. Some werecopied verbatim, and others had single worddifferences, usually a hashtag. Combined

with the timings of the tweets, this points tothe likelihood of a single person or teamoperating all of these accounts in a co-ordi-nated strategic fashion.

Synchronicity analysis revealed this groupcomprised forty-five accounts, thirty-fourmore than we had previously discovered. Italso revealed another German team oftwenty-five accounts operating in a very dif-ferent manner, obsessed with Brexit, withthree of their top five hashtags directlyrelated to the Brexit vote. These posted avariety of messages that promoted the ideaof leaving the EU such as: ‘#Brexit will onlyhave small consequences! You cannot act inthe EU #BritainInOut #GoodbyeUK #Remai-nINEU #EUref’ and ‘Do not let yourself bemanipulated with fear. #Brexit #BrexitOrNot#GoodbyeUK #BritainInOut’. Confusingly,these same accounts also posted anti-UKmessages in apparent offence that the Britishpublic had chosen to leave, tweeting ‘#bri-taininout #goodbyeUK Let them drown ontheir island now’ and ‘#britaininout #good-byeUK We do not need snobs’.

In direct opposition to the pro-Merkelaccounts, the latter group invoked anti-refu-gee sentiments: ‘#britaininout #goodbyeUKYou flee like #Refugees’ and ‘#britaininout#goodbyeUK Take refugees with you!’. They

0123456789101112

Acc

ount

ID

Figure 6: Pro-Merkel accounts schedule

8 AND R EW DAW S O N AND MAR T I N I N N E S

The Political Quarterly © 2019 The Authors. The Political Quarterly published by John Wiley & Sons Ltd on behalf ofPolitical Quarterly Publishing Co (PQPC).

Page 9: How Russia's Internet Research Agency Built its ... Quarterly...vard academics Yochai Benkler, Robert Faris and Hal Roberts have cautioned that digital influencing activity does not

also blamed Merkel’s migration policy forincreasing terror in Europe ‘Thank you, MrsMerkel, that citizens now have to be afraidof terror in Europe. #stopptTerror’ andspouted populist rhetoric ‘Why can not ourwarships send these #migrants back immedi-ately? #stopptTerror’. The Russians wereamplifying both sides of the political argu-ment simultaneously, trying to increase thesocial fissures associated with them.

When the high profile (infamous) InternetResearch Agency accounts, were tested inthis way only two (@southlonestar and@southlonestar2) could be linked throughsynchronicity analysis (the latter appears tohave been set up as a ‘backup’ in case Twit-ter suspended @southlonestar). One possibleimplication of this is that if an accountachieved a certain scale of influence, thenoperators focussed on running that one per-sona. Alternatively, it may simply be an arte-fact of higher skill operators being in chargeof these accounts.

Rather than just retrospective linkage, syn-chronicity analysis has two potential usesgoing forward. First, if there is an identifiedKremlin controlled account, then treatingtemporal pulses of messaging activity as abehavioural signature might enable identifi-cation of other ‘accounts of interest’. Equallypossible is that, with a number of accounts,the presence of similar pulsing sequencescan be interpreted as a potential indicator ofa common controller.

To test the potential of these methods weapplied them to all 2,848 Twitter accountsfeaturing in the full Twitter dataset. Theresults are represented graphically in Fig-ure 7 where the nodes represent IRA Twitteraccounts, shaded by clusters. Some clustershave had their shapes changed to makethem easier to differentiate.

Representing the data in this way, thecloseness in proximity between two con-nected accounts indicates how similar theirtweeting activity is. Thus, the large cluster inthe centre of the graphic (box A) represents449 accounts that have their region predomi-nantly set to Russia and tweet primarily inRussian. This is significant in that it reflectsthe pre-eminent strategic objective of theRussian state, in terms of the perceivedimportance of managing domestic publicopinion and in the ‘near abroad’. This

provides an important corrective to much ofthe public debate that has taken place in theWest about Kremlin disinformation cam-paigns, which has worried predominantlyabout impacts in these contexts. But thisfocus neglects just how much of the IRA’seffort was directed towards influencing theviews and perceptions of Russian speakers.

In addition to the main cluster, the linkedaccounts in the top right-hand corner andconstituting the second largest group (boxB), look to be those associated with the IRA’sAmerican department. Intriguingly, in termsof their temporal activity profiles and mes-sage contents, these could be loosely con-nected to a much smaller group of Russianfacing accounts. The latter were probablyengaged in covering similar topics, but for adifferent audience.

Around the edges of the diagram, are alarge number of smaller ‘satellite’ clusters. Theaccounts here can often be distinguished onthe basis of them using different languages, orfocussing upon particular social identity poli-tics (such as Black Lives Matter). Box C showsa cluster of forty-five German accounts, whichinclude the pro-Merkel accounts we talkedabout earlier. In total, 2,031 or 71 per cent ofthe IRA Twitter accounts in the dataset couldbe linked to at least one other, comprising 119separate clusters. This affords a clear sense ofhow multiple accounts were being employedby IRA operators to push key messages in aco-ordinated fashion.

The attribution challengeDetailed forensic analysis methodologies ofthe kind outlined above are necessary forestablishing an evidence base for detectingcurrent and future activities of a similar nat-ure. This is important given the increasingdifficulties associated with confidentlyattributing accounts involved in communi-cating disinformation to Kremlin directionand control, as they have become increas-ingly sophisticated. Not all disinformationcomes from overseas; much of it is authoredby citizens resident in Western countries.When such sources are mis-attributed, it isespecially problematic, as it negativelyimpacts public confidence in the authoritiesinvolved and can be used as propagandaagainst them.

HOW RU S S I A ’ S I N T E R N E T R E S E A R C H AG EN C Y B U I L T I T S D I S I N F O RMA T I O N CAM -

P A I G N 9

© 2019 The Authors. The Political Quarterly published by John Wiley & Sons Ltd on behalf ofPolitical Quarterly Publishing Co (PQPC).

The Political Quarterly

Page 10: How Russia's Internet Research Agency Built its ... Quarterly...vard academics Yochai Benkler, Robert Faris and Hal Roberts have cautioned that digital influencing activity does not

Such challenges have been exacerbated bya general recalcitrance on the part of the prin-cipal social media platforms to admit proac-tively that malign actors are operating ontheir systems. Whilst they have responded tothese issues when pushed by governments,their responses in terms of increasing restric-tions to application programming interface(API) access and related interventions have,somewhat ironically, also made it harder forindependent researchers to assist in detectingand attributing disinforming communicationsto Kremlin backed accounts. There is a dan-ger that the platform providers are manoeu-vring into an arrangement where only theyare positioned routinely to detect bad actors.

Further challenges to the work of attribut-ing disinformation has resulted from thestrategy pursued when a suspected Kremlinbacked account (or accounts) are detected.Typically, steps are taken either to ‘takedown’ the account from the platform, or to‘expose’ its presence through publicising itssuspicious activities. The issue is that suchapproaches are of limited effectiveness inaltering the dynamics of the disinformationecosystem. Moreover, this has enabled theIRA and other Kremlin backed units to learnhow they are being detected and to adapt

their methods accordingly, the consequencebeing that some of their approaches havebecome more sophisticated and harder todetect. Far more impactive, therefore, interms of leveraging disruption on the capac-ity and capability to spread disinformation,is a strategy that seeks to build an under-standing of a disinformation network overtime, and implements interventions againstmultiple nodes simultaneously. This is howpolice investigators have learned to do dis-ruptions of criminal networks in offlinespaces to maximise impact.

ConclusionIn his coruscating account of life in Russiaand the state’s normalised use of ‘soft facts’to convey multiple and shifting ‘truths’ to itscitizens, Peter Pomerantsev, articulates howthe aggregating effect is a profound suspen-sion of belief.18 Unlike classic propaganda,the design is not intended to seduce peopleto invest in a particular ‘truth’, but rather torender them in a state of profound and radi-cal doubt about what to believe—a state ofepistemic anarchy.

One of the most striking aspects of thegrowing number of analyses of the IRA’s

A

BC

Figure 7: Network graph of synchronicity analysis on all IRA accounts

10 AND R EW DAW S O N A ND MAR T I N I N N E S

The Political Quarterly © 2019 The Authors. The Political Quarterly published by John Wiley & Sons Ltd on behalf ofPolitical Quarterly Publishing Co (PQPC).

Page 11: How Russia's Internet Research Agency Built its ... Quarterly...vard academics Yochai Benkler, Robert Faris and Hal Roberts have cautioned that digital influencing activity does not

activity, especially around the 2016 US presi-dential election, is just how extensive andvaried it was. This, notwithstanding, it isequally vital to recognise that similar effortshave been deployed across a number of othersituations and settings. With this in mind,what is required now is a series of detailedand forensic analyses, of the kind outlinedherein, to distil the key operational tacticsand techniques that were being used. If thiscan be accomplished, then it enhances capac-ity and capability to detect similar attemptsgoing forward, and to constrain their efficacy.

The particular value of the behaviouralmodels discussed is that they documentfairly unusual patterns of activity that func-tion as ‘signatures’ or ‘tells’ that an accountis possibly being run by an operator withspecific intents. The potential is one that issimilar to how police detectives use beha-vioural signatures to profile repeat offenders:their digital equivalents can be used todetect malign influencers online. A benefit ofthe kinds of diagnostic data used to ascertainthese behaviours is that they are largely lan-guage agnostic and do not rely upon an abil-ity to read and interpret the contents ofwhat is being communicated.

At this precise moment, it is difficult toknow how worried we should collectively beabout disinformation communication. Thereare indications that despite the efforts ofgovernments, intelligence agencies and plat-form providers, misinformation and disinfor-mation is becoming an endemic feature ofthe modern media ecosystem. And whilstthere is certainly something objectionableabout attempted behaviour modification inrespect of democratic processes and out-comes, there is actually remarkably littlerobust evidence that such disinforming com-munications have a discernible measurableimpact upon how the majority of peoplethink, feel or act. Messaging of this kindmay be better at ‘channelling’ peoples’ pre-existing values and opinions than it is inchanging them. Perhaps then it is moreappropriate to argue that disinformation hasmore impact in shaping the issues we collec-tively think about, than what we individu-ally think. That is, its pernicious influenceresides in framing and agenda setting whattroubles come to be defined as key publicpolicy problems.

Notes1 H. Benkler et al., Network Propaganda: Manipu-lation, Disinformation and Radicalization in Ameri-can Politics, New York, Oxford UniversityPress, 2018.

2 RBC Magazine, ‘Investigation of RBC: how the“factory of trolls” worked in the elections inthe United States’, 17 October 2017; https://www.rbc.ru/magazine/2017/11/59e0c17d9a79470e05a9e6c1; Radio Free Europe, ‘One pro-fessional Russian troll tells all’, 25 March 2015;https://www.rferl.org/a/how-to-guide-russian-trolling-trolls/26919999.html (both accessed9 April 2019).

3 The Higher Learning, ‘Russia has a trollarmy that is trying to mold public opinionon internet news sites’, 4 June 2014; http://thehigherlearning.com/2014/06/04/russia-has-a-troll-army-that-is-trying-to-mold-public-opinion-on-internet-news-sites/ (accessed 9 April2019).

4 Radio Free Europe, ‘One professional Russian’.5 Yahoo!, ‘Trolling for Putin: Russia’s informa-tion war explained’, 5 April 2015; https://www.yahoo.com/news/trolling-putin-russias-information-war-explained-063716887.html (ac-cessed 9 April 2019).

6 Radio Free Europe, ‘One professional Russian’.7 Ibid.8 BuzzFeed, ‘Documents show how Russia’s trollarmy hit America’, 2 June 2014; https://www.buzzfeednews.com/article/maxseddon/documents-show-how-russias-troll-army-hit-america (ac-cessed 9 April 2019); Radio Free Europe, ‘Oneprofessional Russian’.

9 RBC Magazine, ‘Investigation of RBC’.10 Yahoo!, ‘Trolling for Putin’.11 KickFactory, ‘The average Twitter user now

has 707 followers’, 23 June 2016; https://kickfactory.com/blog/average-twitter-followers-updated-2016/ (accessed 9 April 2019).

12 IRA accounts tested: TEN_GOP, Blackmatterus,Blacknewsoutlet, Blacktolive, Bleepthepolice,Crystal1johnson, Jeblary2016, Jenn_abrams, Lgb-tunitedcom, Muslims_in_usa, Pamela_moore13,Southlonestar, Thefoundingson, Tpartynews,Trayneshacole, Usa_gunslinger, Wokeluisa.

13 Yahoo!, ‘EU firm on Russia sanctionsover Ukraine: Merkel’, 14 September 2018;https://sg.news.yahoo.com/eu-firm-russia-sanctions-over-ukraine-merkel-153248229.html (ac-cessed 9 April 2019).

14 C. Mudde, ‘What the stunning success of AfDmeans for Germany and Europe’, The Guardian,24 September 2014; https://www.theguardian.com/commentisfree/2017/sep/24/germany-elections-afd-europe-immigration-merkel-radical-right (accessed 9 April 2019).

HOW RU S S I A ’ S I N T E R N E T R E S E A R C H AG EN C Y B U I L T I T S D I S I N F O RMA T I O N CAM -

P A I G N 11

© 2019 The Authors. The Political Quarterly published by John Wiley & Sons Ltd on behalf ofPolitical Quarterly Publishing Co (PQPC).

The Political Quarterly

Page 12: How Russia's Internet Research Agency Built its ... Quarterly...vard academics Yochai Benkler, Robert Faris and Hal Roberts have cautioned that digital influencing activity does not

15 Reuters, ‘Forty percent of Germans say Merkelshould resign over refugee policy: poll’, 29January 2016; https://uk.reuters.com/article/us-europe-migrants-germany-merkel-idUKKCN0V70KM (accessed 9 April 2019).

16 Reuters, ‘Germany’s far-right AfD claws backsome support after attacks’, 31 July 2016;

https://www.reuters.com/article/us-germany-afd-idUSKCN10B0FR?feedType=RSS&feedName=worldNews (accessed 9 April 2019).

17 Ibid.18 P. Pomerantsev, Nothing is True and Everything

is Possible: The Surreal Heart of the New Russia,London, Faber & Faber, 2014.

12 AND R EW DAW S O N A ND MAR T I N I N N E S

The Political Quarterly © 2019 The Authors. The Political Quarterly published by John Wiley & Sons Ltd on behalf ofPolitical Quarterly Publishing Co (PQPC).