Did Russia hack the referendum?

By Ben Pearson.

On January 22nd, WebRoots Democracy hosted a public seminar to explore claims about Russian interference in the EU referendum. To assess the evidence and shed some light on how interference could have taken place, we were joined by Sky’s technology correspondent Alexander J Martin and UCL’s Dr Gianluca Stringhini. Areeq Chowdhury opened the event, giving some background to the issue and reading out the Russian Embassy’s rebuttal (read in full).

Alexander J Martin

Alex began with a rehearsal of the events leading up to the 2016 US presidential election. He pointed to official statements by US intelligence agencies alleging that a multifaceted interference project took place to undermine Hilary Clinton, advance the Trump campaign and sew discord and polarisation in the political discourse.

Alex Martin

This took the form of:

  • Hacking into the email accounts of the Democratic National Congress and releasing potentially compromising emails through Wikileaks
  • Astroturfing: creating and funding fake grassroots movements through social media
  • Using social media bots to spread and amplify fake, highly partisan, defamatory or confrontational content
  • Hacking into electronic voting technology, though seemingly without altering any information held on them
  • Creating and spreading heavily partisan or fabricated news content through mainstream outlets like RT or Sputnik

Whilst plenty of accusations exist, Alex pointed out that interference in the US was clearly on a vastly different scale to what is alleged to have happened here in the UK.

Intelligence and its limits

Protecting sources and guarding national security information is a necessary and understandable part of intelligence work. As a result, the public are not given a chance to assess the evidence, all they can do is decide to trust or distrust the intelligence agencies. The shadow of the infamous “dodgy dossier” preceding the Iraq War looms large over the conversation in this context.

The allegations

The first allegation relates to election financing. The Electoral Commission are currently investigating the financing behind the both the official Vote Leave and unofficial Leave.Eu campaigns. OpenDemocracy have done extensive investigations into the dark money behind the DUP’s leave campaign, and Carole Cadwalladr won a British Journalism Award for her work in this area.

The second, and perhaps more pertinent part relates to information warfare. The Internet Research Agency, many of whose employees were recently indicted in Robert Mueller’s Special Counsel investigation, is alleged to have mobilised its considerable resources to a social media campaign of false information and trolling. Alex described how Twitter suspended a number of accounts saying that they were operatives in the campaign, but would not disclose information about the accounts, the tweets in question or their methodology for identifying them.

fake twitter account
The Tennessee GOP Twitter account was later found to be a Russian creation. It had 136k followers and got thousands of retweets, some from top Trump campaign staffers

The evidence base

Referencing research by City University and a joint study by Swansea and Berkley universities, Alex showed that a large-scale influencing campaign did take place, with many thousands of accounts amplifying hyperpartisan content. However, much of the content in question was deleted shortly afterwards, whether by Twitter or the account-holders themselves. Given that disinformation campaigns can originate from a number of other online communities like 4chan and subreddit The Donald, evidence linking the operation directly to the Kremlin is perhaps unclear.

Further questions remain about the extent to which it could really have influenced the final outcome of the referendum.

Fake news?

Blaming unpalatable events and dismiss criticism as “fake news” or “Russian interference” is highly problematic, and allows powerful actors to act with impunity.

Alex asserts overall that without a wide-ranging public investigation like that seen in the US, and absolute transparency from social media companies, the evidence linking the Kremlin to social media activity remains a little too murky.

Dr Gianluca Stringhini

Dr Stringhini approaches the issue with a strong background in measuring and analysing malicious activity online. His talk focused on how content spreads from fringe online communities to more mainstream social media platforms like Twitter and Facebook.

Online activity and real-world impact

He pointed to two recent stories which neatly demonstrate how the orchestrated spread of disinformation from these fringe communities can have tangible consequences in the real world. In the infamous #pizzagate incident, a gunman opened fire on a pizza restaurant in Washington, after having read a story about how it was serving as the premises for a Democratic Party child abuse ring. The story originated in 4chan, was spread onto social media, and got retweets from notable figures on the right.

Michael Flynn pizzagate tweet

In another example, the aftermath of a mass shooting in Texas saw congressman Vincente Gonzales name the shooter as Sam Hyde live on CNN. This name was deliberately fed out into the web by 4chan.

Modelling the spread of disinformation

Dr Stringhini argues that studies focused solely on Twitter or Facebook are very limited and don’t capture the full picture. He has modeled the inter-connectivity between online services and platforms, and how the connections are exploited by malicious actors to spread disinformation and hyper-partisan content.

These actors have been using bots, sock puppets, and trolls to orchestrate attacks on those with opposing politics. Whereas the activity used to be low key, the current political environment means that consequences can be severe.

His research also shows that

  • Alternative news spreads quicker through Twitter than mainstream news
  • Fringe news sites like Infowars and Breitbart are gaining ascendancy  
  • 30% of all content on the mainstream news subreddit (r/news) comes from low-membership fringe communities
  • Articles are frequently decontextualised, repurposed and spread through these fringe communities, before moving onto social media and into the mainstream

The Russian connection?

The Russian efforts at spreading disinformation and polarising content are an important part of this information ecosystem. They work alongside Western actors and trolls in these fringe communities, and push for the same agenda. However, more research and data would be needed to model this relationship effectively.

Further discussion

Tolerating fake news?

Alex put forward the idea that perhaps we should just try to tolerate a certain amount of fake news as being an inevitable result of the system we currently have. He questioned how much real impact – pizzagate and Thomas Nair aside – the abundance of polarising and hyperpartisan coverage has on the real world, given how wacky some of the claims are. Should we perhaps entrust the voting public with a little more rationality?

This was met with some resistance from the audience who pointed out that high-profile incidents distract us from more insidious processes which polarise society and poison the nature of political discourse.

Fine margins

Alex argued that the infamous £350m figure wouldn’t have had an impact but again this was contentious. The margins of victory in both the US presidential election and the EU referendum were incredibly fine, and it therefore wouldn’t take much to tip the balance.

Nothing new?

It was put forward that interference in elections has been happening all over the world for a long time, and that the US had a long history of election meddling.

What next?

According to Dr Stringhini, what is needed instead are content checking tools which a) can show the reputation and provenance of content, and b) are appealing, user-friendly and effective. This will take a lot of resources and engineers, but social media companies have these in abundance. This however, will entail a danger that independent but legitimate blogs are downgraded or pushed down the rankings.

Alex argued that regulation is a problematic issue, as it involves assuming absolute ownership of the truth. The measures taken to regulate content could themselves entail an abuse of power, and rob the public of its ability to discern between falsehood and reality.

We’d like to thank Dr Stringhini and Alex Martin for their valuable input and to the audience for attending. Keep an eye on our social media channels for information about future events.

Ben Pearson is a volunteer at WebRoots Democracy.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s