Guernsey Press

Twitter accounts linked to Russian ‘troll factory’ active in Brexit referendum

The social media site said that the 49 accounts received ‘very low levels of engagement’ from users.

Published

Twitter has found 49 accounts linked to a notorious Russian “troll factory” which were sending out messages about the EU referendum during the 2016 campaign.

The company’s UK head of public policy Nick Pickles told a House of Commons committee that the accounts linked to the St Petersburg-based Internet Research Agency amounted to less than 0.005% of those tweeting about the referendum, and received “very low levels of engagement” from other users.

The announcement came in a hearing of the Commons Digital, Culture, Media and Sport Committee’s inquiry into fake news – taking evidence from internet companies YouTube, Facebook, Google and Twitter in Washington DC.

YouTube told the cross-party committee it had found no evidence of Russian sources using ads on its video-sharing service to attempt to interfere in the 2016 referendum.

Meanwhile, Facebook said it had taken down “thousands” of fake accounts in the run-up to 2017 elections in the UK, France and Germany – although they were not necessarily aimed at spreading false information.

Mr Pickles told the committee that Twitter had identified “a very small number of suspected Internet Research Agency-linked accounts”.

“Forty-nine such accounts were active during the referendum campaign, which represents less than 0.005% of the total number of accounts that tweeted about the referendum,” he said.

“Those accounts collectively posted 942 tweets, representing less than 0.02% of the total tweets posted about the referendum during the campaign. Those tweets collectively were retweeted 461 times and liked 637 times.”

This amounted to fewer than 10 likes and 13 retweets per account, which was “a very low level of engagement”, he said.

Committee chairman Damian Collins has criticised Facebook and Twitter's previous responses to its inquiry into fake news (Chris McAndrew/UK Parliament/PA)
Committee chairman Damian Collins has criticised Facebook and Twitter’s previous responses to its inquiry into fake news (Chris McAndrew/UK Parliament/PA)

YouTube has previously informed a US Senate committee of 18 channels it discovered which were linked to the Internet Research Agency “content farm”.

In September, Facebook bowed to pressure and provided the contents of 3,000 ads bought by a Russian agency to the US committee.

YouTube’s Juniper Downs gives evidence to the Commons Digital, Culture, Media and Sports Committee in Washington DC (www.parliamentlive.tv/PA)

“We have conducted a thorough investigation around the Brexit referendum and found no evidence of interference.

“We looked at all advertisements with any connection to Russia and we found no evidence of our services being used to interfere in the Brexit referendum and we are happy to co-operate with any further efforts.”

Facebook’s head of global policy management, Monika Bickert, told the MPs that it had a strict policy of people signing up using their real names and took action to tackle fake profiles.

“In the run-up to the French election, the German election, the UK election we were using our technical tools to remove thousands of fake accounts,” she said.

“Not that those were necessarily related to spreading disinformation or to spreading information about the election, but they were fake accounts and we are using those technical tools to reduce the chance that they might be used to spread disinformation.”

Committee chairman Damian Collins last year criticised Facebook and Twitter over their replies to the committee’s investigation.

Facebook's Monika Bickert giving evidence to MPs (www.parliamentlive.tv/PA)
Facebook’s Monika Bickert giving evidence to MPs (www.parliamentlive.tv/PA)

He gave the example of bogus cancer cures found by patients searching the internet for information about their conditions.

Mr Gingras said Google was “in the trust business” and felt “an extraordinary sense of responsibility” about the reliability of information highlighted by its search engine and news app.

“The loyalty of our users is based on their continued trust in us,” he said. “To the extent they don’t trust us, they will stop using our products and our business will collapse.

“We believe strongly in having an effective democracy, we believe strongly in supporting free expression and supporting a sustainable high-quality journalism eco-system to make sure that quality information is out there.”

Mr Gingras acknowledged that Google’s autofill function, which suggests possible search phrases as users type, sometimes produces offensive suggestions in response to phrases like “Jews are…” – in part due to “malicious actors” seeking to game the system.

But he said Google was constantly working to correct them.

Ms Downs recognised concerns over YouTube’s “up next” feature, which has come under attack for suggesting inappropriate videos to users.

“We recognise that there is work to do on our recommendation engine in terms of making sure we are surfacing the right news content to our users,” she said.

Sorry, we are not accepting comments on this article.