Guernsey Press

MPs accuse tech giants of failing to answer ‘basic questions’ on disinformation

Representatives from Facebook, Google and Twitter were appearing before a Digital, Culture, Media and Sport sub-committee.

Published

MPs have accused tech giants Facebook, Google and Twitter of being “unable to answer basic questions” about disinformation during appearances before a select committee.

Representatives of the firms were appearing before the Digital, Culture, Media and Sport (DCMS) sub-committee on online harms and disinformation, with particular focus on the spread of false narratives around the coronavirus outbreak.

Committee chairman Julian Knight said MPs would be writing to each of the tech giants to voice their “displeasure” at a “lack of answers” given on content moderation on their platforms.

The appearance of the internet giants comes as they, governments and other organisations continue to attempt to stop the spread of disinformation linked to Covid-19, which has seen incidents of fake cures touted online and phone masts attacked after a debunked conspiracy theory spread which claimed 5G technology was linked to the outbreak.

Facebook’s UK public policy manager Richard Earley was criticised for not being able to confirm the number of content moderators the firm currently had reviewing explicit material flagged to the platform.

Conservative MP Damian Hinds raised the issue after a report from the NSPCC last week which highlighted concerns about the safety of children online after it suggested the Covid-19 lockdown had led to staff cuts at social media platforms and therefore fewer reviewers able to find and remove child exploitation material.

Mr Earley said Facebook had taken steps to “minimise any negative impact on our ability to review content”, including moving responsibility for the most serious content review subjects – such as child abuse material and self-harm content – to its available full-time employees and putting in place systems that allowed other contracted moderators to work from home.

But he admitted some volunteers were also being used for moderation.

But when pressed by the committee on whether Facebook had the same number of moderators working as before the pandemic, or fewer, Mr Earley said he was unable to answer because the situation was changing each day.

Mr Hinds urged Facebook to respond in writing on the issue, while Mr Knight said it appeared that none of the witnesses had been supplied with “genuine, hard information on how you are specifically going about tackling Covid disinformation”.

Andy Burrows, NSPCC associate head of child safety online policy said Facebook’s admission was “deeply troubling”.

“Although no-one could have foreseen these circumstances, the reality is platforms for years have failed to protect children from abuse and harmful content,” he said.

“This goes to show the urgent need for the duty of care legislation which would force tech firms to protect children on their sites and tackle years of industry inaction head-on.”

Following a number of fractious exchanges between MPs on the committee and the tech firms’ representatives, Mr Knight said he had not heard “any facts” from Facebook, while Twitter’s Katy Minshall was accused of using “pre-prepared” remarks rather than attempting to answer questions.

Following an exchange between Google’s public policy and government relations manager, Alina Dimofte, and SNP MP John Nicolson on online gambling, Mr Knight expressed his frustration at the number of follow-up letters the committee would need to write because the three witnesses had been “seemingly unable to answer quite basic questions”.

“We will be writing to all the organisations and frankly we will be expressing our displeasure at the quality of the answers – well a lack of answers – that we’ve received today and will be seeking further clarity,” the committee chairman said.

Sorry, we are not accepting comments on this article.