‘No Facebook users reported Christchurch massacre during livestreaming’
The social media company said the video was viewed fewer than 200 times during its live broadcast.
No-one reported the video of the Christchurch terror attack while it was being streamed live, Facebook has said.
It was 29 minutes after the video had started – and 12 minutes after it had ended – before the first user flagged up the footage, the social media giant said.
The company earlier revealed that it had removed 1.5 million videos of the attack worldwide in the 24 hours after the shootings, 1.2 million of which were blocked at upload.
In a blog post on Tuesday, Chris Sonderby, vice president and deputy general counsel at Facebook, said the video was viewed fewer than 200 times during its live broadcast.
“No users reported the video during the live broadcast,” he added.
“Including the views during the live broadcast, the video was viewed about 4,000 times in total before being removed from Facebook.
“The first user report on the original video came in 29 minutes after the video started, and 12 minutes after the live broadcast ended.
Mr Sonderby said Facebook was “working around the clock” to prevent the video from appearing on its site.
Meanwhile, the Global Internet Forum to Counter Terrorism, formed by tech giants Facebook, Microsoft, Twitter, and YouTube in 2017 to tackle the spread of terrorism online, said more than 800 different versions of the video have been added to a shared database.
The group said the “digital fingerprints” of visually-distinct videos were included, in a bid to uncover and remove edited videos that aim to get around existing detection technology.
“This incident highlights the importance of industry co-operation regarding the range of terrorists and violent extremists operating online,” it said.
She told the country’s parliament: “There is no question that ideas and language of division and hate have existed for decades, but their form of distribution, the tools of organisation, they are new.
“We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published.
“They are the publisher, not just the postman. There cannot be a case of all profit, no responsibility.”
Reacting to a tweet from YouTube claiming that the video-sharing service was working to remove the footage, he said: “You really need to do more @YouTube @Google @facebook @Twitter to stop violent extremism being promoted on your platforms. Take some ownership. Enough is enough.”
Damian Collins, Tory chairman of the Digital, Culture, Media and Sport Select Committee, called for a review into how the footage was shared and “why more effective action wasn’t taken to remove them”.
And Downing Street said social media companies needed to act “more quickly” to remove terrorist content.