Google action on abuse ‘not motivated by ad revenue loss after bad news stories’
Kristie Canegallo, vice president and global lead for the trust and safety organisation, said ensuring a safe space for children online is a priority.
A boss at Google has said a loss of ad revenue after a number of bad news stories did not act as the motivator for the internet giant to take action against child sexual abuse on its platforms.
Kristie Canegallo, vice president and global lead for the trust and safety organisation at the tech firm, said ensuring a safe space for children online is a “corporate priority”.
Giving evidence to the Independent Inquiry Into Child Sexual Abuse (IICSA), Ms Canegallo said Google spends hundreds of millions of dollars as a company on an annual basis fighting abuse, including child sexual exploitation.
The inquiry, sitting in London, heard reference to a number of recent news stories reporting instances of abuse on the Google-owned YouTube platform.
Following one such story last year, 37 videos that had originally been livestreams were reported to Google and, of those, 22 were removed for violating Google’s child safety policies, the inquiry was told.
Ms Canegallo said since then there had been improvements in its classifier technology – used to capture and report inappropriate comments.
Jacqueline Carey, counsel to the inquiry, asked: “Why does it take an expose like this to bring about the improvements that you’ve just told us about?”
Ms Canegallo said work was already ongoing to make the technology better.
She told the inquiry: “The improvements in our comment classifier was not in response to this article. As I mentioned, throughout 2018 our YouTube team has been investing in how to improve the technology in this space and so it was the result of that ongoing effort.”
More recently, Google terminated 360 YouTube accounts after paedophiles were reported to have left comments on videos of young people, prompting big companies to pull their ads from the platform.
Ms Carey asked: “Did the fact that companies such as Fortnite, Disney, Nestle withdrew their ads lead to a loss of revenue to Google?”
Ms Canegallo said: “I would imagine that it did, yes.”
Ms Carey went on: “Is that in any way a motivator, that loss of revenue, a motivator for being seen now to be taking this more seriously than perhaps you should before?”
But Ms Canegallo reiterated her belief that work had already been ongoing to tackle such issues before the news story broke.
She told the inquiry: “Again, the work that the YouTube team has been doing throughout 2018, some of which has come to fruition recently, is the result of continued effort on the part of our team that was not prompted by any one article or news enquiry.”
She said resources are “not an issue” when it comes to the “child safety space”, telling the inquiry Google has around 10,000 people working on tackling “unacceptable content” and more than 400 employees focused specifically on child safety issues.
Ms Canegallo also told the inquiry Google takes the user “at trust”, requiring no identification, when they sign up to YouTube and give a date of birth that makes them at least 13 years old.
But she said they do use technology to look for signals of underage users and told the hearing that thousands of accounts are terminated on a weekly basis for not passing their age verification process.
The IICSA is conducting its second investigation phase, into how the internet is used to facilitate child sexual abuse in England and Wales through acts such as grooming, sharing indecent images and livestreaming abuse.
Earlier, the inquiry heard from Microsoft UK’s director of corporate, external and legal affairs Hugh Milward, who said the firm had closed between 100 and 400 UK accounts a year in connection with suspected child sexual abuse.