Guernsey Press

Search engines can be ‘one-click gateways’ to harmful content, Ofcom warns

The regulator analysed thousands of search results based on terms linked to self-harm and suicide as part of a new study into online harms content.

Published
Last updated

Content that glorifies or celebrates self-harm and suicide is widely available via internet search engines, Ofcom has warned.

The regulator said research carried out on its behalf by the Network Contagion Research Institute found that one in every five links (22%) in the search results it analysed led to content which gloried or offered instruction about self-harm, suicide or eating disorders.

The researchers entered common search terms linked to self-injury, as well as more cryptic phrases used by online communities to conceal their true meaning in order to generate their results, and analysed more than 37,000 result links on five major search engines – Google, Microsoft Bing, DuckDuckGo, Yahoo and AOL.

According to the research, image searches provided the highest proportion of harmful results, with 50% of results being considered extreme.

Ofcom noted that previous research has shown that images are harder for detection algorithms to filter out as it can be difficult to distinguish between visuals glorifying self-harm and those shared in a medical or recovery context.

However, the research did note that help, support and educational content was available and signposted – with one in five search results linking to content focused on getting people help.

Ofcom also acknowledged that some search engines offer safety measures, such as a safe search mode, which restricts inappropriate content, and these were not used by the researchers in the study.

The regulator warned that search engines must act to ensure they are ready to fulfil their requirements under the Online Safety Act, which legally requires internet companies to protect children from harmful content.

Ofcom’s online safety policy development director, Almudena Lara, said: “Search engines are often the starting point for people’s online experience, and we’re concerned they can act as one-click gateways to seriously harmful self-injury content.

“Search services need to understand their potential risks and the effectiveness of their protection measures – particularly for keeping children safe online – ahead of our wide-ranging consultation due in spring.”

In response, a Google spokesperson said: “We are fully committed to keeping people safe online. Ofcom’s study does not reflect the safeguards that we have in place on Google Search and references terms that are rarely used on Search.

“Our SafeSearch feature, which filters harmful and shocking search results, is on by default for users under 18, whilst the SafeSearch blur setting – a feature which blurs explicit imagery, such as self-harm content – is on by default for all accounts.

“We also work closely with expert organisations and charities to ensure that when people come to Google Search for information about suicide, self-harm or eating disorders, crisis support resource panels appear at the top of the page.”

A spokesperson for Microsoft said: “Microsoft is deeply committed to creating safe experiences online, and we take seriously the responsibility to protect our users, particularly children, from harmful content and conduct online.

“We are mindful of our heightened responsibilities as a major technology company and will continue to work with Ofcom to take action against harmful content in search results.”

A DuckDuckGo spokesperson said: “While DuckDuckGo gets its results from many sources, our primary source for traditional web links and image results is Bing.

“For issues in search results or problematic content, we encourage people to submit feedback directly on our search engine results page – by clicking on ‘Share Feedback’, which can be found at the bottom right corner of the page.”

Yahoo and AOL have been approached for comment.

Sorry, we are not accepting comments on this article.