Guernsey Press

Instagram announces ban on graphic self-harm images

Bosses at the social network have met with the Health Secretary, Matt Hancock.

Published
Last updated

Instagram is banning graphic images of self-harm after Health Secretary Matt Hancock said social media companies “need to do more” to curb their impact on teenagers’ mental health.

The photo-sharing platform announced a series of changes to its content rules including a ban on graphic images of self-harm and the removal of non-graphic images of self-harm from searches, hashtags, and the explore tab.

Instagram said it will not be entirely removing non-graphic self-harm content, as it does not “want to stigmatise or isolate people who may be in distress”.

Head of the social network, Adam Mosseri, said: “Nothing is more important to me than the safety of the people who use Instagram. We are not where we need to be on self-harm and suicide, and we need to do more to protect the most vulnerable in our community.

“I have a responsibility to get this right. We will get better and we are committed to finding and removing this content at scale, and working with experts and the wider industry to find ways to support people when they’re most in need.”

On Thursday afternoon the site’s boss met the Health Secretary along with representatives from Facebook, Snapchat, Twitter, Pinterest, TechUK, Samaritans, Internet Association UK, and Google, to discuss content on suicide and self-harm.

Health and Social Care Secretary Matt Hancock
Health and Social Care Secretary Matt Hancock (Stefan Rousseau/PA)

“We’ve seen today progress, the discussions were productive, and the willingness to try and solve this problem.

“We’ve got to be led by what the clinicians, what the experts say needs to be taken down and what’s the appropriate way to do that.

“What all the companies that I met today committed to was that they want to solve this problem, and they want to work with us about it.

“We’re pushing for a duty of care to the users of social media particularly to children and that duty is something we’re are looking to in a white paper that will be published by the government.”

“What really matters is when children are on these sites they are safe, the progress we made today is good, but there’s a lot more work to do.

“I care deeply about getting this right and I feel the concern that any parent feels in this day in age with children using social media and that children are safe.”

Molly Russell died in 2017 aged 14. Her family found material relating to depression and suicide when they looked at her Instagram account after her death.

Molly Russell death
Molly Russell, 14, who took her own life in November 2017 (Family handout/PA)

“I am glad Instagram have committed to me that they will now take down graphic self-harm and suicide content. I’ll keep working working to make the internet safe for all.

“This is best delivered in partnership, because it’s the social media companies who understand their own platforms, we made some progress today in terms of taking down some of the most egregious, harmful material that promotes suicide and self harm.

“But there’s more to do in terms of being clear what materials up there and making sure that the behaviour of the site follows the best medical evidence.”

The NSPCC said the rule changes marked “an important step”, but that social networks were still not doing enough to tackle self-harm.

Charity chief executive Peter Wanless said: “It should never have taken the death of Molly Russell for Instagram to act.

“Over the last decade social networks have proven over and over that they won’t do enough to design essential protections into their services against online harms including grooming and abuse.

“We cannot wait until the next tragedy strikes.”

Ian Russell, Molly’s father, said: “I welcome the commitment made today by Adam Mosseri to ban all graphic self-harm content from Instagram.

“I also welcome their plans to change their search mechanisms in relation to self-harm and suicide related content and to increase the help and support it provides to its users.

“It is encouraging to see that decisive steps are now being taken to try to protect children from disturbing content on Instagram and I hope that the company acts swiftly to implement these plans and make good on their commitments.

“It is now time for other social media platforms to take action to recognise the responsibility they too have to their users if the internet is to become a safe place for young and vulnerable people.”

– The Samaritans operate a round-the-clock freephone service 365 days a year for people who want to talk in confidence. They can be contacted by phone on 116 123 or by visiting samaritans.org.

Sorry, we are not accepting comments on this article.