Molly Russell death: Coroner suggests separate platforms for adults and children
The 14-year-old died in November 2017 after viewing suicide and self-harm content online.
The father of schoolgirl Molly Russell has urged social media companies not to “drag their feet waiting for legislation”, as a coroner issued recommendations including separate platforms for adults and children.
Coroner Andrew Walker sent a Prevention of Future Deaths report (PFD) to businesses such as Meta, Pinterest, Twitter and Snapchat as well as the UK Government on Thursday, in which he urged a review of the algorithms used by the sites to provide content.
The 14-year-old, from Harrow in north-west London, ended her life in November 2017 after viewing suicide and self-harm content online, prompting her family to campaign for better internet safety.
The coroner also voiced concerns over age verification when signing up to the platforms, content not being controlled so as to be age-specific, and algorithms being used to provide content together with adverts.
Other issues included in the report were the lack of access or control for parents and guardians and the absence of capability to link a child’s account to a parent or guardian’s account.
At the inquest held at North London Coroner’s Court last month, the coroner concluded Molly died while suffering from the “negative effects of online content”.
The inquest was told the teenager accessed material from the “ghetto of the online world” before her death, with her family arguing sites such as Pinterest and Instagram recommended accounts or posts that “promoted” suicide and self-harm.
In her evidence, Meta executive Elizabeth Lagone said she believed posts seen by Molly, which her family say “encouraged” suicide, were safe.
In light of the concerns raised, Mr Walker recommended the Government considered reviewing the provision of internet platforms to children in the PFD.
Other areas highlighted for review included separate platforms for adults and children, age verification before joining a platform, provision of age specific content, and the use of algorithms to provide content.
The coroner also recommended the Government review the use of advertising and parental, guardian or carer control including access to material viewed by a child, and retention of material viewed by a child.
Mr Walker’s report said: “I recommend that consideration is given to the setting up of an independent regulatory body to monitor online platform content with particular regard to the above.
“I recommend that consideration is given to enacting such legislation as may be necessary to ensure the protection of children from the effects of harmful online content and the effective regulation of harmful online content.
“Although regulation would be a matter for Government I can see no reason why the platforms themselves would not wish to give consideration to self-regulation taking into account the matters raised above.”
Mr Walker said he believed action should be taken in order to prevent future deaths, adding: “I believe you and/or your organisation have the power to take such action.”
“We urge social media companies to heed the coroner’s words and not drag their feet waiting for legislation and regulation, but instead to take a proactive approach to self-regulation to make their platforms safer for their young users.
“They should think long and hard about whether their platforms are suitable for young people at all.
“The Government must also act urgently to put in place its robust regulation of social media platforms to ensure that children are protected from the effects of harmful online content, and that platforms and their senior managers face strong sanctions if they fail to take action to curb the algorithmic amplification of destructive and extremely dangerous content or fail to remove it swiftly.
“I hope this will be implemented swiftly through the Online Safety Bill which must be passed as soon as possible.”
In their response to the PFD report, Instagram’s parent company Meta said they agreed “regulation is needed”.
The social media giant said it was “reviewing” the coroner’s report, adding: “We don’t allow content that promotes suicide or self-harm, and we find 98% of the content we take action on before it’s reported to us.
“We’ll continue working hard, in collaboration with experts, teens and parents, so we can keep improving.”
Pinterest also issued a statement in reaction to the report, which said: “Pinterest is committed to making ongoing improvements to help ensure that the platform is safe for everyone and the coroner’s report will be considered with care.”
Meta, Pinterest, Twitter and Snapchat all have 56 days to respond with a timetable of action they propose to take or explain why no action is proposed.