Social media firms must be regulated to protect young users, say MPs
A report from the Commons Science and Technology Committee calls on the Government to introduce comprehensive new regulation.
Social media companies must be subject to a “legal duty of care” to protect the health and well-being of younger users of their sites, a report by MPs has concluded.
The House of Commons Science and Technology Committee said the Government must consider legislation to ensure social media firms share data which can help identify and protect those at risk from the negative impact of such sites.
The report, entitled Impact Of Social Media And Screen-Use On Young People’s Health, said the current loose patchwork of regulation has resulted in a “standards lottery” that could not ensure the safety of young internet users.
It warned young people were suffering from damage to their sleep patterns and body image, as well as being exposed to bullying, grooming and sexting, facilitated by social media.
It calls for comprehensive new regulation to be introduced that focuses on platforms such as Facebook, Twitter, YouTube and Google.
The committee also recommends that the Government set itself the “ambitious” target of halving online reports of child sexual exploitation and abuse within two years and eliminating it in four years.
Norman Lamb MP, chair of the Science and Technology Committee, said: “Throughout our inquiry we have heard from a range of experts, including young people, about both the benefits of social media, as well as deep concerns about its potential risks to the health, safety and well-being of young people.
“It is frustrating that there is not yet a well-established body of research examining the effects of social media on younger users.
“More worryingly, social media companies — who have a clear responsibility towards particularly young users — seem to be in no rush to share vital data with academics that could help tackle the very real harms our young people face in the virtual world.
“We understand their eagerness to protect the privacy of users but sharing data with bona fide researchers is the only way society can truly start to understand the impact, both positive and negative, that social media is having on the modern world.
“During our inquiry, we heard that social media companies had openly refused to share data with researchers who are keen to examine patterns of use and their effects. This is not good enough.”
The committee’s report called on the Government to use its upcoming Online Harms White Paper to put legislation and regulation in place.
“The Government also has a vital part to play and must act to put an end to the current ‘standards lottery’ approach to regulation,” Mr Lamb said.
“We concluded that self-regulation will no longer suffice. We must see an independent, statutory regulator established as soon as possible, one which has the full support of the Government to take strong and effective actions against companies who do not comply.”
A spokesman for the Department for Digital, Culture, Media and Sport, which has worked on the white paper with the Home Office, said: “We have heard calls for an Internet Regulator and to place a statutory ‘duty of care’ on platforms, and are seriously considering all options.
“Social media companies clearly need to do more to ensure they are not promoting harmful content to vulnerable people.
“Our forthcoming white paper will set out their responsibilities, how they should be met and what should happen if they are not.”
Earlier this week, new Facebook head of global affairs Sir Nick Clegg acknowledged that government had a place in regulating social networks.
Andy Burrows, associate head of child safety online at the NSPCC, said social media sites had been allowed to operate in a “Wild West” environment for too long.
“It’s hugely significant that the committee is endorsing the NSPCC’s proposal for a legal duty of care to be imposed on these tech companies,” he said.
“As the committee’s report states, the Government now has a crucial opportunity to set out a comprehensive plan to protect children online.
“This must include an independent statutory regulator with enforcement powers, that can impose strong sanctions on platforms that fail to keep children safe.”
In response, a Twitter spokesman said: “Improving the health of the conversation online remains our number one priority.
“In 2018 alone, we introduced more than 70 changes to product, policy and processes to achieve a healthier, safer Twitter. We are committed to building on this progress.”