Children's commissioner: Children are 'afterthought' for social media giants

By Gabriella Jozwiak

| 30 January 2019

The children's commissioner for England has called on leading social media companies to commit to tackling the problem of children viewing disturbing content on their platforms, or admit the situation is out of their control.

Children's commissioner for England Anne Longfield, has questioned whether social media companies are in control of the content published on their platforms. Picture: Alex Deverill

In an open letter to Facebook, which also owns WhatsApp and Instagram, YouTube, Snapchat and Pinterest, Anne Longfield said social media companies had expanded rapidly in recent years, and she questioned if owners of such platforms still had "any control" over their content.

"If that is the case, then children should not be accessing your services at all, and parents should be aware that the idea of any authority overseeing algorithms and content is a mirage," said Longfield.

Her comments follow an appeal made by the father of 14-year-old Molly Russell, who took her own life in 2017 after viewing disturbing content about suicide on social media. 

Longfield said the case highlighted "the horrific amount of disturbing content that children are accessing online", and pointed out that none of the platforms regularly used by children were developed with children in mind.

She suggested the social media platforms were failing to take the issue seriously. "Children remain an afterthought," she said.

As recommended in a 2017 report produced by her office, Growing Up Digital, Longfield called again for internet companies to agree to finance a digital ombudsman, who would independently be able to respond to concerns from children and families related to online content.

She also reiterated an earlier call that internet companies should be bound by a statutory duty of care to prioritise the safety and wellbeing of child-aged users.

In the letter, she further asked the recipients to provide data on the number of self-harm sites or postings hosted on their platforms, how many of these were accessed by under-18s and under-13s, and to explain what support was offered to those seeking images of self-harm.

She also demanded to know what criteria was used for removing content or people from platforms, and the impact of disturbing content on children's mental health, according to the companies' own research.

"I would ask you to answer the following questions, or to explain to your users why you will not," said Longfield.

"I would appeal to you to accept there are problems and to commit to tackling them - or admit publicly that you are unable to."


A spokesman for Instagram and Facebook said the company had "a huge responsibility to make sure young people are safe on our platforms". 

"Working together with the government, the children's commissioner and other companies is the only way to make sure we get this right," he said. 

"Our thoughts are with Molly's family and with the other families who have been affected by suicide or self-harm.

"We are undertaking a full review of our policies, enforcement and technologies and are consulting further with mental health experts to understand what more we can do. 

"In the meantime, we are taking measures aimed at preventing people from finding self-harm related content through search and hashtags." 

NSPCC associate head of child safety online Andy Burrows said the NSPCC backed a proposal for a statutory duty of care.

"Social networks have repeatedly shown they are incapable of regulating themselves and that they need to be forced to protect children," he said.

"But it is absolutely imperative we get this right, if children are going to be truly protected." 

CYP Now also contacted YouTube for a response.

blog comments powered by Disqus