Children's charities welcome plans to appoint social media regulator


Children’s charities have welcomed government plans for a national regulator to protect vulnerable young people online.

A child is subject to online abuse every 16 minutes in England and Wales, the NSPCC says. Picture: Adobe Stock/posed by model
A child is subject to online abuse every 16 minutes in England and Wales, the NSPCC says. Picture: Adobe Stock/posed by model

Media watchdog Ofcom is set to be appointed to enforce rules making internet companies, including social media channels, responsible for their users’ safety.

The first response to the government’s Online Harms consultation, carried out in 2019, reveals plans to crack down on providers who do not protect children from threats including online sexual abuse, cyberbullying and access to self-harm and suicide forums.

The new rules will apply to firms hosting user-generated content, including comments, forums and video-sharing. This could include sites popular with young people including Facebook, Instagram, Twitter, Snapchat and TikTok, which are currently self-regulated.

It is not currently clear what penalties companies will face but a government statement says: “The regulator will hold companies to account if they do not tackle internet harms such as child sexual exploitation and abuse and terrorism.”

Barnardo’s chief executive Javed Khan said two thirds of children supported by the charity’s sexual exploitation services were groomed online before meeting their abuser in person.

“The backbone of an internet that is safe for children is regulation, which is why this announcement is so important.

“Children face growing risks online, including cyber-bullying, sexual grooming, and exposure to self-harm forums.

“We cannot expect children to protect themselves. Instead we need a regulator to act without delay. To do so, it will need the necessary powers to carry out work effectively and to hold tech companies to account,” he said.

A recent report by the NSPCC estimates that a child is subject to online abuse every 16 minutes in England and Wales.

It adds that around 25,300 child abuse image and sexual grooming offences have been recorded by the police in the nine months leading up to January 2020.

Chief executive Peter Wanless said statutory regulation of social media sites is "essential".

"Too many times social media companies have said: 'We don't like the idea of children being abused on our sites, we'll do something, leave it to us,'" he said.

"Thirteen self-regulatory attempts to keep children safe online have failed.”

Home Secretary Priti Patel said the internet is used as a “hiding place for criminals, including paedophiles, to cause immense harm”.

“It is incumbent on tech firms to balance issues of privacy and technological advances with child protection.

“That’s why it is right that we have a strong regulator to ensure social media firms fulfil their vital responsibility to vulnerable users,” she said.

A full response to the consultation will be published in the spring.

CYP Now Digital membership

  • Policy and research analysis
  • Evidence-based case studies
  • Leadership advice
  • Legal updates
  • Local area spotlights

From £170 /year

Subscribe

CYP Now Magazine

  • Policy and research analysis
  • Evidence-based case studies
  • Leadership advice and interviews
  • Legal updates

From £136 /year

Subscribe