Online Safety Bill: MPs call for age-verification on porn sites
Fiona Simpson
Tuesday, February 8, 2022
The Online Safety Bill which seeks to regulate social media and tech companies in a bid to offer greater protection to children will be changed to include compulsory age-verification for pornography sites, MPs have said.
To mark Safer Internet Day (8 February), digital minister Chris Philp announced that a legal duty requiring all sites that publish pornography to put robust checks in place to ensure their users are 18 years old or over will be added to the draft bill.
Measures could include adults using secure age verification technology to verify that they possess a credit card and are over 18 or having a third-party service confirm their age against government data, he said.
Companies which fail to act could be fined as much as 10 per cent of their annual turnover by Ofcom.
The draft bill previously stopped short of putting regulating all sites which publish pornography, focussing on sites which host self-generated images rather than commercial sites.
Philp said: "It is too easy for children to access pornography online. Parents deserve peace of mind that their children are protected online from seeing things no child should see.
"We are now strengthening the Online Safety Bill so it applies to all porn sites to ensure we achieve our aim of making the internet a safer place for children."
The bill, the first draft of which was published in May last year, put a "duty of care" on large social websites to remove harmful or illegal content and protect children but it was predominantly left up to companies to regulate themselves, with oversight from media regulator Ofcom.
A report published by the government’s Joint Committee on the draft Online Safety Bill in December recommended that stricter age-verification rules for all porn sites should be added to the bill.
The 191-page report also called for the bill to provide a tighter definition of “content that is harmful to children”.
Platforms should be included under this section of the bill if “it is specified on the face of the bill, in regulations or there is a reasonably foreseeable risk that it would be likely to cause significant physical or psychological distress to children who are likely to encounter it on the platform,” it stated.
The committee also called for content including pornography, gambling and violent material that promotes self-harm, eating disorders or suicide is more clearly set out as posing a high risk to children.
Content that allows adults to make unsupervised contact with children who do not know them or material allows children to scroll “endlessly”, access visible popularity metrics, live location, and being added to groups without user permission should also be more clearly defined as harmful to children, it added.
The report also recommended that individual users should be able to make complaints to an ombudsman when platforms fail to comply with the new law and proposes that a senior manager at board level or reporting to the board should be designated the "Safety Controller."
The person in the role could face a criminal conviction over repeated failures to comply with their obligations, it stated.
Damian Collins, chair of the Joint Committee on the draft Online Safety Bill, said: “We need to call time on the Wild West online.
What’s illegal offline should be regulated online. For too long, big tech has gotten away with being the land of the lawless. A lack of regulation online has left too many people vulnerable to abuse, fraud, violence and in some cases even loss of life.
“The era of self-regulation for big tech has come to an end. The companies are clearly responsible for services they have designed and profit from, and need to be held to account for the decisions they make.”
- This story was first published on 14 December 2021. It was updated on 8 February 2022.