Under the proposed laws, any company that allows users to share or discover user-generated content or interact with each other online will face fines if they fail to adhere to a mandatory duty of care to protect users.
Other measures in the Online Harms White Paper include forcing online companies to end the sharing of content about child abuse or terrorism, and demanding they publish annual reports revealing the amount of harmful content on their platforms, and how they have responded to this.
The government is also set to appoint a regulator to enforce the tougher measures, including issuing fines, blocking access to sites and potentially imposing liability on individual company employees, which will at first be funded by the industry.
The children's commissioner for England Anne Longfield has campaigned for internet companies to be bound by a statutory duty of care to prioritise the safety and wellbeing of child-aged users, and for them to be more transparent about harmful content.
Longfield called for the government to introduce the new measures as quickly as possible.
"The social media companies have spent too long ducking responsibility for the content they host online and ensuring those using their apps are the appropriate age," she said.
"The introduction of a statutory duty of care is very welcome and something I have long been calling for.
"Any new regulator must have bite. Companies who fail in their responsibilities must face both significant financial penalties and a duty to publicly apologise for their actions and set out how they will prevent mistakes happening in the future."
Today's #OnLineSafety #WhitePaper marks an important first step to further consultation.— Alex Holmes (@abcholmes) April 8, 2019
There is without a doubt much more that tech companies can do to innovate safeguarding.
We look forward to continuing to work government on this important issue in coming weeks & months. pic.twitter.com/l65oZ2Ul47
In March, the NCPCC released figures obtained through Freedom of Information requests to police forces in England and Wales that revealed they had recorded more than 5,000 online grooming offences in 18 months.
The charity's chief executive Peter Wanless said the government's proposed strategy would be "hugely significant" and could "make the UK a world pioneer in protecting children online".
"For too long social networks have failed to prioritise children's safety and left them exposed to grooming, abuse, and harmful content," he said.
"So it's high time they were forced to act through this legally binding duty to protect children, backed up with hefty punishments if they fail to do so."
So the #WhitePaper on #OnlineHarms will be published today, and I want to write a wee thread about the dangers of rushing into kneejerk legislation regarding social media.— Fiona Robertson (@FionaSnp) April 8, 2019
First of all - I hope that we can finally begin recognising that online life is actually real life too. https://t.co/mtl3nTKmZx
The Department for Digital, Culture, Media and Sport and the Home Office will consult on the joint proposals over the next 12 weeks.
The plans also include the creation of a media literacy strategy to give people the skills to recognise and deal with deceptive and malicious behaviours online, including grooming.
"Online companies must start taking responsibility for their platforms, and help restore public trust in this technology.— UK Prime Minister (@10DowningStreet) April 8, 2019
"We are putting a legal duty of care on internet companies to keep people safe."
- PM @Theresa_May #OnlineSafety pic.twitter.com/DXzE2EIkz7
Home Secretary Sajid Javid said tech giants and social media companies had a moral duty to protect young people.
"Despite our repeated calls to action, harmful and illegal content - including child abuse and terrorism - is still too readily available online," he said.
"That is why we are forcing these firms to clean up their act once and for all. I made it my mission to protect our young people - and we are now delivering on that promise."