Use of algorithms in children's social care 'risks families being missed'


Use of data algorithms by children’s services to identify “at risk” children could lead to some being missed and other families being forced to undergo unnecessary scrutiny, a report has warned.

Machine learning “without proper ethical oversight” poses serious risks of reinforcing biases, researchers said. Picture: Adobe Stock
Machine learning “without proper ethical oversight” poses serious risks of reinforcing biases, researchers said. Picture: Adobe Stock

A review of the use of "machine learning" in children’s social care, by the University of Oxford’s Rees Centre and The Alan Turing Institute for What Works for Children’s Social Care, called for the approach to be used “cautiously”.

Machine learning is a general approach in computer science that allows algorithms to carry out tasks on the basis of data, without being explicitly and completely pre-programmed by designers, researchers said.

This means systems may use data mining to flag up children deemed to be "at risk".

However, the report warns that inaccuracies in machine learning systems could lead to "‘false negatives’ that miss children in need of protection". 

It could also "interfere" with privacy, researchers warn, and if used “without proper ethical oversight” poses serious risks of reinforcing biases particularly concerning families noted as facing poverty or from deprived areas.

"The correlation of child neglect and maltreatment with historical patterns of poverty and deprivation make the feedforward of these patterns in effective data mining all but inescapable," the report warns.

“Low data quality may mean either that risks are missed, or that families are subjected to assessment or interventions that they don’t need,” it states.

Researchers expressed concerns over a growing number of local authority children’s services departments using the systems with “varying levels of deployment and transparency”.

They have called for a national standard for design and implementation of machine learning system to be rolled-out across the UK.

Local authorities should also work to Improve data quality and staff understanding of systems through professional development and training, they added.

Michael Sanders, executive director of What Works for Children’s Social Care, said: “We believe that we need to have an open and transparent debate about the use of predictive analytics and data science in children’s social care, and one that draws in the widest possible number of voices, armed with the best possible academic research. This report is an important part of that debate and will help both the public, and local and national governments, to consider whether a particular course of action is the right one.”