DfE turns to 'big data' to warn of Ofsted 'inadequate' rating risk

Neil Puffett
Tuesday, November 22, 2016

Department for Education permanent secretary says new datasets could help children's services departments is identify problems ahead of Ofsted inspections and enable them to turn these around to avoid poor ratings.

Preparing paperwork for Ofsted inspections was a key source of stress cited by the early years workforce. Picture: Avava/Adobe Stock
Preparing paperwork for Ofsted inspections was a key source of stress cited by the early years workforce. Picture: Avava/Adobe Stock

Central government could soon be in a position to spot struggling children's services departments ahead of scheduled Ofsted inspections with the aim of turning them around before they became "inadequate", according to the Department for Education's most senior civil servant.

Speaking last month before members of the public accounts committee, Jonathan Slater - who became permanent secretary at the DfE in May - said data collected by the DfE is beginning to show a link between child protection caseloads, staff turnover rates and agency staffing levels in children's services departments and Ofsted child protection judgments.

He added that there are "promising signs" that the DfE will be able to spot issues as they emerge thanks to new datasets, which it began collecting in 2014.

Slater says a trial took place last February of how best to measure caseloads. From February 2017, it will be tested "comprehensively across the country" prior to being used by the DfE.

The potential usefulness of collecting and analysing raw data appears to be backed by Ofsted's findings.

In 2014/15 across all children's services departments, 16 per cent of children's social workers were agency staff. In authorities judged "good", the average was seven per cent and in authorities judged "inadequate", it was 22 per cent.

Meanwhile, across all councils, 17 per cent of children's social worker posts were vacant. In authorities judged good, the average was 11 per cent and in authorities judged inadequate, it was 22 per cent.

However, there are concerns about relying too much on the development of an "early warning system" (see expert view).

Earlier this year, a study by the Association of Directors of Children's Services (ADCS) attempted to benchmark caseloads for senior practitioners, social workers and newly qualified social workers in four distinct service areas - early help; children in need; child protection; and children in care.

It found that what can be considered "manageable" depends on the complexity of individual cases, the mix of cases in a caseload and the availability of appropriate support.

The report, which found that child protection caseloads alone can vary between two and 27, concluded that developing any benchmark would be complex due to the "huge variability" across local authorities in the roles and remits of those responsible for children's social work, and in the way in which services are organised and structured.

Dave Hill, president of the ADCS, says the challenge for the DfE will be correctly predicting which local authorities rated as "requires improvement" - currently 54 out of 110 inspected - are in decline.

"The key group from where I sit and from where my members sit is that group in the middle ground," Hill says. "If they are going the right way, they will come through in 2017, 2018 and beyond and become ‘good' authorities.

"The ones that are difficult for the department to spot are the ones that are in requires improvement, but are going in the wrong direction.

"I'm worrying a little, to be frank; about the idea that there is some very clever set of data that will shine a light on it beautifully and we will all know and Ofsted will rush in. I think that is unlikely."

Hill believes local authorities are in a good position to keep an eye on the performance of other councils and pass on information to the DfE, but do not necessarily have sufficient resources to do so.

"The difficulty is that we don't have enough good people to go faster," Hill says.

"I don't think it's a money issue. The number of good authorities - and there are now two ‘outstanding' authorities - are simply not able to do the day job and also help lots of other authorities get better.

"We simply don't have enough practice leaders who know what good looks like at the moment."

Paul Kissack, director general of children's services and departmental strategy at the DfE, says other methods of tracking performance will be used in addition to raw data.

"While of course we will continue to look at data and look for correlations that we can then use as lead indicators, I don't think it would be right for us to sit back and wait for the data to tell us all the answers," he says.

"Ofsted have developed a local intelligence system which they use in planning their inspections.

"They [Ofsted] are of course on the ground in local areas on a regular basis inspecting early years facilities and residential care homes, so it would be wrong to suggest Ofsted don't have local intelligence between Ofsted inspections.

"They also have published data on serious incident notifications and receive complaints and whistleblowing information."

EARLY WARNING DATA

  • Caseload levels High caseloads can hinder good social work
  • Staff turnover High levels of workforce churn can lead to inconsistent practice
  • Agency workers Services with high agency staff use struggle to keep costs down

Expert view: ‘Data has potential to warn of poor Ofsted rating, but it will be more limited than proponents think'

By Robert Grant, senior lecturer in health and social care statistics, Kingston Universityand St George's, University of London

Recent research I was involved in looked at available sets of data on child protection services, what these might tell us about how services respond to pressures and whether the data could be used as quality indicators to identify struggling social care services. We concluded that data paints an imperfect picture and careful profession-led interpretation is key.

The group involved in our Exploring Demand and Provision in English Child Protection Services report included a former director of social services, an Ofsted inspector, a social worker and a researcher into complex systems and public service management. I am a statistician specialising in quality indicators for health and social care.

The quality of health services has been estimated using data for decades, and there is great potential for social services to learn from this, both opportunities and difficulties.

One tension is between making good use of existing data rather than collecting afresh, and relying on limited data collected for a different purpose that may not answer new questions.

In austere times, it is tempting for managers to rely on existing data sources to reduce costs. Bold promises are often made about "big data" by consultants and software vendors: by running the right analysis, new insights will spring up to make the organisation more efficient, safer and better.

This sales pitch can prove irresistible, but if existing data does not measure things of fundamental interest, insights will be limited at best and misleading at worst.

We looked at data from a range of sources, including:

  • Children in Need census and its predecessor the Child Protection and Referrals returns
  • Children and Family Court Advisory and Support Service records of care proceedings
  • Department for Education's Children's Social Work Workforce statistics
  • SSDA903 records of looked-after children
  • Spending statements
  • Local authority statistics on child population, deprivation and urban/rural locations.

Some of these provide data stretching back to 2001, but others start more recently. The quality of data was sometimes patchy, and only after weeks of painstaking work were they combined in a useful form for analysis.

For the most part, the available data describes total activity — for example, the number of referrals and care orders — rather than a subtle measure of the quality of care, which reflects the fact that they were collected for administrative and not quality improvement purposes.

A quality indicator can only indicate a potential problem, which needs to be considered carefully in context, in much the same way that a surgeon with the highest mortality rate may turn out to be specialising in the most complex cases, rather than simply be dangerous.

Punitive interpretations of performance data can contribute to blame culture and defensive organisational approaches to risk. Our study found signs that local authorities with "inadequate" Ofsted ratings had increased their child protection activity one year on compared with similar authorities.

Rather than focusing resources on struggling services, using outcomes measures to build some form of "early warning system" may have actually contributed to the acceleration in the use of child protection interventions seen since 2008.

Predicting Ofsted ratings is potentially interesting as an early warning system, but there are two major concerns: the data captures only a narrow aspect of a complex system; and the need for inspectorates to save money could lead to inspections being scaled back to respond only to these imperfect early warnings.

We only found two indicators of outcomes (for the quality of the service from the child/young person's perspective) - rates of re-referral and second child protection plans. While these are undeniably important, they identify poor outcomes in cases that have been closed prematurely, and tell us nothing about poor outcomes while under the care of the service, or poor outcomes that do not result in a new referral or child protection plan.

In our analysis, only three indicators were needed to predict an imminent inadequate rating:

  • Initial assessments within the target of 10 days
  • Re-referrals to the service
  • The use of agency workers

These three variables correctly predicted 68 per cent of inadequate ratings, but missed the rest. Adding more indicators did not improve the prediction rate.

Is this an early warning system or does it just reflect the Ofsted inspectors' response to the same data? It is impossible to say without more research into the inspection process.

Does it mean that a computer in Whitehall could replace Ofsted inspections? No. The inspections and the indicators are linked together as parts of the same process.

Early warning could be useful, but it will surely turn out to be more limited than its proponents think. Above all, data alone should not determine judgments or actions, but should contribute to insights and understanding.

This comes from exploring and explaining what is driving the data and its patterns.

CYP Now Digital membership

  • Latest digital issues
  • Latest online articles
  • Archive of more than 60,000 articles
  • Unlimited access to our online Topic Hubs
  • Archive of digital editions
  • Themed supplements

From £15 / month

Subscribe

CYP Now Magazine

  • Latest print issues
  • Themed supplements

From £12 / month

Subscribe