Headline data often hides the truth about services

John Freeman
Monday, April 29, 2013

Recent events surrounding the Leeds General Infirmary paediatric heart unit conjure up feelings bordering disbelief.

NHS chief David Nicholson remained silent about whether he would send a child to Leeds Infirmary for heart surgery
NHS chief David Nicholson remained silent about whether he would send a child to Leeds Infirmary for heart surgery

I am not just talking about the heart unit itself, but the NHS at its most senior levels – the “Sirs”: NHS chief executive David Nicholson, NHS medical director Bruce Keogh and National Institute for Cardiovascular Outcomes Research director Roger Boyle. Two details were particularly shocking as the story unfolded.

First came a public row, played out in the media, about the death rate data and what it showed. Eventually, Keogh said the Leeds data had been incorrect and the “real” data was better. On that basis, he reopened the unit almost straight away. It seems the weights of some babies had not been included in the national data returns. If so, there were issues with Leeds not submitting complete data, and the receipt and checking of the data. Boyle’s National Institute should not accept incomplete data.

These are simple issues that should have been sorted out months ago, and without public furore. The second deplorable fact was that at the end of the period of suspended surgery, Keogh said he would send his daughter to Leeds for heart surgery, while Boyle (who has now resigned) said he would not. Nicholson was silent. If they cannot get the story straight at this most senior level, what hope is there for public confidence in the NHS top management?

While all this was playing out, I attended Andrew Webb’s inauguration as president of the Association of Directors of Children’s Services. He reminded us that simplifying data for public use often resulted in making it so simplistic that it lost meaning and confused rather than informed. He is right – no headline indicator can give more than a partial perspective on complex organisations.

Just think of GCSE five A*-C scores, and the crude floor targets that ignore prior attainment, social context and funding levels. Worse, no headline indicator can ever tell you what the outcome will be for any individual child – whether in heart surgery, or in school, or even if your child will be safe on a trip to the BBC. Worst of all, the use of headline indicators leads to institutions working hard to improve the headline and sometimes forgetting their underlying purpose. Hospitals can cut waiting times by ensuring that people are not put on a waiting list until the last possible moment.

Where the headline is still not good, it is often blandly explained away. Recently, a hospital trust under review for high mortality rates said in its newsletter: “It is important to understand that a high mortality indicator is not the same as a high death rate, despite the headlines you may have seen in the press.” There was no further discussion of what these terms actually mean.

In education, schools can improve GCSE headlines by finding ways of not admitting, or excluding, pupils who are unlikely to achieve, or by focusing on the C/D borderline at the expense of the most and least able. I worked with a corporate colleague whose job included analysing the Comprehensive Performance Assessment methodology to maximise the council’s CPA score – not by improving the services we provided, but by gaming the system.

Before the Leeds story broke, Ofsted published “data dashboards” for schools, which aim to provide a snapshot of a school’s performance. Ofsted uses nine measures to show how schools have performed over several years against the national mean. For most of these measures, the dashboards also show schools’ quintile position against all schools and against schools with similar prior attainment.

I printed the dashboard for the schools I was concerned with. It was embarrassing for that other “Sir” – Ofsted chief inspector Michael Wilshaw – that the dashboards had to be reissued because his statisticians had made mistakes with the quintile calculations. Nevertheless, the dashboards do provide a much fuller picture than simple headlines.

Professionals and the public need data to help build understanding. But raw data must be published in full, allowing independent checking alongside the “official” analysis. Even then, there will always be much data that is simply not collected, so inevitably the statistical picture will be incomplete. That is what Mark Twain meant when he said that there are “lies, damned lies and statistics”. We have to always think critically and carefully when told “the facts” and look behind the headlines for hidden assumptions.

John Freeman CBE is a former director of children’s services and is now a freelance consultant  

Read his blog

CYP Now Digital membership

  • Latest digital issues
  • Latest online articles
  • Archive of more than 60,000 articles
  • Unlimited access to our online Topic Hubs
  • Archive of digital editions
  • Themed supplements

From £15 / month

Subscribe

CYP Now Magazine

  • Latest print issues
  • Themed supplements

From £12 / month

Subscribe