Measuring our success

By Steve Crocker

| 02 August 2019

As I write this, the political merry go round is in full swing and the former children's minister, Nadhim Zahawhi, has taken up a new role as parliamentary under secretary of state at the Department for Business, Energy and Industrial Strategy, to be replaced by Kemi Bedenoch. On behalf of ADCS I would like to wish them both good luck in their respective new roles. When he was children's minister, Zahawhi was particularly interested in performance and quality measures - as befits the co-founder of the research firm YouGov. However, and as I'm sure the minister found, issues of performance and quality management in children's services are notoriously slippery and difficult to define.

I was reminded of this the other week when I was asked, on behalf of ADCS, to write a forward and to say a few words at the launch of the Rees Centre's research into the development of a children's social care outcomes framework.

This research, funded by the Nuffield Foundation, explores the various components that might constitute a more rounded outcomes framework for children's social care. It's certainly worth a read even if, like me, you don't necessarily agree with all of the conclusions. What's more important though, is that the report stimulates the debate and makes us reflect on how we measure the success of what we do - and I believe that yes, we are successful much of the time and yes, we do need to show this to justify the levels of expenditure that we need to protect vulnerable children and families, especially in these financially constrained times.

Many of you will remember the different attempts to performance manage the safeguarding system over the years. The Quality Protects programme measured local authorities through ‘blobs' as I recall. We then had a Department for Education led performance framework that included ‘stretch targets' via Local Area Agreements. This was a good example of how performance indicators can distort behaviour - for example, a number of local authorities sought a stretch target on the timeliness of initial assessments. All very well but what about the quality of those assessments?

Whilst much of the paraphernalia of performance management was swept away at the beginning of the decade, our new sector-led Regional Improvement and Innovation Alliances have been working on developing an agreed set of indicators. Meanwhile, Ofsted now asks us just three questions for our annual self assessments: what is the quality of social work practice? How do you know this? And, what are you going to do about it? None of those questions can be answered by performance data alone, emphasising the importance of qualitative evidence as well as, what Professor Eileen Munro described, "the intelligent use and analysis of data".

The Rees Centre report goes even further and suggests the use of direct feedback from staff and service users as part of the framework - is that right or would it give us a distorted view?

Personally, I am cautious about the over-reliance of experiential information, but my views are a bit irrelevant. I think what is important is that we bring this debate to the fore and the work of the Rees Centre makes a valuable contribution to this.

The cliché of ‘what's measured is important' may ring true for some, but if we can't measure our success in the sector then how important will we be to a government with so many other priorities?

Steve Crocker, is director of children's services at Hampshire and the Isle or Wight councils. This blog first appeared on the ADCS website

blog comments powered by Disqus