
For many years, local authorities in England have been sending aggregate data on their cohort of children and young people with special educational needs and disabilities (SEND) to the Department for Education.
The return is extremely important for the DfE given the high-profile nature of the area of SEND and the extremely large amount of money involved in the current financial climate.
Of course, any improvements and more modern ways of monitoring are very welcomed, but are we simply counting the wrong things in a better way?
A step in the right direction
The SEN survey – more commonly known as the SEN2 survey – is an annual statutory data collection that takes place early in the year.
Local authorities complete the return with information spanning the previous 12 calendar months on children with education, health and care plans (EHCPs), and in the past submitted aggregated reports back to the DfE.
In 2023, however, with information relating to the previous 12 months for assessments – and going back even further for ongoing EHCPs – local authorities were required to submit data to the DfE per child, rather than aggregated.
This was undertaken with varying degrees of success, prompting some data quality warnings on certain elements of the statistics produced.
Information such as the name, date of birth, gender and ethnicity of the child or young person were included this year for the first time. Also, and again for the first time, the return includes details of a child’s EHCP, such as the date it started and the establishment named in the plan, which would usually be the school or setting they should attend.
It is a welcome change to see the return has been updated to collect a much more holistic picture of what is happening with regards to assessments, reviews and provision, and it even includes demographic information such as sex and ethnicity.
However, the data that is released following that collection is still highly aggregated and has drawn the same narrow conclusions from many in the sector, which may be a missed opportunity.
Not counting what counts
Statistics are of course swiftly analysed across the sector, closely followed by published lists of local authorities labelled as not having met targets or having performed worse than in previous years. Yet let’s remember that the data collected by SEN2 counts processes, not outcomes, experiences, progress or quality,
Ultimately, as some colleagues pointed out to me recently, SEN2 is still simply counting processes. Regardless of how narrowly or holistically it does this, to many in the sector it doesn’t feel like anything more than a box-ticking exercise in some cases.
What’s more, does the pressure local authorities feel not to be thrown into the “bad” category in the public domain push them into making decisions which are more about meeting timescales and bureaucracy, than the needs of the children, young people and families they support?
A recent inspection report of one local authority captures exactly this when it states: “Leaders have worked to ensure that EHC plans are produced within the nationally set timescale. However, they are often overly long and unnecessarily complicated. This, combined with the lack of meaningful contributions from health and social care, makes the plans difficult for practitioners to deliver and, therefore, less likely to have a positive impact on children and young people.”
This raises the question of whether SEN2 creates a greater problem than it seeks to solve.
Meeting the 20-week deadline for the issuing of a plan has been the pinnacle of local authority SEND teams, predominantly because it is the key figure reported for SEN2. Often, however, this is to the detriment of annual reviews which in many authorities have become significantly overdue for thousands of children and young people.
Years of cuts
Following years of cuts from central government, local authority teams don’t have the resources required to do all of the things, all of the time, so they naturally gravitate to the functions that are measured, scrutinised and reported.
But is that to the detriment of children and young people? Are we encouraging councils to do what is counted, not what counts?
What counts, ultimately, depends on the role you play in the current system. However, almost everybody – parents, professionals and civil servants – would agree that putting in place the right support in order to promote positive progress and outcomes is why we are in the business we are in.
The recently published SEND and Alternative Provision Improvement Plan opens with the following statement: “Children only get one childhood. They deserve to get the support they need to thrive and prepare for happy, healthy and productive adulthoods.
“For children and young people with special educational needs and disabilities or in alternative provision, this is especially vital.”
Measuring lived experience
How can we measure if the system is delivering this? Parents of children with SEND will know what works and what doesn’t work for their children – they are the ones who have first-hand knowledge of the day-to-day experiences of their children, what works and what doesn’t. The DfE already know this and as part of its inspection processes already surveys parents to get their input.
In a recent Ofsted SEND area inspection of arrangements in Oxfordshire, around 2,000 parents and carers from the local area shared their views with inspectors.
The report, published in September, recounts: “A tangible sense of helplessness runs through their descriptions of their lived experiences. These were typically about the years spent waiting or struggling to be heard to get support in education, health and care.”
Oxfordshire was found to have “widespread and/or systemic failings leading to significant concerns about the experiences and outcomes of children and young people with special educational needs and/or disabilities”.
It is important to recognise that the “experiences and outcomes” of children are baked into Ofsted’s new inspection framework with all three possible judgments of a report referring to them specifically, so why are we not measuring these rather than counting processes?
If we truly want to take the temperature of the system, why can’t we divert the significant resources used on number crunching to instead focus on the experience and outcomes of the children themselves?