How can you prove that your services for young people actually work? Funders and commissioners increasingly demand evidence that youth projects can achieve “hard” outcomes, such as reducing crime or boosting school attendance.
Meanwhile, youth workers have often found it difficult to quantify the results of their practice, which tends to focus on hard-to-measure personal development skills.
But many in the sector are realising that opting out of the measurement game is no longer a viable option. The reality is that by doing so, youth work, with its principles of voluntary engagement and self-development, could lose out on funding, despite the fact it may have a transformative effect on young lives.
Enter the Young Foundation think-tank. As part of its work in the Catalyst consortium – which has a contract as the government’s “strategic partner” for young people – it has developed a “youth outcomes framework”, released in the summer, to be used by practitioners when planning, developing and evaluating projects. It aims to solve the eternal dilemma of youth workers by allowing them to speak the language of outcomes, without compromising their practice.
The framework cites research showing that the development of “softer” skills, such as resilience or creativity, can lead to “harder”, more measurable outcomes. It notes that services have sought to make their mark through impact on “positional” change for young people: jobs, economic productivity, stable housing and so on.
But it holds that providers should feel confident about articulating their value through their impact on “personal” change – through building social and emotional capabilities. It boils these down into seven “core capabilities”: communication; confidence and agency; planning and problem solving; relationships and leadership; creativity; resilience and determination; and managing feelings.
Bethia McNeil, programme leader at the Young Foundation, says: “Evidence shows that approaches that focus on building these capabilities can have greater long-term impact than ones that focus on directly seeking to reduce the symptoms of poor outcomes for young people. This is where the true value of services for young people lies.”
The framework aims to guide organisations on how to identify the changes they want to make in young people’s lives, put together a programme that will help make those changes, and measure the effectiveness of that programme through a list of 26 recommended “tools”.
Three organisations have been piloting the framework since the summer: London Youth, British Red Cross and Brathay. The Young Foundation has provided support through workshops to enable the organisations to work out what it is they are trying to achieve, and communicate that to staff and stakeholders. Early next year, it will publish a more practical document, designed to help everyone from large agencies to small charities adapt learning from each of the pilots to their own practice.
The pilots have found the framework helpful in planning future work, and the foundation says that aspect will be beefed up in the next publication. But all three admit that the measurement side of the work has been challenging, saying the tools listed in the framework are not relevant to what they are trying to achieve. Brathay, for example, says: “There is a strong quantitative or statistical bias to this toolkit, which on its own is not congruent with youth work.”
Brathay has developed its own qualitative tools to measure the value of its work. But some professionals argue that youth workers should be challenging the whole idea of outcomes-based measurement.
Tony Taylor has put together a discussion paper for the In Defence of Youth Work campaign. He says the Catalyst framework is “partial and flawed”. “It claims that by dipping into an eclectic mix of allegedly scientifically proven psychological tests we can provide robust, objective evidence of our significance,” says Taylor.
“This is illusion dressed up in pseudo-scientific garb. The danger is that this illusion gains the appearance of reality if all those involved, particularly managers, workers and employers, buy into the fantasy that categorising young people via the crude completion of inappropriate tests is a step forward, somehow useful and meaningful.”
Jon Ord, reader in youth work at University College Plymouth, is also sceptical that the “capabilities” the framework identifies can be measured scientifically. “The way the outcomes are completely abstracted seems strange; it talks about confidence as if it was an entity we can measure in a lab. But context is important – people in gangs may have very high self-esteem in that context, for example.”
Kaz Stuart, who is piloting the framework for Brathay, disagrees. “Measuring young people is not at odds with good youth work,” she says. “You can use it so flexibly, as a guide to planning and ensuring practice is of a high quality. There is a pragmatic need to engage with these kinds of models, otherwise smaller charities may no longer exist.”
The Young Foundation’s McNeil warns against the drive to articulate outcomes being perceived as an “add-on” and an agenda that must be “complied with”.
“All of this reinforces the perception of ‘outcomes’ as a management tool, which is meaningless at best and damaging to young people’s development at worst,” she says. “The questions we ask ourselves about the difference we make will determine what we find – we need to ensure that we are asking the right questions.”
Here, the three pilot organisations share what they have learned from using the outcomes framework over the past few months.
PILOT 1: LONDON YOUTH
London Youth is a network of 400 youth organisations in the capital. Its activities range from archery and mountain biking at its outdoor education centre to a quality assurance scheme for London youth clubs.
In the past, it found that the tools it chose to measure success ended up dictating which outcomes were measured, rather than the desired outcomes being chosen first and then a tool found that could measure them. “We had a couple of false starts on this in the past,” says director for organisational development, Sam Grimstone.
Five programmes are taking part in London Youth’s pilot, including its biggest money-spinner, outdoor education centre Hindleap Warren. In each case, the project team is going back to basics, working out what changes they are looking to make in the lives of the young people they work with, what “capabilities” will best help the young people achieve those changes and which activities will best help the young people develop those capabilities.
Hindleap Warren, for example, wants young people to develop an improved relationship with their peers and leaders; greater-self confidence; to become more likely to participate in new experiences; to develop a better awareness of the natural environment; to become more engaged with learning; and to become better able to play an effective role in a team. The team is working out which skills it needs to focus on to ensure these outcomes are reached.
The next step will be to identify the best evaluation tool to measure the achievement of the identified skills. One of the pilot projects, a sports development programme, aims to get inactive young people into sport and will concentrate on numbers, tracking how many young people are still involved in sport after a year. Some programmes will develop their own questionnaires, as the tools listed in the framework are seen as too invasive and personal.
“A lot of them have deep psychological questions,” says Grimstone. “The sense was we are not close enough to those young people to ask those questions.” But Hindleap Warren hopes to find an existing evaluation tool, so it can compare its work to external programmes using the same tool.
London Youth is recruiting a head of learning, funded by a grant from the Esmée Fairbairn Foundation. Part of this role will be to look at how other organisations use the framework, as well as communicating London Youth’s learning to the sector. “We will use the framework to talk to funders, but they do seem to appreciate we are embedding it in development and improvement rather than making it a surface-level counting activity,” says Grimstone.
London Youth’s financial year starts in September, and it will use the framework process to plan all of its activities for the next financial year, after taking account of the achievements and learning of the five pilot programmes. “We can then shift outcomes if necessary, and start evidencing against all of our programmes. This is our suck-it-and-see year,” adds Grimstone.
PILOT 2: BRITISH RED CROSS
The British Red Cross is not a youth work organisation. However, its education programme reaches 210,000 children and young people in the UK every year, while 6,500 young people volunteer for the organisation.
As a pilot site, its challenge was to make sure the youth outcomes framework sat well with the charity’s newly developed organisation-wide evaluation framework. It also needed to find a way to use the framework that would cover work with five- to 25-year-olds, including anything from a school assembly to a six-week programme.
The organisation first started designing a toolkit to measure the outcomes of its work with young people two years ago, but soon realised it did not have a strong enough idea of what those outcomes should be.
Gill Allbutt, humanitarian education development officer at the British Red Cross, says: “The way we described our outcomes was a bit woolly to say the least.”
So the organisation involved around 60 of its educators to come up with four desired outcomes – that children and young people are more likely to cope in a crisis; that they are more likely to have positive interactions with others; that they have a greater understanding of how people are affected in a crisis; and that they are more likely to respond in a crisis.
The charity then came up with indicators for each outcome – for example, under having positive interactions, that children are able to listen to each other, are flexible, and can give and take constructive criticism.
The next step is to involve children, young people and adult educators in designing and piloting appropriate evaluation tools. As with London Youth, the Red Cross found that many of the tools listed in the framework assumed a more in-depth contact with young people than might be the case – for example, questioning a young person’s self-esteem.
“It would be quite unethical for us to ask some of the questions in some of these tools,” says Allbutt.
PILOT 3: BRATHAY TRUST
The Brathay Trust, which delivers a range of community-based and residential youth work, has created its own “meta model” of youth development, which will be used to develop and evaluate all of the organisation’s activities, from icebreaker and trust exercises upwards.
When agreeing the aims of a programme with funders and commissioners, Brathay is urging its practitioners to resist the pressure to sell “hard” outcomes such as improved attendance at school.
Instead, the organisation is drawing on Catalyst’s evidence base to show that the development of capabilities such as creativity and problem solving will lead to the harder outcomes requested by funders. Under its new model, Brathay staff will “sell” programmes around these short-term “softer” outcomes, which it may amend once the programme starts, taking into account the needs of the young people on the programme.
“In the last half-decade, I have noticed people are likely to be commissioning for programmes about contributions to society – reducing gang membership for example,” says Kaz Stuart, head of Brathay’s research and evaluation hub.
“It is difficult for youth work programmes that have a relatively short relationship with young people to show evidence of these things. Now we can show evidence of personal development and where we need to demonstrate long-term gains, we can work with partners.”
Brathay has also come up with a list of indicators for Catalyst’s capabilities. For example, the development of communication skills – listening, self-expression and presentation skills – could be shown by the fact that a young person makes more eye contact, will speak in company, asks open and closed questions, and can present a line of argument, among other things.
Once a programme has decided which capabilities it wishes to develop, it can draw on a grid of relevant activities Brathay has developed. For example, under “support identity formation”, there are activities such as mask-making and analysis of media messages.
Finally, Brathay has developed a range of qualitative evaluation tools to complement what it felt to be the more quantitative tools collected in the Young Foundation framework. Before a programme starts, it aims to establish a “baseline” of where the young people are in relation to the desired outcomes, and uses the evaluation tools among other things – such as data from partner organisations – to measure progress.
Brathay used a grant from the Transition Fund for voluntary sector organisations to develop a number of pilot programmes, such as the Preston Pre-Apprenticeship Scheme, which worked with young people at risk of becoming not in education, employment or training (Neet) to prepare them to take on apprenticeships.
“These programmes were all evaluated using our ‘meta model’ and this made it really clear what processes were going on and the extent to which we achieved our aims,” says Stuart.
Having developed and piloted its outcomes model, the next step for Brathay is to render it more youth-friendly. Earlier this month, young people from the South Lakes Youth Council gathered with some local art students to “turn the language of the model into youth speak”, along with a film crew to document the exercise.