Blog entry by Outreach Evaluation Hub

Anyone in the world

When I tell people about what we do at The Centre for Education and Youth I often say that evaluation is our ‘bread and butter’. While we also love working on big research pieces, policy events and resources for practitioners, we really value our work supporting organisations on the ground to learn and grow. We avoid doing simple ‘Did it work?’ evaluations as, for us, the key question is not just ‘Did this have an impact?’ but also:

  • How did it make an impact? What are the mechanisms behind that impact?
  • On which outcomes was the impact stronger or weaker?
  • How did different activities impact on different groups? How does impact vary across regions, age groups or other demographics?
  • What factors facilitated impact and what inhibited it?
  • How likely is it that impact on short term outcomes will lead to achieving the long-term goal?

The aim of evaluation is improvement, rather than a pat on the back for a job well done. So, the main question we ask is: how can a programme change to improve the impact on children and young people?

evaluation cycle

This is why we are always especially excited to work with organisations on multi-year evaluations, as this is the only way to scrutinise the impact of implementing recommendations year on year to truly see the evaluation cycle all the way through. We have seen the benefit of this with a range of different partners and projects, such as our three year evaluation with the Arts’ Council England.

We have been working with Aspire to HE, the Uni Connect programme in the West Midlands, for two years as their evaluation and research partner. Aspire to HE works with local school and college partners to run a Widening Participation (WP) programme for pupils in Years 9 to 13. Their programme supports in-school outreach-ambassadors who provide Information, Advice and Guidance (IAG) and mentoring, as well as campus visits and other WP events and activities. Their core aim is to increase the number of young people from disadvantaged backgrounds progressing to HE.

Aspire to HE share our philosophy of learning from evaluation and seeking to improve impact based on findings. Our long-term partnership with Aspire to HE has meant we can carry out a range of research and evaluation projects that demonstrate some key tenets of good evaluation practice.

1.     Collaboration: At each stage of the project we have drawn on the knowledge of Aspire to HE’s stakeholders. This was especially important when we initially designed the evaluation framework and methodology. We conducted research into the local context and the barriers to HE progression experienced by young people in the Black Country. This allowed us to identify the place related barriers to progression and to pin down the intended outcomes of the programme. We also conducted in-depth interviews with stakeholders such as teachers, college leads and the Aspire to HE core team.  This is what allowed us to develop an evaluation framework that was tailored to the programme and to stakeholders’ needs.

2.     An active role in evaluation: We don’t think of ourselves as ‘evaluation consultants’. As a social enterprise we want to see sustained change in the teams we work with as well as improvements to programmes. Therefore, it’s important for us to upskill our partner organisations so they can take an increasingly active role in their evaluation, rather than building up a dependency. One way to do this, as we have with other evaluation partners, such as the national gambling outreach charity GamCare, is to run evaluation training workshops for staff. Another is to involve staff in the evaluation on an ongoing basis. This is what we have done with Aspire to HE, their team have been highly involved in the data collection and analysis. This year we have used methods such as Most Significant Change, in which delivery staff and managers sort and assess qualitative responses from pupils who have taken part in campus visits in order to identify the events’ main impacts. This upskills staff in evaluation methodology, helping them to understand the relevance of evaluation to their work and to build the skills to carry out evaluation more objectively and effectively. Staff can also make changes to their practice as findings emerge throughout the year, rather than waiting for the final reporting stage.

3.     Research-informed activities: Not all of our research activity with Aspire to HE has been evaluation focused. We have also investigated widening participation literature and conducted primary research to inform Aspire to HE’s assumptions and practice. This research complements and strengthens the evaluation, for example by interrogating the assumption that the short-term outcomes we measure are likely to lead to the long term goal, in this case progression to HE.

Other research projects have shaped the programme. We have created a knowledge-based curriculum based on a series of consultation focus groups with WP practitioners, teachers, school pupils, undergraduate students and other experts, examining the knowledge young people need to support their progression to HE. This curriculum now underpins the programme’s information, advice and guidance activities and allowed us to develop an evaluation tool to test participants’ knowledge. This alignment between research findings, programme activities and evaluation strengthens both the programme itself and the validity of the evaluation. It ensures that we are looking at the right outcomes and using the right tools to test whether there has been impact on those outcomes.

Effective evaluation practice is not always easy, it takes an investment from an organisation and a commitment to making sure that learning is taken forward into programme changes. Over the years, we have found that incorporating the principles outlined above: drawing on stakeholders expertise, upskilling staff and supporting evaluation with other types of research means we can answer a range of detailed questions about impact and improve programmes whilst also leaving teams stronger and more able to embed evaluation into their everyday practice.

 Ellie Mulcahy, Head of Research, The Centre for Education and Youth

Headshot of E Mulcahy

[ Modified: Wednesday, 22 April 2020, 3:22 PM ]

Comments