Why Evaluate Blog - Neil Harrison

Neil Harrison

Associate Professor of Education Policy, University of the West of England

I have been involved in outreach and access work since the early 1990s, first as a practitioner (e.g. running a student tutoring scheme), then as a manager (e.g. overseeing student bursaries) and for the last ten years as an academic. I’ve looked into a wider range of social justice issues in higher education, including student finance, the experiences of care leavers, factors in retention and success and the targeting of outreach activity. I am currently leading OFFA’s project exploring the evaluation of pre-16 outreach work.

Email: neil.harrison@uwe.ac.uk
Twitter: @DrNeilHarrison
Web: https://westengland.academia.edu/NeilHarrison

Why Evaluate?

One the issues that I have been concerned about throughout the time that I’ve been involved with widening participation work is that of ‘deadweight’. It’s a rather harsh sounding term derived from the world of social policy analysis and it’s a measure of the effectiveness of an initiative in encouraging people to follow a particular behaviour – e.g. apply to higher education.

Specifically, the deadweight comprises those people who would have behaved in the desired way (i.e. made an application) without their involvement in the initiative. This is a key element in assessing whether an outreach activity ‘worked’ – not whether the participants ultimately apply, but whether they will now apply due to their involvement where they wouldn’t have done before.

From my own experience of outreach activities and my reading of research studies, I am convinced that there is a very large amount of deadweight in the system. It’s well-intentioned and may well have ancillary positive effects on the young person, but it’s fundamentally a failure of the targeting of limited resources.

So, this is one reason why we need epistemologically-sound evaluation within outreach. Eliminating deadweight should be a key concern for OFFA and for individual universities. The challenge is being able to find it, as (young) people are bad at second-guessing what they would have done in a different world. There are a number of threads to take forwards with this in mind:

1. We need evaluations that focus on changes not outcomes. This is difficult, but realist approaches based around ‘theories of change’ are likely to be most effective.
2. Tracking studies are particularly problematic as they will tend to generate and endorse self-fulfilling prophecies. If an activity has a large deadweight, then it will ostensibly appear very successful in promoting higher education and therefore more likely to have resources invested in it.
3. More broadly, activities focused on the post-16 age group are likely to be a key source of deadweight (see below).
4. Universities need to invest time in evaluating their targeting as well as their activities. In particular, schools’ approaches to identifying young people can be highly variable and potentially problematic.
5. More attention needs to be given to people who were once on the pathway to higher education, but who have fallen away – e.g. those with above average attainment in KS2, but below average in KS3.

Can you recommend for any resources, readings, sources, examples of evaluation you have found particularly influential or useful:

I have found Claire Crawford’s reports massively helpful for demonstrating how the disadvantage we see in HE admissions is multi-dimensional and how it accumulates over a long period of time – something I instinctively knew to be true, but it’s really useful to have the hard numbers to quantify this. Most importantly, they demonstrate the importance of pre-16 work as the participation rates for different social groups are very similar once KS4 qualifications are taken into account.

Crawford, C. 2014. The Link between Secondary School Characteristics and University Participation and Outcomes. London: Department for Education.
Crawford, C., and E. Greaves. 2015. Socio-Economic, Ethnic and Gender Differences in HE Participation. London: Institute for Fiscal Studies.