staff and participant costs, as well as ‘other costs’, such as ‘publicity, printing [and] stationary’
(HEFCE, 2003: 5). HEPs also had to adopt the ESF’s methodology for calculating expenditure.
From speaking to practitioners involved in the scheme, this included keeping staff timesheets and
adopting a standard formula for calculating accommodation and classroom costs.
Although a ‘challenging’ departure from previous funding requirements (HEFCE, 2009:2), the
practitioners recalled that, once systems had been established and new procedures mastered,
calculating and recording actual costs proved manageable. Indeed, it was noted that the process
raised awareness of the scale and range of inputs associated with delivering summer schools.
Similarly, the wealth of information generated enabled HEFCE to conduct a cost-based analysis
of the scheme, including a consideration of the ‘average total cost’ of each summer school, along
with an analysis of the costs per participant and per participant day (HEFCE, 2009: 7).
However, ESF summer schools came to an end some seven years ago. Although the ‘rigorous’
standards of record keeping associated with this scheme (HEFCE 2003: 5) have not been
replicated in the field of widening access since that time, other sectors have moved towards
evaluating provision using CEA, including health and international development. Their experience
may inform any plans to apply CEA to the assessment of today’s widening access work.
Whilst ESF guidance placed a premium on capturing ‘direct, actual costs’ (HEFCE, 2003: 5), it is
argued that CEA should be based on the concept of opportunity cost: addressing the question of
what is being scarified by directing resources to the selected initiative? (Phillips, 2009).
Consequently, indirect as well as intangible costs need to be considered (Kaplan, 2014). This, for
instance, would include costs incurred by school and college staff in supporting outreach schemes.
It might also include the costs sustained by participants, in terms of their time away from the
classroom, as well as those incurred by their parents (Phillips, 2009). In addition, experience of
conducting CEAs in the field of international development draws attention to the need to
acknowledge that a greater outlay may occur in engaging those from ‘harder-to-reach’ groups
(Jackson, 2012: 3). This said, such studies advocate adopting a pragmatic approach and basing
analysis on those costs deemed reasonable to capture (Raftery, 1999, Jackson, 2012).
Whilst there may be a clear rationale for adopting some degree of CEA, and previous summer
school experience indicates a capacity for widening access practitioners to do so, there is also a
need for reflection. Notably, in weighing up the ‘costs’ and ‘benefits’ of this method of evaluation,
in exploring how to counter the likely challenges, and in envisaging what a consistent approach
might look like. With this in mind, I would welcome the insights of colleagues who have experience
of deploying CEA, or who are interested in further exploring both the case for and against its
application.
Neil Raven
Educational Consultant, Loughborough University
[email protected]
References
THE CENTRE FOR RECORDING ACHIEVEMENT 104 -108 WALLGATE, WIGAN, WN3 4AB |
6