Conducting evaluation during times of change: Lessons from policy and community responses to the pandemic in Scotland
This report reviews existing evidence from evaluations of Scottish Government COVID-19 measures, in order to develop key principles for embedding evaluation more systematically into policymaking during times of rapid change or disruption.
6. Theme 4 - Evaluation design
The nature of the 27 evaluations included in this review is varied, but generally lacking in economic evaluation. Process evaluation has been more common than impact evaluation with 26 addressing the former and 18 the latter, although evidence around impact in many of those studies has been limited. Mixed method approaches were common (adopted in 21 of the evaluations), with a limited range of approaches in qualitative components. This theme considers reasons for these varied design choices and how they have shaped the learning gained.
6.1 Key findings
Some choices in evaluation design were driven by practical constraints. The evaluation report for business support measures,13 for instance, notes that international comparisons were not undertaken because not enough comparator studies were available to achieve this meaningfully. Qualitative work often engaged those delivering rather than receiving interventions, who may have been easier to access. As noted in the theme for Emergency Support Measures, evaluations of those schemes focused solely on process due to the minimal monitoring data requirements placed on recipient organisations. Reliance on available data, not designed for evaluation, may also have limited the approaches that could be taken; this for example constrained the questions that could be addressed in the evaluations of Asymptomatic[29] and Targeted8 Community Testing.
Some studies were designed to meet specific information needs from other evaluations. Qualitative research on barriers to adherence with social distancing12, for instance, was undertaken to help explain behavioural patterns already observed in response to restriction measures. Because of the aim to inform ongoing development of the Shielding Programme3, the rapid evaluation took a flexible approach focused on process and delivery, deferring the consideration of outcomes to a later follow-up survey that explored impacts over time for those shielding.
Only one of the evaluations examined set out to assess value for money. The cost-effectiveness pilot in Tayside for the Vaccination Programme delivered important learning11: for instance, that despite rural delivery models being more isolated and requiring higher band staff for safety reasons, they were still more cost effective than the mass vaccination delivery model. The report nevertheless found that difficulties with accessing data prevented moving beyond a pilot to Scotland-wide evaluation.
Economic evaluation represents a significant gap in the studies reviewed. There is little indication in other reports of why economic evaluation was not within scope. Evaluations of both Connecting Scotland Phase 1 and the Connecting Residents in Care Homes programmes10 acknowledge that using the 'Social Return on Investment model' could have demonstrated the broader financial benefits of the programme. However the latter evaluation report notes that economic evaluation was not in scope, preventing detailed recommendations regarding the degree of investment in the programme.
Resources may also have limited the scope of evaluations, though resourcing is not generally addressed in reports. Evaluations commissioned by the Scottish Government were most commonly undertaken in-house, with only two of the externally commissioned evaluations (UHVP4 and Civil Justice System's Pandemic Response17) having a budget over £100k (with expenditure on the former covering considerably more than the COVID-related component).
6.2 Implications for evaluating in times of rapid change
- Periods of rapid change or disruption represent unique learning experiences Fuller resourcing of a wider range of evaluation activity will maximise learning from these periods of change and experimentation in how we design and deliver services. In particular, there should be a greater expectation across government that evaluation of any interventions will deliver learning around value for money.
- Better evaluation in times of rapid change could be supported by setting guidelines in advance for what proportion of spend in an emergency is appropriate to allocate for evaluation.
- While there will be urgency to deliver any evaluation activity in response to an emergent disruption, detailed prior consideration of what precise knowledge is most important will help to drive a wider range of evaluation approaches and answer a wider range of questions.
- It is likely that decisions around evaluation design during the pandemic responded to which data were perceived as accessible and achievable. Although studies made creative use of available datasets, this placed limitations on what evaluation activity was seen to be in scope. Where several measures are deployed in one area of response to disruption, a coordinated approach to evaluating them together may help to address gaps. For instance, a unified process evaluation of different emergency funding measures could have optimised resource use, freeing up capacity to explore other questions such as impact or value for money.
- Consideration and clear definition of outcomes in any responsive policy design is vital to capturing and understanding impact subsequently. A stronger tendency for process than impact evaluation may also reflect information needs as programmes were developing, or the need for learning before many outcomes would have been realised. However, impact evaluation represents a key gap in learning and may also reflect a lack of clarity on expected outcomes for measures introduced at speed. The need to follow up early process evaluation with subsequent impact evaluation should be recognised in designing a rapid response.
- Work could be undertaken now to consider how best to design flexible evaluation strategies, capable of evolving in response to the changing priorities and information needs of future disruptions. This might involve, for instance, modelling different scenarios for rapid change unfolding, and setting principles for how aims, methods and procedures should be reviewed in those eventualities.
Figure 1: Timeline of evaluations of COVID-19 measures (timelines for third sector reviews shown in lighter shade)
Contact
Email: OCSPA@gov.scot
There is a problem
Thanks for your feedback