Conducting evaluation during times of change: Lessons from policy and community responses to the pandemic in Scotland

This report reviews existing evidence from evaluations of Scottish Government COVID-19 measures, in order to develop key principles for embedding evaluation more systematically into policymaking during times of rapid change or disruption.


5. Theme 3 - Evaluating emergency support measures

A number of measures made funds or other resources available to individuals or organisations to meet urgent needs in the face of the pandemic. In some cases, application and reporting requirements that might normally have been a condition of funding were relaxed or suspended to facilitate a rapid response and partnership working given the unique challenges to organisations (e.g. third sector reports of overload, resourcing issues, fatigue12). This theme considers what these cases can tell us about how to evaluate emergency resource provision.

5.1 Key findings

Measures under the March 2020 £350m emergency fund to support community welfare and wellbeing[22] have been evaluated in terms of process, showing for instance the efficient distribution of funds, but impact evaluation was beyond scope of 'light touch' administrative and organisational feedback data[23]. 'Light touch' monitoring measures included, for instance, having Scottish Government staff pre-populate electronic survey forms, and surveying only for information not already available (e.g. thoughts on partnership working, key learning).[24]

Learning around the successes and challenges could be compromised by data quality. Factors included: variable quality of returns; organisations operating across multiple areas or being able to make multiple applications; or, in the case of the Small Grants Fund[25], organisations developing their own monitoring forms.

The evaluation of the Wellbeing Fund[26] found that a lack of quantitative data to identify needs effectively may have led to duplication and under-provision in different geographic areas; the lack of data also meant it was not possible to comment on the extent to which this had occurred. The Scottish Welfare Fund review14 calls for improved routine data gathering to facilitate future audit, in particular for better equalities data.

Qualitative research exploring low income households' experiences of emergency funding measures was able to capture impacts of those measures for those in need.12 Advantages include: perspectives gathered directly from individuals rather than via organisations distributing funds; sampling to compare outcomes for priority household types; and the opportunity to consider the impacts of multiple support measures. Disadvantages included the delay between accessing the funds and views being gathered, and not being able to verify how much support was received via which measures.

Some evaluations optimised outcomes data by linking existing datasets, or employing multiple strategies to contact potential participants. The evaluation of Connecting Scotland at Phase 1 and Phase 2 used this approach to gather data on outcomes for those receiving resources (although taking place later in the pandemic).18 An impact report on the work of Locality Operational Groups reported that effective sharing of monitoring data between services enabled breakdowns of consultation figures.[27]

Implemented in the early pandemic, the evaluation of business support measures was able to assess impacts through survey data from businesses receiving support, and provides some breakdowns of funding recipients by sex or ethnicity.13

Case studies of pandemic responses by Third Sector interfaces identified a strong support role for Third Sector organisations and scope to improve their capacity to demonstrate outcomes.[28]

5.2 Implications for evaluating in times of rapid change

  • While there is a necessary trade-off between speed and comprehensiveness of evaluating emergency support measures, building in a simple approach focused on impact from the start could result in better learning, including around value for money, without undermining the important principle of 'light touch'. For instance, a very short list of questions around difference made could be introduced to 'light touch' forms for funding measures.
  • Having some systems already in place or accessible to monitor impact could be a criterion for organisations to receive emergency funding, if the impact of urgent support measures is to be evaluated. However, this would need to be weighed against the risk that smaller organisations less resourced for monitoring may be well placed to deliver in valuable or innovative ways.
  • Partner organisations' capacity to gather outcomes data during rapid change could be boosted through planning and resourcing strategies such as data sharing/linkage, learning or partnership working. For instance, if school engagement was an anticipated outcome of third sector support, public sector education data might be usefully shared or linked to recipient data. Third Sector Interfaces28 and the Scottish Third Sector Tracker7 have been seen to support learning around outcomes from third sector delivery, and such initiatives could be strategically resourced during times of disruption to support and enhance their support for monitoring. Collaboration between businesses and third sector organisations might enhance resources to capture impacts.
  • Multiple ways of reaching participants who may not engage should be considered and planned for when evaluating in times of rapid change. The success of Connecting Scotland19 in recruiting at least some end users to the research suggests a need to challenge or explore assumptions that individuals will not be able, available or willing to provide their views where this has not been directly ascertained.
  • Coherent ways of measuring the impacts of measures, beyond surveys and interviews, should be considered; for example, the Connecting Scotland Phase 1 evaluation suggests the 'Social Return on Investment model' could be used to demonstrate broader financial benefits18.
  • Post-hoc qualitative research is a key way of independently assessing impact where outcomes data are not available, or in the face of low levels or quality of returns. If organisations can be supported even to record who they have distributed funds to, then post-hoc qualitative evidence could be linked to those data to illuminate how different measures contributed to outcomes.
  • Engagement with marginalised groups should be a priority consideration from the outset of developing an emergency resource measure. Emergency support is likely to be disproportionately needed by such groups and monitoring how it impacts on them is key to understanding effectiveness. A partner organisation engaged with marginalised individuals could be funded specifically to gather or help gather outcomes information from that group; or peer research approaches could be considered.

Contact

Email: OCSPA@gov.scot

Back to top