Tackling child poverty - place-based, system change initiatives: learnings
This report provides early evidence and learning from a range of initiatives that aim to tackle child poverty through working in partnership to provide holistic, person-centred support for parents and families.
5. Assessing effective approaches to system change
Key messages
As most initiatives are in the early stages of design and implementation at the moment, it is too early to say whether particular approaches are more or less effective or have resulted in positive outcomes. However, early implementation findings highlight the importance of ensuring that monitoring and evaluation is in place from the outset.
There were two key challenges experienced by initiatives in setting up monitoring and evaluation processes for system change initiatives:
- the capacity of local partners in implementing the monitoring requirements (such as measurement frameworks) or having the staff capacity to collect and analyse data.
- knowing how to effectively measure long term system change, in particular, how to define and collect outcomes to indicate system change, as well as the practicalities of collecting data cutting across a complex system and often across multiple services and organisations).
In terms of scaling up system change initiatives, most initiatives are in the early stages and have not yet reported on scalability and replicability, but where this has been considered, views of stakeholders suggest that scaling up might be better based on a set of principles and values, rather than aiming for the exact replication of an approach.
In order to ensure we can assess the effectiveness of place-based, system change initiatives– and the extent to which approaches can, and do, contribute to reductions in child poverty – we need to understand the mechanisms by which these changes are expected to occur through establishing logic models and evaluation frameworks. While it is too early to report on effective approaches to system change at this stage, this chapter focuses on the experiences of different initiatives in setting up and implementing monitoring and evaluation processes. Future reports will be able to consider the question of effectiveness in more detail.
Monitoring and evaluating system change initiatives
Across the selected initiatives, there was a mixed response to the establishment of monitoring and evaluation frameworks from stakeholders, and as of yet, there is no consensus on best practice approaches to assessing the effectiveness of system change initiatives. Developing appropriate, valuable and robust monitoring and evaluation frameworks appeared to be a consistent challenge across initiatives. However, there are also elements of good practice emerging across initiatives with stakeholders considering, and engaging with partners, to develop frameworks which are appropriate for all. This section will outline the challenges of establishing monitoring and evaluation processes for place-based, system change initiatives.
Two key challenges were reported across the initiatives. These are:
- Capacity of local stakeholders to implement monitoring and evaluation frameworks
- Effective measurement of long-term system change
Capacity of local stakeholders to implement monitoring and evaluation
For NOLB, a shared measurement framework was developed with partners from across the public, third and private sectors, in order to provide evidence on the outcomes from the initiative. However, levels of awareness and engagement with the measurement framework were mixed. For some local authorities, often those that were larger and with more resource and capacity, it was embedded into the work, while for other local authorities, often in smaller areas with less capacity, it was found to be more challenging to engage and embed the framework.
There were some local authorities who reported valuing and benefitting from the Framework.
“One local authority described how they are now tracking things they had not done before, such as the number of service users who have a cognitive impairment. The insights from this are being used to inform and support continuous improvement in service delivery.” No One Left Behind implementation report
However, the general consensus reported in the implementation report was that the monitoring and reporting requirements for NOLB were time consuming. In particular, the standardisation of information set out in the framework, when partners and local authorities have different existing processes and systems for the collection and recording of data, was deemed ‘frustrating’. This was due to the levels of engagement, communication and support that was required, between local authorities and delivery partners, in order to provide the information in a standardised form.
Further, feedback from NOLB stakeholders and employability staff highlighted the burden of the administration surrounding monitoring and reporting requirements for the Scottish Government. This suggests that there is a need for greater time to be given to engaging stakeholders when designing and implementing measurement frameworks.
“The consensus was that the administration [of monitoring and reporting requirements] associated with service delivery was time consuming. One local authority stakeholder described how the time they spent collating and organising data could justify recruitment of a full-time staff member. Another noted that whilst the Scottish Government had been receptive to feedback on streamlining requirements, there were frustrations with how often changes were made to monitoring requirements and the time it takes to implement these.” No One Left Behind implementation report
As part of Phase 1 of the Pathfinders evaluation, a draft monitoring framework was also developed. However, it was felt by partners and stakeholders that they would have benefitted from a monitoring framework in place earlier to use in the design and early implementation of the Pathfinders. In particular, there were some stakeholders who felt it was difficult to know whether the Pathfinder was on track due to the lack of a monitoring framework. Indeed, a key recommendation of the Pathfinder early implementation report was to:
“…[have] a locally agreed monitoring and evaluation framework in place at the beginning will give better transparency on what is being achieved, help Pathfinders to stay on track and perform well, enable Pathfinders to identify what has worked less well and to learn from that, [and] enable data and evidence led decision making.” Pathfinders early implementation report
This finding is similar to earlier points raised regarding the importance of taking time in the early stages of design of place-based initiatives to develop relationships between partners and establish common aims and ways of working – as part of this it is important to allow time to develop and properly embed monitoring frameworks.
The ACF evaluation report also provides some key learning and good practice for future monitoring and evaluation. This includes the development of monitoring and evaluation frameworks to enable projects to assess how well their processes work and adapt better to meet the needs of families over time. Further, there is a need to support partners to collect the required information and to clearly communicate the requirements and expectations from Scottish Government around the use of these frameworks.
Effective measurement of long-term system change
Throughout this report, the extended duration of time that system change can take to implement and embed has been repeated and this is an important element and a key challenge to consider in the design of measurement and monitoring frameworks too.
“Key barriers to sustainability were felt to be around continued funding challenges and having robust monitoring and evaluation in place to be able to evidence impact. It was felt that decisions on continued funding needed to be based around an understanding that systems change is a long term process and it will take a while for any outcomes to be fully realised and evidenced.” Pathfinders early implementation report
There is also the challenge of being able to robustly show attribution[12] and providing clear signs of impact across a complex interlinked system of policies and services.
“Another risk relates to the nature of WFWF as a systems-wide change initiative; it requires CSPPs to monitor many different services and support and to be able to link them together to assess their combined impact.” Whole Family Wellbeing Funding interim report
It is worth noting that the issue of attribution is always a challenge in evaluation but this is particularly so for system change initiatives that are often complex, evolving and lack clear parameters. The range of interactions, factors and individuals in a complex system can make it difficult to establish clear and stable causation and attribution between the input of system change initiatives and the outcomes it is seeking to achieve. Therefore, in evaluating system change initiatives it may be more appropriate to consider approaches that focus on contribution, rather than attribution.[13] A further issue for system change initiatives is difficulty defining what the components of system change are, and how to effectively measure these (i.e. what can be measured and how can it be measured).
“…data currently collected on partnerships and on activities undertaken to influence systems change is minimal and not systematically collected. It is currently insufficient to assess change against the monitoring framework. This is partly due to the lack of clarity on what ‘systems change’ is and how the Pathfinder should go about achieving it as described.” Pathfinders early implementation report
CSPPs involved in WFWF faced similar challenges in ensuring the effective measurement of softer outcomes in particular. For example, the WFWF interim report highlighted how some CSPPs had limited staff capacity to ensure high-quality data was collected, and analysed, which best captured the experiences of children, young people and families. Further, there was a challenge for CSPPs in working out how best to evidence the combined impact of a range of services and initiatives across different policy areas. Indeed, a recommendation of the WFWF Interim report was for CSPPs and Scottish Government to:
“…continue to work together to support CSPPs to articulate their intended outcomes of the WFWF. This would be a useful step before then considering how best to measure these with either existing evidence or through new evidence collection.” Whole Family Wellbeing Funding interim report
Scaling up and replicating system change initiatives
The Pathfinders early implementation report considered how system change initiatives might be scaled up. This suggested that scaling up should be based on the principles and values behind initiatives, and that the approach did not lend itself to exact replicability. It was felt that exact replicability would be working against place-based and person-centred values.
“Several partners and stakeholders mentioned that it was the concept of working together across organisations to deliver person-centred support that would be replicable, where the model itself would need to be adapted to suit the locality...As such, the actual execution would depend on local needs and resources…” Pathfinders early implementation report
However, there were also some partners across both Pathfinders who felt it was too early in Pathfinder delivery to assess whether such activity could be replicated or scaled up in other areas. There were also several partners who felt that cost-effectiveness had to be assessed before a case could be made to sustain Pathfinder activity in the longer term.
“Most [participants from both Pathfinders] expressed that there was a need to ensure that monitoring data was being routinely collected to enable evaluations and to support cost-benefit analysis. Many felt that it would be necessary to examine long-term outcomes before it was possible to say whether the Pathfinders were delivering sufficient results to support a case for them to be sustained in the long-term.” Pathfinders early implementation report
It is worth noting that conducting Value for Money analysis of complex interventions (such as through cost-benefit analysis) also poses challenges due to a number of factors (similar to the challenges set out above for determining robust attribution) including: the iterative design of initiatives; complex interventions operating across multiple levels of the system; and evolving interventions without clear parameters or fixed processes.
Discussions on scalability and replicability are currently absent in the outputs of other system change initiatives, likely due to the majority of initiatives being in the early stages of design and implementation.
Contact
There is a problem
Thanks for your feedback