The 5 Step Approach to Evaluation: Designing and Evaluating Interventions to Reduce Reoffending
Updated guidance on how to use the 5 Step approach to design and evaluate criminal justice interventions.
Step 5: Evaluate the logic model
Analysing your data to evaluate the project
Once you've collected some or all of your data you can use it to analyse whether or not your model is working as predicted. Analysis is not just a case of describing your data. You need to address the following questions:
- What does the data tell you?
- Why are you seeing these results (it could be because of your activities or external factors)?
- What are you going do about this? How can you improve the outcomes?
Nb. Although you should definitely carry out this process at the end of your project, earlier interim analysis and evaluation is also highly valuable in order to identify problems and improve your service on an on-going basis.
Who should carry out an evaluation?
Don't automatically assume that outside evaluations will be more helpful or reliable, nor that funders will necessarily view them this way.
As the next page shows, there are advantages and disadvantages to both outside and internal evaluations. You should consider these carefully before deciding which approach is right for your organisation.
You may also want to consider commissioning outside expertise to support with particular stages of the evaluation (e.g. designing a data collection framework or reviewing existing evidence).
Whatever your decision , remember to budget for either internal evaluation or external expertise in your funding proposals. ESS provide further guidance on budgeting for self-evaluation: http://www.evaluationsupportscotland.org.uk/resources/237/
Outside vs. internal evaluation
Self evaluation by staff member(s) |
Commissioning outside evaluation |
---|---|
Advantages | Advantages |
|
|
Disadvantages | Disadvantages |
|
|
Testing the logic model
Did the intervention work as it should? Look back at the research questions and see what the data tells you about each question. The data (quantiative and qualiative) will tell you whether the service worked as the model predicted. The following are example questions you could answer using the basic monitoring data you collected.
|
Inputs
|
|
Outputs
|
|
Outcomes
|
Explaining outcomes: Assessing contribution
Given the complexity of the social world, it is very unlikely that any single project can make a difference to people's behaviour on its own. Where change is evidenced in users (both positive and negative), it is likely that there are multiple causes for this and your project will only be a part of this.
Without using a randomised control trial (which as we have said is often impractical), it is very difficult to really measure the impact of a single project on outcomes, especially long term outcomes such as reoffending. However, we can get a broad sense of the relative importance of the project and how it might have contributed to change, in conjunction with other influences
There are two key ways of doing this:
- Subjective views on contribution
- Identifying potential outside influences
Subjective views on contribution
Users, staff and other stakeholders are valuable source s of evidence in order to assess the relative contribution of your project to observed changes in users, in relation to other influences. You can:
1) Ask users whether they received other forms of support or influences on their behaviour?
2) Ask users to rate the extent to which each form of help contributed to their success, for example, did they say it was the project, their family, friends, another intervention or their own desire to succeed?
3) Ask others who know the users (e.g. family, teachers, social workers) to rate the relative influence of the project on observed changes.
Limitation!
Asking users and staff to judge the influence of a project runs the risk of 'self-serving bias'. This is the well-established tendency for people to take the credit for success and underplay external factors. One way to limit this tendency is to tell staff, users and other participants that you will be asking others to also assess the contribution of the project. Be honest about this limitation in your evaluation reports.
Identifying potential outside influences
By thinking about other potential influences, outside of your project, which might also have influenced behaviour change, you can put your own evidence into context.
Having identified potential influences, you may then be able to exclude or acknowledge whether they actually influenced your own users.
For example, in relation to a project to improve the family relationships of female ex-prisoners in the community, potential influences you might consider are:
- Outstanding warrants - If some of the women were re-arrested on outstanding charges this will have hindered participation
- Child protection issues - Concerns around the safety and well-being of children may have prevented practitioners from working with some families.
- Economic conditions - Changes in income levels for the women could impact on user participation in the project in terms of travel costs
Explaining negative or mixed outcomes
It is extremely unlikely that your data will show that your model worked as predicted for all users. Be honest about this. It is helpful to analyse users with poor outcomes (no change or negative change), as well as those showing positive outcomes. Use the data (and any other relevant information) to consider:
- Are there any patterns in terms of who shows positive/poor outcomes?
e.g. Are there better outcomes according to gender, age, socio-economic group, offence type? - Can you explain these patterns through reference to the way the project was carried out?
e.g. Were activities better targeted at particular groups or likely to exclude others? - Are there any external factors which explain these patterns
e.g. Do cultural norms or practical factors mean particular groups were always less likely to engage? For example women not engaging with drug services for fear of losing their children?
Remember! Your project cannot explain everything. You are only ever contributing to change. This is true of both positive and negative outcomes. If your project demonstrate poor outcomes, you should analyse external factors as well as internal processes in order to explain them.
What can you so to improve?
The crucial next step in the evaluation process is to use your explanations of outcomes in order to improve your model.
- Can you address any issues at the input stage (e.g. issues with staff training or resources)?
- Should you extend activities which appear to have been successful?
- Is it best to stop or redesign activities which the data suggests are ineffective?
- Can you improve the model to better target groups with negative outcomes?
- Can you do anything to address external factors which have negatively impacted? E.g. provide transport
Who needs to know about this?
Don't keep your evaluations to yourself! They are important sources of evidence to various groups.
- Funders will usually require an evaluation report in order to assess the contribution of a particular project (and their funding of it) to positive change. Remember, funders will also want to see evidence of a commitment to continual improvement. So be honest about difficulties and clear about future plans. Advice on producing evaluation reports can be found in appendix 2.
- Staff should ideally be involved in the production of evaluations (particularly at the stage of explaining outcomes and planning for improvement) and should certainly be informed of their findings. This will ensure everyone has a shared vision of how the project is working and how to improve their practice.
- Other organisations particularly those with similar aims, may be able to benefit from your evaluation findings in planning their own projects. Your evaluation contributes to the evidence base which others should review.
Contact
There is a problem
Thanks for your feedback