Inclusion health action in general practice: early evaluation report
An early stage evaluation of the inclusion health action in general practice programme.
Appendix Three: Technical Methods
This technical annex to the final evaluation report provides further details of the methodology used to deliver the evaluation for a technical audience. The evaluation tools are also included as an Appendix.
IHAGP evaluation methodology
A description of the methodological approach used to deliver the IHAGP evaluation is detailed in the following sections and covers:
- The tools developed to undertake data collection activity.
- The approach to participant sampling.
- Recruitment of participants and fieldwork undertaken.
- The approach to data analysis.
- Details of the reporting outputs produced.
Development of evaluation tools
Following an inception meeting with the client team and a review of background documentation, the following tools were developed to support evaluation fieldwork and data collection:
- Participant information form which provided evaluation participants with the information required to make an informed choice about their participation. The information sheet covered the following:
- The purpose and aims of the evaluation.
- Why participants were being invited to take part in the evaluation.
- What participation involved.
- GDPR, data protection and data storage.
- Confidentiality and how findings would be used.
- The voluntary nature of participation.
- Processes for questions and complaints.
- Privacy notice.
- Discussion guide for semi structured interviews.
Evaluation tools were reviewed by the IHAGP Research Advisory Group and approved prior to any fieldwork and data collection activity commencing.
Copies of the evaluation tools that were developed are included as an appendix.
Sampling approach
A deliberative sample of 30 General Practices participating in the IHAGP programme was selected to be invited to take part in the evaluation fieldwork. The sample provided proportionate representation across the following criteria:
- Practice list size – small (<3,000 patients), medium (3,001 – 6,000), large (6,001+). The categories for patient list size were developed by identifying the smallest practice list size among participating practices (1,362) and the highest (10,407) and setting categories to reflect roughly a third of the minimum to maximum range.
- % of patients in living in the most deprived areas (as per Scottish Index of Multiple Deprivation classifications) – low (<50% of patients), medium (50-70%), high (70%+).
- Number of IHAGP themes practices were delivering activity under.
- The IHAGP theme chosen by practices to deliver activity under.
Furthermore, 9 practices who did not take up voluntary participation in the programme were also invited to take part in the evaluation. The aim of this was to explore barriers to engagement, and whether any changes could be made to the programme that would enable or encourage participation. However, no practices took up the offer and we are therefore unable to provide any findings in relation to this.
Due to a slow response from the initial sample of 30 practices when inviting them to take part in the evaluation, the decision was taken in conjunction with the client team to open the opportunity to all 66 practices participating in IHAGP. The tables below provide a breakdown of practices that participated in evaluation fieldwork against the sampling criteria.
Sample breakdown by practice list size
Practice list size / Number of practices
- Large (6001+): 4
- Medium (3001-6000): 9
- Small (<3000): 2
Sample breakdown by percentage of patients in SIMD15
% of patients in SIMD15 / Number of practices
- High (70%+): 4
- Medium (50-70%): 10
- Low (<50%): 1
Sample breakdown by IHAGP theme chosen to deliver activity
Theme chosen / Number of practices
- 1 - Patient Engagement: 3
- 2 - Staff Training : 8
- 3 - Extended Consultation and Outreach: 9
Sample breakdown by number of IHAGP themes selected
Number of themes chosen / Number of practices
- 1: 11
- 2: 3
- 3: 1
Monitoring data
All practices participating in the IHAGP programme were required to submit two monitoring forms (template in Appendix One) over the duration of the programme. The first monitoring forms were to be submitted in August 2023 and the second in December 2024, though many submissions were delayed.
The Scottish Government carried out an analysis of the individual monitoring returns and produced an anonymised summary of the data contained in the monitoring forms. The summary report was shared with the evaluation team so that findings detailed could be drawn into the overall analysis.
When inviting practices to participate in the evaluation fieldwork, consent was sought to share the monitoring returns with the evaluation team. Where consent was given, the forms were accessed by the member of the evaluation team conducting fieldwork with the practice to gain an overview of progress and activity being delivered and any learning shared in the return.
Fieldwork
Recruitment of evaluation participants
The following steps were taken to recruit practices for the evaluation:
1. Scottish Government made initial contact with practices via email, providing the information and consent form, and asking practices to provide initial consent:
- a. To participate in the evaluation.
- b. For their contact details to be shared with the evaluation team.
- c. For monitoring data relating to their IHAGP activity to be shared with the evaluation team.
2. Contact details and monitoring returns of consenting practices were shared with the evaluation team.
3. The evaluation team contacted the practice to plan for semi-structured interviews to take place.
4. Semi-structured interviews were then carried out as per the agreed arrangements.
5. Additional reminders and follow-up emails were sent throughout recruitment to improve the sample. This included targeted follow-up with practices less well-represented in the sample (i.e. those with a list size in the ‘small’ category, and those with a ‘low’ percentage of patients in SIMD 15). Targeted follow-up was also undertaken with practices who, based on a subjective appraisal of their monitoring forms, had well developed projects and had potentially generated learning that could inform the evaluation.
Over the duration of the evaluation 19 practices provided consent to participate in the evaluation. All 19 were contacted by the evaluation team to plan for the fieldwork to take place, though 3 did not respond to contact attempts and one had to withdraw due to capacity challenges.
Fieldwork and data collection with IHAGP practices
To minimise barriers to participation, we explored participant preferences and accommodated their availability. This included:
- Giving participants the option to contribute to the evaluation through one-to-one or group discussion.
- Offering face-to-face, telephone and video call as options for participation.
- Being responsive to participant availability, including offering and accommodating times outside of ‘normal’ working hours including evenings and weekends.
Over the duration of the fieldwork 23 staff from across 15 practices participated in an interview, with the average length of interview being 40 minutes. This included 19 one-to-one interviews and two paired discussions delivered through the following methods of engagement:
- Face-to-face at the participating practice (4)
- Video call (15)
- Phone call (2)
The 23 staff were made up of:
- 16 GPs
- 5 practice managers
- 1 community link worker
- 1 pharmacist
The client team at Scottish Government also identified 4 strategic stakeholders to be interviewed as part of the evaluation to explore their perspectives of the programme. All four stakeholders consented to participate but only 3 responded to contact from the evaluation team to schedule an interview. The 3 stakeholder interviews that went ahead were all carried out through video call. One was from the NHS Greater Glasgow and Clyde Health Board, one from Glasgow City Health and Social Care Partnership, and one from the Scottish Government.
Participation in the evaluation among General Practices was lower than hoped due to a lack of capacity within practices and the short timescale for the evaluation. We highlight that this was a self-selecting sample, and the findings may not represent all practices’ experiences. However, evidence from monitoring data suggests that similar types of work are going on across the programme though they may be at different stages. Furthermore, the evaluation was not able to directly engage with patients due to the time constraints for delivery of the evaluation, and the timeline involved in gaining the required ethical approval. Therefore, findings presented in the main report relating to patient experience and patient outcomes are based on reports from practice staff.
All interviews were recorded with the participant’s consent. Recorded interviews were then transcribed to inform the data analysis phase. The transcribing or recorded interviews involves a two-step process:
1. The recording is run through automated transcription software, which produces a typed version of the recording.
2. A member of staff then listens to the recording while reviewing the typed version generated by the transcription software and making any required changes or additions to ensure complete accuracy.
Approach to data analysis
The following sets out the key steps in the analysis process:
- Each team member reviewed a sample of colleagues’ transcribed interviews, identifying themes and evidence aligned to the evaluation objectives, key questions, and theory of change/logic model.
- Each team member prepared an individual codification framework to support thematic analysis of the qualitative data.
- All team members participate in a facilitated team workshop, where each individual team member presents their initial analysis which is then discussed and challenged until a consensus is reached.
- When this is completed for each team members initial analysis, the team members work collaboratively to develop a single consistent codification framework.
- Thereafter, each team member is allocated interview transcriptions to analyse and code the qualitative data.
- During the analysis process, quotes from the coded responses that illustrate and support key findings are identified and highlighted for inclusion in the report writing process.
- The anonymised summary of monitoring returns produced by the Scottish Government was reviewed during the analysis process and its findings drawn on to:
- Provide an overview of delivery and activity across the programme.
- Provide quantitative data relating to the activity delivered.
- Support and supplement findings from interviews with practices.
Peer review of colleague’s work was an ongoing feature to ensure consistency and accuracy throughout the process. At regular intervals the codification framework was reviewed to identify any areas where sub-themes were emerging or identified, and additional codes were added to the framework to bring further detail and depth to the analysis.
The data captured through stakeholder interviews was analysed in line with the process set out above. The analysis of this data was then used to cross check against the analysed data gathered from engagement with participating practices to explore alignment between stakeholder perceptions and the experiences of practices (e.g. types of activity being delivered, progress, learning, sustainability). This process found that stakeholder perceptions broadly aligned with the experiences of practices, though feedback from practices was more in-depth and detailed due to their direct involvement in delivery, whereas stakeholder feedback was generalised or based on perceptions.
One stakeholder was also a GP at a practice participating in the IHAGP programme and shared their experiences of delivering activity through IHAGP and drawn into the analysis of responses from other participating practices that had engaged in the evaluation.
Reflections on the heterogeneity of data collection methods
The following sets out the variation in data collection methods that were applied during engagement with participating practices, and discusses any implications or limitations this presented in the analysis and reporting process:
- Interview length – While the average interview lasted 40 minutes, this varied from 30 minutes to 60 minutes. However, the interview length did not reflect any variation in the depth or quality of data, instead it reflected the type and number of activities being planned or delivered by a practice and the number of themes that they were delivering activity under.
- One-to-one or paired interviews – Two interviews were paired interviews (all others were one-to-one) which involved a member of the research team and two members of staff from the participating practice. Prior to the interview the researcher checked that each participant was comfortable speaking openly and honestly with their colleague present, and confirmed this was their preferred method of participation. During paired interviews the researcher ensured both participants were given the opportunity to share their experiences and perspectives for each of the questions asked.
- Mode of interview – Interviews were conducted using a combination of face-to-face, telephone and video call. The mode of interview was selected by the participant based on their own preferences, which contributes to effective engagement with the evaluation. Again, no differences in depth or quality of data were observed across the different modes of interview.
- Professional role of participant – While interviews were conducted with a mix of different roles across different practices, the participants represented those that had taken a lead role in the implementation or delivery of the programme, or a particular activity that had been developed and delivered as part of the programme. Therefore, each participant was able to provide an informed view of their own experiences, learning and perceived achievements based on their own involvement.
It is also worth noting that a single discussion guide was used during all interviews, which ensured consistency in the lines of enquiry that were explored with every participant. This enabled the development of a single, consistent thematic coding framework to support the analysis of each interview.
Furthermore, in presenting findings and to assist the reader in interpreting the findings, the following terms were used to provide an indication of prevalence throughout the report:
Most practices/staff: over half of participants provided feedback relevant to the theme presented
Many practices/staff: a third or more but less than less than half.
Several/some respondents: more than a few but less than a third.
A few: A view expressed by roughly three individuals
One/two: a singular view or a view identified from two participants.
Overall, the heterogeneity of data collection approaches did not create any limitations or affect the analysis or reporting process.
Reporting outputs
The following reporting outputs were produced:
- A final evaluation report which detailed the learning generated, and evidence of emerging outcomes.
- A suite of eight thematic case studies.
- A technical annex to the full report.
- A two-page infographic highlighting the key findings.
- A short animation covering key findings.
- Three workshops with stakeholders to reflect on the findings with stakeholders and integrate them into the programme theory and design.
- Two workshops with practice staff to integrate and share learning into the programme implementation.
Contact
Email: socialresearch@gov.scot
There is a problem
Thanks for your feedback