Scottish Crime and Justice Survey: user workshops - summary
Summary of the feedback gathered during three user workshops on the Scottish Crime and Justice Survey held during January 2022.
9. Annex A: Paper for Workshop 1 on Survey Design Options
1. Introduction
The following paper outlines a number of options for how the Scottish Crime and Justice Survey (SCJS) could be carried out in its new iteration, under the new contract. This Option Paper is being appraised internally and we are now sharing this with you, our external users, stakeholders and potential suppliers, for your input and feedback which will help us to make an informed choice when selecting a Preferred Option. As such, any and all comments on the options presented are very welcome.
2. Context
The Scottish Crime and Justice Survey (SCJS) in its current form was established in 2008/09, although a crime survey has run in Scotland since 1982. From October 2015, the SCJS has been delivered by Ipsos & ScotCen and this contract is coming to an end[3]. Therefore, to ensure the continuation of the survey and the continued provision of evidence on crime victimisation in Scotland, a re-procurement process is needed. Our proposed procurement timeline would ensure that a Supplier would be in place in October 2022 and able to begin fieldwork in Spring of 2023.
The COVID-19 pandemic has had a significant impact on the SCJS, in 2020 the SCJS was suspended and the Scottish Victimisation Telephone Survey (SVTS) was developed. The Scottish Government introduced the SVTS as a discrete collection to the SCJS and the results are based on a sample of around 2,700 telephone interviews conducted in September and October 2020. The current survey year (2021/22) has also been affected by research restrictions as a result of COVID-19. This year, the SCJS is being carried out using a mixed-mode approach. 'Knock to Nudge'[4] is being used to offer respondents an interview by telephone or via video call, until such time it is deemed appropriate to include an in-home face-to-face option as well. The self-completion section of the survey is currently completed by the respondent online or on paper.
These recent and significant changes to the survey, in addition to the opportunity offered by the re-procurement, mean now is a fitting time to be re-visiting the fundamentals of the SCJS survey aims and design to ensure that it is fit for purpose and the future.
3. Research Assumptions
The following are a list of assumptions we have made whilst drafting the survey design options. We are sharing these in the spirit of transparency and we welcome all comments on them:
- We want to retain each of the four current SCJS aims:
- enable people in Scotland to tell us about their experiences of, and attitudes to, a range of issues related to crime, policing and the justice system, including crime not reported to the police
- provide a valid and reliable measure of adults' experience of crime, including services provided to victims of crime
- examine trends over time in the number and nature of crimes in Scotland, providing a complementary measure of crime to police recorded crime statistics
- examine the varying risks and characteristics of crime for different groups of adults in the population
- In addition to the above key aims, a number of further, secondary aims have emerged while considering survey design options:
- to improve data collection on Violence Against Women and Girls (VAWG).
- to keep pace with changing definitions of, and trends in, crime - thinking specifically about cyber-crime.
- to establish a survey design that will increase the resilience of the survey's data collection in the face of changing circumstances, such as further lockdowns.
- That the annual SCJS budget will remain the same in real terms.
- That there is a strong case for continuing to achieve a nationally representative sample, that this would enable the SCJS to continue to produce data that provides insights into the population's experience of crime and perception of the criminal justice system in Scotland.
- That Face to Face research is no longer the default and that we will most likely be commissioning a mixed-mode survey.
- That the SCJS will continue to contribute to Scottish Surveys Core Questions (SSCQ) and therefore that the mode adopted by the SCJS is consistent/comparable with the Scottish Household Survey (SHS) and Scottish Health Survey (SHeS).
- That we would continue to be able to make comparisons with Crime Survey for England and Wales (CSEW).
- That regardless of which Survey Option is selected, the SCJS Team will undertake questionnaire development work. This work will include a period of stakeholder engagement, as well as a period of work undertaken in conjunction with the contractor/s. This questionnaire development work will ensure that the questionnaire modules align with the needs of users, stakeholders as well as with wider SG strategies.
4. Option Summary
We are proposing the following 3 Options
1. Continuation of 21/22
2. Online First
3. American Model
A solely face-to-face SCJS is not being presented as an option. As a result of COVID-19 and the significant disruption caused to the SCJS, and other SG surveys, the SCJS team want to select a survey design that is more resilient to future change and, as such, we are placing preference on a mixed mode survey. In addition, adoption of a mixed mode approach means the survey is better able to align with the environmental aspect of the Scottish Government's Sustainable Procurement Duty.
5. Option Details
1 - Continuation of 21/22 approach
Survey Design:
Knock to Nudge[5] followed by the option of either:
a) face to face b) telephone c) video call for main survey modules.
Self-Completion to be completed either:
a) in person
b) online
c) on paper.
Sample / Frequency:
6,000 Adults / Annual
Opportunities:
We will have some intel from 21/22 fieldwork.
Response rates may be higher than face to face (if face to face available as an option throughout the fieldwork period) as respondents are given remote options.
If not comparable with time series, we will have an extra year of data because will be comparable with 21/22.
Risks:
Intel on this approach will be limited at the time we draft the project specification (to be finalised in April, 2022)
Need to consider if there are mode effects and, if so, how these might be mitigated.
Unsure at this stage whether data collected will be comparable with existing time series.
Questions to answer:
Are the results from 21/22 likely to be comparable with the existing time series?
What are the response rates for 21/22? Would the continuation of this response be feasible in terms of achieved interviews/data quality?
Do potential suppliers have adequate workforce to carry out knock-to-nudge?
Does knock to nudge offer Value for Money?
International Comparison:
Combination approaches are used in several countries. For example, both Canada and the USA use face-to-face in the first instance, followed by telephone interviews. However, neither Canada or the USA have a self-completion component for sensitive topics.
Interviewer present self-completion is used in some nations (e.g. CSEW and SCJS pre-covid). However, other countries administer self-completion for sensitive topics online and by paper/post (e.g. Sweden and The Netherlands)
2 – Online First
Survey Design:
Fieldwork would commence by contacting households via letter, sending them log in details to select a random adult and asking the selected adult to complete the survey (including self-completion section) online.
Based on response to the first (online) round, we send a second letter to households offering them the opportunity to take part in the survey over the phone.
Based on response to this second round, we would then move to K2N, nudging people on the doorstep to take part either online or on the telephone (or perhaps even face to face).
Sample / Frequency:
Possibly More than 6,000 Adults / Annual
Opportunities:
Some respondents will take part in the survey regardless of mode. Therefore, beginning with the lowest cost mode, online (as opposed to f2f, K2N), means we will reach these people in the most cost-effective way possible.
As survey design may make the survey cheaper, option to increase sample size. A larger sample size could also increase capacity to report on the experiences of certain groups of individuals/victims (for which sample sizes have been too small in past).
This survey design might be achievable for a larger number of suppliers which would open up the pool of potential bidders, including to organisations based outside of Scotland.
Could improve response rates and help balance the age profile of respondents to include more young adults.
Risks:
Unsure whether data collected will be comparable with existing time series.
Problems arising from using potentially 3 different modes (online/telephone/f2f). Would results from each part of the sample/mode be comparable?
Questions to answer:
How would we select a random adult to take part in the survey?
[For info: Sweden uses a population register for this task. A sample of 200,000 is selected from the 'Statistics Sweden's population register' to be a 'nationally representative stratified simple random sample'. Letters are then sent out to these 200,000 inviting them to take part.]
Would adopting an online/telephone design limit the sensitive questions we are able to ask?
International Comparison:
In some European nations, the survey is conducted entirely online or by post.
For example, both Sweden (since 2017/18) and The Netherlands (since 2012) first invite participants to take part via the internet by providing login details to an online version of the survey. Non-respondents are re-approached by mail, each time a paper questionnaire is included in the mailing which can be returned by post.
Those who still do not respond to this are reminded by telephone, if their telephone number is known.
Additionally, Denmark offer online and telephone response – their online method being available since 2010.
3 - 'American Model'
Survey Design:
Panel Design.
Pool of households deemed representative of the population entered into a sample. These households are contacted for interview at regular intervals over the year. The topics for each interview could vary, for example:
Interview 1: Experiences of Crime (in last 12 months) & Perceptions/Attitudes
Interview 2: Self-Completion Topics
Interview 3: Experiences of Crime (in last 12 months) & Perceptions/ Attitudes
Options for conducting interviews include face to face, telephone, video call, or online.
Sample / Frequency:
Multi-wave data collection. Rethinking data collection as the number of interviews conducted, rather than number of participants recruited.
Options include:
1) Data collection every 4 months for one year (3 interviews in total).
2) Data collection every 6 months. This approach is used by the USA in the National Crime Victimization Survey. These 6 monthly interviews are carried out over 3.5 years (7 interviews in total)
Or, any collection regularity between. A balance between participants and number of interviews must be balanced and costed.
Opportunities:
Longitudinal approach may provide insight into repeat victimisation.
Conducting follow up interviews over telephone (or online) may reduce costs.
Can ask more questions to the same people without needing to extend interview length.
Attrition between interviews 1 and 2 and between interviews 2 and 3 would be less analytically damaging because we already know something about the respondent from interview 1 and can adjust the weighting accordingly.
Risks:
Costing would need investigated – more interviews will require a reduction in the total number of study participants to remain in budget. A balance between the number of survey participants and the number of interviews conducted would need assessed.
Sample Size:
The sample size must be large and balanced enough to be nationally representative and provide sufficient base sizes for analysis.
Interview Regularity:
Follow up interviews must be regular enough to offer the benefits of the panel design, but balanced with sample size as the greater the sample size, and the greater number of follow up interviews, the greater the survey costs.
Possible impacts on time series comparability.
Questions to answer:
Is it possible to perform a longitudinal panel design within the current budget?
If so, what is the minimum number of participants we need to ensure the sample is representative and we capture sufficient numbers to perform analysis?
Considering the need to balance reducing the overall sample size with regularity of longitudinal interviewing - are the potential benefits of the panel design 'worth it'?
International Comparison:
USA: The USA pools households into a sample. Once a household is selected to be in the sample, they remain so for 3.5 years. Over the course of 3.5 years, eligible persons are interviewed every six months for a total of seven interviews. New households rotate into the sample to replace any households that have been in the sample for the full 3.5 years.
Possible comparability with CSEW as they are actively exploring this survey design option as part of their redevelopment work.
6. Second Tier Variations
The following are a number of 'second tier variations'. These are more minor variations to the survey design that could be adopted under any of the three options.
- Reporting Variations: We could vary the information included in the SCJS 'main findings report' each year. For example, in year 1, we could produce a shorter, more 'high-level' report that includes key information on crime rates and trends. In year 2, we could produce a longer and more detailed report that, in addition to key information on rates & trends, includes more detailed analysis on the victim experience (based on 2 years' worth of data).
- Rotational Topics: Having a short fixed or 'central' survey and placing more emphasis on change and rotation of questionnaire modules over the course of the contract. Would require the identification of certain topics that can be collected once every 2/3 years (instead of every year), so perhaps those numbers that don't change significantly (and for questions that can be reported on using a single year of data) – we would continue to provide a 'snapshot' of these topics but not every year. This would enable us to introduce more modules to be asked on a rotational basis. In short, we would increase the flexibility of the survey, enabling it to be more reactive to changing priorities and trends. This approach is similar to the SCJS Quarter sample modules used in the current contract, however, we would expect a higher degree of rotation and flexibility than has previously been the case.
- Bolt-On Elements: Getting suppliers to cost optional 'add-ons' that they could do either themselves, could sub-contract or could perform in collaboration with other partners e.g. academics. For example, an 'add on' could be a survey or qualitative project that explores the victimisation rate amongst the homeless population or those that have experienced a particular type of crime. Such add-ons could improve the survey's capacity for understanding the victim experience, something that is increasingly hard in a nationally representative survey due to decreasing victimisation rates. An alternative add-on could be research and development work into the survey's mode and design. Because of the relative flexibility of 'add ons', these could be implemented over the course of the contract to ensure the SCJS is keeping pace with current crime trends, SG strategies/values and methodological developments.
- Re-Contact Sample: We could increase the value of the survey by utilising the re-contact sample. This could be done in one of two ways. Firstly, allowing SG/ non-SG researchers to bid for re-contact sample use. The SCJS teams could set the parameters for this re-contact research by, for example, setting a theme e.g. inviting bids on the theme of Violence Against Women and Girls. Secondly, asking the contractors to carry out further research with the re-contact sample. This research could be longitudinal, returning to respondents to ask the same questions; a boost, returning to certain groups of respondents to find out more about their specific experiences; or an expansion, returning to respondents to ask a different set of questions. As above, these approaches could improve the survey's capacity for understanding experiences of victimisation during a period of decreasing victimisation rates.
- Sample Size Variations: We are currently re-visiting the reasons for the SCJS achieved sample size of 6,000. Under the new contract we could decrease the size of the sample which would likely decrease costs and potentially enable us to, for example, fund a bolt-on module or utilise the re-contact sample. On the other hand, a move to increase the sample size could allow us to increase our analytical capacity, by reducing our confidence intervals on all estimates and by increasing likelihood of capturing the experience of people in smaller demographic groups or who are victims of low prevalence crimes. When it comes to questions around sample size, we must consider the need to work within our assigned budget and whether increases in size represent value for money.
- Incentives: The SCJS started to issue a £10 conditional incentive for the first time in the 2021/22 survey year. In order to boost the response rate under each option, we could continue to offer an incentive. We could vary the incentive by type and by amount, however, any increase in incentive would have to be considered carefully and with regard to the budget.
7. Timeline for 23/24 Survey Year
23/24 | ||
---|---|---|
Year 1 | October | Contract Let |
2022 | November | Lead-In Time |
December | Lead-In Time | |
January | Lead-In Time | |
February | Lead-In Time | |
March | Lead-In Time | |
April | Fieldwork | |
May | Fieldwork | |
Year 2 | June | Fieldwork |
2023 | July | Fieldwork |
August | Fieldwork | |
September | Fieldwork | |
October | Fieldwork | |
November | Fieldwork | |
December | Fieldwork | |
January | Fieldwork | |
February | Fieldwork | |
March | Fieldwork | |
April | Fieldwork | |
May | Fieldwork | |
Year 3 | June | Data Processing |
2024 | July | Data Processing |
August | Data Processing | |
September | Data Processing | |
October | Data Processing | |
November | Data Processing | |
December | Data Processing | |
Year 4 | January | Data Processing |
2025 | February | Data Processing |
Publication |
Contact
Email: SCJS@gov.scot
There is a problem
Thanks for your feedback