Long term survey strategy: mixed mode research report
Findings from research exploring mixed mode survey designs in the context of the Scottish Government’s general population surveys. The report details information on key issues, potential mitigations and remaining trade-offs, and includes 21 case studies on relevant surveys.
1. Background and introduction
This report presents the findings from research conducted by Ipsos with Professor Peter Lynn on behalf of the Scottish Government exploring mixed mode survey designs in the context of the Scottish Government’s large-scale general population surveys. In this introductory chapter, we set out the context for the research, describing: the three Scottish Government general population surveys that were a primary focus for this study; the changing external context for social surveys in Scotland and further afield; and the Scottish Government’s overall approach to its Long term Survey Strategy. We then set out the aims and scope of the study and summarise the methods used, before outlining the structure of the remainder of the report.
Background and context
The Scottish Government general population surveys
At the heart of the Scottish Government’s approach to evidence-based policy making[1] are its flagship surveys (all of which have National Statistics, or Accredited Official Statistics[2], status). These include three cross-sectional general population surveys and the Growing-up in Scotland survey, which is a longitudinal survey of families with children. The focus of this research is on the three cross-sectional general population surveys, all of which are currently conducted primarily face-to-face.
- The Scottish Household Survey (SHS) was established in 1998. The survey covers a wide range of topics (including transport, household income, participation in culture and sport, volunteering, and many others.). Its central purpose is to provide robust evidence on the composition, characteristics, attitudes and behaviours of households and individuals.
- The SHS incorporates both a household interview (with a ‘household reference person’ who can provide answers for the household as a whole – for example, around tenure, who lives in the household, total household income, etc.) and an individual interview with a randomly selected adult, who answers questions about their individual attitudes and behaviours (including a retrospective travel diary, covering their travel patterns on the previous day).
- The SHS interviews around 10,500 households each year. More information can be found on the Scottish Government SHS webpages.
- Since 2012, the SHS has incorporated the Scottish House Condition Survey (SHCS). This involves physical inspections of a sub-sample of around 3,000 dwellings by a team of qualified surveyors to measure the condition and energy efficiency of the housing stock in Scotland, providing the data that underpins Scotland’s national fuel poverty estimates among other things. The first SHCS was undertaken in 1991.
- The Scottish Crime and Justice Survey (SCJS) can trace its origins as far back as 1982 to the first sweep of the British Crime Survey. The first Scottish Crime Survey was undertaken in 1994 and since then, the survey has undergone a number of design and name changes.
- The SCJS provides key information on victimisation rates, the impact of victimisation and fear of crime. Crucially, it provides an independent measure of crimes that may not be reported to the police providing important data to be read alongside police recorded time statistics. A self-completion section (collected during the interview, with interviewers passing their device to the respondent to complete particular questions) collects information on particularly sensitive crimes (such as partner abuse or sexual crimes).
- Data from the SCJS is used to evaluate measures in place to reduce crime, assess the performance of policing and criminal justice organisations, and to provide evidence for use in targeting resources.
- Around 5,500 adults in Scotland take part in the SCJS each year. More information can be found on the Scottish Government SCJS webpages.
- The Scottish Health Survey (SHeS) was first run in 1995, and since 2008 has been conducted annually. It provides reliable information on health and factors related to health and how these change over time.
- The survey consists of a number of main questions and measurements (such as height and weight), plus questions on selected topics such as general health, mental health, dental health, smoking, drinking, diet, physical activity and cardiovascular disease. A sub-sample of respondents are asked to take part in a biological measures (biomeasures) module, where additional data (e.g. blood pressure and cotinine levels[3] in saliva) is also collected by specially trained interviewers. The study also includes self-completion elements (for adults and children), currently collected using paper questionnaires, and an online dietary diary (every three years).
- A target of around 5,000 adult and 2,000 children take part in SHeS each year. More information can be found on the Scottish Government SHeS webpages.
The data from these three surveys underpin numerous outcome and performance monitoring frameworks such as the Scottish Government National Performance Framework and Single Outcome Agreements between the Scottish Government and local authorities. The Scottish Government relies on data from these surveys for major regular analytical outputs, as do other public sector bodies such as National Records of Scotland, NHS Health Scotland, Local Authorities and a range of other stakeholders.
The three surveys are Accredited Official Statistics products, produced to the standards set out in the Code of Practice for Official Statistics with compliance monitored by the UK Statistics Authority. To maintain Accredited Official Statistics status, the surveys must be robust, transparent, and deliver information that is relevant to policy makers, policy stakeholders, and the public.
The Scottish Government Long Term Survey Strategy (LTSS)
The LTSS sets out the Scottish Government’s vision and plans for its population surveys, with a specific focus on the three surveys described above.[4] There have been four published Long Term Survey Strategies to date. The first covered the period 2005 to 2008 and explored the potential for developing a single combined survey for Scotland. However, by 2009 it was decided that Scotland’s data needs would be better met not through integration of the major surveys, but through greater harmonisation of sampling and questions between them.
A subsequent review of the major population surveys led to the introduction in 2012 of the Scottish Surveys Core Questions – a set of 20 questions included in each of the three surveys so that data can be combined to produce reliable and detailed population information on a number of key topics, including equality characteristics, housing, employment and perceptions of health and crime. The three surveys were also sampled in co-ordination with each other for the first time (to facilitate production of the core questions dataset) and the Scottish House Condition Survey became a component of the Scottish Household Survey.
The stated aims of the 2009-2013 and 2014-2017 Long Term Survey Strategies were:
- “To ensure that the Scottish Government’s population surveys meet key information needs while maximising the analytical potential of the data they generate, the precision of estimates and value for money”; and
- “To give full consideration to issues of survey participation, respondent burden, data quality and data security and to make recommendations that align survey practice across Government and promote good practice to other public bodies.”
The vision shared in the most recent LTSS, covering 2018-2022[5], reiterated the importance of analytical potential, survey quality and value for money. It stated that face-to-face remained the mode that delivered the highest quality data, but noted the risk from falling response rates on survey quality and costs and highlighted a number of areas to explore where developments might be made. These included:
- Metrics for survey quality and how they might be deployed
- Alternative sample frames to allow more targeted sampling and potentially to help assess non-response bias
- Collection of paradata to test the option of using reissues to minimise bias
- Greater use of the pool of survey respondents who are willing to be re-contacted
- Data linkage opportunities
- Various measures related to data protection and better communication with users, and
- Alignment with ONS’ digital transformation project and lessons on online delivery of survey estimates.
The Covid-19 pandemic, and the disruption this created for survey data collection, meant that the next iteration of the LTSS has been delayed. However, the Scottish Government is now developing this. The study which is the focus of this report is intended to provide information to inform this process.
The changing context for survey research
Prior to 2020, all three major Scottish population surveys had (almost) always been undertaken face-to-face.[6] The sample for each is selected using random probability methods from the small user Postcode Address File (PAF), a list of all residential addresses in Scotland. The selected sample is issued to a fieldforce of interviewers, who call in person (often multiple times) at addresses to secure interviews. A response rate can be calculated for each survey, based on the proportion of eligible issued addresses (i.e. excluding addresses that turn out to be vacant or commercial properties) where an interview is obtained.
Historically, there has been a broad consensus across those engaged in survey research that face-to-face approaches, using skilled interviewers to contact respondents and collect data, are the best quality option. Generally, surveys undertaken face-to-face have had the best sample coverage properties and achieved the highest response rates (and therefore have been viewed as being at lowest risk of non-response bias).
However, the Covid-19 pandemic in 2020 meant that in-home face-to-face interviewing became impossible. In common with most large social surveys in the UK and elsewhere, the three major Scottish Government general public surveys had to quickly pivot in response: surveys that had slowly evolved their approach over years quickly had to find new ways to collect data to avoid a major gap in evidence on key areas of government policy:
- The SCJS undertook the Scottish Victimisation Telephone Survey[7], a telephone survey of participants who had previously taken part in the SCJS. When restrictions began to be lifted, the SCJS initially restarted in late 2021 with a knock-to-nudge approach and interviews conducted by telephone or video interview, before returning to face-to-face fieldwork from April 2022.[8]
- The SHS moved to a push-to-telephone/video interview approach that relied on respondents opting in, in response to initial postal mail-outs, with those who did so interviewed remotely by telephone or video[9]. The physical survey element of the SHCS was suspended in 2020, and in 2021 surveys were restricted to the exterior of the property only. In 2022, the survey returned to pre-pandemic methodology, with fieldwork primarily conducted via face-to-face interviews and physical surveys based on both internal and external inspection.
- The SHeS initially moved to a push-to-telephone approach, similar to that adopted for the SHS, with potential participants invited by letter to opt in to an interview undertaken remotely by telephone. Later, as Covid-19-related restrictions were relaxed, SHeS moved to a ‘knock-to-nudge’ approach, whereby interviewers called at addresses to encourage people on the doorstep (with social distancing in place) to take part in a telephone interview[10]. Respondents’ self-reported height and weight was collected in place of objective measurements by interviewers and no biological measurements were undertaken.
Following the pandemic, the three major Scottish Government surveys have largely returned to their previous, face-to-face approach, but with some allowance for interviews to be conducted remotely where respondents are reluctant to take part face-to-face. However, the wider survey context, in combination with the experience of the impact of the pandemic on survey research, has increased the impetus to explore alternative designs for the future.
Pre-pandemic, greater consideration was already being given to alternative designs for social surveys, with a number of large-scale government surveys conducted in England, Wales and elsewhere already having moved away from purely face-to-face designs over the last decade (see chapter 2 for a more detailed discussion of this). The Office for National Statistics’ (ONS) Census and Data Collection transformation programme, which began in 2015 and aligns with the UK Government’s Digital Strategy, recommended making surveys web-first wherever possible, and has arguably been a significant driver of change in approach across UK Government surveys in particular.[11] Pressure on public sector budgets has been an important contributing factor here – conducting interviews face-to-face is, all other things being equal, more expensive than conducting them by telephone or online. Other key considerations have included:
- concern about declining response rates across face-to-face surveys (see Figure 5.1), and
- increasing internet access across the population making online approaches potentially more feasible than they were previously – the 2022 SHS estimated that 91% of households in Scotland had internet access, although as noted there remain significant variations between households, with the on low incomes and in deprived areas less likely to have access.[12]
However, while there was already clear evolution in thinking about surveys pre-pandemic, a number of pre-existing trends have arguably accelerated since: declining response rates and pressure on public budgets, for example. Challenges maintaining face-to-face interviewer panels were also greatly exacerbated by the pandemic. At the same time, the pandemic forced surveys to adopt new modes in the short-term, prompting further reconsideration of optimal approaches to survey data collection in the medium to long-term.
Aims and scope of this research
The main aim of this research was to explore the potential benefits and risks of embedding mixed mode approaches (i.e. collecting data via another mode, such as online or telephone, or by a combination of modes) within the Scottish Government survey landscape. It examines evidence of best practice, common challenges, and key considerations that have informed (or could inform) decisions about transitioning face-to-face surveys to different mode designs, and considers how these might apply to the three flagship Scottish Government general population surveys (the SCJS, SHS and SHeS). All three are cross-sectional surveys. Although this report includes some relevant evidence drawn from longitudinal surveys and may, itself, include evidence relevant to longitudinal studies (such as GUS), its main focus is on evidence on the use and impact of mixed mode approaches for cross-sectional survey research.
The research is intended to inform the Scottish Government’s thinking ahead of its next LTSS and to support its consideration of future options. As such, as well as synthesising evidence on the risks and benefits of moving from a face-to-face to a mixed mode design, this report provides a suggested framework to support consideration of the implications of changing its three flagship general population surveys to different mixed mode designs. However, it does not make any recommendations about what specific mode designs the Scottish Government should or should not consider for its surveys; decisions on this are for the Scottish Government itself.
It is also worth noting up front that the range of issues relevant to decisions about survey mode is very wide – each of the chapters in this report could be (and in some cases have been) the subject of several substantial academic books in themselves. It is not appropriate or feasible to try and do full justice to every nuance of the theory or practice of mixed mode survey research in a single report. Rather, this report attempts to identify and highlight the key issues and questions and to provide a framework for applying these specifically to the Scottish Government’s surveys. There will likely be some topics readers want more detail on, and references are included in footnotes and at the end of this report for those who wish to follow this up.
Research methods
The research that underpins this report combined qualitative interviews and workshops with survey stakeholders and experts with a desk-based review of evidence on mixed mode surveys.
- Ipsos interviewed 25 stakeholders from both within and outwith the Scottish Government, including those involved in running the Scottish Government’s three flagship surveys and internal and external data users of each survey. Interviews explored stakeholder priorities for the surveys, their views on the surveys’ strengths and areas for improvement, and initial thoughts on the potential implications of moving the surveys to mixed mode designs. The findings from these interviews helped shape the topics explored in more detail in the desk-based review and expert interviews.
- Researchers in Ipsos’ Research Methods Centre conducted a desk-based review of key literature on mixed mode surveys. The review was guided by themes emerging from the scoping phase and structured around elements of the Total Survey Error Framework to ensure it covered all aspects of data quality. A list of key literature sources was identified, including journal articles, books, webinars, survey technical documentation and unpublished literature. Sources highlighted within the expert interviews were also included. The literature was reviewed and evaluated in terms of credibility, reliability, and relevance to the Scottish Government surveys.
- The desk-based review was supplemented by a series of interviews with 23 experts from outside Scotland (survey methodologists, commissioners, and those with a role in assessing survey quality). These expert interviews focused on participants’ reflections on key considerations when thinking about transitioning surveys from one mode to another, drawing either on their practical experience of doing so or on relevant expert knowledge. Interviewers sought to identify any practical issues that might not have been included in technical reports, as well as any advice they might give to someone considering survey mode options for the future. Topic guides for interviews with stakeholders and experts are included in Appendix B and C. A full list of organisations consulted as part of the stakeholder and expert interviews is included in Appendix D.
- The researchers also conducted a review of other key surveys from the UK[13] and further afield. Drawing on a combination of information from technical reports and, in some cases, expert interviews, these survey reviews summarise how other surveys have approached decisions about mode change to date and consider what can be learned from them for the Scottish Government’s flagship surveys. A shortlist of surveys for detailed review was agreed with the Scottish Government, based on proposals from the research team to identify those most likely to be relevant based on topic or experience of mode transitions. Summaries of the key points from review of each of the 21 surveys reviewed in detail are included in Appendix A to this report.
- Finally, the researchers held a workshop with key stakeholders in August 2024. This workshop shared findings from the research and presented the draft framework set out in this report to support the Scottish Government in considering future survey options. Stakeholder discussion at this workshop fed into subsequent revisions to this report.
The stakeholder and expert interviews and workshops were conducted by members of the Ipsos research team, using topic guides developed by the research team and agreed with the Scottish Government. Interviews were audio recorded and detailed notes were taken for subsequent thematic analysis. Copies of notes or transcripts were shared with participants where requested.
Report structure
The remainder of this report is structured as follows:
- Chapter 2 explains in more detail what is meant by ‘mixed mode’ survey research. It introduces key features of different modes of data collection and of different mixed mode designs and discusses how other large-scale probability surveys have applied mixed mode approaches.
- Chapter 3 introduces the key issues this research suggests the Scottish Government consider when assessing options for changing or mixing modes. It draws on both recognised frameworks for assessing survey quality and on stakeholder interviews conducted for this study to develop the list of themes to review.
- Chapters 4 to 11 discuss these key themes in turn, assessing the evidence on the implications of changing or mixing modes for each, with a particular focus on implications relevant to the three flagship Scottish Government general population surveys. Potential mitigations (that might reduce any potential negative impacts from changing or mixing modes) and trade-offs are also discussed. More specifically, these chapters cover:
- Coverage and sampling (chapter 4)
- Nonresponse (chapter 5)
- Measurement error and mode effects (chapter 6)
- Implications of mode for data collection options (chapter 7)
- Impacts of changing or mixing modes on trends (chapter 8)
- Survey quality metrics (chapter 9)
- Financial and resource implications (chapter 10), and
- Administrative data (chapter 11)
- Finally, chapter 12 discusses general good practice when considering or planning a change in mode or a move to a mixed mode design.
Findings from the desk-based review and qualitative interviews with experts and stakeholders are interwoven within the report rather than being presented in separate sections.
Each of chapters 4 to 9 concludes with a table summarising:
- the priority issues to consider with reference to the overall theme of the chapter (e.g. Representativeness), and evidence on the potential implications of mode choice for these issues
- potential mitigations that may help address these issues, and
- remaining issues and trade-offs the Scottish Government is likely to have to consider when thinking about future survey mode(s).
The tables also highlight any specific key issues that apply to each of the three Scottish Government surveys, or to particular elements of those surveys. Taken together, these tables constitute a suggested ‘framework’ of issues and questions to help guide future discussions about the implications of different mode options for survey quality. The summaries of chapters 10 to 12 provide the second part of this ‘framework’, focusing on key practical issues and challenges relating to resources and the scope to use administrative data, as well as suggestions around good practice in arriving at robust decisions on future survey mode(s).
Contact
Email: sscq@gov.scot
There is a problem
Thanks for your feedback