Long term survey strategy: mixed mode research report
Findings from research exploring mixed mode survey designs in the context of the Scottish Government’s general population surveys. The report details information on key issues, potential mitigations and remaining trade-offs, and includes 21 case studies on relevant surveys.
2. What does mixed mode mean?
This chapter discusses in more detail what is meant by ‘mixed mode’ survey design. First, it explains the different modes by which surveys can be conducted. Next, it outlines the different options for mixing modes and how these options might interact with other design decisions in relation to the three main Scottish Government surveys. Finally, it briefly summarises the approaches to mixing modes that have been adopted to date across the 20 non-Scottish surveys that were reviewed in detail as part of this research.
Modes of data collection
The most common modes of data collection in large-scale social surveys are: web, paper, face-to-face, and telephone. While other modes exist, such as video interviewing via software platforms such as Microsoft Teams or Zoom, or Interactive Voice Response/Recognition (IVR) systems, these are used relatively rarely and where they are they tend not to be the primary mode of data collection.[14]
Data collection modes are not always defined consistently in the methodological literature, so in the interests of clarity the definitions used in this report are explained below. A key distinction is between modes where the interview is self-administered (web and paper) and those where it is administered by an interviewer (face-to-face and telephone).
Web
Web interviewing refers to self-completion interviewing using an internet-connected device such as a desktop or laptop computer, a smartphone, or a tablet. This mode is sometimes referred to as Computer-Assisted Web Interviewing (CAWI), or as online interviewing.
Web interviews were traditionally completed on personal computers (Schlosser & Mays 2018; Couper 2000) but are now increasingly completed on smartphones[15]. Consequently, although web interviewing comprises a single data collection mode, it is a ‘mixed-device’ mode, with implications for measurement. One widely accepted implication is that web surveys should be developed following “Mobile First” principles (Antoun et al. 2018; Couper 2017). This involves designing questionnaires with the smaller screen sizes and limitations of smartphones foremost in mind.[16]
Paper
Paper interviewing refers to self-completion interviewing using pen and paper. This mode is sometimes referred to as ‘postal’ or ‘mail’ interviewing. However, this terminology obscures the fact that not all paper questionnaires are sent by post, as in the case of questionnaire booklets handed out by interviewers during or at the end of a face-to-face interview. For example, in Scotland, the Scottish Health Survey (SHeS) includes paper self-completion booklets for participants aged 13 and over (and a version for 4-12 year-olds completed by a parent or carer)[17], while in England, the National Travel Survey has to date used a paper travel diary, placed by the interviewer at the end of the face-to-face interview.[18]
Face-to-face
Face-to-face interviewing refers to an in-person interview with an interviewer, most often carried out in the respondent’s home (as is the case with all three of the Scottish Government’s flagship general population surveys), but sometimes conducted elsewhere such as on the doorstep, in-street, or at a place of work. While one can argue that video interviewing is a form of face-to-face interviewing, face-to-face interviewing has become synonymous with in-person interviews in the methodological literature, a convention followed in this report.
Prior to the advent of computerisation in survey research it was common for face-to-face interviewers to collect data using paper forms. For instance, the Crime Survey for England and Wales (then called “The British Crime Survey”) used this mode from its inaugural wave in 1982 until 1994. Interviewer-administered collection of data using paper forms is sometimes referred to as PAPI (Pencil And Paper Interviewing), but we consider it as belonging to the face-to-face interviewing mode. Care should be taken not to confuse PAPI with paper interviewing, which as specified above, refers exclusively to self-completion interviewing. Since the 1990s, most face-to-face interviewing has been conducted via Computer-Assisted Personal Interviewing (CAPI), and the two terms (‘face-to-face’ and ‘CAPI’) are often used interchangeably.
It is common for face-to-face interviews to include a self-completion element for more sensitive content. This usually involves the interviewer passing the respondent their interviewing tablet or laptop, and asking the respondent to enter their responses independently, out of sight of the interviewer. Once complete, answers are locked and hidden, and the device is returned to the interviewer. This approach is often referred to as CASI (Computer-Assisted Self Interviewing), and we consider it to be a self-completion element within a face-to-face interview.
Sometimes this self-completion element is administered by asking the respondent to listen to pre-recorded questions through headphones. This approach has been used on surveys in the USA including the National Survey of Family Growth, the National Survey of Drug Use and Health, and the Longitudinal Study of Adolescent Health. Questions are usually pre-recorded but sometimes use text-to-speech (TTS) technology. While similar in nature to CASI, this approach entails greater logistical complexities, but has certain advantages in that it can support multiple languages and is more accessible for respondents with low levels of literacy.
Telephone
Telephone interviewing refers to an interview administered by an interviewer over the telephone. Almost all large-scale general population telephone surveys are conducted using Computer-Assisted Telephone Interviewing (CATI) technology and again the two terms (‘telephone’ and ‘CATI’) are often used interchangeably.
Traditionally telephone interviews have been conducted by calling land-line telephone numbers, where each telephone number is linked to a single address, and the interview must take place at home. However, telephone interviews are increasingly being conducted by calling mobile telephone numbers, where each number is associated with a single individual, and the interview does not need to take place at home.
Telephone interviews can include a self-completion element for more sensitive content, often referred to as telephone audio computer-assisted self-interviewing (T-ACASI). T-ACASI is used far less frequently than A-CASI, its face-to-face counterpart.[19]
Mode of contact and mode of interview
It is important to note that the mode(s) of data collection for a given survey may or may not match the mode(s) by which sampled households or individuals are contacted. For instance, as there is no comprehensive sample of email addresses for households or individuals in the UK, for general population push-to-web surveys, contact is generally made by mail (with samples of addresses drawn from the Postcode Address File), but data are collected by web.
A survey of the general public that includes telephone interviews might be based on a sample of telephone numbers constructed using either ‘Random Digit Dialing’ (RDD)[20] or from lists of telephone numbers held by an organisation (in the UK, these are typically purchased from commercial organisations). However, as neither of these sources can generate a comprehensive sample frame of telephone numbers in the UK, most large-scale cross-sectional government general population telephone surveys in the UK again involve contact being made by mail (a ‘push-to-telephone’ approach equivalent to the ‘push-to-web’ approach above).
For face-to-face surveys it is also common for letters to be sent to sampled units prior to an interviewer visit (this is current practice on all three of the main Scottish Government general population surveys). The Covid-19 pandemic also saw the more widespread use of ‘knock-to-nudge’ approaches, whereby interviewers make contact at addresses in-person (maintaining social distance as necessary) to request that sample members complete the interview by another mode, such as web or telephone.
Features of mixed mode survey designs
Typically, when people discuss ‘mixed mode’ surveys, they are talking about surveys that combine two or more ways of collecting data from respondents – that is, some combination of web, paper, face-to-face or telephone data collection. However, in practice the options for how different modes may be combined are complex and can vary across a number of dimensions:
- Mode of invitation – as discussed above, how people are invited to take part in the survey may be different from how they complete it – for example, you may be invited by letter, but participate online or face-to-face. This reflects options for and decisions about sample frames.
- Mode of participation – how people complete the survey itself. As discussed above, surveys described as ‘mixed mode’ include at least two completion modes. If face-to-face, telephone, web, and paper are considered as the main current main modes, then there are, in theory, 11 possible combinations of these, leaving aside the matter of the order in which modes are administered or whether they are administered concurrently or sequentially:
- 1. face-to-face + telephone
- 2. face-to-face + web
- 3. face-to-face + paper
- 4. face-to-face + telephone + web
- 5. face-to-face + telephone + paper
- 6. face-to-face + web + paper
- 7. face-to-face + telephone + web + paper
- 8. telephone + web
- 9. telephone + paper
- 10. telephone + web + paper
- 11. web + paper.
- Sequencing of modes – in addition to different options for which modes are combined, there are also different ways in which these could be combined. In particular, mixed mode designs may involve offering different modes at the same time (concurrent designs) or in a particular order, so that a second mode is only offered after a participant has not responded by the first mode (sequential designs), or a combination of these (e.g. a choice of two initial modes followed up by a third mode for non-responders).
- The review of surveys conducted for this research also highlighted examples of surveys where, while different modes are technically offered ‘concurrently’, cheaper modes are prioritised by their incentive, invitation or reminder strategies. For example, it may be technically possible to opt to take part by telephone or face-to-face from the start, but these options are not emphasised in the initial advance letter, in order to encourage as many people as possible to take part online (as this is a cheaper mode). Respondents may also be offered higher incentives to complete surveys online.
It is also worth noting that different modes may be offered for the survey as a whole – so that, for example, some respondents complete the whole survey online and some face-to-face – or for different sections or elements of the survey. It is arguably more typical to use ‘mixed mode’ to refer to surveys where different respondents complete the survey as a whole (or the same section of a survey) by two or more different modes. A survey that combines a primarily interviewer administered survey (CAPI) with a web-based or paper self-completion section would generally be referred to as ‘multi-mode’ rather than ‘mixed mode’ by methodologists. The SCJS and SHeS could currently be described as ‘multi-mode’ (rather than ‘mixed mode’) on this basis: the SCJS currently includes a web-based self-completion section that can be completed after the main face-to-face survey, while as noted above the SHeS includes paper self-completion booklets and the dietary intake tool included in some years of SHeS involves online data collection after the main interview. SHeS also offers respondents the option to take part by telephone, but only if they have refused to participate face-to-face. Take-up of telephone interviews is very low, so while technically mixed mode in practice the survey is largely unimode.
As will be apparent, once all of these different possible variations are taken into account, the number of different possible mixed mode designs is very large.
In the context of considering how mixed mode designs might apply to the three main Scottish Government general populations surveys, it is also worth noting that choice of modes is only one element of choice of survey design. If other changes in design are being considered at the same time, these will interact with choice of survey mode in ways that may have implications for the types and quality of data that can be collected. These might include, for example, reducing (or increasing) the length of the questionnaires, removing (or adding) specific elements, changes to incentive or reissuing strategies; combining elements of the three surveys in a new survey (or surveys), making greater use of administrative data to supplement, enhance or replace elements of the surveys, or changing from a cross-sectional to a longitudinal design. Although these other design elements were not the focus of this research, we mention them here as a reminder that any decisions about mixing or changing modes are not taken in a vacuum and need to consider the implications within the wider context of any other design changes being considered at the time.
How have other surveys applied mixed mode designs?
As discussed above, there are many different theoretical options for mixing modes on surveys, depending on the decisions taken in relation to mode of invitation, participation, and sequencing. The review of surveys conducted for this research also highlighted the range of mixed mode designs that are being implemented in practice, as well as the ways in which these interact with other elements of survey design. In addition to the different modes being used, singularly or in combination, on different surveys, there was also variation in: the type and form of data being collected; reminder strategies; incentives strategies; the stage and manner in which alternative modes are offered; and the use of modelling to try and ‘target’ specific modes on particular respondents, for example.
In summary, across the 21 surveys reviewed in detail (included in Appendix A to this report), mode designs for their current or most recent wave included:
- Face-to-face (e.g. Crime Survey for England and Wales (CSEW), English Housing Survey (EHS), Health Survey for England) – As with the three main Scottish Government general population surveys, post-pandemic a number of UK Government surveys had reverted to predominantly face-to-face data collection. Similarly to the Scottish Government Surveys, some have retained alternative options for those reluctant to have face-to-face contact – for example, EHS allows telephone interviews, but these are not commonly requested. Some of those that had reverted to face-to-face were currently testing alternative mode designs for the future, however (e.g. CSEW, EHS and HSE).
- Face-to-face, with telephone (and video) option for refusals (e.g. Childcare and Early Years survey)
- Face-to-face with telephone or face-to-face follow-up (e.g. Current Population Survey, USA) – The US-based Current Population Survey uses a mixed mode and longitudinal design, with personal visits in the first month and households encouraged to participate by telephone over the subsequent five months of fieldwork (though with the option of face-to-face remaining for later waves).
- Push-to-web with paper option (e.g. Active Lives, British Election Survey 2019 Covid period, Food and You 2, Participation survey, GP Patient Survey) – Surveys that adopt this design tend to prioritise web completion (in that paper copies are not included in the initial invitation letter and are only available on request at this stage). However, they vary in their subsequent approach to distributing paper versions of the questionnaire to non-responders. For example, Active Lives sends a paper copy to all non-responders with the second reminder letter. The Participation survey also sends paper questionnaires with the second reminder letter, but only to a sub-set of non-responding households, targeted on the basis that their address is either in the most deprived quintile or is expected (based on CACI[21] data) to contain only adults aged 65 or older.
- Push-to-web with telephone option (e.g. British Social Attitudes survey, Transformed Labour Force survey (TLFS)) – again, these generally prioritise online completion, so the option to take part by telephone is only actively highlighted in later reminder mailings. The TLFS also includes an adaptive ‘knock to nudge’ reminder strategy, where interviewers call on non-responders to encourage them to complete the survey online or by phone.
- Push-to-web with paper and telephone follow-up of non-responders (e.g. Dutch Crime Victimization Survey, and a number of other Statistics Netherlands (CBS) surveys).
- Push-to-web with face-to-face or video call for non-responders (e.g. Next Steps Sweep 9 – if respondents to this longitudinal survey did not respond online, an interviewer called to either interview them face-to-face or arrange a video interview).
- Push-to-web, with face-to-face and telephone follow-up (e.g. Understanding Society) – Understanding Society combines a sequential mixed mode design (where respondents to this longitudinal survey are invited to take part first online, then face-to-face if they do not complete it online, with a final ‘mop-up’ of non-responders conducted by telephone). Understanding Society also targets certain respondents face-to-face at ‘first issue’ based on modelling of their likelihood to respond online (i.e. those not deemed likely to respond via web are issued face-to-face from the outset).
- Telephone and web (e.g. National Survey for Wales, which invites everyone to take part by telephone, and those who complete the telephone survey are subsequently invited to complete additional modules online).
Appendix A includes further detail on each of the 21 surveys reviewed, which further emphasises that there are many different ways of combining modes in practice.
Contact
Email: sscq@gov.scot
There is a problem
Thanks for your feedback