Scottish Social Attitudes Survey 2023 - Technical Report

Technical report supporting the Scottish Social Attitudes Survey core module 2023.

In 2023, SSAS was run as a push-to-web survey for the first time in its history. This report presents detailed analysis of this change in methodology from face-to-face to push-to-web.


Methodology

This chapter outlines the key methodological differences between the face-to-face and web surveys. Subsequent chapters compare the quality of the samples and examine the possibility of mode effects on how respondents answered the questions.

Survey approach

The SSAS face-to-face surveys aimed to achieve between 1,200 and 1,500 productive interviews (where the respondent has answered all, or almost all, of the survey questions) each year. The target population was adults living in private households[ii] in Scotland. The lower age limit for participation was 18 between 1999 and 2015, but from 2016 onwards was lowered to 16 to reflect the lowering of the age limit for voting in Scottish Parliament and local government elections. As with many face-to-face surveys, SSAS has experienced declining response rates. Table X illustrates this trend. Whereas in 2011 the response rate was 55%, by 2019, when the survey was last conducted face-to-face, it had declined to 41%.

Table 1: Achieved Response Rate – Scottish Social Attitudes Survey (2011-2019)[iii]

  • 2011 - 55%
  • 2013 - 55%
  • 2015 - 46%
  • 2016 - 49%
  • 2017 - 50%
  • 2019 - 41%

SSAS 2023 was conducted using a push-to-web methodology. It was fielded alongside the larger BSA 2023 survey – which is also conducted via push-to-web. The target population remained the same: adults aged 16 and over living in private households in Scotland. Response rates for online surveys are typically lower than those conducted face-to-face. To reflect this, at the beginning of fieldwork a minimum target of 1,000 productive interviews and a 15% response rate was set. These targets were met and SSAS 2023 achieved 1,574 fully productive interviews and a 15.1% household response rate.

Sample Design and Fieldwork Approach

The SSAS face-to-face surveys all used the Postcode Address File (PAF) as the sample frame. This is a list of postal delivery points compiled by the Post Office. A selection of postcode clusters was selected from a list of all sectors, with proportionality of being selected dependent on the number of addresses in each sector. Prior to selection the clusters were stratified by Scottish Government urban-rural classification, region, and percentage of heads of household recorded as being in non-manual occupations. Addresses were then selected at random within each selected cluster and assigned to interviewers. Before being called on by an interviewer, all addresses in the sample were sent a letter that provided information about the survey. In the last three years SSAS was run face-to-face (2019, 2017, and 2016) potential respondents received a £10 Post Office incentive irrespective of whether or not they participated in the survey. If an address was comprised of more than one dwelling unit, all dwelling units would be listed systematically and one selected at random. If the selected dwelling unit had more than one adult living at the household, interviewers would carry out a random selection to determine the adult to interview.

For the online survey, the PAF was also used as the sample frame. The sample that is drawn from the PAF for the online BSA survey typically results in around 500 completed interviews from Scottish respondents. This would not be enough to reach the required minimum target of 1,000. As a result, a separate ‘Scottish boost’ sample was drawn for SSAS. The sample was drawn in a similar way to that for the previous face-to-face surveys – PAF addresses within Scotland were drawn and stratified by region, population density (measured at local authority level) and tenure profile (% owner occupier, measured at output area level). However, one of the advantages of a push-to-web survey design over face-to-face is that the sample does not need to be geographically clustered in order to reduce the burden of interviewer fieldwork, while an unclustered sample typically results in a lower design effect[iv] (and thus, other things being equal, smaller confidence intervals). Thus the sample was not clustered, though, because of the risk of lower response rates in such areas, those living in the two most deprived SIMD quintiles were oversampled.

A letter was sent to the selected addresses inviting up to two adults aged 16 and over to take part. Unlike a face-to-face survey, with a web approach it is not possible to select at random at each address one adult to be invited to participate, or at least not without creating an unacceptable respondent burden. However, this creates the risk that those within a household who complete the survey are systematically distinctive in their demography and/or attitudes - some groups (e.g. women) are more likely to engage with social surveys than others. To try and reduce that risk, up to two adults were invited at each address to complete the survey. This number reflects the fact that two is the average household size[v] - if more than two household members were allowed to take part there would therefore be diminishing returns in terms of the level of response. Each letter inviting potential respondents to take part included a link to the survey and two access codes for each household. The access codes were individually unique and needed to be entered before the survey could be completed – thus ensuring it was those who were sampled for the survey who completed it. After the initial invitation letter was sent out, up to three reminder letters were sent to each address that had not yet taken part in the survey. Respondents received a £10 Love2Shop voucher incentive if they completed the web survey.

There are differences and certain limitations to a web survey in contrast to face-to-face interviewing. Without interviewer engagement, it can be both more difficult to persude participants to take part in the survey and it is difficult or impossible to do a within-household selection of a potential respondent. Although all surveys find it challenging to secure the participation of younger respondents, there may be a concern with a web survey that older people would be less likely to take part. Internet access is increasing in Scotland – it reached 91% in 2022[vi] - but access is still not equally spread across society. Those in higher income households and those living in areas of low deprivation are more likely to have internet access, while older people and those with disabilities are less likely to do so.[vii] These digital barriers may have an impact on the representativeness of a survey conducted primarily online. SSAS tried to mitigate this risk by offering to all respondents a telephone survey as an alternative. However, in practice only 14 respondents took up this offer.

Weighting

Irrespective of mode, it is known that certain subgroups in the population are less likely than others to respond to surveys. This is referred to as differential non-response. These subgroups can end up being under-represented in the sample, which can bias the survey estimates. Weights are applied to SSAS to correct for these biases as much as possible.

First, however, because those living in the most deprived SIMD quintiles were oversampled, the selection probabilities among strata were not equal. This oversampling meant that a corrective selection weight was required. A selection weight essentially accounts for the fact that a respondent may have been more likely to be selected for the survey as a result of the oversampling and weights their responses accordingly.

Meanwhile, to adjust for differential non-response, two sets of weights were produced, one for individuals within households and one at the level of the selected postal address in order to address differential household non-response. Separate non-response models were constructed to account for each of these facets of non-response.

Between household non-response was modelled using logistic regression, with responding addresses coded 1 and non-responding 0. A number of area-based geographical variables in which a selected address was located were considered for possible inclusion in the response model.

These variables were:

  • population density at Output Area level (quintiles);
  • area deprivation quintiles (Scottish Index of Mulitple Deprivation (SIMD));
  • socio-economic classification (NS-SEC quintiles);
  • percentage of residents with a degree in the postcode sector (quintiles);
  • percentage of owner-occupied properties in the Census Output Area (quintiles);
  • percentage of residents in employment in the postcode sector (quintiles);
  • the percentage of ethnic minority residents in the postcode sector (quintiles);
  • the percentage of residents aged 65+ in the postcode sector (quintiles);
  • the percentage of residents aged 55+ in the postcode sector (quintiles);
  • the percentage of households with cars in postcode sectors (quintiles);
  • urban-rural classification and output area classification.

The variables found to be related to household response were:

  • Percentage of residents with a degree in the postcode sector (quintiles)
  • Percentage of owner-occupied properties in the Output Area (quintiles)
  • Output area classification.

The between-household non-response weight was calculated based on this model.

Within household non-response was also modelled using logistic regression, with the dependent variable indicating whether each responding address had one response or two to the survey. Only addresses which included two or more adults and had at least one response were included in the model. In addition to the area level variables specified above, variables of interest that could be measured at the household level were also tested for significance, including: household size; tenure; number of adults in the household; number of children in the household; whether anyone in the household had a degree, and income.

The variables found to be related to within household response were:

  • Income
  • Whether anyone in the household had a degree
  • Deprivation quintiles (SIMD)
  • The percentage of households with cars in postcode sectors (quintiles)
  • Urban-rural classification
  • Output area classification.

Based on this model, the within household non-response weight was calculated as the ratio of the number of adults in the household (capped at 4) divided by the expected number of responses for each household (as identified by the regression model). The number of adults in the household figure was capped at 4 as very few respondents said they lived in households where the number of adults living there was greater than this number. If the non-response weight was not capped in this way, the weights for the outlier cases would be extreme and, as a result, these cases would have an oversized impact on any attitudinal results once the weights were applied. Similar capping was applied when SSAS was conducted face-to-face.

The final step in the process was calibration weighting. This was used to adjust the composite non-response weight, that is the product of the three weights from the previous stages, so that the weighted composition of the responding sample matches the best available population estimates in terms of age, sex, education, tenure, ethnicity, and deprivation (SIMD).

Questionnaire and Mode of Interview

SSAS has always had a modular structure. Between 1999 and 2019, modules on different topics would be commissioned by a range of funders, and SSAS respondents would be asked the questions in each module as part of the survey. A ‘Core Module’ of questions on attitudes to government and public services has consistently been commissioned by the Scottish Government since 2004, though several questions within it were designed at the outset of the survey and thus have run since the establishment of the devolved Scottish Parliament in 1999. In a typical face-to-face year, thirty questions that comprised the Core Module would be asked alongside others commissioned as part of the survey. Given that these questions have been asked consistently as part of SSAS, and there is therefore a long time series of face-to-face data that there is a wish to extend, it was decided that, as a trial, a version of this module would be run as part of the online survey in 2023. This module was asked as part of the online survey to all respondents, alongside a standard set of background variables and an International Social Survey Programme (ISSP) module on national identity and citizenship funded by the ESRC.

When conducted face-to-face by interviewers in respondents’ homes, interviews were carried out using computer assisted interviewing. In this approach a laptop computer is used by the interviewer, with questions appearing on screen and interviewers directly entering respondents’ answers into the computer. From 2011, more sensitive questions were asked via computer aided self-interview (CASI), where the interviewer would hand over their laptop to the respondent to key in responses themselves. CASI interviewing replaced a previous practice of asking more sensitive questions via a paper and pencil self-completion supplement. On the main part of the survey, showcards were used to ease interviewee burden for questions where there were a large number of response options, or where there were the same set of response options for multiple questions.

The potential for respondent fatigue is greater for online surveys where there is no face-to-face engagement with an interviewer, so the length of the survey was kept to a minimum. The median interview length was 33 minutes in total.

Note on Tables

Weighted figures are presented in tables without brackets, while unweighted figures are bracketed. An asterisk (*) symbol within a table represents a figure lower than 0.5 but higher than 0, whereas a dash (-) figure represents an absolute zero.

Contact

Email: CIMA@gov.scot

Back to top