Long term survey strategy: mixed mode research report
Findings from research exploring mixed mode survey designs in the context of the Scottish Government’s general population surveys. The report details information on key issues, potential mitigations and remaining trade-offs, and includes 21 case studies on relevant surveys.
Bibliography and references
Al Baghal, T. A, & Lynn, P. (2015). Using motivational statements in web-instrument design to reduce item-missing rates in a mixed-mode context. Public Opinion Quarterly, 79(2), 568–579, https://doi.org/10.1093/poq/nfv023
Andreadis, I., & Kartsounidou, E. (2020). The impact of splitting a long online questionnaire on data quality. Survey Research Methods, 14(1), 31–42. https://doi.org/10.18148/srm/2020.v14i1.7294
Antoun, C., Katz, J., Argueta, J., & Wang, L. (2018). Design heuristics for effective smartphone questionnaires. Social Science Computer Review, 36(5), 557–574. https://doi.org/10.1177/0894439317727072
Ariel, A., Giesen, D., Kerssemakers, F., & Vis-Visschers, R. (2008). Literature review on mixed mode studies. Statistics Netherlands internal paper.
Bailey, J., Breeden, J., Jessop, C., Wood, M. (2017). Next Steps Age 25 Survey Technical report. NatCen.
Bais, F., Schouten, B., Lugtig, P., Toepoel, V., Arends-Tòth, J., Douhou, S., Kieruj, N., Morren, M., & Vis, C. (2019). Can survey item characteristics relevant to measurement error be coded reliably? A case study on 11 Dutch general population surveys. Sociological Methods & Research, 48(2), 263–295. https://doi.org/10.1177/0049124117729692
Bethlehem, J., Cobben, F., & Schouten, B. (2011). Handbook of nonresponse in household surveys. John Wiley & Sons.
Beukenhorst D., & Wetzels, W. (2009). A comparison of two mixed mode designs of the dutch safety monitor: mode effects, costs, logistics. Paper presented at the European Survey Research Association Conference, Warsaw.
Beuthner, C., Daikeler, J., & Silber, H. (2019). Mixed-device and mobile web surveys. Mannheim, GESIS Leibniz-Institute for the Social Sciences (GESIS - Survey Guidelines).
Biemer, Paul P., & Lyberg, Lars, E. (2003). Introduction to survey quality. John Wiley & Sons.
Biemer, P. P. (2010). Total survey error: design, implementation, and evaluation. Public Opinion Quarterly, 74(5), 817–848. https://doi.org/10.1093/poq/nfq058
Börkan, B. (2010). The mode effect in mixed-mode surveys: mail and web surveys. Social Science Computer Review, 28(3), 371-380. https://doi.org/10.1177/0894439309350698
Bosnjak, M. (2017). Mixed-mode surveys and data quality. In S. Eifler & F. Faulbaum (Eds.), Methodische probleme von mixed-mode-ansätzen in der umfrageforschung (pp. 11–25). Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-15834-7_1
Bowling, A. (2005). Mode of questionnaire administration can have serious effects on data quality. Journal of public health (Oxford, England), 27(3), 281–291. https://doi.org/10.1093/pubmed/fdi031
Braunsberger, K., Wybenga, H., & Gates, R. (2007). A comparison of reliability between telephone and web-based surveys. Journal of Business Research, 60(7), 758–764. https://doi.org/10.1016/j.jbusres.2007.02.015
Buelens, B., & van den Brakel, J. A. (2015). Measurement error calibration in mixed-mode sample surveys. Sociological Methods & Research, 44(3), 391-426. https://doi.org/10.1177/0049124114532444
Buelens, B., van der Laan, J., Schouten, B., van den Brakel, J., Burger, J., & Klausch, T. (2012). Disentangling mode-specific selection and measurement bias in social surveys. Statistics Netherlands.
Buskirk, T. D., & Andrus, C. (2012). Smart surveys for smart phones: exploring various approaches for conducting online mobile surveys via smartphones. Survey Practice, 5(1), 1–11. https://doi.org/10.29115/SP-2012-0001
Campanelli, P., Nicolaas, G., Jäckle, A., Lynn, P., Hope, S., Blake, M., & Gray, M. (2013). A classification of question characteristics relevant to measurement (error) and consequently important for mixed mode questionnaire design. [First presented at the Royal Statistical Society, London, 11 October 2011.]
Candy, D., Smith, P., Gallop, K., & Peto, C. (2020). Food and You 2: Wave 1 technical report. Ipsos MORI. https://www.food.gov.uk/sites/default/files/media/document/food-and-you-2-wave-1-technical-report_1.pdf.
Čehovin, G., Bosnjak, M., & Lozar Manfreda, K. (2022). Item nonresponse in web versus other survey modes: a systematic review and meta-analysis. Social Science Computer Review, 41(3), 926-945. https://doi.org/10.1177/08944393211056229
Centrih, V., Viršček, A., Smukavec, A., Bučar, N., & Arnež, M. (2020). Mode effect analysis in the case of daily passenger mobility survey. Croatian Review of Economic, Business and Social Statistics, 6(2), 43–57. https://doi.org/10.2478/crebss-2020-0010
Cernat, A., & Revilla, M. (2021). Moving from face-to-face to a web panel: impacts on measurement quality. Journal of Survey Statistics and Methodology, 9(4), 745–763. https://doi.org/10.1093/jssam/smaa007
Chang, L., & Krosnick, J. A. (2009). National surveys via rdd telephone interviewing versus the internet. Public Opinion Quarterly, 73(4), 641–678. https://doi.org/10.1093/poq/nfp075
Chang, L., & Krosnick, J. A. (2010). Comparing oral interviewing with self-administered computerized questionnaires an experiment. Public Opinion Quarterly, 74(1), 154–167. https://doi.org/10.1093/poq/nfp090
Charman, C., Mesplie-Cowan, S., & Collins, D. (2024). The post-pandemic role of face-to-face fieldworkers. NatCen Social Research. The post-pandemic role of face-to-face fieldworkers (natcen.ac.uk)
Christian, L. M., Dillman, D. A., & Smyth, J. D. (2007). The effects of mode and format on answers to scalar questions in telephone and web surveys. In J. M. Lepkowski, C. Tucker, M. Bruck, E. D. de Leeuw, L. Japec, P. J. Lavrakas, M. W. Link, & R. L. Sangster (Eds). Advances in Telephone Survey Methodology, 250–275. https://doi.org/10.1002/9780470173404.ch12
Christie, S., & Cornick, P. (2022, January 27). Remodelling social surveys – trade-offs and opportunities [Webinar]. https://www.youtube.com/watch?v=pHr257x0vGA&t=820s
Church, A. H. (1993). Estimating the effect of incentives on mail survey response rates: A meta-analysis. Public opinion quarterly, 57(1), 62-79.
Cleary, A., Lynn, P., Cernat, A. & Nicolaas, G. (2018, August 30 – September 1). The viability of a push-to-web survey design in 28 EU member states: the new Fundamental Rights Survey [Presentation]. International Workshop on Household Survey Nonresponse, Utrecht, Netherlands.
Conrad, F. G., Couper, M. P., Tourangeau, R., & Zhang, C. (2017). Reducing speeding in web surveys by providing immediate feedback. Survey research methods, 11(1), 45-61. https://doi.org/10.18148/srm/2017.v11i1.6304
Cook, B., Gounari, X., Hinchliffe, S., & Wilson, V. (2021). The Scottish Health Survey 2020 edition telephone survey – volume 2 – technical report. Scottish Government. (Available on SHeS webpages of gov.scot).
Cornesse, C., & Bosnjak, M. (2018). Is there an association between survey characteristics and representativeness? A meta-analysis. Survey Research Methods, 12(1), 1-13. http://dx.doi.org/10.18148/srm/2018.v12i1.7205
Cornick, P., D’Ardenne, J., Maslovskaya, O., Mesplie-Cowan, S., Nicolaas, G., & Smith, P. (2022). Review of Options for the National Survey for Wales. Welsh Government. https://gov.wales/national-survey-wales-development-work#section-18671
Cornick, P. (2023). The NatCen remodel approach. NatCen Social Research. https://natcen.ac.uk/sites/default/files/2023-01/REMoDEL-Approach.pdf
Couper, M. (2000). Review: web surveys: a review of issues and approaches. Public Opinion Quarterly, 64(4), 464-494. https://doi.org/10.1086/318641
Couper, M. (2017). Mobile web surveys: a total survey error perspective. In P. P. Biemer, E de Leeuw, S. Eckman, B. Edwards, F. Kreuter, L. E. Lyberg, N. C. Tucker, & B. T. West (Eds.), Total Survey Error in Practice (pp. 133-154). Wiley. https://doi.org/10.1002/9781119041702.ch7
Couper, M. P., & Peterson, G. J. (2017). Why do web surveys take longer on smartphones? Social Science Computer Review, 35(3), 357–377. https://doi.org/10.1177/0894439316629932
de Leeuw, E. D., Hox, J. J., & Boevé, A. (2016). Handling do-not-know answers: exploring new approaches in online and mixed-mode surveys. Social Science Computer Review, 34(1), 116–132. https://doi.org/10.1177/0894439315573744
d’Ardenne, J. (2023). Understanding and reducing the survey industry’s carbon emissions. Research Matters, p.5. https://the-sra.org.uk/common/Uploaded%20files/Research%20Matters%20Magazine/sra-research-matters-december-2023-edition.pdf
d’Ardenne, J., Collins, D., Gray, M., Jessop, C. & Pilley, S. (2017). Assessing the risk of mode effects: Review of proposed survey questions for waves 7-10 of Understanding Society. Understanding Society Working Paper Series 2017-04. https://www.understandingsociety.ac.uk/research/publications/working-paper/understanding-society/2017-04
Daikeler, J., Bošnjak M., & Manfreda K. L. (2020). Web versus other survey modes: an updated and extended meta-analysis comparing response rates. Journal of Survey Statistics and Methodology, 8(3), 513-539. https://doi.org/10.1093/jssam/smz008
Dillman, D. A., Sangster, R. L., Tarnai, J. & Rockwood, T. (1996). Understanding differences in people’s answers to telephone and mail surveys. New Directions for Evaluation, 1996(70), 45-61. https://doi.org/10.1002/ev.1034
Dillman, D., Smyth, J.D., & Christian, L. M. (2014). Internet, Phone, Mail and Mixed-Mode surveys: the Tailored Design Method (4th edition), Wiley.
Durrant, G., Kocar, S., Brown, M., Hanson, T., Sanchez, C., Wood, M., Taylor, K., Tsantani, M., & Huskinson, T. (2024). Live video interviewing: evidence of opportunities and challenges across seven major UK social surveys. Survey Futures Working Paper No. 1. https://surveyfutures.net/working-paper-01-live-video-interviewing/
Eckman, S., & Koch, A. (2019). Interviewer involvement in sample selection shapes the relationship between response rates and data quality. Public Opinion Quarterly, 83(2), 313–337. https://doi.org/10.1093/poq/nfz012
Ernst Stahli, M and Joye, D (2016). Incentives as a possible meassure to increase response rates. SAGE handbook of survey methodology. Sage.
European Statistical System. (n.d.). Quality Assurance Framework of the European Statistical System (Version 2.0). https://ec.europa.eu/eurostat/documents/64157/4392716/ESS-QAF-V2.0-final.pdf
Freeth, S. and Sparks, J. (2004). Scottish Household Survey – Report of the 2001 Census-linked study of survey non-response. Office of National Statistics.
Fricker, S. (2005). An experimental comparison of web and telephone surveys. Public Opinion Quarterly, 69(3), 370–392. https://doi.org/10.1093/poq/nfi027
Fuller, E., Mandalia, D., Bankiewicz, U., & Cabaret, A. (2019). The Food and You survey, wave 5 technical report. Food Standards Agency. https://www.food.gov.uk/sites/default/files/media/document/food-and-you-wave5-technical-report-web.pdf
Galesic, M., & Bošnjak, M. (2009). Effects of Questionnaire Length on Participation and Indicators of Response Quality in a Web Survey. Public Opinion Quarterly, 73, 349-360. https://doi.org/10.1093/poq/nfp031
Graesser, A. C., Cai, Z., Louwerse, M. M., & Frances, D. (2006). Question Understanding Aid (QUAID): A web facility that tests question comprehensibility. Public Opinion Quarterly 70(1), 3–22. https://psycnet.apa.org/doi/10.1093/poq/nfj012
Groves, R. M., & Heeringa, S. G. (2006). Responsive design for household surveys: tools for actively controlling survey errors and costs. Journal of the Royal Statistical Society: Series A (Statistics in Society), 169(3), 439-457. https://doi.org/10.1111/j.1467-985X.2006.00423.x
Groves, R. M., & Peytcheva, E. (2008). The impact of nonresponse rates on nonresponse bias: a meta-analysis. Public opinion quarterly, 72(2), 167-189.
Haan, M., Ongena, Y. P., & Aarts, K. (2014). Reaching hard-to-survey populations: Mode choice and mode preference. Journal of Official Statistics, 30(2), 355-379.
Hanson, T. (n.d.). A self-completion study based on the 2021 European Social Survey in Great Britain. ESRC summary methodological report
Hanson, T., & Fitzgerald, R. (2022, May 11). Developing a self-completion version of the European Social Survey: results from an experiment in Austria [Conference Presentation]. AAPOR Conference, Chicago.
Harari, G. M., Lane, N. D., Wang, R., Crosier, B. S., Campbell, A. T., & Gosling, S. D. (2016). Using Smartphones to Collect Behavioral Data in Psychological Science: Opportunities, Practical Considerations, and Challenges. Perspectives on Psychological Science, 11(6), 838–854. https://doi.org/10.1177/1745691616650285
Heerwegh, D., & Loosveldt, G. (2008). Face-to-face versus web surveying in a high-internet-coverage population: differences in response quality. Public Opinion Quarterly, 72(5), 836–846. https://doi.org/10.1093/poq/nfn045
Heerwegh, D. (2009). Mode differences between face-to-face and web surveys: an experimental investigation of data quality and social desirability effects. International Journal of Public Opinion Research, 21(1), 111–121. https://doi.org/10.1093/ijpor/edn054
Holbrook, A. L., Green, M. C., & Krosnick, J. A. (2003). Telephone versus face-to-face interviewing of national probability samples with long questionnaires: comparisons of respondent satisficing and social desirability response bias. The Public Opinion Quarterly, 67(1), 79–125. https://doi.org/10.1086/346010
Hope, S (2005) Scottish Crime and Victimisation Survey: Calibration exercise – a comparison of survey methodologies.
Hsu, J. W., & McFall, B. H. (2015). Mode effects in mixed-mode economic surveys: insights from a randomized experiment. Finance and Economics Discussion Series (008), 1-35. http://dx.doi.org/10.17016/FEDS.2015.008.
HM Revenue & Customs. (2023). Measuring tax gaps 2023 edition: tax gap estimates for 2021 to 2022, methodological annex. https://www.gov.uk/government/statistics/measuring-tax-gaps/methodological-annex
Huskinson, T., Pantelidou, P., & Pickering, K. (2019). Childcare and early years survey of parents 2019: push-to-web mode trial, methodological report. Department for Education. https://assets.publishing.service.gov.uk/media/5dfa3fa740f0b62185147817/CEYSP_Mode_Trial_Report.pdf
Huskinson, T., Rimmington, E., & Pickering, K. (in press). Childcare and early years survey of parents 2023: push-to-web mode trial. Department for Education.
Hutcheson, L., Martin, C., & Millar, C. (2020). Scottish Household Survey: Response rates, Reissuing and Survey Quality 2018. https://www.gov.scot/publications/scottish-household-survey-response-rates-reissuing-survey-quality/documents/
Ioannidis, E., Merkouris, T., Zhang, L.C., Karlberg, M., Petrakos, M., Reis, F., & Stavropoulos, P. (2016). On a modular approach to the design of integrated social surveys. Journal of Official Statistics, 32(2), 259–286. https://doi.org/10.1515/jos-2016-0013
Ipsos. (2023). GP Patient Survey 2023: technical annex. GPPS_2023_Technical_Annex_PUBLIC (gp-patient.co.uk)
Ipsos MORI. (2020). NHS maternity survey: findings from the mixed-mode methodology pilot. https://nhssurveys.org/wp-content/surveys/06-development-work/06-engagement-work/2020/Maternity%202019%20Mixed-mode%20pilot%20results.pdf
Ipsos MORI. (2020). NHS Adult Inpatient Survey: Findings from the mixed-mode methodology pilot. https://nhssurveys.org/wp-content/surveys/06-development-work/06-engagement-work/2020/Adult%20Inpatient%202019%20Mixed-mode%20pilot%20results.pdf
Jäckle, A., & Lynn, P. (2008). Respondent incentives in a multi-mode panel survey: cumulative effects on nonresponse and bias. Survey Methodology, 34(1), 105-117.
Jäckle, A., Lynn, P., & Burton, J. (2015). Going online with a face-to-face household panel: effects of a mixed mode design on item and unit non-response. Survey Research Methods, 9(1), 57–70. https://doi.org/10.18148/srm/2015.v9i1.5475
Jäckle, A., Roberts, C., & Lynn, P. (2006). Telephone versus face-to face interviewing: mode effects on data quality and likely causes. Report on phase II of the ESS-Gallup mixed mode methodology project, ISER Working Paper 2006-41. University of Essex. https://www.econstor.eu/bitstream/10419/92126/1/2006-41.pdf
Jäckle , A., Beninger, K., Burton, J., & Couper, M. P. (2021). Understanding data linkage consent in longitudinal surveys, in P. Lynn (Ed.), Advances in Longitudinal Survey Methodology (pp. 122-150). Wiley.
Jäckle, A, Burton, J, Couper, M, Crossley, T, Walzenbach, S (2022). How and why does the mode of data colelction affect consent to data linkage? Survey Research Methods 16(3).
JAMA. (2024). Instructions for Authors. Retrieved June 20, 2024, from https://jamanetwork.com/journals/jama/pages/instructions-for-authors
Jessop, C. (2019, June 20-21). Innovating with online data collection: Measurement, sampling and new forms of data [Conference presentation]. The future of online data collection in social surveys: shared learning on the challenges, opportunities and best practice, University of Southampton, England.
Jiang, W., Ha, L., Abuljadail, M., & Alsulaiman, S. A. (2017). Item non-response of different question types and formats in mixed-mode surveys: a case study of a public broadcasting tv station’s members. Journal of Communication and Media Research, 9(1), 173-184.
Kaplowitz, M.D., Lupi, F., Couper, M.P., & Thorp, L. (2012). The effect of invitation design on web survey response rates. Social Science Computer Review, 30(3) 339 - 349. https://doi.org/10.1177/0894439311419084
Kantar Public. (2021). Community Life Survey Technical Report 2020/21. Department for Digital, Culture, Media & Sport. https://assets.publishing.service.gov.uk/media/62bef163d3bf7f5a793ec4d2/Community_Life_Online_and_Paper_Survey_Technical_Report_-_2020-21_v4_WA.pdf
Kantar Public. (n.d.). Research on transforming the crime survey for England and Wales, work package a: developing and testing an online version of the CSEW questionnaire. Office for National Statistics, UK. https://27192314.fs1.hubspotusercontent-eu1.net/hubfs/27192314/CSEW%20files/CSEW%20Transformation%20Research%20Work%20Package%20A%20Main%20Report.pdf
Kreuter, F., Presser, S., & Tourangeau, R. (2008). Social desirability bias in CATI, IVR, and web surveys: the effects of mode and question sensitivity. Public Opinion Quarterly, 72(5), 847–865. https://doi.org/10.1093/poq/nfn063
Krosnick, J. (2005, September 15). Effects of survey data collection mode on response quality: implications for mixing modes in cross-national studies. [Conference paper presentation] Centre for Comparative Social Surveys Conference ‘Mixed Mode Methods in Comparative Social Surveys’, London.
Kwak, N., & Radler, B.A. (2002). A comparison between mail and web surveys: response pattern, respondent profile, and data quality. Journal of Official Statistics, 18(2), 257-273.
Laaksonen, S, and Heiskanen (2013) Comparison of three survey modes. University of Helsinki Department of Social Research. Working Paper No 2:2013. https://core.ac.uk/download/pdf/18617094.pdf
Laurie, H. (2007). The effect of increasing financial incentives in a panel survey: An experiment on the British Household Panel Survey, Wave 14. ISER Working Paper, 2007-5.
Leech, G. N. (1983). Principles of Pragmatics. Longman.
Lenski, G.E., & Leggett, J.C. (1960). Caste, Class, and Deference in the Research Interview. American Journal of Sociology, 65(5), 463 - 467. http://www.jstor.org/stable/2774074
Liu M., Conrad, F. G., & Lee, S. (2017). Comparing acquiescent and extreme response styles in face-to-face and web surveys, Quality and Quantity 51, 941-958. https://doi.org/10.1007/s11135-016-0320-7
Lugtig, P., & Luiten, A. (n.d.). Do shorter stated survey length and inclusion of a QR code in an invitation letter lead to better response rates? Survey Methods: Insights from the Field. https://doi.org/10.13094/SMIF-2021-00001
MacInnis, B., Krosnick, J. A., Ho, A. S., & Cho, M. J. (2018). The accuracy of measurements with probability and nonprobability survey samples: replication and extension. Public Opinion Quarterly, 82(4), 707–744. https://doi.org/10.1093/poq/nfy038
Mack, S., Huggins, V., Keathley, D., Sundukchi, M., & Mack, S. (1998). Do monetary incentives improve response rates in the Survey of Income and Program Participation? Proceedings of the American Statistical Association, Survey Research Methods Section, 529–534.
Manfreda, K. L., Bosnjak, M., Berzelak, J., & Haas, I. (2008). Web Surveys versus other survey modes: a meta-analysis comparing response rates. International Journal of Market Research, 50(1), 79-104. http://dx.doi.org/10.1177/147078530805000107
Martin, C. (2009). Income imputation in the SHS and SHCS: examining the feasibility of use the FRS to broaden the measure of household income. Scottish Government research report.
Martin, C., Bell, M., & Napier, S. (2023). Scottish Crime and Justice Survey: analysing the effects of using a mixed-mode approach to adapt to COVID-19 challenges. Scottish Government. Available on SCJS webpages at gov.scot.
Martin, C. (2020). Response rates, reissuing and survey quality: does reissuing reduce non-response bias in the Scottish Crime and Justice Survey. Scottish Government SCJS Methodological Paper.
Martin, C., Cook, B., Knox, D., & Stout, A. (2022). Scottish Household Survey 2020: Methodology and Impact of Change in Mode. Scottish Government. Available on SHS webpages at gov.scot.
Martin, N., Bodgan, A., Sobolewska, M., Fieldhouse, E., Mellon, J., & Fisher, S. (2024, January 23). Incentives in the Ethnic Minority British Election Study and National Travel Attitudes Survey [Webinar]. https://www.youtube.com/watch?v=Vdt3cKNGOEo
Maslovskaya, O., Calderwood, L., Ploubidis, G., & Nicolaas, G. (2023). Report: GenPopWeb2 meeting of experts (23rd September 2020). https://www.ncrm.ac.uk/documents/GenPopWeb2_Adjustments%20for%20Mode%20Effects.pdf
Massey, D. S., & Tourangeau, R. (2013). Where do we go from here? nonresponse and social measurement. The ANNALS of the American Academy of Political and Social Science, 645(1), 222-236. https://doi.org/10.1177/0002716212464191
Mavletova, A. M. (2013). Data quality in PC and mobile web surveys. Social Science Computer Review, 31(6), 725 - 743. https://doi.org/10.1177/0894439313485201
Mavletova, A., & Couper, M. P. (2015). A meta-analysis of breakoff rates in mobile web surveys. In D. Toninelli, R. Pinter, & P. de Pedraza (Eds.), Mobile research methods: Opportunities and challenges of mobile research methodologies (pp.81-98). Ubiquity. http://dx.doi.org/10.5334/bar.f
McGonagle, K. A., & Freedman, V. A. (2017). The Effects of a Delayed Incentive on Response Rates, Response Mode, Data Quality, and Sample Bias in a Nationally Representative Mixed Mode Study. Field Methods, 29(3), 221–237. https://doi.org/10.1177/1525822x16671701
Miech, R. A., Couper, M. P., Heeringa, S. G., & Patrick, M. E. (2021). The impact of survey mode on US national estimates of adolescent drug prevalence: Results from a randomized controlled study. Addiction, 116(5), 1144–1151. https://doi.org/10.1111/add.15249
NHS England. (2022). Health Survey for England predicting height, weight and body mass index from self-reported data. https://digital.nhs.uk/data-and-information/areas-of-interest/public-health/health-survey-for-england-predicting-height-weight-and-body-mass-index-from-self-reported-data
NHS England. (2022). Differences between self-reported and interviewer-measured height, weight and BMI using HSE 2011-2016 data. Differences between self-reported and interviewer-measured height, weight and BMI using HSE 2011-2016 data - NHS England Digital
Nigg, C. R., Motl, R. W., Wong, K. T., Yoda, L. U., McCurdy, D. K., Paxton, R., & Horwath, C. C. (2009). Impact of mixed survey modes on physical activity and fruit/vegetable consumption: A longitudinal study. Survey Research Methods, 3(2), 81–90 https://doi.org/10.18148/srm/2009.v3i2.1092
Office for National Statistics, UK. (2019). Labour Force Survey performance and quality monitoring report: January to March 2019. https://www.ons.gov.uk/employmentandlabourmarket/peopleinwork/employmentandemployeetypes/methodologies/labourforcesurveyperformanceandqualitymonitoringreports/labourforcesurveyperformanceandqualitymonitoringreportjanuarytomarch2019
Office for National Statistics, UK. (2023). Transformation of the Crime Survey for England and Wales – Discovery research on the redesign of multi-mode questions. Transformation of the Crime Survey for England and Wales – Discovery research on the redesign of multi-mode questions.
Office for Statistics Regulation, UK. (2023). State of the Statistical System 2022/23. https://osr.statisticsauthority.gov.uk/wp-content/uploads/2023/06/State_of_the_Statistical_System_2022_23.pdf
Olson, K., Stange, M., & Smyth J. (2014). Assessing within-household selection methods in household mail surveys. Public Opinion Quarterly, 78(3), 656–678. https://doi.org/10.1093/poq/nfu022
Peycheva, D., Ploubidis, G., & Calderwood. L. (2021). Determinants of consent to administrative records linkage in longitudinal surveys: evidence from Next Steps. In P. Lynn (ed.) Advances in Longitudinal Survey Methodology (pp. 151-180). Wiley.
Peytchev, A., & Peytcheva, E. (2017). Reduction of measurement error due to survey length: evaluation of the split questionnaire design approach. Survey Research Methods, 11(4), 361–368. https://doi.org/10.18148/srm/2017.v11i4.7145
Pfeffermann, D., & Preminger, A. (2021). Estimation under mode effects and proxy surveys, accounting for non-ignorable nonresponse. Sankhya A, 83, 997-813.
Pickett, J., Cullen, F., Bushway, S. D., Chiricos, T., & Alpert, G. (2018). The response rate test: Nonresponse bias and the future of survey research in criminology and criminal justice. Available at SSRN 3103018.
Revilla, M., Couper, M.P., Paura, E. & Ochoa, C. (2021). Willingness to participate in a metered online panel. Field Methods, 33(2), 202–216. https://doi.org/10.1177/1525822X20983986
Revilla, M. (2022). How to enhance web survey data using metered, geolocation, visual and voice data? Survey Research Methods, 16(1),1-12. https://doi.org/10.18148/SRM/2022.V16I1.8013
Roberts, C. (2007). Mixing modes of data collection in surveys: A methodological review. ESRC National Centre for Research Methods NCRM Methods Review Papers NCRM/008. http://eprints.ncrm.ac.uk/418/1/MethodsReviewPaperNCRM-008.pdf
Roberts, C., Gilbert, E., Allum, N., & Eisner, L. (2019). Research Synthesis. Public Opinion Quarterly, 83(3), 598–626. https://doi.org/10.1093/poq/nfz035
Ryu, E., Couper, M. P., & Marans, R. W. (2006). Survey incentives: cash vs. in-kind; face-to-face vs. mail; response rate vs. nonresponse error. International Journal of Public Opinion Research, 18(1), 89-106. https://doi.org/10.1093/ijpor/edh089
Saris, W.E., & Gallhofer, I. (2007). Estimation of the effects of measurement characteristics on the quality of survey questions. Survey research methods, 1(1), 29-43. https://doi.org/10.18148/srm/2007.v1i1.49
Saris, W. E. (2013). The prediction of question quality: the SQP 2.0 software. In B. Kleiner, I. Renschler, B. Wernli, P. Farago, & D. Joye (Eds.), Understanding research infrastructures in the social sciences (pp. 135–144). Seismo Press
Sarracino, F., Riillo, C. F. A., & Mikucka, M. (2017). Comparability of web and telephone survey modes for the measurement of subjective well-being. Survey Research Methods, 11(2), 141–169. https://doi.org/10.18148/srm/2017.v11i2.6740
Schlosser, S., & Mays. (2018). Mobile and dirty: does using mobile devices affect the data quality and the response process of online surveys? Social Science Computer Review, 36(2), 212-230. https://doi.org/10.1177/0894439317698437
Scholes, A, Curtice, J, Bradshaw, P, Birtwistle, S, Martini, O (2024). Scottish Social Attitudes survey 2024 – technical report. Scottish Social Attitudes Survey 2023 - Technical Report (www.gov.scot)
Schouten, B., Bethlehem, J., Beullens K., Kleven, Ø., Loosveldt, G., Shlomo, N., & Skinner, C. (2012). Evaluating, comparing, monitoring, and improving representativeness of survey response through r-indicators and partial r-indicators. International Statistical Review, 80(3), 382-399. https://doi.org/10.1111/j.1751-5823.2012.00189.x
Schouten, B., van den Brakel, J., Buelens, B., Giesen, D., Luiten, A., & Meertens, V. (2021). Mixed-mode official surveys: design and analysis (1st ed.). Chapman and Hall/CRC.
Scottish Government (2023). Scottish House Conditions Survey 2021: Key findings. Scottish House Condition Survey 2021 (www.gov.scot)
Shih, T.H., & Fan, X. (2008). Comparing response rates from web and mail surveys: A meta-analysis. Field Methods, 20(3), 249–271. https://doi.org/10.1177/1525822X08317085
Simmons, E and Wilmot, A (2004). Incentive payments on social surveys: a literature review
Singer, E. (2002). The use of incentives to reduce nonresponse in household surveys. Survey nonresponse, 51(1), 163-177.
Singer, E., & Kulka, R. A. (2002). Paying respondents for survey participation. Studies of welfare populations: Data collection and research issues, 4, 105-128.
Singer, E., Van Hoewyk, J., Gebler, N., & McGonagle, K. (1999). The effect of incentives on response rates in interviewer-mediated surveys. Journal of official statistics, 15(2), 217.
Singer, E., & Ye, C. (2013). The use and effects of incentives in surveys. The ANNALS of the American Academy of Political and Social Science, 645(1), 112-141.
Smit, P.R., & Dijk, J.V. (2014). History of the Dutch Crime Victimization Survey(s). In G. Bruinsma, & D. Weisburd (Eds.), Encyclopaedia of Criminology and Criminal Justice. Springer.
Smyth, J. D., Olson, K., & Kasabian, A. (2014). The effect of answering in a preferred versus a non-preferred survey mode on measurement. Survey Research Methods 8(3), 137-152.
Stoop, I. A. (2005). The hunt for the last respondent: Nonresponse in sample surveys (Vol. 200508). Sociaal en Cultureel Planbu.
Sturgis, P., Williams, J., Brunton-Smith, I., & Moore, J. (2017). Fieldwork effort, response rate, and the distribution of survey outcomes. Public Opinion Quarterly, 81(2), 523–542. https://doi.org/10.1093/poq/nfw055
Sturgis, P. (2024). Assessment of the Gambling Survey for Great Britain. https://eprints.lse.ac.uk/121981/1/Sturgis_Assessment_of_the_gambling_survey_for_great_britain_published.pdf
Toepoel, V., & Lugtig, P. (2014). What happens if you offer a mobile option to your web panel? evidence from a probability-based panel of internet users. Social Science Computer Review, 32(4), 544 - 560. https://doi.org/10.1177/0894439313510482
Tourangeau, R., & Yan, T. (2007). Sensitive questions in surveys. Psychological Bulletin, 133(5), 859–883. https://doi.org/10.1037/0033-2909.133.5.859
Tourangeau, R. (2017). Mixing modes. trade-offs among coverage, nonresponse and measurement error. In P.P. Biemer, E.D. De Leeuw, S. Eckman, B. Edwards, F. Kreuter, L. Lyberg, N.C. Tucker, & B.T. West (Eds.), Total survey error in practice (pp. 115-132). Wiley.
van den Brakel, J. A., & Boonstra, H. J. (2021). Estimation of domain discontinuities using Hierarchical Bayesian Fay-Herriot models. Survey Methodology, 47 (1), 151–189
Villarroel, M. A., Turner, C. F., Rogers, S. M., Roman, A. M., Cooley, P. C., Steinberg, A. B., Eggleston, E., & Chromy, J. R. (2008). T-ACASI reduces bias in STD measurements: the National STD and Behavior Measurement Experiment. Sexually transmitted diseases, 35(5), 499–506. https://doi.org/10.1097/OLQ.0b013e318165925a
Vis-Visschers, R. (2009, May 18-20). Presenting ‘don’t know’ in web surveys [Workshop Presentation]. 7th Quest Workshop, Bergen Norway. Microsoft PowerPoint - Vis_StatNeth_20090511_Presenting 'don't know' in web surveys.ppt (cdc.gov)
Wagner, J. (2010). The Fraction of Missing Information as a Tool for Monitoring the Quality of Survey Data. Public Opinion Quarterly, 74(2), 223–243. https://doi.org/10.1093/poq/nfq007
Webborn, A., McKenna, E., Elam, S., Anderson, B., Cooper, A., & Oreszczyn, T. (2022). Increasing response rates and improving research design: learnings from the smart energy research lab in the United Kingdom. Energy Research and Social Science, 83. https://doi.org/10.1016/j.erss.2021.102312
Williams, J. (n.d.) Investigating the feasibility of sampling all adults in the household. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/466925/The_Community_Life_Survey_Investigating_the_Feasibility_of_Sampling_All_Adults_in_the_Household_FINAL.pdf
Williams, J & Holcekova. (2015). Assessment of the impact of a lower CSEW response rate. Office for National Statistics, UK.
Williams, J. 2019. Five methods of within-household sampling: does it matter which one we use? An empirical study using Community Life Survey data. Kantar Public. https://assets.publishing.service.gov.uk/media/5ff2f881e90e0776aa14130d/ABOS_Within_household_sampling_comparative_study_-_Jan_2019_V2.pdf
Willis, G. and Lessler, J. (1999). Questionnaire Appraisal System QAS-99. Research Triangle Institute, US.
Wilson, L., & Dickinson, E. (2021). Respondent centred surveys: stop, listen, and then design, Office for National Statistics, UK.
Wolfe, E.W., Converse, P.D. & Oswald, F.L. (2008). Item-level nonresponse rates in an attitudinal survey of teachers delivered via mail and web. Journal of Computer-Mediated Communication, 14(1), 35-66. https://doi.org/10.1111/j.1083-6101.2008.01430.x
Yan, T., Conrad, F. G., Tourangeau, R., & Couper, M. P. (2011). Should i stay or should i go: the effects of progress feedback, promised task duration, and length of questionnaire on completing web surveys. International Journal of Public Opinion Research, 23(2), 131–147. https://doi.org/10.1093/ijpor/edq046
Ye, C., Fulton, J., & Tourangeau, R. (2011). More positive or more extreme? A Meta-analysis of mode differences in response choice. Public Opinion Quarterly, 75(2), 349–365. https://doi.org/10.1093/poq/nfr009
Yeager, D. S., Krosnick, J. A., Chang, L., Javitz, H. S., Levendusky, M. S., Simpser, A., & Wang, R. (2011). Comparing the accuracy of RDD telephone surveys and internet surveys conducted with probability and non-probability samples. Public Opinion Quarterly, 75(4), 709–747. https://doi.org/10.1093/poq/nfr020
Zhang, C., & Conrad, F. G. (2018). Intervening to reduce satisficing behaviors in web surveys: evidence from two experiments on how it works. Social Science Computer Review, 36(1), 57–81. https://doi.org/10.1177/0894439316683923
Contact
Email: sscq@gov.scot
There is a problem
Thanks for your feedback