Building standards - verification service: customer experience evaluation - future model
This research identified and proposed a preferred model which the Scottish Government (Building Standards Division) could use to deliver the national customer survey for building standards.
4. Implementation plan
Considerations and decisions required around option 3
4.1 In supporting option 3 as the preferred model, the Review Group put forward several caveats/considerations, where key decisions will need to be made.
4.2 With the potential overlap/duplication this option might have with existing local approaches (noted in paragraph 3.13), local authority verifiers suggested that local level approaches could be discontinued if the national-level question set used in option 3 could be specifically designed to meet the requirements of Customer Service Excellence (CSE) elements.
4.3 There were broad discussions around the response rate that option 3 might generate. While the Review Group noted that similar approaches for their local feedback surveys typically produced a low response (under 10%), participants agreed it would be difficult to estimate the response rate of this option until fully implemented nationwide. Participants did note that raising awareness through personal approaches to individuals, such as prompting customers while on site, or via telephone calls, could be one option to consider to raise awareness and response levels. In addition, the Group noted that, even if responses were to fall, the quality and usefulness of responses may be higher given customers would be providing feedback directly after their experience. In the final point linked to response rate, verifiers raised concerns that a small number of responses per verifier per quarter may result in their performance rating for that quarter being based on a very small number of responses, a trade-off that would need to be made.
4.4 The Review Group raised concerns that the survey link, if included in an automated email with (say) Building Warrant approval, may be easily overlooked. To increase the visibility of the customer survey link, it was suggested that, rather than being included and potentially lost within the text of the email notification of building warrant approval (or acceptance of completion certificate) that the opportunity to provide feedback could instead be circulated in a separate, dedicated and auto-generated (that would require initial set-up) message shortly afterwards.
Potential mitigations for option 3 considerations – views from stakeholders
4.5 Asked about how the response rate might be uplifted and maintained, stakeholders noted that email was the most appropriate means to gather feedback, and that SMS would have little positive impact. Three stakeholders independently suggested that the request for feedback could be implemented as part of application process, for instance, applicants/agents could be informed that their warrant/certificate was ready and could be accessed upon completing a short feedback form. These stakeholders suggested this embedded survey is simple and quick, and that the benefits of completing the feedback are clearly stated for respondents, for example by noting that it will only take 2-3 minutes to complete, that feedback will feed into verifiers’ continuous improvement, and that results are viewable on a dashboard webpage. However, two stakeholders believed that making a feedback request mandatory was unacceptable. Separately, another stakeholder suggested that potential respondents could be incentivised to provide feedback via entry into a free prize draw.
4.6 A second consideration raised by the BSD Review Group was that, moving from an annual exercise to an ongoing one would reduce the potential for focused promotional activities. Stakeholders commented that, were the survey to be embedded as part of the process (as outlined in the previous paragraph), then this would be a moot point. One stakeholder noted that, since the Covid-19 pandemic, they have significantly boosted their social media activity and use two-minute animations to get across key messages to followers on an ongoing basis, and that such a model might be applied in this instance. Another stakeholder commented that any promotion would be best placed coming either from verifiers themselves, or from LABSS, and could link to the latest results dashboard to drive interest and uptake in participating.
4.7 The third main consideration raised by the Review Group centred on whether differing approaches would be required for agents and applicants. Broadly, stakeholders believed that a similar approach should be used to obtain feedback from both cohorts, otherwise there is a danger that a two-tier system results, with data that may not enable consistency or comparability between the two groups. While agents, with the suggested approach outlined in option 3, will be asked for feedback at multiple points in each year, stakeholders believe that agents will not become fatigued if the survey is easy and quick to complete.
4.8 As outlined in paragraph 3.13, Review Group verifiers believe option 3 might duplicate existing local approaches used to gather feedback. One stakeholder suggested that, to avoid such duplication, to simplify processes, and to minimise any potential customer confusion, local approaches should be discontinued, and that the national survey is deployed, as outlined, to enable consistent comparison and benchmarking.
Next steps and considerations
4.9 A new question set will need to be agreed upon between local authority verifiers and the BSD. This would potentially need the input of a specialist research company. This question set should be designed to be quick and easy for respondents to understand and complete, and cover the key question topics, as well as retaining core questions needed for benchmarking purposes, i.e. year-on-year comparison and measurement against performance framework criteria. There is broad agreement between the BSD Review Group and stakeholders that questions should be designed to be closed (rating, or Likert-type question) with one open-ended question for additional comment.
4.10 Decide on whether existing local-level approaches being used to gather customer feedback and satisfaction should be retained or discontinued. If discontinued, any potential customer confusion will be avoided, as will any duplication in effort. However, as some of these local approaches feed into CSE, any new question set would need to cover the key areas and question topics that will still enable these local authorities to meet their CSE requirements. Discontinuing local-level approaches would require a unanimous and collective buy-in from all verifiers.
4.11 The approach to distributing the survey itself, i.e. if it needs to be integrated as a mandatory step. As noted above, local-level approaches which include a link in their BW approval/CC acceptance email typically have low response rates. Therefore, it is recommended that the process is changed to integrate the short feedback questionnaire as an additional step (although completing the questions may not necessarily have to made mandatory) and embed this as part of the BW/CC process. In practice, customers would be informed that their BW/CC is ready and that, to be able to obtain and download a copy, they will first need to navigate through a short feedback form to reach it. To implement this procedure, associated technical systems would first need to be revised to take this into account.
4.12 Alternatively, if the BSD does not wish to implement this particular recommendation of asking for feedback as an integrated, additional step, we recommend that the survey link is sent as a separate email to the BW/CC email, so that it is not lost among the other text and information. Furthermore, it is recommended that ongoing promotional activity and awareness raising is undertaken via social media, via regular posts and short video messages.
4.13 A decision on the software to be used is to be agreed upon to meet all parties’ needs. Various potential reporting dashboards have been outlined in Chapter 2, each with relative strengths, limitations and cost considerations. A careful review will be required to understand precisely what additional technical expertise and capacity is required to turn the quarterly analysis output into a dataset that can be easily imported to the chosen data presentation software and updated with a quarter-on-quarter comparison, both at a national and a local level. Associated with this, once the question set is decided upon, the analysis template needs to be set up so that quarterly datasets can be easily imported and analyses run.
4.14 Decisions need to be taken on how the various key stages of the process (question design, survey hosting, analysis, automated reporting) are set up, managed and audited on an ongoing basis. There was broad support among the BSD Review Group and stakeholders for an independent third party/specialist research agency to run these aspects, working closely with the BSD, to ensure impartiality and underline the independence of the overall process. Thought will also need to be given as to how the cost of implementing this will be met in future years.
4.15 Timelines: All the above steps may take some time to implement, and it is uncertain, at this stage, precisely how long it will take to move from this point to taking this option forward to implement in day-today use. Therefore, depending on such decisions, it may be that the current evaluation model (i.e. that used in 2020 and previous years) is implemented for a final time in 2021 to ensure that customer satisfaction and associated performance framework measures can be assessed and reported on, before this new model is rolled out.
Contact
Email: simon.moore@gov.scot
There is a problem
Thanks for your feedback