Building standards - verification service: customer experience evaluation - future model

This research identified and proposed a preferred model which the Scottish Government (Building Standards Division) could use to deliver the national customer survey for building standards.


2. Considerations for a future model

2.1 To inform potential options to evaluate the customer experience, there are various factors that need to be taken into account:

1. issues which prompted this review,

2. the importance of the customer evaluation exercise remaining relevant to the needs of the BSD and local authority verifiers, and

3. Ensuring the customer evaluation exercise continues to be fit for purpose.

Issues and considerations which prompted this review, and Research findings addressing these

Frequency, point of survey, and time-lag

The Issue

Historically, the national customer satisfaction survey has been issued annually in the autumn, and respondents in scope are those customers who had contact with a local authority building standards service between 1st April and 31st March of the previous financial year. Therefore, there has always been a time lag between customers’ interaction with building standards and the point at which they have been asked to provide feedback via a national survey, ranging from eighteen months as one extreme (April through to October the following year) to six months at best.

In addition, the overall satisfaction rating emerging from each annual report feeds into quarterly performance returns from January to December of the following year. This results in data collection and reporting by local authorities which does not allow the impact or effectiveness of improvements to customer engagement strategies to be assessed timeously. A worst-case scenario is that customer interaction could take place in April one year and be fed into performance returns almost two years later.

The future model should aim to reduce and minimise the time lag so that results can be presented as close to “real time” as possible. Stakeholders and the BSD Review Group agree it is important to reduce this time lag as far as possible.

Customer feedback is currently gathered annually, and customers who have had dealings with a local authority at multiple points over 12 months are only provided the opportunity to provide feedback ‘once’ per local authority in each year.

2.2 In contrast to the national survey occurring annually, at a local level, customers of most local authority verifiers are able to provide feedback at multiple points throughout the year, at any point in their customer journey.

2.3 A benefit of these local approaches being ongoing is that they provide the opportunity to capture views immediately after an interaction, or at different points in the process, however, drawbacks are that results could be skewed to an extent by one individual responding multiple times, or that individuals not in scope might respond.

2.4 The idea of a ‘live’ survey (i.e. one running continuously) provides the ability to monitor results as close to “real time” as possible.

2.5 Questioned on the frequency of the survey, stakeholders are split between whether customer satisfaction data should be gathered annually, or continuously. Those supporting an annual approach state that individuals can feel over-burdened and increasing the frequency with which feedback is requested may lead to survey fatigue. Alternatively, stakeholders commented that the quality of responses may be diminished, leading to some respondents providing comments of less substance. Stakeholders also note that an annual exercise may provide a greater basis for individuals to reflect on and provide an overall response with potentially more interaction from which to cast an overall judgement, rather than respond in what might be a knee-jerk reaction to a one-off experience.

2.6 Stakeholders supportive of a continuous/real-time approach commonly note that existing technology should be able to support faster feedback, and that feedback provided will be more targeted and focused on the specifics on an interaction, both positive and negative aspects.

2.7 With regard to the point of survey, stakeholders agree that customers should be asked for feedback only after the process is complete, noting that otherwise individuals may feel that, by providing comment before then, their application may be jeopardised in some way. In general, stakeholders agree that there are two potential points at which to ask for feedback – once on approval of the building warrant, and once at completion certificate stage – which can be far apart in time, and involve liaising with different building standards staff. With agents often handling multiple applications simultaneously, there is the danger that offering a survey more frequently than this would be less well-received.

Data quality and robustness

Consideration

There is a need to control the quality of information being gathered from customers to ensure that the data provide useful, reliable outputs. For example, the current national survey uses a number of controls to ensure that respondents answer questions only about local authority verifiers they have interacted with (which, in the case of agents acting on behalf of an applicant, could be several).

2.8 The BSD Review Group, as well as Acorn Learning in their report, have both highlighted the benefits and drawbacks that increasing the frequency of the current annual exercise might have in relation to data quality. On the one hand, agents and applicants would be able to provide feedback about every process (e.g. each warrant application) for which they have had interaction, while on the other, agents handling multiple applications per local authority verifier per year would be given the chance to provide feedback on multiple occasions, thus increasing the size of their “voice” compared to a customer with a single interaction in a year.

2.9 In addition, it is noted in Acorn Learning’s report, and again by local authority verifiers for this research, that national level data obtained by the current method are very robust and reliable. However, there were concerns raised – and this directly links to a slight fall in response rates (see below) – that data at a local authority verifier level can sometimes be insufficiently robust due the small number of responses obtained and – as such – could make it difficult to draw firm conclusions based on response levels.

2.10 Research findings from Acorn Learning’s report broadly indicate that a national survey run by an independent agency is perceived to provide impartiality to data gathering and reporting, and adds weight to the findings. This viewpoint is further supported by feedback from local authority verifiers.

2.11 Wider stakeholders also broadly support the view that an independent third party should administer and report on the survey in any future model, on the condition that this body works very closely with the BSD and LABSS to ensure that the aims and objectives are met and understood. Stakeholders disagree that local authority verifiers themselves should administer the survey, raising concerns about the perceived validity of self-reporting. Stakeholders are ambivalent regarding LABSS overseeing the exercise, noting that an overarching body is required to ensure consistency, however, several note that a third party would ensure there are no vested interests in the process.

Consistency and comparability

Consideration

The existing customer satisfaction model and core question set has been largely consistent over the past seven years. This has provided a high degree of consistency in the way in which results can be interpreted and compared, both across local authority verifiers, and year-on-year.

2.12 A clear finding from Acorn Learning’s report is that stakeholders highly value the current national customer satisfaction survey. The reason is that it provides consistency in measuring service quality at national and local levels. Furthermore, retention of the core question set provides the ability to compare performance all the way back to 2013, as well as enabling benchmarking against key indicators within the national performance framework. Stakeholders agree that retention of core questions enabling year-on-year benchmarking and comparison across local authority verifiers is critical in any future question set.

2.13 The BSD Review Group workshop indicated a reasonably high degree of satisfaction with the current approach and question set to evaluate the customer experience at a national level. That said, the Review Group’s discussion predominantly focused on the current question set, and most questions currently in use are viewed as important to keep either to provide year-on-year comparisons, or to enable local authority verifiers to improve continuously. The Review Group’s views on possible alternative models and associated question sets were discussed in the workshop in the second phase of this research.

Response rate

The Issue

The response rate to the national customer satisfaction has sat at around 15% in recent years. Response rates to the survey have decreased slightly in recent years, reflecting either a lack of engagement by customers, and/or a burden on respondents to participate, or just general survey fatigue experienced by the population where surveys are taking place on a regular basis.

2.14 Falling response rates to surveys and research are a current trend being seen across all UK industries, reflective of a society over-burdened by research, and therefore it is imperative that organisations are proactive, agile, and seek innovative ways in which to continue to engage with their customers.

2.15 Response rates can be linked to survey fatigue but can also be linked to factors such as the length of a survey (Acorn Learning’s report suggests a shorter question set in this regard), the readability or intuitiveness of questions and anticipated answers, the “look and feel” of a survey for participants, and whether a respondent “trusts” a survey link, and how their data will be used.

2.16 Social media has been used in recent years to promote the survey to customers, and although difficult to quantify its impact, can have had no adverse impact. Stakeholders interviewed for this research suggested they could assist in wider promotion of the research to their members if required, and Acorn Learning note that any and all promotion would help to boost responses.

Local approaches to seeking feedback

The Issue

In addition to the national survey, most local authority verifiers also gather and monitor customer satisfaction data at a local level. Local authority verifiers have ownership of this approach, and set-up and monitor responses in-house on an ongoing basis. The extent to which this is undertaken varies by verifier, depending upon resource constraints and competing priorities.

2.17 Local authority verifiers’ surveys can typically be accessed by clicking a link in the building standards’ staff email signature, or through the verifier’s website. Feedback can be provided by anyone at any time, meaning that there is less control over responses compared to the current national approach, but that feedback can be provided more timeously.

2.18 These local approaches are typically short and sharp, ranging between six and 15 “Likert” style questions, with a final open-ended text box to capture wider comments and feedback. There is no “joined up” approach between local authority verifiers, and questions asked differ between verifiers, in some cases duplicating, and in some cases adding to, the national survey questions, but broadly covering similar themes such as timeliness of response, ease of understanding of the process, overall satisfaction with customer service, etc.

2.19 Reporting and response rates are both variable among local authority verifiers. Some publish overall feedback on a regular basis (quarterly) while others do not make findings publicly available. Some local authority verifiers’ published results are based on very small sample sizes, as low as a dozen, meaning results need interpreting with caution and that monitoring changes over time should be done so with great care.

2.20 Asked for their thoughts on these local approaches, wider stakeholders could see the benefits that these might add, including the ability to gather richer, more focused data on a quicker timescale. However, most stakeholders raise concerns about these local approaches, stating that such approaches might not be complementary, either to each other, or to the national survey, and that these may in fact detract from the importance of, and confuse customers about, the national survey. Furthermore, there is concern about respondent fatigue for individuals to complete multiple surveys containing similar questions.

Reporting outputs

The Issue

The current national customer satisfaction survey provides a total of 40 reports: one main national report, 32 local authority verifier reports, and seven consortium reports. These are all supplied in Word format and contain charts, but it is worthwhile reviewing whether this format and extent of reporting continues to meet the needs to readers. In addition, full data tables are supplied to the BSD to accompany the main report, and spreadsheets of the customer responses to open-ended questions are provided at a local level.

2.21 Participants at the BSD Review Group workshop suggested that consortium level reports are seldom used or referred to, and therefore it is suggested that these are of little value to continue producing in the future. Wider stakeholders – while broadly familiar with the national report output – were unable to comment on the usefulness of local level outputs and noted that this would be best judged by local authority verifiers and consortia themselves.

2.22 Both the Review Group and stakeholders highlight that two key items of information are of most value when reporting at a local level. Firstly, the core quantitative metrics which enable benchmarking against the performance framework metrics and through which a year-on-year comparisons are made in the reporting outputs. Secondly, there is great value placed on the qualitative data that can be/is provided as an additional, separate output from the national customer satisfaction survey, as this provides a more detailed understanding of customers’ behaviours and attitudes.

2.23 Wider stakeholders support these points, noting that it is important to retain key questions for benchmarking purposes, and that free text comment boxes are often the most interesting and useful components of surveys to provide context and enable improvement. Stakeholders further comment that any future reporting outputs should be clear, easy to read, and accessible, and suggest a mix of (short) explanatory text and graphics to engage the reader.

Data sharing

Consideration

In the current model, local authority verifiers supply customers’ names and email addresses to an independent agency (for all annual surveys to date this has been Pye Tait Consulting), which then contacts customers to gather feedback on the service they received. Data are supplied in such a way that all parties comply with the requirements of General Data Protection Regulation (GDPR). Any future model would have to ensure this process is maintained to be compliant with GDPR requirements and with the Data Protection Act 2018 and other relevant legislation.

Scope of survey respondents

Consideration

The scope of the existing model is to gather and report on feedback from agents, direct applicants, and ‘others’ who have been customers of building standards services. The BSD Review Group workshop participants indicated that local authority verifiers struggle to obtain feedback from a wider range of individuals who interact with building standards, predominantly contractors and tradespeople who have a critical role in ensuring high quality construction and safe buildings, and noted that they do not collect (or are unable to collect easily) the contact details of such individuals and organisations.

Wider building standards landscape

Consideration

This research is taking place against a backdrop of considerable change in the building standards landscape in Scotland, with the wider work of the Futures Board ongoing. Such change – although as yet to be fully determined – could have wide-reaching implications for the way in which local authority verifiers will operate in terms of their priorities and goals. That work takes a considerably longer-term view (three years hence, and beyond), while the scope of this research and options review is more focused on the short- to medium-term (one to three years).

Technology

The Issue

Email has been a reliable way in which to contact customers in recent years, however, the increased shift to online has resulted in more emails either being ignored or filtered (deliberately or not), particularly as individuals seek to cut down on the volume of communication received, or have an increased awareness of potential ‘phishing’ messages. There is a need to explore whether evaluation of the customer experience can be administered in a smarter or more automated fashion to increase efficiencies and meet customer expectations.

2.24 The advent of smarter technology – for instance, social media, smart phones, or the eBuildingStandards portal (eBS for short) – represent opportunities for a national survey to be promoted and/or deployed by additional or alternative means. Smarter technologies and smart devices could potentially offer the opportunity to optimise and automate processes, as suggested in Acorn Learning’s report, but would involve a large amount of initial set-up.

2.25 While the eBS might in theory offer an obvious “quick win” in this regard, it is understood that future digital solutions are in development by the Scottish Government. In addition, there is concern that customers using the portal might provide feedback which is, perhaps unintentionally, focused on their portal experience, rather than on their interaction with building standards services.

2.26 Stakeholders broadly agree that using an online approach is the most suitable means to reach customers to obtain feedback, stating that it is a quick and efficient method of engagement. One stakeholder notes that using alternative methods such as paper-based surveys or telephone calls may broaden the reach of the survey, however, another notes that any survey methodology should be kept as simple as possible.

Data presentation software

2.27 Beyond the technology which is currently used to evaluate customer satisfaction at a national level, there is a wide range of survey software and data presentation options in the market currently. It is therefore important to consider what options are available which might offer solutions to providing a “smarter” survey that increase levels of automation. Many of these software options can appear very similar at first glance, while many are more focused on marketing and increasing customer engagement and customer numbers.

2.28 Presented below in tabular form (in no particular order) is an overview of a selection of potential software options that could potentially be used when gathering and reporting on customer satisfaction with building standards services in Scotland. Please note that the views and assessment presented within the table are those of the author.

Data presentation software Options

Software Overview Cost
Displayr A data analysis and presentation tool. Data are collected through using other survey software and are then uploaded into Displayr which can be used to create cross-tabulation analysis, text analysis, professional reporting interactive dashboard, and infographics which are visually appealing. Multiple data sources can be blended and analysed and outputs can be published as a webpage or to Office products. Professional package is £1,899 per year for one professional user.
SurveyMonkey A widely known and used survey platform, SurveyMonkey is currently used by several local authority verifiers to gather local-level feedback. It has the capability to host personalised surveys via different devices. It can run analysis and create reports at regular intervals, although analytical capabilities are slightly less broad than may be required for rigorous research analysis. Feedback can be integrated into other webpages and customised, visually appealing dashboards can be created. Offers three pricing plans ranging from £25 per month (for three users) to an “Enterprise” package whose cost is calculated based on the needs stipulated by the organisation.
HappyOrNot A “customer experience improvement solution”, most commonly recognised as a series of faces ranging from sad to happy and often seen in airport terminals or shopping malls/supermarkets. This is a very light-touch method to obtain quick, high-level feedback from time-poor respondents, and could be employed at various stages in a customer journey to understand levels of satisfaction with different aspects of the overall experience. This enables feedback to be captured in the moment, and email reports can be generated regularly, while in addition results can be presented online. In terms of analysis, smiley faces do not provide depth of detail of response obtainable by other methods. Prices are not freely available, but anecdotal evidence suggests one “terminal” can cost up to $100 a month in the USA.
Qualtrics Qualtrics enables organisations to survey customers across different channels and provides the ability to view data in real-time on a single platform. Surveys can be created and hosted, and feedback can be collected online, via SMS, offline apps, or even via chatbots. Alternatively, data can be imported from other sources for analysis. Customised reporting dashboards are available to present findings. The basic Qualtrics plan starts at around $1,500 per year in the USA.
SurveySparrow SurveySparrow offers different solutions to a classic or traditional survey, in the form of chat surveys, or pop-up surveys asking for feedback. A survey can also be distributed by email, and data collected offline using the app if respondents lack internet access. While data can be presented similarly to other software via infographics etc, it lacks the visual appeal offered elsewhere. A variety of packages are available ranging from the basic, free version up to a business package costing $149 per month in the USA.
Infogram A content creation tool used by government departments and public organisations, for instance, Angus Council uses this tool to gather and present building standards customer feedback. It is a presentation tool only, and data collected by other means is imported, from which reports, dashboards, and other reporting options can be created. Content is customisable and visually appealing, and viewing is optimised for different devices. Each time data are uploaded, specific processing would be required, thereby meaning processes could not necessarily be automated at each reporting milestone in the context of evaluating building standards customers’ experience. The Pro packages costs $19 per month in the USA.
Microsoft (inc. Forms and Power BI) Forms is a survey tool included as part of the Office package, featuring a range of survey questions and designs for different devices, and is currently used by some local authority verifiers. It is integrable with Excel for data outputs, but analytical capabilities are slightly less broad than may be required for rigorous research analysis. Power BI allows easy visualisation of data in dashboard forms, and integrates with Excel and other data sources. Forms included free as part of the Office Business package. Power BI Pro package $10/month.
Snap Surveys The survey software package used for previous editions of the customer satisfaction survey. Surveys can be set up and run for multiple devices, and disseminated via a smart emailing capability. It has adjustable design and comprehensive analytical capabilities. Directly exported reporting outputs are possible but are less visually appealing than competitors; data can be exported to other software easily. Basic package for one user licence and 1000 responses is £835 per year. Add-ons/extensions available.

Approaches used elsewhere to gathering customer satisfaction data

2.29 When considering what potential models to be used to evaluate the customer experience in the future, it is useful to look to other sectors to understand approaches and to explore whether any lessons can be learned and applied to building standards.

2.30 The supermarket sector has notoriously fierce competition, and customer satisfaction surveys, hosted predominantly online, are key for organisations to understand the customer experience to make improvements. While a small handful are free for anyone to access (whether they have been a customer or not), most of these feedback portals have a point of entry control point, whereby customers enter digits from their receipt to gain access to the survey itself. This enables supermarkets to link up feedback to specific customers. Findings are not made publicly available (understandably) so it is difficult to assess data presentation options.

2.31 Such a concept could potentially be extended to the building standards system to control entry to the feedback portal, for instance by entering a building warrant or completion certificate number to gain entry, however, a major drawback would be that individuals may feel that their anonymity would be comprised if responses could be linked back to applications.

2.32 Relatively new companies such as Uber and Airbnb have taken a mutual feedback approach, whereby both customers and “hosts” provide feedback on each other. Respective reviews are hidden and cannot be viewed by the other party until both have submitted their feedback. This ensures that feedback is honest and not compromised by the other party’s views. Feedback can be provided about specific aspects (e.g. tidiness, cleanliness, communications, meeting expectations etc.) and about specific people.

2.33 While it is not expected that the building standards system will be providing feedback to customers about the experience, approaches to measuring satisfaction using short, sharp surveys are a learning point from these other organisations, as described above. A handful of rating-style questions are provided, as well as an open-text box for freeform comment. These ratings are available for any prospective customer to read, and comments are available to view too. Customers are prompted to provide feedback very shortly after their experience, so that it is fresh in mind.

2.34 Which? is viewed as a reliable, trusted independent third-party organisation for the public to gauge levels of satisfaction across different products. Every year, Which? conducts its own independent online survey asking customers about their satisfaction with their energy suppliers. Questions are again short and sharp, asking individuals to rate out of five stars their overall satisfaction with their provider, and additional aspects such as customer service, value for money, complaints handling, and bill clarity. Results are published online for all to view and compare providers’ service (a minimum of 40 responses per energy provider must be obtained) and a star rating against each aspect is visible, and Which? also calculates an overall score. Survey participants must be registered with Which? already to provide feedback.

Contact

Email: simon.moore@gov.scot

Back to top