Near time data service: dashboard user engagement – research findings

The summary report presents analysis of the user engagement research programme aimed to gather users’ and developers’ experiences on two dashboards that provide strategic overview of key indicators from across the Health and Social Care system


Near time data service: dashboard user engagement – research findings

Background

This report sets out findings from a user engagement programme that was undertaken with users and developers of two new Health & Social Care data dashboards. Researchers engaged with dashboard users and developers in order to understand what their experiences have been and whether they had any suggestions for improvements. The research findings in this report have been generalised so as to be useful to those undertaking development of similar data products across the public sector.

The dashboards were developed in response to The Health and Social Care Data Strategy which sets out a vision and ambitions for improving the use of health and social care data to deliver the best care possible for the people of Scotland.

As part of this approach, specifically Creating Insights from Data, the Near Time Data (NTD) Service was created as a collaborative project involving the SG, Public Health Scotland (PHS) and the National Services Scotland (NSS) to scope out technical opportunities for making use of near time management information data.

The NTD service has developed two new dashboards, with one focussed on whole system data and winter planning and another focussed on social care data. Please note that these dashboards utilise management information and are therefore not in the public domain. The dashboards are intended to provide staff with a strategic overview of key indicators from across the whole Health & Social Care system.

About this research

The dashboards were launched in November 2023 on a Minimum Viable Product basis in time to support reporting on winter pressures. A user engagement (UE) research programme was established in parallel with the aims of:

  • gathering users’ and developers’ experiences and views on the dashboards to identify what works well and what needs improvement,
  • analysing usage metric data, and
  • collating the information and make recommendations on changes required to the dashboards.

This summary report presents the key findings of the UE research programme. Findings are presented in a combined way across both dashboards and have been generalised where possible so that learning may be made applicable across the public sector.

Methodology

The UE research programme included the development of user personas profiles, a survey, individual interviews, focus groups and the review of user metrics for both dashboards. Throughout the report, those who took part in the survey are referred to as survey respondents, and dashboard users who took part in an interview or focus group are referred to as research participants.

User personas profiles

Development of user persona groups was done through an iterative process to identify and understand the needs of the target audience of the dashboards. Four user persona groups were established: Senior Leaders, Operational Delivery Leads, Analytical Users and Strategic Advisors.

Survey

Ran from January to February 2024. 809 individuals were invited. 41 users completed the survey in Microsoft Forms. This equates to a 5% response rate. The survey included questions covering dashboard access, usability, indicators, interpretation, presentation and overall experience.

Interviews and focus groups

Conducted between January to February 2024. 24 participants took part: 14 dashboard users and 10 developers. Interview topics included access and overall use, useability, usefulness of the indicators, interpretation, data presentation and impact. Developers’ topics included: the earlier stages of the project and maintenance of the dashboards.

Dashboard usage metrics

The metrics analysed were: Total number of times dashboard was opened by user and specific page opens by user. Dashboard user metrics were analysed between January to March 2024. Analysis of usage metrics was done by user persona and work area.

Findings

Accessing the dashboards

  • Most respondents did not mention any ongoing issues accessing the dashboards, with 88% of survey respondents agreeing that access was easy or very easy.
  • However, some issues were identified which highlight the importance of developing an easy-to-use authentication process for users with clear instructions and troubleshooting in place.
  • Developers also noted some issues which has highlighted the need for sufficient resource and planning to be in place at the launch stages of dashboarding products to support easy access for users.

"[Access is] incredibly easy, it's just exactly how it should be." (Senior leader)

Useability and features of the dashboards

  • A high proportion of survey respondents selected “the display of the indicators” and “the data visualisation” as features they liked most about the dashboards.
  • Several mentioned features of the dashboards related to the data visualisation and interactivity as engaging. The capacity to ‘export’ charts to PowerPoint was also noted as an engaging feature.

"I like the share function. I can pull out export to Powerpoint and it's really good [...] I like the get insights function. A lot of people seem to like that in terms of embedded AI." (Senior leader)

 "If you can book a train ticket online you could probably use the dashboard." (Senior leader)

Layout and format of the data

  • The majority of participants felt that the layout and format of the data was very clear, accessible, easy to understand and user-friendly.
  • However, a small number suggested that the layout and the presentation could be improved by reducing the amount of data presented on some pages. This highlights the importance of not displaying excessive data in a display environment, given the detrimental impact on visualisation.

"Some of the data could be more succinct - perhaps smaller charts, tables and more drop downs." (Strategic advisor)

Interpreting the data in the dashboards

  • A key finding related to users capacity to interpret data in the dashboards relates to presentation of indicators which are formed through collation from multiple data sources. In this circumstance consideration must be given to variance in definitions and data collection methods within the constituent sources and transparency regarding this.
  • A few research participants noted confusion with identifying the target audience for the dashboards. They highlighted the importance of considering who the intended end user is to facilitate presentation of data in a way that maximises intelligibility. Developers also highlighted these views.

“That's really essential for us, to see or hear some of the direct requirements and asks from the end users.” (Developer)

Dashboard indicators

  • Survey respondents were most likely to say that they used overview and summary pages within the dashboards, rather than pages containing more specific or granular data. This was corroborated by analysis of usage metrics data.
  • Research participants also suggested additional indicators for inclusion in the dashboards. This evidence the value in continued user engagement.

"We didn't have this information before so it is great from our perspective as well to see what it is going on in the wider system" (Senior leader)

Use and impact of the dashboards

  • Research participants and survey respondents said that the data from the dashboards was informing their work, and that they also communicated the data to colleagues in various organisations.
  • Specifically, users told us that the dashboards are being used to inform discussions with senior leaders, help to prepare briefings, inform situational awareness, provide advice, assess performance and draw conclusions in various meetings.

“I probably spend a good couple of hours once a week looking at all the trends and the various indicators." (Senior leader)

“We have board calls with each of the 14 boards every month. So, three meetings a week. Before I go to these calls, I use the dashboard to inform the discussion.” (Senior leader)

Overall experience

  • Overall, users were positive about their experiences with the dashboards.
  • Users liked the user-friendliness of the dashboard tools and functionalities to export data to PowerPoint or PDF. They also noted aspects of the layout and format of the dashboard as appealing and engaging.
  • Users also valued the ‘one-stop shop’ nature of the dashboards offering a range of data in one place.

“I think it is dead easy to use, being able to hover over of the charts and get the specifics.” (Senior leader)

“Very positive. Love having it all in one place.” (Senior leader)

A majority of survey respondents told us that:

It was easy or very easy to find the data they were looking for. 83%

The usability of the dashboard was very good. 96%.

The layout and format of the data was very clear or clear. 96%.

The visualisation of the data was very clear or clear. 94%.

They are extremely or very confident in interpreting the data. 71%.

They were satisfied or very satisfied with their overall experience. 91%.

It was very easy to access the dashboards. 88%.

They used the dashboard at least weekly. 79%.

The data was informing their decisions or conclusions as part of their role. 49%.

They had shared or communicated the data with colleagues. 71%

They were satisfied or very satisfied with how to up to date the data is. 81%.

They had not found issues when using the dashboard. 76%.

Developing and maintaining the dashboards

  • Some Developers felt that the initial stage of the development process could have benefited from clearer direction and defined scope.
  • Developers thought the collaborative working across the organisations has been a success.

Usage metrics

Dashboard user metrics were analysed for the period between January to March 2024. They showed that during this period:

Around 400 individual users had accessed a dashboard

The dashboards were opened around 6,500 times

Users were based in Health Boards, HSCPs, Local Authorities and Scottish Government

Introduction and summary pages were consistently the most commonly viewed pages

Lessons learnt

  • Clearly define parameters and target audience of the dashboards at the outset of the development process.
  • Variation can exist in relation to local areas’ and organisations’ IT licencing and access policies. Early engagement with local digital leads is key to effective delivery and thereby ensuring easy access procedures for users.
  • Users value access to data covering the whole system in one place.
  • Ability to assess national level / system wide data and make local level comparisons augments usability compared to localised data alone. It also increases impact of the dashboards.
  • Visualisation and usability are key factors for users.
  • Users value the opportunity to provide feedback for continuous improvement.

General recommendations for continuous improvement

  • Ensure any dashboard has a clear set of parameters which define scope and intended use.
  • Minimise existence of multiple products on different platforms.
  • Maximise user awareness of indicators available.
  • Maximise signposting to other available data sources.
  • Ensure preliminary and ongoing user engagement is in place.

Contact

Email: socialresearch@gov.scot

Back to top