Public dialogue on the use of data by the public sector in Scotland

This report presents the findings from a public dialogue on the use of data in Scotland commissioned by the Scottish Government to explore the ethics of data-led projects. The purpose of the panel was to inform approaches to data use by the Scottish Government and public sector agencies in Scotland.


Future engagement

One of the aims of this public dialogue was “to create a blueprint for a long-term, sustainable form for engaging and involving the public in data policy, scrutiny and decisions”. To help realise that aim, participants were asked what role the public should play in decisions about public sector use of data, and specifically how the public should be engaged on this topic in future. This section outlines views on these topics, as well as participants’ reflections on their own experiences as part of the panel.

Role of the public in decisions about public sector data use

Participants felt that the public had an important role to play in helping shape how data were used by the public sector. They suggested that members of the public could provide a balance to the views of experts and data specialists, potentially offering new ideas or alternative issues to consider.

“Experts are, for me, looking at one thing and one thing only. The public have got their own perceptions and can input more feeling and reality into this.” (Session six)

If the data in question had originated from members of the public, it was seen as only fair for the public to have a say in how the data would ultimately be used.

“It’s always beneficial and interesting to have an outsider’s perspective…it’s our data, and I think we should have a say on how it’s dealt with.” (Session five)

It was also suggested that the public could help to ensure use of data is explained in “lay person” terms, by asking questions and encouraging data specialists to clarify what might otherwise be very technical information. Participants considered themselves to have played this role in the panel.

Involvement of the public was seen as a marker of transparency, one of the key factors that participants felt was important for building trust in public sector use of data. As previously noted, there was a generally high level of trust in Scottish Government and the public sector to begin with. Over the course of deliberation, this feeling of trust had either remained high, or had increased. Participants attributed this to their own process of learning as a panel, specifically in relation to the current data protection landscape and the steps the public sector had to go through before it uses data.

“Starting out, I had a very low [awareness] of how data was being used. Over the course of this panel, it became a lot less nebulous…it has increased my confidence, as it has showed how much vigour is involved in getting ethics through… I do trust data usage more.” (Session five)

In contrast, when decisions were made by the public sector without any involvement of the public, they felt this created a sense of distrust in those decisions. One participant used the example of changes to the organ donation systems in Scotland, which they thought had been transparent because of the level of public involvement and communication around it.

There was an acceptance that it may not be possible to seek the views of the public on data-led projects in an emergency situation. However, when this was the case, it was felt that the public should at least be informed about how data were being used.

How the public should be engaged in future

There was overwhelming support for future public engagement on the use of data, which reflected participants’ positive feelings about their own experiences as members of the panel.

Echoing earlier views about the importance of transparency, participants felt that data-led projects should be widely publicised and promoted to help raise public awareness and interest.

“More publication and promotion of [data projects] would be exciting for the public…and that knowledge could add to public scrutiny.” (Session five)

Participants viewed a public panel, designed and structured in a similar way to the one they were part of, as a good way of engaging the public. They felt they had benefitted from having the opportunity to learn about the topic, hear from and speak with the experts, and then reach informed conclusions. This learning process was seen as particularly important when asking the public for views on technical, complex topics such as the use of data.

It was suggested that an ongoing panel could be used to help make decisions about future use of data by the public sector. Potential uses for a panel could be to review and provide feedback on potential data-led projects, or to revisit ethical guidelines developed by this panel to test whether they were still appropriate.

“Have a pool of people that they could draw on, who are interested…[to provide] those checks and balances… to challenge some things. I'm not entirely sure how it would look, but I think it's a good idea having lay people on these [panels] in some form.” (Session six)

Other suggested forms of engagement included teaching children about data at school, and using websites (such as a public sector website) to invite feedback from the public about potential data-led projects. However, it was felt that online consultations might only appeal to a certain type of person, and that a randomly chosen sample of the public, like the approach used for this panel, would help to ensure involvement from more diverse groups in society.

Reflections on participants’ involvement in this panel

This public dialogue supported participants to express a range of views on different types of data projects, and explored their expectations and understanding of the ethical considerations for future use of data about citizens. The panel’s reflections in the final session highlighted that learning journey, with participants describing how they went from feeling “overwhelmed” in the first session to feeling “informed” and “empowered” by the end. The image below shows the participants’ experience of the deliberative journey from the first to the last session, expressed in their own words:

Figure 1.3: 3 words to describe the session (online community feedback)
Image showing words participants used to describe sessions 1-6. Common words from session 1: 'Interesting', 'Informative'; session 2: 'Interesting', Informative'; session 3: 'Informative', 'Interesting'; session 4: 'Interesting', Engaging'; session 5: 'Interesting', 'Engaging'; session 6: 'Satisfying', 'Exciting', 'Productive'.

One positive aspect of the process was the opportunity for participants to meet (virtually) and engage with each other, particularly through the smaller group discussions. They felt that the process had helped them to realise their own biases, and to listen to and be shaped by each other’s views.

It was clear that the deliberative nature of the public dialogue had been beneficial for participants. They welcomed the opportunity to learn about the topic in depth, hear from and speak with a range of experts, and then reach informed conclusions. They also appreciated the ability to have direct engagement with expert specialists, made possible by bringing them the expert speakers into the smaller group discussions. It was common for participants to reflect on their overall learning journey, and the growth in their own understanding about the use of data.

“In the first week, I think it felt very overwhelming…. it did become clearer as we went through the weeks…it's nice to be a part of something that you think you might have a slight impact on something moving forward.” (Session six)

The main drawback they highlighted about the process was that the information in the early stages was overwhelming, and this made some feel confused and “out of their depth”. For future public dialogues or similar forms of engagement, participants suggested that information provided in the early stages of the process should be as simple as possible, and that dense presentations of technical information should be avoided.

Overall, there was a sense that involvement in the panel was worthwhile and that they were genuinely having an impact on future policy.

"It really feels like you're actually connected with a process that will change not just your life, but the lives of other people." (Session six)

There are aspects of the process which may, in and of themselves, have impacted on how participants engaged with the topics, such as:

  • The delivery of presentations: while a template for presentations was provided to ensure key details were covered, specialists interpreted these details in different ways in relation to their projects, and the variations in delivery style may have influenced how participants responded to, and engaged with, the projects. Where participants felt presentations used too many ‘academic’ terms or were too ‘jargon’-heavy, they found the discussion afterwards more challenging. Where presentations were felt to be clearer and succinct, it was easier to get straight into the discussion afterwards. Ongoing opportunities for Q&A with specialists helped participants clarify their understanding. Having specialists available to join breakout discussions was also beneficial in addressing any issues or misunderstandings, allowing participants to progress their discussions.
  • Engagement approaches: some presentations were delivered live (or were pre-recorded and played back) during plenary, while others were delivered directly to participants in small breakout groups. Feedback suggested that this worked well in terms of getting the specialists closer to participants and enabling direct feedback and Q&A, but also resulted in less time for participants to reflect on and discuss the project. The specialists being present during discussions may have resulted in participants being less willing to share their views.
  • Perceived complexity of the project: some data projects, such as those related to the pandemic, resonated more with participants’ own lived experiences while others, such as the data linkage projects, felt more abstract and this may have impacted on how participants responded to them.
  • Variance in online community engagement: the online community was primarily a vehicle for maintaining engagement with the panel in-between sessions, however any data collected from it (such as survey tracking) has been treated with caution as not all participants chose to join and, of those who did, not all activities were completed. While most tasks via the online community were discrete and did not inform the main panel process, participants who did not register on the online community were offered opportunities to participate in certain tasks (such as voting on projects they wanted to hear about) via email instead.

Contact

Email: michaela.omelkova@gov.scot

Back to top