Public dialogue on the use of data by the public sector in Scotland

This report presents the findings from a public dialogue on the use of data in Scotland commissioned by the Scottish Government to explore the ethics of data-led projects. The purpose of the panel was to inform approaches to data use by the Scottish Government and public sector agencies in Scotland.


Looking forward: possible future projects

This chapter summarises the panel’s reflections on three projects that were in the early stages of development or that had not yet started, and that the DIN team was consulted on. These projects gave participants an opportunity to explore aspects of data use that they had not been introduced to before (such as use of emerging technologies, different data sources, and private or third sector involvement).

By exploring further reactions to and perceptions of emerging data projects with these new elements, this chapter highlights how the participants’ tested and consolidated their thinking around the ethical principles that had been formed in the previous sessions, which were later developed into final guidelines.

The data projects reviewed in session five included:

  • Little Knight – exploring how Artificial Intelligence (AI) could be used to identify patterns and correlations in anonymised school attendance records to safeguard children from abuse.
  • Policing the pandemic – linking police and health data to understand the usefulness and fairness of the Coronavirus Regulations that were introduced during the pandemic.
  • Mobility – using mobile phone app data collections to investigate how the flow of people in city centre workplaces or greenspaces changed during the pandemic.

An alternative perspective was provided in session five by Laura Carter (from the Ada Lovelace Institute), who offered insights on some key ethical considerations for these emerging projects.

Each project is presented separately, summarising the key ethical considerations raised by the panel in their initial review of each project. These reflections informed the ethical guidelines that the panel developed in the final session.

Despite the introduction of new approaches to using data (such as artificial intelligence, or AI), different types of data (such as police data and mobile phone app data), and different organisations (outside of the public sector), the ethical considerations raised by the panel were similar to those raised in the past project reviews.

Key findings:

  • In session five, participants reviewed new and emerging projects that the DIN team was considering supporting and discussed the key ethical considerations.
  • On the Little Knight project, the panel felt there was a clear public benefit in terms of supporting social services and safeguarding children from abuse. There was some wariness around the use (and possible misuse) of AI.
  • On the Policing and Pandemic project, the panel felt this was worthwhile and would help Police Scotland gather important insights as to how the pandemic affected people. Concerns were raised over the possible stigmatisation and increased surveillance of vulnerable groups.
  • On the Mobility project, the panel considered the use of mobile data to be innovative and useful. It was also recognised that such data could lead to the removal of green spaces, but issues around consent were raised. “Is there any risk to ‘group privacy’ / a risk to the Ukrainian community from how data could be used?”

The key ethical considerations raised in relation to these new and emerging projects were similar to those highlighted in the past project reviews, and included:

  • Weighing up the relative benefits and harms to groups in society.
  • Proportionate use and not going beyond the original scope (including care over who data are shared with).
  • Ensuring transparency and accountability in decisions about what data are used, by whom, and for how long it is held.
  • Ensuring the principles of consent are adhered to.

Little Knight

Little Knight is a not-for-profit initiative run through the Scottish Tech Army, which is a separate organisation bringing together specialist technical skills for charity projects. The Little Knight team are a group of volunteers from the private sector (with expertise in healthcare, education and social care) who came together to discuss how their technical skills could be used to help safeguard children from abuse. Little Knight was exploring how artificial intelligence (AI) could be used to identify patterns and correlations in anonymised school attendance records.

Clear public benefit, but questions around AI

The potential public benefit of this research was clear to participants, as they felt it would help support social services and safeguard children from abuse.

There was some initial wariness around the use of AI in this project, with questions around how it would work in practice, what role it would play and what legal measures would be in place to regulate the use of such emerging technologies.

Questions raised by the panel in relation to the Little Knight project:

  • “What are the legal implications of using AI, where does the law fit in with AI?”
  • “Would we be picking up the right information, the nuances implied by things that are written down, would that carry over from AI?”
  • “If the AI’s learning from reports and flagging them up, and then a human would be involved to validate it, are there some [details] slipping through that should be flagged, and how would you know they’ve slipped through?”

After clarifications were provided by the specialist on how the AI element would work for this project, participants felt broadly assured that AI could be developed as a useful tool to support and speed up, but not replace, human decision-making and interventions.

"If we can rule out human error in some sort, that's something we shouldn't be afraid of." (Session five)

The panel saw the value in using AI to support social workers in this way and felt that it could help reduce the risk of child abuse, however it was also deemed important to consider who it would benefit and how.

Unintended consequences and misuse

While potential benefits were recognised, participants also felt that the use of AI posed some risks. It was felt that an overreliance on AI could lead to unintended consequences, such as social workers feeling judged on their performance and leaving the profession, or those abusing children trying to avoid detection through school attendance. Some participants described the prospect of trusting AI to do what a human does as “scary”, and there were concerns raised over the quality and robustness of the results it would produce. One particular concern was that it might unfairly target particular groups or miss some children at risk altogether.

“Because it's artificial, it's taking parts of the data and working with the most noticeable ones but it could also skip over data that seems normal, that seems fine. You're getting children that attend school and from the outside everything looks perfect, but at the end of day you still need to make the calls, visit houses, do all the groundwork.” (Session five)

Aside from the use of AI, the scope of the project was another consideration that arose in the discussions, and participants felt that care would need to be taken over who data was shared with and for what purpose.

Policing the pandemic

During the pandemic, the Coronavirus Regulations introduced unprecedented powers for UK police forces to ensure compliance in preventing the spread of COVID-19.[14] An Independent Advisory Group was set up during the pandemic to advise Police Scotland on the new powers and to ensure they were compliant with human rights. Police data – including records of encounters with the public during the pandemic and the database of Fixed Penalty Notices issued to members of the public under the Coronavirus Regulations – was used to inform recommendations on policing during the pandemic. The possibility of taking this research further was being considered, to understand the usefulness and fairness of the Coronavirus Regulations that were introduced during the pandemic. Police data would be linked with health data in Scotland’s National Safe Haven.

Weighing up benefit and risk

The use of police data to understand the impact of the Coronavirus Regulations on people was considered worthwhile, with one participant describing it as “necessary”. There was some reassurance in knowing that these impacts were being explored and that Police Scotland was taking stock and reflecting on the groups that may have been adversely affected by the regulations.

“It’s a worthwhile cause to see how it did affect people in different areas.” (Session five)

However, the necessity of this project was not clear to all. Clarity was also sought over what exact data would be used, what it would be used for and whether individuals would be identifiable (which the specialist confirmed they would not be).

Questions raised by the panel in relation to the Policing the Pandemic project:

  • “Is it going to be patient or people-identifiable data you’ll be using?”
  • “Would names be in the data, and if the names were in the data would they be there all the time?”
  • “Does it still fulfil the original reason that it was put in place? It was acceptable at the time, the way we used the data. Is it the same thing we’re using it for, or different? Is there a time limit?”

Concern about the risk of stigmatisation

Participants were concerned by the possible future uses of the data outlined in the presentation, such as to explore whether underlying health-related vulnerabilities increased the likelihood of some individuals being subject to police enforcement for non-compliance. One breakout group questioned the assumption that health issues relate to non-compliance of the law. This prompted the specialist to explain that most incidents that come to the attention of the police are related to an underlying vulnerability such as mental health or addiction. Nevertheless, participants raised the possibility of stigmatisation of people with health issues and the potential for increased surveillance of vulnerable groups. There were also more general concerns raised, particularly among those with more sceptical views about the use of big data by governments, about the invasiveness of linking policing data with other datasets.

“I worry about the stigmatisation of people with health issues, and obviously as the professor said, it is a major issue, in policing, but whether or not it is the right approach, I do think there are lots of other reasons why lots of people in this situation would've broken the rules, that weren't health related. So it worries me there could be some form of stigmatisation of people with mental health or other health issues in this project.” (Session five)

While some were reassured by the specialist’s clarifications about the purpose of the project (i.e. that the research is intended to understand the impact of enforcement on different groups in society), the panel emphasised the need for the public benefit to be defined and justified, with steps taken to minimise the potential harms to groups in society.

Questions over scope and accountability

Participants were initially unclear on the relevance of combining health and police data for research and so felt that the purpose of a data project – especially when using such sensitive data – must be clear. They questioned whether the possible future uses of these data were within the original scope.

“If the data's being used beyond the agreed purpose, I'm not entirely sure about that one. It's gathered in a specific circumstance, and now is being moved to a different circumstance. I'm not sure that's right.” (Session five)

Accountability was also a key consideration, with participants highlighting the role of a bespoke public panel in the early stages of the policing the pandemic project as positive.

Mobility

The Urban Big Data Centre (UBDC) is a research centre and national data service based at the University of Glasgow. UBDC promotes the use of big data and innovative research methods to improve social, economic and environmental well-being in cities. Their project would use mobile phone app data collections (anonymised and non-identifiable data) covering Glasgow City and neighbouring Council areas to investigate how the flow of people in places like city centre workplaces or greenspaces changed during the pandemic, both under lockdown and after the lifting of restrictions.

Public benefit of using mobile data

Although there was some confusion over how mobile data were used, the project was thought to be innovative and potential benefits were identified. It was recognised that green space was beneficial to society and that the use of mobile data to understand how spaces are used could lead to better management of those spaces.

“It's a great project, getting health improvements from green space is a really important issue, seeing how people utilise those spaces is also really, really good.” (Session five)

The public benefit was not clear to all participants, and there was a perceived risk that the data could be used to justify the closure of parks or reductions in green space. Ensuring that there is clarity of purpose and a justification for using the data to benefit society was therefore considered to be important.

Consent for using mobile data

Mobile data was thought of as a “by-product” of smartphone usage and so its use was considered to be an effective way of gathering granular and accurate information that could be used for public good.

However, the issue of consent was raised in relation to this, with several breakout groups asking how the data were collected, and how consent was obtained by the private companies collecting mobile data.

Questions raised by the panel in relation to the Mobility project:

  • “Do you have to download this app, or is it data stored in the same place as where your health and public sector data is stored, or is it a different place than Glasgow?”
  • “Is there a way to give consent or opt out?”
  • “The companies using the apps, what sort of consent guidelines are in place with them, before the data’s even picked up?”
  • “If you’re getting data from commercial bodies, do they have access to your outcomes?”

The specialist explained that mobile data were only available from those who had consented to it being shared, however the panel questioned the extent to which people would really know what they are consenting to. Although the panel were reassured by the processes outlined in the presentation to ensure the privacy, security and de-identification of mobile data, there remained some discomfort around the prospect of private sector organisations holding and selling mobile data without people’s knowledge. The panel felt it was important that people were given clearer guidance on this when it comes to the use of their mobile data.

“It’s interesting using mobile data and saying the legal basis is consent. A lot of people do turn on location data without thinking how that data is being used. I didn't know there were companies out there that had the data.” (Session five)

Contact

Email: michaela.omelkova@gov.scot

Back to top