Building trust in the digital era: achieving Scotland's aspirations as an ethical digital nation
An Expert group supported by public and stakeholder insights has reviewed evidence and provided recommendations which will support and inform future policy. The report has a focus on building trust with the people of Scotland through engaging them in digital decisions that affect their lives.
Public Awareness of Data Use and Sharing
Objects of Trust:
Users: Could it be misused to hurt others?
Privacy: Is my information confidential?
Usefulness: Is it necessary? Will it help? Is it worth it?
What is Public Awareness of Data Use and Sharing?
Public awareness of data use and sharing is ensuring that any processes for which data about individuals is collected, used and stored are transparent and explainable to all. Rules about the preservation of individuals’ privacy and about the ways in which organisations may use personal information, are laid out in UK data protection laws. However, many people tend to not understand these protections or trust them. Public trust should be one of the most important elements of an Ethical Digital Nation, and this can only be achieved if the public have a good awareness and understanding of how their personal data is used and shared, and their rights relating to data.
Personal data should be shared in a trustworthy way, aligning to broader societal values and expectations. It is not just about communicating how data is being used, but taking one step further to think about public acceptability and having the autonomy to challenge and question how their data is being used and shared. This equates to enabling a level of personal control over our personal data, which exists as a right enshrined in law already.
To promote public awareness about data use and sharing, there is a need for organisations to make sure that people understand how this data is collected and used and what steps they can take to control this and protect themselves. Better understanding and awareness will help to foster trust in businesses and governments when using personal data. Additionally, supporting the growth of trusted data intermediaries, whose roles is to manage digital data as a resource for public benefit, can facilitate trustworthy data sharing.
Why is Public Awareness of Data Use and Sharing Important?
A digital footprint is all of the information captured and collected about an individual that exists as a result of digital activity. This means that citizens are leaving behind a ‘digital trail’ of data every time they log on, use an app, swipe a card or click on a link.
There is a growing awareness of the fact that data shared online can be used to profile and target users, but a limited understanding of how this happens. Data can be shared both knowingly, for example when setting up a social media account, or unknowingly, via cookies, trackers or even store loyalty cards. It is the quantity of data points that are gathered that is increasingly becoming a concern.
Improving public awareness of data use and sharing will strengthen public trust in the institutions and services that look to use their data in an ethical way. Citizens are concerned that sharing their data could lead to it being misused. It is important that there is widespread understanding of how online platforms can be used to collect personal data, and how that can be used to deliver services. This can help empower individuals and communities to scrutinise technologies if they are being used unethically. It is not purely the responsibility of the citizen to understand more about data use and sharing, but there needs to be step change in how the organisations who are using and sharing data are more transparent and have more explainability around their processes.
Some key concerns of The Public Panel are:
- Data being sold or shared with other organisations
- Being used to target advertising
- Being used to profile individuals or groups.
(National Digital Ethics Public Panel Insight Report, 2021)
“I know that practically any website I go to will have access to things they shouldn’t, which they can then sell, or will simply be taken from them in turn. I’m resigned to my data being harvested to an extent.”
National Digital Ethics Public Panel Insight Report, 2021, P. 42
Case Study:
Targeted advertising, advanced marketing and behaviour change
Dr. Ben Collier, Dr. James Stewart
Contemporary forms of digital marketing are the financial lifeblood of the Internet. Most of the online platforms, search engines and social media sites we use are provided free to the end user, generating revenue through the collection of intimate behavioural data, which are used to generate advertising profiles. These profiles allow adverts to be targeted and personalised not only based on demographic characteristics and traditional segmentation, but on previous and current behaviour, surfaced by the application of algorithmic technologies to extremely intimate and fine detail records of online browsing, communication, and activity. The targeted digital advertising industry has been the subject of a series of scandals and critical debates in recent years, not only due to concerns around intrusive corporate surveillance, but also in the use of this surveillance influence infrastructure for legitimate and subversive political communication. We have recently identified a new area of potential concern: the increasing use of these infrastructures by government to shape the behaviour of the public.
Government communication practices are not static, and change and adapt in line with the cutting edge of industry practice. These practices involve not only classic forms of awareness-raising – public health and safety, regulatory changes, democratic participation etc., but attempts to directly shape the behaviour of the public – often through ‘nudge’ and other approaches incorporating insights from behavioural science. As digital marketing tools have evolved, government departments and law enforcement are increasingly using them in behaviour change campaigns as part of a shift to prevention. This in theory allows government to shape behaviour in-the-moment in novel, intimate, and deeply targeted ways, bringing together administrative data, marketing data, and platform targeting data to target, deliver, and evaluate complex campaigns.
The use of government administrative or survey data to develop targeting profiles may be contested where those data are explicitly not to be used for marketing purposes. This blurs the line between marketing and service delivery. Secondly, the algorithmic targeting of adverts leaves open legal room to challenge if it can be proven that it has the potential to harm or disadvantage. Another concern in this domain are the unintended consequences of these campaigns, which the Scottish Government are actively tackling.
The public are largely aware of the existence of digital targeting, and as a result, may feel anxious if they receive government adverts, which they assume are as a result of their online behaviour. This presents a real capacity for unintended harm, particularly for more vulnerable groups. This is linked to the wider issue of communications ‘blowback’ – in which unintended consequences (such as the advert resulting in the opposite of the intended outcome for a small percentage of viewers, or accidentally spreading rather than countering false information) can result from the complex social environment in which these communications are consumed. Additionally, there are a set of issues around privacy/intrusiveness. These practices open up to government a new generation of detailed data sources that can be used to target communications by interposing a private entity (the platform). This allows for the use – at arm's length – of very intimate targeting and delivery approaches in ways not historically available to government.
Scotland and the Scottish Government are in many ways leading in developing the ethical accountable use of digital targeting approaches, with behavioural ad campaigns handled through a single centralised team and subject to ethical scrutiny and oversight. There is an opportunity to firmly develop a positive and distinctly Scottish approach to strategic communications – particularly foregrounding the values of co-production and ‘bottom-up’ policymaking, rather than the ‘top- down’ campaigns run with little public consultation, oversight, or transparency which characterise practices in many other jurisdictions (particularly in national security contexts). There remains a big challenge in terms of systemic change. Currently structures and practices are still fairly informal and there is a need for more concrete structures for accountability and governance over institutional practices.
Potential policy proposals here could include a public register of current and previous ‘behaviour change’ campaigns conducted by the public sector, with details of targeting and procurement, and the further formalisation of expert review (possibly in the style of the data sharing scrutiny boards used by the statistical profession within Scottish Government). As these practices become more widely spread there are also political and democratic questions to answer – such as whether micro-targeting is appropriate for use at all by the public sector – which require further scrutiny by politicians, civil society, and the public.
How can we strengthen and assure public trust in the use of data by public and private organisations?
Data can be an extremely useful tool for decision-making and service optimisation. It can help to personalise online services and advertising in line with past behaviours or predict demand for services in the future. Data is critically useful as we look to develop the quality of public services through sharing information for initiatives such as ‘Smart Cities.’
Data sharing – the ability to distribute sets of public or private sector data with multiple users or applications to benefit citizens, whilst maintaining data privacy and security. However, in order to do this sustainably, there is a need to develop stronger safeguards around data collection and use. Innovative ways to use computers and data will often challenge the existing balance of interests and rights in society, politics and the economy, such that we require mechanisms of debate and governance that enable these to be decided on democratically.
There is a need for the public to be able to hold organisations and institutions, such as the NHS, to account over how their data is used, as well as having more insight and control over their data. By promoting more transparent operational mechanisms, the public may feel more confident in challenging and questioning:
- Who can be trusted with their data
- Whether individuals are aware or have given consent for their data to be used
- Whether groups could be unfairly profiled
- Whether individuals could be re-identified
- The reliability of the data collected
- The comprehensiveness of the data.
(Extracted from National Digital Ethics Public Panel Insight Report, 2021)
In order to help raise public awareness of data use and sharing, there is a need to provide all citizens with the knowledge, skills and tools to safely navigate the online space and use digital technologies. This will allow them to make more informed decisions about sharing their data. In addition, citizens should be able to easily identify the ownership, and any links between, social media and other web-based platforms to allow them to have a better understanding of how their digital footprint is being used and shared.
“Acceptability depends on 3 factors:
1. Anonymous is OK if use for social benefit & agreement sought
2. Not to benefit corporations for additional profit
3. Trust of companies holding data can be enforced if trust to hold safe is breached”
National Digital Ethics Public Panel Insight Report, 2021, P. 44
Contact
Email: digitalethics@gov.scot
There is a problem
Thanks for your feedback