Building trust in the digital era: achieving Scotland's aspirations as an ethical digital nation: case study supplement
This paper is a supplement to the ‘Building Trust in the Digital Era: Achieving Scotland’s Aspirations as an Ethical Digital Nation’ Report. The case studies have fed into the core report content, helping to position the ethical challenges relating to digital innovation across a range of sectors.
Ethical Limits to Monitoring and Surveillance
Case Study: Cybersecurity – Dr. Markus Christen
Cybersecurity is a major area of growth and investment in Scotland, and the Scottish Government has been proudly promoting this sector as an enabler of prosperity and jobs.
The development of tools for strengthening personal and corporate privacy-protection, building cyber-resilience against threats presented by criminal or state actors, supporting financial or supply chain accountability and helping to tackle serious crime may, on the one hand be regarded as an ethical duty.
At the same time, the cybersecurity sector is also heavily invested in the development of surveillance and forensic tools for purposes such as law enforcement, border control, national security and behavioural monitoring, which can challenge public expectations for ethical, proportionate, transparent, fair, inclusive and accountable digital practice.
Investments in Scottish cybersecurity/forensics companies are also partly based on the prospect of selling such technologies/services abroad. Some of these may be regarded as ethical exports, since they may help to guard vital public services or secure the assets and private information of citizens globally. Yet even the most well intentioned technologies may be misused in the wrong hands; for example, there has been much coverage of Israel’s success in cybersecurity innovations, yet we are seeing evidence of these being used in for domestic, corporate and governmental spyware, including by authoritarian governments or geopolitical adversaries of the UK.
Recommendations
Scotland can make the most of a cyber-Scotland and avoid the potentially harmful effects of misuse and misappropriation by following three layers of action:
Government and legal: obtain an overview of the often-fragmented legal landscape, including gaps and conflicts, across the legislation areas of network and information security measures, electronic communications, including privacy and data protection issues and cybercrime.
Guidelines and soft law: Legislation will not be able to cover all cases/issues that will emerge in real live. Thus, what is needed is that companies themselves create a culture of awareness for such ethical and legal issues including procedures how to operate (and deliberate) in case of unclear legal guidance. The process of generating guidelines within a company could be an instrument to enforce such a cultural change.
Training of professionals on all levels: It is well known that cybersecurity is a “wicked problem” that cannot be solved but only be managed. Thus, knowledge regarding cybersecurity should include a broad spectrum of competences (certainly with a specified focus depending on the profession). What we consider relevant is that ethical, legal and social aspects of cybersecurity should be part of the training of professionals.
Case Study: Domestic Abuse and Data and Digital Technologies - Dr. Katherine O’Keefe
The use of digital technologies to facilitate domestic abuse mirrors many of the concerns revealed in mini publics about surveillance and technology. The increasing integration of digital connected devices into the home life impacts privacy generally but is of particular concern in the context of domestic abuse or intimate partner violence. Where the legal and ethical frameworks often used to raise concern regarding the impacts of digital technologies and surveillance on our rights to privacy, autonomy often model the threats and harms as external to the home and look for protection of the home from government, industry, or external criminal threats, the same threats to privacy, dignity, and autonomy can occur within the domestic space, in the context of intimate partner violence. This is reflected in the focus of legal protections. The UK Data Protection Acts and GDPR limit the scope of protections, exempting “domestic” or “household” use of personal data from requirements for compliance.
The impact of domestic abuse in Scottish life is wide-ranging and significant. According to research done by the Scottish Government 62,907 incidents of domestic abuse recorded by the police in 2019/20, and the Coronavirus crisis saw a “shadow pandemic”, with an increase in reported domestic violence as well as increased threats and pandemic specific tactics of abuse during lockdowns. “Some services observed increases in online stalking and harassment behaviours.” According to Scottish Women’s Aid, “For women not living with their abuser, lockdown meant that their abuser knew they would be at home, increasing the abuser’s opportunities for stalking and continued harassment. The reliance on technology during lockdown to maintain social contact and for work also provided opportunities for abusers to misuse that technology to continue the abuse.”
Many emerging digital devices and connected services have been weaponized by abusers as tools for surveillance or stalking (facilitated by GPS, webcams, spyware, or abusive uses of apps and phone functions), as well as control of “smart” home IoT technologies such as smart meters, voice assistants, and locks. These can impact victims’ autonomy and be used as methods of coercive control and psychological abuse, to establish power over victims and harass them as well as for surveillance.
Technology facilitated abuse in the context of domestic abuse or gender based violence is not necessarily fully recognized in the way domestic violence is recorded and countered in the justice system, though they are likely to fit into categories of “threatening or abusive behaviour or stalking” offences that constitute 88% of breach of the peace-type convictions recorded against abusers in the statistics recorded by the police in Scotland - 2018/19 (5). Additionally, the types of harassing and coercive behaviour for such digital abuse is intended to “cause the partner or ex-partner to suffer physical or psychological harm” such as fear, alarm, or distress. This is recognized in The Abusive Behaviour and Sexual Harm (Scotland) Act 2016 as an aggravation of an offense (Abusive Behaviour and Sexual Harm (Scotland) Act 2016, 1 (2)).
The harms of technology-facilitated abuse are significant, and part of a range of tactics used by perpetrators.
Restriction of access and monitoring of mobile phones has become a significant element of coercive control, as well as stalking behaviour. Abusers may misuse general-purpose software or operating system features or install more purpose specific spyware on phones. This can include changing passwords to block or control access to communications, as well as access to bank accounts and monitoring finances, using location tracking to surveil or stalk victim-survivors, and enabling spyware on phones. One example of psychological abuse often employed against survivors is harassment using payment apps, by repeatedly sending small payment amounts to constantly remind victims and survivors that they are within the abuser’s reach.
Technology facilitated abuse, particularly in the context of smartphones and “smart home” connected devices and systems integrated into the functioning of a home abuse raises specific privacy and security concerns for such sensitive situations and introduces new threats and harms. A number of digital technologies may be used by abusers as surveillance mechanisms to stalk victims and monitor their activity throughout the day as a tool of coercive control. This surveillance affects victims/survivors psychologically, impacting their dignity, privacy, and autonomy. The Scottish Government’s reported that a commonly used phrase victims used was that they felt like "sitting ducks", as their abusers knew where they were at all times”.
This can include many “internet of things” (IoT) devices as well as mobile phones. Webcams and home assistants such as Alexa or Google Home devices may be used for surveillance, or to control connected thermostats, lights, locks, and other elements of the home, connected devices, or wearables. The effects of this weaponized use are not only limited to the possible physical effects of the literal updated “gas lighting”, but the psychological effects of the threat whether the threatened control is possible or realized.
There has been increasing recognition of the harm caused by non-consensual publishing of intimate images or “revenge porn” as abuse and harassment. It is one of a number threatening and abusive uses of social media. The design of social media networks makes it difficult for abuse survivors to control their privacy and cut their abusers off from information about them, as their privacy is impacted by the social media profile privacy settings of everyone they know. Even if they block an abuser from all of their social media, they cannot ensure that everyone in their network also blocks information about them. Technologies such as facial recognition and automated tagging aggravate this risk.
Digitalization and introduction of new connectivity into conventional technologies introduces new threats and ethical dilemmas in design. As Jane Bailey that technology facilitated abuse “is perpetuated not just by “bad individuals,” but also by the systems and practices of the technology companies that structure and facilitate online interactions” (Bailey, et al introduction). Lack of prioritization of privacy and security in design can be directly related to the harms that result from the misuse of digital technologies. Potential impacts in the context of domestic abuse or coercive control are often considered “edge cases” in the design of connected technologies, or the potential misuse of technologies is simply not acknowledged as something a responsible designer accounts for and designs for. In order to have ethical digital design, designers, developers, and policy makers should consider and take into account the experiences of victims/survivors in design and development of the systems and threat modelling.
The concerns of differently impacted demographics and marginalized groups are vital for ethical design, and designers must take understand and account for potential misuse rather than designing with an assumption of best-case scenarios.
Dr. Leonie Tanczer, principle researcher in Gender and IoT at University College London, notes that the traditional model of thinking of computing and device design as “access to a personal computer” or a single account holder is no longer fit for purpose, and that for many digital technologies and particularly for IoT devices, the model that needs to be considered for access, control, and privacy, is that of shared access to a resource. The effects on privacy and control in a shared service space with multiple people potentially affected by a device are not appropriately designed for with single account holder access and controls, which are often set up and controlled solely by the abuser.
While digital technologies have been misused and have the potential to be weaponised to abuse people in the context of domestic violence, they can also be key supports for victims and survivors, enabling them to leave an abusive situation and reassert control over their lives. The ability to disconnect, privacy and security in communications may be a deciding factor in victims’ ability to leave an abusive situation.
The same digital technologies that abusers may have weaponized against abuse victims can, once they have regained control over their data and have autonomous control over the technology themselves, be liberating. Cameras the victim/survivor has control over can be used to ensure security. Yee Man Louie observes that smart phones can be used to document evidence, and online forums can provide social connections and support.
Ethical and responsible design of digital technologies appropriately taking into account and mitigating risks of abuse can readdress the balance of potential harms and benefits resulting from these technologies.
At a government and policy level, support for the programmes and organisation working with victims and survivors of domestic abuse and gender based violence should consider the digital and physical abuse holistically. Similarly, the framing of legal protections in relation to data could take into account the gaps in protections resulting from “domestic use” exemptions to data protection legislation. Support offering specialized expertise and cyber security support for survivors will likely be increasingly needed. Having a centralized government cybersecurity resource devoted to this, perhaps as an aspect of the Scottish cyber strategy, would also provide insight and statistics into the prevalence and trends in technology facilitated abuse. Additionally, policies supporting better understanding of threats through Higher Education could offer another opportunity to help emerging developers understand the social context in which their products will affect people, individually and socially.
Contact
Email: digitalethics@gov.scot
There is a problem
Thanks for your feedback