Abstract
Given the unprecedented scale of digital surveillance in the COVID-19 pandemic, designing and implementing digital technologies in ways that are equitable is critical now and in future epidemics and pandemics. Yet to date there has been very limited consideration about what is necessary to promote their equitable design and implementation. In this study, literature relating to the use of digital surveillance technologies during epidemics and pandemics was collected and thematically analyzed for ethical norms and concerns related to equity and social justice. Eleven norms are reported, including procedural fairness and inclusive approaches to design and implementation, designing to rectify or avoid exacerbating inequities, and fair access. Identified concerns relate to digital divides, stigma and discrimination, disparate risk of harm, and unfair design processes. We conclude by considering what dimensions of social justice the norms promote and whether identified concerns can be addressed by building the identified norms into technology design and implementation practice.
Keywords: digital technologies, contact tracing, data sharing, COVID-19, fairness, justice, equity
Introduction
During the COVID-19 pandemic, the scale of digital surveillance has been unprecedented. Questions about the ethics of emerging digital surveillance and contract tracing platforms (e.g., phone location, mobility, facial recognition data) have thus risen to prominence (Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020; Parker, Fraser, Abeler-Dörner, & Bonsall, 2020). Such technologies have the potential to serve several public health purposes during epidemics and pandemics— including rapid contact tracing, enforcing quarantine and self-isolation orders, and monitoring and reporting use of public services in real-time so users can avoid overcrowding the system—and have been employed across many countries worldwide (Berman, Carter, Garcia-Herranz, & Sekara, 2020). They also have the potential to widen existing disparities via, for example, digital divides or by generating stigma and discrimination. The equitable design and use of digital surveillance technologies is thus essential during public health emergencies. In this paper, we identify what is thought to comprise equity and social justice in their design and implementation during infectious disease epidemics and pandemics.
Digital Surveillance and Contact Tracing
Digital contact tracing employs mobile phone or bluetooth signals for disease surveillance and can reduce contact transmission rates, as primary and secondary contacts of infected persons can be identified and requested to self-isolate/quarantine much faster (Wymant et al., 2021). Three broad approaches to digital contact tracing have been adopted globally. They fall along a spectrum of potential methods of digital contact tracing and reflect different approaches to preserving privacy rights versus promoting public health (Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020).
During the first three months of the pandemic, some countries, including Israel, South Korea and China, adopted a “maximal” approach, incorporating mandatory, centralized data collection systems. Israel's government traced all citizens’ phone location data and used that data to identify with whom COVID-19 infected persons had been in contact. Close contacts of patients were placed in mandatory quarantine to stop further contagion (Amit et al., 2020). Similarly, South Korea opted to use geolocation data without seeking consent. It additionally publicly posted information on potential hotspots for transmission, informed by cellphone location data, credit card records, and video surveillance records of COVID-19 positive patients (Mello & Wang, 2020; Robinson-Greene, 2020). In China, people were required to download an app that used geolocation to track the whereabouts of everyone who had the app, including how close they had come to others who were confirmed COVID-19 positive. Each individual was assigned a QR code, depending on their risk level, which controlled whether they could travel with no restrictions, were required to self-quarantine, or were required to enter mandatory government quarantine (Robinson-Greene, 2020).
Australia and Singapore are examples of countries that adopted a “middle ground” approach, allowing contact tracing app users to turn over both proximity data and GPS location data (i.e., mobile phone location data) to public health authorities on a largely voluntary basis (Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020). In Australia, individuals could download the CovidSafe app and would be notified if they came into contact with a person infected with COVID-19 and advised to self-isolate and get tested. Health officials could only access app data if someone tested positive and agreed to the information in their phone being uploaded. App data could only be used to help alert those who needed to quarantine or get tested (Australian Government Department of Health, 2021). In Singapore, the TraceTogether app is also voluntary to download but, unlike CovidSafe, shares data with the government if the user becomes subject to contact tracing because of a positive COVID-19 diagnosis (Illmer, 2021; Johnson, 2020a; Mello & Wang, 2020). Despite assurances that the data collected would only be used for contact tracing during the pandemic, TraceTogether data have been used for purposes of law enforcement, where the police accessed it during criminal investigations. By April 2020, similar apps had been rolled out in nearly 30 countries and have continued to be developed and refined during the pandemic.
In contrast to “maximal” and “middle ground” approaches, a Google and Apple contact tracing app comprised a “minimal” approach, with decentralized privacy-preserving proximity tracking and contact notification. Individuals were anonymous, none of their location data were gathered, and none of their identifiable data were shared with anyone. Apps were approved by a public health agency and users were notified if they came into contact with a (not identified) person infected with COVID-19 (Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020). The process was fully anonymized from start to finish. The implications of this in relation to the actions of those who had been notified (or ‘pinged’) vary from country to country and depend on the rules at the time.
Beyond contact tracing, some countries have used facial data and/or phone's geolocation data to enforce restrictions on movement for people infected with COVID-19 and their contacts. New Zealand, Thailand, Kenya, Taiwan and Singapore, for example, authorized law-enforcement authorities to monitor quarantine orders remotely (Ada Lovelace Institute, 2020; Mello & Wang, 2020), incorporating smart phone location tracking to detect and impose sanctions for quarantine violations (Amit et al., 2020; Cohen, Gostin, & Weitzner, 2020). China, Poland, and Russia went further, using facial recognition software to monitor compliance with quarantine orders (Mello & Wang, 2020).
During the COVID-19 pandemic, the use of QR codes for public health purposes has also steeply risen to record access to public spaces and services. In Australia and New Zealand, for example, registration using QR codes have been required at all eating, drinking and shopping venues. QR codes were also used for airport entry upon arrival when travelling interstate in Australia. In Taiwan, travelers scan a QR code using their smartphone, which leads to an online travel declaration form that asks for their travel history and flight information, symptoms of fever or respiratory infection, and contact information in Taiwan. On the basis of their health and travel information, travelers are either sent a pass by text, asked to do home quarantine for 14 days, or instructed to self-isolate at home for 14 days (Mello & Wang, 2020).
Design and Implementation of Digital Surveillance Technologies: Equity and Social Justice
Multiple and overlapping dimensions of equity and social justice are described in the philosophical and ethics literatures relating to power relations, recognition, inclusion, distribution, and rights to self-development and adequate levels of well-being. Accounts of structural justice highlight the importance of reducing unfair power relations such as subordination, exploitation, exclusion, and violence (Powers & Faden, 2019; Young, 1990). Unfair power dynamics create unequal conditions where some have to work much harder and be much luckier to have prospects for a decent life that others are socially positioned to experience very easily (Powers and Faden 2019). Recognition-focused concepts of justice call for addressing misrecognition: devaluing social groups (disrespect, discrimination) and rendering their knowledge, needs, and perspectives invisible (silencing) (Fraser, 1997; Fricker, 2007; Santos, 2014; Young, 1990). Inclusion encompasses a right to shape a decision-making space, to be present or represented (in diversity and numbers), to raise voice, and be heard (Cornwall, 2011; Crocker, 2008; Young, 2000). It highlights the importance of ensuring all affected, including those whose voices are often less heard, are included in deliberative democratic decision-making (Young, 1990). Inherent in existing accounts of deliberative democracy are norms such as inclusion, reciprocity, reasonableness, and transparency (Gutmann and Thompson, 2004; Young, 2000). Distributive accounts of equity and social justice address the importance of fairly distributing burdens and benefits and reducing unfair disparities in the distribution of resources (Macklin, 2005). Additional dimensions of equity and social justice address the importance of enabling self-development and human flourishing, including achieving an adequate level of health and wellbeing for all, with debate ongoing as to whether that comprises a basic, sufficient, or optimal level of wellbeing (Nussbaum, 2000; Powers & Faden, 2006; Venkatapuram, 2011; Young, 1990). Ensuring adequate health and wellbeing has both negative and positive ends (Powers and Faden, 2006). It means no one is pushed below an adequate level of health and wellbeing and that disparities in health and wellbeing, particularly involving those below an adequate level, are not made worse. It also entails bringing those below an adequate level of health and wellbeing up to such a level and reducing disparities in health and wellbeing (Powers and Faden, 2006).
Commitments to promoting social justice in pandemic contexts prompt careful consideration of the capacity of novel health technologies to both alleviate and exacerbate health and social inequities. There is growing awareness that digital health technologies pose important risks as well as opportunities (Shaw et al., 2020). These risks are particularly pronounced from an equity perspective, where certain communities might be systematically excluded from the benefits of technology. Digital divides occur when certain members of societies (e.g., elderly, certain socioeconomic or cultural groups) do not have access to the same digital tools as the wider population. As a result, the benefits of many digital health technologies—contact tracing, electronic medical records, telehealth, mhealth—may be unequally distributed amongst patients, with greater benefits accruing to those who are already socially and economically advantaged. For instance, electronic health record records’ use requires cognitive skills and health literacy that not all patients have or have equally (Spriggs et al., 2012). Equitable implementation means employing strategies to ensure digital health technologies are accessible to such patients (Shaw et al., 2020).
Digital health technologies can even contribute to and be part of larger systems of institutional oppression. Technologies powered by bad data, bad algorithmic models, or both lead to ‘high-tech’ discrimination – misclassifications, over targeting, disqualifications, and flawed predictions that affect some groups, such as historically marginalized ones, more than others (Lupton, 2016). As Char, Shah, and Magnus (2018) note, “health care delivery already varies by race. An algorithm designed to predict outcomes from genetic findings will be biased if there have been few (or no) genetic studies in certain populations. For example, attempts to use data from the Framingham Heart Study to predict the risk of cardiovascular events in non-white populations have led to biased results, with both overestimations and underestimations of risk.” Algorithms can misrepresent the health needs of racialized populations and lead to undue harm for them, widening existing health and wellbeing disparities (Shaw et al., 2020). Equitable design of artificial intelligence technologies that rely on data to train algorithms thus requires examining how existing datasets represent underserved groups (Shaw et al., 2020).
In the context of emerging digital surveillance and contract tracing formats, decisions made in relation to collecting and sharing digital data, and choosing not to, can affect equity and social justice. Decisions about whether technologies should share data, and how, can have profound beneficial and harmful effects in relation to recognition and wellbeing dimensions of social justice, such as by making different groups visible and highlighting their needs, or potentially making them targets of discrimination. As an example of the latter, restaurants and venues linked to COVID-19 cases often faced abuse or boycott (Gibert et al., 2020). It is not inevitably the case that digital approaches are more problematic in equity terms than the available non-digital alternatives. The use and sharing of data from manual contract tracing mechanisms involving the collection of names, contact details, and movement information can also be invasive and raise concerns about the equitable sharing and use of data. However, the scale and pace at which digital surveillance mechanisms have been developed and deployed during the COVID-19 pandemic, and the widespread use of personal mobile phones for data capture, do raise important new questions about equitable access to digital surveillance technologies and the equitable use of resulting datasets, which affect the distributive and wellbeing dimensions of equity and social justice.
The Contribution of this Paper
This paper identifies what is thought to comprise equity and social justice in the design and implementation of digital surveillance technologies during epidemics and pandemics. As part of a broader literature review carried out in June and July 2020 on ethical data sharing in epidemics and pandemics, we systematically identified literature relating to the use of digital technologies to collect surveillance data during epidemics and pandemics. That literature was thematically analyzed for ethical norms and ethical concerns related to equity and social justice. In our analysis, each distinct way in which the literature said digital surveillance technologies should be designed or implemented to advance equity and social justice comprised a norm. This paper reports the findings of that literature review and analysis, including identified norms for equitably designing and implementing digital surveillance technologies and concerns related to equity and social justice that arise when such technologies are developed and implemented. We conclude by considering what dimensions of social justice the norms promote and whether identified concerns can be addressed by building the identified norms into technology design and implementation practice.
Methods
Scoping reviews seek to identify literature relevant to the research objective and may include a variety of article formats (Arksey & O'Malley, 2005; Armstrong, Hall, Doyle, & Waters, 2011). A scoping review was performed to gather literature on data sharing in infectious disease outbreaks, epidemics, and pandemics. Two searches were undertaken in June and July 2020. First, literature from the Epidemic Ethics Database was reviewed. Second, a formal literature review was performed to identify any additional articles beyond those in the Epidemic Ethics Database. From these searches, 53 documents that discussed digital surveillance technologies during COVID-19 were identified.
Epidemic Ethics Database
The Epidemic Ethics Network is a global community of bioethicists that aims to help coordinate and support real-time, contextual ethical decision-making in public health emergencies. Led by the World Health Organization, it has developed the Epidemic Ethics Database (https://epidemicethics.tghn.org/resources) to compile resources in relation to the ethical issues arising out of global health emergencies, with a current focus on the COVID-19 pandemic.
Three key word searches of the Epidemic Ethics Database were performed: (1) data and sample sharing, (2) surveillance & apps & AI, and (3) global health justice. The selection of these key words reflected key word options within the database itself and the overall study aim of identifying what is thought to comprise equity and social justice in data sharing during infectious disease epidemics and pandemics. The original searches were conducted on 2 June 2020 and repeated on 12 July 2020. No language or publication date limits were applied. In total, 197 articles were identified (Figure 1).
Figure 1.
PRISMA diagram
Formal Literature
To conduct the formal literature review, two categories of search terms were used: (1) data sharing and (2) infectious disease outbreaks/epidemics/pandemics. This search strategy was developed in consultation with an informatics expert at the University of Melbourne Brownless Biomedical Library. The data sharing search terms were initially drawn from those used in a prior systematic scoping review by Bull et al. (2015). The two categories of search terms were further developed for this study through an iterative process, where combinations of controlled vocabulary and key words were piloted in Ovid Medline. The following four databases were searched for relevant studies: Embase (OvidSP)[1974-present], Global Health (OvidSP) [1973-present], MEDLINE(R) (OvidSP) [1946-present], and Science Citation Index (Web of Science Core Collections, Thomson Reuters) [1945-present]. The searches were conducted on 2 July 2020. No language or publication date limits were applied. (The full search strategy is available in Supplemental File 1). In total, 388 citations were identified in the formal literature after de-duplication between the four searches (Figure 1). References were imported into bibliographic software (Endnote X9).
Literature Screening
A matrix of inclusion and exclusion criteria was developed to inform screening (Figure 1). For the Epidemic Ethics Database, titles, abstracts and full-text were screened together because many articles did not have abstracts. Of the 197 articles, 149 were subsequently excluded because they did not meet any of the inclusion criteria, or could not be found or accessed (n = 1), were multimedia (n = 2), or were not in English (n = 10). All titles and abstracts from the formal literature review were assessed by the first author. In total, 84 articles from the formal literature review were included (as part of a broader study) because they discussed data sharing in infectious disease epidemics. These 84 articles were retrieved and full-text versions were further assessed for eligibility; 79 were subsequently excluded because they did not discuss digital technologies in epidemics or pandemics. To ensure rigor, a sample of 10% of full text articles included in the study was co-reviewed by the last author. Ultimately, 53 articles (5 formal, 48 Epidemic Ethics Database) were identified by the scoping review that discuss digital surveillance technologies in COVID-19. The full list of included articles is provided in Supplemental File 2.
Thematic Analysis
The 53 full-text articles that met the inclusion criteria were thematically analyzed using the approach described by Braun and Clarke (2006). That approach involves the following main steps: (1) familiarization with the data, (2) generating a coding framework, (3) searching for categories and sub-categories and (4) reviewing categories and sub-categories. A broad coding framework was developed relying on a priori categories that were selected based on the study's research aims. The two main categories were: (1) equity and social justice-related norms of digital technologies (how should digital data be shared to advance equity and social justice) and (2) equity and social justice-related concerns raised about digital data sharing. Subcategories within these two categories emerged from the dataset and were consistent with one or more of the aforementioned five dimensions of equity and social justice: power relations, recognition, inclusion, distribution, and rights to self-development and adequate levels of well-being. Norm sub-categories reflected where the authors of the identified literature used normative language to describe ways in which digital surveillance technologies should be designed or implemented to advance equity and social justice. The coding framework was developed by BP and refined following discussion with SB. Coding of full-text articles was performed by the first author using NVivo Version 12.
Results
Digital Technology Norms Related to Equity and Social Justice
Thirteen documents identified by the literature review made normative recommendations about how the design and implementation of digital technologies ought to be undertaken in the COVID-19 context. Ten documents identified eleven norms related to social justice and equity in app design and implementation (see Table 1). Norms for the design of digital technologies addressed the importance of fair and participatory design processes, excluding, where necessary, stakeholders associated with human rights violations. They also highlighted the importance of ensuring that designs sought to ensure equal access and the minimization of existing inequities. Norms for implementation echoed requirements for equal access, incorporating requirements for equal availability, inclusive approaches to support understanding and acceptability of digital technologies, and careful consideration of approaches to minimize harm to marginalized groups. These were complemented by norms for monitoring the social impact of interventions, including the distribution of benefits and burdens of approaches digital surveillance. A final norm explicitly addressed the importance of addressing inequities in power and ensuring that technology companies did not have sole control over the implementation of apps. Many norms unrelated to equity and social justice were suggested and discussed as well but are not reported here, including voluntariness, anonymity, security, being timebound, and proportionality.
Table 1.
Equity-related norms for digital technologies’ design and implementation
| Norms for design | Definition | References |
|---|---|---|
| Fair participatory processes | Meaningful participation of relevant stakeholders, such as experts in public health, data and technology, the social sciences and humanities, and privacy; the public; civil liberties advocates and civil society organizations; and the most marginalized groups. Technology companies alone should not control apps’ design. In terms of what meaningful engagement means, Kahn and the Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies (2020) suggest deliberative public engagement efforts should be undertaken and their outputs integrated into apps’ design. The interests of less powerful participants should be taken into account. The Ada Lovelace Institute (2020) suggests the engagement of people across society should be as decision-makers. To build trust, public engagement should “talk directly to citizens (especially low-trust communities) through a (ideally national) conversation about what questions they need answered and which concerns must be allayed before they feel comfortable downloading the app.” |
Access Now, 2020; Ada Lovelace Institute, 2020; Dubov and Shoptawb, 2020; Human Rights Watch, 2020; Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020; Ranisch et al., 2020; Vaithianathan et al., 2020; WHO, 2020 |
| Value-based design | Robust public engagement activities should identify and incorporate, to the extent possible, a range of values into the design of the technology. These values may include privacy, but also efficiency, equity, liberty, autonomy, economic well-being, companionship, solidarity, or others. Technology design should reflect an appropriate balance and prioritization of identified values. | Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020 |
| Equal access | Design apps should be easy to use and accessible to anyone, irrespective of the technology needed, their level of digital literacy, and type of phone. Potential exclusion of vulnerable groups is considered. This includes ensuring a digital contact tracing app is accessible for people with disabilities, neurodiverse people, and others who might have difficulty with using digital technologies. |
Ada Lovelace Institute, 2020; Morley et al., 2020; Ranisch et al., 2020 |
| Design to rectify or avoid exacerbating existing inequities | Design digital contact tracing apps to
Consider how the technology may affect different population groups (including children, women, people living with disabilities, displaced people, and other vulnerable groups within the population) in different ways. Risk of harms to marginalized groups is identified and mitigated. |
Ada Lovelace Institute, 2020; Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020; Ranisch et al., 2020 |
| Human rights protection | Exclude companies with track records of violating or facilitating human rights violations from participating in public calls to provide technological solutions to epidemics or pandemics. | Access Now, 2020 |
| Norms for implementation | ||
| Equal availability | Ensure apps are free and distributed to anyone. | Morley et al., 2020; Ranisch et al., 2020 |
| Equal access | Ensure disparity driven gaps in access are recognized and provisions are made to address them:
|
Ada Lovelace Institute, 2020; Dubov and Shoptawb, 2020; Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020; Ranisch et al., 2020 |
| Avoidance/Prevention of harm to those groups already considered to be marginalized | Consider the impacts of implementation on vulnerable groups and develop appropriate mitigation/prevention strategies. The risk of apps propagating existing patterns of disadvantage should be addressed, e.g. causing discrimination, punishment, or other rights abuses against racial minorities, people living in poverty, and other marginalized populations. Digital tools should not be used to further marginalize these groups, e.g. using them for criminal prosecution or immigration enforcement. | Human Rights Watch, 2020; Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020; Parker et al., 2020; Ranisch et al., 2020 |
| Public engagement | Before implementing apps, government, public health, and digital technology leaders must engage effectively with the public, including groups considered marginalized, and other stakeholders to increase their understanding of the acceptability of digital contact tracing apps’ design features and uses and to communicate the utility, importance, oversight, and limitations of relevant apps, including their implications for individuals’ privacy and civil liberties. | Human Rights Watch, 2020; Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020; WHO, 2020 |
| Address inequities in power | Ensure technology companies alone do not control the terms, conditions, or capabilities of digital contact tracing apps. | Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020 |
| Monitoring and evaluation to assess social impact of app implementation, including whether benefits and burdens are equally distributed | Ensure that the benefits and burdens of digital contact tracing apps are distributed fairly. Entails measuring the social impact of apps on individuals and communities, especially those who are considered marginalized. Public engagement is an important tool for assessing apps’ social impact. | Ada Lovelace Institute, 2020; Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020; Morley et al., 2020; Parker et al. 2020 |
Concerns About Digital Technologies Related to Equity and Social Justice
Six concerns relating to social justice and equity were discussed regarding sharing digital surveillance data: digital divides, stigma and discrimination, disparate risk of fear and harm, surveillance creep, unfair processes of design, and eroding democracy and public trust. The digital divide, stigma and discrimination, and disparate risk of harm are each linked to potentially widening existing disparities in wellbeing. Stigma and discrimination are also recognition-related injustices and disparate risk of harm raises concern about distributive injustice. Surveillance creep, unfair processes of design, and eroding democracy and public trust relate to the inclusion dimension of social justice.
Digital Divides
Concerns were expressed in the reviewed literature that the most vulnerable members of societies (e.g., elderly, certain socioeconomic or cultural groups) may often not have access to the same digital tools as the wider population. This may present issues when population-level inferences are made from surveillance data generated from digital technologies primarily used by urban, younger, and higher socioeconomic groups (Chunara & Cook, 2020). Using such data could ‘‘skew predictions, diagnoses, risk scores, and decisions about where, or to whom, finite resources and care should be prioritized’’ (Bayram, Springer, Garvey, & Ozdemir, 2020). In the current COVID-19 pandemic situation, specific concerns arose that assessments of the efficacy of social distancing interventions or measurements of community disease burden could not guide policy appropriately if certain populations were omitted from such analyses (Chunara & Cook, 2020). Instead, decisions could be taken that amplify existing inequities in the allocation of public goods, or favor certain kinds of lives more than others, exacerbating existing health inequities (Buzzell, 2020).
In terms of maximalist and middle-ground contact tracing apps, which share individuals’ location and health data during epidemics and pandemics, existing disparities in digital literacy and smartphone ownership within and between countries could result in apps’ use and benefits being unevenly distributed. In 2019, two-thirds of the world's population did not own a smartphone, many of whom live in low and middle-income countries (Gasser, Ienca, Schiebner, Sleigh, & Vayena, 2020). Within countries, evidence has shown that individuals with lower levels of education, individuals living in rural areas, the elderly, minorities, the economically disadvantaged, the homeless, and individuals less fluent in a country's main language can be less likely to have basic digital skills or to own a smartphone (Ada Lovelace Institute, 2020; Dubov & Shoptawb, 2020; Gasser et al., 2020; Luciano, 2020). For example, in the European Union, only about 31% of people with low education levels or no education have at least basic digital skills. 49% of those living in rural areas have basic digital skills compared with 63% in urban areas (Luciano, 2020). Concerns thus arose in the identified literature that contact tracing apps may widen existing health disparities because many of the groups/countries with less access to smartphones or digital skills already experience worse health than other groups/countries with better access (Ada Lovelace Institute, 2020; Buzzell, 2020; Gasser et al., 2020; Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020; Luciano, 2020).
Stigma and Discrimination
Stigma and discrimination can result from biases within the digital technology itself and from the uses to which the data it collects are put (Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020; Luciano, 2020; Morley, Cowls, Taddeo, & Floridi, 2020; Ranisch, 2020). Stigma and discrimination of individuals can occur when data collected by digital technologies are used to identify individuals as being infected with the disease driving the epidemic and/or as residing in an area designated high-risk (Berman et al., 2020). For example, in South Korea, the government tracked and publicly released detailed location data of people confirmed and suspected to be infected by COVID-19, including when they left for work, whether they wore masks in the subway, the name of the stations where they changed trains, and the names of the clinics where they were tested. As reported in the New York Times, these data were used to identify people by name and harass them (Access Now, 2020). Spatial and group stigmatization can occur when digital data are used to generate maps showing that certain businesses, neighborhoods, or postcodes are hotspots. If specific groups are known to live in those areas or own those businesses, they can be stigmatized as a result (Gibert et al., 2020). Where digital data are stratified by demographics like race, this also runs the risk of generating stigmatization and discrimination of particular groups (Gasser et al., 2020; Labrique, 2020). For example, in China, the government-supported contact tracing app designates threat levels based on where people are from or where they have recently visited. Concern was raised that the app's decision-making metrics may be discriminatory against certain regions or groups within China (Robinson-Greene, 2020).
Disparate risk of fear and harm
Groups that are already marginalized by social norms and institutions may experience greater fear of harm from having their data collected (Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020). “For many, institutionalized racism, massive income inequality, lack of legal support or protections, and violence at the hands of police, makes contact tracing measures frightening and dangerous” (Bush & Leins, 2020). In such contexts, the impact of a society's use of digital technologies has a disproportionate impact on the psychological well-being of members of marginalized groups. Digital technologies could further expose such groups to greater risk of harm or further marginalize these groups by, for example, using digital data for criminal prosecution or immigration enforcement (Dubov & Shoptawb, 2020).
Surveillance Creep
Surveillance creep is a related concern that digital technologies will be used for purposes unrelated to public health during and following the pandemic. These may not be made transparent to the public or may be added after they have already signed up for contact tracing app (Berman et al., 2020; Gibert et al., 2020; Knight, 2020; Mello & Wang, 2020; Ranisch, 2020). The potential for government officials and private companies to take advantage of the emergency and put surveillance measures enacted in the name of fighting a pandemic to other uses is real. Since 9/11, information acquired via surveillance on national-security grounds has been used to prosecute drug crimes, food-stamp and mortgage fraud, and lying on bank statements. Conversations recorded by an Amazon Echo and heart-rate data tracked by a Fitbit have been used in criminal investigations (Giglio, 2020). In the COVID-19 pandemic, specific concerns have been raised that digital data could be used by law enforcement to track citizens by linking their data to other things like their use of the health system or consumer habits, to track the activities of political dissidents, to share data with the private sector for commercial purposes, and/or to monitor people when they are working from home (Sweeney, 2020; Volkin, 2020; Xafis, Schaefer, Labude, Zhu, & Hsu, 2020). This form of tracking could be used in ways that target certain groups or minorities more than others. Heightening risk of “snowballing into state-sponsored mass surveillance” was also flagged as a danger (Access Now, 2020).
Unfair Design Processes
During the COVID-19 pandemic, large technology companies, including Google and Apple, have become major players in disease management, with significant control over digital health technologies (Kahn & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies, 2020; Sweeney, 2020). Such companies have designed contact tracing apps for use by governments in the fight against coronavirus. Concerns have arisen that the design choices made when building an app are value-laden and companies are not the appropriate actors to decide which values and interests are favored. Calls have been made for technologies to be designed and evaluated by democratically elected governments and to ensure that technology companies do not “control the terms, conditions or capabilities” of digital contact tracing (Albergotti, 2020). To date, such apps have tended to favor privacy and users over maximizing public health and utility in their decision-making (Johnson, 2020b).
Who is included and excluded and how decision-making occurs (in public or confidentially, with or without debate and discussion) are related concerns about the overall fairness of the contact tracing app design process. For example, Mello and Wang (2020) highlighted the app design process in the United States as having several shortcomings:
A group of technology companies convened by the White House to discuss potential uses of technology to combat COVID-19 has no evident agenda, public or stakeholder group representation, or set of guiding principles. Ethicists and legal experts do not appear to be involved. No processes (for example, adaptation of the notice-and-comment period used for administrative rule-making) have been created for the public to give input. (p. 954)
Lack of transparency, public discussion, and accountable democratic political debate were also flagged as unfair features of contact tracing app design processes (Sweeney, 2020).
Eroding Democracy and Public Trust
Concerns were voiced that digital technologies may weaken public trust and democracy if they threaten privacy, are poorly designed, and/or are badly communicated to the public. Where digital technologies encroach upon people's privacy too much, they may leave “people feeling as if they’ve lost control not just over their government, but over their personal life—and the ability to think, act, and communicate without the expectation that someone is watching or listening is fundamental to a thriving democracy” (Giglio, 2020).
Discussion
COVID-19 has exacerbated existing inequalities, leading to calls for public health responses to the pandemic to focus on equity and social justice (Jensen, Kelly, & Avendano, 2021; Kelley et al., 2020; Paremoer, Nandi, Serag, & Baum, 2021; Perry, Aronson, & Pescosolido, 2021; The Lancet Public Health, 2021). Contract tracing and digital surveillance have critical roles to play in informing public health responses and should be used in ways that promote collective interests and that are fair and equitable. This study analyzed literature on digital technologies used to collect surveillance data during epidemics and pandemics to investigate what norms have been identified for fairly and equitably designing and implementing such technologies and what concerns related to social justice and equity have been reported. This discussion considers what dimensions of social justice the norms promote and whether the identified concerns can be addressed by building the identified norms into technology design and implementation practice.
The norms identified in this study advance each of the five dimensions of equity and social justice outlined in the introduction: power relations, recognition, inclusion, distribution, and rights to self-development and adequate levels of well-being. Norms calling for equal access, availability, and monitoring and evaluation speak to ensuring a fair distribution of digital technologies’ burdens and benefits. To appropriately promote social justice and equity in epidemics and pandemics, care is needed to develop evaluation strategies that are granulated to identify any differential impacts on specific groups within surveilled populations. It is also important that such technologies are accessible within populations, including by socially marginalized groups, to promote fair distribution of the benefits of such technologies’ use. Human rights norms seek to protect the wellbeing of socially marginalized groups by preventing companies with track records of violating or facilitating human rights violations from designing and implementing digital technologies. Norms addressing power seek to curb the power of technology companies in digital technologies’ design and implementation and to prevent relations of subordination (where a privileged few determine rules and make decisions for others) from being created or exacerbated in relation to digital technologies (Powers & Faden, 2019).
Norms calling for value-based design and to avoid harming marginalized groups help promote the two dimensions of recognition: respect and visibility. These norms highlight the importance of designing and implementing digital technologies in ways that avoid generating stigma and discrimination. Value-based design entails robust public engagement activities to make visible and incorporate, to the extent possible, a range of values from different groups into the design of the technology. Finally, norms for fair design processes promote inclusion. They require the meaningful participation of a wide range of stakeholders through deliberative decision-making processes. They call for the participation of members of the public, including socially marginalized groups, in decision-making about digital technologies’ design.
The six concerns described in the paper spanned several dimensions of equity and social justice. Each of the six concerns could potentially be addressed by building the identified norms into app design and implementation practice. Where norms for equal access to digital technologies and for monitoring and evaluating the distribution of their implementation's burdens and benefits guide technologies’ design and use, they can help minimize the impact of the digital divide and thus address concerns about it widening disparities. Building in norms for avoiding harming marginalized groups and promoting human rights protection during technologies’ design and implementation can help mitigate the risk of stigma and discrimination. Implementing norms that limit private companies’ power and ensure fair design processes can mitigate concerns raised about procedurally unfair design and implementation of digital technologies. The risk of surveillance creep may be reduced where digital technologies’ design and implementation is guided by norms that limit private companies’ power and public engagement norms that demand transparency regarding how technologies and people's data will be used. Concerns about eroding democracy arise if digital technologies threaten privacy, are poorly designed, and/or are badly communicated to the public. Where norms of fair design processes, value-based design, and public engagement effectively guide digital technologies’ design and implementation, they can reduce such threats.
Best Practices
Given the unprecedented scale and pace of development and implementation of digital surveillance technologies in the COVID-19 pandemic, designing and implementing digital technologies in ways that are equitable is critical now and in future epidemics and pandemics. We recommend that those involved in designing and implementing such technologies seek to uphold the norms identified in this study in their practice. This includes researchers, other developers of digital technologies such as Google and Apple, and the public health officials who deploy them in contact tracing. The eleven norms (Table 1) provide a valuable guide to the equitable design and implementation of digital technologies during infectious disease epidemics and pandemics.
Research Agenda
Although the concerns described in the paper could be addressed by building the identified norms into app design and implementation practice, some norms may still require further specification. In particular, further consideration of norms of fair design process, value-based design, and design to rectify inequities have the potential to help digital technologies in epidemics and pandemics better advance social justice and equity. Norms of fair design process could provide greater direction on several aspects of meaningful inclusion—namely, what design features the public should have a say in, who should lead engagement processes, what stages of the design process the public should be involved in, where public engagement should happen, and what ground rules should govern the process (Pratt, 2019a, 2019b).
In value-based design, robust public engagement activities could identify and incorporate, to the extent possible, a range of values into the design of the technology. Yet there are likely to be competing values and priorities, as evidenced, for example, in other contested policy responses to pandemics (wearing masks, lockdowns, etc.). It is thus important to further consider how public engagement should be conducted and inclusive approaches implemented.
When designing to rectify inequities, careful consideration is needed of what supports may be required for health needs identified by digital surveillance technologies, especially for those who are socially marginalized or vulnerable. These may include, for example, features that assist individuals to self-isolate and access COVID-19 testing. Diversity in the teams who develop digital technologies can also promote epistemic justice by bringing different perspectives and ways of knowing to the table, which in turn can help mitigate the risk of coding bias. Future research can specify norms to avoid coding bias and further investigate whether any additional norms are needed for digital technologies to help advance the various dimensions of social justice and equity during epidemics and pandemics. Considering that epidemic and pandemic situations require data for prompt action and response, it would also be helpful for future studies to delineate new models of operationalizing responsible and transparent digital technology governance, as well as oversight mechanisms, in order to adhere to the norms of design and implementation outlined in the study.
This review found a substantially more comprehensive consideration of equity and social justice norms in the literature on digital surveillance technologies than in the literature on sharing research data in epidemics and pandemics (Pratt & Bull, 2021). For instance, norms of reducing inequities and inclusion have been described for digital surveillance technologies but not for research data sharing. Further research should explore how the equity-related norms for digital surveillance technologies outlined above can inform the identification and articulation of equity-related norms for research data sharing during epidemics and pandemics.
Educational Implications
The findings of this study are pertinent for education programs that train technology developers, digital health professionals, and/or public health professionals and researchers. Bringing together public health, computer science, and digital information technologies, universities are now offering degree programs in digital health. It is important to ensure that these degree programs and other digital health training programs promote norms for the equitable design and implementation of digital surveillance technologies during epidemics and pandemics. Education programs should ensure students understand what the norms mean and how they should be upheld and applied, in addition to being able to identify ethical concerns that may arise in practice.
Sources of Support
BP was supported by a University of Melbourne R Douglas Wright Research Fellowship. This research was partially supported by a Wellcome Trust Strategic Award (096527) and a Department for International Development/Wellcome -Epidemic Preparedness - Coronavirus Grant 221559/Z/20/Z. For the purpose of open access, the author has applied a CC BY public copyright licence to any Author Accepted Manuscript version arising from this submission.
Supplemental Material
Supplemental material, sj-docx-1-jre-10.1177_15562646221118127 for Equitable Design and Use of Digital Surveillance Technologies During COVID-19: Norms and Concerns by Bridget Pratt, Michael Parker and Susan Bull in Journal of Empirical Research on Human Research Ethics
Supplemental material, sj-docx-2-jre-10.1177_15562646221118127 for Equitable Design and Use of Digital Surveillance Technologies During COVID-19: Norms and Concerns by Bridget Pratt, Michael Parker and Susan Bull in Journal of Empirical Research on Human Research Ethics
Author Biographies
Bridget Pratt is an ethics researcher and the Mater Lecturer in Healthcare Ethics at the Queensland Bioethics Centre at Australian Catholic Universty. Her research interests include the ethics of global health research and health systems research, with a focus on social and global justice.
Michael Parker is Professor of Bioethics, and Director of the Ethox Centre and the Wellcome Centre for Ethics and Humanities at the University of Oxford. His research interests include ethical questions relating to: collective action, individual responsibility, and the common good in infectious disease response, research, and preparedness; research in global health emergencies; global health justice; conceptions of consent, privacy and confidentiality in data-driven health systems; clinical, research, and public health uses of genomics and genetics; and the roles of commercial and industry partnerships in healthcare innovation.
Susan Bull is Associate Professor in Bioethics at the Ethox Centre and the Wellcome Centre for Ethics and Humanities at the University of Oxford, and Associate Professor in Medical Ethics at the Faculty of Medical and Health Sciences at the University of Auckland. Susan's research focuses on global health ethics, including ethical issues associated with data sharing, genomic research, seeking consent to research, ethical review and oversight of research and ethical issues associated with infectious disease outbreaks, epidemics and pandemics.
Footnotes
Author Contributions: BP co-conceived of the study described in this paper and was primarily responsible for conducting the literature review, screening, and analyses. She was responsible for writing the first draft of the paper and revising the work critically for intellectual content. She approved the final version submitted to JERHRE. MP revised the paper critically for intellectual content and gave final approval of the version to be published. SB co-conceived the study described in this paper and contributed to both data collection and analysis. She contributed to drafting this paper, revised the paper critically for intellectual content, and gave final approval of the version to be published.
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: BP was supported by a University of Melbourne R Douglas Wright Research Fellowship. SB was partially supported by a Wellcome Trust Strategic Award (096527) and a Department for International Development/Wellcome -Epidemic Preparedness - Coronavirus Grant 221559/Z/20/Z.
ORCID iDs: Bridget Pratt https://orcid.org/0000-0002-4934-3560
Susan Bull https://orcid.org/0000-0002-9730-091X
Supplemental Material: Supplemental material for this article is available online.
References
- Access Now (2020). Recommendations on Privacy and Data Protection in The Fight Against COVID-19. Retrieved from accessnow.org.
- Ada Lovelace Institute (2020). Exit through the App Store? A rapid evidence review on the technical considerations and societal implications of using technology to transition from the COVID-19 crisis.
- Albergotti R. (2020). European government officials call for tech companies to loosen grip on contact-tracing technology. The Washington Post. Retrieved fromhttps://www.washingtonpost.com/technology/2020/05/29/apple-google-contact-tracing/
- Amit M., Kimhi H., Bader T., Chen J., Glassberg E., Benov A. (2020). Mass-surveillance technologies to fight coronavirus spread: The case of Israel. Nature Medicine, 26, 1167–1169. https://doi.org/https://doi.org/10.1038/s41591-020-0927-z [DOI] [PubMed] [Google Scholar]
- Arksey H., O'Malley L. (2005). Scoping studies: Towards a methodological framework. International Journal of Social Research Methodology, 8(1), 19–32. 10.1080/1364557032000119616 [DOI] [Google Scholar]
- Armstrong R., Hall B. J., Doyle J., Waters E. (2011). ‘Scoping the scope’ of a cochrane review. Journal of Public Health, 33(1), 147–150. 10.1093/pubmed/fdr015%J.Journal of Public Health [DOI] [PubMed] [Google Scholar]
- Australian Government Department of Health (2021). COVIDSafe app. Retrieved fromhttps://www.health.gov.au/resources/apps-and-tools/covidsafe-app
- Bayram M., Springer S., Garvey C. K., Ozdemir V. (2020). COVID-19 digital health innovation policy: A portal to alternative futures in the making. OMICS, 24(8), 460–469. 10.1089/omi.2020.0089 [DOI] [PubMed] [Google Scholar]
- Berman G., Carter K., Garcia-Herranz M., Sekara V. (2020). Digital contact tracing and surveillance during COVID-19: General and Child-specific Ethical Issues. UNICEF Office of Research. Innocenti.
- Braun V., Clarke V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. 10.1191/1478088706qp063oa [DOI] [Google Scholar]
- Bull S., Roberts N., Parker M. (2015). Views of ethical best practices in sharing individual-level data from medical and public health research: A systematic scoping review. Journal of Empirical Research on Human Research Ethics, 10(3), 225–238. 10.1177/1556264615594767 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bush G., Leins K. (2020). When tools for a health emergency become tools of oppression. Pursuit. Retrieved fromhttps://pursuit.unimelb.edu.au/articles/when-tools-for-a-health-emergency-become-tools-of-oppression
- Buzzell A. (2020). COVID-19 Digital Contact Tracing - Launch it fast and debug it live. What could go wrong? Retrieved fromhttps://ethicalintelligence.co/covid19-dct-wcgw.html
- Char D. S., Shah N. H., Magnus D. (2018). Implementing machine learning in health care - addressing ethical challenges. New England Journal of Medicine, 378(11), 981–983. 10.1056/NEJMp1714229 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chunara R., Cook S. H. (2020). Using digital data to protect and promote the most vulnerable in the fight against COVID-19. Frontiers in Public Health, 8, 296. 10.3389/fpubh.2020.00296 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cohen I. G., Gostin L. O., Weitzner D. J. (2020). Digital smartphone tracking for COVID-19: public health and civil liberties in tension. JAMA, 323(23), 2371–2372. 10.1001/jama.2020.8570 [DOI] [PubMed] [Google Scholar]
- Cornwall A. (2011). Whose voices? Whose choices? Reflections on gender and participatory development. In Cornwall A. (Ed.), The participation reader (pp. 203–223). Zed Books. [Google Scholar]
- Crocker D. A. (2008). Ethics of global development: agency, capability, and deliberative democracy. Cambridge University Press. [Google Scholar]
- Dubov A., Shoptawb S. (2020). The value and ethics of using technology to contain the COVID-19 epidemic. American Journal of Bioethics, 20(7), W7–W11. 10.1080/15265161.2020.1764136 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fraser N. (1997). Justice Interruptus critical reflections on the “postsocialist” condition. Routledge. [Google Scholar]
- Fricker M. (2007). Epistemic injustice: power and the ethics of knowing. Oxford University Press. [Google Scholar]
- Gasser U., Ienca M., Schiebner J., Sleigh J., Vayena E. (2020). Digital tools against COVID-19: Taxonomy, ethical challenges, and navigation aid. Lancet Digital Health 2, e425–e434. https://doi.org/10.1016/S2589-7500(20)30137-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gibert M., Chicoisne G., Chung R., Dietsch P., Gambs S., Maclure J., Martin D., Tappolet C., Weinstock D. (2020). Ethical issues of pandemic applications. Retrieved fromhttp://www.lecre.umontreal.ca/ethical-issues-of-pandemic-applications/
- Giglio M. (2020). Would You Sacrifice Your Privacy to Get Out of Quarantine? The Atlantic. Retrieved fromhttps://www.theatlantic.com/politics/archive/2020/04/coronavirus-pandemic-privacy-civil-liberties-911/609172/
- Gutmann A., Thompson D. (2004). Why deliberative democracy? Princeton University Press. [Google Scholar]
- Human Rights Watch. (2020). Joint Civil Society Statement: States use of digital surveillance technologies to fight pandemic must respect human rights. https://www.hrw.org/news/2020/04/02/joint-civil-society-statement-states-use-digital-surveillance-technologies-fightAccess 2 Nov 2021
- Illmer A. (2021). Singapore reveals Covid privacy data available to police. BBC News. Retrieved fromhttps://www.bbc.co.uk/news/world-asia-55541001
- Jensen N., Kelly A. H., Avendano M. (2021). The COVID-19 pandemic underscores the need for an equity-focused global health agenda. Humanities and Social Sciences Communications, 8(1), 15. 10.1057/s41599-020-00700-x [DOI] [Google Scholar]
- Johnson B. (2020a). The Covid Tracing Tracker: What’s happening in coronavirus apps around the world. Retrieved fromhttps://www.technologyreview.com/2020/12/16/1014878/covid-tracing-tracker
- Johnson B. (2020b). The US’s draft law on contact tracing apps is a step behind Apple and Google. MIT Technology Review. Retrieved fromhttps://www.technologyreview.com/2020/06/02/1002491/us-covid-19-contact-tracing-privacy-law-apple-google/
- Kahn J., & Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies. (2020). Digital contact tracing for pandemic response: ethics and governance guidance. Johns Hopkins University Press. [Google Scholar]
- Kelley M., Ferrand R. A., Muraya K., Chigudu S., Molyneux S., Pai M., Barasa E. (2020). An appeal for practical social justice in the COVID-19 global response in low-income and middle-income countries. The Lancet Global Health, 8(7), e888–e889. 10.1016/S2214-109X(20)30249-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Knight W. (2020). The Value and Ethics of Using Phone Data to Monitor Covid-19. WIRED. Retrieved fromhttps://www.wired.com/story/value-ethics-using-phone-data-monitor-covid-19/
- Labrique A. B. (2020). It's foolish to worry about privacy when data can help fight coronavirus. CNN Business. Retrieved fromhttps://edition.cnn.com/2020/04/21/perspectives/data-privacy-coronavirus/index.html
- Luciano F. (2020). Mind the app-considerations on the ethical risks of COVID-19 apps. Philosophy & Technology, 33, 167–172. 10.1007/s13347-020-00408-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lupton D. (2016). Digital health technologies and digital data: new ways of monitoring, measuring and commodifying human bodies. In Olleros X., Zhegu M. (Eds.), Research handbook on digital transformations (pp. 85–102). Edward Elgar. [Google Scholar]
- Macklin R. (2005). Double standards in medical research in developing countries. Cambridge University Press. [Google Scholar]
- Mello M. M., Wang C. J. (2020). Ethics and governance for digital disease surveillance. Science, 29(6494), 951–954. 10.1126/science.abb9045, Retrieved fromhttps://www.washingtonpost.com/technology/2020/05/29/apple-google-contact-tracing/ [DOI] [PubMed] [Google Scholar]
- Morley J., Cowls J., Taddeo M., Floridi L. (2020). Ethical guidelines for COVID-19 tracing apps. Nature, 582, (7810), 29-31. https://doi.org/10.1038/d41586-020-01578-0 [DOI] [PubMed] [Google Scholar]
- Nussbaum M. (2000). Women and human development: the capabilities approach. Cambridge University Press. [Google Scholar]
- Paremoer L., Nandi S., Serag H., Baum F. (2021). COVID-19 pandemic and the social determinants of health. BMJ, 372, n129. 10.1136/bmj.n129 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Parker M. J., Fraser C., Abeler-Dörner L., Bonsall D. (2020). Ethics of instantaneous contact tracing using mobile phone apps in the control of the COVID-19 pandemic. Journal of Medical Ethics, 46(7), 427. 10.1136/medethics-2020-106314 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Perry B. L., Aronson B., Pescosolido B. A. (2021). Pandemic precarity: COVID-19 is exposing and exacerbating inequalities in the American heartland. Proceedings of the National Academy of Sciences, 118(8), e2020685118. 10.1073/pnas.2020685118 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Powers M., Faden R. (2006). Social justice: the moral foundations of public health and health policy. Oxford University Press. [DOI] [PubMed] [Google Scholar]
- Powers M., Faden R. (2019). Structural injustice: power, advantage, and human rights. Oxford University Press. [Google Scholar]
- Pratt B. (2019a). Constructing citizen engagement in health research priority-setting to attend to dynamics of power and difference. Developing World Bioethics, 19(1), 45–60. https://doi.org/10.1111/dewb.12197 [DOI] [PubMed] [Google Scholar]
- Pratt B. (2019b). Towards inclusive priority-setting for global health research projects: Recommendations for sharing power with communities. Health Policy and Planning, 34(5), 346–357. 10.1093/heapol/czz041%J.Health Policy and Planning [DOI] [PubMed] [Google Scholar]
- Pratt B., Bull S. (2021). Equitable data sharing in epidemics and pandemics. BMC Medical Ethics, 22(1), 136. 10.1186/s12910-021-00701-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ranisch R., Nijsingh N., Ballantyne A., Friedrich O., Hendl T., Hurst S., Marchmann G., Munthe C., Wild V. (2020). Ethics of digital contact tracing apps for the COVID-19 pandemic response. 10.13140/RG.2.2.23149.00485 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Robinson-Greene R. (2020). The Quandary of Contact Tracing Tech. Retrieved fromhttps://www.prindlepost.org/2020/05/the-quandary-of-contact-tracing-tech/
- Santos B. (2014). Epistemologies of the South: justice against epistemicide. Paradigm. [Google Scholar]
- Shaw, J., Chandra, S., Gordon, D., Bickford, J., Fujioka, J., Yang, R., Griffith, J, Gibson, J. & Bhatia, S. (2020). Digital health technologies for more equitable health systems: A discussion paper. Joint Centre for Bioethics at the Unviersity of Toronto. [Google Scholar]
- Spriggs, M., Arnold, M. V., Pearce, C. M., & Fry, C. (2012). Ethical questions must be considered for electronic health records. Journal of Medical Ethics, 38, 535–539. 10.1136/medethics-2012-101003 [DOI] [PubMed] [Google Scholar]
- Sweeney Y. (2020). Tracking the debate on COVID-19 surveillance tools. Nature Machine Intelligence, 2(6), 301–304. 10.1038/s42256-020-0194-1 [DOI] [Google Scholar]
- The Lancet Public Health. (2021). COVID-19 - break the cycle of inequality. The Lancet Public Health, 6(2), e82. 10.1016/S2468-2667(21)00011-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Vaithianathan R., Ryan R., Anchugina N., Selvey L., Dare T., Brown A. (2020). Digital Contact Tracing for COVID-19: A Primer for Policymakers. Centre for Social Data Analytics: Auckland University of Technology & The University of Queensland.
- Venkatapuram S. (2011). Health justice: an argument from the capabilities approach. Polity Press. [Google Scholar]
- Volkin S. (2020). Digital contact tracing poses ethical challenges. Johns Hopkins Univeristy Hub. Retrieved fromhttps://hub.jhu.edu/2020/05/26/digital-contact-tracing-ethics/
- World Health Organization. (2020). Ethical considerations to guide the use of digital proximity tracking technologies for COVID-19 contact tracing: Interim Guidance. https://www.who.int/publications/i/item/WHO-2019-nCoV-Ethics_Contact_tracing_apps-2020.1Accessed 2 Nov 2021
- Wymant C., Ferretti L., Tsallis D., Charalambides M., Abeler-Dörner L., Bonsall D., Fraser C. (2021). The epidemiological impact of the NHS COVID-19 app. Nature, 594(7863), 408–412. 10.1038/s41586-021-03606-z [DOI] [PubMed] [Google Scholar]
- Xafis V., Schaefer G. O., Labude M. K., Zhu Y., Hsu L. Y. (2020). The perfect moral storm: diverse ethical considerations in the COVID-19 pandemic. Asian Bioeth Rev, 12(2), 65–83. 10.1007/s41649-020-00125-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Young I. M. (1990). Justice and the politics of difference. Princeton University Press. [Google Scholar]
- Young I. M. (2000). Inclusion and democracy. Oxford University Press. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supplemental material, sj-docx-1-jre-10.1177_15562646221118127 for Equitable Design and Use of Digital Surveillance Technologies During COVID-19: Norms and Concerns by Bridget Pratt, Michael Parker and Susan Bull in Journal of Empirical Research on Human Research Ethics
Supplemental material, sj-docx-2-jre-10.1177_15562646221118127 for Equitable Design and Use of Digital Surveillance Technologies During COVID-19: Norms and Concerns by Bridget Pratt, Michael Parker and Susan Bull in Journal of Empirical Research on Human Research Ethics

