Abstract
The COVID-19 pandemic witnessed a surge in the use of health data to combat the public health threat. As a result, the use of digital technologies for epidemic surveillance showed great potential to collect vast volumes of data, and thereby respond more effectively to the healthcare challenges. However, the deployment of these technologies raised legitimate concerns over risks to individual privacy. While the ethical and governance debate focused primarily on these concerns, other relevant issues remained in the shadows. Leveraging examples from the COVID-19 pandemic, this perspective article aims to investigate these overlooked issues and their ethical implications. Accordingly, we explore the problem of the digital divide, the role played by tech companies in the public health domain and their power dynamics with the government and public research sector, and the re-use of personal data, especially in the absence of adequate public involvement. Even if individual privacy is ensured, failure to properly engage with these other issues will result in digital epidemiology tools that undermine equity, fairness, public trust, just distribution of benefits, autonomy, and minimization of group harm. On the contrary, a better understanding of these issues, a broader ethical and data governance approach, and meaningful public engagement will encourage adoption of these technologies and the use of personal data for public health research, thus increasing their power to tackle epidemics.
Keywords: Digital epidemiology, Big data, COVID-19, Ethics, Public engagement, Privacy
Graphical Abstract
1. Introduction: COVID-19 as the first digital pandemic
The SARS-coV-2 (COVID-19) pandemic reinforced the role of data as an indispensable resource for fighting public health threats. For the first time in the history of epidemiology, researchers had real-time access to large volumes of health data (Johnson, 2020). Health authorities worldwide have relied on information from digital diagnostics, computer vision tools (such as temperature-sensing cameras), and social media epidemiological surveillance strategies and online searches, among others. When combined with powerful artificial intelligence (AI) algorithms and machine learning (ML)-based computational models, these datasets have provided valuable insights concerning regional rate of infection rates or the evolution of the epidemic. Hence, a change occurred in the approach to epidemic response during COVID-19. We shifted from a reactive model, chasing pandemic developments and attempting to mitigate the consequences, to a dynamic one that anticipates the steps forward and responds to the epidemic in real time. Such a shift had the potential to inform policymakers regarding timely responses to the health crisis (e.g., concerning which mitigation measures to adopt).
Among other examples, the successful case of the Global.health data repository exemplifies the power of health data to curb COVID-19 (Maxmen, 2021). During 2020 and 2021, this platform stored not an aggregate of data but rather anonymized individual data – date of positive infection test, coronavirus variant, disease symptoms, hospitalization, travel history – from over 150 countries, in a single open-access database. Epidemiologists worldwide had the opportunity to access this pool of highly granular data, comparing findings across projects and more accurately hypothesizing how the virus was spreading (Kraemer et al. 2021). The success of this story is largely due to involvement of the big tech industry (specifically Google), which invested human and financial resources to curate and standardize the various datasets, making them interoperable.
However, collaboration with such high-profile private partners is not always a viable option in biomedical and epidemiological research. Furthermore, the COVID-19 pandemic shed light on other issues that can undermine access to and use of health data in such contexts. Among them is the scarce availability of big data repositories at the level of national research institutes and ministries of health. Even where they exist, the data are often not complete, up-to-date, or granular to an adequate degree (Oderkirk, 2021). Lack of appropriate infrastructure and technology to share this data, especially across national borders, presents a further challenge. Data governance regulations vary greatly across jurisdictions, interfering with access and exchange of data between countries for research purposes (OECD, 2021). During COVID-19, researchers in Europe may have favored a risk-adverse approach to data sharing to avoid violating the General Data Protection Regulation (GDPR), which sets demanding standards in order to safeguard individual autonomy and privacy (McLennan et al., 2020). Consequently, there were calls to strengthen the open data science approach, to meet the rising demands for health data while ensuring safe data use (Gardner et al. 2021).
To address the hunger for data prompted by the race against COVID-19, a variety of digital technologies were leveraged for epidemic surveillance (Budd et al. 2020), including direct to consumers tools (mobile apps, social media, online search engines, wearable trackers), as well as computer vision devices and infrared cameras (Davis and Matsoso, 2020). While the discipline of digital epidemiology was established long before the COVID-19 pandemic (Salathe et al. 2012), over the past three years we have witnessed its full potential to harvest, analyze, and interpret data that were not originally collected for public health purposes. Notably, these data helped to detect COVID-19 infection and related symptoms at an early stage, monitor social removal and quarantine obligations, track infected contacts, and provide insights into citizen attitudes about vaccination campaigns (Whitelaw et al., 2020, Mahmood et al., 2020). As epidemic surveillance went digital, actors outside the healthcare environment, such as tech giants and telecommunication companies, emerged as new stakeholders in the data ecosystem (Robinson, 2022). However, such private sector stakeholders often hold divergent interests from those of national governments, health agencies, or biomedical researchers, with different ethical standards for data management and use (Thomason, 2021). This increasingly complex ecosystem of stakeholders, data types, interests, and standards brought to the forefront an issue that, though historically discussed in epidemiological surveillance (Mariner, 2007), received amplified attention during the COVID-19 pandemic: the problem of privacy.
2. Privacy in the spotlight
The COVID-19 pandemic affirmed the paramount value of privacy in the age of digital surveillance, particularly regarding the debate on the development and adoption of digital contact tracing. At a time when vaccines were not yet available, ministries of health and policymakers around the world endorsed the use of these technologies, and thus the processing of personal data, to contain the spread of disease (European Commission, 2020, Sust et al., 2020). Some Asian countries mandated that their citizens download mobile applications that relied on GPS or geolocation from cellular towers, storing the data in centralized government archives (Blasimme, Ferretti, and Vayena, 2021). The public health emergency seemed to justify such intrusive intervention by appealing to exceptional reasons of public interest and security.
On the other side, the adoption of digital contact tracing in Europe (and other Western countries) was not flawless. Beyond questions and concerns about the reliability of the technology was debate over the risk of harm to citizens through privacy violations. Some scholars have noted that data on one’s health status has the potential to be re-used by third parties to discriminate and restrict individual rights (including the right to free movement, to study, or to work) (Gasser et al. 2020). Others have pointed out that institutional and government access to personal data (e.g., geospatial data) could inhibit the exercise of basic freedoms, if individuals feel watched as to what they do or with whom they spend time (Gasser et al. 2020).
Because of these potential risks, and despite the urgency of finding quick and effective solutions to curb the pandemic, European advisory committees and governing bodies emphasized the need to safeguard citizens' privacy and data. This precautionary approach aligned with GDPR provisions, as well as the opinion of the European General Data Protection Board (eHealth Network, 2020, EDBP, 2020b). European regulators and policymakers recommended minimal processing of personal data and adherence to technical precautions, to avoid data leakage and cyber-attacks (EDBP 2020a). As a result, many European countries opted for privacy-preserving systems based on voluntariness, transparency, exchange of unidentifiable Bluetooth data, and decentralized data storage built on the application programming interface (API) created by Google and Apple.
While these efforts represented the intention of democratic societies to protect one of their core values (i.e., privacy), they also framed the conversation on digital contact tracing, and on digital tools for epidemic monitoring more generally, in binary terms (Vayena, 2021). The tension between individual privacy and public health has been exemplified by an increasingly polarized privacy-focused debate. Whereas some scholars have criticized the European approach to digital contact tracking for prioritizing privacy over public health and the safeguard of human life (O’Connell and O’Keeffe 2021), others worry that these tools will become normalized and support a state of surveillance even in a post-pandemic world (Seberger and Patil, 2021).
While the issue of privacy dominated ethical, technical, and governance debates about digital surveillance during the COVID-19 pandemic, this did not translate into widespread adoption of digital contact tracking technologies. On the contrary, uptake of these technologies was quite modest. Among the empirical studies conducted to date to assess public perceptions and motivations about this phenomenon, one from the United Kingdom (UK) has suggested that lack of trust in digital health surveillance resulted from distrust in the government, and was further exacerbated by scandals involving big data corporations (Samuel et al., 2021). The extent to which the discussion on digital epidemiology overlooked other relevant issues and ethical concerns may also account for this distrust. The next section explores what lay in the shadow of privacy, and considers how neglecting other ethical concerns may not only negatively affect public acceptance of digital technologies, but also undermining their power to manage public health threats.
3. Unaddressed issues and ethical concerns
Individual and collective harm can result from failing to harness the potential of digital epidemiology to stop epidemics. However, the use of data for epidemic surveillance and control can also pose personal and societal harms. While various stakeholders displayed intention and effort to address risk to individual privacy during COVID-19, this cannot be considered a panacea for data ethics. Data ethics extends far beyond protecting data, ensuring control over one's information, and applying privacy-by-design technological choices (Blasimme and Vayena, 2020). Indeed, further ethical issues exist in relation to how (i.e., in which ways and by whom) and why (i.e., for which purposes) personal data are collected and used in digital epidemiology. If not promptly addressed, concerns about equity, accountability, trust, transparency, risk of group harm, and autonomy will persist and will disproportionately impact the most vulnerable, even when individual privacy is assured.
3.1. The digital divide
The COVID-19 pandemic shed light on the presence of two opposing but coexisting forces: on the one hand the abundance of data that characterized the battle against the pandemic, and on the other hand the scarce and poor quality data from certain population groups (Ibrahim et al. 2021). This duality can be understood in light of the digital divide that affected both advanced and emerging economies during the pandemic. The digital divide was evident in relation to three aspects: access to technology, ability/willingness to use technology, and variety of technologies.
First, digital technologies and communication infrastructure do not reach everyone the same way. Economic, cultural, and political barriers stand between technological potential and the opportunity to exploit it (Naudé and Vinuesa, 2021). Certainly, this problem is prevalent in low- and middle-income countries, but during the COVID-19 pandemic, it also emerged in high income economies (Eruchalu et al., 2021, Pagliari, 2020). In Europe, for instance, as only newer smartphone models supported the Google-Apple API system, those with older devices were denied access to these apps (Reader, 2020). Second, even in circumstances when access to technology is feasible, some people might not be online, either because of other limitations or by choice (Giansanti and Velcro, 2021). For instance, the older segment of the population may lack sufficient digital literacy and skills to take advantage of digital tools. Others may be unwilling to engage in such digital endeavors (e.g., due to fear of stigma and third-party data access, or due to lack of interest) (Nachega et al. 2021). Finally, the wide array of technologies rolled out during a health crisis is matched by inconsistencies in the quality of data collected by these technologies. The reliability of self-reported or social media datasets about disease symptoms for example, may be lower and more difficult to corroborate than that of traditional epidemiological datasets (Campos-Castillo and Laestadius, 2020). In addition, inadequate technology validation due to the pressure to address an urgent health threat only increases the likelihood of errors in datasets (Crawford and Serhal, 2020).
Although inequalities in data quantity and quality are not exclusive to the field of digital epidemiology, the magnitude of impact in this context is severe. Thus, the digital divide, data poverty, and biased datasets can lead to economic, social, and health burdens on many lives, while potentially exacerbating existing health inequalities amid a public health emergency. In the COVID-19 pandemic, missing or skewed data from vulnerable groups (e.g., elderly people, those in lower income households) resulted in undetected infections and inadequate care while the virus continued to circulate (Blom et al. 2021). Conversely, when areas were incorrectly flagged as high-risk for disease infection, the closure of schools and businesses affected entire communities (Mello and Wang, 2020). Because the stakes are so high, proper testing of datasets and validation of ML models, alongside technologies designed to be compatible with existing infrastructures and digital literacy (Veinot, Mitchell, and Ancker, 2018), are necessary to ensure fair distribution of the benefits of digital epidemiology. In a pandemic setting, acess to these technologies becomes even more crucial. In this regard, some researchers have recently recommended including digital transformation among the determinants of health, lest the most vulnerable be the most negatively affected by the effects of health digitization (Kickbusch et al. 2021).
3.2. The role of big tech
The roles and distribution of accountability between national governments and the private sector in the biomedical and health sectors require timely clarification. This urgency arises as the asymmetry of power between these two stakeholders is growing, and during the COVID-19 pandemic sovereign states appeared to be losing ground.
As commercial technology and telecommunications companies increasingly enter the spaces of digital epidemiology, health research, medical services, and healthcare infrastructure, they control an ever-increasing amount of data (Tagmatarchi et al., 2021). Despite dealing with a public good (i.e. safeguarding health), the private sector decides whether or not to make these data available for research and public benefit (Kostkova et al. 2021). Unlike national governments, private companies are not bound by democratic and transparent decision-making processes, and do not have the same standard of public accountability. For example, pharmaceutical or health insurance companies acquire and control large amounts of health data but bind their strategies for using and managing this data through non-disclosure agreements. Despite efforts to standardize and update data governance within and across countries, some uses of data may still fall outside the purview of existing oversight mechanisms, particularly when it comes to publicly available data and data generated by the private sector (Ferretti et al. 2020). Yet, the question of the liability and moral ground of the private sector in the case of exploiting sick people for corporate profit must be considered carefully, as it affects trust in government, corporations, and ultimately in health research (Levine, 2021). In this regard, scholars have investigated the limits of informed consent in digital health research, as well as the negative impact of insufficient public engagement and unfair distribution of benefits towards data subjects (Amann et al., 2021, Paterson and McDonagh, 2018, Banks, 2020).
This aside, the heart of the matter of the "Googlisation” of health is that private companies have become indispensable players in various sectors of society, capable of providing platforms connecting the sphere of health to those of communication, marketing, education, transportation, and others (Sharon, 2022). Thanks to this methodological advantage and pervasive network, companies offer services that not even governments can resist. Hence, the U.S. government's recent collaboration with numerous dating apps to promote the COVID-19 vaccination campaign among young people (Judd, 2021), illustrating the growing one-way reliance of lay citizens and governments on the private sector to address (public) health needs.
Strengthened by this power, commercial companies advocate positions that often go beyond their technological expertise. These positions may not only influence the focus of biomedical research, but can also impact changes in society (Sharon, 2021). Google and Apple’s prompt offer of digital support through their privacy preserving API system in the fight against COVID-19 is one example. This case illustrates the power of corporations to set technical standards that can hardly be negotiated. Google and Apple prioritized privacy over the use of sensitive data (e.g., location data), and in doing so determined the balance between privacy protection and data access (Kahn, 2020). What remains to be seen is whether the private sector will assume the responsibility that accompanies such determinations, and how this will impact the power dynamics with national governments and public trust.
3.3. Data (re-)uses without public engagement
During the COVID-19 pandemic, many people were willing to share their data and health information to improve the public health situation. However, this proactive attitude was perhaps at times misunderstood by government authorities, who interpreted it as a free pass to re-use these data as they wished, as long as it was for a "common good". In this regard, we witnessed a series of incidents involving law enforcement agencies accessing health data for investigations without seeking public agreement, and even though these data were explicitly collected to fight COVID-19.
A first case was reported in early 2021, in relation to the Singaporean government’s app TraceTogether, a Bluetooth and centralized system-based contact tracing tool. Although the app was praised for its effectiveness in monitoring the spread of the virus and enforcing COVID-19 restrictions, some criticized it harshly for failing to adequately protect citizens' privacy. As law enforcement agencies were granted access to citizens' data, the government revised the app's original privacy statement and amended legislation to justify use of this data for serious criminal investigations (Ikeda, 2021). A similar scandal emerged in Fall 2021 when Australian police accessed QR code check-in data for criminal investigations on at least six occasions, even though the data were originally collected by digital epidemiology tools for outbreak monitoring (Galloway, 2021). Finally, the recent "Luca app" case caused a stir in the public debate, as the German police successfully petitioned local health authorities to release location data collected via this check-in app used to trace restaurant guests and shop customers (Pannett, 2022). By misusing data originally gathered to protect against infection, the police and prosecutors violated German data protection law. A great deal of media hype followed each of these cases.
There have also been reports of individuals being tracked, privacy being breached, and minorities being stigmatized. Some authors have noted that the misuse of data by law enforcement agencies is likely to exacerbate profiling, policing, discrimination, and criminalization of vulnerable groups and minorities (Sundquist, 2021, Spektor, 2021). Facial recognition software, thermal imaging, and other digital epidemiology tools can lead to human rights infringement, besides being relatively ineffective and inaccurate at detecting communicable diseases like SARS-coV-2 (Roussi, 2020, Hendl et al., 2020).
The limits of data access should be a matter of public engagement and deliberation (namely seeking public opinion via different means such as referendums, polls and co-creative opportunities) during which concerns can be voiced and benefits and risks can be understood and balanced. Lack of engagement can undermine public support and lead to a de-legitimization of public health measures, even if governments have laudable intentions (such as promoting public good or catching criminals). If data are collected in non-transparent ways, without informing the public or obtaining permission, the public can in response feel spied upon and betrayed, diminishing trust in institutions (Zhao et al. 2021). By accessing data for purposes which the public does not approve, authorities undermine public trust.
As an example, the Canadian government recently procured aggregate location data from a telecommunications company in order to monitor the prevalence of the pandemic in certain areas (Berendt, 2021). Despite the authorities’ good intentions and the fact that these data aggregates may sufficiently protect individual privacy, this decision sparked a public response and questions about non-transparent public-private partnerships in digital epidemiology and health research. Similarly in the UK, concerns were raised about the use of wastewater data to forecast COVID-19 transmission, despite the absence of individual privacy violations (Tubb, 2022). This case exemplifies once again how – regardless of the urgency and exceptional circumstances of a health emergency – compromising transparent communication and adequate involvement of the population can hinder positive outcomes (Gable et al., 2020). On the contrary, lack of adequate communication and misinformation may increase mistrust in health care authorities, and consequently negatively affect the adoption of public health measures.
The end-users may reject digital epidemiology interventions as unacceptable if their opinions and perspectives were not included in the development of the interventions. Notably, research shows that people find more acceptable those tools that involve their active participation at each design phase, as well as those that are aligned with their preferences and expectations (e.g., about the data uses) (Perski and Short 2021; Westerlund et al., 2021). These preferences are influenced by context, sociocultural norms, and individual needs, suggesting that a one-size-fits-all approach to the development and implementation of these technologies may be inappropriate. The risk would not only lie in the potential rejection of public health measures, but more dramatically in discriminating against those individuals or groups whose voices have not been heard and whose needs have not been addressed (Crawford and Serhal, 2020).
Although a daunting task in the health crisis setting, it is nonetheless crucial to promote open dialogue with stakeholders, codesign of technologies, careful assessment of the enabling context, and meaningful involvement with vulnerable individuals and marginalized groups. As a recent analysis suggested, these strategies may encurage public agency and data sharing for research purposes, while ensuring social acceptance and greater trust in public health technologies (Erikainen et al. 2021).
4. Conclusion: The path towards a more comprehensive ethical approach to digital epidemiology
Our experience with COVID-19 has shown that data for epidemic surveillance must be protected. Certainly, data privacy regulation and privacy-by-design help to limit the frequency of data abuse. In this regard, stakeholders seem to be increasingly aware of privacy issues, as evidenced by efforts to avoid data misuse (Sharon, 2022). Yet, critical lessons must still be learned and acted upon to guarantee more ethically-aligned used of digital epidemiology.
A first lesson is that beyond privacy, there are still unresolved issues to critically addressed. We need to rethink what it means to use and rely upon digital epidemiology, as even guaranteed data security does not translate necessarily into fair, transparent, and correct use of data. We must redefine the ethical rationale that justifies the implementation of these technologies and the use of personal data. Such ethical appraisal and reflection must be integrated in the process of developing a technology and reiterated at various stages from conceptualization to deployment.
A second lesson is the need to clarify the relationships and the roles of the public and private sectors in public health research and services. The definition of mechanisms to hold governments, private companies, researchers, and technology developers accountable is of paramount importance to ensure the ethical use of digital health technologies. In this regard, harvesting data for public good might not be reason enough to justify data re-use, notwithstanding individual privacy safeguards.
A third lesson is that public trust and adequate social license for data usage serve to legitimize digital surveillance interventions. Despite claims of seeking to engage with underrepresented voices and integrate their perspectives into data governance and digital technology development, this action has yet to happen (Agrawal, 2021). Hence, the call of the World Health Organization, among other institutions, for adoption of community data oversight (WHO, 2021): both private and public sectors should seek meaningful social engagement when deploying digital health tools and using personal data for health research.
While these issues have been raised since the early days of digital epidemiology (Vayena et al. 2015), we have yet to effectively address them. The pandemic experience should serve as an opportunity to now promote a more ethically aligned use of surveillance technology against health threats in order to unlock its full potential.
Funding
This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.
CRediT authorship contribution statement
Agata Ferretti: Conceptualization, Writing – original draft, Writing – review & editing. Effy Vayena: Conceptualization, Writing – review & editing, Supervision.
Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
References
- Agrawal, Anurag. 2021. Ungoverned digital advances undermine global healthcare gains. Financial Times, Last Modified 2021–10-25. https://www.ft.com/content/a3095835–2416-4235–967b-7986d1678601.
- Amann Julia, Sleigh Joanna, Vayena Effy. Digital contact-tracing during the Covid-19 pandemic: an analysis of newspaper coverage in Germany, Austria, and Switzerland. Plos One. 2021;16(2) doi: 10.1371/journal.pone.0246524. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Banks Marcus A. Tech giants, armed with wearables data, are entrenching in health research. Nat. Med. 2020;26(1):4–6. doi: 10.1038/s41591-019-0701-2. [DOI] [PubMed] [Google Scholar]
- Berendt, Bettina. 2021. Ottawa's use of our location data raises big surveillance and privacy concerns. The Conversation, accessed 12.02.2022. http://theconversation.com/ottawas-use-of-our-location-data-raises-big-surveillance-and-privacy-concerns-175316.
- Blasimme Alessandro, Ferretti Agata, Vayena Effy. Digital contact tracing against COVID-19 in Europe: current features and ongoing developments. Front. Digit. Health. 2021;3:61. doi: 10.3389/fdgth.2021.660823. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Blasimme Alessandro, Vayena Effy. What's next for COVID-19 apps? Governance and oversight. Science. 2020;370(6518):760–762. doi: 10.1126/science.abd9006. [DOI] [PubMed] [Google Scholar]
- Blom Annelies G., Wenz Alexander, Cornesse Carina, Rettig Tobias, Fikel Marina, Friedel Sabine, Möhring Katja, Naumann Elias, Reifenscheid Maximiliane, Krieger Ulrich. Barriers to the large-scale adoption of the COVID-19 contact-tracing app in Germany: survey study. J. Med. Internet Res. JMIR. 2021;23(3) doi: 10.2196/23362. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Budd Jobie, Benjamin S.Miller, Manning Erin M., Lampos Vasileios, Zhuang Mengdie, Edelstein Michael, Geraint Rees, Vincent C.Emery, Molly M Stevens, Keegan Neil. Digital technologies in the public-health response to COVID-19. Nat. Med. 2020;26(8):1183–1192. doi: 10.1038/s41591-020-1011-4. [DOI] [PubMed] [Google Scholar]
- Campos-Castillo Celeste, Linnea I.Laestadius. Racial and ethnic digital divides in posting COVID-19 content on social media among US adults: secondary survey analysis. J. Med. Internet Res. 2020;22(7) doi: 10.2196/20472. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Crawford Allison, Serhal Eva. Digital health equity and COVID-19: the innovation curve cannot reinforce the social gradient of health. J. Med. Internet Res. 2020;22(6) doi: 10.2196/19361. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Davis, Steve and Precious Matsoso. 2020. "COVID-19 as a Digital Pandemic | Think Global Health. Think Global Health, Accessed 12.02.2022. https://www.thinkglobalhealth.org/article/covid-19-digital-pandemic.
- EDBP. 2020a. Guidelines 03/2020 on the processing of data concerning health for the purpose of scientific research in the context of the COVID-19 outbreak." European Data Protecion Board, Accessed 15.02.2022. https://edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_202003_healthdatascientificresearchcovid19_en.pdf.
- EDBP. 2020b. Statement on the processing of personal data in the context of the COVID-19 outbreak. European Data Protection Board, Accessed 12.02.2022.
- eHealth Network. 2020. Interoperability Guidelines for Approved Contact Tracing Mobile Applications in the EU. Accessed 21.02.2022. https://ec.europa.eu/health/sites/health/files/ehealth/docs/contacttracing_mobileapps_guidelines_en.pdf.
- Erikainen, S., N. Boydell, N. Sethi, E. Stewart. 2021. Participatory public engagement in digital health and care: Moving beyond conventional engagement methods. NESTA (National Endowment for Science,Technology and the Arts),London, accessed 21.02.2022. https://www.pure.ed.ac.uk/ws/portalfiles/portal/249597691/Participatory_public_engagement_in_digital_health_and_care_EdUni.pdf.
- Eruchalu Chukwuma N., Margaret S.Pichardo, Bharadwaj Maheetha, Rodriguez Carmen B., Rodriguez Jorge A., Bergmark Regan W., Bates David W., Ortega Gezzer. The expanding digital divide: digital health access inequities during the COVID-19 pandemic in New York City. J. Urban Health. 2021;98(2):183–186. doi: 10.1007/s11524-020-00508-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- European Commission. 2020. "Digital solutions during the pandemic." accessed 25.02.22. https://ec.europa.eu/info/live-work-travel-eu/coronavirus-response/digital-solutions-during-pandemic_en.
- Ferretti Agata, Marcello Ienca Samia Hurst, Vayena Effy. Big data, biomedical research, and ethics review: new challenges for IRBs. Ethics Hum. Res. 2020;42(5):17–28. doi: 10.1002/eahr.500065. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gable Lance, Natalie Ram, Jeffrey L.Ram. Legal and ethical implications of wastewater monitoring of SARS-CoV-2 for COVID-19 surveillance. J. Law Biosci. 2020;7(1):lsaa039. doi: 10.1093/jlb/lsaa039. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Galloway, Anthony. 2021. Police using QR check-in data to solve crimes. The Sydney Morning HErald, Last Modified 2021–09-05, Accessed 12.02.2022. https://www.smh.com.au/politics/federal/breach-of-trust-police-using-qr-check-in-data-to-solve-crimes-20210903-p58om8.html.
- Gardner Lauren, Jeremy Ratcliff, Ensheng Dong, Aaron Katz. A need for open public data standards and sharing in light of COVID-19. Lancet Infect. Dis. 2021;21(4) doi: 10.1016/S1473-3099(20)30635-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gasser Urs, Ienca Marcello, Scheibner James, Sleigh Joanna, Vayena Effy. Digital tools against COVID-19: taxonomy, ethical challenges, and navigation aid. Lancet Digit. Health. 2020;2(8):e425–e434. doi: 10.1016/S2589-7500(20)30137-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Giansanti, Daniele and Giulia Velcro. 2021. The digital divide in the era of COVID-19: an investigation into an important obstacle to the access to the mHealth by the citizen. Healthcare. [DOI] [PMC free article] [PubMed]
- Hendl Tereza, Chung Ryoa, Wild Verina. Pandemic surveillance and racialized subpopulations: mitigating vulnerabilities in COVID-19 apps. J. bioethical Inq. 2020;17(4):829–834. doi: 10.1007/s11673-020-10034-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ibrahim Hussein, Xiaoxuan Liu, Nevine Zariffa, Morris Andrew D., Denniston Alastair K. Health data poverty: an assailable barrier to equitable digital health care. Lancet Digit. Health. 2021;3(4):e260–e265. doi: 10.1016/S2589-7500(20)30317-4. [DOI] [PubMed] [Google Scholar]
- Ikeda, Scott. 2021. Police Have Access To Singapore's TraceTogether App Data, in Spite of Earlier Assurances; Will Trust in Contact Tracing Apps Be Undermined? - CPO Magazine. CPO magazine, Last Modified 2021–01-11, Accessed 12.02.2022. https://www.cpomagazine.com/data-privacy/police-have-access-to-singapores-tracetogether-app-data-in-spite-of-earlier-assurances-will-trust-in-contact-tracing-apps-be-undermined/.
- Johnson, Steven. 2020. How Data Became One of the Most Powerful Tools to Fight an Epidemic (Published 2020). The New York Times, Last Modified 2020–06-11, accessed 12.02.2022. https://www.nytimes.com/interactive/2020/06/10/magazine/covid-data.html.
- Judd, Donald. 2021. White House partners with dating apps to encourage Covid vaccinations. CNN Politics, Accessed 12.02.2022. https://www.cnn.com/2021/05/21/politics/covid-vaccine-status-dating-apps-white-house/index.html.
- Kahn Jeffrey P. Johns Hopkins University Press; 2020. Digital Contact Tracing for Pandemic Response: Ethics and Governance Guidance. [Google Scholar]
- Kickbusch Ilona, Piselli Dario, Agrawal Anurag, Balicer Ran, Banner Olivia, Adelhardt Michael, Capobianco Emanuele, Fabian Christopher, Gill Amandeep Singh, Lupton Deborah. The lancet and financial times commission on governing health futures 2030: growing up in a digital world. Lancet. 2021;398(10312):1727–1776. doi: 10.1016/S0140-6736(21)01824-9. [DOI] [PubMed] [Google Scholar]
- Kostkova Patty, Saigí-Rubió Francesc, Eguia Hans, Borbolla Damian, Verschuuren Marieke, Hamilton Clayton, Azzopardi-Muscat Natasha, Novillo-Ortiz David. Data and digital solutions to support surveillance strategies in the context of the COVID-19 pandemic. Front. Digit. Health. 2021;3:89. doi: 10.3389/fdgth.2021.707902. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kraemer Moritz U.G., Oliver G Pybus, Christophe Fraser, Simon Cauchemez, Rambaut Andrew, Cowling Benjamin J. Monitoring key epidemiological parameters of SARS-CoV-2 transmission. Nat. Med. 2021;27(11):1854–1855. doi: 10.1038/s41591-021-01545-w. [DOI] [PubMed] [Google Scholar]
- Levine, Alexandra. 2021. Suicide hotline shares data with for-profit spinoff, raising ethical questions. Politico, Accessed 12.02.2022. https://www.politico.com/news/2022/01/28/suicide-hotline-silicon-valley-privacy-debates-00002617.
- Mahmood Sultan, Khaled Hasan, Michelle Colder Carras, Labrique Alain. Global preparedness against COVID-19: we must leverage the power of digital health. JMIR Public Health Surveill. 2020;6(2) doi: 10.2196/18980. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mariner Wendy K. Mission creep: public health surveillance and medical privacy. BUL Rev. 2007;87:347. [Google Scholar]
- Maxmen, Amy. 2021. Massive Google-funded COVID database will track variants and immunity. Nature (Lond.). [DOI] [PubMed]
- McLennan Stuart, Leo Anthony Celi, Buyx Alena. COVID-19: putting the general data protection regulation to the test. JMIR Public Health Surveill. 2020;6(2) doi: 10.2196/19279. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mello Michelle M., Wang C.Jason. Ethics and governance for digital disease surveillance. Science. 2020;368(6494):951–954. doi: 10.1126/science.abb9045. [DOI] [PubMed] [Google Scholar]
- Nachega Jean B., Atteh Rhoda, Ihekweazu Chikwe, Sam-Agudu Nadia A., Adejumo Prisca, Nsanzimana Sabin, Rwagasore Edson, Condo Jeanine, Paleker Masudah, Mahomed Hassan. Contact tracing and the COVID-19 response in Africa: best practices, key challenges, and lessons learned from Nigeria, Rwanda, South Africa, and Uganda. Am. J. Trop. Med. Hyg. 2021;104(4):1179. doi: 10.4269/ajtmh.21-0033. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Naudé Wim, Vinuesa Ricardo. Data deprivations, data gaps and digital divides: Lessons from the COVID-19 pandemic. Big Data Soc. 2021;8(2) 20539517211025545. [Google Scholar]
- O’Connell James, Derek T.O.’Keeffe. Contact tracing for Covid-19—a digital inoculation against future pandemics. N. Engl. J. Med. 2021;385(6):484–487. doi: 10.1056/NEJMp2102256. [DOI] [PubMed] [Google Scholar]
- Oderkirk, Jillian. 2021. Health Working Paper No. 127. Survey Results: National Health Data Infrastructure and Governance. OECD, Directorate for Employment, Labour and Social Affairs, Health Division. https://www.oecd.org/sti/survey-results-national-health-data-infrastructure-and-governance-55d24b5d-en.htm.
- OECD. 2021. Enhancing access to research data during crises: lessons learned from the COVID-19 pandemic Summary note of a GSF-RDA workshop. OECD, Directorate Science, Technology and Innovation Committee for Scientific and Technological Policy, Accessed 12.02.2022. https://one.oecd.org/document/DSTI/STP/GSF(2021)13/FINAL/en/pdf.
- Pagliari Claudia. The ethics and value of contact tracing apps: International insights and implications for Scotland's COVID-19 response. J. Glob. Health. 2020;10(2) doi: 10.7189/jogh.10.020103. 020103-020103. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pannett, Rachel. 2022. German police used a tracing app to scout crime witnesses. Some fear that’s fuel for covid conspiracists. The Washington Post, Last Modified 2022–01-13, accessed 22.02.2022. https://www.washingtonpost.com/world/2022/01/13/german-covid-contact-tracing-app-luca/.
- Paterson Moira, McDonagh Maeve. Data protection in an era of big data: the challenges posed by big personal data. Monash Univ. Law Rev. 2018;44(1):1–31. [Google Scholar]
- Perski Olga, Camille E.Short. Acceptability of digital health interventions: embracing the complexity. Transl. Behav. Med. 2021;11(7):1473–1480. doi: 10.1093/tbm/ibab048. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Reader, Ruth. 2020. Apple and Google’s contact tracing apps only work on new phones. That’s a problem. Fastcompany, Last Modified 2020–08-24, accessed 12.02.2022. https://www.fastcompany.com/90542415/apple-and-googles-contact-tracing-apps-only-work-on-new-phones-thats-a-problem.
- Robinson Sandra. In Communication and Health. Springer; 2022. Cases and Traces, Platforms and Publics: Big Data and Health Surveillance; pp. 271–289. [Google Scholar]
- Roussi A. Resisting the rise of facial recognition. Nature. 2020;587(7834):350–353. doi: 10.1038/d41586-020-03188-2. [DOI] [PubMed] [Google Scholar]
- Salathe, Marcel, Linus Bengtsson, Todd J Bodnar, Devon D Brewer, John S Brownstein, Caroline Buckee, Ellsworth M Campbell, Ciro Cattuto, Shashank Khandelwal, and Patricia L Mabry. 2012. "Digital epidemiology." [DOI] [PMC free article] [PubMed]
- Samuel, Gabrielle, SL Roberts, A Fiske, F Lucivero, S McLennan, A Phillips, S Hayes, and SB Johnson. 2021. "COVID-19 contact tracing apps: UK public perceptions." Critical Public Health:1–13. [DOI] [PMC free article] [PubMed]
- Seberger John S., Patil Sameer. Post-COVID public health surveillance and privacy expectations in the United States: scenario-based interview study. JMIR mHealth uHealth. 2021;9(10) doi: 10.2196/30871. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sharon Tamar. Blind-sided by privacy? Digital contact tracing, the Apple/Google API and big tech’s newfound role as global health policy makers. Ethics Inf. Technol. 2021;23(1):45–57. doi: 10.1007/s10676-020-09547-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sharon, Tamar. 2022. Beyond privacy: there are wider issues at stake over Big Tech in medicine. OpenDemocracy. https://www.opendemocracy.net/en/technology-and-democracy/beyond-privacy-there-are-wider-issues-at-stake-over-big-tech-in-medicine/.
- Spektor, Mchelle. 2021. Perspective | History offers a cautionary tale for biometric covid tracking systems. The Washington Post, Last Modified 2022–02-03, accessed 12.02.2022. https://www.washingtonpost.com/outlook/2022/02/03/history-offers-cautionary-tale-biometric-covid-tracking-systems/.
- Storeng Katerini, Tagmatarchi, Puyvallée Antoine de Bengy. The Smartphone pandemic: how big tech and public health authorities partner in the digital response to Covid-19. Glob. Public Health. 2021;16(8–9):1482–1498. doi: 10.1080/17441692.2021.1882530. [DOI] [PubMed] [Google Scholar]
- Sundquist Christian Powell. Pandemic surveillance discrimination. Seton Hall. Law Rev. 2021;51(5):4. [Google Scholar]
- Sust Pol P.érez, Solans Oscar, Fajardo Joan Carles, Peralta Manuel Medina, Rodenas Pepi, Gabaldà Jordi, Eroles Luis Garcia, Comella Adrià, Muñoz C.ésar Velasco, Ribes Josuè Sallent. Turning the crisis into an opportunity: digital health strategies deployed during the COVID-19 outbreak. JMIR Public Health Surveill. 2020;6(2) doi: 10.2196/19106. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Thomason Jane. Big tech, big data and the new world of digital health. Glob. Health J. 2021;5(4):165–168. [Google Scholar]
- Tubb, Gerard. 2022. COVID-19: Technology developed to track spread of coronavirus could be abused, privacy campaigner warns. Skynews, accessed 22.02.2022. https://news.sky.com/story/covid-19-technology-developed-to-track-spread-of-coronavirus-could-be-abused-privacy-campaigner-warns-12516018.
- Vayena Effy. Value from health data: European opportunity to catalyse progress in digital health. Lancet. 2021;397(10275):652–653. doi: 10.1016/S0140-6736(21)00203-8. [DOI] [PubMed] [Google Scholar]
- Vayena, Effy, Marcel Salathé, Lawrence C. Madoff, John S. Brownstein. 2015. Ethical challenges of big data in public health. Public Library of Science San Francisco, CA USA. [DOI] [PMC free article] [PubMed]
- Veinot T.C., Mitchell H., Ancker J.S. Good intentions are not enough: how informatics interventions can worsen inequality. J. Am. Med. Inf. Assoc. 2018;25(8):1080–1088. doi: 10.1093/jamia/ocy052. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Westerlund Mika, Diane A.Isabelle, Leminen Seppo. The acceptance of digital surveillance in an age of big data. Technol. Innov. Manag. Rev. 2021;11(3) [Google Scholar]
- Whitelaw Sera, Mamas A.Mamas, Topol Eric, Harriette GC Van Spall. Applications of digital technology in COVID-19 pandemic planning and response. Lancet Digit. Health. 2020;2(8):e435–e440. doi: 10.1016/S2589-7500(20)30142-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- WHO. 2021. Ethics and governance of artificial intelligence for health: WHO guidance. World Health Organization, Accessed 21.02.2022. https://apps.who.int/iris/bitstream/handle/10665/341996/9789240029200-eng.pdf.
- Zhao Ivy Y., Ye Xuan Ma, Man Wai Cecilia Yu, Jia Liu, Wei Nan Dong, Qin Pang, Xiao Qin Lu, Alex Molassiotis, Eleanor Holroyd, Chi Wai William Wong. Ethics, integrity, and retributions of digital detection surveillance systems for infectious diseases: systematic literature review. J. Med. Internet Res. 2021;23(10) doi: 10.2196/32328. [DOI] [PMC free article] [PubMed] [Google Scholar]