Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2023 Jul 1.
Published in final edited form as: Am J Bioeth. 2022 Jul;22(7):68–70. doi: 10.1080/15265161.2022.2075981

Regulatory angels and technology demons? Making sense of evolving realities in health data privacy for the digital age

Vasiliki Rahimzadeh 1,*
PMCID: PMC9748849  NIHMSID: NIHMS1848383  PMID: 35737504

How do we respect the legitimate privacy interests of individuals and communities about whom data relate while maximizing data’s utility as a fundamental resource in the 21st century? Pyrrho, Cambraia and de Vasconcelos engage with this question, no doubt among the most enduring for the information economy in which we live and work. In their target article, “Privacy and Health Practices in the Digital Age,” the authors explore philosophical accounts of what privacy is—human right? a legal obligation? a social promise? or some combination?—and what it is not—a statistic, value-neutral, and a technical fix to a socially constructed problem.

I share the authors’ general frustration that health data privacy is seldom discussed as something other than sacrificial, as if protecting health information that is privately ours will somehow make us publicly sicker. Access to one’s own information is critical, in the authors’ view, to successfully rebel against opaque company practices that “produce and use aggregated data” about us but largely without our participation or consent. Pyrrho and colleagues argue that regulating the most prolific violators—big tech—is the best option we have for wrestling back some control over our private digital lives.

I comment on four of the authors’ positions in their target article. First, the authors’ tour of legal theories lacks mention of Helen Nissenbaum’s important contributions to constructing privacy as contextual integrity. I supplement the authors’ background by drawing on some of Nissenbaum’s work to better understand the social implications (intended and unintended) of distinguishing between the public and the private. Nissembaum’s ideas, and other scholarship on feminist approaches to privacy theory have shaped questions about who privacy is principally for, and also why some theorists believe only a privileged few can enjoy it.

Next, I explain why relaxing privacy and confidentiality rules during a public health crisis does not signal the State’s devaluation of privacy. Nor do episodic infringements on privacy in emergent times necessarily become the gateways to surveillance capitalism. Finally, I discuss why novel proposals to commodify data (sharing) have potential to introduce new incentive structures that enhance data agency and reinforce stakeholder choices in ways regulatory protections rarely have.

As actors in an increasingly data-driven world we iteratively appeal to dynamic choice architectures every time we decide what, where, and how information may be broadcasted about us. This idea underpins Helen Nissenbaum’s theory of privacy as contextual integrity (Nissenbaum 2004) and fosters relationship-building which Pyrrho and colleagues claim is part of “what privacy is for.” Nissembaum suggests that people participate in activities across different spheres and contexts which are each governed by distinct norms and practices. Privacy is therefore best understood, according to Nissembaum, as dynamic flows of information between contexts. One helpful privacy heuristic is whether emerging information technologies “violate any context-relative informational norms” (Nissenbaum 2004). Describing privacy as a set of dynamic choices as opposed to delimited frontiers between the “individual and the collective sphere” as Prryho and colleagues suggest, is more broadly inclusive of community value systems that reject such rigid frontiers exist at all. It can also help to explain how some frontiers work against marginalized groups and can in fact veil gender-based harm (Nash 2004).

Second, the authors cite an array of cases that implicate privacy—from social media use and artificial intelligence to genomics and election fraud. The diversity of their cases highlights the challenges some governments face in imposing adequate protections for data which sit at the intersection of two or more regulatory categories e.g. health data and consumer data. In this vein the authors propose that public health surveillance meant to guide the Covid-19 pandemic response demonstrated in part how State governments devalue privacy and inadvertently encourage its violation. However Pyrrho and colleagues level an unfair critique in this regard; their example conflates episodic infringements on privacy during emergencies with unfettered surveillance a la panopticon. Just as privacy protection supports strong “community formation,” infringing on privacy in exceptional circumstances may be necessary to keep those same communities safe. To generate the most public health benefits during a pandemic, health data needs to be protected not private. It is for this reason that privacy and confidentiality rules are relaxed during public health emergencies under the Health Insurance Portability and Accountability Act Privacy Rule, and the processing of certain categories of personal data to protect against serious cross-border threats to health is explicitly mentioned under Recital 46 of the General Data Protection Regulation in Europe, to name but two regulatory examples.

Another nuance the authors overlook in their arguments is that sharing data generates “common goods” only when those goods are actually distributed to the commons. Tech companies are not the only entities that fall short of such distributive justice targets. Studies demonstrate in my field of genomics, for example, that underrepresentation of diverse genomes has been one of many ways that precision medicine initiatives yield biased insights and skew bench-to-bedside translation (Ferryman and Pitcan 2018). Thus the authors’ appeal to accept frequent, though perhaps minor infringements on privacy to support health research is an exceedingly harder sell for those still waiting for basic benefits to trickle down e.g. access to quality healthcare (Tsosie et al. 2021). Recent efforts are promising to better align benefit sharing models with research participant values (Bedeker et al. 2022) and improve data representation (Wang et al. 2022), but have yet to measurably improve health outcomes for many patient populations so far.

Pyrrho et al. also criticize tech companies for converting “data into economic value,” because they have generated enormous profits seemingly at the expense of our privacy. But what if data contributors, not companies, were the beneficiaries of data sharing moving forward? This reality is no longer hypothetical. NFT (nonfungible token) marketplaces enable patients to share their health data for research or other purposes in return for crypto rewards (Kostick-Quenet et al. 2022). Blockchain and other distributed ledger technologies are unique in that they make data more secure by making it more accessible, thereby turning conventional understanding of data security and privacy assumed in the target article on its head. Health data sharing marketplaces that commodify both the sharing process and the data itself need not further disenfranchise communities as the authors imply. Rather NFT-based sharing could empower people through value-concordant data contribution and benefit sharing in real time.

Lastly, the authors present privacy as a parallel movement responding to “technological devices that threaten it.” In doing so, they inadequately credit the literal quantum leaps in computing that have accelerated innovation in privacy-enhancing technologies (see Aaronson (2013) for a highly entertaining history on the mathematical discoveries and privacy-relevant advances in quantum computing, and Perrier (2021) for the ethical implications of quantum’s modern applications in everyday life). Many data repositories entrust human oversight bodies to balance privacy as well as other autonomy-related values, for example, but increasingly rely on machines to computationally execute tasks that act on stakeholder values accurately, efficiently and at scale (Scheibner et al. 2021).

In presenting their lively account of the myriad people, purposes and policies served by privacy protections in the health research context and beyond, Pyrrho and colleagues convince readers of at least one truth: privacy isn’t one dimensional. The issues crystallized in this commentary compel new questions to add to the authors’ growing list published in their target article. These additional questions include but are not limited to who is privacy for, what metrics can be used to benchmark privacy loss or gain, and by what means can we protect privacy in line with diverse stakeholder values? To be sure, dialogue on these questions should be anything but private.

Funding

This work was supported by the NIH Division of Loan Repayment; EMBL-Stanford Life Science Alliance; and the Stanford Training Program in ELSI Research Grant T32HG008953.

References

  1. Aaronson S 2013. Quantum Computing Since Democritus. Cambridge University Press. [Google Scholar]
  2. Bedeker A, Nichols M, Allie T, Tamuhla T, van Heusden P, Olorunsogbon O, and Tiffin N 2022. A framework for the promotion of ethical benefit sharing in health research. BMJ Global Health 7(2): e008096. doi: 10.1136/bmjgh-2021-008096. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Ferryman K, and Pitcan M 2018. Fairness in Precision Medicine. Data & Society. [Google Scholar]
  4. Kostick-Quenet K, Mandl KD, Minssen T, Cohen IG, Gasser U, Kohane I, and McGuire AL 2022. How NFTs could transform health information exchange. Science 375(6580). American Association for the Advancement of Science: 500–502. doi: 10.1126/science.abm2004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Nash JC 2004. From Lavender to Purple: Privacy, Black Women, and Feminist Legal Theory. Cardozo Women’s Law Journal 11: 303. [Google Scholar]
  6. Nissenbaum H 2004. Privacy as contextual integrity. Washington Law Review 79: 41. [Google Scholar]
  7. Perrier E 2021. Ethical Quantum Computing: A Roadmap. arXiv:2102.00759 [quant-ph]. [Google Scholar]
  8. Scheibner J, Raisaro JL, Troncoso-Pastoriza JR, Ienca M, Fellay J, Vayena E, and Hubaux J-P 2021. Revolutionizing Medical Data Sharing Using Advanced Privacy-Enhancing Technologies: Technical, Legal, and Ethical Synthesis. Journal of Medical Internet Research 23(2): e25120. doi: 10.2196/25120. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Tsosie KS, Fox K, and Yracheta JM 2021. Genomics data: the broken promise is to Indigenous people. Nature 591(March). [DOI] [PubMed] [Google Scholar]
  10. Wang T, Antonacci-Fulton L, Howe K, Lawson HA, Lucas JK, Phillippy AM, Popejoy AB, Asri M, Carson C, Chaisson MJP, Chang X, Cook-Deegan R, Felsenfeld AL, Fulton RS, Garrison EP, Garrison NA, Graves-Lindsay TA, Ji H, Kenny EE, Koenig BA, Li D, Marschall T, McMichael JF, Novak AM, Purushotham D, Schneider VA, Schultz BI, Smith MW, Sofia HJ, Weissman T, Flicek P, Li H, Miga KH, Paten B, Jarvis ED, Hall IM, Eichler EE, and Haussler D 2022. The Human Pangenome Project: a global resource to map genomic diversity. Nature 604(7906). Nature Publishing Group: 437–446. doi: 10.1038/s41586-022-04601-8. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES