The potential psychosocial effects of neurotechnology, particularly deep brain stimulation (DBS), have been at the center of a contentious debate within neuroethics. Many scholars have argued that the field of neurotechnology needs to work hard to address such effects, while others have challenged the largely negative characterization of such effects or framed them as exaggerated in importance, suggesting a kind of neuroethics bubble (Gilbert et al. 2018). Zuk et al. (2023) aim to help clarify some of this debate by adding the voices of an important stakeholder group, i.e., researchers running next generation or adaptive DBS studies.
Zuk et al.’s study includes 23 interviews, from which “the majority of researchers reported being aware of personality, mood, or behavioral (PMB) changes in recipients of DBS/aDBS” (6). Indeed, they note that “nearly all” of the interviewees reported having observed changes in mood or behavior (6). “Personality” was viewed as a vague term that is also difficult to measure, and was typically not part of any standard assessment. While some of these researchers were fairly sanguine about the mood and behavior changes they observed – especially where, e.g., change in mood is part of the therapeutic aim of the experimental intervention – they also recognized common and potentially more negative side effects (e.g., increased impulsivity) and understood them as matters for concern.
The kinds of negatively valenced side effects they reported included lower risk aversion, mania or hypomania, “feelings of alienation from or lack of identification with mental state” (7), depression, increased anxiety, being bothered by the presence of the device, and even psychosis (8). They also share one extreme case, in which the researcher notes:
I mean certainly, certainly we’ve had cases where someone’s personality has been changed by DBS, or their behavior’s been changed by DBS […] [B]asically, a patient reprogrammed by a fellow who leaves the room, and he just becomes a completely different person […] becomes very hypomanic, and disinhibited, and giggles [like] a little child in a candy store. At some point he jumps on, or tries to grab one [of] the female neuroscientists. Then hides behind the door giggling about it, and that is not his personality. After the device is reprogrammed, we see him coming back to himself, and he becomes this kind of quiet and reserved person, and says, “Oh, I’m feeling much better. That was a really strange feeling. (8, emphasis in original)
How should such a story impact the way scholars focused on the ethics of neural devices do their work? It might seem reasonable to assert, as Zuk et al. 2023 seem at least to flirt with claiming, that this is an outlier, and given how many researchers also attributed a positive valence to many of the PMB changes they noticed in their participants, perhaps this case deserves little more than raised eyebrows and relief that it is an uncommon experience.
On the one hand, nearly all researchers said that DBS and aDBS can cause changes in mood and behavior, and a majority said that it can cause changes in personality. Thus, those in the neuroethics literature who raise concerns about such changes seem to be targeting genuine phenomena. On the other hand, researchers varied widely in their estimates of how often such changes occur and whether the effects of changes are considered positive or negative. (Zuk et al. 2023, 10, emphasis added)
Their “on the other hand” serves to deflate the power of the first claim, and perhaps to assuage the reader’s unease about the possibility of PMB changes described, because it implies that if many or most of the changes are positive, the devices generally work well. To be fair, the authors call for urgency in undertaking careful work on how PMB changes are conceptualized and framed (recognizing the theory/observation relationship from philosophy of science, 11), and they recommend close collaboration between scholars from the humanities/social sciences and neuroscientists and engineers to ensure sufficient attention to how such experiences are characterized and measured (11).
Our central aim here is to rethink that “on the other hand” – i.e., to articulate why attending to troubling experiences of neural device users is important for ethicists (and the field of neurotechnology more generally), even if those cases are outliers or at least not clearly representative of the majority of users. Calling attention to negative psychosocial impacts of neurotechnology, even if they are not widely shared, can have positive effects in relation to how the technology is designed, implemented, distributed and supported. The possibility of significantly negative experience, for instance, should call attention to the new vulnerabilities experienced by people who enroll in neurotechnology studies, or even those who take on therapeutically approved devices in order to try to live better, given their conditions. In having a neural device implanted, they are increasing their vulnerability, or at least trading for different vulnerabilities (Goering et al. 2020). Recognizing this shift in vulnerability could help to motivate a restructuring of the support environment available for device users (in the clinic, or within a study, but also after a study ends), to ensure that such vulnerabilities are acknowledged and well managed.
Focusing on problematic outlier cases can also be a way to highlight the voices of people who are marginalized in a variety of ways, including people already oppressed through systems of racism, sexism, ableism and more. An outlier experience with neural tech might be a one-off oddity, or it might be a kind of canary in the coalmine, a sign that the values and assumptions informing the design of the technology are not capacious enough to recognize the wide diversity of people who might use them (including, e.g., neurodiverse people, people with multiple diagnostic labels, people who value resistance more than assimilation)(for related arguments, see Benjamin 2019; Costanza-Chock 2020; Goering 2017).
We can imagine some skeptical readers would object that the kinds of negative side effects we are calling attention to are not particularly different from those that can occur through treatment with SSRIs or dopaminergic drugs like Sinemet – they too can cause impulsivity, hallucinations and more in some users. We agree, but note that 1) it may also be important to pay more attention to these drug-related effects, to better understand how and why they affect people so differently; and 2) we often fail to give appropriate attention to the seriousness of psychiatric side effects (whether from pharmaceuticals or neurotechnologies) and how they affect people’s sense of themselves. Imagine being the person who tried to grab the research assistant. They might be so embarrassed that they won’t want to interact with the researchers again, or to continue the research; at the very least, maintaining small talk after a return to normal PMB would likely be strained. These side effects are not like feeling dizzy or getting a headache. They might shake a person’s confidence and self-trust in significant ways, even if over time users can figure out how to adjust.
In sum, the outlier – a data point to be dismissed in large scale quantitative studies designed for statistical significance, but a crucial call for further investigation and attention in small, qualitative studies – can speak volumes regarding what the collection research enterprise is failing to notice. Anyone who reads the “extreme” case cited in the Zuk et al. 2023 piece should take note of how quickly and forcefully a neural device can alter someone’s thinking and behavior. We recognize and support the continued development of neural devices, knowing that they can be lifesaving and provide incredible opportunities. Nonetheless, this is an area of intense possibility, and warrants significant caution and care.
Contributor Information
Sara Goering, University of Washington System,United States,Washington,Seattle.
Eran Klein, OREGON HEALTH AND SCIENCE UNIVERSITY,OREGON HEALTH AND SCIENCE UNIVERSITY,United States,Oregon,PORTLAND.
References
- Benjamin R 2019. Race After Technology Cambridge MA: Polity Press. [Google Scholar]
- Costanza-Chock S 2020. Design Justice: Community-Led Practices to Build the Worlds We Need Cambridge MA: MIT Press. [Google Scholar]
- Gilbert F, Viaña JNM, and Ineichen C. 2018. Deflating the “DBS causes personality changes” bubble. Neuroethics 10.1007/s12152-018-9373-8. [DOI]
- Goering S 2017. Thinking Differently: Neurodiversity and Neural Engineering. In The Routledge Handbook of Neuroethics (eds. Johnson LSM and Rommelfanger K), 37–50, Routledge. [Google Scholar]
- Goering SA and Klein E. 2020. Trading Vulnerabilities: Living with Parkinson’s Disease before and after Deep Brain Stimulation. Cambridge Quarterly of Healthcare Ethics 30(4): 1–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zuk P, Sanchez CE, Kostick-Quenet K, Muñoz KA, Kalwani L, Lavingia R, Torgerson L, Sierra-Mercado D, Robinson JO, Pereira S, Outram S, Koenig BA, McGuire AL and Lázaro-Muñoz G. 2023. Researcher Views on Changes in Personality, Mood, and Behavior in Next-Generation Deep Brain Stimulation. AJOB Neuroscience [DOI] [PMC free article] [PubMed]
