Abstract
Data from networked sensors, such as those in our phones, are increasingly being explored and used to identify behaviors related to health and mental health. While computer scientists have referred to this field as context sensing, personal sensing, or mobile sensing, medicine has more recently adopted the term digital phenotyping. This paper discusses the implications of these labels in light of privacy concerns, arguing language that is transparent and meaningful to the people whose data we are acquiring.
Subject terms: Ethics, Medical research
Common networked devices like the smartphone (e.g., GPS, keyboard touches, phone use, and communication patterns) and wearables can provide a continuous stream of the data about an individual’s behaviors, psychological states, and environments, forming a picture of their lived experience1. This sensing technology can, with varying degrees of accuracy, estimate sleep patterns, activity, and social engagement, as well as mental health conditions2. The application of sensing technology has enormous potential to improve our understanding of the experience of individuals and our capacity to deliver behavioral health treatments. Behavioral markers inferred from sensed data are beginning to be integrated into apps, making them simpler and more engaging to use3,4. Such sensing apps can be integrated into standard psychological or behavioral treatments5, or delivered as stand-alone or coached interventions6. Passive tracking of populations of at-risk people could facilitate early identification and intervention for behavioral problems. These potential clinical innovations have led to a rapidly growing field of research, and are beginning to be developed commercially, thereby supporting their dissemination.
As with any emerging field, there have been many different terms used to describe this application of sensing technology. The exploration of the use of phone sensors to estimate behaviors, psychological states, and environmental contexts began more than 15 years ago in computer science, where it has been referred to variously as context sensing, reality mining, mobile sensing, behavioral sensing, and personal sensing2. As medicine entered the field, the name “digital phenotyping” was proposed in 20157, and has rapidly gained currency, becoming the most commonly used term in publications listed in PubMed. The term digital phenotyping has been adopted by funders, including the US National Institute of Health and the Wellcome Trust. From the research world, the term is spreading into publications for the healthcare industry, as well as into general media such as the New York Times8, and is now used by companies that are commercializing these technologies. As sensing technology for health and mental health becomes disseminated through commercialization and general media, it is incumbent upon us to consider the implications of the labels we use to describe it.
The language in a name provides information to an audience, thereby framing how that audience understands the product or service. The term digital phenotyping speaks to a medical audience, whose oldest texts, written in Greek, provide terms still used today, such as dyspnea (bad breathing) and melancholia (originally black bile). The term digital phenotyping (to show a type) provides a good description for a medical audience of the aims and processes involved in using digital traces to identify characteristics of an individual. It helps contextualize the field of sensing within medicine, which provides legitimacy, and suggests how to integrate sensing into genetics, diagnosis, and prognosis9.
But what might the term digital phenotyping signal mean to those whose data are being used? That such sensing is medical and scientific, perhaps? That it is complex? It does not convey to the average person that we are engaging in a sensitive form of surveillance: collecting large amounts of data, and using those data to understand deeply personal things, such as how they sleep, where they go, how and when they communicate with others, or whether they may be experiencing a mental health condition.
Yet, these are the people to whom we most need to explain the risks of participation, and why they should trust us with their data. These data are incredibly revealing, and we are asking research participants and commercial users to be vulnerable to our decision-making. For example, in a study of GPS data from the phones of 1.5 million Europeans, it took only four GPS points over 15 h to identify 95% of individuals10. As we detect behaviors and mental health conditions using sensed data, which often include GPS, the behaviors and conditions we detect can be linked directly to individuals, even without traditional personal identifiers. This capability is emerging in an environment where some companies in the digital health industry have demonstrated a remarkable lack of regard for privacy. A recent study, which intercepted the network traffic generated in the use of the top 30 mental health and smoking cessation apps, found that more than 80% of the apps shared data for advertising and marketing purposes, but only 28% disclosed this in a privacy policy11. Thus, the field of sensing poses significant vulnerabilities in a context that has tended to exploit rather than protect the people we aim to help.
To earn participant trust, the labels we use should increase, not decease, the transparency of our intent (what we are doing and why) and practice (how we are getting the data and its nature)12. Among the terms used in computer science, mobile sensing, behavioral sensing, and personal sensing come closest to providing this information. “Sensing” conveys an automated, background data collection. “Mobile” suggests the device (as in mobile phone), and that the data gathered are not restricted to a single place. “Behavioral” identifies the target of sensing. “Personal” conveys the intimate nature of the behaviors, and states that we are attempting to detect.
Terminology in modern medicine has trended toward transparency13. Standard English has often been used in naming more recent innovations. “Bypass surgery” is descriptive and understandable (even if the modifier “coronary” is less so). “Scanning” gives people a general sense of the aims and processes for imaging technologies. The use of a descriptor, such as personal sensing for emerging data-gathering technologies, would be well aligned with the growing use of descriptive English terms in medicine.
Sensing technology targeting health and mental health is making its way into our lives and healthcare systems. It has enormous potential to enhance behavioral healthcare, but it is not without risks. The use of language that is transparent about the intent and practice behind this technology with the people whose data we are using is both ethically responsible and more likely to engender trust over time. For this emerging technology, our colleagues in computer science had it right when they selected terms using standard English descriptors that provide average people with an understanding of the intent and practice. We urge the field to use terms that are easily understood by the people whose data we are using. Our preference has been for the term “personal sensing.” It conveys the intent and practice, as well as the personal nature of the behaviors and states we are attempting to detect. By using standard English, we demonstrate respect for the people we are trying to serve and support transparency of our practices, which has not been uniformly provided in digital health and mental health.
Acknowledgements
This work was supported by a grant from the National Institute of Mental Health (R01 MH111610).
Author contributions
All authors contributed to the writing of this Commentary and approved the completed version.
Competing interests
Dr. Mohr has accepted honoraria and consulting fees from Apple Inc., Otsuka America Pharmaceutical Inc., and has an ownership interest in Adaptive Health, Inc. Dr. Hotopf is the principal investigator of the RADAR-CNS consortium—a precompetitive public–private partnership jointly funded by the Innovative Medicines Initiative and European Federation of Pharmaceutical Industries and Associations. As such, he receives research funding and in kind contributions from five pharmaceutical companies—Janssen, Biogen, UCB, Lundbeck, and Merck Sharp & Dohme. Dr. Shilton has no competing interests.
Footnotes
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Torous J, Kiang MV, Lorme J, Onnela JP. New tools for new research in psychiatry: a scalable and customizable platform to empower data driven smartphone research. JMIR Ment. Health. 2016;3:e16. doi: 10.2196/mental.5165. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Mohr DC, Zhang M, Schueller SM. Personal sensing: understanding mental health using ubiquitous sensors and machine learning. Annu. Rev. Clin. Psychol. 2017;13:23–47. doi: 10.1146/annurev-clinpsy-032816-044949. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Rabbi, M., Aung, M. H., Zhang, M. & Choudhary, T. MyBehavior: automatic personalized health feedback from user behaviors and preferences using smartphones. in UbiComp ‘15 707–718 (Association for Computing Machinery (ACM), Osaka, 2015).
- 4.Klasnja P, et al. Efficacy of contextually tailored suggestions for physical activity: a micro-randomized optimization trial of HeartSteps. Ann. Behav. Med. 2019;53:573–582. doi: 10.1093/abm/kay067. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Lindhiem O, Bennett CB, Rosen D, Silk J. Mobile technology boosts the effectiveness of psychotherapy and behavioral interventions: a meta-analysis. Behav. Modif. 2015;39:785–804. doi: 10.1177/0145445515595198. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Linardon J, Cuijpers P, Carlbring P, Messer M, Fuller-Tyszkiewicz M. The efficacy of app-supported smartphone interventions for mental health problems: a meta-analysis of randomized controlled trials. World Psychiatry. 2019;18:325–336. doi: 10.1002/wps.20673. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Jain SH, Powers BW, Hawkins JB, Brownstein JS. The digital phenotype. Nat. Biotechnol. 2015;33:462–463. doi: 10.1038/nbt.3223. [DOI] [PubMed] [Google Scholar]
- 8.Singer, N. How companies scour our digital lives for clues to our health. in New York Times (The New York Times Company, New York, NY, 2018). https://www.nytimes.com/2018/02/25/technology/smartphones-mental-health.html.
- 9.Freimer NB, Mohr DC. Integrating behavioural health tracking in human genetics research. Nat. Rev. Genet. 2019;20:129–130. doi: 10.1038/s41576-018-0078-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.de Montjoye Y-A, Hidalgo CA, Verleysen M, Blondel VD. Unique in the crowd: the privacy bounds of human mobility. Sci. Rep. 2013;3:1376. doi: 10.1038/srep01376. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Huckvale K, Torous J, Larsen ME. Assessment of the data sharing and privacy practices of smartphone apps for depression and smoking cessation. JAMA Netw. Open. 2019;2:e192542. doi: 10.1001/jamanetworkopen.2019.2542. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Vitak, J., Shilton, K. & Ashktorab, Z. Beyond the Belmont principles: ethical challenges, practices, and beliefs in the online data research community. in 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing 941–953 (Association of Computing Machinery, San Francisco, CA, 2016).
- 13.Wulff HR. The language of medicine. J. R. Soc. Med. 2004;97:187–188. doi: 10.1177/014107680409700412. [DOI] [PMC free article] [PubMed] [Google Scholar]