Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2022 Jul 8.
Published in final edited form as: IEEE Pervasive Comput. 2022 Mar 1;21(2):41–50. doi: 10.1109/MPRV.2022.3141986

Peer Support Specialists and Service Users’ Perspectives on Privacy, Confidentiality, and Security of Digital Mental Health

Maria D Venegas 1, Jessica M Brooks 2, Amanda L Myers 3, Marianne Storm 4, Karen L Fortuna 5
PMCID: PMC9267391  NIHMSID: NIHMS1782225  PMID: 35814864

Abstract

As the digitalization of mental health systems progresses, the ethical and social debate on the use of these mental health technologies has seldom been explored among end-users. This article explores how service users (e.g., patients and users of mental health services) and peer support specialists understand and perceive issues of privacy, confidentiality, and security of digital mental health interventions. Semi-structured qualitative interviews were conducted among service users (n = 17) and peer support specialists (n = 15) from a convenience sample at an urban community mental health center in the United States. We identified technology ownership and use, lack of technology literacy including limited understanding of privacy, confidentiality, and security as the main barriers to engagement among service users. Peers demonstrated a high level of technology engagement, literacy of digital mental health tools, and a more comprehensive awareness of digital mental health ethics. We recommend peer support specialists as a potential resource to facilitate the ethical engagement of digital mental health interventions for service users. Finally, engaging potential end-users in the development cycle of digital mental health support platforms and increased privacy regulations may lead the field to a better understanding of effective uses of technology for people with mental health conditions. This study contributes to the ongoing debate of digital mental health ethics, data justice, and digital mental health by providing a first-hand experience of digital ethics from end-users’ perspectives.


The COVID-19 pandemic has intensified and enabled a broader acceptance and uptake of mHealth services assisting in meeting the needs of an unprecedented number of individuals with new or worsening mental health challenges. mHealth is a subset of the broader field of “digital health” defined as mobile and wireless devices [smartphone, tablets, and computers] used to deliver services, and research,1 Video consultations and the use of smartphone for improving mental health support are now increasingly advocated as an alternative for in-person consultations.2 Digital mental health promises to bring psychological support to areas that are difficult to reach, or to people who lack access to these services. However, although mHealth modalities have the potential to serve and engage a wider range of people, there are still many uncertainties regarding its effectiveness, accessibility, and safety to serve and protect vulnerable populations.35

Recent scholarship has raised ethical concerns regarding the broader landscape of digital mental health technologies, identifying safety, transparency, and privacy as key challenges.3,6,7 Furthermore, research suggests that individuals are still more willing to share health data and data about beliefs and values than financial data.8 Thus, the data collected by mHealth applications may raise issues of safety for individuals in vulnerable situations. For example, individuals using mHealth technology to monitor their mental illness symptoms might have details about using illicit drugs or other sensitive and private information that can be maliciously accessed or compromised leading to shame and stigma, and even legal consequences.9

The ability to protect an individual’s data, confidentiality and security are critical to the uptake of mHealth.10,11 Although security and privacy oftentimes overlap when it comes to patients’ confidentiality; privacy is the individual’s right to maintain control over and be free from intrusion into their private data and communications.7 Security relates to the protection against unauthorized access to data4 and confidentiality refers individual’s autonomous choice to make an informed decision and control which data remain protected.5 Privacy policies should be easily understandable and should not prevent users from making informed decisions.7 Informed consent requires that patients have a clear understanding of the risks and benefits, available alternatives, and relevant facts pertaining to a digital service.6 Yet, users of mHealth, in particular, older and marginalized populations oftentimes lack the technical skills to understand privacy policies or to control privacy settings. Most privacy policies are written at a reading level equivalent to two years of college and most of the U.S. adult population has completed less than one year of college.12

In terms of accessibility, mHealth interventions may not reach those who are most in need of care, either because they are illiterate or because apps require mobile phones with a fast internet connection and some abilities to interact with these phones, thereby excluding low-income groups, individuals with physical disabilities or elderly people with less tech skills.13 Moreover, most technologies for people with serious mental illness (SMI) are still beleaguered with the lowest levels of service user engagement.14 For example, people with a diagnosis of schizophrenia, bipolar disorder, or major depressive disorder, commonly disengage from digital mental health interventions designed for symptom management and recovery before the intervention achieves any outcomes.15,16 Because digital mental health hinges on the disposition of patients and the public to use tools such as apps to monitor or manage their mental health, a better understanding of users’ perspectives about their engagement with digital mental health and their understandings of privacy, confidentiality, and security require careful consideration.

This article explores how service users (e.g., patients and users of mental health services) and peer support specialists understand and perceive issues of privacy, confidentiality, and security of digital mental health interventions. Peer support specialists are described as people with lived experience of mental health and/or substance use challenges employed and accredited by their respective states to offer mental health support services. We identify barriers and facilitators to engagement with mHealth among service users and peer support specialists. We then offer specific facilitators to overcome those barriers and suggestions on how to engage potential users with mHealth in a more pragmatic and safe way. The overall goal of this article to increase understanding of ethical issues from the perspective of potential users (e.g., SMI older, underserved, and disenfranchised potential users) of mHealth technologies for the management of mental illness.

METHODS

Study Design and Participants

Thirty-two semi-structured interviews were conducted in-person with service user participants (N = 17) and peer support specialists (N = 15). Service user participants and peer support specialists were recruited from one community mental health center in the Northeast coast of United States. Agency staff reviewed case files of potential service users that met study participant criteria and discussed the study with potential participants. If interested, an in-person meeting with research staff was scheduled on-site at the community mental health center. Peer support specialists within the same agency were approached by agency staff to discuss the study to gauge interest. If interested, peer support specialists were scheduled for a one-time screening, informed consent, and individual interviews with research staff. Participation was completely voluntary and informed consent was obtained from all participants.

Service user inclusion criteria included:

  1. adults age 18 or older who have a chart-documented Diagnostic & Statistical Manual of Mental Disorders (DSM-V Axis I) diagnosis of schizophrenia, schizoaffective disorder, bipolar disorder, or persistent major depressive disorder;

  2. have been enrolled in treatment for at least three months;

  3. have been diagnosed with one or more chronic conditions;

  4. speak and read English;and

  5. provide voluntary informed consent for participation.

Peer support specialists’ inclusion criteria included:

  1. Certified peer specialist (self-report any mental health diagnosis, be in active treatment, and complete an 80-hour training that includes classes, small group activities, and homework on fundamentals of peer support, cross-cultural partnering, and human experience language and must pass a written examination);

  2. speak and read English;and

  3. must provide voluntary informed consent for participation in the study.

All interviews lasted between 30–60 minutes were audio recorded and transcribed. Participants were compensated with $30 for participation. Participants were included based on their interest and willingness to participate in the study. All procedures were conducted in accordance with the ethical standards of the Institutional Review Board and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. Qualitative interviews were conducted until saturation of data occurred (i.e., saturation means that sampling more data will not lead to more information related to research questions).

The interview guide was codesigned with two peer support specialists using the Peer and Academic Model of Community Engagement.17 The interview guide covered topics related to perspectives on community engagement to inform the software development lifecycle of digital interventions for people with SMI. The interview guide included four broad questions and probes:

  1. In your opinion, what is the role of service users or peer support specialists in developing digital health interventions? If they do not have a role, how do you think they could play a role?

  2. How could you help develop digital health interventions?

  3. Have you ever helped or contributed to developing a digital intervention? If so, what was your role? What was your experience in this role?

  4. How would envision yourself assisting in the development of digital health interventions?

The interview guide also focused on topics around privacy, confidentiality, and security concerns including three larger questions:

  1. “Do you have any concerns about digital health interventions for the management of mental illness?”

  2. “What are your thoughts on researchers monitoring your technology use?”

  3. “What are your thoughts on replacing clinicians with technology? [e.g., chatbots, digital peer services, and mental health apps]”.

This article focused on these three last questions related to privacy, confidentiality, and security concerns in using mHealth specifically digital mental health interventions among peers and service users. Data Analysis Verbatim transcriptions of interview text were analyzed using thematic analysis.18 Initial categories derived directly from the interview guide and then from interview data. Qualitative data were summarized, distilled, and condensed into aggregates and codes. The first and last author read data and incorporated new codes and operational definitions from transcript coding, a validated approach that allows for multiple perspectives.19 Codes were assigned to text and then grouped and checked for themes. Key themes were assessed within-group consensus or disagreement utilizing “member checking” or respondent validation.20 Member checking via group discussion was employed with four participants to validate qualitative results and resolve any incongruent findings.

RESULTS

Study Sample

Service user participants had a mean age of 51.2 years (SD = 8.8;range 38–75) and were primarily men (70.6%) and White (82.4%). Among service user participants included people diagnosed with major depressive disorder (29.4%), schizophrenia spectrum disorders (41.1%), and bipolar disorder (23.5%). Peer support specialists had a mean age of 39.7 (SD = 12.1;range 24–61 years); 66.7% were female and 86.7% identified as White. All peer support specialists completed the certified peer specialist training and were currently employed (see Table 1). We identified a set of eight larger codes relating to general knowledge of technology and digital mental health interventions; experience with technology, knowledge and perceptions of privacy, confidentiality, and security concerns related to mHealth tools; personal preferences; challenges using technology; technology ownership; and technology skills and literacy. The following themes synthesize our qualitative data analysis:

TABLE 1.

Demographic Characteristics of Participants.

Characteristic Peer support specialists Service users
n % n %
Gender
Male 5 33.3 12 70.6
Female 10 66.7 5 29.4
Age
20–29 4 26.7 1 5.9
30–39 2 13.3 1 5.9
40–49 3 20 9 52.9
50–59 1 6.7 5 29.4
60–69 2 13.3 1 5.9
70–79 0 0 0 0
Race/ Ethnicity
White 13 86.7 14 82.4
Black/ African American 1 6.7 1 5.9
American Indian/ Alaskan Native 0 0 0 0
Asian 0 0 0
Native Hawaiian/ Pacific Islander 0 0 0 0
More than 1 Race 1 6.7 2 11.8
Education
No Formal Schooling 0 0 0 0
Some Elementary Schooling 0 0 1 5.9
Completed 8th Grade 0 0 1 5.9
Some High School 0 0 5 29.4
Completed High School or GED 0 0 6 35.3
Some College 6 40 2 11.8
Completed College or Technical School 0 0 1 5.9
Completed Associate’s Degree 1 6.7 1 5.9
Completed Bachelor’s Degree 5 33.3 0 0
Some Graduate School 2 13.3 0 0
Completed Master’s Degree 1 6.7 0 0
Completed Doctoral Degree 0 0 0 0
Psychiatric Diagnoses
Bipolar Disorder 4 23.5
Schizophrenia 4 23.5
Schizoaffective Disorder 3 17.6
Major Depressive Disorder 5 29.4
Smartphone ownership 15 100 8 47.1
  1. technology ownership and use;

  2. awareness and knowledge of private, confidentiality, and security concerns in using mHealth; and

  3. social media diagnostics, sensors, and monitoring of data.

Technology Ownership and Use

Smartphone

Variability existed among both groups in terms of technology ownership and use. Peer support specialists reported 100% ownership of a smartphone device compared to service users who reported 47% on smartphone ownership. Of those service users who owned a smartphone, 41% reported using the smartphone “every day” compared to peer support specialists who reported 100% usage. The most common use of a smartphone among both groups was to call and communicate with family (87%). The use of smartphone among service users was impacted not only by ownership but other factors such as age-related physical impairments or age-related technology adoption and psychiatric symptoms. (See Table 2 for exemplary quotes by service users). Although service users overall face physical, financial barriers and are less familiar with smart phone capabilities, 59% of the service users expressed interest in learning about and using technology, specifically apps designed to build networks based on similar goals and experiences. Smartphone use is mediated by technology literacy and familiarity with smartphone capabilities, which oftentimes is a barrier among older populations who may have less technology literacy. In contrast, Peer support specialists reported active use of smartphone apps for different purposes. For example, some of the most widely used apps among peer support specialists included: Happy Color, Weight Watchers, Breathe to Relax, CVS, Calm, and Google maps. Peer support specialists also reported experience with apps specifically designed to help people in managing mental health, substance use, and physical conditions. (See Table 3 for exemplary quotes by peer support specialists).

TABLE 2.

Service User Representative Quotes.

Factor Highlighted quotes
Technology ownership and use “The closest I’ve gotten to [sic] using something like that has been a cell phone. So far. And, um a TV remote control. And sometimes I- I have to note the letters are-I can read them alright, but uh, they’re hard to- some of them can be hard to understand."
“I have a hard time hearing on the phone too. Then people tell you to call. My ears aren’t that good and my eyes aren’t that good. I have two cataracts.”
Knowledge of Privacy, Confidentiality and Ethics in mHealth and Telehealth “how would they have access to that kind of information anyway? I’m kind of concerned, I’m kind of, um, confused about that. It’s kind of like the TV, they probably do this a lot differently now, they have like the Nelson ratings...”
“Privacy being compromised...I don’t know if you watch the news but that’s a big issue with Facebook. I think they got sued for that.”
“only concern aside from the fact that someone could be listening or actually copying. . ..monitoring, you know like they have on the internet people...”
Social media diagnostics, sensors, and passive monitoring of data “"as long as am informed and aware that my data is being tracked.”
“As long as like, they had, they seek your approval. Having a consent form, yeah.” They do that for everything with uh mental health. Like I can’t even get my psychiatrist to talk to my visiting nurse without me signing something.”
TABLE 3.

Peer Support Specialists Representative.

Factor Highlighted quotes
Technology ownership and use “I use different apps or what not on my phone, but I think it would have to be put across in simpler terms for some–I think it’s different according to their age. You know older people would benefit from—They definitely would benefit from it, but it just would take a little practice.”
Knowledge of Privacy, Confidentiality and Ethics in mHealth tools “Yeah, I have an app that’s called Calm but like for instance, here is an app called Sober Time, like track recovery, track sobriety, stay clean. Recovery. Time loss, twelfth, then there’s AA, there’s all sorts of AA apps. Recovery Box. Twelve steps in action, twelves steps in. . .oh that’s AA. You know, they go by different names. They target different subjects.”
“with respect to the digital mental health interventions I am concerned with harming privacy from peer-to-peer and concerns confidentiality, especially with interventions that have multiple points of entry, which make them less secure.”
“I assume that they [researchers] have to be HIPAA compliant. I have no concerns as long as the app requires some kind of a passcode.”
“I have concerns about whether/how to intervene if someone posts suicide-related content on a message thread, as well as conflicts that might arise if such reporting breaches a user’s confidentiality rights.”
“No concerns with digital health interventions because these are clients’ own phones; caveat of an intervention like this is that we have a paranoid population who might be hesitant to use anything that involves monitoring for supporting their recoveries. Other ethical concern is how reliance on smartphones and checking social media/communications can be damaging for some people and their recoveries.”
“I think about people who have seeing- seeing visual things in their symptoms. Um, that aren’t really for the rest of us in the world. So, does that exacerbate a symptom or exacerbate a part of their reality that isn’t our reality and is that healthy or not? And I think about like the spectrum of conditions and so while we want to create an app that fits everyone’s needs, like there’s going to be conflicts of interest for like a person with depression versus a person who has a schizophrenia diagnosis because someone might have an altered reality and someone else might just have problems with motivation, and sadness, and isolation.”
Social media diagnostics, sensors, and passive monitoring of data “It depends on how they’re monitoring technology use; if it’s monitoring being done by companies like Facebook or Amazon it’s not great, but if it’s being done by clinicians or peers for tracking clients in order to help people with recoveries then thinks it’s good. For instance, an app that ould allow a peer to have access to a homeless client’s location would be extremely helpful in helping to track down the person to meet with them.”

Social Media

All the peer support specialists reported using social media and using social media every day. By contrast, approximately 53% of Service users used social media, with Facebook being a platform most commonly used or known by both groups (80%). Generally, across both groups, it was reported the use of smartphone for leisure activities (e.g., reading, watching videos, taking and storing pictures, and keeping up to date with sports scores), organizational activities (i.e., setting an alarm, writing notes), financial (i.e., managing money through banking, budgeting), and miscellaneous (i.e., global positioning system navigation, online shopping, email, and work).

Privacy, Confidentiality, and Security of mHealth mHealth and Digital Mental Health Interventions

Security of sensitive data was reported broadly as imperative in mHealth tools that support the management of mental health conditions. Notions and knowledge of privacy, confidentiality, and security with technology tools for the management of mental illness were considerably different from peer support specialists and service users. Generally, service users reported limited knowledge on privacy and breaching of private data, most commonly informed by social media outlets (e.g., Facebook’s past privacy violations). In contrast, peer support specialists reported high awareness and knowledge of privacy, confidentiality, and security issues regarding mHealh in general and digital health interventions in particular. Given the peer support specialists’ technology literacy, use of social media and apps, as well as being in recovery, peers provided views on security and identified potential privacy issues and concerns with emergent mHealth tools for service users.

With respect to digital mental health interventions, some peer support specialists reported being concerned with giving too much information while other reported not being concerned at all. Peer support specialists were aware of protection, privacy, and security of health information and service users’ rights due to their recovery experience and participation in mental health interventions. For example, some peers reported not being concerned with mental health interventions as long as the interventions follow protocols for confidentiality and the Health Insurance Portability and Accountability Act (HIPAA).

Yet, some peers reported being concerned with the consequences of using certain smartphone apps or social media platforms. Other cited concerns included: the privacy of content shared by peer support specialists and service users and how the classification of information may have negative impacts for the users. One peer support specialist explained that “gray areas created by social networking features of apps/technology can serve to foster online bullying/ posting of harmful content.”

Given the lived experience and spectrum of mental health conditions among service users, peer support specialists identified an ethical dilemma of privacy versus safety with certain digital mental health tools that do not account for the experiences and realities of service users dealing with psychiatric symptoms. A peer support specialist expressed that app designers must be aware of the mental state of the target population who may not always feel comfortable making use of phone apps or participating in digital health interventions. Feelings of paranoia, suicidal thoughts, or psychotic episodes can make the user more troubled about privacy and security as well as to interfere with the user’s path to recovery (See Table 3 for exemplary quotes from peer support specialists). Peer support specialists raised highly relevant safety concerns when potential users cannot or do not want to use digital devices because of symptoms they are experiencing and how it may be helpful to temporarily revert anonymity and confidentiality when a user is at risk.

Social Media Diagnostics, Sensors, and Monitoring of Data

Responses and attitudes towards passive monitoring of technology use such as digital phenotyping, social media diagnostics, and Bluetooth-enabled motion sensor data were diverse across both peers and service users. Because of the low engagement with technology and less technology skills, Service users reported less awareness on social media diagnostics and monitoring of data. For example, some service users cited Alexa, Facebook, and Google as examples of data monitoring. Other service users reported passive monitoring as an invasion of privacy while some reported to be fine with monitoring if there is an agreement through an informed consent form for the users. Most service users in this study were aware of their rights as related to consenting around monitoring. For example, one servicer user asserted that “as long as am informed and aware that my data is being tracked.” Another service user explained that consumers must be provided with information to make an informed decision whether to use new technology tools or participate in digital health interventions.

Peer support specialists in this study were more inclined to accept and reflect on the end goal for passive monitoring through digital phenotyping, social media diagnostics, and Bluetooth-enabled motion sensor data. However, peer support specialists were more accepting of passive monitoring if done by clinicians or peers who can help and support users in their recovery rather than passive monitoring by big tech corporations. “It depends on how they’re monitoring technology use; if it’s monitoring being done by companies like Facebook or Amazon it’s not great, but if it’s being done by clinicians or peers for tracking clients in order to help people with recoveries then thinks it’s good.”

DISCUSSION

This article reports qualitative data from service users (e.g., patients) and peer support specialists on their understandings and perceptions of privacy, confidentiality, and security of digital mental health interventions. Although digital health tool technologies, including mobile phones, were commonly owned by participants in this study, ownership was far from ubiquitous among service users. Additionally, service users reported limited knowledge and awareness of privacy, confidentiality, and security. The main barriers to engagement with mHealth among service users included: technology ownership and use, lack of technology literacy, and other factors hindering the use of technology related to age-related physical impairments, psychiatric symptoms. These findings are consistent with recent systematic reviews on the factors that affect technology use among older adults21,22 and with previous studies that found low technological ownership and engagement among older service users and people with a diagnosis of a SMI.23,24 Given the service users’ limited experience and engagement with digital health tools, their knowledge of privacy, confidentiality, and safety was limited.

In comparison, peer support specialists demonstrated a high level of technology engagement, literacy of digital mental health tools and a more comprehensive awareness and knowledge of privacy, confidentiality, and security concerns in using mHealth and social media diagnostics, sensors, and monitoring of data. For instance, the monitoring of social data interaction was a bigger concern among peer support specialists who suggested that monitoring may be negatively interpreted by service users as surveillance and can potentially increase psychiatric symptomatology (e.g., increase feelings of paranoia, suicidal thoughts, or psychotic episodes) interfering with the user’s path to recovery. Evaluation studies of specific apps often do not mention adverse events, experiences, or risks associated with their apps.11 This finding points towards the potential broader harms associated with mental health app use and the relevance of incorporating end users’ perspectives and needs.

A critical aspect related to privacy, confidentiality, and security of digital mental health interventions is the passive monitoring and collection of data. Passive data collection on digital technologies monitors patient location, activity levels, and social engagements within other smartphone applications. For example, innovative technological and analytical techniques such as digital phenotyping-the moment-by-moment quantification of interactions, behaviors, and cognitions may present heightened privacy, confidentiality, and general ethical concerns. In this study, service users see this issue as related to consent and infringement of privacy while some peers see the passive monitoring of data contingent to the situation. Both groups place different values on their data, and everyone has the right to decide how risk adverse they choose to be regarding digital mental health tools. However, to meet the standard of “informed” in informed consent, individuals must be given information in a way that is genuinely usable and accessible9,12 including the potential risks.

In terms of facilitators and potential solutions for the ethical engagement of digital mental health interventions, peer support specialists are a potential resource. Peer support specialists can facilitate the engagement of service users with less skills and technological literacy.25,26 Most peer support specialists own and use smartphones, and they also see the promise in using smartphones to deliver services.27 In many settings, peer support specialists work as part of a team with other behavioral health professionals in mental health programs, and recovery centers. Engaging peers and service users in a collaborative partnership can help ensure the benefits of digital mental health tools, provide technical assistance, and education. In addition, recent efforts have offered Digital Peer Support Certification26 to peers including education and simulation training sessions, synchronous, and asynchronous support services which increases peers’ capacity to use digital peer support technological features. This places peer support specialists in a unique position as providers with expert knowledge, end-users, and digital health promoters.

In order to be ethically justified, digital mental health tools and interventions, must deliver sufficient benefits to balance against any risks to the consumers.9 A concrete solution to this issue is to include end-users in the software development lifecycle as partners (not only subjects in usability studies) in designing mHealth services.17,26 Within the realm of smartphone app interventions, evidence indicates that a combination of a highly involved participatory research approach and user-centered design throughout the software development lifecycle has shown promising evidence of leading to the highest levels of engagement among people with SMI.27

In addition to partnering with the industry to advance the science of Peer support in digital psychiatry, other promising means of implementing effective mental health technologies include the use of participatory research techniques in the development of digital mental health apps and peer support interventions.28 The professional engagement and consultation of potential end users during design, development, and deployment is an issue of justice29 and “design justice”13 that involves rethinking other aspects of design practice, including the intended design beneficiaries: the “users “and their needs.

Digital mental health tools that are developed through academic research projects are held to HIPAA regulations and a standard of ethics but applications that are produced by other entities are not held to the same standards and do not require informed consent prior to collecting and dispersing personal data. Increased utilization of mental health applications accentuates the need for increased privacy regulations in digital mental health tools that are in line with the privacy regulations set forth for traditional protected health information. Finally, a promising direction in the design of digital mental health can focus on quality assessment framework30 and privacy guidelines tools further into the digital health development process to facilitate “privacy by design” principles and bring development and design teams closer to compliance with regulatory frameworks.7

While these findings advance our understanding of digital mental, it is important to note limitations. As with most qualitative research, our findings are not broadly generalizable to populations at large. Recruitment and retention of ethnic, racial, and gender minorities was a challenge in our study given the geographic location and demographics of our study site. However, the individuals in this study come from diverse socioeconomic and educational backgrounds, SMI diagnoses, and lived experiences which provide unique perspectives on ethical, privacy, and security concerns in using digital mental health tool and engaging in digital mental health interventions. Finally, the diagnosis of peer support specialists is not reported on (due to their employment status and protection under the American Disability Act),30 as such, it is not possible to conduct subanalysis based on mental health diagnosis.

Overall, our findings suggest that service users experience unique barriers in their engagement, literacy, and ownership with digital technologies and digital mental health interventions due to age-related physical and psychiatric impairments and lack of technology literacy. Our study identifies peer support specialists as potential facilitators of technology training for older service users as well as in establishing social networks and mediating relationships with broader institutions. Partnering with mental health researchers, industrial developers, and potential end-users to evaluate promising digital peer support platforms may lead the field to a better understanding of the effective uses of technology for people with mental health conditions.

ACKNOWLEDGMENTS

The work of K. L. Fortuna was supported by a K01 Award from the National Institute of Mental Health under Grant K01MH117496.

Biography

MARIA D. VENEGAS is a medical anthropologist with New England Geriatric Research, Education, and Clinical Center (GRECC) ENRM Veterans Hospital. She completed a Postdoctoral Fellowship with the Centers for Health and Aging, Department of Psychiatry, Dartmouth Hitchcock Medical Center. Her primary research interests include ethnographic informed studies of mental health, m-health and telehealth for the management of complex chronic conditions, medication adherence, and care utilization among economically disadvantaged and minority and older populations. She is the corresponding author of this article. Contact her at maria.venegas@va.gov

JESSICA M. BROOKS is a licensed psychologist in NY state, Texas, and Wisconsin. Her primary affiliation is with the Department of Psychiatry, University of Wisconsin-Madison and UW Health. For over a decade, she has been involved in clinical care, foundation and federally funded mental health promotion research, and her research interests include peer support services, vocational rehabilitation, and integrated mental and physical health services development. She is a member of the American Psychological Association (APA). Contact heratjbrooks3@uwhealth.org.

AMANDA L. MYERS is a research associate with the Brandeis University’s Institute for Behavioral Health within the Heller School for Social Policy and Management in Waltham, Massachusetts. She has worked in equal partnership with people with a lived experience of serious mental illness worldwide to coproduce and empirically test digital technologies and related trainings. Her research interests revolve around working with individuals with serious mental illness, utilizing user-centered design and community-engagement methods to develop mhealth technologies, and working with vulnerable populations. She is a member of Collaborative Design for Recovery and Health. Contact her at amanda@digitalpeersupport.org.

MARIANNE STORM is affiliated to the University of Stavanger, Health Sciences Department. She is a registered nurse and researcher interested in patient- and user involvement, care coordination, social innovation, and health promotion, telecare, and digital health. Contact her at marianne.storm@uis.no.

KAREN L. FORTUNA is an assistant professor of psychiatry with Dartmouth College, Hanover, NH, USA. She works in equal partnership with peer support specialists in the United States, Canada, Europe, Australia, and New Zealand in coproducing and empirically testing digital peer support technologies and trainings. She serves on the APAs Expert Advisory panel on smartphone app development and PCORI’s Advisory Panel on Patient Engagement. She serves as editor of the Journal of Participatory Medicine. Contact her at Karen.L.Fortuna@Dartmouth.edu.

Contributor Information

Maria D Venegas, Department of Veterans Affairs GRECC, Bedford, VA, 01730, USA.

Jessica M. Brooks, University of Wisconin-Madison, Madison, WI, 53701, USA

Amanda L. Myers, Rivier University, Nashua, NH, 03060, USA

Marianne Storm, University of Stavanger, 4007 Stavanger, Norway.

Karen L. Fortuna, Dartmouth College, Hanover, NH, 21076, USA

REFERENCE

  • 1.Fortuna KL, Venegas M, Umucu E, Mois G, Walker R, and Brooks JM, “The future of peer support in digital psychiatry: Promise, progress, and opportunities,” Psychiatric Quart, vol. 89, no. 4, pp. 947–956, 2019. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Tourous J and Keshavan M, “COVID-19, mobile health and serious illness,” Schizophrenia Res, vol. 218, pp. 36–37, 2020. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Martinez-Martin N and Kreitmair K, “Ethical issues for direct-to-consumer digital psychotherapy apps: Addressing accountability, data protection, and consent,” JMIR Ment Health,, vol. 5, no. 2, p. e32, 2018. [Online]. Available: https://mental.jmir.org/2018/27e32 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Bauer M, Glen T, Monteih S, Bauer R, Whybrow PC, and Geddes J, “Ethical perspectives on recommending digital technology for patients with mental illness,” Internet J. Bipolar. Disord, vol. 5, 2017, Art. no. 6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Lucivero F and Jongsma KR, “A mobile revolution for healthcare? Setting the agenda for bioethics,” J. Med. Ethics, vol. 44, pp. 685–689, 2018. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Nebeker C et al. , “Engaging research participants to inform the ethical conduct of mobile imaging, pervasive sensing, and location tracking research,” Transl. Behav. Med, vol. 6, no. 4, pp. 577–586, 2016. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Nurgalieva L, O’Callaghan D, and Doherty G, “Security and privacy of mHealth applications: A scoping review,” IEEE Access, vol. 8, pp. 104 247–104 268,2020. doi: 10.1109/ACCESS.2020.2999934. [DOI] [Google Scholar]
  • 8.Karampela M, Ouhbi S, and Isomursu M, “Exploring users’ willingness to share their health and personal data under the prism of the new GDPR: Implications in healthcare,” in Proc. 41st Annu. Int. Conf. IEEE Eng. Med. Biol. Soc, 2019, pp. 6509–6512. doi: 10.1109/EMBC.2019.8856550. [DOI] [PubMed] [Google Scholar]
  • 9.Cvrkel T, “The ethics of mHealth: Moving forward,” J. Dent, vol. 74, no. 1, pp. S15–S20, 2018. [DOI] [PubMed] [Google Scholar]
  • 10.Dorsey ER et al. , “The use of smartphone for health research,” Acad. Med, vol. 92, pp. 157–160, 2017. [DOI] [PubMed] [Google Scholar]
  • 11.Huckvale K, Nicholas J, Tourous J, and Larsen M, “Smartphone apps for the treatment of mental health conditions: Status and considerations,” Curr. Opin. Psychol, vol. 23, pp. 65–70, 2020. [DOI] [PubMed] [Google Scholar]
  • 12.Glenn T and Monteith S, “Privacy in the digital world: Medical and health data outside of HIPAA protections,” Current Psychiatry Reports, vol. 16, no. 494, 2014, doi: 10.1007/s11920-014-0494-4. [DOI] [PubMed] [Google Scholar]
  • 13.Constanza-Chock S, “Design practices: ‘Nothing about us without us’,” in Design Justice: Communityled Practices to Build the Worlds We Need, Costanza-Chock S, Ed. Cambridge, MA, USA: MIT Press, 2020. [Google Scholar]
  • 14.Yeager C and Benight C, “If we build it, will they come? Issues of engagement with digital health interventions for trauma recovery,” mHealth, vol. 4, 2018, Art. no. 37. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Eysenbach G, “The law of attrition,” J. Med. Internet Res, vol. 7, no. 1,2005, Art. no. e11. doi: 10.2196/jmir.7.1.e11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Larsen L, Erik M, Nicholas J, and Christensen H, “A systematic assessment of smartphone tools for suicide prevention,” PLoS One, vol. 11, no. 4, 2016, Art. no. e0152285. doi: 10.1371/journal.pone.0152285. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Fortuna K et al. , “Application of community-engaged research to inform the development and implementation of a peer-delivered mobile health intervention for adults with serious mental illness,” J. Participatory Med, vol. 11, no. 1, 2019, Art. no. e12380. doi: 10.2196/12380. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Braun V and Clarke V, “Using thematic analysis in psychology,” Qualitative Res. Psychol, vol. 3, no. 2, pp. 77–101, 2006. doi: 10.1191/1478088706qp063oa. [DOI] [Google Scholar]
  • 19.Martin PY and Turner BA, “Grounded theory and organizational research,” J. Appl. Behav. Sci, vol. 22, pp. 141–157,1996. [Google Scholar]
  • 20.Birt L, Scott S, Cavers D, Campbell C, and Walter F, “Member checking: A tool to enhance trustworthiness or merely a nod to validation?,” Qualitative Health Res, vol. 26, no. 13, pp. 1802–1811, 2016. doi: 10.1177/1049732316654870. [DOI] [PubMed] [Google Scholar]
  • 21.Kavandi H and Jaana M, “Factors that affect health information technology adoption by seniors: A systematic review,” Health Social Care Community, vol. 28, pp. 1927–1842, 2020. [DOI] [PubMed] [Google Scholar]
  • 22.Rocheleau JN et al. , “Factors affecting information technology use from the perspective of aging persons with cognitive disabilities: A scoping review of qualitative research,” Technol. Disabil, vol. 32, pp. 1–13, 2020. [Google Scholar]
  • 23.Depp C, Moore R, Perivoliotis D, and Granholm E, “Technology to assist and support self-management in serious mental illness,” Dialogues Clin. Neurosci, vol. 18, no. 2, pp. 171–183,2016. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Naslund JA, Aschbrenner KA, and Bartels SJ, “How people with serious mental illness use smartphones, mobile apps, and social media,” Psychiatr. Rehabil. J, vol. 39, no. 4, pp. 364–367, 2016. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Unertl K et al. , “Integrating community-based participatory research and informatics approaches to improve the engagement and health of underserved populations,” J. Amer. Med. Informat. Assoc, vol. 23, no. 1, pp. 60–73, 2016. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Fortuna K, Myers A, Walsh D, Walker R, Mois G, and Brooks J, “Strategies to increase peer support specialists’ capacity to use digital technology in the Era of COVID-19: Pre-post study,” JMIR Mental Health, vol. 7, no no. 7, 2020, Art. no. 20429. doi: 10.2196/20429. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Fortuna K et al. , “Digital peer support mental health interventions for people with a lived experience of a serious mental illness: Systematic review,” JMIR Mental Health, vol. 7, no. 4, 2020, Art. no. e16460. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Fortuna KL et al. , “Peer and non-peer academic scientists and peer support specialist community of practice: Stakeholder engagement to advance the science of peer support,” in Proc. IEEE Glob. Humanitarian Technol. Conf, 2021, pp. 188–194. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Nunes-Vilaza G and McCashin D, “Is the automation of digital mental health ethical? Applying an ethical framework to chatbots for cognitive behaviour therapy,” Front. Digit. Health, vol. 3, pp. 689–736, 2021. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Carlo AD, Hosseini Ghomi R, Renn B, and Arean PA, “By the numbers: Ratings and utilization of behavioral health mobile applications,” NPJ Digit. Med, vol. 2, 2019, Art. no. 54. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Health and Human Services, “Summary of HIPPA privacy rule,” 2003. [Online]. Available: https://www.hhs.gov/hipaa/for-professionals/privacy/laws-regulations/index.html

RESOURCES