Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2020 Apr 14.
Published in final edited form as: Behav Ther. 2019 Aug 8;51(1):1–14. doi: 10.1016/j.beth.2019.08.001

Cognitive-Behavioral Therapy in the Digital Age: Presidential Address

Sabine Wilhelm 1,*, Hilary Weingarden 1, Ilana Ladis 1, Valerie Braddick 1, Jin Shin 1, Nicholas C Jacobson 1
PMCID: PMC7155747  NIHMSID: NIHMS1575628  PMID: 32005328

In any given year in the United States, one in five adults suffers from a mental illness, (National Institute of Mental Health, 2019; Substance Abuse and Mental Health Services Administration, 2018). Similarly, across the globe, one in five persons suffers from a mood, anxiety, and/or substance use disorder in any given year (Steel et al., 2014). Moreover, globally one in seven children and adolescents meets criteria for a mental disorder (Polanczyk, Salum, Sugaya, Caye, & Rohde, 2015). Half of all chronic mental illnesses begin by age 14, and nearly three quarters onset by age 24 (Kessler, Chiu, Demler, Merikangas, & Walters, 2005). Thus, mental illnesses affect a sizable portion of the population, and it affects people beginning at an early age.

In addition to being highly prevalent, mental illnesses result in a substantial financial burden to society. In the United States, mental disorders account for over $200 billion in annual health care expenditures and cost more than any other health condition, exceeding expenditures for heart conditions, trauma, and cancer (Roehrig, 2016). This includes productivity losses, as mental illness is also the leading cause of disability worldwide (Friedrich, 2017; Whiteford et al., 2013).

Despite the great need to treat mental illness, the majority (59%) of individuals who meet criteria for a current psychiatric diagnosis have not received treatment in the past year (Wang, Berglund, et al., 2005). Moreover, only about one in seven persons with a past-year diagnosis received minimally adequate care (i.e., having any appointments with a psychotherapist, social worker, counselor, therapist, or mental health nurse in the past year; Wang, Lane, et al., 2005). Relatedly, there is a large gap between illness onset and the receipt of care, with a median delay ranging from 6 to 23 years depending on the disorder (Cullen et al., 2008; Marques et al., 2010; Pinto, Mancebo, Eisen, Pagano, & Rasmussen, 2006; Stengler et al., 2013; Thompson, Issakidis, & Hunt, 2012; Wang, Berglund, et al., 2005). Compounding these issues, when seeking treatment, most individuals do not receive care in a timely manner (Trusler, Doherty, Mullin, Grant, & McBride, 2006), increasing the no-show rate and exacerbating symptom severity (DiMino & Blau, 2012; Folkins, Hersch, & Dahlen, 1980; Hicks & Hickman, 1994; Williams, Latta, & Conversano, 2008). Altogether, these data suggest that most individuals with a mental illness do not receive treatment, and most who do receive treatment only do so after suffering for years without care.

In exploring obstacles to receiving mental health care, we identified many patient-level barriers. For instance, individuals often report logistical barriers, including a lack of transportation to attend appointments, an inability to take time off from work to attend appointments, and/or a lack of child care (Harvey & Gumport, 2015). Although some studies have failed to find a relationship between stigma and a delay in seeking treatment (Green, Hunt, & Stain, 2012; Johnson & Coles, 2013), the stigma of walking into the office of a mental health care provider may also present a formidable barrier to obtaining treatment (Hepworth & Paxton, 2007; Hinshaw & Stier, 2008; Jung, von Sternberg, & Davis, 2017; Link & Phelan, 2006; Schreiber, Maercker, & Renneberg, 2010). Moreover, many persons cannot afford the high costs of treatment (Green et al., 2012; Hepworth & Paxton, 2007; Ho, Hunt, & Li, 2008; Kremer & Gesten, 2003), which often costs thousands of dollars in out-of-pocket expenses for a given individual (Crow et al., 2013; Daley, Morin, LeBlanc, Gregoire, & Savard, 2009; Kass et al., 2017; Otto, Pollack, & Maki, 2000). Therefore, logistical barriers, stigma, and cost represent tangible patient-level barriers to receiving timely treatment.

In addition to patient-level barriers, broader system-level barriers also impede patients’ ability to obtain treatment. Based on American Psychological Association statistics released in 2016, there are only 14.7 licensed psychologists in the United States per 100,000 people (Lin, Stamm, & Christidis, 2016). Thus, even if all psychologists provided psychotherapy to 50 clients per week, only 3,185 patients would receive care in a given year, whereas the prevalence estimates would suggest that 26,400 meet criteria for a psychological disorder yearly (Demyttenaere et al., 2004). Unfortunately, supply problems are only compounded elsewhere in the world. In other regions, such as Africa, Southeast Asia, the Eastern Mediterranean, and the Western Pacific, the number of any type of mental health care worker averages less than 10 per 100,000 (World Health Organization, 2015). Hence, even in resource-laden countries, most persons with psychiatric diagnoses do not have access to care. Accordingly, such evidence suggests that the current care system cannot be scaled to treat patients at the population level.

Using the Internet to Close the Access Gap

While technology in general has a negative reputation when it comes to its influence on mental health (e.g., Twenge & Campbell, 2018), it can provide innovative solutions that may address many individual- and system-level treatment barriers. The growth of the Internet presented the first major opportunity to increase access to care. Specifically, the availability of the Internet has increased by approximately 685% between 2000 and 2018, with 48% of the world’s population now having Internet access (Hilbert & López, 2011; International Telecommunications Union, 2018); major technology companies also continue to increase access for less developed countries (e.g., West, 2015). Internet cognitive-behavioral therapy (I-CBT), a first-wave technology-based movement, transfers what would have been formerly considered self-help material to a Web-based format (Andersson, Carlbring, & Lindefors, 2016). Using I-CBT, patients can sit down at their home computers, read psychoeducational material, and practice CBT exercises and worksheets. Often a therapist or coach spends circumscribed time helping to guide patients through the modules, although the required therapist time is often only a fraction of that required by traditional face-to-face methods (Enander et al., 2016). Many pioneers have made great strides in developing these treatments and evaluating the efficacy of I-CBT (e.g., Andersson, 2009; Andersson, Carlbring, Berger, Almlöv, & Cuijpers, 2009; Andersson et al., 2016; Batterham et al., 2015; Bergström et al., 2010; Carlbring, Ekselius, & Andersson, 2003; Glozier et al., 2013; Griffiths & Christensen, 2007; Kaldo et al., 2015). In addition to the potential for I-CBT to be more accessible to patients, meta-analyses have shown superiority of I-CBT to wait-list control groups, and randomized controlled trials have shown I-CBT to have approximately equivalent efficacy compared to face-to-face CBT (Andrews et al., 2018; Carlbring, Andersson, Cuijpers, Riper, & Hedman-Lagerlöf, 2018). Taken together, these meta-analyses suggest that I-CBT is a safe and effective method to decrease access barriers.

Using Phones to Provide Care In Situ

Although I-CBT is effective and addresses key access barriers, most of its practice occurs within one setting (i.e., at home, in front of one’s computer). However, symptoms can occur any-where, at any time (e.g., Moskowitz & Young, 2006; Newman et al., 2019). Given the ubiquity and access to cell phones in daily life, more recent technology-based treatments have relied on smartphones to close this access gap. Two of the most prominent national carriers have independently estimated coverage rates to be 99% in the United States, and the current 4G coverage rates (averaging 4.5–6.9 Mbps) range from 92 to 97% in the United States (Global Wireless Solutions, 2017). Moreover, 81% of those in the United States own a smartphone (Pew Research Center, 2019). Worldwide, the average number of mobile-cellular network subscriptions is greater than one phone line per person globally, even averaging 70 lines per 100 people in the least developed countries (International Telecommunications Union, 2018). Thus, phones present a viable platform to address symptoms as they occur within daily environments.

Indeed, the field has been increasingly moving toward smartphone-based treatments, principally focused on mobile phone applications (apps). Currently, thousands of mental health apps are available on the market (Larsen et al., 2019). Mobile health apps are optimally designed for brief, frequent use throughout the day (Mohr et al. 2017). Apps may have greater ecological validity than prior treatments (Schembre et al., 2018), and it is possible that treatment apps can result in better generalization to the real world, since they are accessible anytime and place that symptoms arise. Mental health apps approach treatment from different angles. Some apps focus on symptom monitoring (Mehdizadeh, Asadi, Mehrvar, Nazemi, & Emami, 2019) and allow users to repeatedly log their symptoms with rating scales over time (Nicholas, Larsen, Proudfoot, & Christensen, 2015). Other apps focus on a single skill, like cognitive restructuring or mindfulness (Mohr, Tomasino, et al., 2017). Some apps are intended to be used in conjunction with a live therapist (Bry, Chou, Miguel, & Comer, 2018; Gindidis, Stewart, & Roodenburg, 2018), such as an app that sends food diaries to clinicians for patients with eating disorders (Lindgreen, Clausen, & Lomborg, 2018). Finally, there are apps designed as full standalone treatments, which incorporate many techniques from CBT (Bakker, Kazantzis, Rickwood, & Rickard, 2018; Bry et al., 2018; Gindidis et al., 2018; Wilhelm et al., 2019)—nevertheless, these apps may still be combined with in-person treatments (Ventura & Chung, 2019). Thus, apps present a range of viable treatment options at varying levels of care.

Randomized controlled trials of therapy apps’ efficacy conducted to-date have shown them to be quite promising. Meta-analyses for anxiety and depression have shown that app-based treatments are superior to control conditions (Firth, Torous, Nicholas, Carney, Pratap, et al., 2017; Firth, Torous, Nicholas, Carney, Rosenbaum, et al., 2017). Particularly, apps have shown superiority to inactive control conditions with moderate effect sizes (g = 0.45 for anxiety, g = 0.56 for depression; Firth, Torous, Nicholas, Carney, Pratap, et al., 2017; Firth, Torous, Nicholas, Carney, Rosenbaum, et al., 2017). In addition, apps have shown superiority to active control conditions with small effect sizes (g = 0.19 for anxiety, g = 0.22 for depression; Firth, Torous, Nicholas, Carney, Pratap, et al., 2017; Firth, Torous, Nicholas, Carney, Rosenbaum, et al., 2017). Although meta-analyses have yet to be conducted in other domains, systematic reviews suggest that apps show promising uptake and potential efficacy in treating those with serious mental illnesses (Batra et al., 2017; Berry, Lobban, Emsley, & Bucci, 2016; Firth & Torous, 2015). Taken together, this early evidence suggests that certain apps may provide promising platforms for symptom reduction. However, while there is cause for optimism, we need to remain cautious: as will be described below, many mental health apps available for public download are not based on scientific principles, and their outcomes have not been assessed.

In addition to apps that are grounded in empirical principles demonstrating strong early outcomes, smartphone-based treatments address many of the key barriers to accessing mental health care. In particular, the ubiquitous nature of smartphones, coupled with the scalability of smartphone-based interventions, may help to address the access problem. Unlike the long wait times for traditional in-person care, smartphone-based treatments are available in seconds and consequently can provide treatment immediately as symptoms develop. Moreover, smartphone-based treatments are often designed to be utilized within daily life— as such, they may be less time-consuming and disruptive to engage in, present no transportation or child care issues, and require no time off work. Smartphone-based treatments may also decrease access barriers related to shame, as persons can easily receive treatment delivered in a less clinical or pathologizing format (Garnett et al., 2018). In particular, given that utilizing health and wellness apps has become so commonplace, smartphone-based treatments may feel less stigmatizing than in-person treatments (Kasckow et al., 2014). Furthermore, apps are often available for a considerably lower cost than in-person treatment. Last, although the fidelity of in-person therapy to validated techniques can vary dramatically in outpatient settings (Weisz et al., 2013; Zima et al., 2005), app-based treatments have the potential to deliver standardized care across persons.

Challenges of Introducing New Technologies in Mental Health Care

While it is evident that technology offers many opportunities for mental health care, integration of technology into research, assessment, and treatment also introduces challenges of its own. Considering the fast pace with which new technologies are developed and integrated into mental health care, it is essential that we identify and thoughtfully address these challenges as early as possible. Perhaps the three most central challenges we face to date include (a) issues of low engagement with digital CBT tools, (b) a lack of sufficient evidence for many digital mental health tools, and (c) poor understanding related to security issues for technology-based tools.

Engagement

In traditional, face-to-face therapy, we experience challenges of engagement and retention. It is no surprise, therefore, that digital CBT tools such as smartphone-based treatments face even more significant hurdles related to patient engagement—that is, users’ uptake and adherence with a digital tool (Torous, Nicholas, Larsen, Firth, & Christensen, 2018). In fact, many mental health apps and computer-based treatments struggle to keep users engaged at all. For example, out of roughly 150,000 downloads of the well-known app PTSD Coach, only 15.6% of its users had opened the app the week after download, and only 37% ever accessed its primary content (Owen et al., 2015). A second study investigated naturalistic engagement rates with the Intellicare suite of CBT apps during its first year of public availability (Lattie et al., 2016). The modal number of uses for each app was one (mean number of uses ranged by app from 3 to 17; Lattie et al., 2016).

A Need for Additional Support?

Low levels of engagement suggest that digital interventions need to incorporate additional features that bring individuals back to the intervention. For example, as users might need to feel accountable to the program and cared for (Newman, Szkodny, Llera, & Przeworski, 2011), experts in the space of digital mental health advise that incorporating mental health clinicians or coaches into technology-delivered treatments is integral to their success (Mohr, Cuijpers, & Lehman, 2011; Torous et al., 2018). In fact, several I-CBT and smartphone-based therapies currently involve therapist or coach contact, ranging from very minimal interactions all the way to using an app only as an adjunct to face-to-face therapy. Although the amount of support required for engagement may vary depending on the type of clinical issues being addressed (Newman et al., 2011), it appears that only a small amount of therapist time may be necessary to meaningfully bolster engagement. For example, in our team’s open pilot trial of 12-week smartphone-based CBT for body dysmorphic disorder (BDD; N = 10), attrition rates were 0% and mean number of minutes spent on the app per user was 398 (SD = 310.25; Wilhelm et al., 2019). We hypothesize that strong preliminary engagement rates may be attributed to a combination of an extensive user-centered design process (described below) as well as circumscribed interactions with a clinician. Specifically, the clinician spent an average of approximately 1 hour communicating with each user across the whole 12-week program, via a combination of phone calls and asynchronous in-app messaging (Wilhelm et al., 2019). Importantly, it may not be necessary for human interactions to occur with a licensed clinician—rather, lay coaches with some training may be just as effective for keeping users engaged (Mohr, Tomasino, et al., 2017). Ultimately, however, most published studies of technology-based treatments to date do not include details on the level of human support provided (Hollis et al., 2017), underscoring the importance of further research that specifically seeks to elucidate how much and what kind of human support is necessary to efficiently and effectively enhance engagement.

Beyond using coach or therapist support, there are emerging, cutting-edge approaches to keeping users engaged that seek to mimic or replace the role of the therapist. Replacing some or all of the time required by a trained coach or highly trained clinician likely enhances the scalability and cost-effectiveness of digital interventions. For example, chatbots, or “fully automated conversational agents” use natural language processing to interact with users and deliver support, enhance motivation, or even teach CBT skills, via fluid text conversations (Fitzpatrick, Darcy, & Vierhile, 2017). One example of a chatbot is Woebot, which uses a text platform to engage with users and provide CBT skills. A randomized controlled trial of Woebot (N = 70) showed that depression symptoms reduced more in nonclinical college students who interacted with Woebot compared to students in an information-only control condition, with a medium effect (Fitzpatrick et al., 2017). Moreover, Woebot had high usage rates (it was checked an average of 12 times over 2 weeks) and satisfaction rates (mean = 4.3/5; Fitzpatrick et al., 2017). Substantial additional research is needed to understand whether chatbots may be effective for teaching CBT skills in clinical samples and across psychiatric diagnoses.

Likewise, avatars can be used to enhance digital interventions in a diversity of ways, one of which is to mirror telehealth therapist contact (Rehm et al., 2016). Avatars are “digital self-representations, which enable individuals to interact with each other in computer-based virtual environments” (Rehm et al., 2016). Rehm and colleagues describe a study comparing 91 adults’ interactions with an avatar clinician to 140 adults’ interactions with a live (face-to-face) clinician. Results showed that most participants who interacted with the avatar reported feeling comfortable sharing information (Rizzo et al., 2016). However, participants gave higher ratings for rapport and listening skills to the live therapist, compared to the avatar (Rizzo et al., 2016). More work is needed to address barriers that may arise when interacting with avatar clinicians and to understand whether incorporation of avatar therapists into digital tools enhances engagement.

Finally, some digital tools have incorporated peer support platforms, which typically allow for anonymous, moderated interactions with other people who share symptoms or experiences with the user (Torous et al., 2018). A pioneer in the space of peer support is the no-longer-active Panoply (Morris, Schueller, & Picard, 2015). Panoply was a Web-based platform that allowed users to post negative thoughts and receive crowdsourced reappraisal suggestions from others (Morris et al., 2015). Another currently widely used platform is 7 Cups, one feature of which is peer-support chatrooms to connect with others experiencing similar symptoms. By nature, peer-support platforms may be especially useful for destigmatizing mental health issues, compared to other approaches to integrating human support into digital interventions.

Thus, finding ways to ensure that patients do not drop out of treatment and keep using digital interventions such as computer- or phone-based treatments regularly is a challenge. Brief interactions with therapists or even trained bachelor’s-level coaches have been shown to boost engagement. Perhaps even chatbots, avatar therapists, and peer-support platforms could ultimately be combined with other tools, such as computer- or smartphone-based treatments to improve engagement. Research in this area remains at a very early stage and significant work is needed to understand each of these approaches’ potential for enhancing engagement. Moreover, it is not clear whether these approaches will be acceptable or effective across demographic groups or clinical presentations.

A Need for Stakeholder Engagement and Interdisciplinary Collaboration

Another key barrier to engagement is the poor usability of many mental health apps (Torous et al., 2018). Usability refers to an app’s ease of use, the extent to which it meets users’ needs, how enjoyable it is to interact with, and the attractiveness of its interface (Torous et al., 2018). By and large, neither mental health researchers nor app developers working in isolation have the necessary expertise to develop a digital service for mental health that is at once easy to use, attractive, fun to use, and relevant to patients. Indeed, mental health apps are commonly described by users as “buggy” and “clunky” (Torous et al., 2018), and often lack the core features that patients desire (Torous et al., 2018). Nicholas, Fogarty, Boydell, and Christensen (2017) conducted a qualitative analysis of 2,173 user reviews of 48 apps for bipolar disorder. Results showed that existing apps often did not meet the needs of users or contain the features that patients valued. In fact, roughly a quarter of reviews contained negative comments, often related to poor usability (Nicholas et al., 2017). To address engagement issues that stem from poor usability, it is imperative that we collaborate from the start with key stakeholders. In particular, digital service development should involve collaboration between patients, clinicians, designers, and engineers, and representatives from health care systems.

Collaboration With Patients

In contrast to typically low rates of engagement, when app development involves patient stakeholders from the start, engagement can become quite high. For example, Torous and colleagues (2018) describe a suicide prevention app called iBobbly. Its developers obtained input from patients at each stage of development, from design to implementation, and adherence with the iBobbly app was a remarkable 97% (Torous et al., 2018). In our own work developing a smartphone-based CBT treatment for BDD (Perspectives), we implemented an in-depth, user-centered design protocol (Wilhelm et al., 2019) that included seeking individual input from five patient consultants who had recently undergone face-to-face CBT for BDD. This allowed us to learn about each patient’s personal experience with BDD, CBT, and what they would find most useful (or unhelpful) in a digital service. We also asked that our patient consultants test the app prototype and give feedback on each module over the course of a week and asked them to test and give feedback on a functional beta version of Perspectives over 12 days (see Wilhelm et al., 2019, for a detailed description). We iteratively improved the app across each of these stages based on consultant input. Subsequently, feedback was collected from BDD patients who participated in an open pilot trial, and further changes were made based on this input. As noted above, we hypothesize that obtaining substantial patient input across development and initial testing may have contributed to strong engagement rates in our initial open pilot trial—however, more research is needed to confirm results from this preliminary study.

Multidisciplinary Partnerships Between Clinicians and Industry

Often, mental health apps are created unilaterally by industry developers, without involvement from clinical experts (Schueller, Muñoz, & Mohr, 2013). Indeed, a review of apps for anxiety and worry found that 67.3% of apps on the market were developed without having obtained input from clinicians (Sucala et al., 2017). Unfortunately, digital interventions are unlikely to be as potent, evidence based, or effective in the absence of substantial involvement from clinical experts.

Likewise, academics and clinicians are bound to fail in creating fun or attractive apps when working in isolation (Torous et al., 2018). Clinicians typically have no training in user-interface design or in gamifying content—areas where technology and industry partners bring great benefits. In particular, gamifying app content, or “using game design elements in non-game contexts” (Sardi, Idri, & Fernandez-Aleman, 2017), has garnered attention for its potential to enhance engagement by incorporating features like competition and rewards. However, a 2017 review of gamification in health-related apps found that only three studies of mental health apps had utilized gamification (Sardi et al., 2017). The low number of gamified mental health apps very likely reflects the paucity of multidisciplinary collaborations used to create these apps. Altogether, we are likely to build the most usable and engaging tools if clinical experts and industry developers adopt a true, multidisciplinary partnership approach (Schueller et al., 2013).

Involving Health Care System Stakeholders in Development

Usability issues also plague the transition from research to practice (Mohr, Lyon, Lattie, Reddy, & Schueller, 2017)—that is, even when digital interventions show strong engagement in efficacy trials, engagement almost invariably plummets upon real-world implementation (Mohr, Lyon, et al., 2017). This can likely be attributed to failing to involve key health care system stakeholders in the development process. If we want our apps to be used, we have to not only involve patients, providers, and technology experts but we have to consider how these apps might ultimately be used and reimbursed (Powell, Bowman, & Harbin, 2019). For example, a self-help app downloaded from the app store or offered as part of an employee assistance program might need to look quite different from a treatment that will be reimbursed by a commercial or Medicaid payer. Different parts of our health care system have unique and complex regulatory restrictions, requirements, and processes. Failure to learn what different stakeholders require or not designing digital services with their needs in mind will yield apps that have poor usability in their real-world contexts. Armontrout, Torous, Cohen, McNiel, and Binder (2018) describe a smartphone app to help with recovery from alcohol use disorder (A-CHESS; Ford et al., 2015). Armontrout et al. (2018) note that the app had strong clinical outcomes within an efficacy trial—however, when it was deployed to real-world clinics, only 3 of 14 clinics continued using the app after 2 years, due to challenges of integrating it into their unique systems (Armontrout et al., 2018). This example highlights the importance of involving health care stakeholders early in addition to conducting externally valid research, if we hope for our work to have a major, disseminable impact (Mohr, Lyon, et al., 2017).

Evidence and Security

In addition to making engaging digital treatments, our field has a long way to go in terms of gathering empirical support for new, technology-based treatments. At present, our enthusiasm for digital treatments is outpacing not only their research base but also our understanding of key issues related to security and data protection when developing and recommending digital services to patients.

Lack of Evidence Base

The large majority of mental health apps available for public download are neither grounded in evidence-based principles, nor do they have efficacy data. Leigh and Flatt (2015) note that 1,536 depression apps were available for download in 2013, whereas only 32 papers about depression apps had been published at that time. A 2017 review of apps for anxiety and worry likewise highlights that only 3.8% of available apps provided efficacy data, and most (63.5%) lacked information about their theoretical or treatment approach (Sucala et al., 2017). Moreover, a review of claims made in app store descriptions of 73 top-ranking mental health apps also showed that most (64%) app store descriptions included statements about the app’s effectiveness for diagnosing or improving symptoms or self-management of a mental illness (Larsen et al., 2019). However, when these descriptions referenced specific treatment approaches, only about half described methods supported in the literature. Furthermore, only two descriptions cited research about the app itself to back up effectiveness claims (Larsen et al., 2019). Taken together, despite having strong evidence-based treatments for many mental illnesses, we cannot assume that publicly available digital treatments offer those evidence-based tools.

At times, mental health apps may even be unsafe for our patients. Many apps actually contain misinformation about mental illness and its treatment (Neary & Schueller, 2018). This includes occasionally providing dangerous misinformation (Neary & Schueller, 2018). Moreover, only 23% of health apps follow up appropriately when users indicate possible suicide risk (Singh et al., 2016). Therefore, we must not only assume that unvetted publicly available apps may lack an evidence base but also that they have potential to be harmful to our patients. Altogether, there is a clear need for guidance and resources that can help both health care providers and patient consumers decipher which apps offer appropriate, evidence-based tools and which do not.

Poor Understanding Surrounding Issues of Security

At present, the overwhelming majority of mental health apps are not formally regulated (Torous et al., 2018). Despite this, mental health apps frequently collect, transmit, and store users’ sensitive health information. Often, those health data are sold to third parties—likely unbeknown to the user (Foster & Torous, 2019; Torous et al., 2018; Torous & Roberts, 2017).

The fact that mental health apps are handling and sharing users’ health data underscores the importance of establishing appropriate data security and privacy policies and informing users of those policies. However, most mental health apps fail to do this. In fact, only 24% of apps for bipolar disorder (Foster & Torous, 2019; Torous et al., 2018; Torous & Roberts, 2017), 29% of apps for suicide prevention (Torous et al., 2018), 46% of apps for dementia (Rosenfeld, Torous, & Vahia, 2017), and 49% of apps for depression (O’Loughlin, Neary, Adkins, & Schueller, 2019) provide any sort of privacy policy. Apps that do provide privacy policies frequently fail to include relevant information (O’Loughlin et al., 2019; Rosenfeld et al., 2017), present privacy policies at too high a reading level (O’Loughlin et al., 2019), or provide policies only after soliciting information from the user first (O’Loughlin et al., 2019).

Resources and Guidance

Altogether, we have few official requirements for privacy and security of digital mental health services—moreover, there are no formalized standards for how or when clinicians should evaluate an app before referring a patient to use it. It is important to note that reliance on app store “star ratings” is not sufficient, as star ratings do not correlate strongly with an app’s clinical utility (Singh et al., 2016; Torous & Roberts, 2017).

Whereas we lack formal standards, excellent resources have been developed over recent years to address emerging evidence, security, and privacy issues. Stoyanov et al. (2015) developed the Mobile App Rating Scale (MARS), now the most widely used approach to app rating. MARS is based on the premise that to properly evaluate an app’s utility, the app must be assessed objectively across multidimensional facets (as noted above, apps should strive to be at once engaging, evidence based, and secure; Neary & Schueller, 2018).

PsyberGuide is a non-profit-funded project that evaluates smartphone apps and other digital mental health products. PsyberGuide provides three ratings for each app it evaluates: (a) a transparency rating that reviews the app’s privacy policy with regard to clarity of data collection and storage procedures, (b) a credibility rating, which provides information on the scientific foundation for the app’s content, and (c) a user-experience rating based on MARS, which reviews the app’s user-interface design, accessibility, and level of engagement (PsyberGuide, 2018). Some apps in PsyberGuide also have expert reviews (Neary & Schueller, 2018). To be most effective, PsyberGuide has partnered with other leading organizations, including the Association for Behavioral and Cognitive Therapies (ABCT) and the Anxiety and Depression Association of America (ADAA). Similar to PsyberGuide, the American Psychiatric Association (2018) has put together a framework for how to select and recommend apps. Neary and Schueller (2018) provide a useful summary of these resources.

Recently, privacy regulations have evolved in both the European Union (which issued the General Data Protection Regulation [GDPR]) and the United States (with evolving policies by the Food and Drug Administration [FDA] on mobile medical apps). The FDA has distinguished between three categories of health apps based on the claims of the app and the app’s level of risk. These include (a) apps that are not classified as medical devices (e.g., medical dictionaries and references), and which will therefore have no FDA oversight, (b) apps that may be classified as medical devices but which are determined to be of lower risk, over which the FDA will use enforcement discretion (e.g., an app that provides a diagnosed patient with a “skill of the day”), and (c) apps that are classified as medical devices and which, based on their potential risk to patients, the FDA intends to oversee (e.g., an app that assesses the cancer risk of a skin lesion by analyzing an image; Armontrout et al., 2018; O’Loughlin et al., 2019; U.S. Department of Health and Human Services Food and Drug Administration, 2015). The first smartphone-delivered psychotherapy to be cleared by the FDA is reSET by Pear Therapeutics, a prescription-only digital service that offers CBT for substance use disorders, in conjunction with outpatient therapy (U.S. Food and Drug Administration, 2017). At present, it remains unclear whether increasing FDA oversight of smartphone-based treatments as mobile medical devices will hinder growth and innovation, or whether it will provide useful safety and efficacy regulation.

Future Directions

Building from smartphone-based mental health treatments, cutting-edge approaches to using technology within psychology continue to emerge. Newest approaches tend to incorporate machine learning and include the use of wearable or smartphone sensors to enhance assessment and treatment, as well as the integration of multiple technology-based tools into one sophisticated assessment and treatment-delivery platform.

Sensors to Enhance Assessment and Treatment

Although frequent clinical assessment in the context of mental health care is extremely beneficial for treatment planning, monitoring deterioration, and measuring improvements across treatment, we often fail to assess our patients as frequently as we should. This is unsurprising, as assessments can be burdensome and time-consuming. Our assessment data are also limited by relying primarily on retrospective reports, collected either via clinician interview or self-report. Retrospective reports collapse information across a period of time— days, months, or longer, a process that risks collecting data that are systematically biased by recall, may overlook acute or brief changes in well-being, or that omit critical contextual factors. Unfortunately, incomplete and obsolete clinical assessments will undermine optimal treatment planning. To this end, perhaps one of the most promising recent advances is the growing ability to use smartphone sensors and wearables (e.g., smart watches) to collect continuous, passive (i.e., unobtrusive, collected in the background) data from patients, with their consent. For example, social information may be deduced from phone communication logs, mobility patterns (e.g., amount of time spent at home) can be inferred from the Global Positioning System (GPS), and activity and sleep patterns may be detected using accelerometer or screen-use data. Thus, sensor-based assessment presents an opportunity for objective, clinically relevant, time-sensitive, and context-rich data collection that requires no extra effort on the part of our patients.

A review of research that used sensor data to detect depression severity found that studies used 17 unique types of sensor data to infer information about participants’ social interactions, physical activity, location, smartphone use patterns, individual characteristics (e.g., fitness, sleep), environment, and physiology (Rohani, Faurholt-Jepsen, Kessing, & Bardram, 2018). Across studies, certain features, including amount of time spent at home, location entropy, amount of time the screen was active, and duration of sleep, correlated consistently with mood (Rohani et al., 2018), underscoring the potential benefits of supplementing gold-standard clinical assessments with low-burden sensor data.

Sensor-based assessments also have the potential to inform intervention selection. For example, if sensors show that an individual recently stopped leaving the home and started sleeping excessively, a linked intervention app might suggest a depression prevention or a treatment strategy (this could include behavioral activation strategies, such as visiting a friend or going to the gym). Or if a patient had previously improved in therapy, and a few weeks after the end of treatment sensors indicate a consistent decrease in mobility, an intervention app might suggest relevant relapse prevention strategies. Thus, sensor-based assessments hold great promise to help us optimize treatment, as they might ultimately be able to trigger a personalized treatment strategy at the time when it is most beneficial for an individual. However, despite the clear potential for sensor-based assessment and associated intervention strategies, it is important to note that we are in the very early stages of this research. In particular, research on sensor-based assessment suffers from issues of methodological inconsistency (Mohr, Zhang, & Schueller, 2017). For example, across 46 depression studies using passive sensor data, mood was assessed using 19 different methods (Rohani et al., 2018). Moreover, our field is still adapting to using statistical methodologies that can appropriately accommodate big data with intensive longitudinal properties (Barnett, Torous, Staples, Keshavan, & Onnela, 2018). A review of sensor-based assessment research showed that roughly half of studies used inappropriate analytic methods (Saeb, Lonini, Jayaraman, Mohr, & Kording, 2017).

There are also issues that extend beyond our research methods, to the technology itself. First, both our phones and our individual phone usage patterns vary substantially. Differences exist between the two primary operating systems (Android and iOS), as well as between operating system versions and phone manufacturers (Mohr, Zhang, et al., 2017). On top of this, individuals use phones differently from one another. For example, women may be more likely to carry their phone in a purse, whereas men may keep phones in their pockets. Any statistical algorithms we develop that are reliant on nonstable variables may cease to work when those key variables naturally morph over time or across populations (Mohr, Zhang, et al., 2017).

Another hurdle we face related to implementing sensor-based assessment is clinician skepticism (Bourla et al., 2018). In a survey that assessed 515 psychiatrists’ attitudes about new technologies in the field, the majority rated wristband-based sensor data as having only moderate (46.8%) to low (34.9%) acceptability in terms of risk, and perceived this technology to have moderate (58.4%) to low (26.3%) acceptability in terms of utility (Bourla et al., 2018). Thus, as sensor-based assessment becomes more reliable and useful over time, clinician education will be critical in the dissemination and adoption process.

Taken together, there is much reason for enthusiasm in the space of sensor-based assessment and its potential for precision treatment in mental health care. However, as with other cutting-edge technologies reviewed here, our field’s enthusiasm is outpacing our expertise, and while we can remain optimistic we first need to provide a research foundation, prior to widespread clinical deployment.

Comprehensive Mental Health Platform: Putting It All Together

Ultimately, each of the individual technology tools described here will be most powerful if integrated with other relevant health care information. Assessment data can be pooled across traditional clinician and self-report evaluations, one’s electronic medical record, and passive sensor data as well as the “digital exhaust” patients leave on social media. Of course, we should seek patients’ consent prior to collecting and using these data, and their privacy would have to be protected. In this context, this would provide rich, real-time information about a patient.

Using machine learning, we can distill down these big data to suggest personalized treatment packages to patients. Ideally, an assessment algorithm could inform both the optimal type of treatment and level of care needed. In certain cases, an optimized treatment could then be delivered to the individual immediately via technology, such as through a smartphone-based treatment that incorporates a personally tailored level of support (e.g., support via an in-app coach or a peer network). Ongoing sensor-based assessment could enhance treatment, by triggering just-in-time interventions and alerting a clinician in cases of possible clinical deterioration. Of course, this individualized, stepped-care approach also implies that technology-based interventions will not be appropriate for everyone, and there will always be individuals for whom face-to-face interventions will be the most acceptable or appropriate intervention.

This comprehensive assessment and treatment platform is still a vision for the future, where each of its component parts remains at a very early development stage. One of the closest current models of a comprehensive technology-based mental health platform is the Intellicare suite (Mohr, Tomasino, et al., 2017), which involves 13 individual apps that provide single CBT skills or other small interventions (CBITS Intellicare, n.d.). Intellicare also has a “hub” app that, when downloaded, suggests specific Intellicare skills apps to the user. In the future, the Intellicare team seeks to develop an algorithm for the hub app by drawing from multimodal assessments (e.g., usage data, user ratings, self-report assessments), to tailor and optimize its intervention suggestions (Lattie et al., 2016).

In summary, the mental health field currently faces major challenges. We urgently need scalable interventions to address the global mental health crisis, and in particular, we need clinical services that can reach individuals with major treatment access barriers (e.g., rural communities). Moreover, we have to develop treatment tools that not only reduce symptom severity and enhance functioning and quality of life but are also acceptable to patients and clinicians. Technology-based tools such as smartphone apps might provide powerful solutions for assessment, prevention, and treatment. They might ultimately be able to match patients with a high-quality, precisely timed personalized intervention. However, currently there are thousands of apps to choose from and we need to educate our patients and ourselves, to ensure we have the knowledge to select services that are firmly grounded in science, engaging to use, and guided by ethical principles that protect patients’ rights and privacy. As technology-based clinical services can be automated, they might not only enhance the quality and scalability of health care but could also potentially reduce its cost. Thus, while the mental health challenges we are facing are significant, if we take advantage of recent developments in technology, we might be able to fundamentally change health care.

Footnotes

This article is based on the presidential address delivered by Sabine Wilhelm at ABCT’s 52nd Annual Convention, November 17, 2018.

Declaration of Competing Interest

Drs. Wilhelm and Weingarden have received salary support from Telefonica Alpha, Inc. Dr. Wilhelm is a presenter for the Massachusetts General Hospital Psychiatry Academy in educational programs supported through independent medical education grants from pharmaceutical companies. Dr. Wihelm has received royalties from Elsevier Publications, Guilford Publications, New Harbinger Publications, and Oxford University Press. Dr. Wilhelm has also received speaking honoraria from various academic institutions and foundations, including the International Obsessive Compulsive Disorder Foundation and the Tourette Association of America. In addition, she received payment from the Association for Behavioral and Cognitive Therapies for her role as Associate Editor for the Behavior Therapy journal, as well as from John Wiley & Sons, Inc. for her role as Associate Editor for the journal Depression & Anxiety. Mr. Jacobson is the owner of a free application published on the Google Play Store entitled “Mood Triggers”. He does not receive any direct or indirect revenue from his ownership of the application (i.e. the application is free, there are no advertisements, and the data is only being used for research purposes).

References

  1. American Psychiatric Association. (2018). App evaluation model [Position statement]. Retrieved from, https://www.psychiatry.org/psychiatrists/practice/mental-health-apps/app-evaluation-model.
  2. Andersson G (2009). Using the internet to provide cognitive behaviour therapy. Behaviour Research and Therapy, 47, 175–180. 10.1016/j.brat.2009.01.010 [DOI] [PubMed] [Google Scholar]
  3. Andersson G, Carlbring P, Berger T, Almlöv J, & Cuijpers P (2009). What makes internet therapy work? Cognitive Behaviour Therapy, 38, 55–60. 10.1080/16506070902916400 [DOI] [PubMed] [Google Scholar]
  4. Andersson G, Carlbring P, & Lindefors N (2016). History and current status of ICBT In Lindefors N, & Andersson G (Eds.), Guided internet-based treatments in psychiatry (pp. 1–16). Cham, Switzerland: Springer International. [Google Scholar]
  5. Andrews G, Basu A, Cuijpers P, Craske MG, McEvoy P, English CL, & Newby JM (2018). Computer therapy for the anxiety and depression disorders is effective, acceptable and practical health care: An updated meta-analysis. Journal of Anxiety Disorders, 55, 70–78. 10.1016/j.janxdis.2018.01.001 [DOI] [PubMed] [Google Scholar]
  6. Armontrout JA, Torous J, Cohen M, McNiel DE, & Binder R (2018). Current regulation of mobile mental health applications. Journal of the American Academy of Psychiatry and the Law, 46, 204–211. 10.29158/jaapl.003748-18 [DOI] [PubMed] [Google Scholar]
  7. Bakker D, Kazantzis N, Rickwood D, & Rickard N (2018). Development and pilot evaluation of smartphone-delivered cognitive behavior therapy strategies for mood- and anxiety-related problems: MoodMission. Cognitive and Behavioral Practice, 25, 496–514. 10.1016/j.cbpra.2018.07.002 [DOI] [Google Scholar]
  8. Barnett I, Torous J, Staples P, Keshavan M, & Onnela JP (2018). Beyond smartphones and sensors: Choosing appropriate statistical methods for the analysis of longitudinal data. Journal of the American Medical Informatics Association, 25, 1669–1674. 10.1093/jamia/ocy121 [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Batra S, Baker RA, Wang T, Forma F, DiBiasi F, & Peters-Strickland T (2017). Digital health technology for use in patients with serious mental illness: A systematic review of the literature. Medical Devices (Auckland, N. Z.), 10, 237–251. 10.2147/MDER.S144158 [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Batterham PJ, Sunderland M, Calear AL, Davey CG, Christensen H, Teesson M, & Krouskos D (2015). Developing a roadmap for the translation of e-mental health services for depression. Australian and New Zealand Journal of Psychiatry, 49, 776–784. 10.1177/0004867415582054 [DOI] [PubMed] [Google Scholar]
  11. Bergström J, Andersson G, Ljótsson B, Rück C, Andréewitch S, Karlsson A, & Lindefors N (2010). Internet-versus group-administered cognitive behaviour therapy for panic disorder in a psychiatric setting: A randomised trial. BMC Psychiatry, 10, 54 10.1186/1471-244X-10-54 [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Berry N, Lobban F, Emsley R, & Bucci S (2016). Acceptability of interventions delivered online and through mobile phones for people who experience severe mental health problems: A systematic review. Journal of Medical Internet Research, 18e121 10.2196/jmir.5250 [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Bourla A, Ferreri F, Ogorzelec L, Peretti CS, Guinchard C, & Mouchabac S (2018). Psychiatrists’ attitudes toward disruptive new technologies: Mixed-methods study. JMIR Mental Health, 5e10240 10.2196/10240 [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Bry LJ, Chou T, Miguel E, & Comer JS (2018). Consumer smartphone apps marketed for child and adolescent anxiety: A systematic review and content analysis. Behavior Therapy, 49, 249–261. 10.1016/j.beth.2017.07.008 [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Carlbring P, Andersson G, Cuijpers P, Riper H, & Hedman-Lagerlöf E (2018). Internet-based vs. face-to-face cognitive behavior therapy for psychiatric and somatic disorders: An updated systematic review and meta-analysis. Cognitive Behaviour Therapy, 47, 1–18. 10.1080/16506073.2017.1401115 [DOI] [PubMed] [Google Scholar]
  16. Carlbring P, Ekselius L, & Andersson G (2003). Treatment of panic disorder via the internet: A randomized trial of CBT vs. applied relaxation. Journal of Behavior Therapy and Experimental Psychiatry, 34, 129–140. https://doi.org/101016/S0005-7916(03)00026-0 [DOI] [PubMed] [Google Scholar]
  17. CBITS Intellicare. (n.d.). Intellicare [Position statement]. Retrieved from, http://cbits.northwestern.edu/portfolio/intellicare/.
  18. Crow SJ, Agras WS, Halmi KA, Fairburn CG, Mitchell JE, & Nyman JA (2013). A cost effectiveness analysis of stepped care treatment for bulimia nervosa. International Journal of Eating Disorders, 46, 302–307. 10.1002/eat.22087 [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Cullen B, Samuels JF, Pinto A, Fyer AJ, McCracken JT, Rauch SL, & Nestadt G (2008). Demographic and clinical characteristics associated with treatment status in family members with obsessive-compulsive disorder. Depression and Anxiety, 25, 218–224. 10.1002/da.20293 [DOI] [PubMed] [Google Scholar]
  20. Daley M, Morin CM, LeBlanc M, Gregoire JP, & Savard J (2009). The economic burden of insomnia: Direct and indirect costs for individuals with insomnia syndrome, insomnia symptoms, and good sleepers. Sleep, 32, 55–64. 10.5665/sleep/32.1.55 [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Demyttenaere K, Bruffaerts R, Posada-Villa J, Gasquet I, Kovess V, Lepine JP, & Kikkawa T (2004). Prevalence, severity, and unmet need for treatment of mental disorders in the World Health Organization World Mental Health Surveys. JAMA, 291, 2581–2590. 10.1001/jama.291.21.2581 [DOI] [PubMed] [Google Scholar]
  22. DiMino J, & Blau G (2012). The relationship between wait time after triage and show rate for intake in a nonurgent student population. Journal of College Student Psychotherapy, 26, 241–247. 10.1080/87568225.2012.685857 [DOI] [Google Scholar]
  23. Enander J, Andersson E, Mataix-Cols D, Lichtenstein L, Alström K, Andersson G, & Rück C (2016). Therapist guided internet based cognitive behavioural therapy for body dysmorphic disorder: Single blind randomised controlled trial. BMJ, 352i241 10.1136/bmj.i241 [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Firth J, & Torous J (2015). Smartphone apps for schizophrenia: A systematic review. JMIR mHealth and uHealth, 3e102 10.2196/mhealth.4930 [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Firth J, Torous J, Nicholas J, Carney R, Pratap A, Rosenbaum S, & Sarris J (2017). The efficacy of smartphone-based mental health interventions for depressive symptoms: A meta-analysis of randomized controlled trials. World Psychiatry, 16, 287–298. 10.1002/wps.20472 [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Firth J, Torous J, Nicholas J, Carney R, Rosenbaum S, & Sarris J (2017). Can smartphone mental health interventions reduce symptoms of anxiety? A meta-analysis of randomized controlled trials. Journal of Affective Disorders, 218, 15–22. 10.1016/j.jad.2017.04.04 [DOI] [PubMed] [Google Scholar]
  27. Fitzpatrick KK, Darcy A, & Vierhile M (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Mental Health, 4e19 10.2196/mental.7785 [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Folkins C, Hersch P, & Dahlen D (1980). Waiting time and no-show rate in a community mental health center. American Journal of Community Psychology, 8, 121–123. 10.1007/bf00892287 [DOI] [PubMed] [Google Scholar]
  29. Ford JH 2nd, Alagoz E, Dinauer S, Johnson KA, Pe-Romashko K, & Gustafson DH (2015). Successful organizational strategies to sustain use of A-CHESS: A mobile intervention for individuals with alcohol use disorders. Journal of Medical Internet Research, 17e201 10.2196/jmir.3965 [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Foster KR, & Torous J (2019). The opportunity and obstacles for smartwatches and wearable sensors. IEEE Pulse, 10, 22–25. 10.1109/mpuls.2018.2885832 [DOI] [PubMed] [Google Scholar]
  31. Friedrich MJ (2017). Depression is the leading cause of disability around the world. JAMA, 317(15), 1517 10.1001/jama.2017.3826 [DOI] [PubMed] [Google Scholar]
  32. Garnett C, Perski O, Tombor I, West R, Michie S, & Brown J (2018). Predictors of engagement, response to follow up, and extent of alcohol reduction in users of a smartphone app (Drink Less): Secondary analysis of a factorial randomized controlled trial. JMIR mHealth and uHealth, 6e11175 10.2196/11175 [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Gindidis S, Stewart S, & Roodenburg J (2018). A systematic scoping review of adolescent mental health treatment using mobile apps. Advances in Mental Health, 1.. 10.1080/18387357.2018.1523680 [DOI] [Google Scholar]
  34. Global Wireless Solutions. (2017, November 17). First ever test of mobile networks on US highways—which operator was in the fast lane? Retrieved from, https://news.gwsolutions.com/2017/11/27/first-ever-test-of-mobile-networks-on-us-highways-which-operator-was-in-the-fast-lane/.
  35. Glozier N, Christensen H, Naismith S, Cockayne N, Donkin L, Neal B, & Hickie I (2013). Internet-delivered cognitive behavioural therapy for adults with mild to moderate depression and high cardiovascular disease risks: A randomised attention-controlled trial. PLoS ONE, 8e59139 10.1371/journal.pone.0059139 [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Green AC, Hunt C, & Stain HJ (2012). The delay between symptom onset and seeking professional treatment for anxiety and depressive disorders in a rural Australian sample. Social Psychiatry and Psychiatric Epidemiology, 47, 1475–1487. 10.1007/s00127-011-0453-x [DOI] [PubMed] [Google Scholar]
  37. Griffiths KM, & Christensen H (2007). Internet-based mental health programs: A powerful tool in the rural medical kit. Australian Journal of Rural Health, 15, 81–87. 10.1111/j.1440-1584.2007.00859. [DOI] [PubMed] [Google Scholar]
  38. Harvey AG, & Gumport NB (2015). Evidence-based psychological treatments for mental disorders: Modifiable barriers to access and possible solutions. Behaviour Research and Therapy, 68, 1–12. 10.1016/j.brat.2015.02.004 [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Hepworth N, & Paxton SJ (2007). Pathways to help-seeking in bulimia nervosa and binge eating problems: A concept mapping approach. International Journal of Eating Disorders, 40, 493–504. 10.1002/eat.20402 [DOI] [PubMed] [Google Scholar]
  40. Hicks C, & Hickman G (1994). The impact of waiting-list times on client attendance for relationship counselling. British Journal of Guidance and Counselling, 22, 175–182. 10.1080/03069889408260312 [DOI] [Google Scholar]
  41. Hilbert M, & López P (2011). The world’s technological capacity to store, communicate, and compute information. Science, 332, 60–65. 10.1126/science.1200970 [DOI] [PubMed] [Google Scholar]
  42. Hinshaw SP, & Stier A (2008). Stigma as related to mental disorders. Annual Review of Clinical Psychology, 4, 367–393. 10.1146/annurev.clinpsy.4.022007.141245 [DOI] [PubMed] [Google Scholar]
  43. Ho KP, Hunt C, & Li S (2008). Patterns of help-seeking behavior for anxiety disorders among the Chinese speaking Australian community. Social Psychiatry and Psychiatric Epidemiology, 43, 872–877. 10.1007/s00127-008-0387-0 [DOI] [PubMed] [Google Scholar]
  44. Hollis C, Falconer CJ, Martin JL, Whittington C, Stockton S, Glazebrook C, & Davies EB (2017). Annual research review: Digital health interventions for children and young people with mental health problems—a systematic and meta-review. Journal of Child Psychology and Psychiatry, 58, 474–503. 10.1111/jcpp.12663 [DOI] [PubMed] [Google Scholar]
  45. International Telecommunications Union. (2018). Time Series of the International Telecommunications Union Data for the World. Retrieved from, https://www.itu.int/en/ITU-D/Statistics/Pages/stat/default.aspx.
  46. Johnson EM, & Coles ME (2013). Failure and delay in treatment-seeking across anxiety disorders. Community Mental Health Journal, 49, 668–674. 10.1007/s10597-012-9543-9 [DOI] [PubMed] [Google Scholar]
  47. Jung H, von Sternberg K, & Davis K (2017). The impact of mental health literacy, stigma, and social support on attitudes toward mental health help-seeking. International Journal of Mental Health Promotion, 19, 252–267. 10.1080/14623730.2017.1345687 [DOI] [Google Scholar]
  48. Kaldo V, Jernelöv S, Blom K, Ljótsson B, Brodin M, Jörgensen M, & Lindefors N (2015). Guided internet cognitive behavioral therapy for insomnia compared to a control treatment—a randomized trial. Behaviour Research and Therapy, 71, 90–100. 10.1016/j.brat.2015.06.001 [DOI] [PubMed] [Google Scholar]
  49. Kasckow J, Felmet K, Appelt C, Thompson R, Rotondi A, & Haas G (2014). Telepsychiatry in the assessment and treatment of schizophrenia. Clinical Schizophrenia and Related Psychoses, 8, 21–27a. 10.3371/csrp.kafe.021513 [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Kass AE, Balantekin KN, Fitzsimmons-Craft EE, Jacobi C, Wilfley DE, & Taylor CB (2017). The economic case for digital interventions for eating disorders among United States college students. International Journal of Eating Disorders, 50, 250–258. 10.1002/eat.22680 [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Kessler RC, Chiu WT, Demler O, Merikangas KR, & Walters EE (2005). Prevalence, severity, and comorbidity of 12-month DSM-IV disorders in the National Comorbidity Survey Replication. Archives of General Psychiatry, 62, 617–627. 10.1001/archpsyc.62.6.617 [DOI] [PMC free article] [PubMed] [Google Scholar]
  52. Kremer TG, & Gesten EL (2003). Managed mental health care: The client’s perspective. Professional Psychology: Research and Practice, 34, 187–196. 10.1037/0735-7028.34.2.187 [DOI] [Google Scholar]
  53. Larsen ME, Huckvale K, Nicholas J, Torous J, Birrell L, Li E, & Reda B (2019). Using science to sell apps: Evaluation of mental health app store quality claims. npj Digital. Medicine, 2, 18 10.1038/s41746-019-0093-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Lattie EG, Schueller SM, Sargent E, Stiles-Shields C, Tomasino KN, Corden ME, & Mohr DC (2016). Uptake and usage of IntelliCare: A publicly available suite of mental health and well-being apps. Internet Interventions, 4, 152–158. 10.1016/j.invent.2016.06.003 [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Leigh S, & Flatt S (2015). App-based psychological interventions: Friend or foe? Evidence-Based Mental Health, 18, 97–99. 10.1136/eb-2015-102203 [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Lin L, Stamm K, & Christidis P (2016). 2015 County-level analysis of U.S. licensed psychologists and health indicators. APA Center for Workforce Studies. Retrieved from, http://www.apa.org/workforce/publications/15-county-analysis/index.aspx.
  57. Lindgreen P, Clausen L, & Lomborg K (2018). Clinicians’ perspective on an app for patient self-monitoring in eating disorder treatment. International Journal of Eating Disorders, 51, 314–321. 10.1002/eat.22833 [DOI] [PubMed] [Google Scholar]
  58. Link BG, & Phelan JC (2006). Stigma and its public health implications. The Lancet, 367, 528–529. 10.1016/s0140-6736(06)68184-1 [DOI] [PubMed] [Google Scholar]
  59. Marques L, LeBlanc NJ, Weingarden HM, Timpano KR, Jenike M, & Wilhelm S (2010). Barriers to treatment and service utilization in an internet sample of individuals with obsessive-compulsive symptoms. Depression and Anxiety, 27, 470–475. 10.1002/da.20694 [DOI] [PubMed] [Google Scholar]
  60. Mehdizadeh H, Asadi F, Mehrvar A, Nazemi E, & Emami H (2019). Smartphone apps to help children and adolescents with cancer and their families: A scoping review. Acta Oncologica, 1.. 10.1080/0284186X.2019.1588474 [DOI] [PubMed] [Google Scholar]
  61. Mohr DC, Cuijpers P, & Lehman K (2011). Supportive accountability: A model for providing human support to enhance adherence to eHealth interventions. Journal of Medical Internet Research, 13e30 10.2196/jmir.1602 [DOI] [PMC free article] [PubMed] [Google Scholar]
  62. Mohr DC, Lyon AR, Lattie EG, Reddy M, & Schueller SM (2017). Accelerating digital mental health research from early design and creation to successful implementation and sustainment. Journal of Medical Internet Research, 19e153 10.2196/jmir.7725 [DOI] [PMC free article] [PubMed] [Google Scholar]
  63. Mohr DC, Tomasino KN, Lattie EG, Palac HL, Kwasny MJ, Weingardt K, & Schueller SM (2017). IntelliCare: An eclectic, skills-based app suite for the treatment of depression and anxiety. Journal of Medical Internet Research, 19e10 10.2196/jmir.6645 [DOI] [PMC free article] [PubMed] [Google Scholar]
  64. Mohr DC, Zhang M, & Schueller SM (2017). Personal sensing: Understanding mental health using ubiquitous sensors and machine learning. Annual Review of Clinical Psychology, 13, 23–47. 10.1146/annurevclinpsy-032816-044949 [DOI] [PMC free article] [PubMed] [Google Scholar]
  65. Morris RR, Schueller SM, & Picard RW (2015). Efficacy of a web-based, crowdsourced peer-to-peer cognitive reappraisal platform for depression: Randomized controlled trial. Journal of Medical Internet Research, 17e72 10.2196/jmir.4167 [DOI] [PMC free article] [PubMed] [Google Scholar]
  66. Moskowitz DS, & Young SN (2006). Ecological momentary assessment: What it is and why it is a method of the future in clinical psychopharmacology. Journal of Psychiatry and Neuroscience, 31, 13–20. [PMC free article] [PubMed] [Google Scholar]
  67. National Institute of Mental Health. (2019). Prevalence of any mental illness (AMI). Retrieved from, http://www.nimh.nih.gov/health/statistics/prevalence/any-mental-illness-ami-among-adults.shtml.
  68. Neary M, & Schueller SM (2018). State of the field of mental health apps. Cognitive and Behavioral Practice, 25, 531–537. 10.1016/j.cbpra.2018.01.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
  69. Newman MG, Jacobson NC, Zainal NH, Shin KE, Szkodny LE, & Sliwinski MJ (2019). The effects of worry in daily life: An ecological momentary assessment study supporting the tenets of the contrast avoidance model. Clinical Psychological Science, 7, 794–810. 10.1177/2167702619827019 [DOI] [PMC free article] [PubMed] [Google Scholar]
  70. Newman MG, Szkodny LE, Llera SJ, & Przeworski A (2011). A review of technology-assisted self-help and minimal contact therapies for anxiety and depression: Is human contact necessary for therapeutic efficacy? Clinical Psychology Review, 31, 89–103. 10.1016/j.cpr.2010.09.008 [DOI] [PubMed] [Google Scholar]
  71. Nicholas J, Fogarty AS, Boydell K, & Christensen H (2017). The reviews are in: A qualitative content analysis of consumer perspectives on apps for bipolar disorder. Journal of Medical Internet Research, 19, e105 10.2196/jmir.7273 [DOI] [PMC free article] [PubMed] [Google Scholar]
  72. Nicholas J, Larsen ME, Proudfoot J, & Christensen H (2015). Mobile apps for bipolar disorder: A systematic review of features and content quality. Journal of Medical Internet Research, 17e198 10.2196/jmir.4581 [DOI] [PMC free article] [PubMed] [Google Scholar]
  73. O’Loughlin K, Neary M, Adkins EC, & Schueller SM (2019). Reviewing the data security and privacy policies of mobile apps for depression. Internet Interventions, 15, 110–115. 10.1016/j.invent.2018.12.001 [DOI] [PMC free article] [PubMed] [Google Scholar]
  74. Otto MW, Pollack MH, & Maki KM (2000). Empirically supported treatments for panic disorder: Costs, benefits, and stepped care. Journal of Consulting and Clinical Psychology, 68, 556–563. 10.1037/0022-006X.68.4.556 [DOI] [PubMed] [Google Scholar]
  75. Owen JE, Jaworski BK, Kuhn E, Makin-Byrd KN, Ramsey KM, & Hoffman JE (2015). mHealth in the wild: Using novel data to examine the reach, use, and impact of PTSD Coach. JMIR Mental Health, 2(1)e7 10.2196/mental.3935 [DOI] [PMC free article] [PubMed] [Google Scholar]
  76. Pew Research Center. (2019). Mobile fact sheet. Retrieved from, http://www.pewinternet.org/fact-sheet/mobile/.
  77. Pinto A, Mancebo MC, Eisen JL, Pagano ME, & Rasmussen SA (2006). The brown longitudinal obsessive compulsive study: Clinical features and symptoms of the sample at intake. Journal of Clinical Psychiatry, 67, 703–711. [DOI] [PMC free article] [PubMed] [Google Scholar]
  78. Polanczyk GV, Salum GA, Sugaya LS, Caye A, & Rohde LA (2015). Annual research review: A meta-analysis of the worldwide prevalence of mental disorders in children and adolescents. Journal of Child Psychology and Psychiatry, 56, 345–365. 10.1111/jcpp.12381 [DOI] [PubMed] [Google Scholar]
  79. Powell AC, Bowman MB, & Harbin HT (2019). Reimbursement of apps for mental health: Findings from interviews. JMIR Mental Health, 6(8)e14724 10.2196/14724 [DOI] [PMC free article] [PubMed] [Google Scholar]
  80. PsyberGuide. (2018). Retrieved from, https://psyberguide.org.
  81. Rehm IC, Foenander E, Wallace K, Abbott JM, Kyrios M, & Thomas N (2016). What role can avatars play in e-mental health interventions? Exploring new models of client-therapist interaction. Frontiers in Psychiatry, 7, 186 10.3389/fpsyt.2016.00186 [DOI] [PMC free article] [PubMed] [Google Scholar]
  82. Rizzo A, Shilling R, Forbell E, Scherer S, Gratch J, & Morency L-P (2016). Autonomous virtual human agents for healthcare information support and clinical interviewing In Luxton DD (Ed.), Artificial intelligence in behavioral and mental health care (pp. 53–79). San Diego, CA: Academic Press. [Google Scholar]
  83. Roehrig C (2016). Mental disorders top the list of the most costly conditions in the United States: $201 billion. Health Affairs, 35, 1130–1135. 10.1377/hlthaff.2015.1659 [DOI] [PubMed] [Google Scholar]
  84. Rohani DA, Faurholt-Jepsen M, Kessing LV, & Bardram JE (2018). Correlations between objective behavioral features collected from mobile and wearable devices and depressive mood symptoms in patients with affective disorders: Systematic review. JMIR mHealth and uHealth, 6e165 10.2196/mhealth.9691 [DOI] [PMC free article] [PubMed] [Google Scholar]
  85. Rosenfeld L, Torous J, & Vahia IV (2017). Data security and privacy in apps for dementia: An analysis of existing privacy policies. American Journal of Geriatric Psychiatry, 25, 873–877. 10.1016/j.jagp.2017.04.009 [DOI] [PubMed] [Google Scholar]
  86. Saeb S, Lonini L, Jayaraman A, Mohr DC, & Kording KP (2017). The need to approximate the use-case in clinical machine learning. Gigascience, 6, 1–9. 10.1093/gigascience/gix019 [DOI] [PMC free article] [PubMed] [Google Scholar]
  87. Sardi L, Idri A, & Fernandez-Aleman JL (2017). A systematic review of gamification in e-Health. Journal of Biomedical Informatics, 71, 31–48. 10.1016/j.jbi.2017.05.011 [DOI] [PubMed] [Google Scholar]
  88. Schembre SM, Liao Y, O’Connor SG, Hingle MD, Shen S-E, Hamoy KG, & Boushey CJ (2018). Mobile ecological momentary diet assessment methods for behavioral research: Systematic review. JMIR mHealth and uHealth, 6e11170 10.2196/11170 [DOI] [PMC free article] [PubMed] [Google Scholar]
  89. Schreiber V, Maercker A, & Renneberg B (2010). Social influences on mental health help-seeking after interpersonal traumatization: A qualitative analysis. BMC Public Health, 10, 634 10.1186/1471-2458-10-634 [DOI] [PMC free article] [PubMed] [Google Scholar]
  90. Schueller SM, Muñoz RF, & Mohr DC (2013). Realizing the potential of behavioral intervention technologies. Current Directions in Psychological Science, 22, 478–483. 10.1177/0963721413495872 [DOI] [Google Scholar]
  91. Singh K, Drouin K, Newmark LP, Lee J, Faxvaag A, Rozenblum R, & Bates DW (2016). Many mobile health apps target high-need, high-cost populations, but gaps remain. Health Affairs, 35, 2310–2318. 10.1377/hlthaff.2016.0578 [DOI] [PubMed] [Google Scholar]
  92. Steel Z, Marnane C, Iranpour C, Chey T, Jackson JW, Patel V, & Silove D (2014). The global prevalence of common mental disorders: A systematic review and meta-analysis 1980–2013. International Journal of Epidemiology, 43, 476–493. 10.1093/ije/dyu038 [DOI] [PMC free article] [PubMed] [Google Scholar]
  93. Stengler K, Olbrich S, Heider D, Dietrich S, Riedel-Heller S, & Jahn I (2013). Mental health treatment seeking among patients with OCD: Impact of age of onset. Social Psychiatry and Psychiatric Epidemiology, 48, 813–819. 10.1007/s00127-012-0544-3 [DOI] [PubMed] [Google Scholar]
  94. Stoyanov SR, Hides L, Kavanagh DJ, Zelenko O, Tjondronegoro D, & Mani M (2015). Mobile App Rating Scale: A new tool for assessing the quality of health mobile apps. JMIR mHealth and uHealth, 3e27 10.2196/mhealth.3422 [DOI] [PMC free article] [PubMed] [Google Scholar]
  95. Substance Abuse and Mental Health Services Administration. (2018). Key substance use and mental health indicators in the United States: Results from the 2017 National Survey on Drug Use and Health (HHS Publication No. SMA 18–5068, NSUDH Series H-53). Retrieved from, https://www.samhsa.gov/data/sites/default/files/cbhsq-reports/NSDUHFFR2017/NSDUHFFR2017.pdf.
  96. Sucala M, Cuijpers P, Muench F, Cardos R, Soflau R, Dobrean A, & David D (2017). Anxiety: There is an app for that. A systematic review of anxiety apps. Depression and Anxiety, 34, 518–525. 10.1002/da.22654 [DOI] [PubMed] [Google Scholar]
  97. Thompson A, Issakidis C, & Hunt C (2012). Delay to seek treatment for anxiety and mood disorders in an Australian clinical sample. Behaviour Change, 25, 71–84. 10.1375/bech.25.2.71 [DOI] [Google Scholar]
  98. Torous J, Nicholas J, Larsen ME, Firth J, & Christensen H (2018). Clinical review of user engagement with mental health smartphone apps: Evidence, theory and improvements. Evidence-Based Mental Health, 21, 116–119. 10.1136/eb-2018-102891 [DOI] [PMC free article] [PubMed] [Google Scholar]
  99. Torous J, & Roberts LW (2017). Needed innovation in digital health and smartphone applications for mental health: Transparency and trust. JAMA Psychiatry, 74, 437–438. 10.1001/jamapsychiatry.2017.0262 [DOI] [PubMed] [Google Scholar]
  100. Trusler K, Doherty C, Mullin T, Grant S, & McBride J (2006). Waiting times for primary care psychological therapy and counselling services. Counselling and Psychotherapy Research, 6, 23–32. 10.1080/14733140600581358 [DOI] [Google Scholar]
  101. Twenge JM, & Campbell WK (2018). Associations between screen time and lower psychological well-being among children and adolescents: Evidence from a population-based study. Preventive Medicine Reports, 12, 271–283. 10.1016/j.pmedr.2018.10.003 [DOI] [PMC free article] [PubMed] [Google Scholar]
  102. U.S. Department of Health and Human Services Food and Drug Administration. (2015). Mobile medical applications: Guidance for industry and Food and Drug Administration staff. Rockville, MD: Author; Retrieved from, https://www.fda.gov/media/80958/download. [Google Scholar]
  103. U.S. Food and Drug Administration. (2017, September 14). reSET de novo DEN160018 approval. Retrieved from, https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfpmn/denovo.cfm?ID=DEN160018.
  104. Ventura J, & Chung J (2019). The Lighten Your Life Program: An educational support group intervention that used a mobile app for managing depressive symptoms and chronic pain. Journal of Psychosocial Nursing and Mental Health Services, 1.. 10.3928/02793695-20190221-01 [DOI] [PubMed] [Google Scholar]
  105. Wang PS, Berglund P, Olfson M, Pincus HA, Wells KB, & Kessler RC (2005). Failure and delay in initial treatment contact after first onset of mental disorders in the National Comorbidity Survey Replication. Archives of General Psychiatry, 62, 603–613. 10.1001/archpsyc.62.6.603 [DOI] [PubMed] [Google Scholar]
  106. Wang PS, Lane M, Olfson M, Pincus HA, Wells KB, & Kessler RC (2005). Twelve-month use of mental health services in the United States: Results from the National Comorbidity Survey Replication. Archives of General Psychiatry, 62, 629–640. 10.1001/archpsyc.62.6.629 [DOI] [PubMed] [Google Scholar]
  107. Weisz JR, Kuppens S, Eckshtain D, Ugueto AM, Hawley KM, & Jensen-Doss A (2013). Performance of evidence-based youth psychotherapies compared with usual clinical care: A multilevel meta-analysis. JAMA Psychiatry, 70, 750–761. 10.1001/jamapsychiatry.2013.1176 [DOI] [PMC free article] [PubMed] [Google Scholar]
  108. West DM (2015). Digital divide: Improving internet access in the developing world through affordable services and diverse content. Retrieved from, https://www.brookings.edu/wp-content/uploads/2016/06/West_Internet-Access.pdf.
  109. Whiteford HA, Degenhardt L, Rehm J, Baxter AJ, Ferrari AJ, Erskine HE, & Vos T (2013). Global burden of disease attributable to mental and substance use disorders: Findings from the Global Burden of Disease Study 2010. The Lancet, 382, 1575–1586. 10.1016/S0140-6736(13)61611-6 [DOI] [PubMed] [Google Scholar]
  110. Wilhelm S, Weingarden H, Greenberg J, McCoy TH, Ladis I, Summers BJ, & Harrison O (2019). Development and pilot testing of a cognitive behavioral therapy digital service for body dysmorphic disorder. Behavior Therapy. 10.1016/j.beth.2019.03.007 [DOI] [PubMed] [Google Scholar]
  111. Williams ME, Latta J, & Conversano P (2008). Eliminating the wait for mental health services. Journal of Behavioral Health Services and Research, 35, 107–114. 10.1007/s11414-007-9091-1 [DOI] [PubMed] [Google Scholar]
  112. World Health Organization. (2015). Mental health atlas 2014. Geneva: Switzerland: Author; Retrieved from, http://apps.who.int/iris/bitstream/handle/10665/178879/9789241565011_eng.pdf;jsessionid=1436E3F816F93FC5FC8E8FBF6074AA9E?sequence=1. [Google Scholar]
  113. Zima BT, Hurlburt MS, Knapp P, Ladd H, Tang L, Duan N, & Wells KB (2005). Quality of publicly-funded outpatient specialty mental health care for common childhood psychiatric disorders in California. Journal of the American Academy of Child and Adolescent Psychiatry, 44, 130–144. 10.1097/00004583-200502000-00005 [DOI] [PubMed] [Google Scholar]

RESOURCES