Abstract
Background:
Understanding how individuals utilize and perceive digital mental health interventions may improve engagement and effectiveness. To support intervention improvement, participant feedback was obtained and app use patterns were examined for a randomized clinical trial evaluating a smartphone-based intervention for individuals with bipolar disorder.
Methods:
App use and coaching engagement were examined (n = 124). Feedback was obtained via exit questionnaires (week 16, n = 81) and exit interviews (week 48, n = 17).
Results:
On average, over 48 weeks, participants used the app for 4.4 hours and engaged with the coach for 3.9 hours. Participants spent the most time monitoring target behaviors and receiving adaptive feedback and the least time viewing self-assessments and skills. Participants reported that the daily check in helped increase awareness of target behaviors but expressed frustration with repetitiveness of monitoring and feedback content. Participants liked personalizing their wellness plan, but its use did not facilitate skills practice. App use declined over time which participants attributed to clinical stability, content mastery, and time commitment. Participants found the coaching supportive and motivating for app use.
Limitations:
App engagement based on viewing time may overestimate engagement. The delay between intervention delivery and the exit interviews and low exit interview participation may introduce bias.
Conclusion:
Utilization patterns and feedback suggest that digital mental health engagement and efficacy may benefit from adaptive personalization of targets monitored combined with adaptive monitoring and feedback to support skills practice and development. Increasing engagement with supports may also be beneficial.
Keywords: bipolar disorder, self-management, engagement, smartphone, digital mental health, behavior change
1. Introduction
Mental health problems are the largest cause of disabilities worldwide (Vigo et al., 2016). Each year in the USA, 21% of adults aged 18 or older (52.9 million people) experience a mental health disorder, and 5.6% (14.2 million people) experience a serious mental illness such as schizophrenia, bipolar disorder, or major depression (Ruggeri et al., 2000; Substance Abuse and Mental Health Services Administration, 2019). Although existing evidence-based treatments reduce symptoms and improve quality of life, less than half of adults in the USA receive treatment for their mental health problems, and the average delay between the onset of problems and the start of treatment is 11 years (Wang et al., 2004).
Digital technologies provide a promising means to reduce this treatment gap by increasing access to empirically based treatment strategies while providing real-time assessments and feedback to improve treatment. The growing deployment of digital mental health interventions has led to the recognition that human support helps maintain engagement with and improves the effectiveness of these interventions (Bernstein et al., 2022). A mental health professional or a trained non-professional can provide this human support by carrying out specific roles (coach) (Bernstein et al., 2022). Despite the widespread proliferation of digital mental health interventions and their human support, evidence for improved access and outcomes remains variable (Firth et al., 2017a; Firth et al., 2017b). Understanding how individuals utilize and perceive digital mental health interventions and their human support is important for improving them (Bartholomew et al., 1998; Hekler et al., 2016).
Within this context, we obtained feedback and examined participant engagement patterns during a randomized controlled trial (RCT) of LiveWell, a smartphone-based self-management intervention for people with bipolar disorder (Goulding et al., 2022b). The primary aim of the intervention was to decrease mood episode relapse and secondarily to reduce symptom burden and improve quality of life. The intervention aimed to achieve these outcomes by assisting individuals with attending to target behaviors proposed to underlie the impact of existing empirically supported therapies. The main behaviors targeted were taking psychiatric medications as prescribed, obtaining adequate sleep duration, maintaining regular routines, and managing signs and symptoms (i.e., identifying early warning signs, making action and coping plans, monitoring for early warning signs, enacting plans, re-evaluating and revising plans as needed) (Dopke et al., 2021; Goulding et al., 2022b; Jonathan et al., 2021a; Jonathan et al., 2021b; Miklowitz et al., 2008; Miklowitz and Scott, 2009). LiveWell also assisted participants in strengthening social support, managing stressors, and engaging in healthy habits regarding diet, exercise, and substance use.
The RCT evaluating LiveWell did not detect a reduction in the primary outcome of relapse to any mood episode (Goulding et al., 2022a). However, participants were stratified at baseline based on their relapse risk (low risk: asymptomatic recovery; high risk: continued symptomatic, prodromal, recovering, symptomatic recovery), and decreased relapse rates and reduced manic symptom severity were observed for low-risk participants but not high-relapse risk participants (Goulding et al., 2022b). The RCT for LiveWell also demonstrated decreased depressive symptom severity and improved relational quality of life.
Understanding how individuals utilize and perceive digital mental health technologies and human support is essential for improving engagement with efficacy (Bartholomew et al., 1998; Hekler et al., 2016). This paper describes the frequency, duration, and patterns of app use and coaching engagement for intervention arm participants during an RCT evaluating LiveWell (Goulding et al., 2022b). Additionally, user feedback was obtained via exit questionnaires and interviews to provide insights into participants’ perceptions of the intervention.
2. Methods
2.1. Overview
The study was approved by the Northwestern University Institutional Review Board (IRB, STU00202860) and registered at ClinicalTrials.gov (NCT03088462). The analysis presented here uses data from LiveWell, a 48-week randomized controlled trial of a coach-guided, smartphone-based intervention (Goulding et al., 2022a). Participants gave informed consent before participation. Of the 205 randomized participants, 124 were randomly assigned to usual care plus the intervention (Supplement 1, Supplemental Table 1) (Goulding et al., 2022a). The intervention design, its empirical and theoretical framework, and the study protocol for the RCT have been described in detail elsewhere (Dopke et al., 2021; Goulding et al., 2021; Goulding et al., 2022b; Jonathan et al., 2021a; Jonathan et al., 2021b).
2.2. Intervention Description
The LiveWell intervention included a smartphone app, a website, and human support provided by a coach (Dopke et al., 2021; Goulding et al., 2021; Goulding et al., 2022b). The smartphone app had five primary components: foundations, toolbox, wellness plan, daily check-in, and daily review. The foundations and toolbox provided psychoeducation about bipolar disorder and self-management. These app components gave participants a rationale for self-management of the target behaviors and addressed the impact of motivational determinants, such as beliefs, environmental resources, and social support, on self-management of target behaviors. The foundations and toolbox sections also discussed the importance of volitional determinants of engaging in target behaviors, including setting clear and realistic target goals (goal setting), making detailed plans for accomplishing goals and overcoming obstacles (wellness plan), monitoring target behaviors (daily check-in), evaluating if goals were being met, and adjusting behavior as needed (daily review). Additionally, the toolbox offered self-assessments to help identify skills to practice (e.g., behavioral activation techniques) to aid in staying well.
Participants were encouraged to use the core of the intervention (daily check-in) to monitor the main target behaviors (sleep, medication adherence, routine, early warning sign monitoring). Based on their self-report information, they received automated adaptive feedback from an expert system (daily review) (Goulding et al., 2021). They could also review topics not selected for feedback using a supplemental section (review something else). In addition, a weekly check-in was used to help participants communicate their clinical status to providers via self-report measures common in treatment and research settings (Patient Health Questionnaire 8, Altman Self-Rating Mania Scale).
Based on chronic disease self-management models, the coach used a motivational interviewing approach to support app use adherence, identifying and using strategies included in the app, and communication with mental health professionals to assist individuals in their self-management of bipolar disorder (Dopke et al., 2021). Participants randomized to the intervention arm were asked to attend a face-to-face app training, visit their coach, complete six scheduled coach calls, read two lessons per week for four weeks, and complete daily and weekly check-ins for 16 weeks. Over the initial four weeks, users worked through the Foundations and Toolbox to develop a personalized Wellness Plan that addressed lifestyle skills for reducing risk, coping skills for managing signs and symptoms, and resources for staying well. Development of this plan included personalizing plans for managing symptoms across different wellness levels and for managing sleep, medication adherence, attending to diet, exercise, and substance use, maintaining a routine, managing stressors, and engaging social support. At week 16, users were asked to read a wrap-up lesson and complete the sixth and final scheduled coach call, which ended the active intervention. Participants continued to have access to the app and ad hoc support for 32 additional weeks.
In addition to scheduled calls, coaches contacted participants after receiving alerts from the expert system concerning potential app use non-adherence, technical problems, or worsening symptom severity (Goulding et al., 2021). When a technical/non-adherence (TNA) data alert was received, the coach called the participant to assess for non-adherence with the app, smartphone, smartwatch use, or technical problems with data collection (Goulding et al., 2022b). When a symptom severity alert was received, the coach conducted a suicidality assessment and functional impairment evaluation (SAFE) (Dopke et al., 2021).
The website and a secure server provided a place for participants to review their daily and weekly check-in data over weeks compared to weekly summaries in the app. However, the website’s main purpose was to provide information from the participants’ self-report data to their mental health providers. To facilitate reviewing this data, the expert system sent email alerts to mental health providers and coaches to facilitate review of participant data and clinical care communication (Dopke et al., 2021; Goulding et al., 2021; Goulding et al., 2022b).
2.4. Procedures
After the coaching portion of the intervention was completed at week 16, an exit questionnaire assessing the technology and coaching support was delivered via a web survey. In addition, at 48 weeks, the first individuals completing the study assessments were invited to participate in an exit interview.
2.5. Measures of App Use and Coach Engagement
Measures of app use include (1) minutes spent engaged in overall app use, (2) minutes and percentages spent viewing each app section, (3) minutes and percentages spent viewing each app subsections (e.g., foundation lesson, daily review feedback categories) and (4) percentage of participants viewing the app and app sections by week and overall.
Measures of coach engagement include (1) minutes engaged in coach interactions overall, (2) minutes and percentages spent during the coach in-person visit and scheduled coach calls, (3) total number of ad hoc calls, (4) number of TNA and SAFE calls (5) average number of TNA and SAFE calls per participant, (6) percentage of completed TNA and SAFE calls, (7) frequency and type of TNA and SAFE calls, (8) total number of calls to providers due to symptom severity.
2.6. Analysis
In examining app use and coaching participation, all 124 participants randomized to the intervention arm were considered regardless of whether they initiated the intervention by attending the face-to-face coach visit (n = 117). Of participants that attended this visit, some withdrew due to time commitment (n = 8), watch/phone issues (n = 4), difficulty with assessments (n = 2), or other reasons (n = 3) or were lost to follow-up (n = 13).
2.6.1. Analysis of App Use
App use data was stored and queried from a PostgreSQL database. For each participant, time spent viewing the application sections was calculated using the date time stamps of entry and exit into and out of the pages and these viewing times were used to calculate the app time and percentage use measures.
2.6.2. Analysis of Coach Calls
The dates for all coach call attempts and completions were recorded in and queried from REDCap (Research Electronic Data Capture) databases (Harris et al., 2009). For the coach visit and scheduled calls, the duration of each completed interaction was recorded in REDCap databases and used to calculate the mean times and 95% confidence intervals for the visits and calls. The duration of the TNA and SAFE calls was not recorded in the REDCap databases, so the duration of participant time spent during these calls was estimated (Supplement 1, Supplemental Figure 1). The percent of participants activating and completing TNA and SAFE calls was calculated from the queried data. Distinguishing whether missing check-in and sensor data were due to non-adherence issues such as not keeping the phone charged or using the app or due to technical problems with software and data transmission was not always possible, especially if the participant did not respond to contact attempts.
2.6.3. Analysis of Exit Questionnaires
After participants completed the active intervention portion of the study at week 16, they were asked to complete an 81-question exit questionnaire regarding the intervention overall, app sections, the website, app use reminders, technical problems, and working with their psychiatrist and coach (Supplement 2). Questions were adapted from usability satisfaction questionnaires to measure (1) usefulness – extent to which the app or support is useful and increases effectiveness and productivity, (2) ease of use – extent to which the app or support is simple to use, (3) ease of learning - ease of learning how to accomplish and complete tasks with app, (4) satisfaction - overall satisfaction with the app or support, (5) interface quality - whether the app’s interface was satisfying and pleasant, and (6) privacy - ability understand and successfully use privacy controls and sense privacy is protected (Abran et al., 2003; Lewis, 1995; Lund, 2001). Data from the 7-point question response scales was summarized based on the agree/strongly agree (ASA) and strongly disagree/disagree (SDD) response distributions (Supplement 1, Supplemental Figure 2) and summarized based on the usability constructs (e.g., usefulness) (Supplement 2).
2.6.4. Analysis of Exit Interviews
The first participants exiting the intervention arm were asked to participate in exit interviews, which were transcribed verbatim and used for thematic analysis (Braun, 2006). The preplanned goal of obtaining exit interviews for 20% of participants was reduced as no additional themes were identified after the first 15 of 17 total interviews were completed; hence, our sample size was ultimately guided by expectations for thematic saturation, which research suggests is typically achieved after 12 to 18 interviews (Guest et al., 2006). Initial codes were developed using deductive coding guided by the exit interview script (Supplement 3). Three researchers independently conducted a preliminary round of coding, during which transcripts were partitioned into excerpts (transcript lines conveying an understandable unit) and exported to Microsoft Excel (Supplement 4). Coders used nominal group consensus, meeting with a moderator to discuss and clarify differences in coding and finalize codes (Potter et al., 2004). To account for participants who discussed a coded element frequently, ranking scores were assigned at each level of coding (themes, subthemes, subtheme divisions) to provide a metric of how often participants discussed a given code (code count, CC) weighted by the number of participants (participant count, PC) who discussed the code: Rank Score (RS) = 10^(log(CC*PC)/max(log(CC*PC))), range = 1–10. Processing of the Excel spreadsheets to obtain counts and scores was completed using MATLAB (R2021b, MathWorks).
3. Results
3.1. Participant Characteristics
The demographic characteristics of all participants randomized to the intervention arm, those who completed the exit questionnaire or exit interview, are summarized in Supplement 1, Supplemental Table 1. Intervention arm participants could opt to allow their mental health providers secure access to a website summarizing their self-reported data, with 83 (67%) agreeing to enroll 1 (n = 61) or 2 (n = 22) mental health providers. Of the 105 mental health providers for whom participants agreed to offer enrollment, only 32 (30%) agreed to participate (21 psychiatrists and 11 therapists); 43 (41%) declined to enroll, and 30 (29%) did not respond to enrollment outreach efforts.
3.2. App and Website Use
During weeks 0 to 48, most participants viewed the primary sections of the application at least once, but only half the participants viewed the review something else section at least once (Figure 1, Supplement 1, Supplemental Table 2). The percentage of participants viewing the app declined over time, with viewing the foundations, toolbox, and wellness plan declining more quickly than viewing the daily and weekly check-ins and daily review (Figure 1). The average percentage of daily check-ins completed each week by individual participants and the average percentage of weekly check-ins completed every four weeks are provided in Supplement 1, Supplemental Figure 3. While most participants viewed all the lessons in part (92%), only a quarter of the participants read all pages of all the lessons (27%) (Supplement 1, Supplemental Table 2). Viewing of the website summary data was low; only 32 (27%) of the 117 intervention arm participants who initiated the intervention and 10 (31%) of the 32 enrolled providers viewed the website at least once.
Figure 1.

Percentage of Participants Viewing App and App Sections
The percent of participants viewing the app declined over time (90% week 1, 68% week 16), daily check in (90% week 1, 64% week 16), daily review (89% week 1, 57% week 16), weekly check in (74% week 1, 53% week 16), foundations (63% week1, 23% week 16), toolbox (61% week 1, 10% week 16) and wellness plan (71% week 1, 23% week 16). Over weeks 0–48 for intervention arm participants (n=124), 94% viewed the daily check in, 94% viewed the weekly check in, 94% viewed the daily review, 94% viewed the wellness plan, 92% viewed the foundations, 88% viewed the toolbox and 56% viewed the review something else section.
3.3. Intervention Contact Time
During weeks 0 to 48, participants spent (mean, (95% confidence interval)) 496 (448,544) minutes engaged in app use and coach interactions, with 264 (228, 301) minutes spent using the app and 232 (214, 249) minutes spent interacting with the coach. In terms of app use, participants spent the most time viewing the daily check-in, followed by the weekly check-in and daily review, then the wellness plan and foundations, with less time spent viewing the toolbox and the least time spent viewing review something else (Figure 2; Supplement 1, Table 2). In general, the amount of time spent viewing sections of the application declined over time (Supplement 1, Supplemental Figure 4) except for the wellness plan, which exhibited a peak in viewing at user week 4 when participants were scheduled to collaborate with the coach on their wellness plan (Supplement 1, Supplemental Figure 5). In terms of coaching, participants spent 171 (158, 184) minutes engaged in scheduled interactions (visits and calls), 48 (42, 53) minutes engaged in non-adherence/technical calls, and 13 (10, 17) minutes engaged in SAFE calls (Figure 2). The time spent using the app is correlated with time spent engaged in coaching (Pearson correlation coefficient, r = 0.55; Supplement 1, Supplemental Figure 6).
Figure 2.

Intervention Time Budgets
During weeks 0–48, total app use time was 4.4 hours. The average number of minutes (95% confidence interval) participants viewed the daily check in was 90 (78, 102), the weekly check in was 47 (41, 54), the daily review was 45 (36, 54), the wellness plan was 34 (29, 39), the foundations was 31 (26, 36), the toolbox was 14 (11, 17) and review something else was 3 (2, 4). The daily review time budget includes the daily review and review something else time. During weeks 0–48, total coach time was 3.9 hours. For the coach visit, the average number of minutes was 70 (65, 74). For the coach scheduled calls, the average number of minutes at week 1 was 17 (15, 18), at week 2 was 15 (14, 17), at week 3 was 13 (12, 15), at week 4 was 34 (30, 38), at week 6 was 11 (10, 12), at week 16 was 11 (9, 12). For the ad hoc calls, the average number of minutes was 48 (42, 53) for technical/non-adherence calls (TNA) and was 13 (10, 17) for suicidal assessment/functional evaluation (SAFE) calls.
Within the foundations section, participants spent 2 to 6 minutes viewing each lesson, with the most time spent viewing lifestyle skills (i.e., content related to keeping a healthy lifestyle around sleep, medication, regular routines, diet, exercise, substances, and healthy relationships) and the least time spent viewing wrapping up (i.e., content prompting participants to reflect on overall use of the program and future plans) (Supplement 1, Supplemental Table 2; Supplement 5). Regarding the daily review, most participants viewed feedback about staying well (received when meeting all primary target goals) and problems with irregular routines (Supplement 5). In addition, about half of the participants viewed feedback about sleeping too little, too much, or erratically, as well as about early warning signs of depression and problems with medication adherence (Supplement 5).
3.4. Coach Scheduled Interactions
Of the 124 intervention arm participants, 117 (94%) attended the coach visit and initiated app use (Supplement 1, Supplemental Table 3). Scheduled call completion declined over time (Supplement 1, Supplemental Table 3), but 85 (68%) participants completed all the coach scheduled calls, and coaches completed 618 (83%) of the 744 scheduled calls planned for participants (6 calls for 124 participants).
3.5. Coach Ad Hoc Interactions
In addition to the coach scheduled visits and calls, 115 (93%) participants activated 642 ad hoc call attempts due to alerts for technical/non-adherence (TNA) problems, with (median, (interquartile range)) 5 (3, 7) activated per participant. Coaches were able to contact the participants for 451 (70%) of these alerts and clarified that 341 (53%) were related to technical problems (transmission issues with sensor data, self-report data, or both) and 79 (12%) were related to daily or weekly check-in adherence issues in the absence of technical problems.
For 68 (55%) of the participants, 193 ad hoc call attempts were activated by symptom severity alerts, with (median, (interquartile range)) 1 (0, 2) activated per participant. These symptom severity alerts led to 36 providers being contacted for 21 (17%) participants. Most of the activations were related to potential new onset of a depressive (53%) or manic (27%) episode based on a change in the weekly check-in score. The remaining calls were triggered by participants’ daily check-in wellness rating (3%), coach concerns about clinical status during a scheduled visit (7%), call (4%), or at another point of contact with the coach (7%).
Coaches were able to contact the participants for 161 (83%) of the symptom severity alerts and completed 161 suicidality assessments (SA) and 154 functional evaluations (FE). During the completed suicidality assessments, most of the participants denied thoughts of wanting to die or kill themselves, but some participants reported suicidal ideation (Table 1). During the completed functional evaluations, most participants reported changes in two areas (mode = 2 problems; 0 problems = 20%, 1 problem = 20%, 2 problems = 30%, 3 problems = 18%, 4 problems = 12%, 5 problems = 1%) with the majority reporting changes in self-care followed by social changes, family problems, changes in impulse control, and work problems (Table 1).
Table 1.
Suicidality Assessment and Functional Evaluation Activations
| Suicidality Assessment | N (%) n = 161 |
Functional Evaluation | N (%) n = 154 |
|---|---|---|---|
| SAa - Low1 | 141 (88%) | FEb - Self-care changes | 111 (72%) |
| SA - Mild2, Provider Aware | 10 (6%) | FE - Sociability changes | 71 (46%) |
| SA - Mild, Provider Unware | 7 (4%) | FE - Impulsivity changes | 35 (23%) |
| SA - Moderate3, Provider Aware | 0 (0%) | FE - Family problems | 44 (29%) |
| SA - Moderate, Provider Unaware | 2 (1%) | FE - Work problems | 28 (18%) |
| SA - Severe4 | 1 (1%) |
SA: suicidality assessment;
FE: functional evaluation;
Low: no thoughts of wanting to die or kill self;
Mild: thoughts of wanting to die but no thoughts of killing self; thoughts of killing self and feel able to control actions and no thoughts of method, no intent or plan to act;
Moderate: thoughts of killing self with method but feel able to control actions and no intent or plan to act;
Severe: thoughts of killing self without method but do not feel able to control actions; thoughts of killing self with method and do not feel able to control actions; thoughts of killing self with intent or plan to act.
3.6. Exit questionnaires
The exit questionnaire was sent to all 92 participants in the intervention arm who remained enrolled at user week 16, and 81 (88%) completed the questionnaire (74% of the 124 randomized intervention arm participants). Overall, most participants found it easy to learn how to use the app (74%, Agree Strongly Agree). They found the app easy to use (70%) and useful (64%) and were satisfied with their app use (68%). They found the coach easy to work with (83%) and useful (78%) (Supplement 2, Usability Summary). Table 2 summarizes individual exit questionnaire responses; the complete questionnaire response data and usability summary are available in Supplement 2.
Table 2.
Exit Questionnaire Responses
| Sections | Usability Type | Questions | DSD1 | SD2 | NAD3 | SA4 | ASA5 |
|---|---|---|---|---|---|---|---|
| Overall | Ease of Use | I was able to complete my tasks in a reasonable amount of time. | 3 | 4 | 1 | 10 | 83 |
| Ease of Use | Overall the application is easy to use. | 6 | 1 | 5 | 11 | 76 | |
| Ease of Use | It was easy to move from one page to another. | 9 | 4 | 4 | 12 | 70 | |
| Ease of Learning | Terminology used in this application is clear. | 3 | 3 | 1 | 15 | 79 | |
| Ease of Learning | The overall organization of the application is easy to understand. | 6 | 11 | 2 | 24 | 57 | |
| Satisfaction | The content of the application met my expectations. | 5 | 4 | 6 | 20 | 65 | |
| Satisfaction | I would be likely to use this application in the future. | 10 | 4 | 6 | 12 | 68 | |
| Foundations | Ease of Learning | I found the lessons easy to understand. | 1 | 0 | 1 | 10 | 88 |
| Toolbox | Usefulness | I found skills that I practiced routinely. | 4 | 9 | 15 | 30 | 43 |
| Wellness Plan | Satisfaction | I found the idea of a Wellness Plan interesting. | 1 | 1 | 4 | 16 | 78 |
| Usefulness | I actively used the ideas and plans that I developed for managing symptoms in my Awareness & Action section. | 4 | 4 | 5 | 28 | 59 | |
| Usefulness | I actively used the ideas and plans that I developed for my lifestyle skills (SMARTS). | 6 | 5 | 7 | 33 | 48 | |
| Daily Check In | Ease of Use | I found the Daily Check In easy to use. | 4 | 0 | 0 | 5 | 91 |
| Usefulness | I found using the Daily Check In helpful. | 4 | 1 | 4 | 10 | 82 | |
| Using the Daily Check In made me more aware of how much I was sleeping. | 1 | 1 | 1 | 4 | 93 | ||
| Using the Daily Check In made me more aware of my routine. | 4 | 1 | 3 | 10 | 83 | ||
| Using the Daily Check In made me more aware of symptoms and early warning signs. | 4 | 5 | 0 | 14 | 78 | ||
| Using the Daily Check In made me more aware of my medication use. | 9 | 1 | 15 | 5 | 70 | ||
| Daily Review | Ease of Learning | I found the Daily Review easy to understand. | 1 | 1 | 1 | 9 | 88 |
| Usefulness | I often followed up and practiced skills suggested in the Daily Review. | 6 | 14 | 10 | 24 | 47 | |
| Weekly Check In | Ease of Learning | I found the Weekly Survey easy to understand. | 1 | 1 | 4 | 11 | 83 |
| Ease of Use | I (did not have) difficulty remembering or reflecting on if I had symptoms and what symptoms I experienced in the past week. | 28 | 24 | 10 | 11 | 27 | |
| Website | Ease of Learning | I found the LiveWell Clinical Status Summary easy to understand. | 10 | 5 | 30 | 16 | 40 |
| Usefulness | I looked at the LiveWell Clinical Status Summary when I was having problems. | 31 | 14 | 15 | 11 | 30 | |
| I found the LiveWell Clinical Status Summary useful. | 9 | 9 | 35 | 20 | 28 | ||
| I looked at the LiveWell Clinical Status Summary on a regular basis. | 33 | 16 | 14 | 17 | 20 | ||
| Reminder s | Usefulness | I relied on the reminders to complete my daily LiveWell activities. | 15 | 9 | 21 | 11 | 44 |
| Technical | Ease of Use | The battery life of the phone was adequate. | 31 | 14 | 6 | 10 | 40 |
| I used the study phone as I would my primary phone. | 33 | 6 | 17 | 7 | 36 | ||
| I (did not) experience technical issues that impeded my use of the application or study equipment. | 20 | 26 | 19 | 6 | 30 | ||
| I was able to problem solve technical issues that came up on my own. | 11 | 9 | 28 | 24 | 28 | ||
| Coach | Ease of Use | I was able to schedule the coach calls at times that were convenient for me. | 3 | 0 | 5 | 4 | 89 |
| Usefulness | I found the coach supportive. | 3 | 0 | 4 | 6 | 88 | |
| Having the coach calls motivated me to read the lessons. | 3 | 1 | 6 | 11 | 79 | ||
| I found the coach’s role beneficial to my use of the application. | 4 | 0 | 7 | 10 | 79 | ||
| I got more out of the application by working with the coach. | 5 | 3 | 5 | 9 | 79 | ||
| Psychiatrist | Privacy | I was comfortable with my psychiatrist having access to the data collected by the application. | 3 | 1 | 11 | 5 | 80 |
| Usefulness | Using LiveWell helped me communicate with my psychiatrist about how I was doing. | 9 | 1 | 17 | 21 | 52 | |
| Using LiveWell led me to make changes in how I worked with my psychiatrist. | 15 | 9 | 20 | 20 | 37 |
Participant responded to exit questions using a 7-point likert scale: strongly disagree, disagree, somewhat disagree, neither agree nor disagree, somewhat agree, agree, strongly agree. To summarise the data, the responses disagree and strongly disagree were combined and the resposnses agree and strongly agree were combined. The distribution of these combined responses were examined and strongly negatively or positively endorsed questions (Supplement 1) are summarized here and responses to all questions are available in Supplement 2.
DSD – disagree/strongly disagree,
SD – somewhat disagree,
NAD – neither agree nor disagree,
SA – somewhat agree,
ASA – agree/strongly agree.
Participants reported that the app-related tasks could be completed reasonably and that the application terminology was clear (Table 2). However, responses were more mixed regarding the overall organization of the app, ease of moving from page to page, and satisfaction with the app. Participants did report that the foundation lessons, daily review, and weekly check-in were easy to understand and that the daily check-in was easy to use and helpful. They also reported that the daily check-in made them more aware of their sleep, routine, symptoms, and early warning signs, but increased awareness about medication use was not as strongly endorsed.
Participants found the idea of developing a wellness plan interesting but did not strongly endorse that it led to active use of their plans for lifestyle skills or managing symptoms (Table 2). In addition, less than half of the participants reported finding skills in the toolbox that they practiced routinely or following up and practicing skills suggested by the daily review. Participants also had mixed responses about whether they could reliably recall their symptoms for the weekly check-in and the usefulness of the smartphone app reminders. Some participants reported technical problems with using the app and smartphone. Regarding the website, participants did not find the content easy to understand, did not report looking at the website regularly or when having problems, and did not find the website useful (Table 2).
Participants found working with the coach convenient and supportive, beneficial for app use, and motivating for reading the lessons. They also reported that they got more out of the application by working with the coach (Table 2). Most participants were also comfortable with allowing their psychiatrist access to the data collected by the app. However, participants had mixed responses regarding the use of the app improving communication with their psychiatrist or leading to changes in how they worked with them.
3.7. Exit interviews
The first 34 individuals completing the intervention arm were approached by study staff to complete exit interviews, and 17 (50%) completed interviews (14% of the 124 randomized intervention arm participants). Thematic analysis of usability issues identified seven themes: ease of use, ease of learning, usefulness, satisfaction, barriers, technical limitations, and suggestions (Tables 3 and 4).
Table 3.
Exit Interview Themes and Subthemes
| Theme | RSa | PPb | CPc | Subthemed | RS | PP | CP | Subtheme - Divisiond | RS | PP | CP |
|---|---|---|---|---|---|---|---|---|---|---|---|
| Barriers | 10.0 | 94 | 24.1 | Memorability | 10.0 | 88.2 | 10.3 | Website | 10.0 | 70.6 | 2.9 |
| Toolbox | 6.3 | 41.2 | 1.9 | ||||||||
| Wellness Plan – RRe | 2.9 | 17.6 | 1.0 | ||||||||
| Repetitive | 7.8 | 76.5 | 6.1 | Daily Review | 10.0 | 70.6 | 2.9 | ||||
| Daily Check In | 5.3 | 35.3 | 1.6 | ||||||||
| Decreased Use | 5.3 | 47.1 | 3.5 | Mastery | 4.4 | 29.4 | 1.3 | ||||
| Stable | 2.9 | 17.6 | 1.0 | ||||||||
| Time Commitment | 2.9 | 17.6 | 1.0 | ||||||||
| Navigation | 3.6 | 23.5 | 2.6 | Device - S & Wf | 3.4 | 23.5 | 1.0 | ||||
| Privacy | 2.3 | 17.6 | 1.0 | ||||||||
| Usefulness | 9.9 | 100.0 | 21.9 | Yes | 8.6 | 76.5 | 8.0 | Generic Positive | 5.8 | 35.3 | 1.9 |
| Daily Check In | 3.9 | 23.5 | 1.3 | ||||||||
| Daily Review | 3.9 | 23.5 | 1.3 | ||||||||
| Reminders | 7.6 | 70.6 | 6.1 | Ensure Completion | 6.3 | 41.2 | 1.9 | ||||
| Reflection | 4.4 | 29.4 | 1.3 | ||||||||
| Routine | 2.9 | 17.6 | 1.0 | ||||||||
| Personalized | 6.1 | 47.1 | 5.1 | Wellness Plan | 3.9 | 23.5 | 1.3 | ||||
| Wellness Plan – MRg | 3.4 | 17.6 | 1.3 | ||||||||
| No | 4.2 | 35.3 | 2.6 | ||||||||
| Suggestions | 9.8 | 100.0 | 21.2 | Monitoring | 7.6 | 64.7 | 6.8 | Daily Check In Times | 4.8 | 35.3 | 1.3 |
| Daily Check In Notes | 4.4 | 29.4 | 1.3 | ||||||||
| Healthy Habits | 3.8 | 29.4 | 1.0 | ||||||||
| Support | 4.9 | 47.1 | 2.9 | Psychiatrist | 4.4 | 29.4 | 1.3 | ||||
| Family | 3.9 | 23.5 | 1.3 | ||||||||
| Interactive | 4.3 | 41.2 | 2.3 | ||||||||
| Feedback | 3.8 | 29.4 | 2.3 | ||||||||
| Knowledge | 3.6 | 29.4 | 1.9 | Foundations | 4.9 | 29.4 | 1.6 | ||||
| Navigation | 2.7 | 17.6 | 1.6 | ||||||||
| Data Viewing | 2.3 | 17.6 | 1.0 | ||||||||
| Technical | 8.6 | 100. | 14. | Battery Life | 6.2 | 64. | 3.9 | Need for Charge | 4.9 | 29. | 1. |
| Limitations | 0 | 5 | 7 | 4 | 6 | ||||||
| Purple Robot | 3.9 | 23.5 | 1.3 | ||||||||
| Data Transmission | 6.1 | 52.9 | 4.5 | Sensor & Self Report | 6.7 | 35.3 | 2.6 | ||||
| Reminders | 4.8 | 35.3 | 1.3 | ||||||||
| Watch | 3.6 | 29.4 | 1.9 | ||||||||
| Not Iphone Compatible | 2.7 | 17.6 | 1.6 | ||||||||
| Satisfaction | 7.0 | 70.6 | 10.6 | Yes | 8.4 | 64.7 | 8.7 | Generic Positive | 5.3 | 35.3 | 1.6 |
| Coach | 2.9 | 17.6 | 1.0 | ||||||||
| Toolbox | 2.9 | 17.6 | 1.0 | ||||||||
| Watch | 2.9 | 17.6 | 1.0 | ||||||||
| No | 3.6 | 29.4 | 1.9 | ||||||||
| Ease of Use | 4.2 | 41.2 | 3.9 | Convenient | 4.2 | 35.3 | 2.6 | Quick Reference | 6.7 | 47.1 | 1.9 |
| Time-Sufficient | 2.5 | 17.6 | 1.3 | Right Amount of Time | 2.9 | 17.6 | 1.0 | ||||
| Ease of Learning | 4.0 | 35.3 | 3.9 | Straightforward | 4.9 | 35.3 | 3.9 | Daily Check In | 3.8 | 29.4 | 1.0 |
RS – rank score,
PP – percent of participants interviewed (n = 17),
CP – percent of all codes,
Subthemes and subtheme divisions were included here if 3 or more participants discussed them in the interviews. See Supplement 4 for all subthemes and subtheme divisions.
RR – reduce risk,
S & W – smartphone and watch,
MR – my resources.
Table 4.
Exit Interview Quotes
| Theme | Subtheme | Quote |
|---|---|---|
| Usefulness | Overall |
So, I think the app did help me keep more of a balanced. 5043
I think I feel healthier overall. I think my mental health is better. 3028 |
| Daily Check In and Review |
Checking in every day I found very helpful. 3035
The daily checking in, and then, that coordination with the watch [Daily Review feedback], to really kind of get a picture of what the day has looked like and then reflecting on that the next morning. I think that’s been the most helpful. 5037 And then because routine is so important to my good mental health. The fact that it [Daily Review] was flagged when I was off on something was helpful. 5054 |
|
| Reminders |
Well [the reminders] were useful because it kept me on task to complete my daily routine. 3211
It was like having, kinda like your little electronic buddy reminding you to check in. So, I thought it was good. I don’t think the LiveWell application would be as good without that. 5040 |
|
| Personalization |
I liked being able to put in which symptoms would I be at each one [Awareness & Action Wellness levels]. That was really nice to be able to look at my own words on the screen. 3027
I think the wellness plan is a great tool… Especially since it’s customized for the patient. 3063 |
|
| Satisfaction | Yes |
I think this is a good program. I’m gonna recommend it to everyone I know that have mental illness. 3022
I see a lot of benefit… my participation did make a difference in what I’m doing now in my life and finding a little more place of peace where the anxiety and stress don’t cause me to swerve off course. 3211 |
| No |
It [Daily Review] was nagging me a little… To me it was, a little extra that I didn’t need. (Not useful) 5037
No. I mean, I already know what I need to do to reduce risk. These kind of things are very simple. I mean, they’re already ingrained in my brain. (Not useful) 3027 I had some discontent with the application, so I just sort of continued to do things on my own. (Not satisfied) 3157 |
|
| Ease of Use | Convenient | You know, just kind of sit there and look at it [LiveWell] and spend some time with it because it was convenient to have that resource. 3036 |
| Time-Sufficient | [The daily check in questions] seemed the proper amount. Like not too many questions and just general enough that you didn’t feel overburdened by it. 3035 | |
| Ease of Learning | Straightforward |
I think this a pretty good setup. It’s [daily check in] straightforward. 3022
Um, it [awareness and action plan] was very straightforward and easy to follow. 3027 |
| Barriers | Memorability |
Verified that I could sign in [to the website]. And, then I never did again...I think I forgot. 5037
The toolbox, I’m just trying to remember… maybe I did not use that as much. 3028 |
| Repetitive |
[The check in] almost gets to the point where you just click through it. 5032
[The daily review] became redundant. I haven’t really accessed it too much. Maybe three or four times. 3211 |
|
| Decreased Use |
My utilization of the app was intensive early on… I used to check in, say, “Hey, this is my sleep, this is my meds.” I haven’t been referencing, that guidance for a while now. 3211
In terms of the [wellness] scale, I don’t go too far in either direction. So, my use of it was limited. 3028 I guess I grew less interested in using the app. But when you needed...it was helpful. When |
|
| you’re being active to do it and you still have a life going on. 3035 | ||
| Navigation | I’m not really techno-savvy… If I was a little bit more into technology, I would’ve been able to navigate some of the... More apps on the, on the phone. 5043 | |
| Privacy | I know there was a lot of data… that was just kind of a fleeting concern about data. 3211 | |
| Technical Limitations | Battery Life | The monitoring… it drains your battery… so that’s a tad bit of a problem. When I was traveling I had to worry about that. 3035 |
| Data Transmission | I had some issues with my phone not syncing and it took us a couple weeks to get that resolved and then I just got out of the habit. 5037 | |
| Not Iphone Compatible | I have an iPhone and it would have been convenient for me to have it just on my iPhone. 3036 | |
| Suggestions | Monitoring |
If I could kind of just check in every three days and then kind of think back onto how my day went, I could just log in for you know, the past two days or the past three days and check in like that. 5030
I thought maybe it could’ve used quarter-hour increments on the getting to sleep and how many hours of sleep did you get...I like to be more precise. 5032 I wish I had had the opportunity to enter a note...if I’m down maybe say, pain today. ‘Cause I have a bad shoulder, I had surgery... and my pain level can be through the roof sometimes... Even though I’m okay it’s the pain affecting the mood. So if I could put a little note in there that would help me track it better. 3027 I think caffeine and alcohol would be something that would have helped me track...what are you doing with booze, how many drinks that day, and when you check in. 3211 |
| Support |
Something that you could give your psychiatrist to say this is an app I’m using. Like some kind of info card so you don’t have to explain it to them. Because as a patient it’s sometimes complicated to explain to a doctor. 3035
I would like to have like the family more involved in it. As far as significant other. Instead of just us doing it by ourselves because I feel that mental illness is not just an individual problem. It’s also a family problem. 3183 |
In terms of usefulness and satisfaction, participants found the app useful and specifically mentioned that the daily check-in and daily review were helpful and valuable for monitoring their target behaviors, such as sleep and routine. Participants shared that the reminders motivated them to complete their daily check-in while encouraging self-reflection and reported that they found the app’s personalization useful regarding the wellness plan, daily check-in, and daily review (Supplement 4). Most participants were satisfied with the app and working with the coach. However, about a third of participants expressed that they did not find the app useful and were unsatisfied. In terms of ease of use and learning, participants discussed that app use was easy, convenient, and took the right amount of time, and they noted that the app was easy to learn because it was straightforward, with most mentioning this in the context of the daily check-in (Table 4, Supplement 4).
Regarding barriers, participants primarily focused on several subthemes, including memorability of the website and toolbox and repetitiveness of the daily check-in and daily review (Tables 3 and 4). Participants also discussed reasons why their use decreased over time, including feeling as though they had mastered the materials, that their symptoms had reached a place of stability, or that the ongoing time commitment made it difficult to maintain use. Several participants also discussed navigation of the smartphone and watch as difficult, and a few participants had privacy concerns. Participants discussed technical limitations that got in the way of use, such as problems with battery life, data transmission, the watch, and smartphone. Participants also discussed a desire for the app to be compatible with iPhones.
Users had suggestions for app improvements for monitoring, support, interactivity, feedback, and information provided (knowledge) (Tables 3 and 4). For monitoring, participants shared that they would like to change the frequency of daily check-in and the time increments offered for monitoring their bedtimes and risetimes (routine). Other participants requested that further iterations of the intervention include an opportunity to write notes in the daily check-in to help them remember days or instances where their mood may have been context-dependent. Additionally, participants suggested that it would be helpful to monitor healthy habits such as the consumption of caffeine and alcohol. Participants also suggested that the intervention might do more to involve their supports, including care providers and family.
4. Discussion
Understanding how individuals utilize and perceive digital mental health interventions and associated human support can improve engagement and efficacy (Bartholomew et al., 1998; Hekler et al., 2016). To this end, participants’ feedback was obtained, and frequency, duration, and patterns of app use and coaching engagement were examined during an RCT evaluating LiveWell.
During app use, participants spent most of their time using the daily and weekly check-ins and the daily review. Participants reported that the daily check-in was useful for enhancing awareness about sleep, routine, and early warning sign management, consistent with prior research demonstrating that app-based monitoring tools increase self-awareness (Kauer et al., 2012; Morris et al., 2010). Participants also found the daily review helpful but expressed frustration with the repetitiveness of its content and daily check-in monitoring.
The check-ins supported self-monitoring, and similar to other adaptive interventions, the daily review adaptively delivered personalized content and suggested tools and skills for use based on self-report data (Collins et al., 2004; Heron and Smyth, 2010; Nahum-Shani et al., 2018). These app sections support behavioral control processes (self-monitoring of target behaviors, comparison of current behavior with behavioral goals (evaluation), and making adjustments based on evaluation) that have previously been identified as playing an important role in health behavior change (Michie et al., 2008; Michie et al., 2013). In addition, among individuals with bipolar disorder, self-monitoring, evaluation, and adjusting behaviors have been associated with reduced relapse rates and symptom burden, improved quality of life, modification of maladaptive behaviors, and development of coping strategies (Colom et al., 2003; Faurholt-Jepsen et al., 2014; Miklowitz et al., 2008; Miklowitz et al., 2007; Perry et al., 1999).
The foundations and toolbox provided information about bipolar disorder and its treatment to facilitate setting goals and developing wellness plans. The toolbox provided participants with self-assessments and skills to practice. Despite the toolbox’s potential, participants did not believe it translated into practical skill practice, a known facilitator of skill generalization in real-world situations (Miklowitz et al., 2008; Miklowitz et al., 2007). This is significant, as studies have shown that individuals with bipolar disorder who engage in skills training exhibit greater reductions in depressive and manic symptoms compared to those who do not (Firth et al., 2015; Lam et al., 2005).
App use declined over time, comparable to other app-based interventions for individuals with bipolar disorder (58–92%) (Patoz et al., 2021). The decline in overall use was similar to the change in use of the daily check-in, daily review, and weekly check-in. In contrast, use of the foundations, toolbox, and wellness plan declined more quickly, corresponding with the request to read the lessons and review them with the coach during the first four weeks. Participants attributed their decline in app use to clinical stability, content mastery, and the time commitment required. Problems with memorability of the toolbox and wellness plan and the repetitive nature of the daily check-in and review likely also played a role in declining app use. Differentially addressing the various reasons for use decline should improve app engagement. Identifying when to support reengagement with app use and human support will be important for individuals who decreased use due to clinical stability. In contrast, declining use due to mastery and repetitiveness issues should stimulate further efforts to develop adaptive personalization of target and skills monitoring combined with adaptive feedback supporting personalized target achievement and skills development.
The predominant use of the daily check-in and review and participant feedback that they found these tools useful may provide an avenue to increase sustained app engagement. For instance, to reduce the repetitiveness of the daily check-in and review and as suggested by participants, targets for daily monitoring could be adaptively personalized by providing options to add additional targets for monitoring, such as healthy habits (diet, exercise, substance use) and by incorporating entry of freehand notes. To reduce repetitiveness, a behavior target participants consistently achieve (e.g., medication adherence) might be monitored less frequently to allow participants to focus on new self-selected goals. The daily check-in and review might be adaptively personalized to enhance skills practice and development by allowing users to add skills they want to track, like reframing negative thoughts (Basco and Rush, 2005), fostering interactive self-assessment, and skills practice. In addition, as individuals with bipolar disorder who develop a wellness plan report better medication adherence, self-care, and fewer symptoms than those without plans (Janney et al., 2014; Murray et al., 2011), better integration of wellness plans into adaptive app feedback may improve the use of personalized plans.
Similar to other digital interventions, participants’ coaching engagement was substantial. Participants spent a similar amount of time engaging with the coach (3.9 hours) and using the app (4.4 hours). While a recent review of human support in app-based treatment reports that weekly support time per participant ranges from <10 to 60 minutes, it is difficult to compare the present findings to other studies given that duration of human support time is the most commonly omitted data from app-based treatment trials (Bernstein et al., 2022). To our knowledge, our study is the first to report total coaching duration time compared to time spent using the app – it is critical that similar studies report these findings moving forward.
Consistent with prior findings, participants indicated that coaches were supportive, beneficial, and motivated app use (Bernstein et al., 2022; Firth et al., 2017a; Mohr et al., 2011). About half of the participants triggered SAFE calls, most reporting functional changes in multiple domains (e.g., self-care, sociability, family and work relations). This information prompted increased communication with mental health providers. However, to address these functional changes and clinical follow-up, further integration of app content and tools may be necessary (e.g., having coaches directly support participants in using app components or engagement with their mental health providers to address a specific functional change). Considering the limited engagement of mental health providers, it might be necessary to explore options for participants to access additional support beyond providers, like integrating family and friends through a web portal or establishing an online peer support forum.
Addressing problems elicited during SAFE calls highlights the importance of provider engagement. Very few providers enrolled in the study, and for those who did, very few used the website to view their patient’s data. This lack of provider involvement may have resulted because the person-centered design process focused primarily on development of the app with the participants and not development of the website with the providers. Future iterations of the intervention might emphasize co-designing the website with providers and shifting the website information into the app to make it more accessible for participants (Eyles et al., 2016). Participants also reported that the intervention did not change how they interacted with their care teams. Investigating how to increase provider engagement and improve patient-provider communication will be important for improving outcomes. Participants also expressed interest in improvements to LiveWell to enhance the involvement of friends and family members, which may also improve digital mental health intervention efficacy.
4.1. Limitations
There were several limitations to assessing participants’ engagement with and views regarding the intervention. Smartphone engagement was defined as time spent viewing app content and completing self-reports, which may overestimate app engagement if participants are not attending to the content being viewed or the responses being entered. Only half of the participants approached for exit interviews completed an interview; thus, self-selection may have resulted in missed perspectives and bias in the thematic analysis. In addition, while the active intervention ended at 16 weeks, the exit interviews were conducted at 48 weeks, potentially leading to recall bias. Regarding engagement, the duration of app use and scheduled coach calls were directly measured, but the coach ad hoc calls (SAFE, TNA) durations were estimated. Finally, although participants identified coaching as being beneficial, the extent to which coaching is necessary to aid in and benefit from app use and for whom it is necessary cannot be determined from the current study in which all participants had the opportunity to engage with a coach.
5. Conclusion
The present study is consistent with prior work indicating that participants are interested in and willing to use app components to enhance behavioral control processes, such as self-monitoring tools and adaptive feedback supporting evaluation and adjustment (Jonathan et al., 2021a; Jonathan et al., 2021b). These components likely drive app use, but use still declines over time for reasons such as clinical stability, content mastery, and time commitment. As participant app use is focused on behavioral control components, developing real-time adaptive identification of current personally relevant targets for monitoring and feedback will likely increase sustained app use. In addition, because of the important role of identifying and building skills for managing mental health problems (Farkas and Anthony, 2010; Lyman et al., 2014), adaptively incorporating current personally relevant skills for monitoring and feedback combined with providing more interactive tools to support skills practice and development may improve sustained app use and intervention efficacy. Identifying when to support reengagement with app use and human support for individuals who decrease use due to clinical stability is also important to address.
Consistent with prior work, this study indicates the importance of human support in enhancing engagement with and benefits derived from app use and highlights the need to determine the best methods for increasing provider engagement and facilitating communication between providers and users to improve digital mental health intervention efficacy. Similarly, additional work enhancing the utility of digital mental health interventions in assisting participants in building and engaging friends and family as helping supports will likely be useful. In summary, information about user opinions and engagement with technology and human support should be obtained while developing and deploying digital mental health interventions. This information should be made available to create a knowledge base that enhances understanding how to successfully implement these interventions in real-world settings.
Supplementary Material
Highlights.
App use predominantly involved monitoring using the daily and weekly check-ins
Check-ins encouraged awareness around sleep, routine, early warning signs and symptoms
App use declined due to clinical stability, content mastery, and time constraints
Time spent with coaches was substantial and comparable to app use
Suicidality and functional evaluation calls highlight the need for provider involvement
Acknowledgements
We want to thank all individuals who agreed to participate in the LiveWell study.
Role of the Funding Source
This study was funded by grant R01 MH110626 from the National Institute of Mental Health (Dr. Goulding).
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Declaration of interest
Dr. Goulding reported receiving grants from the National Institute of Mental Health, honoraria from Otsuka Pharmaceuticals outside the submitted work. Dr. Rossom reported receiving grants from Otsuka Pharmaceuticals outside the submitted work. Dr. Mohr reported receiving grants from the National Institute of Mental Health; honoraria/consulting fees from Apple Inc, Otsuka Pharmaceuticals, Optum Behavioral Health, Pear Therapeutics, Centerstone Research Institute, and OneMind Foundation; royalties from Oxford Press Royalties; an equity stake from Adaptive Health Inc; and having a patent for US 2022/0084683 A1 issued. Dr. Kwasny reported receiving grants from the National Institutes of Health during the conduct of the study. No other disclosures were reported.
References
- Abran A, Khelifi A, Suryn W, Seffah A, 2003. Usability meanings and interpretations in ISO standards. Software quality journal 11, 325–338. [Google Scholar]
- Bartholomew LK, Parcel GS, Kok G, 1998. Intervention mapping: a process for developing theory- and evidence-based health education programs. Health Educ Behav 25, 545–563. [DOI] [PubMed] [Google Scholar]
- Basco MR, Rush AJ, 2005. Cognitive-behavioral therapy for bipolar disorder. Guilford Press. [Google Scholar]
- Bernstein EE, Weingarden H, Wolfe EC, Hall MD, Snorrason I, Wilhelm S, 2022. Human Support in App-Based Cognitive Behavioral Therapies for Emotional Disorders: Scoping Review. J Med Internet Res 24, e33307. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Braun V, Clarke V, 2006. Using Thematic Analysis in Psychology. Qualitative Research in Psychology 3, 77–101. [Google Scholar]
- Collins LM, Murphy SA, Bierman KL, 2004. A conceptual framework for adaptive preventive interventions. Prev Sci 5, 185–196. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Colom F, Vieta E, Martinez-Aran A, Reinares M, Goikolea JM, Benabarre A, Torrent C, Comes M, Corbella B, Parramon G, Corominas J, 2003. A randomized trial on the efficacy of group psychoeducation in the prophylaxis of recurrences in bipolar patients whose disease is in remission. Archives of general psychiatry 60, 402–407. [DOI] [PubMed] [Google Scholar]
- Dopke CA, McBride A, Babington P, Jonathan GK, Michaels T, Ryan C, Duffecy J, Mohr DC, Goulding EH, 2021. Development of Coaching Support for LiveWell: A Smartphone-Based Self-Management Intervention for Bipolar Disorder. JMIR Form Res 5, e25810. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Eyles H, Jull A, Dobson R, Firestone R, Whittaker R, Te Morenga L, Goodwin D, Mhurchu CN, 2016. Co-design of mHealth delivered interventions: a systematic review to assess key methods and processes. Current Nutrition Reports 5, 160–167. [Google Scholar]
- Farkas M, Anthony WA, 2010. Psychiatric rehabilitation interventions: a review. International Review of Psychiatry 22, 114–129. [DOI] [PubMed] [Google Scholar]
- Faurholt-Jepsen M, Frost M, Vinberg M, Christensen EM, Bardram JE, Kessing LV, 2014. Smartphone data as objective measures of bipolar disorder symptoms. Psychiatry research 217, 124–127. [DOI] [PubMed] [Google Scholar]
- Firth J, Torous J, Nicholas J, Carney R, Pratap A, Rosenbaum S, Sarris J, 2017a. The efficacy of smartphone-based mental health interventions for depressive symptoms: a meta-analysis of randomized controlled trials. World Psychiatry 16, 287–298. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Firth J, Torous J, Nicholas J, Carney R, Rosenbaum S, Sarris J, 2017b. Can smartphone mental health interventions reduce symptoms of anxiety? A meta-analysis of randomized controlled trials. Journal of Affective Disorders 218, 15–22. [DOI] [PubMed] [Google Scholar]
- Firth N, Barkham M, Kellett S, 2015. The clinical effectiveness of stepped care systems for depression in working age adults: a systematic review. Journal of affective disorders 170, 119–130. [DOI] [PubMed] [Google Scholar]
- Goulding EH, Dopke CA, Michaels T, Martin CR, Khiani MA, Garborg C, Karr C, Begale M, 2021. A Smartphone-Based Self-management Intervention for Individuals With Bipolar Disorder (LiveWell): Protocol Development for an Expert System to Provide Adaptive User Feedback. JMIR Form Res 5, e32932. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Goulding EH, Dopke CA, Rossom R, Jonathan G, Mohr D, Kwasny MJ, 2022a. Effects of a Smartphone-Based Self-management Intervention for Individuals With Bipolar Disorder on Relapse, Symptom Burden, and Quality of Life: A Randomized Clinical Trial. JAMA Psychiatry. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Goulding EH, Dopke CA, Rossom RC, Michaels T, Martin CR, Ryan C, Jonathan G, McBride A, Babington P, Bernstein M, Bank A, Garborg CS, Dinh JM, Begale M, Kwasny MJ, Mohr DC, 2022b. A Smartphone-Based Self-management Intervention for Individuals With Bipolar Disorder (LiveWell): Empirical and Theoretical Framework, Intervention Design, and Study Protocol for a Randomized Controlled Trial. JMIR Res Protoc 11, e30710. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Guest G, Bunce A, Johnson L, 2006. How many interviews are enough? An experiment with data saturation and variability. Field methods 18, 59–82. [Google Scholar]
- Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG, 2009. Research electronic data capture (REDCap)—a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 42, 377–381. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hekler EB, Klasnja P, Riley WT, Buman MP, Huberty J, Rivera DE, Martin CA, 2016. Agile science: creating useful products for behavior change in the real world. Transl Behav Med 6, 317–328. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Heron KE, Smyth JM, 2010. Ecological momentary interventions: incorporating mobile technology into psychosocial and health behaviour treatments. Br J Health Psychol 15, 1–39. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Janney CA, Bauer MS, Kilbourne AM, 2014. Self-management and bipolar disorder--a clinician’s guide to the literature 2011–2014. Curr Psychiatry Rep 16, 485. [DOI] [PubMed] [Google Scholar]
- Jonathan GK, Dopke CA, Michaels T, Bank A, Martin CR, Adhikari K, Krakauer RL, Ryan C, McBride A, Babington P, Frauenhofer E, Silver J, Capra C, Simon M, Begale M, Mohr DC, Goulding EH, 2021a. A Smartphone-Based Self-management Intervention for Bipolar Disorder (LiveWell): User-Centered Development Approach. JMIR Ment Health 8, e20424. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jonathan GK, Dopke CA, Michaels T, Martin CR, Ryan C, McBride A, Babington P, Goulding EH, 2021b. A smartphone-based self-management intervention for individuals with bipolar disorder (LiveWell): qualitative study on user experiences of the behavior change process. JMIR Mental Health 8, e32306. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kauer SD, Reid SC, Crooke AHD, Khor A, Hearps SJC, Jorm AF, Sanci L, Patton G, 2012. Self-monitoring using mobile phones in the early stages of adolescent depression: randomized controlled trial. Journal of medical Internet research 14, e1858. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lam DH, Hayward P, Watkins ER, Wright K, Sham P, 2005. Relapse prevention in patients with bipolar disorder: cognitive therapy outcome after 2 years. American Journal of Psychiatry 162, 324–329. [DOI] [PubMed] [Google Scholar]
- Lewis JR, 1995. IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. International Journal of Human‐Computer Interaction 7, 57–78. [Google Scholar]
- Lund AM, 2001. Measuring Usability with the USE Questionnaire. STC Usability SIG Newsletter 8. [Google Scholar]
- Lyman DR, Kurtz MM, Farkas M, George P, Dougherty RH, Daniels AS, Ghose SS, Delphin-Rittmon ME, 2014. Skill building: Assessing the evidence. Psychiatric Services 65, 727–738. [DOI] [PubMed] [Google Scholar]
- Michie S, Johnston M, Francis J, Hardeman W, Eccles M, 2008. From theory to intervention: Mapping theoretically derived behavioural determinants to behaviour change techniques. Applied Psychology-an International Review-Psychologie Appliquee-Revue Internationale 57, 660–680. [Google Scholar]
- Michie S, Richardson M, Johnston M, Abraham C, Francis J, Hardeman W, Eccles MP, Cane J, Wood CE, 2013. The Behavior Change Technique Taxonomy (v1) of 93 Hierarchically Clustered Techniques: Building an International Consensus for the Reporting of Behavior Change Interventions. Annals of Behavioral Medicine 46, 81–95. [DOI] [PubMed] [Google Scholar]
- Miklowitz DJ, Goodwin GM, Bauer MS, Geddes JR, 2008. Common and specific elements of psychosocial treatments for bipolar disorder: a survey of clinicians participating in randomized trials. Journal of psychiatric practice 14, 77–85. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Miklowitz DJ, Otto MW, Frank E, Reilly-Harrington NA, Wisniewski SR, Kogan JN, Nierenberg AA, Calabrese JR, Marangell LB, Gyulai L, Araga M, Gonzalez JM, Shirley ER, Thase ME, Sachs GS, 2007. Psychosocial treatments for bipolar depression: a 1-year randomized trial from the Systematic Treatment Enhancement Program. Archives of general psychiatry 64, 419–426. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Miklowitz DJ, Scott J, 2009. Psychosocial treatments for bipolar disorder: cost-effectiveness, mediating mechanisms, and future directions. Bipolar disorders 11 Suppl 2, 110–122. [DOI] [PubMed] [Google Scholar]
- Mohr DC, Cuijpers P, Lehman K, 2011. Supportive accountability: a model for providing human support to enhance adherence to eHealth interventions. J Med Internet Res 13, e30. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Morris ME, Kathawala Q, Leen TK, Gorenstein EE, Guilak F, DeLeeuw W, Labhard M, 2010. Mobile therapy: case study evaluations of a cell phone application for emotional self-awareness. Journal of medical Internet research 12, e1371. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Murray G, Suto M, Hole R, Hale S, Amari E, Michalak EE, 2011. Self-management strategies used by ‘high functioning’ individuals with bipolar disorder: from research to clinical practice. Clinical psychology & psychotherapy 18, 95–109. [DOI] [PubMed] [Google Scholar]
- Nahum-Shani I, Smith SN, Spring BJ, Collins LM, Witkiewitz K, Tewari A, Murphy SA, 2018. Just-in-Time Adaptive Interventions (JITAIs) in Mobile Health: Key Components and Design Principles for Ongoing Health Behavior Support. Ann Behav Med 52, 446–462. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Patoz M-C, Hidalgo-Mazzei D, Pereira B, Blanc O, de Chazeron I, Murru A, Verdolini N, Pacchiarotti I, Vieta E, Llorca P-M, 2021. Patients’ adherence to smartphone apps in the management of bipolar disorder: a systematic review. International journal of bipolar disorders 9, 19. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Perry A, Tarrier N, Morriss R, McCarthy E, Limb K, 1999. Randomised controlled trial of efficacy of teaching patients with bipolar disorder to identify early symptoms of relapse and obtain treatment. BMJ (Clinical research ed 318, 149–153. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Potter M, Gordon S, Hamer P, 2004. The nominal group technique: a useful consensus methodology in physiotherapy research. New Zealand Journal of Physiotherapy 32, 126–130. [Google Scholar]
- Ruggeri M, Leese M, Thornicroft G, Bisoffi G, Tansella M, 2000. Definition and prevalence of severe and persistent mental illness. The British Journal of Psychiatry 177, 149–155. [DOI] [PubMed] [Google Scholar]
- Substance Abuse and Mental Health Services Administration, 2019. Key substance use and mental health indicators in the United States: results from the 2019 National Survey on Drug Use and Health. Center for Behavioral Health Statistics and Quality, Substance Abuse and Mental Health Services Administration, Rockville, MD. [Google Scholar]
- Vigo D, Thornicroft G, Atun R, 2016. Estimating the true global burden of mental illness. Lancet Psychiatry 3, 171–178. [DOI] [PubMed] [Google Scholar]
- Wang PS, Berglund PA, Olfson M, Kessler RC, 2004. Delays in initial treatment contact after first onset of a mental disorder. Health services research 39, 393–416. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
