Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2023 Apr 1.
Published in final edited form as: Behav Res Ther. 2022 Feb 15;151:104065. doi: 10.1016/j.brat.2022.104065

Lessons Learned from Designing an Asynchronous Remote Community Approach for Behavioral Activation Intervention for Teens

Jessica L Jenness 1, Arpita Bhattacharya 2, Julie A Kientz 3, Sean A Munson 3, Ria Nagar 4
PMCID: PMC8983010  NIHMSID: NIHMS1785658  PMID: 35248749

Abstract

Adolescent depression is common; however, over 60% of depressed adolescents do not receive mental health care. Digitally-delivered evidence-based psychosocial interventions (EBPIs) may provide an opportunity to improve access and engagement in mental health care. We present a case study that reviews lessons learned from using the Discover - Design - Build - Test (DDBT) model to create, develop, and evaluate a high-fidelity prototype of an app to deliver an EBPI for depression, behavioral activation (BA), on an Asynchronous Remote Communities (ARC) platform (referred to as ActivaTeen). We review work at each stage of the DDBT framework, including initial formative work, iterative design and development work, and an initial feasibility study. We engaged teens with depression, mental health clinicians, and expert evaluators through the process. We found that the DDBT model supported the research team in understanding the requirements for our prototype system, ActivaTeen, and conceiving of and developing specific ideas for implementation. Our work contributes a case study of how the DDBT framework can be applied to adapting an EBPI to a new, scalable and digital format. We provide lessons learned from engaging teens and clinicians with an asynchronous approach to EBPIs and human centered design considerations for teen mental health.

Keywords: Digital Mental Health, Discover-Design-Build-Test, Asynchronous Remote Communities, Adolescent Depression, Depression Treatment, Behavioral Activation

Introduction

Approximately 9–11% of U.S. adolescents are diagnosed with depression each year (Mojtabai et al., 2016; SAMHSA, 2016). Adolescent depression increases risk for lifelong negative outcomes such as psychosocial disability and suicide (Clayborne et al., 2019). Despite the rise of depression in adolescence, only 1% of all U.S. youth receive outpatient care for depression each year, with low engagement among those treated (Olfson et al., 2003, 2009). Furthermore, uptake of evidence-based psychosocial interventions (EBPI) into usual care has been slow given training and implementation challenges (Fixsen et al., 2009; Williams & Beidas, 2019). It is critical to investigate innovations capitalizing on the near ubiquitous use of technology platforms among adolescents for improving the usability of and engagement in EBPIs. Here we present a case study reviewing lessons learned from application of the Discover, Design, Build, Test (DDBT) implementation model (Lyon et al., 2019) to understand how a specific EBPI, Behavioral Activation (BA) (McCauley, Schloredt, et al., 2016), may be improved by delivery on an asynchronous online platform that provides technological and peer support.

Technology-Mediated Interventions to Address Adolescent Depression

Traditional EBPIs are greatly limited due to implementation cost (Eiraldi et al., 2015), time burden, and lack of administrative support (Langley et al., 2010). Digital mental health innovations have the potential to transform EBPIs in ways that increase accessibility, engagement, and scalability, especially for traditionally underserved individuals (Comer, 2015). Previous digital adolescent depression treatments have included a range of options including multi-modal (e.g., videos, online workbooks and modules, animations) self- and therapist-guided computer-based cognitive behavioral therapy (CBT), video game based EBPI learning, and self-guided smartphone applications or text-based therapy (see Grist et al., 2019 and Hollis et al., 2017 for reviews). In a systematic reviews of digital health interventions, computer-based CBT was found effective for adolescents with anxiety and depression (Grist et al., 2019; Hollis et al., 2017). Pilot randomized control trials have demonstrated acceptability and feasibility of websites and mobile applications in delivering BA modules to adolescents with depression (Davidson et al., 2014; Rohani et al., 2020). While self-administered digital approaches, such as smartphone apps and online modules, may improve certain barriers related to access, maintaining user engagement over time continues to be a significant limitation (Fritz et al., 2014; Tatara et al., 2013). Similar to in-person EBPI delivery approaches, digital interventions have also been critiqued for their poor, unresponsive design, reducing their potential for real-world impact (Grist et al., 2019). Further, only 10% of CBT/BA mobile applications use core EBPI components, and many do not address safety and privacy needs crucial to safe platform engagement (Huguet et al., 2016). Therefore, our work aimed to address the need for creative use and design of technology to engage teens in EPBIs for mental health management.

Asynchronous Remote Communities (ARC) are a promising technology-based approach that capitalizes on the improved reach and usability of technology while providing support, social interactions, and peer-driven motivation, features that have increased engagement in digital health platforms (Epstein et al., 2016). Specifically, ARC platforms- like Slack- use private online spaces to both collect data and deliver weekly tasks to groups of target users or other stakeholders through automated chatbots and human moderators. Participants in ARCs may respond with multimedia options (e.g., de-identified photos, videos) and collaborate among themselves to generate ideas, problem-solve challenges, and provide supportive coaching. ARC platforms were originally used as a data collection tool in qualitative research allowing for asynchronous discussions by target users (e.g., clinicians, patients) on design and user needs for integrating technology into healthcare. Recently, work from our group has evaluated whether ARC may be a feasible delivery tool for modified EBPIs (Bhattacharya et al., 2021).

Specifically, the same ARC attributes that make them valuable for research–flexibility, transportability, and scalability–may also support their use to improve EBPI usability, reach, and engagement for adolescents who have limited access to mental health facilities. For example, ARC platforms could either supplement or reduce the need for synchronous therapy interactions through technology-based symptom and behavioral data collection and homework completion. Alternatively, ARC could serve as a fully self-directed tool by delivering EBPI modules within a community of peer learners. In order to determine the feasibility of ARC delivered EBPIs, we adapted the ARC method to (1) deliver early prototypes based on the weekly evidence-based strategies of behavioral activation, which is a straightforward and adaptable intervention for depression management (Tindall et al., 2017) and (2) facilitate remote participation and feedback from teenagers and clinicians in the design and testing process.

BA is based on a functional analytic model of depression that highlights the transactional associations among environmental stress, behavior, and mood (McCauley, Schloredt, et al., 2016). BA addresses depression through a number of behavioral strategies (e.g., mood-activity logging, goal-setting, problem-solving barriers) with the goal of (1) increasing the experience of positive reinforcement (rewarding experiences) to help improve mood and (2) decreasing avoidance of reinforcing activities (McCauley et al., 2016). Importantly, BA is easily tailored to fit each individual’s values and goals based on their context. BA has been successfully delivered across many platforms including self-directed online courses and clinician-directed approaches in primary care, telephone, and brief-group therapy (Dimidjian, 2011).

Our previous work examining teen stress management using ARC indicated that teens wanted digital tools to 1) support reflection/logging of factors that may impact mood management; 2) social/peer support; and 3) planning support for goal completion (Bhattacharya et al., 2019). Indeed, features such as mood and activity logging and visualizations, as well as text reminders for skill use, was feasible, acceptable, and improved platform engagement (Davidson et al., 2014; Rohani et al., 2020). While previous digital mental health tools have examined each of these aspects independently or in limited combinations (Grist et al., 2019; Hugeuet et al., 2016), maintaining user engagement (e.g., time spent using components of the digital tool, frequency of skill use/treatment recommendations in situ) over time continues to be a significant limitation (Grist et al., 2019). The unique aspect of using the ARC approach for mental health interventions is leveraging the combination of social connection and interactive support tools like the ability to integrate new and interactive media such as chatbots, photos, and videos on portable platforms across multiple devices. Specifically, the core components of ARC platforms that may improve patient engagement in BA and, ultimately, mental health and functional outcomes are: (1) between-session therapy homework and goal completion support through interactive chatbots, reminders, and access to online worksheets; (2) ecological momentary assessment (EMA) logging and visualizations of symptoms and behavior that can been viewed and shared by both patient and clinician; (3) asynchronous therapist coaching through direct messaging; and (4) scaffolded peer communities that collectively respond to moderated engagement prompts to share homework goals and challenges, provide asynchronous peer coaching, and general peer support. Therefore, we sought to extend our previous work by developing an app to deliver BA on an ARC platform (referred to as ActivaTeen).

Human-Centered Design for Mental Health and the Discover Design Build Test Model

Human-centered design (HCD) describes methods for learning about people and contexts to develop engaging, usable, and effective products and services (Holtzblatt et al., 2005; Holtzblatt & Beyer, 1997; Norman & Draper, 1987; Zomerdijk & Voss, 2010), as well as design patterns that can be translated from one system or intervention to another (Dearden & Finlay, 2006; Fincher et al., 2003). Intervention developers, implementers, and designers have increasingly applied HCD methods to support development and refinement of a range of mental health services and systems (e.g., Alexopoulos et al., 2015; Graham et al., 2019; Lyon, Dopp, et al., 2020; Lyon & Bruns, 2019; Mohr et al., 2013, 2017; Murnane et al., 2018; Rohani et al., 2020; J. Suh et al., 2020). For example, 1) the HCD technique of journey mapping–understanding a user’s current or intended path through use of a service and/or products–has been used to support staff and clinician understanding of patient experiences (Percival & McGregor, 2016); 2) Lyon et al. (2021) apply the HCD technique of cognitive walkthroughs–having domain experts go through prototype system according to scripted scenarios to identify barriers to effective use–to identify implementation barriers that could be addressed before implementation (Lyon et al., 2021). HCD methods aim to improve feasibility, acceptability, and effectiveness by shifting intervention design and implementation away from top-down processes to methods that engage the patients and clinicians who will be delivering them. These methods aim to tailor interventions to the people and contexts in which they will be used and might be led by design researchers or practitioners or may be even more participatory (e.g., Björling & Rose, 2019; Wadley et al., 2013).

The NIMH-funded ALACRITY center at the University of Washington funded the work discussed in this paper and seeks to support and further develop the use of HCD for mental health interventions and implementations. This center adapted the HCD process and aspects of implementation science into a Discover - Design - Build - Test (DDBT) model (Lyon et al., 2019) that scaffolds the application of HCD to mental health services (Figure 1). While the overall model is linear, results in any phase could indicate a need to return to an earlier phase.

Figure 1:

Figure 1:

Summary of the overall DDBT process and the methods used in our study using DDBT approach for designing for mental health interventions.

Using DDBT in Designing ActivaTeen

We organized our research agenda, and the remainder of this paper, according to the DDBT model. We review lessons learned from the Discover and Design phases, as well as Building and Testing of preliminary interactive prototypes of ActivaTeen to assess the feasibility and acceptability of our design concepts. In the Discover phase, we invited teens with depression and low mood and mental health clinicians to begin using an ARC platform (Slack) and gave them design activities to understand how components of BA (McCauley et al., 2016) and ARC could be used with these stakeholder groups. In the Design and Build phases, we designed and built an ActivaTeen prototype that included a chatbot, digital homework completion, and ecological momentary assessment (EMA) logging and visualizations of mood and behavior and conducted expert review and usability evaluation to iterate on the designs to assure they were suitable for testing. Finally, in the Test phase, we evaluated our prototypes of ActivaTeen with a new set of teens experiencing depression or low mood and mental health clinicians to understand whether the overall approach was acceptable and feasible and to assess a more holistic evaluation of the user experience. We leave developing and testing a scalable, robust ActivaTeen platform for our future work.

This paper presents a case study reviewing our experience and lessons learned from applying the DDBT model’s methods to designing a technology-based mental health intervention (Figure 1). We summarize our process of using the DDBT framework in practice by summarizing the methods and results of our work within each phase of the DDBT model (see Table 1 for a broad overview). We end by reflecting on our findings related to using ARC as a new platform for mental health intervention delivery and how DDBT allowed us to identify and shape the intervention’s possibilities in collaboration with teens diagnosed with depression and their clinicians.

Table 1:

Summary of design of our prototype through the different phases of DDBT based and takeaways from the empirical work with teens, clinicians, and experts which informed the next phase

Module Discover Phase Design & Build Phase Test Phase

BA Psychoeducational Video Prototype: NIMH BA video and examples of animated educational videos Prototype: Storyboard with hand drawn panels outlining the video script and visuals reviewed by experts turned into animated video Prototype: Animated BA video showing the journey of a teen experiencing depression and BA treatment
Takeaways:
- Make the video interactive
-Use tangible examples for teens
-Use animations
Takeaways:
- Minor edits by experts (child clinical psychologists) to content and story of storyboard and video to be aligned with BA principles
Takeaways:
- Add content explaining the upcoming modules as well to what to expect in the coming weeks

Module 1: Mood and Activity Logging Prototype: Mock-up of mood and activity logging in survey format Prototype: ActivaTeen programmed to send an automatic prompt to log activity, emotion and intensity every 3 hours between 9 am-9 pm for 3 weeks. The app provided a summary for each log. Expert designers were asked to reflect on their summaries and what they would like to learn from these data. Prototype: ActivaTeen sends an automatic prompt to log activity, emotion, intensity, and journaling context every 3–6 hours between 9 am-9 pm. Visualization of data provided to teen and clinician. Direct message between teen and clinicians two weeks after logging to discuss data.
Takeaways:
-Add reminders to the tracking prompts
-Ensure in-person review with a clinician
-Allow for data review to check on progress
-Consider number of notifications to avoid teens ignoring prompts
Takeaways:
-Need to provide more context for elaborating on emotion through a free text journaling box
-Frame the prompt to allow for tracking of multiple activities
- increase the interval between prompts (max 3 daily)
-Consider randomly sending prompts through the day to get better sampling of activity/mood
Takeaways:
- Account for difficulty in identifying negative emotions by setting expectations for logging a range of emotions and not just positive emotions
- Show real-time visualizations for reflection to facilitate what they understand from these data and to motivate logging
- Reduce guilt and overwhelming backlog when wanting to return to logging after not logging
- Provide space for both logging in the moment and logging later
- Continue to allow for clinician-teen reflection on data

Module 2: Upward and Downward Spiral Prototype: Survey format with prompts on reflecting on upward and downward spirals in mood and action Prototype: ActivaTeen programmed with Interactive chatbot with prompts Prototype: Interactive chatbot with prompts
Takeaways:
-Teens noted survey was clear, simple and liked that it supported reflection on situation-mood-activity spirals
Takeaways:
- Add a prompt to think about what they would want to do next after experiencing an emotion
Takeaways:
-No changes: Teens and clinicians continued to note the prompts supported reflection

Module 3: plan SMART goals Prototype: Survey format for individually planning a SMART goal, mini steps, and setting reminders. Illustrated screenshot mock-up of chatbot format. Evaluated direct messaging format in which participants pair up with a peer and a researcher moderator. Participants were asked to share SMART goals and provide feedback on each other’s goals. Prototype: ActivatTeen programmed with Interactive chatbot that allowed teens to generate a SMART goal and ministeps. Included suggestion cards to help generate SMART goals. Prototype: Added visual example of SMART goal that defined SMART acronym. Added prompt to direct message with clinicians the week after planning their SMART goal and ministeps
Takeaways: -preference for interactive format of chatbot vs survey
- Consideration of what to do when direct messaging falls through if one of the teens in the pairing does not respond
Takeaways:
- Include a visual explanation of what a SMART is that includes an example
-Include more support when choosing and setting SMART goals
Takeaways:
- Clinicians said quality of SMART goals would be improved by synchronous support to help teens plan SMART goals ahead of time instead of after
-Most teens found it helpful to plan and break down the goals to reduce avoidance
-Teen did not find Slack reminder tool to be useful
- Some teens used sticky notes to remind themselves of mini-steps others did not
- Mini-steps should include some tracking of progress when each step was completed and enable setting bulk reminders for regular activities
- All teens found messaging with clinicians to be helpful

Module 4 Overcoming barriers to SMART goals Prototypes: Survey format for assessing barriers. Illustrated mockup of chatbot format with prompts to overcome barriers. Evaluated direct messaging with paired peer from week 8 to allow for follow up and sharing barriers Prototype: ActivaTeen programmed with interactive chatbot to help identify internal and external barriers, suggest strategies to overcome barriers, and re-plan SMART goal to account for overcoming barriers Prototype: No changes from Design/Build phase
Takeaways: -Preference for interactive format of chatbot vs survey
- Moderator and researchers provided direct message support if teen’s peer partner did not engage
Takeaways:
-No changes suggested by expert evaluators
Takeaways:
- Some teens found it difficult to identify barriers
- Certain external barriers outside of teens’ control were unpredictable to plan for and chatbot did not help (e.g., waiting to hear back from a friend or medication taking effect)

Discover

In the Discover phase, we sought to understand the needs of our target users, worked closely within our interdisciplinary team of psychologists and HCD experts, and involved target users in obtaining feedback on the early design of our ActivaTeen activities and prototype. This section reviews our Discover phase work as relevant to the scope of the paper in explaining the aspects that informed the DDBT process. In-depth details on methods and results are available in our prior publication (Bhattacharya et al., 2021).

Method

We enrolled target users, including teens experiencing depression and adolescent mental health clinicians, in an ARC based research study in order to understand their needs with depression management and obtain their feedback on low fidelity prototypes1 for ActivaTeen (Bhattacharya et al., 2020). Specifically, we conducted two separate ARC studies on Slack across 10 weeks with ten mental health clinicians (31–50 years of age), including therapists, primary care, and school counselors) who work with teens with depression, and eight teen participants (between 15 and 19 years of age2) who were currently experiencing mild-to-moderate symptoms of depression based on the PHQ-8 (Richardson et al., 2010; Wu et al., 2019). Please see Bhattacharya et al. 2021 Table 1 and Table 2 for further demographic details. We gave them weekly prompts sent to a Slack channel to evaluate target user feedback on several topics including (1) current technology use in therapy, (2) preferences on how to use ARC with therapy, and (3) feedback on prototypes for adapting paper-based BA content and homework to a digital format for delivery on an ARC platform (see Bhattacharya et al., 2021 for examples of prompts). Users were directed to respond to the prompt in the channel, but also had the option to direct message study team members if they had questions or thoughts they wanted to share privately. The team created prompts prior to study start but adapted the planned prompt each week based on target user responses from the previous week. Specifically, our group moderator monitored and summarized all participant responses as they came in, and our research team met weekly to discuss the prompts and target user feedback to inform adaptions to the upcoming week’s prompt. In developing and adapting these prompts, we drew on our team’s clinician investigator’s experience and knowledge of areas for potential improvements in the study design, activity design, and prototype design.

Table 2.

Overview of the alignment between the BA protocol content (McCauley et al., 2016) and ActivaTeen app delivered via an ARC platform

BA Protocol Content ActivaTeen Support via ARC
Session Modules and Content Overarching features: Direct message therapist, prompts to share success and challenges to peer community
1–2 Intro to BA: BA Model of Depression, Activity/Symptom Monitoring Peer community introductions, BA Psychoed video, daily EMA mood-activity tracking
3–4 Activation/Goal Directed Behavior: Situation-Activity-Mood Connection, Regulating Mood with Activity, Linking Behavior to Short- & Long-term Outcomes, Making the Most of a Good Feeling ActivaTeen homework support for tracking upward and downward mood spirals; mood-activity EMA tracking
5–8 Core Skill Building: Problem Solving, Goal Setting, Identifying Internal & External Barriers, Overcoming Avoidance, Use Alternative Coping ActivaTeen Goal setting, problem-solving, and overcoming barriers chatbot support; BA goal and mini-step tracking; EMA tracking
9–11 Practice/Application: Review and Integrate Content, Practice Skills ActivaTeen Goal setting, problem-solving, and overcoming barriers chatbot support; goal, mini-step, and mood EMA tracking
12 Relapse Prevention & Termination: Strategies for Preventing Relapse, Identify Strengths & Challenges for Moving Forward Continued access to ActivaTeen chatbots, therapist messaging, and peer community

BA content and homework prototypes included mood and activity logging, chatbots for planning goals to improve mood and overcome barriers, including avoidance. We created low-fidelity prototypes to show how some BA activities might be adapted to an interactive asynchronous remote platform and showed them to our clinician and teen participants to obtain feedback. Prototypes included low-fidelity drawings or online survey-based mock-ups and screenshots where the research team could easily incorporate feedback and iterate on the design before investing resources into building the tool. Teen and clinicians did not interact with each other in the Discover phase. Teen participants interacted with each other by commenting on other teen’s posts and utilizing emoji reactions to each other’s posts. There were also structured activities that involved moderated direct message discussions between teens (see Bhattacharya et al. 2021 for more details on weekly activities). After the online activities ended, 9 clinicians and 5 teens interviewed with us about their experience in the study and completed exit surveys (Bhattacharya et al. 2021). Participants were compensated $5 per week (teens), $10 per week (clinicians), and $20 for completing the exit interviews.

Data Analysis

Our study data included Slack transcripts of discussions among all participants including polls and DMs, interview transcripts, and survey responses. Researchers inductively analyzed qualitative data (participants’ weekly responses on Slack and the exit interview transcripts). Three researchers initially read and discussed four interviews with the lead coder then generated a preliminary codebook based on that discussion. Three researchers then used this codebook to code the initial four interviews, discussed, and revised the codebook. After discussion, the researchers divided the remaining interviews and data from weekly activities to code independently. Because the weekly activities sometimes reflected different themes than the interviews, the coders periodically discussed additional themes and updated the codebook, recoding the data as-needed. Because the researchers started by coding the exit interviews–which spanned experiences in the entire study–changes from the initial codebook were minimal. We quantitatively evaluated target user feedback using the Acceptability, Intervention Appropriateness, and Feasibility of Intervention Measure (Weiner et al., 2017) and six constructs of the User Burden Scale, (Suh et al., 2016) including physical, mental & emotional, time & social, financial, difficulty of use, and privacy.

Summary of Results

Across both qualitative and quantitative measures, we found that ARC is feasible for engaging teens and clinicians in the early phases of the design process and obtaining their feedback on incorporating technical functionalities and BA concepts into the prototype. The key themes (see Table 1 for more details) we found from the Discover phase included: (1) clinicians and teens perceived benefits in using the ARC platform to support logging mood and activities and engaging with BA content and (2) both teens and clinician participants wanted interactive online support as a supplement (versus replacement) to in-person therapy, (3) clinicians highlighted concerns about managing boundaries around expectations of constant asynchronous access and crisis support, and (4) both teens and clinicians raised the importance of privacy and data security (Bhattacharya et al., 2021). To summarize the design insights gained from this study, teens did not want a chatbot to replace or emulate a human but envisioned its function as an interactive tool for self-reflection and completion of therapy goals.

Design and Build

The Design and Build phases of the DDBT model are typically intertwined in an iterative process until usability scores and/or expert consensus indicate that the prototype is suitable for the testing phase. In our process, we applied findings from our Discover phase to design and develop the ActivaTeen prototype (Table 1), evaluated its usability with HCD and clinical psychology experts, and iteratively refined aspects of our prototype before testing it with target users in the next phase.

Methods

Based on our results in the Discover phase that highlighted the themes of design needs in a prototype to supplement face-to-face therapy between teens and clinicians, we further made refinements to the ARC method and identified the technical and usability requirements of translating BA components into an ARC format.

Video for Psychoeducation on BA:

Clinicians and teens highlighted a need for interactive training and education components for individuals new to BA (Bhattacharya, 2020). To introduce BA to teens, we created an animated video (Nagar & Kemp, 2020) that follows the style of showing real time whiteboard drawings. The content for this video was prepared by storyboarding iteratively and was led by our teen volunteer. We obtained feedback from two clinicians who are experts in BA and the visual design and video-making was outsourced to a freelance Undergraduate designer.

ActivaTeen:

Based on feedback from teens in the Discover phase, we designed an application on the Slack platform called ActivaTeen in the format of an interactive smart diary. We did not aim to simulate human conversation through ActivaTeen but intended for teens to reflect through interactive prompts sent by the bot. This expectation was set for the teens at the start of the study. We incorporated four main modules of BA into ActivaTeen: (1) logging prompts and data visualizations for activity and mood tracking, (2) interactive, conversational style chatbot module to prompt teens to reflect on upward and downward spirals in mood and behavior, (3) interactive module for planning SMART (Specific, Measureable, Appealing, Realistic, Time-bound) goal and mini-stems, and (4) interactive, conversational style chatbot module to reflect on and problem-solve barriers to SMART goals.

We built the backend of the application (i.e., the parts of the system that retrieve, store, and manipulate data, but with which that the user does ot directly interact) using the Slack API, Javascript, and Node.js, building the modular functions for each BA module on the open source botkit library. Our approach of using an existing platform allowed us prototype key features of ActivaTeen while relying on the platform’s implementation of necessary-but-not-novel features, such as user authentication and profiles and discussion channels, a form of piggyback prototyping (Grevet & Gilbert, 2015). We parsed and cleaned the JSON data using Python scripts and prepared the visualizations summarizing teens’ logging data using Tableau.

Preliminary Testing with Design Students:

In February 2020, we invited one Undergraduate and three Masters’ students in human centered design & engineering as users with design expertise to conduct preliminary user testing and cognitive walkthroughs of the ActivaTeen Slack app for 7 weeks (Doherty et al., 2010). Each week, we conducted an in-person meeting and asked designers to try out the respective activity on Slack for 20 minutes while two researchers observed them. During each session, two researchers provided the prompts to the users, asking them to respond on Slack while observing them. After all students completed responding, researchers asked them to elaborate on what went well, what was challenging and why and the group discussed and brainstormed how to resolve the challenges within the limits of technical feasibility. Changes were addressed based on what the group considered important, and we iterated on the prototype weekly to get feedback on the changes till everyone was satisfied with it. Design students were also asked to log their mood and activity for 3 weeks while obtaining intermediate feedback on design changes each week. Key changes were made to the interface, including adding a freeform text box for users to provide context for logging activities and mood, support for understanding psychoeducational content, and visual representations of mood logging data. We then conducted a focus group for 40 minutes to discuss the designers’ experience, challenges, and changes they want in the design. We also asked our volunteer teen and two additional teens and a clinician from prior studies to remotely test and provide feedback on the four ActivaTeen modules via a survey or email. As is standard practice when involving expert evaluators in the design and build process, we did not obtain any demographic information on expert evaluators (Doherty et al., 2010).

Summary of Results

The final design and functionality of the modules are explained below. Table 2 shows the alignment between ActivaTeen modules and the BA protocol content.

  1. The mood-activity logging module prompts teens to log an activity from the past 3–6 hours, the type of activity, how they were feeling during the activity, and intensity of the feeling (Figure 2a). The nine types of activities included everyday activities, hobby, physical activity, relaxation, school, socializing, work, and other. After week 6 when they planned a SMART goal, a category of “SMART goal” was added. Options for feelings were based on the circumplex model of emotions (Russell, 1980) which included positive (content, relaxed, focused, happy, excited), neutral, and negative (angry, overwhelmed, depressed, anxious, tired, sad, bored) valence. Teens could log the intensity of their feeling ranging from 1: least intense to 10: most intense. Based on feedback from expert designers, we added a text box for teens to freely journal any additional context that may be relevant to the activity or how they were feeling.

  2. The reflecting on upward and downward mood spirals module prompts teens to think about a time in the week when they noticed their mood was “down” and a time it was “up”. In a conversational style, the chatbot app prompts the teens to respond to: what happened, how they felt about it, and what they did as a result of feeling that way.

  3. The planning SMART goals module presents teens with a digital form to support setting a SMART goal and mini-steps. To support teens with generating ideas for goals, we also created examples of SMART goals and mini-steps and added suggestion cards for 6 types of activities: hobby, everyday activities or chores, socializing, relaxation, do less of something, and physical activity (Figure 2b).

  4. The overcoming barriers module uses an interactive chatbot to ask teens what barrier(s) they encountered in completing their mini-steps (Figure 2c), educate teens about the internal and external types of barriers with examples, and ask them to plan how they would overcome these barriers. Suggestion cards were designed for both internal and external barriers with an intent to help teens brainstorm ideas and adapt it to what was feasible in their context. The teens could then plan their SMART goal by getting redirected to module 3 with revised mini-steps for overcoming barriers.

Figure 2:

Figure 2:

Four modules of BA supported by ActivaTeen: (a) logging activity: teens log feeling, intensity, and journaling context, (b) planning SMART goals and mini-steps, and (c) overcoming barriers to SMART goals (author name as user for anonymity)

The key outcome of this stage was the implemented and refined prototype for the Test phase. Through this development and refinement, we also identified ideas for improvement and new features. Changes included having a space for free reflection (context in the EMA logs), framing of the interactive app as a smart diary instead of a chat bot, and revising the framing of the weekly activities and ActivaTeen prompts. We also identified questions that participants would like to answer based on their logs, which helped us develop visualizations to support reflection. We stopped making changes to the prototype when expert designers, the design team, and teen volunteers agreed that all identified concerns that could interfere with an informative Test phase had been addressed. This qualitative approach and feedback were valuable for rapidly iterating and testing during the early stages of prototyping to prioritize changes before testing it with teen and clinician participants. These reflections supported the team in making decisions about proceeding to the next phase.

Test

Once our Design and Build phase prototype met the requirements identified in the Discover phase, we moved onto the Test phase to assess the feasibility and acceptability of the prototype with teens and clinicians. In this phase, we sought to inform future planned clinical trials of the proposed system.

Methods

We started the Test phase of the study during the onset of the COVID-19 pandemic in the US. To recruit teens between February and March 2020, we advertised our study in online groups, sent messages and flyers to clinicians and a mailing list of parents with teenagers. We directed interested teen participants to fill out a screener survey with contact information and enrolled teens experiencing mild to moderately severe depression symptoms based on the PHQ-8 questionnaire (Richardson et al., 2010; Wu et al., 2019). If the teen was experiencing moderately severe depression, we also required that they had a current therapist. We recruited clinicians through Seattle Children’s Hospital Mood and Anxiety Disorders outpatient clinic (the child psychiatry clinic connected to the University of Washington) via emailed flyers. Participants were compensated with $10 on completing each week’s activity and $20 for completing the exit interviews. All study activities were conducted between May and August 2020.

Eleven teens (Table 3) consented and joined Slack. We present the study recruitment and retention flow in Figure 3 and summarize the demographic information of Teens and Clinicians in Tables 3 and 4, respectively. Clinicians and teens dropped out of the study due to personal life challenges, transitions, and difficulties with keeping up (which may have also been aggravated due to the pandemic). Please see Figure 3 and Table 5 for a count of participants who participated each week.

Table 3:

Summary of teen participants’ demographic information in study 3 (n=9)

Age 13–19 years (mean= 16 years)
Gender Female (n=4), Male (n=4), Non-binary (n=1),
Education level Some high school (n=5), Some college (n=1), No response (n=3)
Race White (n=5), More than one race (unspecified) (n=1), no response (n=3)
Therapy experience Currently seeing therapist (n=5), past therapy (n=2), no previous therapy (n=5)
Pre-study PHQ M= 14.44, SD= 3.94, range= 6–19

Fig. 3.

Fig. 3.

Study recruitment and retention flowchart.

Table 4:

Summary of clinician participants’ demographic information in study 3 (n=2)

Age 29–41 years
Gender Female (n=3)
Number of years of clinical experience treating teens 5–12 years

Table 5:

Number of activities completed by teens in each week in Study 3 (total number of teens who joined the Slack group n=11, number of teens who continued after week 1 n=9)

Activity type in Test Phase Number of teens who completed
Week 1: Introductions and Ice Breakers 10
Week 2: Review of BA psychoeducation video & Begin Mood-Activity Logging 6
Week 3: Upward/Downward Spiral Chatbot 6
Week 4: Clinician-Teen DM to review logging 9
Week 5: Reflection on Clinician-Teen DM 6
Week 6: SMART Goal Planning 7
Week 7: Sharing on group about SMART goal 5
Week 7: Peer feedback on SMART goals 0
Week 8: Clinician-Teen DM SMART goal barriers 7
Week 8: Barriers to SMART goals 6
Mood and activity Logging 9
Interviews 7
End of study survey 5
*

Note: DM= Direct message

We conducted the online activities for 8 weeks (Table 5). We designed each week’s activity to guide teens through BA content and corresponding supports in our ARC, and to learn about their experiences with those supports. Similar to our Discover phase process, we discussed progress and preliminary data each week to finalize the activity for next week for clinicians and teens. The content for each module is explained in Appendix A.

Online group activities:

In Weeks 1 and 2, clinicians and teens were enrolled in the same Slack group. They were provided a short tutorial on how to use Slack features, asked to introduce themselves with ice-breaker questions, and asked for feedback on the BA psychoeducation video. The clinician and teen groups were separated in week 3 as we wanted the teens to be able to reflect on their own accord without the influence of clinicians.

Both teens and clinicians were asked to track their mood and activities in week 3 and reflect on their experience in week 4 in their respective groups. The teens additionally tracked their activities and mood for 5 weeks (weeks 3, 4, 6, 7, and 8). We asked teens to log mood and activity on ActivaTeen with 112 reminders over 5 weeks (35 days) of the study at 3–6-hour intervals between 9 am and 9 pm. We changed the timing and frequency of the reminders based on the preliminary analysis of the times at which the participants logged and the feedback from participants. Participants were asked to let us know any time if they wanted us to change the frequency of or stop the reminders.

Each ActivaTeen module was interleaved with one week of reflection and feedback on the module. Teens completed the ActivaTeen modules on upward /downward spiral in week 3, SMART goal planning module in week 6, and the overcoming barriers module in week 8. The clinicians also piloted these activities and provided feedback. In weeks 4 and 8, we created a direct message (DM) group in which a teen, clinician, and one researcher present as a moderator interacted with each other in mock therapy scenarios. For example, the clinicians were asked to initiate the conversation and provided example prompts to help teens reflect on their logging data in week 4 and review SMART goals, brainstorm how to overcome barriers, and review data in week 6. Across all weeks, teen participants interacted with each other by commenting on other teen’s posts and utilizing emoji reactions to each other’s posts. The creation of a clinician facing ActivaTeen “dashboard” to provide data visualizations and summary of patient homework completion is a next step for our work, so summaries and visualizations were created by research staff and sent to clinicians via direct message on the ActivaTeen platform.

Exit Interviews and Surveys:

Nine interviews (7 teens, 2 clinicians) were conducted on Zoom and participants were given the choice to turn off the video. One teen (T31) chose to respond via text chat only with her audio and video turned off as the interviewer asked questions verbally while being available on video. The ninth interview was conducted over the phone with a clinician. Interviews lasted between 30–60 minutes. All interviews were transcribed automatically via Zoom, and we read through and edited the transcript for errors while watching each video.

For each interview, we prepared a slide deck with screenshots of the four main modules of ActivaTeen, summarized graphs for each teen participant, and SMART goals (as available). The graphs included representations of valence of emotions across categories of activities, time, frequency of logging, and a summary table of specific activities across time and valence (Appendix B). For the clinician interviews, we used graphs from two teens as examples. During the interview, we shared our screen co-viewing the deck of slides with the participants with screenshots of ActivaTeen user interface (UI) to ask them for feedback on the design. Seven participants (5 teens, 2 clinicians) also completed a post-study PHQ-8 survey (Richardson et al., 2010; Wu et al., 2019).

Ethical Considerations:

All phases of this study are a part of a larger project for which we obtained emergency contact information from all participants and parental permission was not obtained formally. During recruitment in our prior work (Bhattacharya et al. 2019), we found that some interested teens were unable to participate as their parental contact was not responsive or teens were unwilling to provide contact information of the parent. To reduce such barriers for teens in participating, we consulted with our university’s Institutional Review Board (IRB) and obtained a complete waiver of parental permission. In the Test phase, the majority of the teens were recruited through parents. All participants chose anonymous pseudonyms, and no identifiable information was shared on the groups or direct messages. Teens were informed during the consent process that clinicians will be on the Slack group and will be able to see their mood and activity logging data (with their pseudonym). Given the joint teen and clinician involvement, researchers made sure to clarify to participants that they were enrolling in a feasibility, acceptability, and usability study and not a treatment study. Adversity mitigation protocols were still relevant and used from prior studies and emergency contact information was obtained for all participants. Three researchers moderated the group and responded to participants within 24 hours. There we no adverse or crisis events and no need to contact emergency contacts throughout all phases of our project. This study was approved as minimal risk by the University’s Human Subject Division.

Data Analysis

The coders first read and inductively coded a part of the data (four teen-interview transcripts) line by line and wrote their codes in a cumulative document. The coders then met for weekly discussions to prepare a codebook. We distributed the weekly Slack data and interview transcripts between three coders and the remainder of the data was coded using the final codebook (Appendix C). All coders read and shared memos with each other and met for discussions to resolve discrepancies in coding. We conducted affinity modelling (Holtzblatt et al., 2004), an open-ended process by which researchers iteratively discuss and cluster data, to identify emergent themes. Three main themes emerged from our analysis. The research team further iterated on these themes through discussion and while writing. We summarize the results focusing on the insights we gained for iterating on our design below and a detailed empirical analysis can be found in (Bhattacharya, 2020, Chapter 5).

To provide descriptive characterization of our sample, we calculated average scores of PHQ-8 pre- and post-study. We calculated an average score for each item of the Acceptability, Intervention Appropriateness, and Feasibility of Intervention Measure for the clinician group and teen group, respectively. For the User Burden Scale (Suh et al., 2016), we computed average scores of teens and clinician groups separately across each of the 6 constructs – physical, mental & emotional, time & social, financial, difficulty of use, and privacy. In the IUS, items are rated on a Likert scale from 0 (strongly disagree) to 4 (strongly agree), with half of the items reverse-scored (Lyon, Pullman, et al., 2020). A total score is normally calculated by multiplying the sum of these scores by 2.78 (range: 0–100). Drawing parallels from scoring the system usability scale (SUS), above a 68 would be considered above average and anything below 68 is below average (Lewis, 2018).

Summary of Results

ActivaTeen was found to be feasible, usable, and acceptable across qualitative and quantitative measures. Specifically, three main themes emerged from our qualitative analysis of the weekly feedback and interview data based on our goals. Based on our empirical analysis, participants reported that ActivaTeen supported teens in moving from avoidance to action by (1) providing structure for data collection and reflection on the self-tracked mood-behavior data to better set and follow through meaningful goals. Specifically, data collected from BA homework modules and ActivaTeen logging and visualizations of mood and behavior that was viewed and shared by both patient and clinician assisted in supporting use and engagement of BA content; (2) scaffolding interactions with clinicians asynchronously through direct messaging; and (3) scaffolding peer communities to share insights, goals, and challenges and general peer support (Bhattacharya, 2020).

Due to our sample size, quantitative data were collected with the aim of describing our sample and providing pilot feasibility, usability, and acceptability data. All teens engaged in the mood-activity logging module with the total number of logs ranging from 3 to 125 out of 112 logging prompts (M= 52.13, SD= 45.53) and number of days logged ranging from 3 to 37 (M= 12.71, SD=10.34) across 35 days of logging (two teens continued to log after reminders were stopped). Five teens and two clinicians responded to the end of study survey with the usability metrics. On average, teen participants reported depression scores in the moderate range pre-study (M= 14.44, SD= 3.94) and the mild range post-study (M= 9.60, SD= 6.27). Overall, participants indicated low burden and high acceptability/feasibility for the online intervention. On the User Burden Scale (1: not at all burdensome, 4: very burdensome) (Suh et al., 2016), participants scored an average of 0.6 (clinicians) and 0.6 (teens) on difficulty of using Slack and 0–0.2 on all other types of burden. Target users reported high acceptability, intervention appropriateness, and feasibility of intervention (1: Strongly Disagree, 5: Strongly agree) (Weiner et al., 2017) with average scores between 3.4 and 3.5 for teens and 3.6 and 4 for clinicians. The mean IUS score was > 68 (above average; Lewis, 2018; Lyon, Pullmann, et al., 2020).

Discussion

In our application of DDBT, we combined the clinical and human centered design expertise of our research team with teens’ and clinicians’ experiences to understand the design needs for the successful use of ARC supported BA (Discover phase). This also facilitated generation of ideas on what additional resources could support teens and clinicians in their use of BA (a combination of Discover and Design) and iterative building and testing of the advanced prototype of our ARC app, ActivaTeen. Through this work, we reflected on design considerations gained from our work, applications of DDBT to innovate on mental health care approaches, and future directions.

Reflection on Design Considerations

Our work led to several general design considerations that may also apply to the design of other digitally-delivered EBPIs. Teens and clinicians reported that ActivaTeen’s design supported teens in reducing avoidance by providing structure through logging and adjusting their perception of control by planning SMART goals and breaking down goals into smaller tasks. Using reminders and planning tools, technology can more directly intervene in the avoidant thought process in-situ with psychoeducation and reminders for action supporting teens in breaking down activities into manageable steps. Further, technologies can be used to help teens keep track of progress with their mini-steps and provide intermediate rewards for small achievements.

Understanding one’s behavior and mood patterns through reflection is crucial for teens who are undergoing psychosocial transitions and developing routines and habits that may last a lifetime (Galván, 2017; McCauley, Gudmundsen, et al., 2016). The levels of reflections demonstrated by teens in this study ranged from teens merely describing what they notice in the data, supporting inquiry to elaborate more on why they observe certain patterns in their routines and making connections, and ultimately taking transformative action to change their activities and plan goals (Baumer, 2015; Fleck & Fitzpatrick, 2010; See Appendix C). Asking participants early on to reflect on why they are logging and providing intermediate summaries to support potential outcomes would be helpful to revisit intermittently for digitally delivered EBPIs.

As we developed ActivaTeen, we needed to make design changes to meet the temporal and structural expectations of teens such as personalizing the timing and frequency of logging reminders based on the participants’ routines and the times they feel most comfortable to log. Rohani et al. (2019) recommended providing options for not only in-the-moment logging but designing to support participants in retroactively logging multiple activities. To enable accurate interpretations and account for recall bias, these logs need to be differentiated by providing visual indicators for what data was logged in-the-moment versus data that was logged later. Teens also reported using non-digital tools such as sticky notes or retaining the information without reminders. It is important to identify and be inclusive of what meaningful engagement means for the teen even if it does not always involve interacting with the technology and what engagement would mean for a therapist-client interaction for a particular evidence-based practice. For example, for ActivaTeen we conceptualize meaningful engagement as (1) compliance or adherence, in whether participants act on what the treatment recommends and (2) a proxy for whether participants can receive support for the intervention in digital form when they need it in the absence of a therapist.

Consistent with research showing that the inclusion of even minimal therapist contact improves outcomes of digitally delivered treatments (Grist et al., 2019), both teens and clinicians wanted to preserve the human connection with a clinician, online or in-person. Though teens preferred to have asynchronous direct messaging in addition to reviewing the data during therapy, clinicians spoke about the burden of their caseload which did not provide them with additional time outside of weekly synchronous therapy sessions. Both teens and clinicians found value in reflecting on the data summaries and graphs and discussing it together. Clinicians wanted to use the data summary and visualizations to identify activities and contexts that triggered negative emotions and the intensity of the emotion, identify positively reinforcing activities, and plan future goals and recommendations with teen clients during synchronous online or in-person therapy sessions. Research suggests that visualizations summarizing logged data based on personally meaningful criteria supports meaningful reflection (Epstein et al., 2016). Relatedly, our findings show that problem-solving and goal-planning discussions between patients and therapists were supported by data visualizations displaying patterns and triggers associated with positive and negative changes in mood. Data visualization tools may also provide agency to the teens by allowing them to share what they think is valuable to their treatment with their clinicians. Using the app as a central repository of psychoeducational information with examples and suggestion cards can support teens in further reflection on reviewing and generating ideas for SMART goals.

Our work also highlights the inclusion of a peer community component to digitally-supported EBPIs to potentially improve engagement in and effectiveness of care. Built-in peer support may help maintain user engagement by increasing feelings of social belongingness and decreasing stigma around mental health concerns. Social belonging has been identified as an important component in effective mental health outcomes (Allen et al., 2018; Newman et al., 2007; Nuttman-Shwartz, 2019) and is inversely correlated with behavior problems in adolescents (Newman et al., 2007), provides protection against stress for adolescents (Nuttman-Shwartz, 2019), and may be an essential component to improve feelings of isolation (Allen et al., 2018). The integrated peer support network also works to combat mental health stigma, a common barrier for adolescents seeking mental health treatment (Chandra & Minkovitz, 2006; Gulliver et al., 2010). Indeed, reductions in mental health stigma have been observed for individuals interacting with other treatment seeking peers (Pinfold et al., 2005; Wade et al., 2011).

Reflection on Using the DDBT Framework

As we consider the next steps in our research, we anticipate questions that might support further development of the DDBT model and provide guidance to other research teams applying DDBT in their own work. First, while DDBT emphasizes the iteration between the Design and Build phase, the design process is overall iterative. For example, in our work we used Slack because it was a flexible, user-friendly platform for exploring different designs and connecting with participants as we conducted our initial Discover, Design, Build, and Test phases. As we move toward implementing tools that providers can adopt in their day-to-day work, HIPAA regulations and various organizational factors mean that Microsoft Teams will be the ARC platform better suited for our intended use context. This work necessitates backing up and revisiting certain aspects of the design: not every design choice that worked well on Slack is appropriate to just copy to Microsoft Teams, and so we must redesign some features.

In our Discover and Test phases, the ARC was a both platform for collecting input from participants and for rapidly prototyping design ideas, leading to conflation between aspects of the research study and aspects of the intervention. When researchers integrate using ARC as both a research tool to gather information and as an intervention delivery method for BA, it may be most usable for participants, but it also may confuse and conflate our qualitative research with the intervention. This is a tradeoff we and others have noted in previous uses of ARC in research (Bhattacharya et al., 2019; Maestre et al., 2018). While we deemed it most beneficial to combine the research platform with the prototyping platform at this stage of the research, future work oriented toward evaluating ActivaTeen’s efficacy will need to test the prototype on its own, without conflating research components.

Finally, the DDBT framework felt familiar to the members of the research team trained in human centered design, but teams can borrow from and apply other frameworks and methods from human centered design and other fields. Recent work has described the potential of HCD for mental and behavioral health (e.g., Graham et al., 2019; Lyon, Brewer, et al., 2020; Lyon, Dopp, et al., 2020) as well as the alignment between HCD and implementation science (e.g., Dopp et al., 2019, 2020). Others may see similarities to the plan-do-study-act (PSDA) cycles common in quality improvement work (Langley et al., 2009), though DDBT focuses more on initial discovery and more open-ended design phases in pursuit of new solutions, while PSDA is focused more on continuous improvement in the field. Each framework will have different emphases and may also support buy-in from different stakeholders based on their prior experiences with that framework.

Limitations and Future Work

Teens with depression are difficult to reach for participation in research. During the Test phase, the onset of the pandemic impacted and delayed recruitment as we could no longer reach teens through in-person measures, participants’ schedule and activities changed (e.g., no school, lack of or no access to peers, migration to online activities, and cancellation of camps or part time work), and the coping strategies that teens could feasibly use were limited due to social distancing. Due to recruitment challenges related to the pilot nature of our project and COVID-19 pandemic, our sample was skewed towards people who could be reached through researchers’ contacts, mailing lists, clinics, and snowball sampling and were willing to participate during the COVID-19 pandemic. Our sample is not racially or ethnically diverse, which limits the generalizability of our findings. We could not further evaluate the usability with real-world clinician-client dyads and could only ask participants to speculate how this type of guided ARC system would work with their real-life therapists or patients. Recruiting real world client-therapist pairs and a diverse and representative group of participants is the next step for our work in order to provide more helpful insights into understanding how a system like this would feasibly fit into the clinicians’ workflow, influence the therapeutic relationship, and influence cultural barriers to care.

While both teens and clinicians highlighted that they preferred ARC as a supplement versus replacement to therapy during our Discover phase, this phase was conducted prior to the COVID-19 pandemic, and it is possible that these views may have changed. Indeed, the pandemic was a catalyst for mental health care systems to migrate to online telehealth using teleconferencing tools (e.g., Torous et al., 2020; Torous & Wykes, 2020). Specifically, as social distancing requirements due to the pandemic continued for over one year in the United States at the time of writing, we speculate that compared to our data, mental health clinicians and teens may have increased acceptance over the potential use of online technologies for therapy and may have devised creative strategies for compartmentalizing their online time, incorporating contextual information from clients’ homes, and using available mental health apps and digital tools. Clinicians might want to continue using these digital tools when they return to in-person therapy creating more opportunities for investigating hybrid in-person and ARC formats of therapy.

Additional options to integrate ARC into clinical work include using ARC as a platform for the client-clinician dyad alone and/or creating a peer group with multiple patients at a clinic who all join the same group (either for patients attending group therapy or seeing individual clinicians at the same clinic). Understanding the feasibility of this in-context would require more intensive recruitment efforts, using a HIPAA compliant system such as Microsoft Teams, and connecting with administrative and technological stakeholders at clinics. Though technology has the potential to provide increased access, we acknowledge that ARC approaches exclude teens who do not have regular and consistent access to digital tools and internet connectivity. Indeed, a nationally representative sample of teens shows that, while the vast majority of teens own a smartphone in both low- and high-income households, there continues to be disparities in smartphone ownership (i.e., 74% vs. 89%, respectively) (Rideout & Robb, 2019). Therefore, it is important for efforts to improve access to care also consider approaches that do not solely rely on technology.

In our future work, we aim to we aim to evaluate mental health and usability outcomes in a larger scale RCT with and without the guided ARC enhancement in conjunction with synchronous BA therapy. As part of this work, we plan to further collaborate with clinical stakeholders to examine whether and how the ARC approach increases the feasibility and effectiveness of hybrid synchronous and asynchronous care. It is integral for researchers to work with organizational decision-makers and stakeholders in adolescent clinical settings and industry partners to create innovative solutions and work towards real-world deployment to improve scalability and accessibility of evidence-based mental health care for teens. Given the varied components of ARC platforms that may improve engagement in skill use, clinical outcomes, and functioning, we also plan to utilize platform engagement metrics (e.g., time spent with chatbot teaching modules, frequency of EMA logging, frequency of peer and patient-therapist direct messaging) to explore which aspects of ActivaTeen are most associated with improvements in symptoms and outcomes.

Conclusion

Using the DDBT framework, we designed, developed, and studied the feasibility of using integrated peer, clinician, and interactive ActivaTeen app support with teens experiencing depression. Our work demonstrated that the DDBT model was a useful frame in which to understand target user needs for adapting BA to an ARC format, build an advanced prototype and to empirically understand how engagement with this prototype varied across teens and mental health clinicians. Our work provides a platform for future research to investigate the use of ARC supported BA, and other EBPIs, in real-world clinical contexts with the ultimate aim of improving access to, engagement in, and scalability of mental health care for teens.

Supplementary Material

1
2
3

Highlights.

  • Digital platforms may improve teen access to and engagement in mental health care.

  • This is a case study of the Discover-Design-Build-Test (DDBT) implementation model.

  • We applied DDBT to adapt a treatment to an Asynchronous Remote Communities platform.

  • The DDBT model supported our understanding of prototype requirements.

  • Future work will test a robust digital platform to improve access and engagement in care.

Acknowledgements:

We thank all participants. We are grateful for constructive discussions with colleagues at the University of Washington and for feedback provided by Drs. Schueller and Cohen. This work is funded by the University of Washington ALACRITY Center under NIMH Award #1P50MH115837. This project was supported by the National Center for Advancing Translational Sciences of the National Institutes of Health under Award Number UL1 TR002319 and the National Institute of Mental Health (K23MH112872). The content is solely the responsibility of the authors and does not necessarily represent the official views of funding agencies.

Footnotes

Conflict of Interest: The authors declare no conflict of interest.

Author Credit Statement:

Jessica L. Jenness: Conceptualization; methodology; resources; writing- original draft, review & editing; visualization, supervision; project administration; funding acquisition

Arpita Bhattacharya: Conceptualization; methodology; resources; writing- original draft, review & editing; visualization; data curation; investigation; formal analysis; project administration

Julie A. Kientz: Conceptualization; methodology; resources; writing- original draft, review & editing; supervision; project administration; funding acquisition

Sean A. Munson: Conceptualization; methodology; resources; writing- original draft, review & editing; visualization, supervision; project administration; funding acquisition

Ria Nagar: data curation; investigation; formal analysis; project administration

1

In prototyping, fidelity refers to the level of a prototype’s detailing and finish, with respect to both functionality and look and feel. Low fidelity prototypes facilitate understanding how a system might work and initial evaluations before investing in development of high-fidelity prototypes.

2

Allowable age range of 13- to 19-years old was the same across all phases of our work. The presented age ranges reflect the ages of enrolled participants.

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

References

  1. Agapie E, Avrahami D, & Marlow J (2016). Staying the Course: System-Driven Lapse Management for Supporting Behavior Change. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 1072–1083. 10.1145/2858036.2858142 [DOI] [Google Scholar]
  2. Alexopoulos GS, Raue PJ, Kiosses DN, Seirup JK, Banerjee S, & Arean PA (2015). Comparing engage with PST in late-life major depression: A preliminary report. The American Journal of Geriatric Psychiatry, 23(5), 506–513. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Allen K, Kern ML, Vella-Brodrick D, Hattie J, & Waters L (2018). What schools need to know about fostering school belonging: A meta-analysis. Educational Psychology Review, 30(1), 1–34. [Google Scholar]
  4. Baumer EP (2015). Reflective informatics: Conceptual dimensions for designing technologies of reflection. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 585–594. [Google Scholar]
  5. Bhattacharya A (2020). Designing Guided Asynchronous Remote Communities to Support Teen Mental Health [Unpublished Doctoral Dissertation]. [Google Scholar]
  6. Bhattacharya A, Nagar R, Jenness JL, Munson S, & Kientz JA (2021). Designing Asynchronous Remote Interventions to Support Teens with Depression Management using Behavioral Activation. JMIR Formative Research, 5(7), e20969. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Bhattacharya A, Liang C, Zeng EY, Shukla K, Wong MER, Munson SA, & Kientz JA (2019). Engaging Teenagers in Asynchronous Online Groups to Design for Stress Management. Proceedings of the 18th ACM International Conference on Interaction Design and Children, 26–37. 10.1145/3311927.3323140 [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Björling EA, & Rose E (2019). Participatory research principles in human-centered design: Engaging teens in the co-design of a social robot. Multimodal Technologies and Interaction, 3(1), 8. [Google Scholar]
  9. Chandra A, & Minkovitz CS (2006). Stigma starts early: Gender differences in teen willingness to use mental health services. Journal of Adolescent Health, 38(6), 754–e1. [DOI] [PubMed] [Google Scholar]
  10. Clayborne ZM, Varin M, & Colman I (2019). Systematic review and meta-analysis: Adolescent depression and long-term psychosocial outcomes. Journal of the American Academy of Child & Adolescent Psychiatry, 58(1), 72–79. [DOI] [PubMed] [Google Scholar]
  11. Comer JS (2015). Introduction to the special series: Applying new technologies to extend the scope and accessibility of mental health care. Cognitive and Behavioral Practice, 22(3), 253–257. [Google Scholar]
  12. Davidson TM, Yuen EK, Felton JW, McCauley J, Gros KS, & Ruggiero KJ (2014). Feasibility assessment of a brief, web-based behavioral activation intervention for adolescents with depressed mood. The International Journal of Psychiatry in Medicine, 48(1), 69–82. [DOI] [PubMed] [Google Scholar]
  13. Dearden A, & Finlay J (2006). Pattern languages in HCI: A critical review. Human–Computer Interaction, 21(1), 49–102. [Google Scholar]
  14. Dimidjian S, Barrera M Jr, Martell C, Muñoz RF, Lewinsohn PM. The origins and current status of behavioral activation treatments for depression. Annual review of clinical psychology. 2011;7:1–38. [DOI] [PubMed] [Google Scholar]
  15. Dopp AR, Parisi KE, Munson SA, & Lyon AR (2019). Integrating implementation and user-centred design strategies to enhance the impact of health services: Protocol from a concept mapping study. Health Research Policy and Systems, 17(1), 1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Dopp AR, Parisi KE, Munson SA, & Lyon AR (2020). Aligning implementation and user-centered design strategies to enhance the impact of health services: Results from a concept mapping study. Implementation Science Communications, 1(1), 1–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Doherty G, Coyle D, & Matthews M (2010). Design and evaluation guidelines for mental health technologies. Interacting with computers, 22(4), 243–252. [Google Scholar]
  18. Eikey EV, & Reddy MC (2017). “It’s Definitely Been a Journey”: A Qualitative Study on How Women with Eating Disorders Use Weight Loss Apps. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 642–654. 10.1145/3025453.3025591 [DOI] [Google Scholar]
  19. Eiraldi R, Wolk CB, Locke J, & Beidas R (2015). Clearing hurdles: The challenges of implementation of mental health evidence-based practices in under-resourced schools. Advances in School Mental Health Promotion, 8(3), 124–140. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Epstein DA, Caraway M, Johnston C, Ping A, Fogarty J, & Munson SA (2016). Beyond abandonment to next steps: Understanding and designing for life after personal informatics tool use. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 1109–1113. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Fincher S, Finlay J, Greene S, Jones L, Matchen P, Thomas J, & Molina PJ (2003). Perspectives on HCI patterns: Concepts and tools. CHI’03 Extended Abstracts on Human Factors in Computing Systems, 1044–1045. [Google Scholar]
  22. Fixsen DL, Blase KA, Naoom SF, Van Dyke M, & Wallace F (2009). Implementation: The missing link between research and practice. NIRN Implementation Brief, 1, 218–227. [Google Scholar]
  23. Fleck R, & Fitzpatrick G (2010). Reflecting on reflection: Framing a design landscape. Proceedings of the 22nd Conference of the Computer-Human Interaction Special Interest Group of Australia on Computer-Human Interaction, 216–223. [Google Scholar]
  24. Fritz T, Huang EM, Murphy GC, & Zimmermann T (2014). Persuasive technology in the real world: A study of long-term use of activity sensing devices for fitness. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 487–496. [Google Scholar]
  25. Galván A (2017). The neuroscience of adolescence. Cambridge University Press. [Google Scholar]
  26. Graham AK, Wildes JE, Reddy M, Munson SA, Barr Taylor C, & Mohr DC (2019). User-centered design for technology-enabled services for eating disorders. International Journal of Eating Disorders, 52(10), 1095–1107. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Grevet C, & Gilbert E (2015). Piggyback prototyping: Using existing, large-scale social computing systems to prototype new ones. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 4047–4056. [Google Scholar]
  28. Grist R, Croker A, Denne M, & Stallard P (2019). Technology delivered interventions for depression and anxiety in children and adolescents: A systematic review and meta-analysis. Clinical Child and Family Psychology Review, 22(2), 147–171. [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Gulliver A, Griffiths KM, & Christensen H (2010). Perceived barriers and facilitators to mental health help-seeking in young people: A systematic review. BMC Psychiatry, 10(1), 1–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Hetrick SE, Robinson J, Burge E, Blandon R, Mobilio B, Rice SM, Simmons MB, Alvarez-Jimenez M, Goodrich S, & Davey CG (2018). Youth codesign of a mobile phone app to facilitate self-monitoring and management of mood symptoms in young people with major depression, suicidal ideation, and self-harm. JMIR Mental Health, 5(1), e9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Hollis C, Falconer CJ, Martin JL, Whittington C, Stockton S, Glazebrook C, & Davies EB (2017). Annual Research Review: Digital health interventions for children and young people with mental health problems–a systematic and meta-review. Journal of Child Psychology and Psychiatry, 58(4), 474–503. [DOI] [PubMed] [Google Scholar]
  32. Holtzblatt K, & Beyer H (1997). Contextual design: Defining customer-centered systems. Elsevier. [Google Scholar]
  33. Holtzblatt K, Wendell JB, & Wood S (2005). Rapid contextual design: A how-to guide to key techniques for user-centered design (Vol. 2005). [Google Scholar]
  34. Huguet A, Rao S, McGrath PJ, et al. A systematic review of cognitive behavioral therapy and behavioral activation apps for depression. PloS one. 2016;11(5):e0154248. [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Kelley C, Lee B, & Wilcox L (2017). Self-Tracking for Mental Wellness: Understanding Expert Perspectives and Student Experiences. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 629–641. 10.1145/3025453.3025750 [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Langley AK, Nadeem E, Kataoka SH, Stein BD, & Jaycox LH (2010). Evidence-based mental health programs in schools: Barriers and facilitators of successful implementation. School Mental Health, 2(3), 105–113. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Langley GJ, Moen RD, Nolan KM, Nolan TW, Norman CL, & Provost LP (2009). The improvement guide: A practical approach to enhancing organizational performance. John Wiley & Sons. [Google Scholar]
  38. Lewis JR (2018). The system usability scale: Past, present, and future. International Journal of Human–Computer Interaction, 34(7), 577–590. [Google Scholar]
  39. Lyon AR, Brewer SK, & Areán PA (2020). Leveraging human-centered design to implement modern psychological science: Return on an early investment. American Psychologist, 75(8), 1067. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Lyon AR, & Bruns EJ (2019). User-centered redesign of evidence-based psychosocial interventions to enhance implementation—Hospitable soil or better seeds? JAMA Psychiatry, 76(1), 3–4. [DOI] [PubMed] [Google Scholar]
  41. Lyon AR, Coifman J, Cook H, McRee E, Liu FF, Ludwig K, … & McCauley E (2021). The Cognitive Walkthrough for Implementation Strategies (CWIS): a pragmatic method for assessing implementation strategy usability. Implementation Science Communications, 2(1), 1–16. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Lyon AR, Dopp AR, Brewer SK, Kientz JA, & Munson SA (2020). Designing the future of children’s mental health services. Administration and Policy in Mental Health and Mental Health Services Research, 47(5), 735–751. [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Lyon AR, Munson SA, Renn BN, Atkins DC, Pullmann MD, Friedman E, & Areán PA (2019). Use of human-centered design to improve implementation of evidence-based psychotherapies in low-resource communities: Protocol for studies applying a framework to assess usability. JMIR Research Protocols, 8(10), e14990. [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Lyon AR, Pullmann MD, Jacobson J, Osterhage K, Al Achkar M, Renn BN, Munson SA, & Areán PA (2020). Assessing the Usability of Complex Psychosocial Interventions: The Intervention Usability Scale. [DOI] [PMC free article] [PubMed]
  45. MacLeod H, Jelen B, Prabhakar A, Oehlberg L, Siek K, & Connelly K (2016). Asynchronous remote communities (arc) for researching distributed populations. 10th EAI International Conference on Pervasive Computing Technologies for Healthcare. [Google Scholar]
  46. Maestre JF, MacLeod H, Connelly CL, Dunbar JC, Beck J, Siek KA, & Shih PC (2018). Defining Through Expansion: Conducting Asynchronous Remote Communities (ARC) Research with Stigmatized Groups. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 1–13. 10.1145/3173574.3174131 [DOI] [Google Scholar]
  47. McCauley E, Gudmundsen G, Schloredt K, Martell C, Rhew I, Hubley S, & Dimidjian S (2016). The adolescent behavioral activation program: Adapting behavioral activation as a treatment for depression in adolescence. Journal of Clinical Child & Adolescent Psychology, 45(3), 291–304. [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. McCauley E, Schloredt KA, Gudmundsen GR, Martell CR, & Dimidjian S (2016). Behavioral activation with adolescents: A clinician’s guide. Guilford Publications. [Google Scholar]
  49. Mohr DC, Burns MN, Schueller SM, Clarke G, & Klinkman M (2013). Behavioral intervention technologies: Evidence review and recommendations for future research in mental health. General Hospital Psychiatry, 35(4), 332–338. [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Mohr DC, Weingardt KR, Reddy M, & Schueller SM (2017). Three problems with current digital mental health research… And three things we can do about them. Psychiatric Services, 68(5), 427–429. [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Mojtabai R, Olfson M, & Han B (2016). National trends in the prevalence and treatment of depression in adolescents and young adults. Pediatrics, 138(6). [DOI] [PMC free article] [PubMed] [Google Scholar]
  52. Munson SA (2017). Rethinking assumptions in the design of health and wellness tracking tools. Interactions, 25(1), 62–65. [Google Scholar]
  53. Murnane EL, Walker TG, Tench B, Voida S, & Snyder J (2018). Personal informatics in interpersonal contexts: Towards the design of technology that supports the social ecologies of long-term mental health management. Proceedings of the ACM on Human-Computer Interaction, 2(CSCW), 1–27. [Google Scholar]
  54. Nagar R, & Kemp N (2020). BA Video. https://youtu.be/Le10_EyamDw
  55. Newman BM, Lohman BJ, & Newman PR (2007). Peer group membership and a sense of belonging: Their relationship to adolescent behavior problems. Adolescence, 42(166). [PubMed] [Google Scholar]
  56. Norman D, & Draper S (1987). User centered system design: New perspectives on human-computer interaction. Journal Educational Computing Research, 3(1), 129–134. [Google Scholar]
  57. Nuttman-Shwartz O (2019). The moderating role of resilience resources and sense of belonging to the school among children and adolescents in continuous traumatic stress situations. The Journal of Early Adolescence, 39(9), 1261–1285. [Google Scholar]
  58. Olfson M, Gameroff MJ, Marcus SC, & Waslick BD (2003). Outpatient treatment of child and adolescent depression in the United States. Archives of General Psychiatry, 60(12), 1236–1242. [DOI] [PubMed] [Google Scholar]
  59. Olfson M, Mojtabai R, Sampson NA, Hwang I, Druss B, Wang PS, Wells KB, Pincus HA, & Kessler RC (2009). Dropout from outpatient mental health care in the United States. Psychiatric Services, 60(7), 898–907. [DOI] [PMC free article] [PubMed] [Google Scholar]
  60. Percival J, & McGregor C (2016). An evaluation of understandability of patient journey models in mental health. JMIR human factors, 3(2), e20. [DOI] [PMC free article] [PubMed] [Google Scholar]
  61. Pinfold V, Thornicroft G, Huxley P, & Farmer P (2005). Active ingredients in anti-stigma programmes in mental health. International Review of Psychiatry, 17(2), 123–131. [DOI] [PubMed] [Google Scholar]
  62. Richardson LP, McCauley E, Grossman DC, McCarty CA, Richards J, Russo JE, Rockhill C, & Katon W (2010). Evaluation of the Patient Health Questionnaire-9 Item for detecting major depression among adolescents. Pediatrics, 126(6), 1117–1123. 10.1542/peds.2010-0852 [DOI] [PMC free article] [PubMed] [Google Scholar]
  63. Rideout V, & Robb M (2019). The Common Sense Census: Media Use by Tweens and Teens (2019). Common Sense Media. https://www.commonsensemedia.org/research/the-commonsense-census-media-use-by-tweens-and-teens-2019 [Google Scholar]
  64. Rohani DA, Quemada Lopategui A, Tuxen N, Faurholt-Jepsen M, Kessing LV, & Bardram JE (2020). MUBS: A personalized recommender system for behavioral activation in mental health. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–13. [Google Scholar]
  65. Rohani DA, Tuxen N, Lopategui AQ, Faurholt-Jepsen M, Kessing LV, & Bardram JE (2019). Personalizing mental health: A feasibility study of a mobile behavioral activation tool for depressed patients. Proceedings of the 13th EAI International Conference on Pervasive Computing Technologies for Healthcare, 282–291. 10.1145/3329189.3329214 [DOI] [Google Scholar]
  66. Russell JA (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161. [DOI] [PubMed] [Google Scholar]
  67. Substance Abuse and Mental Health Services Administration (SAMHSA). (2016). 2015 National survey on drug use and health. https://www.samhsa.gov/data
  68. Suh H, Shahriaree N, Hekler EB, & Kientz JA (2016). Developing and Validating the User Burden Scale: A Tool for Assessing User Burden in Computing Systems. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 3988–3999. 10.1145/2858036.2858448 [DOI] [Google Scholar]
  69. Suh J, Williams S, Fann JR, Fogarty J, Bauer AM, & Hsieh G (2020). Parallel Journeys of Patients with Cancer and Depression: Challenges and Opportunities for Technology-Enabled Collaborative Care. Proceedings of the ACM on Human-Computer Interaction, 4(CSCW1), 1–36. [DOI] [PMC free article] [PubMed] [Google Scholar]
  70. Tatara N, \AArsand E, Skrøvseth SO, & Hartvigsen G (2013). Long-term engagement with a mobile self-management system for people with type 2 diabetes. JMIR MHealth and UHealth, 1(1), e1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  71. Thrul J, & Ramo DE (2017). Cessation strategies young adult smokers use after participating in a facebook intervention. Substance Use & Misuse, 52(2), 259–264. [DOI] [PMC free article] [PubMed] [Google Scholar]
  72. Tindall L, Mikocka-Walus A, McMillan D, Wright B, Hewitt C, & Gascoyne S (2017). Is behavioural activation effective in the treatment of depression in young people? A systematic review and meta-analysis. Psychology and Psychotherapy: Theory, Research and Practice, 90(4), 770–796. 10.1111/papt.12121 [DOI] [PMC free article] [PubMed] [Google Scholar]
  73. Torous J, Myrick KJ, Rauseo-Ricupero N, & Firth J (2020). Digital mental health and COVID-19: Using technology today to accelerate the curve on access and quality tomorrow. JMIR Mental Health, 7(3), e18848. [DOI] [PMC free article] [PubMed] [Google Scholar]
  74. Torous J, & Wykes T (2020). Opportunities from the coronavirus disease 2019 pandemic for transforming psychiatric care with telehealth. JAMA Psychiatry, 77(12), 1205–1206. [DOI] [PubMed] [Google Scholar]
  75. Wade NG, Post BC, Cornish MA, Vogel DL, & Tucker JR (2011). Predictors of the change in self-stigma following a single session of group counseling. Journal of Counseling Psychology, 58(2), 170. [DOI] [PubMed] [Google Scholar]
  76. Wadley G, Lederman R, Gleeson J, & Alvarez-Jimenez M (2013). Participatory design of an online therapy for youth mental health. Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration, 517–526. [Google Scholar]
  77. Weiner BJ, Lewis CC, Stanick C, Powell BJ, Dorsey CN, Clary AS, Boynton MH, & Halko H (2017). Psychometric assessment of three newly developed implementation outcome measures. Implementation Science, 12(1), 108. [DOI] [PMC free article] [PubMed] [Google Scholar]
  78. Williams NJ, & Beidas RS (2019). Annual research review: The state of implementation science in child psychology and psychiatry: A review and suggestions to advance the field. Journal of Child Psychology and Psychiatry, 60(4), 430–450. [DOI] [PMC free article] [PubMed] [Google Scholar]
  79. Wu Y, Levis B, Riehm KE, Saadat N, Levis AW, Azar M, Rice DB, Boruff J, Cuijpers P, Gilbody S, & others. (2019). Equivalency of the diagnostic accuracy of the PHQ-8 and PHQ-9: A systematic review and individual participant data meta-analysis. Psychological Medicine. [DOI] [PMC free article] [PubMed] [Google Scholar]
  80. Zomerdijk LG, & Voss CA (2010). Service design for experience-centric services. Journal of Service Research, 13(1), 67–82. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

1
2
3

RESOURCES