Abstract
Schools are well positioned to provide access to youth mental health services, but implementing effective programs that promote emotional and behavioral functioning in school settings is complicated by the poor fit of interventions developed in research settings to complex school contexts. The current study formed a research-practice partnership with two urban public schools and mental health providers employed by those schools (N = 6, 100% female, 50% Black/African American, 50% White/Caucasian) in the adaptation of a depression prevention intervention, Act & Adapt. The intervention was modified by decreasing meeting time and streamlining session content, increasing flexibility, making intervention materials more similar to academic curriculum, and increasing the focus on managing disruptive behavior within group sessions. In an open trial, 6th grade students (N = 22; 59% boys, 31% Hispanic, 22% Black/African American, 4% Asian, 30% White/Caucasian) at both schools who were identified as clinically “at risk” reported improvements from baseline to post-intervention and at one-year follow-up on measures of emotional and behavioral difficulties and coping strategies, with parallel results by caregiver report. The providers reported satisfaction with the intervention, and qualitative analyses of provider focus groups suggested both barriers and facilitators to research-practice collaborations to implement mental health interventions in schools.
Keywords: School-based mental health, co-design, depression, prevention, youth mental health
Schools are naturally positioned to provide youth mental health intervention (Weist et al., 2003), and school-based services have long been recommended as a means of addressing unmet mental health needs (Adelman & Taylor, 2010; Hoagwood et al., 2007; Mufson et al., 2004; Weist et al., 2019). Though recognized as the “de facto mental health system” for children (Jacob & Coustasse, 2008), behavioral health services typically offered in school are varied, and evidence of effectiveness has been limited (Green et al., 2013; Kutash et al., 2006). Transporting effective interventions for youth into school settings remains a challenging enterprise, as many mental health interventions designed and implemented under ideal circumstances fare less well when the conditions of client, setting, and provider are under less optimal control (Weisz et al., 2014). The drop in effectiveness as interventions move into complex ecosystems such as schools (Weisz et al., 2013) highlights the need to identify strategies that support high-quality implementation (Williams & Beidas, 2019). The fit of the intervention with its intended system and population is suggested as a key factor in implementation and sustainment (Damschroeder et al., 2009; Lyon & Koerner, 2016).
As others have noted (Lyon & Koerner, 2016), existing frameworks of implementation stress the importance of intervention fit to end-user needs, but offer little guidance regarding how to achieve that goal. Interventions can be adapted using a variety of methodologies, and range in terms of the centrality of community-member perspectives and the intensity of the adaptation (i.e., surface structure adaptations described by Resnicow et al., 1999 vs. participatory action research models described by Goodyear-Smith et al., 2015). Approaches may be informed by the extent to which intervention aspects are seen as essential and non-modifiable versus inessential and subject to change (Chorpita et al., 2011), and the implementation outcomes of interest (Proctor et al., 2011).
Co-design, or co-creation, is one approach used to promote social innovation that entails active participation of end-users—providers or service recipients—in “various stages of the production process” (Voorberg et al., 2014, p. 3). As described by Chorpita et al., (2011), co-design related to mental health intervention entails treatment researchers and end-users each contributing expertise to make use of the empirical evidence base for treatment as well as “practice-based evidence” (Margison et al., 2000, p. 123). Using this definition of treatment co-design, researchers contribute a priori recommendations about the ingredients that treatment should entail, while end-user clinicians provide contextual information related to the setting to personalize the treatment and enhance its fit with the intended setting, providers, and recipients. For example, although researchers might identify pleasant events scheduling as a component of effective treatments for depression, local providers will know what activities might be available within the community that can be recommended for their student population. Likewise, providers can suggest structural aspects of the intervention (length, frequency, duration) as well as its delivery format that may improve implementation within the end-user setting. Co-design principles have been used to adapt and test the effectiveness of evidence-based practices in numerous studies (Brookman-Frazee et al., 2016); for example, for youth referred to mental health clinics (Weisz et al., 2017) or for parents in a universal prevention program (Morris et al., 2019). This approach has also been used with students to inform the development of a measurement-based care model for mental health within schools (Mayworm et al., 2020).
Recommendations for partnering with end-user clinicians to support the co-design process and the implementation of effective interventions in community settings include developing clear goals among researcher and stakeholders, ensuring equitable knowledge exchange (Brookman-Frazee et al., 2012), and tailoring the intervention strategies for end-user contexts using jointly developed prototypes that explicitly consider the environmental constraints of the intended setting (Lyon & Koerner, 2016; Metz & Bartley, 2017). In theory, this process may yield interventions that are appropriate (e.g., “the perceived fit, relevance, or compatibility of the. . . practice for a given practice setting, provider, or consumer”) (Proctor et al., 2011, p. 69). Improving appropriateness may in turn influence consumer and provider experiences of acceptability (e.g. satisfaction), and influence feasibility—the extent to which an intervention can be used within an intended setting (Proctor et al., 2011).
Developing appropriate, acceptable, and feasible school-based interventions for depression is an important goal. Youth depression predicts lower grades, academic underachievement (Hishinuma et al., 2012; Jaycox et al., 2009), and school absenteeism (Wood et al., 2012), as well as higher rates of disciplinary problems, poor social adjustment, substance abuse, and a three-fold increase in the risk of suicide (Armstrong & Costello, 2002; Field et al., 2012; Rohde et al., 1991). Many depression interventions have been found beneficial in school settings (e.g., Benas et al., 2019; Brunstein-Klomek et al., 2017; Tompson et al., 2017; Eiraldi et al., 2016; McCarty et al., 2013; Michael et al., 2016), but have used protocols developed by researchers and delivered or co-delivered by members of the research team. A recent review of school-based depression interventions found larger effect sizes for interventions delivered by external providers such as researchers than for those delivered by school staff (Werner-Seidler et al., 2017). And, despite evidence from large intervention trials in schools, school-based providers rarely use these interventions (Evans et al., 2013). This signals a need to develop models that both capitalize on the evidence-base for youth interventions and attend to the needs of end-users that may influence implementation.
With this in mind, the current study utilized a research-practice partnership and treatment co-design principles to modify an existing coping skills program and improve appropriateness for implementation within an urban middle school setting. The intervention, Act & Adapt, was selected because 1) it was developed for group implementation in schools (Bearman et al., 2009), 2) includes the most frequently occurring practice elements found in efficacious treatments of youth depression (Chorpita & Deleiden, 2009), 3) has evidence for clinical benefit in schools (Eiraldi et al., 2016) and 4) implementation challenges related to intervention session length and duration, as well as cost and timing of screening procedures, were identified during efficacy testing (Polo et al., 2006).
The current study had two goals: To describe the co-design process, including school provider perspectives on the process, and to assess the acceptability and feasibility of the modified intervention with the providers, students, and caregivers in the partner schools. We utilized a mixed-methods design, in which qualitative data collection about provider’s experiences with the co-design process was embedded in addition to the student outcomes from the open trial in order to deepen the understanding of treatment co-design in schools and inform future efforts. Based on the prior work by Polo et al. (2006) we anticipated that providers in the partner schools would recommend adaptations related to the structure (length, frequency, duration) and delivery format of the intervention, and provide recommendations related to the presentation of the a priori content. With regard to acceptability, we hypothesized that providers, students and caregivers would report satisfaction with the intervention. Feasibility indicators included recruitment, retention, and benefit for students. We hypothesized that we would identify, recruit, and retain a sample of students who were identified as at risk for depression, and that these students would show declines from pre-to-post-test on a measure of emotional and behavioral problems and maintain these gains through the one-year follow-up period. Hypotheses for acceptability and feasibility outcomes were based on prior research suggesting that end-user participation in treatment development enhances the relevance for stakeholders (Evans et al., 2005), and, conversely, that poorly-matched intervention-level characteristics dampen both acceptability and feasibility of evidence-based psychosocial interventions (Damschroder et al., 2009; Lyon & Koerner, 2016). Because the intervention targets the extent to which students use effective coping strategies, we further hypothesized that students would report increases in these coping strategies from pre-to-post-test and through the follow-up period.
The coping skills program adapted in this study was previously tested in another school-based trial for youth provided by research staff who co-led the intervention with school providers (Eiraldi et al., 2016). Because the current study adapted the content of the program using co-design, utilized only school-based providers employed by the schools, and used a briefer provider training and less rigorous student screening process, we made benchmark comparisons (Minami et al., 2008) of our results and the results from this prior trial in order to provide a point of reference to help interpret the present findings.
Method
Participants
The current study included two groups of participants: School-based mental health provider participants (“providers”) and student participants, both recruited from the two participating schools. Informed consent was obtained from all individual participants. Schools were urban public middle schools in a large Northeastern city. School A had a student population of 1,345 that identified as Hispanic (23%), Black/African American (8%), Asian (28%) and White Non-Hispanic (39%). Fifty-nine percent of students in School A received free or reduced lunch services. School B had a student population of 298 that identified as Hispanic (22%), Black/African American (71%), Asian (2%) and White Non-Hispanic (4%). Ninety-five percent of students in School B received free or reduced lunch services.
Provider participants were school-based behavioral health providers (N = 6) who were employed by the Department of Education and were embedded within the two participating schools. The research team initially reached out to school principals via emails, calls, and outreach to administrators within the Department of Education, and arranged meetings with principals and other members of leadership teams at interested schools (N = 7) to describe the project. The two final partner schools were selected based on interest, capacity, and student population. Providers who participated in the co-design and led the intervention were identified by the principals at each school and then approached by study staff. For school A, they were the mental health providers assigned to the grade being targeted (6th grade). For school B, they represented the entire behavioral health workforce at the school, broadly defined. All of the providers were women (100%) with an average age of 45.94 (SD = 10.45), who identified as Black/African-American (33.3%) and White Non-Hispanic (66.7%). Most (83.3%) had obtained a masters’ degree or higher, and they represented a number of allied health disciplines, including psychology (33.3%), counseling (33.3%) and social work (16.7%). Professionally, their roles were that of school psychologist (33.3%), guidance counselor (16.7%), social worker (16.7%) and substance abuse prevention specialist (16.7%). They had an average of 18.83 (SD = 11.14) years of experience. Provider characteristics are reported in Table 1.
Table 1.
Provider Characteristics
| Characteristic | N | % |
|---|---|---|
|
| ||
| Female | 6 | 100 |
| Age | 45.9 (M) | 10.5 (SD) |
| Race/Ethnicity | 12 | 92.3 |
| White/Non-Hispanic | 4 | 66.7 |
| Black/African-American | 2 | 33.3 |
| Primary discipline | ||
| Guidance | 2 | 33.3 |
| Psychology | 2 | 33.3 |
| Social Work | 1 | 16.7 |
| Substance Abuse Counselor | 1 | 16.7 |
| Highest degree earned | ||
| Bachelor’s | 1 | 16.1 |
| Master’s | 4 | 66.7 |
| Doctorate in Psychology | 1 | 16.1 |
| Years of professional/clinical experience | 18.8 (M) | 11.1 (SD) |
Student participants were 6th grade students (N = 22) attending one of two urban public middle schools who were identified via school-wide screening as clinically “at risk” based on exceeding a clinical cutoff on a broad measure of emotional and behavioral functioning. Students were largely boys (59.1%) with a mean age of 11.48 (SD = 0.68). They identified as Hispanic (31.8%), Black/African American (27.3%), Asian (4.6%), White/Caucasian (22.7%) and Other (13.6%). Participating children lived in households with an average of 2.32 (SD = 0.82) adults (18 years and older) of which 45.5% were married biological parents, as well as with an average of 2.40 (SD = 1.19) children (including themselves). Families reported annual household income between $5,000 and $30,000 (45.5%), $30,000 to $50,000 (13.6%), $50,000 to $75,000 (4.6%), $75,000 to $100,000 (9.1%), and over $100,000 (9.1%). Fourteen percent did not report annual income. Student characteristics are reported in Table 2.
Table 2.
Student Characteristics
| Characteristic | N | % |
|---|---|---|
|
| ||
| Sex | ||
| Male | 13 | 59.1 |
| Female | 11 | 40.9 |
| Age | 11.5 (M) | 0.7 (SD) |
| Race/Ethnicity | ||
| Hispanic | 7 | 31.8 |
| Black/African-American | 6 | 27.3 |
| White/Caucasian | 5 | 22.7 |
| Asian | 1 | 4.6 |
| Other | 3 | 13.6 |
| Adults living in household | 2.3 (M) | 0.8 (SD) |
| Children living in household | 2.4 (M) | 1.2 (SD) |
| Living with married parents | 10 | 45.5 |
| Household income | ||
| $5,000–$10,000 | 4 | 18.2 |
| $10,000–$20,000 | 3 | 13.6 |
| $20,000–$30,000 | 3 | 13.6 |
| $30,000–$40,000 | 2 | 9.1 |
| $40,000–$50,000 | 1 | 4.5 |
| $50,000–$75,000 | 1 | 5.5 |
| $75,000–$100,000 | 2 | 9.1 |
| Over $100,000 | 2 | 9.1 |
| Not reported | 4 | 18.2 |
| Dependents on income | 4.2 (M) | 1.8 (SD) |
| Biological mother employed | 12 | 54.5 |
Procedures
The Intervention
Primary and Secondary Control Enhancement Training (PASCET), is an existing cognitive behavioral approach to coping skills that has been previously tested in community clinics (Weisz et al., 1997; Weisz et al., 2009). It was adapted for video-guided delivery in schools and efficacy tested in public middle schools in Boston and Los Angeles (Bearman et al., 2009; Polo et al., 2006). The school-based version of PASCET has been nicknamed “Act & Adapt,” and focuses on strengthening coping strategies for vulnerable youth. The program emphasizes primary control coping skills to change objective conditions, such as using relaxation to soothe physiological sensitivity, or enhancing social skills to improve interpersonal interactions, and secondary control coping skills to adjust one’s thoughts or expectations to minimize the impact of a situation, such as reappraising the meaning attached to an event. The original Act & Adapt was developed as a group program with thirteen 90-minute sessions led by two group leaders, who were members of the research team. The protocol in the current trial was adapted (described below, in Results) following recommendations by the participating providers, resulting in a flexible manual with 12 skill units designed to be delivered in 35–45-minute sessions.
Treatment Co-design
The model of research-community partnership for co-design used in this study is comprised of four stages and follows closely to that used in the Southern California BRIDGE collaborative study (Stahmer et al., 2011); these stages include formation/initiation, activities related to program revision, pilot planning, process evaluation, and feedback. Through these stages, participants generated shared goals and decided upon a meeting structure, examined and evaluated the existing program, suggested ways to modify the Act & Adapt program to better fit their school environment, and reviewed the resulting products (e.g., program protocol, recruitment process).
Provider participation began with a series of three, three-hour meetings during the Spring academic semester between members of the research team and providers from both schools. The first meeting focused on the roles of the providers in their schools and established goals for the co-design process. At the second meeting, the research team reviewed the a priori content of the existing intervention including the core practice elements theorized to increase primary and secondary control (“Act” and “Adapt” skills) and the video content and activities used to convey those practice elements. Providers were informed that these core practice elements should be retained in the co-designed product, and were asked for their input about the procedures and content of the treatment protocol and video vignettes, as well as their ideas for activities, ways to convey the core elements, and any additions to the core practice elements. At the final meeting, the research team shared a prototype of an adapted version of the intervention based on feedback from the prior meeting and received additional feedback from the providers. Prior to the start of the following Fall academic semester, the providers and the research team reconvened for a six-hour meeting to review the content of the revised intervention. This meeting consisted of didactic presentation and modeling and role-play of intervention content.
Open Trial
At the start of the Fall academic semester, all sixth-grade students at both schools (N = 556) participated in a universal mental health screening initiated by the principals at both schools. A portion of the students’ caregivers (19.0%) opted not to share the results of their children’s screening with the research team. Of the remaining 450 students, 46 (10.2%) had scores on the screening measure that corresponded with either the borderline range or the abnormal range on the screening measure (SDQ ≥ 14), thus qualifying as “at risk.” Other inclusion criteria for students included being English-speaking and having an IQ in the average range (determined by school providers). Exclusion criteria included a historical diagnosis of Autism Spectrum Disorder, Psychotic disorder, Bipolar depression, or any current or recent suicide attempt or hospitalization due to mental health difficulties (past six months). Twenty-two of the students who met criteria (47.8%) received parental consent to participate in the intervention. Of the 24 remaining students with elevated SDQ scores, 12 (50%) declined participation for various reasons, including that they were already receiving services or that parents did not want them to be pulled from class. An additional eight students (33%) either did not meet all inclusion criteria (e.g., did not speak English) or met some exclusion criteria. Two students (8.3%) changed schools and were not able to participate, and research staff were unable to contact caregivers for an additional two students (8.3%). Figure 1 illustrates study enrollment. One child changed schools mid-way through the intervention and did not complete the study.
Figure 1.
Student Participant Enrollment
School A had one coed group (n = 9) led by two providers, whereas School B chose to have one girls’ group (n = 6) and one boys’ group (n = 7) led by two different provider dyads. All groups met during non-instructional time during regular periods in the students’ schedules as determined by the providers. The average group session duration was 36 minutes. School providers completed baseline and post-treatment questionnaires. Students and their caregivers completed questionnaires about students at baseline, post-treatment, and 12-month follow up. To account for missing data points, we conducted multiple imputation to calculate predicted values based on the observed data in a manner that results in valid statistical inference (Rubin, 1987).
Focus Groups
Providers participated in two one-hour focus group meetings following the completion of the co-design process and after the open trial. The focus groups were facilitated by an interviewer (the fourth author) who had not been involved with the intervention adaptation or implementation and who had not received training in the intervention. Providers were assigned code numbers and assured their names would not be associated with their responses, and that the research team would not listen to the audio and would only view their responses in transcribed form. The focus groups were conducted using a structured list of open-ended questions followed by queries or probes to elaborate the responses about the providers’ impressions of the co-design process and resulting modified program as well as their experiences throughout the collaborative process (see Appendix). All meetings were audio-recorded and transcribed verbatim by research assistants who were not involved in prior study phases.
All transcribed data from the focus groups were analyzed in the following manner consistent with thematic analysis: 1) The entire body of text was read and discussed by the first and fourth authors to develop familiarity with the broad concepts related to the study’s aim of understanding feasibility of co-design (Palinkas et al., 2011). 2) The first and fourth authors independently reviewed the text line-by-line, and highlighted relevant text related to the broad concepts. Relevant text was highlighted when it contained similar words or phrases used by more than one source, or when it reflected a priori concepts based on an extensive literature review of school-based mental health programs and barriers to implementation. While highlighting relevant text, memos were created in order to record preliminary groupings and track emergent themes. For example, relevant text related to educational tasks that consumed provider time was initially given the notation “Competing Academic and Mental Health Priorities” based on the extant literature on school-based mental health. 3) The first and fourth authors reviewed all relevant text and corresponding memos in order to identify repeating ideas, similar concepts expressed by relevant text from two or more research participants, and to refine, combine, and disaggregate repeating ideas as needed. For example, text notated as “Competing Academic and Mental Health Priorities” was later merged with text that was notated as “Multiple Roles for Providers,” ultimately becoming “Competing Work Roles and Responsibilities.” The repeating ideas and corresponding exemplar relevant text were collected in a preliminary coding manual. 4) Using the preliminary coding manual, trained qualitative coders used QSR NVivo (Fraser, 2000) to independently organize the relevant text into repeating ideas and the repeating ideas into themes. 5) As suggested by Auerbach and Silverstein (2003), the organization of the data was made “transparent” as both coders discussed all coding discrepancies, provided their rationale, and reorganized the data accordingly; this process was consistent with the principle of Constant Comparison (Maykut & Morehouse, 1994). All disagreements were resolved in discussion among the coders and the first author. Sixty-seven percent of the data was double coded and agreement on final text coded ranged from 73.6% to 100% (M = 98.34, SD = 6.97).
Measures
Provider Measures
School Provider Background Questionnaire.
This is a brief questionnaire that collects information regarding school provider age, gender, race/ethnic status, professional training/experience, and treatment orientation.
School Provider Satisfaction Index.
This 16-item questionnaire adapted from Addis and Krasnow (2000) assesses provider agreement with statements related to treatment satisfaction (“I liked using this treatment approach”) on a five-point scale (1 = strongly disagree, 3 = somewhat agree, 5 = strongly agree). Internal consistency in the current study was good (α = .84).
Student Measures
Demographic Questionnaire.
This is a brief questionnaire that collects information regarding the age, gender, and race/ethnic status of all student participants.
Strengths and Difficulties Questionnaire (SDQ).
The SDQ (Goodman, 2001) is a brief behavioral screening self-report questionnaire for adolescents from 11–17 years of age made up of 25 statements regarding psychological attributes and behaviors, some positive and others negative, divided between 5 scales: emotional symptoms, conduct problems, hyperactivity/inattention, peer relationship problems, and prosocial behavior. They also form a total difficulties scale used in the current study. Total difficulties scores range from 0 to 40; scores above 17 are considered abnormal, scores between 14 to 16 are considered in the borderline range for difficulties, and below 14 is considered normal functioning. In the current study, internal consistency at pretreatment was .31, and at post-treatment was .68.
Perceived Control Scale for Children (PCSC).
The PCSC (Weisz et al., 1998) is a 24-item measure that assesses beliefs about one’s ability to exert control over outcomes in academic, social, and behavioral domains (“I can get good marks for my homework if I really work at it,” “If I try to behave, I can keep myself out of trouble.”). The measure has demonstrated internal consistency ratings of .88 and test-retest reliability of .57 (Weisz et al., 2001). In the current sample internal consistency was .70.
Secondary Control Scale for Children (SCSC).
The SCSC is a 20-item measure that assesses how children perceive their ability to influence the personal psychological impact of objective conditions by adjusting oneself to fit those conditions (“When something bad happens, I can find a way to think about it that makes me feel better,” “When bad things happen to me that I can’t control, there are lots of things I can do to feel better.”). The SCSC has demonstrated good internal consistency (α = .89), and test-retest stability and convergent and discriminant validity (Weisz et al., 2010). Internal consistency was .64 in the current sample.
Satisfaction with Treatment.
This is a 5-item questionnaire that assesses student agreement with statements related to satisfaction with the services provided during the program measured on a 5-point Likert-scale (1 = strongly disagree, 3 = somewhat agree, 5 = strongly agree). The highest possible score is 25. In the current sample, internal consistency was .67.
Caregiver Measures
Caregiver Demographics.
This questionnaire assesses the ethnic/racial background of both the child’s mother and father, as well as reports on caregiver employment, educational level achieved, and household income.
Strengths and Difficulties Questionnaire (SDQ).
The SDQ (Goodman, 2001) is a brief behavioral screening report questionnaire for parents of children 11–17 years of age. Like the youth self-report, the parent SDQ is made up of 25 statements regarding psychological attributes and behaviors, some positive and others negative. These attributes are divided between 5 scales: emotional symptoms, conduct problems, hyperactivity/inattention, peer relationship problems, and prosocial behavior, and form a composite total difficulties score. Total difficulties scores range from 0 to 40; scores above 17 are considered abnormal, scores between 14 and 16 are in the borderline range, and below 14 are considered normal. In the current study internal consistency at pre-treatment was .99 and at post-treatment was .76 for the total difficulties score.
Satisfaction with Treatment.
This is a 5-item questionnaire that assesses caregiver agreement with statements related to satisfaction with their child’s services measured on a 5-point Likert-scale at post-treatment (1 = strongly disagree, 3 = somewhat agree, 5 = strongly agree). The highest possible score is 25. In the current sample, internal consistency was .44.
Results
Treatment Co-Design Adaptations
During the initial meeting, members introduced themselves and shared personal professional goals for the collaboration, as well as goals for their school and student communities. An important goal for all was professional development and learning new skills to enhance their work with students. Professional development credits (CEUs) were recommended for the training as well as a certification process. All agreed that the majority of their time at work was spent reacting to student concerns rather than preventing problems, and that a proactive approach to helping students was an important end goal. The first author described the existing intervention, which consisted of 13 90-minute meetings. Participants unanimously stated that this format was not feasible for the school context and agreed to review the intervention to consider how to adapt the structure and content for better fit.
At the second meeting, the process of program co-design was the primary focus. The participants reviewed the previously tested version of Act & Adapt and made recommendations for content that could be cut or that should be retained. They recommended shortening much of the video-guided content, decreasing repetition (review sessions), and having fewer “school like” activities (e.g., completion of worksheets in the meetings). The participants also discussed the student population they most wanted to target, with consensus that sixth graders were most in need. They set guidelines for the optimal group size (3–6 students and two leaders), duration of meetings (30 minutes of content, 10–15 minutes for transitions), and frequency (once a week, with flexibility during testing and holidays). Finally, recruitment and assessment schedule were discussed. Providers preferred a universal screening approach (vs. targeted referral). They selected a free, easily scored measure for screening and to evaluate outcomes (SDQ). One suggestion was to structure the intervention manual more like a classroom curriculum, by identifying main content to be covered and making recommendations for activities, with room for flexibility and options. Providers suggested they have “decision points” where they could decide that a content area had been covered sufficiently and move to the next topic, or could continue to review the original content area.
In the last meeting, decisions from the prior meetings were reviewed. A prototype of the intervention was shared, with suggestions emerging to give options for the level of detail and scripting. Ultimately, a decision was made to provide “Key Content/Main Steps” in a format that could easily be scanned, with more details and activities provided in a separate column. One provider raised concerns about the cost of providing rewards for students, and the group brainstormed ideas of incentives that could be provided by schools that were low or no cost or that were part of existing programs. Another provider suggested that the intervention needed to address student non-compliance or disruptive behavior. Lastly, the group recommended that more content be made “optional,” with clearer opportunities to skip content depending on the group’s needs. The timeline for study procedures were reviewed, with providers recommending that they reach out to parents about student participation once those at risk were identified from the universal screening. The group agreed to look at additional prototypes over the summer and reconvene for a training once the school year began. The program manual was finalized over the summer; modifications to the program reduced the manual from 162 pages to 104, and the video-guided content from 110 minutes to 60 minutes. The training was reduced from two days to one day. All modifications made during the co-design process are summarized in Table 3.
Table 3.
Summary of Co-Design Adaptations made to Act & Adapt Intervention
| Adaptation | Example |
|---|---|
| Shortened Meeting Duration | • Length of group meetings reduced from weekly 90 minutes to weekly 45 minutes with 30 minutes of content and 10–15 minutes for transitions |
| Shortened video content | • Cut 50 minutes of video content that was perceived as redundant or extraneous (for example, a second video vignette of the problem-solving skill) |
| Shortened group leader manual content | • Increased brevity of phrasing and cut content related to review sessions resulting in 58 fewer pages in manual |
| Reorganized group leader manual | • Manual restructured to parallel academic curriculum with clearly defined “main content” section followed by suggested activities |
| Added flexibility | • Clearly defined “decision points” added to highlight group leader choice about when to move on to new content versus continue with current topic |
| Fewer prescribed components | • Activities labelled “optional” • Rewards system not fixed as in prior protocol |
| Recommendations for managing disruptive behavior | • Added content related to managing group behavior with an emphasis on addressing non-compliance or disruptive behavior in group |
| Participant Screening | • Used brief, free universal screener rather than multi-step clinical assessment |
| Provider Training Length | • Provider training decreased from 1.5 days to 1 day |
Qualitative Results
The two focus group meetings occurred after the completion of the co-design process and after the program had been fully implemented. All six providers participated in both focus groups. Although queried, providers did not make any recommendations to further adaptations to the content of the intervention during either focus group, but did make recommendations regarding the timing of implementation. The qualitative analysis of the focus groups revealed two broad themes related to the co-design process: challenges and facilitative processes. Several sub themes emerged that were related to these overarching themes. Qualitative results are reported by theme and sub-themes and are summarized, along with exemplar quotes, in Table 4.
Table 4.
Qualitative Data Analysis for Focus Groups: Categories and Themes
| Category | % Participants who endorsed (N) | Exemplar response |
|---|---|---|
|
| ||
| 1. Challenges | 100.0% (6) | |
| 1.1. Competing work roles and responsibilities | 100.0% (6) | There were the constraints of, our kids here and whether I’m supposed to be testing . . . or whatever it is, we were spending some time on it, not crazy amounts, but there was a period or two that got spent on e-mailing discussing whatever, I could have, would have been helpful for other things. |
| 1.2. Unpredictable crises or school events | 50.0% (3) | I work in the middle school two days a week and elementary school three days a week. It’s supposed to be separate but they’ve been calling us from elementary school when there’s a crisis, which they say they try not to do. |
| 1.3. Academic calendar | 66.7% (4) | I wish we could have done this like the first week back to school when kids weren’t in the building and when my time was a little bit more flexible. |
| 2. Facilitators | 100.0% (6) | |
| 2.1. Opportunities for professional development |
83.3% (5) | [It was] something to maybe take into another venue whether it’s like private practice or something that I find that could be useful whether I ran a group privately or if it was just skills that I use in individual therapy with someone privately. |
| 2.2. Increase service capacity for students | 100.0% (6) | Selfishly it worked out for me because we basically took the kids who weren’t mandated, put them in this group, so they were getting services. |
Note. Text included in brackets has been included by the author for clarification or to replace identifying information.
Challenges
All of the six providers (100%) endorsed that they experienced some challenges to participating in the co-design process and the intervention implementation. Within that theme, school providers noted three sub-themes: (1) competing work roles and responsibilities, (2) unpredictable crises or school events, and (3) the challenges of the academic calendar.
Competing Work Roles and Responsibilities.
All of the providers (100%) noted that competing work roles and responsibilities made their participation in the co-design process challenging, describing a number of demands that limited their time. For example, one provider explained, “We do all of the mandated counseling and at risk counseling for 8th grade. We do the whole high school articulation process, we do all of the crisis intervention through the 8th grade.” Another noted, “At the very beginning [of the year] we have to, we help with registration,” making this a challenging time to meet. The co-design process introduced additional work that sometimes conflicted with other tasks, as described by this provider:
There were the constraints of our kids were here and whether I’m supposed to be testing . . . or whatever it is, we were spending some time on it, not crazy amounts, but there was a period or two that got spent on e-mailing, discussing, whatever—I could have, would have been helpful for other things.
These demands sometimes tested their commitment to the project: “Yeah, like you want to do it and you see the need for it, but you have those other responsibilities.” One provider said, “I had to invest a bunch of time that I didn’t have to do it. I’m happy I did, so there’s no complaints, but like if you don’t have somebody who’s willing to do that then it’s not going to work.”
Unpredictable Crises or School Events.
Relatedly, three of the providers (50.0%) noted that unplanned clinical crises or events in the school day sometimes challenged their participation. Stated one provider, “My least favorite part is just when there’s like five crises happening at the same time and you can’t really give the intervention that the kids deserve.” Another provider noted that she was expected to be available to address unscheduled clinical issues: “Just random kids that walk in through our door looking to talk to someone or for help.” Said another:
I work in middle school two days a week and elementary school three days a week. It’s supposed to be separate but they’ve been calling us from elementary school when there’s a crisis, which they say they try not to do.
The Academic Calendar.
The challenges posed by the academic calendar were also noted by four providers (66.7%). Reflecting upon the timeline of the co-design, one provider said, “I wish we could have done this like the first week back to school when kids weren’t in the building and when my time was a little bit more flexible.” Another agreed that the timing of the co-design meetings was challenging: “Plus, I think we started more towards the end of the school year, right? Which is kind of very hectic anyway.” Another provider recommended using natural breaks in the academic calendar for parts of the co-design process, “I might be open to trying to do that over the summer. That might be something that I would rather have done then come back to be like, ‘oh that needs to be ready for the second day of school.’”
Facilitators
All of the providers (100%) also noted a number of features of the co-design process that fostered their participation. Two sub-themes related to this theme emerged: 1) opportunities for professional development and 2) increasing service capacity and skills.
Opportunities for Professional Development.
Most of the providers (83.3%) noted that they were motivated by opportunities for professional advancement and networking afforded by the co-design process. One provider stated “For me it was just like an investment in this year, of that maybe we could learn some sort of therapeutic intervention that would be beneficial for me professionally whether it’s inside school or not.” Said another provider,
[It’s] something to maybe take into another venue whether it’s like private practice or something that I find that could be useful whether I ran a group privately or if it was just skills that I use in individual therapy with someone privately.
A different provider thought the experience might be helpful if she sought additional professional training, and said, “If personally, if I’m thinking about going for higher level work, it’s hard to find research opportunities when you’re not still in school and it’s hard to get picked up by Ph.D. programs without the research experience.” Another indicated a similar desire to list this partnership on her resume “I think what would be helpful would be if we could have a sort of…that we’ve been through this training.”
Increasing Service Capacity and Skills.
All of the providers (100.0%) discussed that their participation was motivated by a desire to increase the school’s ability to meet the mental health needs of their students. One provider explicitly stated this goal: “Selfishly it worked out for me because we basically took the kids who weren’t mandated, put them in this group, so they were getting services and now I’m doing it with my mandated groups.” Said another, “This is going to help us pick up on some of those kids that we miss, that are not having crises.” Another provider identified that participation in the co-design process resulted in increased mental health resources, and said, “Some type of maybe educational resources to help us share with teachers how they can best work with kids if the depression is manifesting itself.” Providers also identified that they were driven to expand their own clinical skills in order to better serve students. Said one provider, “I wanna be more effective so that’s what I wanted to get from it.” Agreed another, “I was thinking of practical tools that we can use. Yeah because now there is such a variety of things you’re seeing.” Several providers described feeling encouraged by their expanded skillset, as expressed by this quote, “The goals came true; they came into fruition for us. We wanted to take a break from putting out fires and actually be in a room and you know do something therapeutic.”
Open Trial Outcomes
Provider Outcomes
Providers expressed overall satisfaction with the treatment approach at post-treatment (M = 65.75, SD = 3.20; Range: 63–69). On average, responses corresponded to “agree” on satisfaction items.
Student Outcomes
Students reported general satisfaction with the treatment program at post-treatment (M = 20.57, SD = 3.16; Range: 15–25), with average responses corresponding to “agree” with positive statements about the treatment. On average, students reported difficulties in the abnormal range at pre-treatment (M = 18.71, SD = 3.39), in the borderline range at post-treatment (M = 15.29, SD = 4.61), and in the normal range at one-year follow up (M = 13.78, SD = 5.02).
One-way repeated measures ANOVA were conducted to compare the effect of participation in the group on self-report of total child difficulties, primary coping skills, and secondary coping skills at pre-treatment, post-treatment, and one-year follow up. Results indicated a significant effect of participation in the group on total child difficulties as measured by the SDQ (refer to Table 3 for statistical results). Post-hoc tests using the Bonferroni correction revealed that child difficulties reduced by an average of 3.43 units from pre-treatment to post-treatment (p = .022) and reduced by an average of 4.93 units from pre-treatment to one-year follow up (p = .002). With regards to risk classification on the SDQ, at pre-treatment 68.2% of students were in the borderline range of risk for mental health problems, and 38.1% were in the clinically abnormal range of risk (Goodman, 1997), per student self-report. Immediately following treatment, 52.4% of students reported SDQ scores corresponding to the normal range, 28.6% remained in the borderline range, and 19.0% were in the abnormal range. At one-year follow-up, 71.4% of students reported SDQ scores that corresponded with the normal range of functioning, 9.5% reported SDQ scores that corresponded with the borderline range of functioning, and 19.0% reported SDQ scores in the abnormal range.
Self-reported primary control skills increased significantly between time points (see Table 3), with post hoc tests using the Bonferroni correction indicating that coping skills increased by an average of 3.01 units from pre-treatment to one-year follow up (p = .024). Students reported a similar non-significant trend for secondary control skills across time points.
Caregiver Outcomes
Caregivers were generally satisfied with the treatment their child received (M = 20.57, SD = 2.48; Range: 17–25), with average responses corresponding to “agree” with positive statements about the treatment. A repeated measures ANOVA was conducted to test the effect of group participation on parent-reported child difficulties, indicating a statistically significant effect (see Table 3). Post hoc tests using the Bonferroni correction revealed that child difficulties reduced by an average of 4.39 units from pre-treatment to one-year follow up (p = .003). In terms of risk classification, at pre-treatment 36.4% of parents reported that their student was in the normal range on the SDQ, 31.8% of parents reported that their student was in the borderline range of risk, and 31.8% reported scores on the SDQ that corresponded with the clinically abnormal range of risk. After treatment, 59.1% of parents reported SDQ scores for their students corresponding to the normal range, 13.6% reported scores within the borderline range, and 27.3% were in the abnormal range. At one-year follow-up, 86.4% of parents reported SDQ scores within the normal range of functioning, 0.0% reported SDQ scores that corresponded with the borderline range of functioning, and 13.6% reported SDQ scores in the abnormal range.
Benchmarking Comparisons
We compared these results to another trial of the video-guided, group-based version of PASCET for middle-school students (Eiraldi et al., 2016). Eiraldi and colleagues tested PASCET in a sample of 22 youths with sessions co-delivered by members of the research team and school providers. Diagnostic risk status was classified using the NIMH Diagnostic Interview Schedule for Children, Computer Version, 4th Edition (NIMH C-DISC-IV) and generated three levels of diagnostic severity for internalizing disorders: Positive (has a diagnosis), Intermediate (at risk), or Negative (no disorder). Per self-report, from pre-to-post-test two youths (9.1%) increased their level of diagnostic severity, eight youths (36.4%) maintained the same level of diagnostic severity, and 12 youths (54.5%) improved (i.e., changed from a positive to intermediate or no diagnosis, or changed from an intermediate to no diagnosis). To compare, the current study sample was classified using the “abnormal,” “borderline,” and “normal” criteria for the SDQ. In the current sample, three youths (14.3%) increased their risk status at post-treatment, six youths maintained the same at risk status (28.6%), and 12 youths (57.1%) improved (i.e. changed from abnormal to borderline or normal, or changed from borderline to normal), by self-report.
Discussion
This study used a treatment co-design model to adapt a previously tested coping skills intervention and test its acceptability and feasibility in two urban middle schools with local providers. Advancing the use of effective, sustainable treatments in schools is critical for improving access to services and quality of care for youth. Increasing the appropriateness of services to address depression would capitalize on natural patterns of service use, since students with mood disorders most commonly report receiving school-based, rather than clinic-based services (Mendenhall, 2012). The strong evidence base for school-based depression interventions is somewhat qualified by the fact that most studies have relied on research team members to deliver interventions (Owens et al., 2014). The current study responds to recommendations for overcoming school implementation challenges by adapting the intervention to fit its intended context using a research-practice partnership and supporting school providers in implementation (Cook et al., 2019; Eiraldi et al., 2015; Lyon & Bruns, 2019). Study aims included documenting the process of adaptation and evaluating the acceptability and feasibility of the intervention with provider and students. Qualitative data was embedded within the study to provide additional insight into the school providers’ experiences with the co-design process.
During the initial stages of our partnership, providers identified goals that were both aspirational (enhanced skill development, proactive approach to managing mental health concerns) and practical (using the training to get needed professional development credits; finding a school-wide screening that was free and sustainable). As expected, a number of the changes recommended by the providers during the intervention co-design process were structural. Providers suggested modifications to improve the appropriateness of the intervention to the school context; decreasing the duration of the meeting time and making the manual more user-friendly and similar to other school materials (e.g. academic curriculum). They also requested a focus on managing disruptive behavior that had not previously been included, based on their experiences with the student populations. Providers also recommended increasing flexibility by introducing “decision points” where they could choose to continue a topic or move on to the next, depending on their judgment of student needs.
Following the process of co-design, the program appeared to have acceptability for implementation within the school ecology. Providers, students, and caregivers reported high levels of satisfaction with the intervention, comparable to a treatment trial of a co-designed intervention for mental health treatment (Weisz et al., 2017). Satisfaction is a key metric of acceptability and is theorized to be an important aspect of treatment adoption and sustainment (Damschroder et al., 2009). Providers further reported that they were using aspects of the program with individual students (outside of the formal groups) and intended to run the groups again in the following academic year.
We used information about recruitment, retention, and effectiveness as metrics of the intervention’s feasibility. Ten percent of students screened reported elevated scores on the brief screening measure selected by the schools and 47% of those were eligible and consented to participate in the intervention, similar to other school-based intervention studies using universal screening approaches (Sweeney et al., 2015; Young et al., 2019). Related to retention, all but one student completed the intervention. In terms of clinical benefit, students reported significant improvements, with group means moving from clinically elevated scores on the SDQ to borderline elevations by the end of treatment, with continued benefit at the one-year follow-up assessment. Parent report of student difficulties also showed a significant change over time and a large effect size, though they reported less initial clinical elevation. These findings may reflect research indicating that non-White children self-report more mental health symptoms than their parents do about them (Lau et al., 2004). Other school-based intervention trials have also found that youth report is particularly responsive to change (Fox & Masia Warner, 2017).
Students also reported significant increases in primary control coping strategies with a large effect size and a parallel, nonsignificant trend for secondary coping strategies with a moderate effect size. It was encouraging that students continued to show improvements over time with regard to their report of primary and secondary coping strategies. In previous research, improvements in coping skills appear to mediate the beneficial effects of interventions for at risk youth and predict better mental health outcomes at follow-up (Compas et al., 2010; Tein et al., 2006). Therefore, our findings suggest that the increases in coping after the current intervention may potentially mitigate risk for these students.
Benchmarking comparisons with another school-based trial of the same intervention showed remarkably similar impact on youth functioning, with similar percentages showing changes in risk status, albeit with different outcome measures (clinical diagnosis versus a broadband measure of emotional and behavioral functioning). This is especially notable because the prior study (Eiraldi et al., 2016) used more rigorous clinician training (a day and half in the prior study versus a day in the current study), more rigorous screening for inclusion (diagnostic interviews by research team) and used members of the research team as intervention co-leaders.
Qualitative data from the provider focus groups also yielded information related to feasibility of end-user co-design via a research-practice partnership. Barriers were consistent with those identified in other studies (Forman et al., 2009; Langley et al., 2010; McGoey et al., 2014): Multiple roles of the providers, limited time, and the competing demands of the academic setting. Some of these barriers appear to be inherent to the school setting, while others might be addressed in future implementation efforts—for example, utilizing the natural breaks in the academic calendar to plan trainings or other events. Providers reported that the barriers were balanced somewhat by the potential for professional development, expanding service capacity for their students, and opportunities to develop new clinical expertise. Because every new context presents unique characteristics, the co-design process described here may be a way not only to increase intervention fit but also to facilitate provider engagement, for Act & Adapt and for other interventions.
Limitations
This study attempted to address some well-documented challenges of introducing mental health interventions developed under rigorous research conditions into the complex ecology of the school setting. A critical purpose was to support school provider co-design of the existing intervention, and to test acceptability and feasibility of this revised version of Act & Adapt as delivered by those same school providers. The student symptom and coping outcomes, though encouraging, should be viewed in the context of both a small sample size and an open-trial design that limit statistical power and interpretation of effects. Without a control group, it is impossible to rule out any number of rival hypotheses that might explain the changes on outcomes (e.g., passage of time, expectancies, attention). Likewise, a comparison to the original version of Act & Adapt is necessary to draw inferences about comparative effectiveness and implementability. Though the inclusion of provider perspectives in the co-design process is a strength of the study, we used a small convenience sample from only two schools and cannot be certain that their perspectives are generalizable. We attempted to minimize bias by using an interviewer who was not involved in the intervention adaptation or implementation process and assuring participants that all responses would be de-identified, but it is possible that the providers’ felt pressured to make positive statements about the experience. Including other end-users in co-design (e.g., school leaders, student service recipients) might bolster the fit of the intervention in schools. We were also not able to collect diagnosis-specific symptom data due to school district policy that prohibited the collection of these data. However, given that schools may be more likely to use broad measures of student functioning than diagnostic scales, the use of a brief, free, and potentially sustainable measure may increase the relevance of these results for schools.
Future Directions and Implications
A useful next step to extend this research would be a randomized trial that provides a rigorous test of the adapted intervention and permits probing of potential moderators, including school climate. The organizational social culture and climate of workplaces is a critical factor in the implementation of interventions (Williams & Beidas, 2019), and was related to school provider fidelity in a recent trial (Williams et al., 2019). Strategies to improve school-based implementation may need to go beyond individual providers, and address system-level factors.
The current study provides a blueprint of a co-design process that is often recommended for overcoming common implementation challenges in schools but has rarely been enacted (Eiraldi et al., 2015; Owens et al., 2014). Local school providers shared valuable insights about how Act & Adapt could be adapted to fit within their complex school contexts, and the results of the open trial of the adapted intervention were comparable to a trial of the same intervention with higher levels of researcher implementation support. Our findings highlight the utility of research-practice partnerships to co-design interventions that are acceptable and feasible within the school setting, without any evidence of compromised clinical benefit. Such approaches may help overcome implementation challenges in schools like those previously noted for Act & Adapt (Polo et al, 2006), increasing the capacity to serve a diverse population of youth at risk for mental health problems. At the same time, the process of adaptation and implementation can be lengthy and effortful (Wolk et al., 2019), and the aspects that challenge or facilitate participation, as identified in this study, can inform future efforts to partner with school providers and implement effective mental health programs to the benefit of students.
Supplementary Material
Table 5.
One-Way Repeated Measures Analysis of Variance of Student Outcomes
| Outcome | Pre-Treatment M (SD) | Post-Treatment M (SD) | 1 Year FU M (SD) | F | p | η2 |
|---|---|---|---|---|---|---|
|
| ||||||
| Child SDQ | 18.71a (3.39) | 15.29b (4.61) | 13.78bc (5.02) | 9.05 | .001 | .31 |
| Parent SDQ | 13.83 (4.79) | 12.06 (5.31) | 9.43 (5.41) | 6.39 | .005 | .23 |
| Primary Coping | 29.48a (5.23) | 30.57ab (4.44) | 32.49b (4.29) | 3.62 | .039 | .15 |
| Secondary Coping | 23.38 (4.26) | 23.71 (5.48) | 25.75 (5.61) | 1.77 | .186 | .08 |
Note. Bonferonni’s correction post hoc comparisons were conducted when F ratios were statistically significant. Means with differing subscripts within rows are significantly different at the p < .05 based on Bonferonni’s correction post hoc comparisons.
Acknowledgments
This research was supported by funding received by Sarah Kate Bearman from the National Institute of Mental Health (MH083887). We are most grateful to the students, teachers, and mental health providers at Wagner Middle School and Ebbets Field Middle School.
Footnotes
Alison Bellevue is now with CBT/DBT Associates, New York NY
References
- Addis ME, & Krasnow AD (2000). A national survey of practicing psychologists’ attitudes toward psychotherapy treatment manuals. Journal of Consulting and Clinical Psychology, 68(2), 331. [DOI] [PubMed] [Google Scholar]
- Adelman HS, & Taylor L. (2010). Mental health in schools: Engaging learners, preventing problems, and improving schools. Corwin Press. [Google Scholar]
- Armstrong TD, & Costello EJ (2002). Community studies on adolescent substance use, abuse, or dependence and psychiatric comorbidity. Journal of Consulting and Clinical Psychology, 70(6), 1224. [DOI] [PubMed] [Google Scholar]
- Auerbach C, & Silverstein LB (2003). Qualitative data: An introduction to coding and analysis. NYU press. [Google Scholar]
- Bearman SK, Weisz JR, & Essau CA (2009). Primary and secondary control enhancement training (PASCET): Applying the deployment-focused model of treatment development and testing. Treatments for adolescent depression: Theory and practice. [Google Scholar]
- Benas JS, McCarthy AE, Haimm CA, Huang M, Gallop R, & Young JF (2019). The depression prevention initiative: Impact on adolescent internalizing and externalizing symptoms in a randomized trial. Journal of Clinical Child and Adolescent Psychology, 48(Suppl 1), S57–S71. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brookman‐Frazee L, Stahmer AC, Lewis K, Feder JD, & Reed S. (2012). Building a research‐community collaborative to improve community care for infants and toddlers at‐risk for autism spectrum disorders. Journal of Community Psychology, 40(6), 715–734. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brookman-Frazee L, Stahmer A, Stadnick N, Chlebowski C, Herschell A, & Garland AF (2016). Characterizing the use of research-community partnerships in studies of evidence-based interventions in children’s community services. Administration and Policy in Mental Health and Mental Health Services Research, 43(1), 93–104. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brunstein-Klomek A, Kopelman-Rubin D, Apter A, Argintaru H, & Mufson L. (2017). A pilot feasibility study of interpersonal psychotherapy in adolescents diagnosed with specific learning disorders, attention deficit hyperactive disorder, or both with depression and/or anxiety symptoms (IPT-ALD). Journal of Psychotherapy Integration, 27(4), 526. [Google Scholar]
- Chorpita BF, & Daleiden EL (2009). Mapping evidence-based treatments for children and adolescents: Application of the distillation and matching model to 615 treatments from 322 randomized trials. Journal of consulting and clinical psychology, 77(3), 566. [DOI] [PubMed] [Google Scholar]
- Chorpita BF, Rotheram-Borus MJ, Daleiden EL, Bernstein A, Cromley T, Swendeman D, & Regan J. (2011). The old solutions are the new problem: How do we better use what we already know about reducing the Burden of mental illness?. Perspectives on Psychological Science, 6(5), 493–497. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Compas BE, Champion JE, Forehand R, Cole DA, Reeslund KL, Fear J, Hardcastle EJ, Keller G, Rakow A, Garai E, Merchant MJ, & Roberts L. (2010). Coping and parenting: Mediators of 12-month outcomes of a family group cognitive–behavioral preventive intervention with families of depressed parents. Journal of Consulting and Clinical Psychology, 78(5), 623–634. 10.1037/a0020459 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cook CR, Lyon AR, Locke J, Waltz T, & Powell BJ (2019). Adapting a compilation of implementation strategies to advance school-based implementation research and practice. Prevention Science. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, & Lowery JC (2009). Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science, 4(1), 50. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Eiraldi R, Power TJ, Schwartz BS, Keiffer JN, McCurdy BL, Mathen M, & Jawad AF (2016). Examining effectiveness of group cognitive-behavioral therapy for externalizing and internalizing disorders in urban schools. Behavior Modification, 40(4), 611–639. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Eiraldi R, Wolk CB, Locke J. & Beidas R. (2015) Clearing hurdles: the challenges of implementation of mental health evidence-based practices in under-resourced schools, Advances in School Mental Health Promotion, 8:3, 124–140. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Evans SW, Koch R, Brady C, Meszaros P, Sadler J. (2013). Community and school mental health professionals’ knowledge and use of evidence-based substance use prevention programs. Administration and Policy in Mental Health and Mental Health Services Research, 40, 319–330. [DOI] [PubMed] [Google Scholar]
- Evans SW, Green AL, & Serpell ZN (2005). Community participation in the treatment development process using community development teams. Journal of Clinical Child and Adolescent Psychology, 34(4), 765–771. [DOI] [PubMed] [Google Scholar]
- Field T, Diego M, Pelaez M, Deeds O, & Delgado J. (2012). Depression and related problems in university students. College Student Journal, 46(1), 193–203 [Google Scholar]
- Forman SG, Olin SS, Hoagwood KE, Crowe M, & Saka N. (2009). Evidence-based intervention in schools: Developers’ views of implementation barriers and facilitators. School Mental Health: A Multidisciplinary Research and Practice Journal, 1(1), 26–36. [Google Scholar]
- Fraser D. (2000). NVivo: Reference Guide. [Google Scholar]
- Fox JK, & Masia Warner C. (2017). Assessing clinical improvement in school-based treatment for social anxiety disorder: Agreement between adolescents, parents, and independent evaluators. Child Psychiatry & Human Development, 48(5), 721–727. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Goodman R. (1997). The Strengths and Difficulties Questionnaire: A research note. Journal of Child Psychology and Psychiatry, 38(5), 581–586. [DOI] [PubMed] [Google Scholar]
- Goodman R. (2001). Psychometric properties of the strengths and difficulties questionnaire. Journal of the American Academy of Child & Adolescent Psychiatry, 40, 1337–1345. [DOI] [PubMed] [Google Scholar]
- Goodyear-Smith F, Jackson C, & Greenhalgh T. (2015). Co-design and implementation research: challenges and solutions for ethics committees. BMC medical ethics, 16(1), 78. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Green JG, McLaughlin KA, Alegría M, Costello EJ, Gruber MJ, Hoagwood K, Leaf PJ, Olin S, Sampson NA and Kessler RC, (2013). School mental health resources and adolescent mental health service use. Journal of the American Academy of Child & Adolescent Psychiatry, 52(5), 501–510. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hoagwood KE, Olin S, Kerker BD, Kratochwill TR, Crowe M, & Saka N. (2007). Empirically based school interventions targeted at academic and mental health functioning. Journal of Emotional and Behavioral Disorders, 15(2), 66–92. [Google Scholar]
- Hishinuma ES, Chang JY, McArdle JJ, & Hamagami F. (2012). Potential causal relationship between depressive symptoms and academic achievement in the Hawaiian high schools health survey using contemporary longitudinal latent variable change models. Developmental Psychology, 48(5), 1327. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jacob S, & Coustasse A. (2008). School-based mental health: a de facto mental health system for children. Journal of Hospital Marketing & Public Relations, 18(2), 197–211. [DOI] [PubMed] [Google Scholar]
- Jaycox LH, Stein BD, Paddock S, Miles JN, Chandra A, Meredith LS, Tanielian T, Hickey S. and Burnam MA (2009). Impact of teen depression on academic, social, and physical functioning. Pediatrics, 124(4), e596–e605. [DOI] [PubMed] [Google Scholar]
- Kutash K, Duchnowski AJ, & Lynn N. (2006). School-based mental health: An empirical guide for decision-makers. [Google Scholar]
- Langley AK, Nadeem E, Kataoka SH, Stein BD, & Jaycox LH (2010). Evidence-based mental health programs in schools: Barriers and facilitators of successful implementation. School Mental Health: A Multidisciplinary Research and Practice Journal, 2(3), 105–113. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lau AS, Garland AF, Yeh M, Mccabe KM, Wood PA, & Hough RL (2004). Race/ethnicity and inter-informant agreement in assessing adolescent psychopathology. Journal of Emotional and Behavioral Disorders, 12(3), 145–156. [Google Scholar]
- Lyon AR, & Bruns EJ (2019). From evidence to impact: Joining our best school mental health practices with our best implementation strategies. School Mental Health, 11(1), 106–114. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, & Koerner K. (2016). User‐centered design for psychosocial intervention development and implementation. Clinical Psychology: Science and Practice, 23(2), 180–200. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Margison FR, Barkham M, Evans C, McGrath G, Clark JM, Audin K, & Connell J. (2000). Measurement and psychotherapy: Evidence-based practice and practice-based evidence. The British Journal of Psychiatry, 177, 123–130. [DOI] [PubMed] [Google Scholar]
- Maykut P, & Morehouse R. (1994). Qualitative data analysis: Using the constant comparative method. Beginning qualitative research: A philosophic and practical guide, 126–149. [Google Scholar]
- Mayworm AM, Kelly BM, Duong MT, & Lyon AR (2020). Middle and High School Student Perspectives on Digitally-Delivered Mental Health Assessments and Measurement Feedback Systems. Administration and Policy in Mental Health and Mental Health Services Research, 1–14. [DOI] [PMC free article] [PubMed] [Google Scholar]
- McCarty CA, Violette HD, Duong MT, Cruz RA, & McCauley E. (2013). A randomized trial of the positive thoughts and action program for depression among early adolescents. Journal of Clinical Child & Adolescent Psychology, 42(4), 554–563. [DOI] [PMC free article] [PubMed] [Google Scholar]
- McGoey KE, Rispoli KM, Venesky LG, Schaffner KF, McGuirk L, & Marshall S. (2014). A preliminary investigation into teacher perceptions of the barriers to behavior intervention implementation. Journal of Applied School Psychology, 30(4), 375–390. [Google Scholar]
- Mendenhall AN (2012). Predictors of service utilization among youth diagnosed with mood disorders. Journal of Child And Family Studies, 21(4), 603–611. [Google Scholar]
- Metz A, & Bartley L. (2017). Co-creating the conditions to sustain the use of research evidence in public child welfare. Child Welfare, 94, 115–139. [Google Scholar]
- Michael KD, George MW, Splett JW, Jameson JP, Sale R, Bode AA, & Weist MD (2016). Preliminary outcomes of a multi-site, school-based modular intervention for adolescents experiencing mood difficulties. Journal of Child and Family Studies, 25(6), 1903–1915. [Google Scholar]
- Minami T, Serlin RC, Wampold BE, Kircher JC, & Brown GJ (2008). Using clinical trials to benchmark effects produced in clinical practice. Quality and Quantity, 42(4), 513. [Google Scholar]
- Morris H, O’Connor A, Cummins J, Valentine C, Dwyer A, Goodyear M, & Skouteris H. (2019). A pilot efficacy study of Parents Building Solutions: A universal parenting program using co-design and strength-based approaches. Children and Youth Services Review, 105, 104447. [Google Scholar]
- Mufson LH, Dorta KP, Olfson M, Weissman MM, & Hoagwood K. (2004). Effectiveness research: Transporting interpersonal psychotherapy for depressed adolescents (IPT-A) from the lab to school-based health clinics. Clinical Child and Family Psychology Review, 7(4), 251–261. [DOI] [PubMed] [Google Scholar]
- Owens JS, Lyon AR, Brandt NE, Warner CM, Nadeem E, Spiel C, & Wagner M. (2014). Implementation science in school mental health: Key constructs in a developing research agenda. School Mental Health, 6(2), 99–111. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, & Landsverk J. (2011). Mixed method designs in implementation research. Administration and Policy in Mental Health and Mental Health Services Research, 38(1), 44–53. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Polo AJ, Bearman SK, Short KH, Ho A. & Weisz JR (2006). Strengthening school-research collaborations while developing effective youth depression programs. Emotional & Behavioral Disorders in Youth, 6, 27–46. [Google Scholar]
- Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R. and Hensley M. (2011). Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research, 38(2), 65–76. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Resnicow K, Baranowski T, Ahluwalia JS, & Braithwaite RL (1999). Cultural sensitivity in public health: defined and demystified. Ethnicity & disease, 9(1), 10–21. [PubMed] [Google Scholar]
- Rohde P, Lewinsohn PM, & Seeley JR (1991). Comorbidity of unipolar depression: II. Comorbidity with other mental disorders in adolescents and adults. Journal of Abnormal Psychology, 100(2), 214. [PubMed] [Google Scholar]
- Rubin DB (1987). Multiple imputation for nonresponse in surveys. John Wiley & Sons: New York,New York. [Google Scholar]
- Stahmer AC, Brookman-Frazee L, Lee E, Searcy MK, & Reed MS (2011). Parent and multidisciplinary provider perspectives on earliest intervention for children at risk for autism spectrum disorders. Infants and Young children, 24(4), 344. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sweeney C, Masia Warner C, Brice C, Stewart C, Ryan J, Loeb KL, & McGrath RE (2015). Identification of social anxiety in schools: The utility of a two-step screening process. Contemporary School Psychology, 19(4), 268–275. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tein JY, Sandler IN, Ayers TS, & Wolchik SA (2006). Mediation of the effects of the Family Bereavement Program on mental health problems of bereaved children and adolescents. Prevention Science, 7(2), 179–195. [DOI] [PubMed] [Google Scholar]
- Tompson MC, Sugar CA, Langer DA, & Asarnow JR (2017). A randomized clinical trial comparing family-focused treatment and individual supportive therapy for depression in childhood and early adolescence. Journal of The American Academy Of Child & Adolescent Psychiatry, 56(6), 515–523. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Voorberg WH Bekkers VJJM & Tummers LG (2014): A Systematic Review of Co-Creation and Co-Production: Embarking on the social innovation journey, Public Management Review, DOI: 10.1080/14719037.2014.930505 [DOI] [Google Scholar]
- Weist MD, Evans SW, & Lever NA (2003). Advancing mental health practice and research in schools. In Weist MD, Evans SW, & Lever NA (Eds.), Handbook of school mental health: Advancing practice and research (pp. 1–7). New York: Kluwer Academic/Plenum. [Google Scholar]
- Weist MD, Hoover S, Lever N, Youngstrom EA, George M, McDaniel HL, Fowler J, Bode A, Bradley WJ, Taylor LK and Chappelle L. (2019). Testing a Package of Evidence-Based Practices in School Mental Health. School Mental Health, 1–15. [Google Scholar]
- Weisz J, Bearman SK, Santucci LC, & Jensen-Doss A. (2017). Initial test of a principle-guided approach to transdiagnostic psychotherapy with children and adolescents. Journal of Clinical Child & Adolescent Psychology, 46(1), 44–58. [DOI] [PubMed] [Google Scholar]
- Weisz JR, Francis SE, & Bearman SK (2010). Assessing secondary control and its association with youth depression symptoms. Journal of abnormal child psychology, 38(7), 883–893. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Weisz JR, Ng MY, & Bearman SK (2014). Odd couple? Reenvisioning the relation between science and practice in the dissemination-implementation era. Clinical Psychological Science, 2(1), 58–74. [Google Scholar]
- Weisz JR, Southam-Gerow MA, Gordis EB, Connor-Smith JK, Chu BC, Langer DA, McLeod BD, Jensen-Doss A, Updegraff A. and Weiss B. (2009). Cognitive–behavioral therapy versus usual clinical care for youth depression: An initial test of transportability to community clinics and clinicians. Journal of Consulting and Clinical Psychology, 77(3), 383. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Weisz JR, Southam-Gerow MA, & McCarty CA (2001). Control-related beliefs and depressive symptoms in clinic-referred children and adolescents: Developmental differences and model specificity. Journal of Abnormal Psychology, 110(1), 97. [DOI] [PubMed] [Google Scholar]
- Weisz JR, Southam-Gerow MA, & Sweeney L. (1998). The perceived control scale for children. Los Angeles: University of California, Los Angeles. [Google Scholar]
- Weisz JR, Thurber CA, Sweeney L, Proffitt VD, & LeGagnoux GL (1997). Brief treatment of mild-to-moderate child depression using primary and secondary control enhancement training. Journal of Consulting and Clinical Psychology, 65(4), 703. [DOI] [PubMed] [Google Scholar]
- Weisz JR, Ugueto AM, Cheron DM, & Herren J. (2013). Evidence-based youth psychotherapy in the mental health ecosystem. Journal of Clinical Child and Adolescent Psychology, 42(2), 274–286. [DOI] [PubMed] [Google Scholar]
- Werner-Seidler A, Perry Y, Calear AL, Newby JM, & Christensen H. (2017). School-based depression and anxiety prevention programs for young people: A systematic review and meta-analysis. Clinical Psychology Review, 51, 30–47. [DOI] [PubMed] [Google Scholar]
- Williams NJ, & Beidas RS (2019). Annual Research Review: The state of implementation science in child psychology and psychiatry: A review and suggestions to advance the field. Journal of Child Psychology and Psychiatry, 60(4), 430–450. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Williams NJ, Frank HE, Frederick L, Beidas RS, Mandell DS, Aarons GA, Green P. and Locke J. (2019). Organizational culture and climate profiles: relationships with fidelity to three evidence-based practices for autism in elementary schools. Implementation Science, 14(1), 15. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wood JJ, Lynne-Landsman SD, Langer DA, Wood PA, Clark SL, Eddy JM, & Ialongo N. (2012). School attendance problems and youth psychopathology: Structural cross-lagged regression models in three longitudinal data sets. Child Development, 83(1), 351–366. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wolk CB, Stewart RE, Eiraldi R, Cronholm P, Salas E, & Mandell DS (2019). The implementation of a team training intervention for school mental health: Lessons learned. Psychotherapy, 56(1), 83–90. 10.1037/pst0000179.supp (Supplemental) [DOI] [PMC free article] [PubMed] [Google Scholar]
- Young JF, Jones JD Sbrilli MB, Benas JS, Spiro CN, Haimm CA, Gallop R, Mufson L & Gillham JE (2019) Long-Term Effects from a School-Based Trial Comparing Interpersonal Psychotherapy-Adolescent Skills Training to Group Counseling, Journal of Clinical Child & Adolescent Psychology, 48:sup1, S362–S370. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.

