Abstract
Implementation science has grown considerably in school mental health over the past decade, driven by the persistent gaps between knowledge about “what works” to promote student well-being and what is typically delivered in schools. In 2014, Owens and colleagues outlined an implementation research agenda for the field of school mental health, emphasizing professional development and coaching, evidence-based practice fidelity, and sustainment. This paper provides a 10-year progress review across these domains and articulates an updated research agenda focused on: (1) pragmatic implementation strategies, (2) implementation mechanisms explaining the connection between implementation strategies and outcomes, (3) strategic intervention redesign to promote implementation, and (4) de-implementation of low-value practices. We propose critical research questions in these areas and conclude by highlighting contemporary study designs and methods (i.e., hybrid trial designs; sequential, multiple assignment, randomized trials; rapid qualitative approaches; economic analyses) that can be leveraged to investigate implementation research questions in school mental health over the next decade. In doing so, we hope to promote shared understanding of evolving implementation research priorities and opportunities to promote equitable, pragmatic, and sustainable implementation efforts in school mental health and drive synergy across projects.
Keywords: Implementation research design, School mental health, Implementation strategies, Implementation mechanisms, Intervention redesign, De-implementation
Introduction
For decades, researchers have focused on “what works” to address student mental health concerns, resulting in an array of evidence-based practices (EBP) to be delivered by school-based professionals (Evans et al., 2023; Fabiano & Evans, 2019). However, methods to ensure use of evidence in authentic educational contexts have been slower to develop, leading to underutilization of these practices (Lyon & Bruns, 2019a). Limitations in the reach, effectiveness, and equity of our school mental health service systems have been made especially apparent by the growing children’s mental health crisis, exacerbated by the onset of the COVID-19 pandemic (Bussières et al., 2021). Efforts to close the research-to-practice gap in school mental health can be conceptualized through the lens of implementation science—the study of methods to promote the uptake of research findings and EBP into routine service delivery (Eccles & Mittman, 2006). In 2014, driven by a growing wave of implementation-explicit work across healthcare and other sectors, Owens et al. (2014) articulated a research agenda intended to launch school mental health into the choppy “high seas” of implementation. Ten years later, we hope to highlight progress that has been made relevant to the research questions posed at that time and offer an updated perspective with more precise “charting” of a research agenda to help steer the field of school mental health on its continuing implementation voyage.
Implementation science is founded on repeated observations that EBP are inconsistently adopted, delivered with fidelity, and/or sustained over time in “real-world” delivery settings. In school mental health, substantial research has documented this phenomenon across the prevention and intervention spectrum, as well as the barriers and facilitators (i.e., implementation determinants) that drive it (e.g., Connors et al., 2021; Langley et al., 2010; Locke et al., 2019; Stahmer et al., 2015). For conceptual clarity, it is important to distinguish implementation from dissemination, or the targeted distribution of information to a specific audience (NIH, 2019). Although interest in dissemination science and the identification of dissemination strategies (i.e., intentional methods to communicate strategically about an EBP to specific partners to alter antecedents of behavior change) is growing, it has received far less attention than implementation across sectors and will not be an explicit focus of the current paper. Readers are referred to Baker et al. (2021) for a comprehensive review of dissemination science as it relates to school mental health. Further, as the field of implementation has continued to evolve and expand, distinct and complementary research and practice arms have emerged (Lyon et al., 2020a). A critically important counterpart to implementation research is implementation practice which involves conducting applied work to improve research evidence use and service quality rather than the production of generalizable knowledge about implementation (Franks & Bory, 2015). While this paper is focused explicitly on implementation research, the research agenda presented prioritizes questions with relevance for implementation practitioners.
School-based research has become increasingly oriented toward addressing implementation gaps, especially surrounding the provision of social, emotional, and behavioral supports and services. Core “building blocks” of implementation research include the identification and prioritization of implementation determinants, selection and testing of implementation strategies (i.e., active methods and techniques that address determinants to improve implementation), and measurement of implementation outcomes (e.g., adoption, fidelity, sustainment; Lyon & Bruns, 2019a). Further, because school mental health service access and utilization equals or exceeds mental health services provided in any other child-serving sector (Duong et al., 2020)—especially for marginalized and minoritized populations (e.g., Kataoka et al., 2011; Lyon et al., 2013; Wilk et al., 2022)—ensuring high-quality services also becomes an ethical and social justice imperative. Although schools improve access to services, there remain numerous avenues through which school mental health services still experience (and can perpetuate) inequities (Cipriano et al., 2022; Liu et al., 2022b). By considering both accessibility and quality, implementation science provides one prominent pathway to achieve the promise of school mental health. The need to explicitly center equity also is inherent in both school mental health (due to its population focus and orientation toward an accessible public sector service delivery system; Weist et al., 2023) and implementation science (due to its emphasis on contextually appropriate improvements in service quality; Baumann et al., 2023). Mirroring the broader implementation field (Adsul et al., 2024; Shelton & Brownson, 2023), studies at the intersection of school mental health and implementation science have begun to explicitly incorporate equity through pathways, such as the Adapting Strategies to Promote Implementation Reach and Equity (ASPIRE) framework (Gaias et al., 2022) and the emergence of equity-explicit implementation strategies that target barriers like implicit bias (Liu et al., 2022a; Owens et al., 2024).
Ten years out from Owens and colleagues’ (2014) research agenda, school mental health research and practice is situated within a social and policy context shaped by the COVID-19 pandemic and a contentious sociopolitical climate. Unmet youth mental health needs are rising, particularly for young people who experience marginalization related to racism, poverty, and/or discrimination based on LGBTQ+ status (Murthy, 2022; Ormiston & Williams, 2022). There is a growing recognition that schools are a key setting to help address this unmet need (IES, 2023); at the same time, school systems and personnel are experiencing tremendous stress, disruption, and staff turnover (Steiner & Woo, 2021), which pose heightened implementation challenges. Recent investments in schools, including federal COVID-19 relief funding, have offered opportunities to expand and improve the reach of comprehensive school mental health services (e.g., via enhanced EBP and school–community partnerships), but concerns remain about continued resources, implementation, and sustainment (Hoover, 2024).
Situated within the context of the advancing field and the current sociopolitical climate, the purpose of the current paper is to (1) review progress on the research questions originally outlined by Owens et al. (2014) and (2) present an updated agenda for implementation research in school mental health that reflects the advancing field. We will not provide broad definitions or overviews of implementation concepts, as those were addressed in Owens et al. (2014) and can also be found elsewhere (c.f., Forman et al., 2013; Lyon & Bruns, 2019a; Sanetti & Collier-Meek, 2019). The updated agenda focuses on four domains: (1) pragmatic implementation strategies, (2) implementation mechanisms, (3) strategic intervention redesign for implementation, and (4) de-implementation of low-value practices. We conclude by featuring example research designs and methods that can be applied to the four domains and illustrate how each design or method could link to a proposed implementation research question.
Review of Progress
The primary goal of Owens et al. (2014) was to present a blueprint with relevance to funders, research–practice partnerships, and researchers. In doing so, the authors first described an array of school-specific contextual issues relevant to implementation (e.g., school calendars, multiple types of relevant professionals). Then, they outlined a research agenda—and specific research questions—surrounding three core implementation components: (1) professional development (PD) and coaching, (2) EBP integrity/fidelity, and (3) EBP sustainment. We provide an update on progress for each research question posed within each domain, with particular attention to PD and coaching strategies given their prominence in school mental health research and notable strides since 2014. In identifying updated literature, we (1) reviewed papers cited in the original Owens et al. (2014) paper and searched for updated works from those research teams, (2) conducted forward searches on the key articles cited in the relevant sections of the 2014 paper, and (3) gathered a group of established experts in school mental health and implementation science to identify rigorously conducted and well-cited works. We scoped our search to focus on PreK-12 school mental health promotion, prevention and intervention, including studies involving teachers, administrators, school mental health providers, and other professionals involved in supporting student mental health and social–emotional well-being.
Professional Development (PD) and Coaching
Among the three areas outlined in the 2014 research agenda, we identified notable strides over the past decade particularly in the domain of PD (Admiraal et al., 2019; Stormont et al., 2015). Here we highlight selected work to illustrate what has been learned about the questions posed ten years ago. As noted in the prior paper, effective PD extends beyond onetime training to include ongoing consultation and coaching and other related implementation strategies (Cook et al., 2019) targeting practice change at the level of the school-based implementer (e.g., teacher, school clinician). The initial training component of PD, or “conducting educational meetings,” includes meetings targeted toward different groups of professionals (e.g., teachers, principals, central administrators, other organizational leaders, and community and family members) to teach them about the new practices (Cook et al., 2019; Lyon et al., 2019b). Onetime training continues to be associated with increased knowledge and attitudes but, not necessarily practice change in schools (Anderson et al., 2019) or other settings (Valenstein-Mah et al., 2020). Despite this evidence, we notice educational meetings and onetime training to be the prevailing PD modality used in schools (Langreo, 2022; Wilkinson et al., 2020).
Beyond initial training, a robust school mental health implementation research base supports the value of ongoing, post-training consultation or coaching (Ennis et al., 2019; Erchul, 2023; Pas et al., 2023). Consultation or coaching can be provided by someone internal or external to the school system, focused on the implementers who actually provide the EBP (Cappella et al., 2015). Some coaching or consultation is more expert-driven and hierarchical, whereas other models are more collaborative and guided by the teacher’s expertise and goals (Pas et al., 2023). The term “consultation” has been used more historically in the education literature including consultation models dating back to the 1970s (e.g., Behavioral Consultation; Bergan, 1977). “Consultation and coaching” represent similar processes and we use the more recent term, “coaching,” hereafter (Erchul, 2023; Pas et al., 2023). Research since 2014 has evaluated coaching as an implementation strategy to promote educator practices including classroom management (e.g., Bradshaw et al., 2018; Fabiano et al., 2018; Owens et al., 2020; Reinke et al., 2008; Shernoff et al., 2018; Sutherland et al., 2015) and classroom climate (e.g., Allen et al., 2015; Reinke et al., 2008), as well as to promote school mental health clinician’s use of EBP (Lyon et al., 2022; Meyer et al., 2022). Progress made on the PD and coaching-related research questions from Owens et al. (2014) are reviewed below.
Question 1. How Much Coaching (Dose) is Needed Following Initial PD to Enable Providers to Deliver an EBP or Its Components with High Integrity and Produce Desired Student Outcomes?
Inherent in this question is the hypothesis that “more is better” with respect to coaching dosage. Evidence from the past decade provides mixed support for this hypothesis. An observational study of coaching dose within the coaching arm of a randomized controlled trial found that higher coaching dosage across two school years for 192 preschool teachers implementing My Teaching Partner was associated with higher teacher implementation quality of emotional support practices. This implementation outcome mediated the relation between coaching dosage and increases in student outcomes of classroom engagement (Pianta et al., 2022). A second observational study of in-person classroom coaching with 210 teachers that focused on enhancing implementation of a Tier 1 mental health promotion program (PAX Good Behavior Game) found that more frequent coaching was associated with more frequent implementation, but no better implementation quality (Pas et al., 2015). However, these observational studies did not account for variables that may have contributed to higher doses of coaching, such as teacher preferences or needs, necessitating experimental design of coaching dose. In this vein, Lyon et al. (2022) randomly assigned 75 school-based mental health clinicians to 2, 4 or 8 weeks of coaching following initial training in implementation of measurement-based care with Tier 3 interventions. Interestingly, although some consultation was critical to implementation, they found no evidence for additional benefits from higher consultation dosages. These mixed results illustrate the importance of empirical studies to identify the impact of coaching dose, and factors related to the intervention, implementer, or context that may impact the relation between coaching dose and outcomes.
Some factors that may impact that relation include the quality of the coaching relationship (e.g., coach–implementer fit; (Johnson et al., 2016; Partee et al., 2022)), implementer motivation, interest in, and openness to receiving coaching, intervention complexity and usability, teacher experience and burnout (Holdaway & Owens, 2015) and teacher stress, workload demands, and other priorities aside from the intervention (Owens et al., 2018). Measurement of these factors would likely necessitate collection and comparison of multi-reporter data from coaches and implementers on the coaching experience and multi-level contextual factors (such as from a determinants checklist adapted for schools) potentially related to implementation outcomes. For instance, receptivity to coaching and teacher motivation has been related to growth and change in teacher practices, respectively (Owens et al., 2017b, 2021). Some of this work is reflected in Question 2, as various implementer-level determinants may operate as mechanisms of implementation or intervention fidelity. Of note, findings to date on the relation between coaching dose and outcomes have been conducted in the context of research studies with voluntary implementer participation, so future work is warranted to understand this relation in naturalistic implementation efforts in schools outside the confines of a research study with voluntary participation.
Question 2. What Strategies (or Combinations of Strategies) are Most Likely to Enhance Provider Knowledge, Skills and Beliefs, Intervention Integrity, and Student Outcomes?
A large body of evidence from the past decade supports modeling, practice, observation and performance feedback as core components of coaching that predict implementation of academic, social, emotional, and behavioral interventions (e.g., Glover et al., 2023; Kraft & Blazar, 2018; Noell & Gansle, 2014). However, intervention fidelity varies considerably within study samples based on implementer and school context factors, motivating the more recent question of identifying what coaching strategies work best for whom. Some research suggests that tailored consultation based on implementer needs and preferences is desirable to school mental health partners (Larson, 2022). In addition, there is emerging evidence that supports the effectiveness of tailored consultation. Namely, a randomized trial with 58 elementary school teachers found that, among teachers with lower baseline knowledge, skills and intervention supportive beliefs, those receiving an individually tailored, multi-component consultation package made greater gains in their use of effective classroom management strategies compared to those who received a standard problem-solving consultation package (Owens et al., 2017a). Yet, teachers without these implementer-level barriers showed equal benefit across individualized and standardized consultation conditions and were satisfied with the consultation received. This finding suggests that individualization may be more important for those who experience implementation challenges. Further, a small multiple baseline study with four teachers who received consultation within a Response to Intervention (RtI) model used predetermined implementation decision rules for transitioning teachers between tiers of consultation support and found that teachers needed varying levels and types of consultation support to improve their implementation outcomes (LaBrot et al., 2020). Together, these findings point toward future opportunities to study the effectiveness and scalability of tailored or multi-tiered levels of coaching strategies to suit individual-level determinants.
In summary, research in the past decade has advanced our understanding how implementer knowledge, skills, beliefs, and preferences influence the effectiveness of coaching dosage and strategies. Although tailored coaching based on individual implementer needs and preferences has been recommended among school mental health stakeholders (Larson, 2022), questions still remain to be explored about how coaching should be tailored, when, and for whom to achieve optimal implementation outcomes. While used frequently in school mental health implementation research (Baffsky et al., 2023), coaching is often tested as a packaged implementation strategy. Thus, future research to summarize, distill and evaluate coaching components (e.g., goal setting, action planning, modeling, observation, performance feedback, role play) in isolation and/or combination is needed.
Question 3. What Models of Coaching (External vs. Internal Coach) are Most Likely to be Adopted and Best Promote the Skills Taught Within PD?
Differences in the effectiveness of internal (e.g., district-employed) vs. external (e.g., EBP purveyors) coaching is still a nascent area in school mental health implementation research. Trade-offs between costs and funding sources needed, coach capacity or bandwidth, and diversity of training and expertise have been repeatedly pointed out as important considerations for school systems (Flannery et al., 2018; Giordano et al., 2020). Evidence about effective educator PD and coaching models for academic instruction more broadly is a useful guidepost as we develop evidence in school mental health implementation coaching. Specifically, a multi-site qualitative study of secondary schools found that external coaches were most effective at providing training and influencing early adoption, whereas internal coaches provided more in vivo classroom modeling, problem-solving and feedback for continued implementation (Thompson, 2022). A mixed methods study of internal and external coaches supporting 63 teachers on professional learning teams in Belgium found more positive teacher satisfaction with internal coaches yet better team effectiveness with external coaches (Compen & Schelfhout, 2020). More specific to school mental health, Giordano and colleagues (2020) found that among 15 Head Start teachers receiving coaching for social–emotional classroom practices, those with internal peer coaches demonstrated better intervention fidelity and student social skills as compared to those with external coaches. External district- and internal building-level coaches are commonly used in positive behavioral interventions and supports (PBIS) (Amaya & Amundson, 2024) and in some comprehensive school behavioral health models in the field (e.g., Battal et al., 2020; Marx et al., 2020). Overall, this is still an emerging area in school mental health implementation research and evidence from recent studies of school-based coaching more broadly points to strengths and limitations of both external and internal coaching models. Research to date is only beginning to explore the differential effectiveness of internal versus external coaches, with generally mixed findings. It may also be the case that internal and external coaches can work together in tiered coaching models to build internal coaches’ capacity for longer term workforce sustainability within the school system (Frazier et al., 2019; Ouellette et al., 2024). Although little research has explicitly attended to equity in coaching, it will be important to explore practical, systems issues such as how to ensure that under-resourced school systems can access effective, tailored, and multi-level (as needed) internal and external coaching models. Finally, beyond examining models of coaching, there is still little research to date empirically examining which components of coaching (internal or external) are essential to effectiveness.
Question 4. To What Extent are Implementer Motivation and Perceptions Malleable, and Does Change in These Factors Facilitate Implementer Skill Development?
Implementer motivation, perceptions, and engagement in PD and coaching for new initiatives has continued to be an important area of focus (Holmes et al., 2021) in addition to interest in the effects of teacher stress and burnout on classroom practice and teacher and student outcomes (Herman et al., 2020). Emerging research on the Beliefs and Attitudes for Successful Implementation in Schools (BASIS; Lyon et al., 2019a) implementation strategy, designed to target implementer motivation and enhance the effects of training and coaching, points to the potential malleability of implementer perceptions and beliefs. BASIS is a pragmatic and theoretically driven strategy that applies strategic education, social influence techniques, motivational interviewing, action planning, and problem-solving planning to improve EBP uptake. Clinician-focused and teacher-focused versions have been developed and tested. BASIS has demonstrated promise in enhancing participants’ attitudes, subjective norms, self-efficacy, and EBP adoption (Cook et al., 2015; Larson et al., 2021; Lyon et al., 2019a; Merle et al., 2023), and additional large-scale RCT work is ongoing (Lyon et al., 2021, 2024). Additional evidence is accumulating on the use of motivational interviewing techniques within coaching models to improve teacher self-efficacy in classroom behavior management interventions, reduce work-related stress (Bradshaw et al., 2018; Lee et al., 2019), and improve implementation (Owens et al., 2021). Overall, these advancements suggest that implementer motivation and perceptions may be malleable and point to the promise of targeting these characteristics to enhance implementation. However, enhancing implementer motivation is still an early and open area of inquiry, with motivational interviewing approaches as the predominant method to date.
EBP Fidelity
The 2014 research agenda also outlined three critical research questions related to the multiple dimensions of intervention integrity (restated using Owens et al.’s original terminology but discussed in this paper using the term “fidelity”), including adherence and competence. Addressing these questions requires measuring multiple dimensions of intervention fidelity, particularly within naturalistic trials in which school personnel implement interventions.
Question 1. What Level of Integrity (Adherence and Competence) is Good Enough to Produce the Intended Student Outcomes?
Several studies in the past decade have examined the relation between intervention fidelity and student outcomes. Although a number of studies have found positive relations between aspects of fidelity (e.g., fidelity to school mental health models; fidelity to aspects of schoolwide positive behavioral interventions and supports) and student outcomes (e.g., Barclay et al., 2022; Reinke et al., 2021), other findings have been more mixed (see Rojas-Andrade & Bahamodes, 2018 for a review). Regarding the specific research question posed by Owens et al. (2014), there has been less work specifically examining benchmarks for “good enough” levels of fidelity that might be needed to produce intended student outcomes. The few studies that have examined fidelity benchmarks have focused on teacher fidelity to behavioral classroom practices. Using weekly observation data, Owens et al. (2018) found that disruptive student behavior was significantly lower in classrooms where teachers responded effectively to a minimum threshold of disruptive behavior (51% of rule violations). Similarly, teachers who reached a minimum benchmark of responding effectively to least 51% of rule violations witnessed progressively lower rates of challenging behavior classwide and among students with or at-risk for ADHD over time (Owens et al., 2020). In contrast, disruptive behavior worsened over time in classrooms where teachers did not meet this threshold. Although additional benefits were observed when teachers responded effectively to even higher proportions (e.g., 60%, 70%, 80% of rule violations), these data provide preliminary evidence for a “good enough” level of fidelity to support desired student outcomes within the context of teacher-delivered behavior management approaches. Additional research with teacher and school mental health professionals is needed to identify minimum thresholds for fidelity that produce desired outcomes, as such minimum thresholds may be more achievable and feasible to sustain than thresholds requiring near perfection (e.g., 80–100%). Additionally, given the mixed results relating fidelity to student outcomes (i.e., only about 40% of the studies reviewed by Rojas-Andrade & Bahamodes (2018) observed an association between components of fidelity and outcomes), it is also important to identify factors that may moderate the relationship between fidelity and outcomes.
Question 2. Are the Multiple Dimensions of Integrity Differentially Predictive of Student Outcomes?
The past decade has seen progress as well as a number of challenges related to addressing this question. As emphasized by the special series in this journal, Advancing the Science of Integrity Measurement in School Mental Health Research, many researchers have led innovative and important work regarding the measurement of the multiple dimensions of fidelity, but this literature is still limited (Sutherland & McLeod, 2022). There has been an increase in the number of school-based intervention studies assessing at least one dimension of fidelity in the past decade; however, only a minority of studies from this time period reported on dimensions other than adherence (Sanetti et al., 2020). Developing and administering measures of the multiple dimensions of fidelity that are feasible, reliable, valid, and appropriate to the stage of the intervention development pipeline is a necessary step to advance the literature about the relations between fidelity dimensions and student outcomes (Sutherland & McLeod, 2022).
Studies that have tested the relations between multiple dimensions of fidelity and student outcomes have found mixed results. In a trial with school-based providers, Husabo et al. (2021) examined therapist adherence and competence in two school-based CBT programs and did not find relationships between either dimension of fidelity and student outcomes. Sutherland et al. (2018) found that teacher competence, but not teacher adherence, partially mediated the treatment effect of a teacher coaching model on student outcomes (i.e., externalizing problems). Other recent studies examining the relationship between single dimensions of fidelity and outcomes have found positive relationships between student outcomes and treatment dosage (DuPaul et al., 2024; Girio-Herrera et al., 2021) as well as session attendance (Margherio et al., 2023). Additionally, Pas et al. (2022) found mixed relationships between latent profiles of fidelity (using indicators of adherence, dosage, quality, and participant responsiveness) to a teacher coaching model and student outcomes. Taken together, the past decade of school mental health research has led to some evidence that multiple dimensions of fidelity may differentially predict student outcomes, but the primary conclusion from the current body of work is that additional measurement development work may be needed before researchers can fully answer this question.
Question 3. What Accountability Structures (e.g., Administration or Peer Networks) and/or Incentive Programs (Used in Combination with PD) Reduce Variability in Integrity?
There has also been increasing interest in the role of accountability structures and incentive programs in supporting treatment fidelity, although we are not aware of published studies that directly address this question. McLeod et al. (2022) wrote about the promise of “learning school systems,” which use an efficient, data-driven process to collect data and provide feedback about treatment fidelity and/or student outcomes to support the continuous improvement of EBP implementation. Learning school systems—like learning healthcare systems—can be realized through plan–do–study–act (PDSA) cycles, a rapid learning approach to (a) plan a small test of change based on a hypothesis about why a problem exists, (b) enact the change, (c) observe the result, and (d) evaluate and monitor whether the plan worked (Taylor et al., 2014). Several recent papers have described the use of PDSA cycles to support quality improvement via targeted implementation efforts in school mental health systems, finding promising results in terms of quality improvement outcomes (Bohnenkamp et al., 2023; Connors et al., 2020, c; Heatly et al., 2023). These processes can be used by school teams to collect, review, and make quality improvement and implementation decisions based on fidelity data (McLeod et al., 2022). Nevertheless, although PDSAs have been associated with observed systemwide quality improvement and implementation outcomes in school mental health (Bohnenkamp et al., 2023; Connors et al., 2022c), we are not aware of studies directly testing the impact of PDSA cycles on reducing variability in fidelity in schools.
Furthermore, although incentive-based implementation strategies have emerged in community mental health (e.g., Beidas et al., 2017), we are unaware of published studies in the past decade that test incentive programs to support intervention fidelity in schools. The school context may have unique considerations related to the acceptability and feasibility of incentives (Lyon et al., 2018; Thayer et al., 2022). It is important for research–practice partnerships to consider these contextual factors in order to advance work on accountability structures and incentive programs in schools.
EBP Sustainment
Sustainment has been broadly conceptualized as the continuation of effective practices and programs after the initial implementation stage or after program funding or external support has ended (Stirman et al., 2012). This includes studies of (1) the extent to which core program elements continue to be implemented with sufficient fidelity or intensity, (2) the extent to which program outcomes and benefits are maintained, (3) the impact, extent, or characteristics of modifications made to the program, and (4) the capacity to continue to use the program (Stirman et al., 2012). The 2014 research agenda outlined two core research questions related to sustainment.
Question 1. What Does it Mean for an EBP to be Sustained in SMH? What are the Critical Dimensions of Sustainment and How are They Best Measured?
There has been notable growth in research on different definitions of sustainment in school mental health, based on looking at whether the intervention was sustained in whole or in part, at various follow-up intervals, and within or outside the context of grant-funded research. For instance, several recent studies have examined the extent to which EBPs were sustained following the end of research trials or implementation efforts. Measuring partial or full program continuation, these studies have typically found moderate to low levels of sustainment at the district, school, and individual implementer level. A survey study of 77 evidence-based prevention programs funded through a statewide seed grant initiative across schools and community organizations found that about 69% of programs were continuing in some capacity two years or more after initial funding. However, 60% of these programs reported significantly scaling back their efforts (e.g., fewer schools, fewer people served; Cooper et al., 2015). In another study, conducted two years after an effectiveness trial of an intervention for behaviorally at-risk students in 48 elementary schools across five districts, only one of the districts continued to implement the program (Woodbridge et al., 2014). Another post-trial study of a peer victimization and bullying prevention program in eight schools found that five schools reported full implementation, and three reported that they sustained the program only in part (Leadbeater et al., 2015). In contrast, a study of trauma-informed universal mental health intervention found that no schools sustained the program at one-to two-year follow-up (Arnold et al., 2021).
Other sustainment studies have focused on extensiveness and quality of practice implementation by individual teachers involved in research trials. For instance, in a study of MOSAIC—a classroom intervention with universal and targeted supports to facilitate positive peer relationships and academic success—Kassab et al. (2023) observed the relative usage of MOSAIC practices during the follow-up year compared to the trial year, and found teachers showed good sustainment of 75% of strategies (operationalized as observed usage 80% or higher), and greater use of some strategies in the follow-up year as compared to the randomized trial year. In another study of the sustainment of the Research-based Developmentally Informed (REDI) project among 37 preschool teachers, Bierman et al. (2013) observed no significant changes in the observed implementation quality of the Promoting Alternative thinking Strategies (PATHS) social–emotional learning curriculum at a 1-year follow-up.
There are fewer studies of sustainment as part of implementation efforts that occurred outside of research trials. Findings from a survival analysis study of a statewide effort to implement evidence-based mental health interventions with community clinicians suggest that school-specific EBP may be particularly vulnerable to discontinuation compared to EBP in other sectors (Brookman-Frazee et al., 2018). Another study of a national dissemination effort of the LifeSkills Training (LST) middle school program in 158 districts found that 51% of districts sustained the program in at least one participating school 2 years after grant-supported implementation; of the 419 schools across these districts, 30% reported implementing LST at the same level or higher, 11% at a reduced level, and 59% had discontinued use (Combs et al., 2023). One study of a non-grant-funded, school district-led effort to implement a small group trauma intervention found that once a district-level mandate to implement the program was lifted, about half the schools stopped implementing the program (Nadeem & Ringle, 2016).
Relatively less progress has been made in the area of operationally defining and measuring sustainment in school mental health. Across the sustainment studies reported above, most used semi-structured interview data or surveys to identify the number of schools, classrooms, or teachers using a specific practice (e.g., Combs et al., 2023; Nadeem & Ringle, 2016). One study used health claims data to assess implementation by clinicians employed by mental health agencies (Brookman-Frazee et al., 2018). To identify practice-level sustainment, some studies have used observational measures focused on the number of strategies or quality of implementation (e.g., Bierman et al., 2013; Kassab et al., 2023) or self-report checklists (e.g., Kassab et al., 2023). Given the limited advancement surrounding the clear measurement of sustainment in school mental health, some direction may be taken from work within the broader implementation science where sustainment measurement is more robust (see Moulin et al., 2020, for a review of sustainment measures).
Question 2. What are the Essential Ingredients (e.g., Coaching, Integrity, Adaptation, Student Outcomes, Contextual Characteristics) for Successful Sustainment (Predictors/Processes)?
Recent research points to factors at multiple levels (e.g., intervention characteristics, implementer perceptions, initial level of implementation support provided, school characteristics) that influence successful sustainment. One emergent theme is that school personnel appear to be sustaining practices that have higher ease of use and which they perceive to be effective or useful. For example, a qualitative study of 13 teachers using BEST in Class, a coaching program designed to promote the use of effective teaching practices that foster positive teacher–child interactions for students at risk for an emotional disability classification, revealed that half of teachers used practices like making rules, praise, and relationship building. Practices that required more intentional implementation and planning (e.g., opportunities to respond), and were therefore presumably more difficult to use, were applied less often (Washington-Nortey et al., 2023). Similarly, in their study of sustainment of MOSAIC practices, Kassab and colleagues (2023) found that teachers who received a year of intensive coaching were observed to have strong sustainment (less than a 20% decline), and tended to implement strategies that were feasible and easy to use (i.e., greetings, reinforcing expectations for behavior and inclusiveness) more frequently in the follow-up year than teachers who did not receive such coaching. This study also offers preliminary evidence that strength of initial implementation strategies may impact strength of sustainment. Finally, Bierman et al. (2013) found higher sustained use of the PATHS social–emotional learning (SEL) curriculum relative to the literacy curriculum, which the authors attributed to feedback from teachers that PATHS filled a particular void for them in teaching SEL. Together, these studies offer evidence that when teachers perceive strategies as easier, needed, and/or more effective than others, it may reinforce teachers’ usage of such strategies over time.
There is also emerging literature on the impact of broader contextual characteristics on sustainment such as system-level implementation support structures and community and school characteristics. In a large national sample of schools implementing school wide PBIS, McIntosh et al. (2016) found that state-level support systems appeared to facilitate sustained high-quality implementation (e.g., statewide PBIS leadership team, state-level trainers, standardized trainings and curricula, recognition of schools with strong implementation, and use of national assessment and guidance tools). This study also found that elementary schools were more likely to sustain PBIS compared to middle and high schools, and that schools with higher numbers of students receiving free and reduced lunch (FRL) were less likely to sustain at high quality (McIntosh et al., 2016). Another study of a behavioral health curriculum found that sustainment was higher in school districts with strong administrator support and lower in districts serving higher proportions of ethnic minority students (Combs et al., 2023). Similar contextual factors have emerged across qualitative studies pointing to multi-level influences similar to those found in studies of initial implementation. These include policies, mandates, funding, district-level leadership support and guidance, resource constraints, staffing stability, shared leadership between leaders and implementers, school-level training and PD, data and feedback, and efforts to establish buy-in (e.g., Leadbetter et al., 2015; Nadeem & Ringle, 2016; Scaletta & Tejero Hughes, 2021; Woodbridge et al., 2016). The emerging evidence that sustainment is more challenging for schools and districts with higher proportions of minoritized students and students receiving free and FRL warrants attention (e.g., Combs et al., 2023; McIntosh et al., 2016). These findings raise the need to attend to equity issues in sustainment research and practice, especially on sustainment strategies conducted with school communities serving high numbers of minoritized students. For example, Nadeem et al. (2023) piloted a sustainment planning process for a trauma-focused EBP that included community organizations, school leaders, providers, funders, advocates, and others serving the same city school district. Such work highlights the potential need for increased and longer periods of implementation funding, innovative funding models, additional resources for dedicated program and leadership staffing, and ongoing expert consultation for school and district leaders.
Lastly, there appear to be pre-implementation factors and processes that have bearing not only on implementation but also sustainment. Specifically, the MOSAIC study found that teachers with higher pre-intervention teaching quality had better sustainment compared to those with lower pre-intervention quality (Kassab et al., 2023). With respect to processes to support sustainment, a study of schoolwide PBIS found that buildings who were able to move more quickly through early implementation steps were also more likely to sustain than those that did not (McIntosh et al., 2016), suggesting that early-stage team collaboration processes have later implications for sustaining new practices. Across studies, baseline implementation quality has been found to predict sustainment, and sustained implementation in one program domain predicts sustained implementation in other program domains (e.g., Bierman et al., 2013; Kassab et al., 2023). Finally, intentional and ongoing sustainment planning processes involving key personnel and leadership from the beginning of implementation appear important for program continuation (e.g., Leadbetter et al., 2015). Future research can continue to build on the sustainment progress described above, ultimately working toward the development and testing of effective sustainment strategies.
Updated Research Agenda
Much progress has been made related to Owens et al.’s original research agenda, and now is an appropriate time to continue to push the field forward and chart a new course for the next decade. Although contemporary implementation research in schools tends to emphasize PD and coaching approaches (Baffsky et al., 2023) above the multitude of other implementation strategies (e.g., incentives, leadership supports; see Cook et al., 2019) and elevates fidelity above other implementation outcomes (Sanetti & Collier-Meek, 2019), we envision a future that builds on these discoveries and extends them into the broader domains of implementation science.
Owens et al. (2014) emphasized the importance of contextual factors unique to implementation of EBP in schools, including organizational and leadership factors, the diverse workforce of school-based professionals as implementers, and the complexities of academic calendars. Additional factors that can vary widely across implementation projects include, but are not limited to, implementation initiative funding (e.g., federal service or grant; research grant; foundation funding; local, state, or district funds), implementer characteristics, incentives or costs for individual implementers or school systems to take part, and school system expectations or mandates. These remain highly relevant to the updated research agenda described below and align with a growing emphasis in the broader field of implementation on clearly specifying and studying context in research studies and trial designs (Kemp et al., 2019; Pinnock et al., 2017). Further, we have observed in the literature that unique contextual factors in schools (such as those detailed by Owens et al., 2014, see above)—which, at their core, are not health service delivery organizations—tend to yield lower adoption rates for mental health EBP than might be observed in implementation research conducted in healthcare settings such as specialty mental health clinics or hospital settings (e.g., Connors et al., 2021; Nadeem et al., 2018). This may be especially true at more intensive levels of mental health service delivery (e.g., Tier 3) which are often seen as deviating from, or in addition to, the core educational mission of schools (Atkins et al., 2017). As a result of the unique contextual factors influencing implementation of mental health EBP in schools, research–practice partnerships that include school administrators, implementers and recipients working together to carefully consider implementation expectations and operationalizations of success are especially critical. A recent special issue of School Mental Health focused on these partnerships underscores their importance (Lawson & Owens, 2024).
Below, we outline an updated agenda for implementation research in schools focused on the four priority domains mentioned earlier. First, we describe the value of developing and testing pragmatic (i.e., rigorous, relevant, efficient) implementation strategies that are usable for implementation researchers and practitioners alike. Second, we emphasize the importance of identifying and measuring the proximal variables that explain the effects of implementation strategies on outcomes (i.e., implementation mechanisms). Third, we present frameworks and methods that can drive research on usability and contextual appropriateness of our best EBP through systematic, proactive redesign. Fourth, we explore the knowledge and studies we need to understand the best ways to prune or deliberately de-implement low-value or ineffective school mental health practices. The paper concludes with a discussion of methods and study designs that can be leveraged to address different aspects of the agenda.
Pragmatic Implementation Strategies
Definition and Rationale
In recent years, significant progress has been made in the identification and categorization of implementation strategies specific to advancing implementation science and practice in schools (Cook et al., 2019; Lyon et al., 2019b). This lays the foundation for more advanced implementation research that tailors and tests implementation strategies and their impacts on implementation and student outcomes. However, testing the impact of implementation strategies on implementation outcomes has been relatively slow to advance across sectors (Proctor et al., 2023) and the impact of this work will be limited if the strategies developed and studied are not practical in real-world school settings. Therefore, similar to Owens et al.’s (2014) original recommendation that researchers should work to identify efficient, “good enough” approaches to implementation, future research should prioritize developing and testing pragmatic implementation strategies. Pragmatic approaches are those that are relevant and meaningful to practitioners and feasible and efficient to use in the local context (Glasgow, 2013). Unfortunately, there has been a tendency for implementation researchers across settings to develop complex and high-intensity implementation strategies in an effort to maximize theoretical effects (Beidas et al., 2022), which can render the strategies difficult to use in the real world outside of the context of a research study. Prioritizing and testing truly pragmatic implementation strategies will help school mental health researchers avoid the potential pitfall of “recreating the research-to-practice gap” (Beidas et al., 2022; Westerlund et al., 2019) by developing strategies that cannot be used in real-world school settings.
State of the Science
Although research on pragmatic implementation strategies in school settings is currently limited, there have been recent advances in methods to adapt and optimize implementation strategies to be more pragmatic. For example, the Cognitive Walkthrough for Implementation Strategies (CWIS; Lyon et al., 2021) is a method to evaluate complex implementation strategies for usability issues that might inhibit their use. The CWIS has been applied to adapt Leadership and Organizational Change for Implementation (LOCI)—a complex implementation strategy to support implementation leadership and climate, developed in mental health and substance use treatment settings (Aarons et al., 2015, 2017)—for elementary school principals (Collins et al., 2024). Applying processes like cognitive walkthroughs to design implementation strategies or adapt them for the school context is one example of a promising method to move the field toward implementation strategies that are usable, feasible, and pragmatic within the school context. Additional methods relevant for developing pragmatic implementation strategies can be found in the review of methods and designs later in this paper.
Deploying pragmatic implementation strategies in schools also requires identifying the minimal dose needed to effect change: a central question in intervention research (Glasgow et al., 2014). Aligned with this goal, methods to adapt implementation strategies to the local context (e.g., individual schools, classrooms, or implementers) can support strategy pragmatism by optimizing them to the necessary intensity (Kilbourne et al., 2024). For example, a recent study evaluating implementation strategies to support school professionals’ delivery of CBT found that the most effective strategy was an adaptive approach that began with a relatively low-intensity implementation strategy and augmented it with a higher-intensity strategy for slow responder schools (Smith et al., 2022). These results signal the promise of adaptive implementation strategies as an approach to provide the most appropriate level of implementation support for a given context, thus improving their efficiency.
Critical Research Questions
Question 1: What partner-engaged methods can be used most efficiently and effectively to develop or select implementation strategies that are pragmatic within the school context?
Question 2: Which pragmatic implementation strategies are effective to support the delivery of mental health EBP in schools? Under what circumstances?
Question 3: Can adaptive approaches improve the efficiency of school mental health implementation strategies? Can strategies be effectively adapted to the individual implementer, in addition to the individual school?
Implementation Mechanisms
Definition and Rationale
Identification and evaluation of the pathways through which implementation strategies operate is a promising avenue for pursuing strategy pragmatism, as well as investigating other important implementation research questions in school mental health. Implementation mechanisms are the processes or events through which implementation strategies operate to bring about positive implementation outcomes (Lewis et al., 2018). Specification of mechanisms can allow for investigation of null effects, replication of positive findings, identification of which strategy components are inert or unnecessary, and adaptation of strategies to fit new contexts (Lewis et al., 2021b).
Ideally, implementation mechanisms are grounded in theory, such as adult learning or behavior change theories. With a strong theoretical foundation, study of implementation mechanisms can facilitate theory building and revision. For instance, the theory of planned behavior (TPB)—which states that behavioral intentions are driven primarily by attitudes, perceived social norms, and self-efficacy (Ajzen, 1991)—is the most commonly applied socio-cognitive theory for developing and testing implementation strategies (Godin et al., 2008; Lewis et al., 2020). The TPB closely relates to the original Owens et al. (2014) recommendation that implementation research in schools evaluate the extent to which implementer motivation and perceptions are malleable; and there are multiple instance of the TPB being applied in the identification of implementation determinants and design of implementation strategies in school mental health (e.g., Hugh et al., 2022; Lyon et al., 2021; Zhang et al., 2023). When applying implementation strategies based on the TPB, measurement of attitudes, social norms, and self-efficacy can shed light on the validity of the theory or the functioning of the strategy (e.g., if mechanisms change but are not associated with behavior change then the theoretical link might not be supported). Understanding implementation mechanisms also can allow for tailoring and streamlining of strategies to maximize the impact of components on these proximal mediators. Strategy components found to not operate on their putative mechanisms could be redesigned to increase their impact or considered for removal from a multifaceted implementation package.
Importantly, even effective implementation strategies have the potential to create implementation disparities when impacts vary across schools, and it is often necessary to adapt strategies to ensure consistent effects on implementation outcomes. For instance, a coaching strategy might effectively increase the adoption and high-fidelity delivery of an evidence-based SEL program in a well-resourced school, but is less effective improving the implementation in a school with fewer resources in the same district. The ASPIRE framework (Gaias et al., 2022), mentioned earlier, provides a series of concrete steps for strategy adaptation to reduce the likelihood that an implementation strategy will exacerbate health disparities by promoting equitable implementation. However, as detailed in ASPIRE, an understanding of implementation mechanisms is critical to facilitate this process because it allows for rapid confirmation that a strategy (e.g., coaching with performance feedback) continues to influence the proximal outcomes identified in its theory of change (e.g., fluency building; Massar et al., 2023).
State of the Science
Although the study of mechanisms is closely aligned with the interests of some sponsors of school mental health research—such as the National Institute of Mental Health’s Experimental Therapeutics approach (Frank et al., 2022)—systematic reviews of implementation mechanisms in mental and behavioral health indicate that this type of inquiry is rare (Lewis et al., 2020; Williams, 2016). Mechanistic implementation research in school mental health is particularly rare but increasing, with most examples coming from protocol papers for active projects. For instance, Kwak et al. (2022) are conducting a study to evaluate the mechanisms through which a variety of implementation strategies (e.g., teaming, training, and PDSA cycles) will operate to support the implementation of a mental health prevention program for school personnel. Turner et al. (2022) are conducting an evaluation of strategies for implementing PBIS with educators in rural schools (e.g., teaming, systems support, training, ongoing technical assistance, feedback) that attends explicitly to the mechanisms through which strategies operate (organizational readiness, school team functioning, psychological safety).
As school mental health researchers increasingly develop and test implementation strategies, they should explicitly specify and report their strategies (Moore et al., 2021) and conceptualize and evaluate the mechanisms through which those strategies promote implementation outcomes. To achieve this, the field would benefit from more consistent operationalization of mechanisms. We argue that mechanisms are best defined as the intrapersonal or interpersonal variables that change when a strategy is applied and not the application of strategy components themselves. For instance, for a coaching strategy that involves feedback to school clinicians, the provision of feedback might be a component of strategy fidelity (i.e., delivery as intended) but would not be considered a mechanism through which the strategy is effective. Instead, processes such as awareness of discrepancies between current states of practice and goal states might be considered. Building from this perspective, we propose a series of high-priority research questions.
Critical Research Questions
Question 1: Through which putative mechanisms do the most common implementation strategies in schools (e.g., training, coaching) operate?
Question 2: Where are there gaps between identified implementation mechanisms and available measurement instruments to evaluate those mechanisms?
Question 3: When effective, do implementation strategies operate through the same mechanisms across diverse school contexts?
Strategic Intervention Redesign
Definition and Rationale
Across service sectors, evidence suggests that not all EBP are equal with regard to their implementation potential (i.e., implementability). For instance, a study by Ahuna et al. (2023) found that 75% of the variability in youth mental health therapists’ intentions to implement specific EBP was accounted for by characteristics of the intervention rather than individual and organizational characteristics. Similarly, another study found that 57% of the variability in program start-up for 27 EBP focused on child, family, and adult well-being implemented across 1287 sites could be attributed to the programs’ characteristics (Alley et al., 2023). In schools, Eisman et al. (2020c) reported that teacher ratings (n = 171) of the design and packaging quality of a mental health curriculum were related to the fidelity with which the curriculum was delivered, especially for teachers with less experience. These findings underscore how intervention-level determinants are critical to implementation success. Despite this, innovation characteristics are an underemphasized “forgotten domain” in contemporary implementation science, receiving considerably less attention than individual and organizational determinants (Lewis et al., 2021a).
Although relatively little implementation research focuses on the innovation level, strategic intervention redesign, which addresses intervention-level determinants by making adaptations (i.e., planned, proactive modifications; Stirman et al., 2019) to create a more implementable intervention, is an important, yet underutilized, implementation strategy (Lyon & Bruns, 2019b). In school mental health, the innovations being implemented may range from specific components of evidence-based interventions (e.g., measurement-based care; Connors et al., 2022a) to complex, multi-level service models (Moore et al., 2024) such as positive behavioral interventions and supports (Sugai & Horner, 2006). No matter what their complexity, many of these interventions could benefit from prospective approaches to evaluate the extent to which they meet user needs and make changes to improve contextual fit, usability, and implementation potential (Lyon et al., 2020b). While many school mental health researchers do not attend explicitly to intervention design (or redesign) (Lyon & Bruns, 2019a), we have observed that purveyors of school-focused prevention and intervention programs that may not have a strong evidence base often attend carefully to the surface look and feel of their interventions, creating engaging and visually appealing materials. This can provide a competitive advantage for non-EBP that often drives building- and district-level adoption decisions.
State of the Science
While “there is no implementation without adaptation” (Lyon & Bruns, 2019b), adaptations of EBP often occur reactively (vs. prospectively) which may impact the extent to which they are fidelity consistent (Pettersson et al., 2024; Stirman et al., 2019). Many intervention adaptation or redesign models and approaches have emerged in implementation science, such as the Model for Adaptation Design and Impact (MADI; Kirk et al., 2020), the Adaptive Designs Accelerating Promising Trials Into Treatments (ADAPT-IT; Meurer et al., 2012), Making Optimal Decisions for Intervention Flexibility during Implementation (MODIFI; Brewer et al., 2024), and the Discover, Design/Build, Test (DDBT) framework (Lyon et al., 2019c). Other models and approaches have been developed with the goal of supporting professionals to make decisions about whether or not to adapt an intervention as well as how to evaluate an adaptation process (Miller et al., 2020). The Framework for Reporting Adaptations and Modifications to Evidence (FRAME)-based interventions is the most prominent example of a descriptive approach (Stirman et al., 2019). With the recent emergence of most of these models, we look forward to research evaluating their utility. One framework that has been applied repeatedly in school mental health to increase implementability and fit (e.g., Bearss et al., 2022; Liu et al., 2022a; Pfiffner et al., 2022) is the DDBT framework (Lyon et al., 2019c). Evaluation of the impact of this framework is underway in the multiple school and community-based projects supported by the NIMH-funded University of Washington ALACRITY Center (Munson et al., 2022).
Some of the most compelling approaches to redesign interventions draw heavily from the field of human-centered design (HCD), which uses information about the settings and individuals who will ultimately use a product to inform its (re)development (Norman & Draper, 1986). HCD and implementation science are increasingly being integrated in health services research (Chen et al., 2021; Dopp et al., 2019a; Lyon et al., 2023). Although it is common to associate HCD methods with digital innovations, application of those methods need not be limited to that medium. In the context of youth mental health services, Lyon et al. (2020b) detailed how HCD processes also can be brought to bear on other health services research products, including psychosocial EBP and implementation strategies in service delivery settings such as schools. Specifically, leveraging HCD can allow for implementation-focused intervention redesign to ensure EBP fit with schools’ and districts’ existing priorities, initiatives, and constraints. Due to the methods through which it elevates user voices (Rose, 2016), HCD also may create opportunities to help ensure that adaptations are culturally appropriate and equitable, although it has not historically been used in this way and cannot be assumed to achieve equity outcomes without careful reflection and evaluation (Stiles-Shields et al., 2022). Nonetheless, HCD models for psychosocial intervention redesign have great potential to improve contextual appropriateness through deep engagement with end users. In light of the growing focus on strategic intervention redesign in implementation science in general—and in school-based implementation research, in particular—we argue that future research in the school mental health space should increasingly explore the impact of prospective adaptations to improve fit. Below, we articulate specific research questions for considering how intervention design and redesign can advance implementation in school mental health.
Critical Research Questions
Question 1: To what degree do EBP in school mental health vary with regard to their usability and implementability? Are there common design problems that consistently inhibit large-scale adoption?
Question 2: What redesigns of EBP are needed to improve usability, adoption, and impact?
Question 3: Can HCD-driven approaches to redesign improve equitable implementation and outcomes?
De-Implementation and Pruning
Definition and Rationale
De-implementation is typically thought of as a systematic process focused on stopping the use of low-value, ineffective, or harmful practices (e.g., Johnson et al., 2018; Norton & Chambers, 2020). McKay et al.’ (2018) have outlined three primary cases in which interventions should be deimplemented: (a) intervention lacks evidence or is harmful, (b) alternative interventions are more effective or efficient, or (c) the issue of concern is resolved. In school mental health, similar factors may apply. In addition, the reality is that implementation is often a zero-sum game. Schools need to strategically make room for new initiatives as opposed to continually adding new initiatives, otherwise risking diluting efforts, and leading staff to be overwhelmed. This process of de-implementation (a.k.a., pruning; Cook et al., 2019) differs from a failure to sustain in that it is purposeful, with leaders considering the value of existing practices, current priorities, staffing and financial capacity, and the expertise of existing personnel and teams.
State of the Science
In health care more broadly, there has been growing interest in de-implementation, but this work is still in its infancy, focused on identifying existing frameworks (Nilsen et al., 2020), characterizing and developing de-implementation strategies (Ingvarsson et al., 2022; McKay et al., 2018), and defining outcomes (Prusaczyk et al., 2020). Within the context of school mental health, research to date focuses primarily on describing the processes through which schools make de-implementation decisions. The theoretical concept of “escalation of commitment,” is a decision-making bias where leaders continue to use low-value or ineffective programs. In such cases, administrators may continue implementation due to a range of external and internal pressures, and may attribute poor program performance not to the program itself but to implementation issues, leadership, or limitations of the performance indicators (Barrett et al., 2023). In other cases, districts may make the choice to cut programs that had been previously well supported due to limited resources. A case study of the de-implementation of a school-based trauma program over a two-year period highlighted the role of severe budget and staffing cuts and a district-level decision to refocus staff on providing mandated services (Nadeem & Ringle, 2016). The role of racial equity and culture in de-implementation decisions is understudied (Clinkscales et al., 2024). However, there is some evidence that schools do consider equity issues as they make de-implementation decisions. For instance, the increased uptake of restorative practices and culturally responsive, equity-driven social–emotional learning and classroom practices is often coupled with the de-implementation of exclusionary discipline practices that have disproportionate negative impact on minoritized students (e.g., Farr et al., 2020; Gregory et al., 2021).
As the science of de-implementation in school mental health evolves, it will be critical to examine how well emergent theories, strategies, and outcomes applied in health care apply to schools; identify unique factors that are inherent to the educational context; and design and test strategies that can support targeted de-implementation processes. Such work will help the field to develop theory and practice about what should drive decisions about what to de-implement, and how to apply decision-making frameworks (e.g., how to assess evidence, fit with the context, implementation and resource burden). To further understand de-implementation, mixed methods and rapid qualitative approaches (see Designs and Methods, below) may be useful to observe and characterize what decision-making looks like in routine practice (e.g., level at which decisions are made, who decides, perceptions about the intervention or program, what outcomes matter to decision makers, role of costs), and what de-implementation strategies are already being used.
Critical Research Questions
Question 1: How are schools and districts making decisions to de-implement existing programs and initiatives (e.g., factors considered, characteristics of programs de-implemented, decision-making processes, key decision makers)?
Question 2: What role do resources and costs play in de-implementation decisions? Specifically, what is the cost–benefit of keeping a current practice versus de-implementing it (i.e., costs of existing practices, opportunity costs, ongoing maintenance costs, costs of de-implementation)?
Question 3: What models or strategies might enhance decision-making and processes related to strategic deimplementation (e.g., measurement of existing practices before implementation and de-implementation)?
Designs and Methods to Support the Research Agenda
Over the past decade, a range of study designs and methods have been furthered in the broader implementation science field that can be leveraged to improve school mental health intervention adoption, implementation, and sustainment and to address the priority questions in each domain. We review here several of these methods—hybrid trial designs; sequential, multiple assignment, randomized trials (SMARTs); rapid qualitative approaches; and economic analyses—to provide additional direction for investigating some of the research questions articulated above (see Table 1). All of these designs can, and should, be applied in the context of strong research–practice partnerships to ensure relevance and representation of community voice.
Table 1.
Examples of Study Design or Method to Advance Implementation Science in SMH Research
| Study design or method | Example research question | Example application |
|---|---|---|
|
| ||
| Hybrid | Implementation Mechanisms | Hybrid Type 3 multi-school site trial to primarily assess implementation strategy mechanisms across buildings and contexts and secondarily assess effectiveness of EBP on student outcomes |
| Effectiveness–Implementation Trial (Design) | Question 3: When effective, do implementation strategies operate through the same mechanisms across diverse school contexts? |
|
| SMART Trial (Design) | Pragmatic Implementation Strategies Question 3: Can adaptive approaches improve the efficiency of school mental health implementation strategies? Can strategies be effectively adapted to the individual implementer, in addition to the individual school? |
Study designed to develop a multi-level adaptive implementation strategy (MAISY) and empirically evaluate the relative implementation outcomes associated with different combinations of strategies based on timing or implementer response to initial strategies provided |
| Rapid Qualitative Approaches | Strategic Redesign Question 2: What redesigns of EBP are needed to improve usability, adoption, and impact? | Exploratory study using a rapid ethnographic toolkit from www.impscimethods.org (a website fueled by the work of research centers funded by the National Institutes of Health) to understand the reasons why an intervention was adapted when implemented and inform potential redesign efforts |
| Cost/Economic Analysis | De-Implementation Question 2: What role do resources and costs play in deimplementation? Specifically, what is the cost–benefit of keeping a current practice versus de-implementing it? | Formative study applying the Cost of Implementing New Strategies (COINS) method for documenting costs associated with implementation strategies (Saldana et al., 2014) and recent resources provided by IES (https://ies.ed.gov/blogs/research/post/have-a-cost-analysis-to-plan-or-execute-we-have-a-mod-ule-for-that) prior to initiating a de-implementation effort |
Hybrid Trials
Hybrid trial designs, briefly mentioned and recommended by Owens et al. (2014), simultaneously evaluate intervention effectiveness and implementation within the same study to shorten the translational research pipeline (Curran et al., 2022). A continuum of hybrid Type 1, 2, and 3 designs are available to suit the appropriate balance of emphasis on effectiveness (e.g., student outcomes) versus implementation outcomes (e.g., adoption, fidelity, feasibility). On one end of the continuum, Type 1 trials focus primarily on testing the effectiveness of an intervention, while also gathering information about implementation outcomes. On the other end of the continuum, Type 3 trials focus mainly on testing implementation outcomes associated with an implementation strategy or strategies, with a secondary goal of collecting effectiveness outcomes associated with the intervention. Finally, Type 2 trials have a dual focus: testing interventions and implementation strategies simultaneously (Curran et al., 2022). Increasingly, hybrid Type 1 trials are not considered “true” implementation research studies since implementation strategies and outcomes are observed, not empirically evaluated. Consistent with the National Institutes of Health’s Science of Implementation in Health and Healthcare (SIHH) study section and the journal Implementation Science, which do not review Type 1 trials (NIH, 2024; Wensing et al., 2021), we focus our discussion here on Type 2 and 3 hybrid trial designs as they have the most explicit implementation focus.
Hybrid trials have largely become the industry standard in implementation science, but are still underutilized in school-based research. Aside from a few notable examples of recently completed school-based Type 2 trials in the domains of computer-assisted interventions for autism (Pellecchia et al., 2020) and cognitive behavioral therapy (Eiraldi et al., 2024), nearly all hybrid trial examples we identified are study protocols for ongoing work. These include Type 2 trials testing a positive school climate intervention with an implementation facilitation strategy (Eisman et al., 2020a); testing a scalable implementation model for an evidence-based early childhood mental health intervention in Ugandan schools (Huang et al., 2022); testing a pragmatic, motivational enhancement implementation strategy (i.e., BASIS) for Trauma-Focused Cognitive Behavioral Therapy delivered in schools (Lyon et al., 2021); and testing a team science intervention alongside tailored implementation strategies (Kuriyan et al., 2021). Relevant Type 3 study protocols in school mental health include effectiveness–implementation trials of a co-designed, multi-component implementation strategy to support implementation of the PAX Good Behavior Game (Baffsky et al., 2022); implementation of a proactive classroom management strategy (Positive Greetings at the Door) via a teacher-focused version of BASIS (Lyon et al., 2024); and a comparison of implementation strategies for PBIS in rural settings (Turner et al., 2022).
SMART Designs
Another study design emerging in the field of implementation science in school mental health is the sequential, multiple assignment, randomized trial (SMART). These experimental designs can be applied—among other uses—to empirically develop an adaptive intervention using multiple stages of randomization to learn how the intervention can be optimized or tailored for service recipients (Advancing the science of adaptive interventions, 2024; Lei et al., 2012). SMARTs have been identified as an ideal fit for the future of school mental health intervention research, especially within a multi-tiered system of support (August et al., 2018). Application of SMART designs to school mental health implementation research is still in its infancy, but a notable example is the Adaptive School-based Implementation of Cognitive Behavioral Therapy (ASIC) study, mentioned above in the section on pragmatic strategies (Kilbourne et al., 2018). In that SMART, augmentation decisions were based on response status of schools after eight weeks and results indicated that an adaptive strategy that included training/TA with facilitation resulted in the highest average CBT delivery (Smith et al., 2022).
A variation of the SMART design that school mental health implementation scientists may want to follow is the multi-level implementation SMART, which can be used to construct an optimized multi-level adaptive implementation strategy (MAISY). MAISYs involve randomizations at multiple levels to learn how to sequence and adapt implementation strategies at the system (e.g., school district), unit (e.g., school) and implementer (e.g., teacher, school practitioner) level (Solving implementation challenges, 2024). Although MAISYs are just emerging in outlets such as conference proceedings (e.g., Connors et al., 2022b), a wide net of applied implementation research questions about which implementation strategies are most effective, when, for whom (based on student, school implementer, and/or school setting characteristics) can be pursued via application of these innovative SMART designs in school mental health.
Rapid Qualitative Approaches
Qualitative research and mixed methods are central to implementation science (Hamilton & Finley, 2020; Palinkas, 2014). Yet, the time lag and resources to conduct traditional qualitative research can run counter to the goals of implementation research to speed the evidence-to-practice gap. Rapid qualitative analyses vary in technique, but typically involve a team of researchers conducting iterative data collection and analysis and/or rapid synthesis of results to increase efficiency (Vindrola-Padros & Johnson, 2020). Importantly, rapid qualitative approaches reduce time and cost but do not compromise rigor or value. Comparisons of rapid to traditional qualitative approaches have found a range of modest to substantial time saved in the interpretation and analysis phases (due to variation by research teams), large savings in transcription costs, and very similar amounts of data collected, results, and recommendations (Gale et al., 2019; Nevedal et al., 2021; Taylor et al., 2018). We were challenged to find examples of rapid qualitative approaches applied to school mental health implementation research; however, within the broader implementation science literature there are numerous examples of success using these methods in resource-constrained settings (c.f., Moran et al., 2023; Sperber et al., 2019).
Economic/Cost Evaluation
Economic evaluations include analyses to minimize cost, improve cost effectiveness, study cost utility, and report cost–benefit (Kim et al., 2022). Economic evaluations have been increasingly emphasized in healthcare implementation research (Eisman et al., 2020b) and are likely to have great value in school implementation research given the resource constraints and intervention overload in most school systems. To our knowledge, cost analyses have been conducted in academic intervention research (Barrett et al., 2020) as well as school mental health research (Abou Jaoude et al., 2024; Bradshaw et al., 2020; Malik et al., 2021), but this is still an emerging method in education research, including in school mental health implementation science. The Institute of Education Sciences (IES), a primary funder of school mental health research, recently began requiring cost analyses in nearly all of their grant mechanisms as a component of its Standards for Excellence in Education Research (SEER) principles (IES, 2022). Researchers interested in learning more about these methods may be interested in published guidance for implementation scientists’ use of mixed methods findings to inform economic evaluations (Dopp et al., 2019b).
Conclusion
In this paper, we sought to reflect on the decade that has passed since Owens and colleagues (2014) articulated an agenda for implementation science in school mental health. Although some significant progress has been made, much work remains as the field continues its shift from the identification of implementation determinants to developing, testing, and understanding the mechanisms of implementation strategies, including those that are pragmatic and adaptive. As with the original research agenda, full consensus surrounding a blueprint for future implementation research in school mental health is neither the goal nor the expectation of this paper. Nevertheless, it is our hope that some shared understanding of evolving research priorities and novel designs/methods can continue to drive complementarity and synergy across projects.
Throughout this paper, we have attempted to underscore opportunities to advance equitable implementation in school mental health. While implementation of EBP in schools has the potential to support equity, all implementation processes carry risks of maintaining or inadvertently exacerbating inequities (Liu et al., 2019), which necessitates attention. As with many other fields, equity-explicit and anti-racist implementation science has rapidly accelerated over the past five years (Baumann et al., 2023; Shelton et al., 2021). This includes the development of multiple equity-oriented implementation frameworks (e.g., Eslava-Schmalbach et al., 2019; Woodward et al., 2019), processes to surface and address equity concerns in implementation strategies (Gaias et al., 2022), and guidance about how to incorporate equity into the evaluation of context and measurement of implementation outcomes (Brownson et al., 2021). Relatively few but outstanding examples of this work can be found within the school mental health implementation literature. For instance, Renshaw and Phan (2023) recently proposed an expansion of implementation reporting recommendations for school-based mindfulness interventions to center equity (e.g., through explicit attention to principles that research teams might employ to center diversity, equity, and inclusion during intervention adaptation). The previously mentioned ASPIRE framework (Gaias et al., 2022) for school-based implementation strategy adaptation serves as another great example. As school mental health implementation research continues to evolve over the next decade, we hope to see expanded emphasis on ways to promote equitable implementation.
The field of school mental health sits squarely at the intersection of two worlds: health and education. The pace with which implementation science in health has advanced continues to far exceed that in education, a fact evidenced by the number of healthcare-related citations in the current paper, especially in more recently emerging areas of inquiry (e.g., strategic intervention redesign, implementation mechanisms). Nevertheless, it is important to acknowledge that there is strong interest in implementation science within the larger education research community (Gaias et al., 2023) and that examples in education outside of the mental health domain also are rapidly expanding (e.g., Komesidu & Hogan, 2023; Moir, 2018; Olswang & Prelock, 2015). Training programs such as the IES-funded Research Institute for Implementation Science in Education (RIISE) (https://ies.ed.gov/funding/grantsearch/details.asp?ID=4646) have emerged to match this interest and support educational scholars in developing rigorous implementation research portfolios. As these efforts continue to advance, it will be important that implementation research in school mental health—and education more generally—does not simply adopt the health-oriented conceptualization of implementation wholesale and uncritically. Instead, we envision continued development of an education-specific science of implementation that recognizes the unique constraints and opportunities in schools and leverages the tremendous wealth of information that has been generated in health care, while avoiding some of the potential pitfalls (Beidas et al., 2022). Overall, we are highly optimistic about the future of implementation science in school mental health and wish both budding and established researchers in this space, “fair winds and smooth sailing.”
Acknowledgements
We would like to thank Aislyn Gordon and Sofia Redondo for their reference management and manuscript formatting support.
Funding
While writing this article, Dr. Lawson’s research was funded by the National Institute of Mental Health under Grant K23MH122577; Dr. Owens was funded by the Institute of Education Sciences under Grants (R324A190154; R305A210224; R305A200423); and Dr. Lyon was funded by the Institute of Education Sciences (R305A200023; R305B210004; R305A220481; R305A230391) and the National Institute of Mental Health (P50MH115837; R01MH119148; R34MH128386).
Footnotes
Declarations
Competing interests Dr. Owens is an Associate Editor of School Mental Health. All decisions on this paper were made by another editor. The authors declare that they have no competing interests.
References
- Aarons GA, Ehrhart MG, Farahnak LR, & Hurlburt MS (2015). Leadership and organizational change for implementation (LOCI): A randomized mixed method pilot study of a leadership and organization development intervention for evidence-based practice implementation. Implementation Science, 10, 11. 10.1186/s13012-014-0192-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons GA, Ehrhart MG, Moullin JC, Torres EM, & Green AE (2017). Testing the leadership and organizational change for implementation (LOCI) intervention in substance abuse treatment: A cluster randomized trial study protocol. Implementation Science, 12, 29. 10.1186/s13012-017-0562-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Abou Jaoude GJ, Leiva-Granados R, Mcgranahan R, Callaghan P, Haghparast-Bidgoli H, Basson L, Ebersöhn L, Gu Q, & Skordis J (2024). Universal primary school interventions to improve child social-emotional and mental health outcomes: A systematic review of economic evaluations. School Mental Health, 16, 291–313. 10.1007/s12310-024-09642-0 [DOI] [Google Scholar]
- Admiraal W, Schenke W, De Jong L, Emmelot Y, & Sligte H (2019). Schools as professional learning communities: What can schools do to support professional development of their teachers? Professional Development in Education, 47(4), 684–698. 10.1080/19415257.2019.1665573 [DOI] [Google Scholar]
- Adsul P, Shelton RC, Oh A, Moise N, Iwelunmor J, & Griffith DM (2024). Challenges and opportunities for paving the road to global health equity through implementation science. Annual Review of Public Health. 10.1146/annurev-publhealth-060922-034822 [DOI] [PubMed] [Google Scholar]
- Advancing the science of adaptive interventions. (2024). D3center; Institute for Social Research, University of Michigan. https://d3c.isr.umich.edu/advancing-the-science-of-adaptive-interventions/#smart [Google Scholar]
- Ahuna JK, Becker KD, & Chorpita BF (2023). Predicting therapists’ intention to use innovations: Comparing the role of individual organizational, and innovation characteristics. Administration and Policy in Mental Health and Mental Health Services Research, 50, 946–965. 10.1007/s10488-023-01295-6 [DOI] [PubMed] [Google Scholar]
- Ajzen I (1991). The theory of planned behavior. Organizational Behavior and Human Decision Process, 50(2), 179–211. 10.1016/0749-5978(91)90020-T [DOI] [Google Scholar]
- Allen JP, Hafen CA, Gregory AC, Mikami AY, & Pianta R (2015). Enhancing secondary school instruction and student achievement: Replication and extension of the My Teaching Partner-Secondary intervention. Journal of Research on Educational Effectiveness, 8(4), 475–489. 10.1080/19345747.2015.1017680 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Alley ZM, Chapman JE, Schaper H, & Saldana L (2023). The relative value of pre-implementation stages for successful implementation of evidence-informed programs. Implementation Science, 18, 30. 10.1186/s13012-023-01285-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Amaya AJ, & Amundson L (2024). The changing role of the district PBIS coordinator throughout the stages of implementation. Journal of Positive Behavior Interventions, 26(3), 157–167. 10.1177/10983007231215531 [DOI] [Google Scholar]
- Anderson M, Werner-Seidler A, King C, Gayed A, Harvey SB, & O’Dea B (2019). Mental health training programs for secondary school teachers: A systematic review. School Mental Health, 11, 489–508. 10.1007/s12310-018-9291-2 [DOI] [Google Scholar]
- Arnold KT, Pollack Porter KM, Frattaroli S, Durham RE, Clary LK, & Mendelson T (2021). Multilevel barriers and facilitators to sustainability of a universal trauma-informed school-based mental health intervention following an efficacy trial: A qualitative study. School Mental Health, 13, 174–185. 10.1007/s12310-020-09402-w [DOI] [PMC free article] [PubMed] [Google Scholar]
- Atkins MS, Cappella E, Shernoff EE, Mehta TG, & Gustafson EL (2017). Schooling and children’s mental health: Realigning resources to reduce disparities and advance public health. Annual Review of Clinical Psychology, 13, 123–147. 10.1146/annurev-clinpsy-032816-045234 [DOI] [PubMed] [Google Scholar]
- August GJ, Piehler TF, & Miller FG (2018). Getting “SMART” about implementing multi-tiered systems of support to promote school mental health. Journal of School Psychology, 66, 85–96. 10.1016/j.jsp.2017.10.001 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Baffsky R, Ivers R, Cullen P, Batterham PJ, Toumbourou J, Calear AL, Werner-Seidler A, McGillivray L, & Torok M (2022). A cluster randomised effectiveness-implementation trial of an intervention to increase the adoption of PAX Good Behaviour Game, a mental health prevention program, in Australian primary schools: Study protocol. Contemporary Clinical Trials Communications. 10.1016/j.conctc.2022.100923 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Baffsky R, Ivers R, Cullen P, Wang J, McGillivray L, & Torok M (2023). Strategies for enhancing the implementation of universal mental health prevention programs in schools: A systematic review. Prevention Science, 24(2), 337–352. 10.1007/s11121-022-01434-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Baker EA, Brewer SK, Owens JS, Cook CR, & Lyon AR (2021). Dissemination science in school mental health: A framework for future research. School Mental Health, 13, 791–807. 10.1007/s12310-021-09446-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Barclay CM, Castillo J, & Kincaid D (2022). Benchmarks of equality? School-wide positive behavioral interventions and supports and the discipline gap. Journal of Positive Behavior Interventions, 24(1), 4–16. 10.1177/10983007211040097 [DOI] [Google Scholar]
- Barrett CA, Pas ET, & Johnson SL (2020). A cost analysis of the innovation-decision process of an evidence-based practice in schools. School Mental Health, 12, 638–649. 10.1007/s12310-020-09372-z [DOI] [Google Scholar]
- Barrett CA, Sleesman DJ, Spear SE, Clinkscales A, & Amin T (2023). Sticking with programs that do not work: The role of escalation of commitment in schools. Prevention Science, 24, 567–576. 10.1007/s11121-023-01510-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Battal J, Pearrow MM, & Kaye AJ (2020). Implementing a comprehensive behavioral health model for social, emotional, and behavioral development in an urban district: An applied study. Psychology in the Schools, 57, 1475–1491. 10.1002/pits.22420 [DOI] [Google Scholar]
- Baumann AA, Shelton RC, Kumanyika S, & Haire-Joshu D (2023). Advancing healthcare equity through dissemination and implementation science. Health Services Research, 58(53), 327–344. 10.1111/1475-6773.14175 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bearss K, Tagavi D, Lyon AR, & Locke J (2022). Interactive redesign of a caregiver-mediated intervention for use in educational settings. Autism, 26(3), 666–677. 10.1177/13623613211066644 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas RS, Becker-Haimes EM, Adams DR, Skriner L, Stewart RE, Wolk CB, Buttenheim AM, Williams NJ, Inacker P, Richey E, & Marcus SC (2017). Feasibility and acceptability of two incentive-based implementation strategies for mental health therapists implementing cognitive-behavioral therapy: A pilot study to inform a randomized controlled trial. Implementation Science. 10.1186/s13012-017-0684-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas RS, Dorsey S, Lewis CC, Lyon AR, Powell BJ, Purtle J, Saldana L, Shelton RC, Stirman SW, & Lane-Fall MB (2022). Promises and pitfalls in implementation science from the perspective of US-based researchers: Learning from a pre-mortem. Implementation Science, 17, 55. 10.1186/s13012-022-01226-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bergan JR (1977). Behavioral consultation psychology series. Merrill Pub Co. [Google Scholar]
- Bierman KL, Sanford DeRousie RM, Heinrichs B, Domitrovich CE, Greenberg MT, & Gill S (2013). Sustaining high-quality teaching and evidence-based curricula: Follow-up assessment of teachers in the REDI project. Early Education & Development, 24, 1194–1213. 10.1080/10409289.2013.755457 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bohnenkamp JH, Patel C, Connors E, Orenstein S, Ereshefsky S, Lever N, & Hoover S (2023). Evaluating strategies to promote effective, multidisciplinary team collaboration in school mental health. Journal of Applied School Psychology, 39(2), 130–150. 10.1080/15377903.2022.2077875 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bradshaw CP, Debnam KJ, Player D, Bowden B, & Johnson SL (2020). A mixed-methods approach for embedding cost analysis within fidelity assessment in school-based programs. Behavioral Disorders, 48(3), 174–186. 10.1177/0198742920944850 [DOI] [Google Scholar]
- Bradshaw CP, Pas ET, Bottiani JH, Debnam KJ, Reinke WM, Herman KC, & Rosenberg MS (2018). Promoting cultural responsivity and student engagement through double check coaching of classroom teachers: An efficacy study. School Psychology Review, 47(2), 118–134. 10.17105/spr-2017-0119.v47-2 [DOI] [Google Scholar]
- Brewer SK, Corbin CM, Baumann AA, Stirman SW, Jones JM, Pullmann MD, & Lyon AR (2024). Development of a method for making optimal decisions for intervention flexibility during implementation (MODIFI): A modified Delphi study. Implementation Science Communications, 5, 64. 10.1186/s43058-024-00592-x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brookman-Frazee L, Zhan C, Stadnick N, Sommerfeld D, Roesch S, Aarons GA, Innes-Gomberg D, Bando L, & Lau AS (2018). Using survival analysis to understand patterns of sustainment within a system-driven implementation of multiple evidence-based practices for children’s mental health services. Frontiers Public Health. 10.3389/fpubh.2018.00054 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brownson RC, Kumanyika SK, Kreuter MW, & Haire-Joshu D (2021). Implementation science should give higher priority to health equity. Implementation Science, 16, 28. 10.1186/s13012-021-01097-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bussières E, Malboeuef-Hurtubise C, Meilleur A, Mastine T, Hérault E, Chadi N, Montreuli M, Généreux M, Camden C, PRISME-COVID Team. (2021). Consequences of the COVID-19 Pandemic on children’s mental health: A meta-analysis. Frontiers in Psychiatry. 10.3389/fpsyt.2021.691659 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cappella E, Jackson DR, Kim HY, Bilal C, Holland S, & Atkins MS (2015). Implementation of teacher consultation and coaching in urban schools: A mixed method study. School Mental Health, 8, 222–237. 10.1007/s12310-015-9165-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chen E, Neta G, & Roberts MC (2021). Complementary approaches to problem solving in healthcare and public health: Implementation science and human-centered design. Translational Behavioral Medicine, 11(5), 1115–1121. 10.1093/tbm/ibaa079 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cipriano C, Naples LH, Eveleigh A, Cook A, Funaro M, Cassidy C, McCarthy MF, & Rappolt-Schlichtmann G (2022). A systematic review of student disability and race representation in universal school-based social and emotional learning interventions for elementary school students. Review of Educational Research, 93(1), 73–102. 10.3102/00346543221094079 [DOI] [Google Scholar]
- Clinkscales A, Barrett CA, & Endres B (2024). How does culture fit into de-implementation? A scoping review of empirical research. Psychology in the Schools, 61(9), 3589–3611. 10.1002/pits.23244 [DOI] [Google Scholar]
- Collins VK, Corbin CM, Locke JJ, Cook CR, Ehrhart MG, Hatch KD, & Lyon AR (2024). Centering school leaders’ expertise: Usability evaluation of a leadership-focused implementation strategy to support tier 1 programs in schools. School Mental Health. 10.1007/s12310-024-09635-z [DOI] [Google Scholar]
- Combs KM, Drewelow KM, Lain MA, Habesland M, Ippoliot A, & Finigan-Carr N (2023). Sustainment of an evidence-based, behavioral health curriculum in schools. Prevention Science, 24, 541–551. 10.1007/s11121-022-01454-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Compen B, & Schelfhout W (2020). The role of external and internal team coaches in teacher design teams. A mixed methods study. Education Sciences, 10(10), 263. 10.3390/educsci10100263 [DOI] [Google Scholar]
- Connors EH, Lyon AR, Garcia K, Sichel CE, Hoover S, Weist MD, & Tebes JK (2022a). Implementation strategies to promote measurement-based care in schools: Evidence from mental health experts across the USA. Implementation Science Communications, 3, 67. 10.1186/s43058-022-00319-w [DOI] [PMC free article] [PubMed] [Google Scholar]
- Connors EH, Martin JK, Aarons GA, Barwick M, Bunger AC, Bustos TE, Comtois KA, Crane ME, Frank HE, Frank TL, Graham AK, Johnson C, Larson MF, Kim B, McHugh SM, Merle JL, Mettert K, Patel SR, Swindle T, & Powell BJ (2022b). Proceedings of the sixth conference of the society for implementation research collaboration (SIRC) 2022: From implementation foundations to new frontiers. Implementation Research and Practice. 10.1177/26334895231173514 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Connors EH, Moffa K, Carter T, Crocker J, Bohnenkamp JH, Lever NA, & Hoover SA (2022c). Advancing mental health screening in schools: Innovative, field-tested practices and observed trends during a 15-month learning collaborative. Psychology in the Schools, 59(6), 1135–1157. 10.1002/pits.22670 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Connors EH, Prout J, Vivrette R, Padden J, & Lever N (2021). Trauma-focused cognitive behavioral therapy in 13 urban public schools: Mixed methods results of barriers, facilitators, and implementation outcomes. School Mental Health, 13, 772–790. 10.1007/s12310-021-09445-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Connors EH, Smith-Millman M, Bohnenkamp JH, Carter T, Lever N, & Hoover S (2020). Can we move the needle on school mental health quality through systematic quality improvement collaboratives? School Mental Health, 12(3), 478–492. 10.1007/s12310-020-09374-x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cook CR, Lyon AR, Kubergovic D, Browning Wright D, & Zhang Y (2015). A supportive beliefs intervention to facilitate the implementation of evidence-based practices within a multitiered system of supports. School Mental Health, 7, 49–60. 10.1007/s12310-014-9139-3 [DOI] [Google Scholar]
- Cook CR, Lyon AR, Locke J, Waltz T, & Powell JB (2019). Adapting a compilation of implementation strategies to advance school-based implementation research and practice. Prevention Science, 20, 914–935. 10.1007/s11121-019-01017-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cooper BR, Bumbarger BK, & Moore JE (2015). Sustaining evidence-based prevention programs: Correlates in a large-scale dissemination initiative. Prevention Science, 16, 145–157. 10.1007/s11121-013-0427-1 [DOI] [PubMed] [Google Scholar]
- Curran GM, Landes SJ, McBain SA, Payne JM, Smith JD, Fernandez ME, Chambers DA, & Mittman BS (2022). Reflections on 10 years of effectiveness-implementation hybrid studies. Frontiers of Health Services. 10.3389/frhs.2022.1053496 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dopp AR, Mundey P, Beasley LO, Silovsky JF, & Eisenberg D (2019a). Mixed-method approaches to strengthen economic evaluations in implementation research. Implementation Science, 14, 2. 10.1186/s13012-018-0850-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dopp AR, Parisi KE, Munson SA, & Lyon AR (2019b). A glossary of user-centered design strategies for implementation experts. Translational Behavioral Medicine, 9(6), 1057–1064. 10.1093/tbm/iby119 [DOI] [PubMed] [Google Scholar]
- Duong MT, Bruns EJ, Lee K, Cox S, Coifman J, Mayworm A, & Lyon AR (2020). Rates of mental health service utilization by children and adolescents in schools and other common service settings: A systematic review and meta-analysis. Administration and Policy in Mental Health and Mental Health Services Research, 48, 420–439. 10.1007/s10488-020-01080-9 [DOI] [PubMed] [Google Scholar]
- DuPaul GJ, Evans SW, Cleminshaw-Mahan CL, & Fu Q (2024). School-based intervention for adolescents with ADHD: Predictors of effects on academic, behavioral, and social functioning. Behavior Therapy, 55(4), 680–697. 10.1016/j.beth.2024.01.010 [DOI] [PubMed] [Google Scholar]
- Eccles MP, & Mittman BS (2006). Welcome to implementation science. Implementation Science. 10.1186/1748-5908-1-1 [DOI] [Google Scholar]
- Eiraldi R, Lawson GM, Glick HA, Khanna MS, Beidas R, Fishman J, Rabenau-McDonnell Q, Wilson T, Comly R, Schwartz BS, & Jaward AF (2024). Implementation fidelity, student outcomes, and cost-effectiveness of train-the-trainer strategies for Masters-level therapists in urban schools: Results from a cluster randomized trial. Implementation Science, 19, 4. 10.1186/s13012-023-01333-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Eisman AB, Heinze J, Kilbourne AM, Franzen S, Melde C, & McGarrell E (2020a). Comprehensive approaches to addressing mental health needs and enhancing school security: A hybrid type II cluster randomized trial. Health & Justice. 10.1186/s40352-020-0104-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Eisman AB, Kilbourne AM, Dopp AR, Saldana L, & Eisenberg D (2020b). Economic evaluation in implementation science: Making the business case for implementation strategies. Psychiatry Research. 10.1016/j.psychres.2019.06.008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Eisman AB, Kilbourne AM, Greene D, Walton M, & Cunningham R (2020c). The user-program interaction: How teacher experience shapes the relationship between intervention packaging and fidelity to a state-adopted health curriculum. Prevention Science, 21, 820–829. 10.1007/s11121-020-01120-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ennis RP, Royer DJ, Lane KL, & Dunlap KD (2019). The impact of coaching on teacher-delivered behavior-specific praise in Pre-K–12 settings: A systematic review. Behavioral Disorders, 45(3), 148–166. 10.1177/0198742919839221 [DOI] [Google Scholar]
- Erchul WP (2023). As we coach, so shall we consult: A perspective on coaching research in education. Journal of School Psychology, 96, 88–94. 10.1016/j.jsp.2022.10.004 [DOI] [PubMed] [Google Scholar]
- Eslava-Schmalbach J, Garzón-Orjuela N, Elias V, Reveiz L, Tran N, & Langlois EV (2019). Conceptual framework of equity-focused implementation research for health programs (EquIR). International Journal for Equity in Health, 18, 80. 10.1186/s12939-019-0984-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Evans SW, Owens JS, Bradshaw CP, & Weist MD (Eds.). (2023). Handbook of school mental health: Research, training, practice, and policy. Springer. [Google Scholar]
- Fabiano GA, & Evans SW (2019). Introduction to the special issue of school mental health on best practices in effective multi-tiered intervention frameworks. School Mental Health, 11, 1–3. 10.1007/s12310-018-9283-2 [DOI] [Google Scholar]
- Fabiano GA, Reddy LA, & Dudek CM (2018). Teacher coaching supported by formative assessment for improving classroom practices. School Psychology Quarterly. 10.1037/spq0000223 [DOI] [PubMed] [Google Scholar]
- Farr B, Gandomi M, & DeMatthews DE (2020). Implementing restorative justice in an urban elementary school: A principal’s commitment and experiences eliminating exclusionary discipline. Journal of Cases in Educational Leadership, 23(3), 48–62. 10.1177/1555458920922888 [DOI] [Google Scholar]
- Flannery KB, Hershfeldt P, & Freeman J (2018). Lessons learned on implementation of PBIS in high schools: Current trends and future directions. Center for Positive Behavioral Interventions and Supports (funded by OSEP, US Department of Education), University of Oregon Press. [Google Scholar]
- Forman SG, Shapiro ES, Codding RS, Gonzales JE, Reddy LA, Rosenfield SA, Sanetti LMH, & Stoiber KC (2013). Implementation science and school psychology. School Psychology Quarterly, 28(2), 77–100. 10.1037/spq0000019 [DOI] [PubMed] [Google Scholar]
- Frank HE, Kemp J, Benito KG, & Freeman JB (2022). Precision implementation: An approach to mechanism testing in implementation research. Administration and Policy in Mental Health and Mental Health Services Research, 49, 1084–1094. 10.1007/s10488-022-01218-x [DOI] [PubMed] [Google Scholar]
- Franks RP, & Bory CT (2015). Who supports the successful implementation and sustainability of evidence-based practices? Defining and understanding the roles of intermediary and purveyor organizations. New Directions for Child and Adolescent Development, 2015(149), 41–56. 10.1002/cad.20112 [DOI] [PubMed] [Google Scholar]
- Frazier SL, Chou T, Ouellette RR, Helseth SA, Kashem ER, & Cromer KD (2019). Workforce support for urban afterschool programs: Turning obstacles into opportunities. American Journal of Community Psychology, 63, 430–443. 10.1002/ajcp.12328 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gaias LM, Arnold KT, Lui FF, Pullmann MD, Duong MT, & Lyon AR (2022). Adapting strategies to promote implementation reach and equity (ASPIRE) in school mental health services. Psychology in the Schools, 59(12), 2471–2485. 10.1002/pits.22515 [DOI] [Google Scholar]
- Gaias LM, Cook CR, Brewer SK, Bruns EJ, & Lyon AR (2023). Addressing the “last mile” problem in educational research: Educational researchers’ interest, knowledge, and use of implementation science constructs. Educational Research and Evaluation, 28(7–8), 205–233. 10.1080/13803611.2023.2285440 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gale RC, Wu J, Erhardt T, Bounthavong M, Reardon CM, Damschroder LJ, & Midboe AM (2019). Comparison of rapid vs in-depth qualitative analytic methods from a process evaluation of academic detailing in the Veterans Health Administration. Implementation Science, 14, 11. 10.1186/s13012-019-0853-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Giordano K, Eastin S, Calcagno B, Wilhelm S, & Gil A (2020). Examining the effects of internal versus external coaching on preschool teachers’ implementation of a framework of evidence-based social-emotional practices. Journal of Early Childhood Teacher Education, 42(4), 423–436. 10.1080/10901027.2020.1782545 [DOI] [Google Scholar]
- Girio-Herrera E, Egan TE, Owens JS, Evans SW, Coles EK, Holdaway AS, Mixon CS, & Kassab HD (2021). Teacher ratings of acceptability of a daily report card intervention prior to and during implementation: Relations to implementation integrity and student outcomes. School Mental Health, 13, 69–83. 10.1007/s12310-020-09400-y [DOI] [Google Scholar]
- Glasgow RE (2013). What does it mean to be pragmatic? Pragmatic methods, measures, and models to facilitate research translation. Health Education & Behavior, 40(3), 257–265. 10.1177/1090198113486805 [DOI] [PubMed] [Google Scholar]
- Glasgow RE, Fisher L, Strycker LA, Hessler D, Toobert DJ, King DK, & Jacobs T (2014). Minimal intervention needed for change: Definition, use, and value for improving health and health research. Translational Behavioral Medicine, 4(1), 26–33. 10.1007/s13142-013-0232-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Glover TA, Reddy LA, & Crouse K (2023). Instructional coaching actions that predict teacher classroom practices and student achievement. Journal of School Psychology, 96, 1–11. 10.1016/j.jsp.2022.10.006 [DOI] [PubMed] [Google Scholar]
- Godin G, Bélanger-Gravel A, Eccles M, & Grimshaw J (2008). Healthcare professionals’ intention and behaviours: A systematic review of studies based on social cognitive theories. Implementation Science, 3, 36. 10.1186/1748-5908-3-36 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gregory A, Osher D, Bear GG, Jagers RJ, & Sprague JR (2021). Good intentions are not enough: Centering equity in school discipline reform. School Psychology Review, 50(2–3), 206–220. 10.1080/2372966X.2020.1861911 [DOI] [Google Scholar]
- Hamiliton AB, & Finley EP (2020). Qualitative methods in implementation research: An introduction. Psychiatry Research. 10.1016/j.psychres.2019.112516 [DOI] [PubMed] [Google Scholar]
- Heatly MC, Nichols-Hadeed C, Stiles AA, & Alpert-Gillis L (2023). Implementation of a school mental health learning collaborative model to support cross-sector collaboration. School Mental Health, 15, 384–401. 10.1007/s12310-023-09578-x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Herman KC, Reinke WM, & Eddy CL (2020). Advances in understanding and intervening in teacher stress and coping: The coping-competence-context theory. Journal of School Psychology, 78, 69–74. 10.1016/j.jsp.2020.01.001 [DOI] [PubMed] [Google Scholar]
- Holdaway AS, & Owens JS (2015). The effect of training and consultation condition on teachers’ self-reported likelihood of adoption of a daily report card. Journal of Educational Psychology, 107(1), 222–235. 10.1037/a0037466 [DOI] [Google Scholar]
- Holmes SR, Reinke WM, Herman KC, & David K (2021). An examination of teacher engagement in intervention training and sustained intervention implementation. School Mental Health, 14, 63–72. 10.1007/s12310-021-09457-3 [DOI] [Google Scholar]
- Hoover S (2024). Investing in school mental health: Strategies to wisely spend federal and state funding. Psychiatric Services, 75(8), 801–806. 10.1176/appi.ps.20230553 [DOI] [PubMed] [Google Scholar]
- Huang KY, Nakigudde J, Kisakye EN, Sentongo H, Dennis-Tiwary TA, Tozan Y, Park H, & Brotman LM (2022). Advancing scalability and impacts of a teacher training program for promoting child mental health in Ugandan primary schools: Protocol for a hybrid-type II effectiveness-implementation cluster randomized trial. International Journal of Mental Health Systems, 16, 28. 10.1186/s13033-022-00538-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hugh ML, Johnson LD, & Cook C (2022). Preschool teachers’ selection of social communication interventions for children with autism: An application of the theory of planner behavior. Autism, 26(1), 188–200. 10.1177/13623613211024795 [DOI] [PubMed] [Google Scholar]
- Husabo E, Haugland BSM, McLeod BD, Baste V, Haaland ÅT, Bjaastad JF, Hoffart A, Raknes S, Fjermestad KW, Rapee RM, Ogden T, & Wergeland GJ (2021). Treatment fidelity in brief versus standard-length school-based interventions for youth with anxiety. School Mental Health, 14, 49–62. 10.1007/s12310-021-09458-2 [DOI] [Google Scholar]
- Institute of Education Sciences. (2023). School Pulse Panel. US Department of Education. https://ies.ed.gov/schoolsurvey/spp/. [Google Scholar]
- Institute of Education Sciences. (2022). Standards for excellence in education research. US Department of Education. https://ies.ed.gov/seer/. [Google Scholar]
- Johnson C, Wiltsey Stirman S, & La Bash H (2018). De-implementation of harmful, pseudo-scientific practices: An underutilized step in implementation research. The Behavior Therapist, 41(1), 32–35. [Google Scholar]
- Johnson SR, Pas ET, & Bradshaw CP (2016). Understanding and measuring coach–teacher alliance: A glimpse inside the ‘black box.’ Prevention Science, 17, 439–449. 10.1007/s11121-016-0633-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kassab HD, Owens JS, Evans SW, Everly EL, & Mikami AL (2023). Exploring intervention sustainment and intervention spread following a randomized clinical trial of the MOSAIC program. School Mental Health, 15, 402–415. 10.1007/s12310-022-09555-w [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kataoka S, Jaycox LH, Wong M, Nadeem E, Langley A, Tang L, & Stein BD (2011). Effects on school outcomes in low-income minority youth: Preliminary findings from a community-partnered study of a school-based trauma intervention. Ethnicity & Disease, 21(301), S1. [PMC free article] [PubMed] [Google Scholar]
- Kemp CG, Wagenaar BH, & Haroz EE (2019). Expanding hybrid studies for implementation research: Intervention, implementation strategy, and context. Frontiers in Public Health. 10.3389/fpubh.2019.00325 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kilbourne A, Smith SN, Choi SY, Koschmann E, Liebrecht C, Rusch A, Abelson JL, Eisenberg D, Himle JA, Fitzgerald K, & Almirall D (2018). Adaptive school-based implementation of CNT (ASIC): Clustered-SMART for building an optimized adaptative implementation intervention to improve uptake of mental health interventions in schools. Implementation Science, 13, 119. 10.1186/s13012-018-0808-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kilbourne A, Chinman M, Rogal S, & Almirall D (2024). Adaptive designs in implementation science and practice: Their promise and the need for greater understanding and improved communication. Annual Review of Public Health. 10.1146/annurev-publhealth-060222-014438 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kim Y, Kim Y, Lee HJ, Lee S, Park SY, Oh SH, Jang S, Lee T, Ahn J, & Shin S (2022). The primary process and key concepts of economic evaluation in healthcare. Journal of Preventive Medicine & Public Health, 55(5), 415–423. 10.3961/jpmph.22.195 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kirk MA, Moore JE, Stirman SW, & Birken SA (2020). Towards a comprehensive model for understanding adaptations’ impact: The model for adaptation design and impact (MADI). Implementation Science, 15, 56. 10.1186/s13012-020-01021-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Komesidou R, & Hogan TP (2023). A generic implementation framework for school-based research and practice. Language, Speech, and Hearing Services in Schools, 54(4), 1165–1172. 10.1044/2023_LSHSS-22-00171 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kraft M, & Blazar D (2018). Taking teacher coaching to scale: Can personalized training become standard practice? Education next, 18(4), 68. [Google Scholar]
- Kuriyan A, Kinkler G, Cidav Z, Kang-Yi C, Eiraldi R, Salas E, & Wolk CB (2021). Team strategies and tools to enhance performance and patient safety (TeamSTEPPS) to improve collaboration in school mental health: Protocol for a mixed methods hybrid effectiveness-implementation study. JMIR Research Protocols. 10.2196/26567 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kwak L, Toropova A, Powell BJ, Lengnick-Hall R, Jensen I, Bergström G, Elinder LS, Stigmar K, Wåhlin C, & Björklund C (2022). A randomized controlled trial in schools aimed at exploring mechanisms of changes of a multifaceted implementation strategy for promoting mental health at the workplace. Implementation Science, 17, 56. 10.1186/s13012-022-01230-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- LaBrot ZC, Dufrene BA, Whipple H, McCargo M, & Pasqua JL (2020). Targeted and intensive consultation for increasing head start and elementary teachers’ behavior-specific praise. Journal of Behavioral Education, 29, 717–740. 10.1007/s10864-019-09342-9 [DOI] [Google Scholar]
- Langley AK, Nadeem E, Kataoka SH, Stein BD, & Jaycox LH (2010). Evidence-based mental health programs in schools: Barriers and facilitators of successful implementation. School Mental Health, 2, 105–113. 10.1007/s12310-010-9038-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Langreo L (2022). One-time PD is not effective. Why do districts still rely on it? Education Week. https://www.edweek.org/leadership/one-time-pd-is-not-effective-why-do-districts-still-rely-on-it/2022/10. [Google Scholar]
- Larson M (2022). Advancing tailored implementation of evidence-based practice in School Mental Health (Order No. 29208465). [Doctoral dissertation, University of Minnesota]. Available from ProQuest Dissertations & Theses Global. (2705415571). https://www.proquest.com/dissertations-theses/advancing-tailored-implementation-evidence-based/docview/2705415571/se-2. [Google Scholar]
- Larson M, Cook CR, Brewer SK, Pullmann MD, Hamlin C, Merle JL, Duong M, Gaias L, Sullivan M, Morrell N, Kulkarni T, Weeks M, & Lyon AR (2021). Examining the effects of a brief, group-based motivational implementation strategy on mechanisms of teacher behavior change. Prevention Science, 22, 722–736. 10.1007/s11121-020-01191-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lawson GM, & Owens JS (2024). Practice partnerships for the development of school mental health interventions: An introduction to the special issue. School Mental Health, 16, 593–600. [Google Scholar]
- Leadbeater BJ, Gladstone EJ, & Sukhawathanakul P (2015). Planning for sustainability of an evidence-based mental health promotion program in Canadian elementary schools. American Journal of Community Psychology, 56, 120–133. 10.1007/s10464-015-9737-8 [DOI] [PubMed] [Google Scholar]
- Lee J, Frey A, Warner Z, & Kelley L (2019). Coaching to improve motivation in early childhood practitioners and parents. In Saracho O(Ed.), Contemporary perspectives on research in motivation in early childhood education (pp. 2019–2228). Information Age Publishing Inc. [Google Scholar]
- Lei H, Nahum-Shani I, Lynch K, Oslin D, & Murphy SA (2012). A “SMART” design for building individualized treatment sequences. Annual Review of Clinical Psychology, 8, 21–48. 10.1146/annurev-clinpsy-032511-14315 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lewis CC, Boyd MR, Walsh-Baliey C, Lyon AR, Beidas R, Mittman B, Aarons GA, Weiner BJ, & Chambers DA (2020). A systematic review of empirical studies examining mechanisms of implementation in health. Implementation Science, 15, 21. 10.1186/s13012-020-00983-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, Walsh-Bailey C, & Weiner B (2018). From classification to causality: Advancing understanding of mechanisms of change in implementation science. Frontiers in Public Health, 6, 136. 10.3389/fpubh.2018.00136 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lewis CC, Metter K, & Lyon AR (2021a). Determining the influence of intervention characteristics on implementation success requires reliable and valid measures: Results from a systematic review. Implementation Research and Practice. 10.1177/2633489521994197 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lewis CC, Powell BJ, Brewer SK, Nguyen AM, Schriger SH, Vejnoska SF, Walsh-Baliey C, Aarons GA, Beidas RS, Lyon AR, Weiner B, Williams N, & Mittman B (2021b). Advancing mechanisms of implementation to accelerate sustainable evidence-based practice integration: Protocol for generating a research agenda. British Medical Journal Open. 10.1136/bmjopen-2021-053474 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Liu FF, Coifman J, McRee E, Stone J, Law A, Gaias L, Reyes R, Lai CK, Blair IV, Yu C, Cook H, & Lyon AR (2022a). A brief online implicit bias intervention for school mental health clinicians. International Journal of Environmental Research and Public Health, 19(2), 679. 10.3390/ijerph19020679 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Liu FF, Cruz RA, Rockhill CM, & Lyon AR (2019). Mind the gap: Considering disparities in implementing measurement-based care. Journal of the American Academy of Child & Adolescent Psychiatry, 58(4), 459–461. 10.1016/j.jaac.2018.11.015 [DOI] [PubMed] [Google Scholar]
- Liu FF, McRee E, Coifman J, Stone J, Lai CK, Yu C, & Lyon AR (2022b). School mental health professionals’ knowledge of stereotypes and implicit bias toward Black and Latinx youths. Psychiatric Services, 73, 1308–1311. 10.1176/appi.ps.202100253 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Locke J, Lawson GM, Beidas RS, Aarons GA, Xie M, Lyon AR, Stahmer A, Seidman M, Frederick L, Oh C, Spaulding C, Dorsey S, & Mandell DS (2019). Individual and organizational factors that affect implementation of evidence-based practices for children with autism in public schools: A cross-sectional observational study. Implementation Science. 10.1186/s13012-019-0877-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, & Bruns EJ (2019a). From evidence to impact: Joining our best school mental health practices with our best implementation strategies. School Mental Health, 11, 106–114. 10.1007/s12310-018-09306-w [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, & Bruns EJ (2019b). User-centered redesign of evidence-based psychosocial interventions to enhance implementation-hospitable soil or better seeds? JAMA Psychiatry, 76(1), 3–4. 10.1001/jamapsychiatry.2018.3060 [DOI] [PubMed] [Google Scholar]
- Lyon AR, Coifman J, Cook H, McRee E, Liu FF, Ludwig K, Dorsey S, Koerner K, Munson SA, & McCauley E (2021). The Cognitive Walkthrough for Implementation Strategies (CWIS): A pragmatic method for assessing implementation strategy usability. Implementation Science Communications, 2, 78. 10.1186/s43058-021-00183-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, Comtois KA, Kerns SEU, Landes SJ, & Lewis CC (2020a). Closing the science–practice gap in implementation before it widens. In Albers B, Shlonsky A, & Mildon R (Eds.), Implementation science 3.0 (pp. 295–313). Springer. 10.1007/978-3-030-03874-8_12 [DOI] [Google Scholar]
- Lyon AR, Cook CR, Brown EC, Locke J, Davis C, Ehrhart M, & Aarons GA (2018). Assessing organizational implementation context in the education sector: Confirmatory factor analysis of measures of implementation leadership, climate, and citizenship. Implementation Science. 10.1186/s13012-017-0705-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, Cook CR, Duong MT, Nicodimos S, Pullmann MD, Brewer SK, Gaias LM, & Cox S (2019a). The influence of a blended, theoretically-informed pre-implementation strategy on school-based clinician implementation of an evidence-based trauma intervention. Implementation Science, 14, 54. 10.1186/s13012-019-0905-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, Cook CR, Larson M, Hugh ML, Dopp A, Hamlin C, Reinke P, Bose M, Law A, Goosey R, Goerdt A, Morrell N, Wackerle-Hollman A, & Pullmann MD (2024). Protocol for a hybrid type 3 effectiveness-implementation trail of a pragmatic individual-level implementation strategy for supporting school-based prevention programming. Implementation Science, 19, 2. 10.1186/s13012-023-01330-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, Cook CR, Locke J, Davis C, Powell BJ, & Waltz TJ (2019b). Importance and feasibility of an adapted set of implementation strategies in schools. Journal of School Psychology, 76, 66–77. 10.1016/j.jsp.2019.07.014 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, Dopp AR, Brewer SK, Kientz JA, & Munson SA (2020b). Designing the future of children’s mental health services. Administration and Policy in Mental Health and Mental Health Services Research, 47, 735–751. 10.1007/s10488-020-01038-x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, Liu FF, Connors EH, King KM, Coifman JI, Cook H, McRee E, Ludwig K, Law A, Dorsey S, & McCauley E (2022). How low can you go? Examining the effects of brief online training and post-training consultation dose on implementation mechanisms and outcomes for measurement-based care. Implementation Science Communications, 3, 79. 10.1186/s43058-022-00325-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, Ludwig KA, Stoep AV, Gudmundsen G, & McCauley E (2013). Patterns and predictors of mental healthcare utilization in schools and other service sectors among adolescents at risk for depression. School Mental Health, 5, 155–165. 10.1007/s12310-012-9097-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, Munson SA, Reddy M, Schueller SM, Agapie E, Yarosh S, Dopp A, von Thiele Schwarz U, Doherty G, Graham AK, Kruzan KP, Kornfield R (2023). Bridging HCI and implementation science for innovation adoption and public health impact. In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems (CHI EA ‘23), April 23–28, 2023, Hamburg, Germany: (p. 7). ACM. 10.1145/3544549.3574132. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, Munson SA, Renn BN, Atkins DC, Pullmann MD, Friedman E, & Areán PA (2019c). Use of human-centered design to improve implementation of evidence-based psychotherapies in low-resource communities: Protocol for studies applying a framework to assess usability. JMIR Research Protocols, 8(10), e14990. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Malik K, Michelson D, Doyle AM, Weiss HA, Greco G, Sahu R, James EJ, Mathur S, Sudhir P, King M, Cuijpers P, Chorpita B, Fairburn CG, & Patel V (2021). Effectiveness and costs associated with a lay counselor-delivered, brief problem-solving mental health intervention for adolescents in urban, low-income schools in India: 12-month outcomes of a randomized controlled trial. PLoS Medicine, 18(9), e1003778. 10.1371/journal.pmed.1003778 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Margherio SM, Evans SW, DuPaul GJ, Allan DM, & Owens JS (2023). Effects of compliance to a training intervention for high school students with ADHD. Journal of Clinical Child & Adolescent Psychology, 53(3), 429–443. 10.1080/15374416.2023.2292030 [DOI] [PubMed] [Google Scholar]
- Marx T, Klein E, Colpo A, Walden-Doppke M, Reinhardt E, & Butler C (2020). Coaching for ongoing professional learning within tiered support models. National Center on Intensive Intervention, Office of Special Education Programs, U.S. Department of Education. [Google Scholar]
- Massar MM, Horner RH, Kittelman A, & Conley KM (2023). Mechanisms of effective coaching: Using prompting and performance feedback to improve teacher and student outcomes. Journal of Positive Behaviors Interventions, 25(3), 169–184. 10.1177/10983007221133524 [DOI] [Google Scholar]
- McIntosh K, Mercer SH, Nese RNT, Strickland-Cohen MK, & Hoselton R (2016). Predictors of sustained implementation of school-wide positive behavioral interventions and supports. Journal of Positive Behavior Interventions, 18(4), 209–218. 10.1177/1098300715599737 [DOI] [Google Scholar]
- McKay VR, Morshed AB, Brownson RC, Proctor EK, & Prusaczyk B (2018). Letting go: Conceptualizing intervention de-implementation in public health and social service settings. American Journal of Community Psychology, 62(1–2), 189–202. 10.1002/ajcp.12258 [DOI] [PMC free article] [PubMed] [Google Scholar]
- McLeod BD, Cook CR, Sutherland KS, Lyon AR, Dopp A, Broda M, & Beidas RS (2022). A theory-informed approach to locally managed learning school systems: Integrating treatment integrity and youth mental health outcome data to promote youth mental health. School Mental Health, 14, 88–102. 10.1007/s12310-021-09413-1 [DOI] [Google Scholar]
- Merle JL, Cook CR, Pullmann MD, Larson MF, Hamlin CM, Hugh ML, Brewer SK, Duong MT, Bose M, & Lyon AR (2023). Longitudinal effects of a motivationally focused strategy to increase the yield of training and consultation on teachers’ adoption and fidelity of a universal program. School Mental Health, 15, 105–122. 10.1007/s12310-022-09536-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- Meurer WJ, Lewis RJ, Tagle D, Fetters MD, Legocki L, Berry S, Connor J, Durkalski V, Elm J, Zhao W, Frederiksen S, Silbergleit R, Palesch Y, Berry DA, & Barsan WG (2012). An overview of the adaptive designs accelerating promising trials intro treatments (ADAPT-IT) project. Annals of Emergency Medicine, 60(4), 451–457. 10.1016/j.annemergmed.2012.01.020 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Meyer AE, Rodriguez-Quintana N, Miner K, Bilek EL, Vichich J, Smith SN, & Koschmann E (2022). Developing a statewide network of coaches to support youth access to evidence-based practices. Implementation Research and Practice. 10.1177/26334895221101 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Miller CJ, Wiltsey-Stirman S, & Baumann AA (2020). Iterative Decision-making for Evaluation of Adaptations (IDEA): A decision tree for balancing adaptation, fidelity and intervention impact. Journal of Community Psychology, 48(4), 1163–1177. 10.1002/jcop.22279 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Moir T (2018). Why is implementation science important for intervention design and evaluation within educational settings? Frontiers in Education. 10.3389/feduc.2018.00061 [DOI] [Google Scholar]
- Moore SA, Arnold KT, Beidas RS, & Mendelson T (2021). Specifying and reporting implementation strategies used in a school-based prevention efficacy trial. Implementation Research and Practice. 10.1177/26334895211047841 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Moore SA, Cooper JM, Malloy J, & Lyon AR (2024). Core components and implementation determinants of multilevel service delivery frameworks across child mental health service settings. Administration and Policy in Mental Health and Mental Health Services Research, 51, 172–195. 10.1007/s10488-023-01320-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Moran L, Koester KA, Le Tourneau N, Coffey S, Moore K, Broussard J, Crouch P, VanderZander L, Schneider J, Lynch E, Roman J, & Christopoulos KA (2023). The rapid interaction: A qualitative study of provider approaches to implementing Rapid ART. Implementation Science Communications, 4, 78. 10.1186/s43058-023-00464-w [DOI] [PMC free article] [PubMed] [Google Scholar]
- Moulin JC, Sklar M, Green A, Dickson KS, Stadnick NA, Reeder K, & Aarons GA (2020). Advancing the pragmatic measurement of sustainment: A narrative review of measures. Implementation Science Communications, 1, 76. 10.1186/s43058-020-00068-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Munson SA, Friedman EC, Osterhage K, Allred R, Pullmann MD, Areán PA, Lyon AR, UW ALACRITY Center Researchers. (2022). Usability issues evidence-based psychoical interventions and implementation strategies: Cross-project analysis. Journal of Medical Internet Research. 10.2196/37585 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Murthy VH (2022). The mental health of minority and marginalized young people: An opportunity for action. Public Health Reports, 137(4), 613–616. 10.1177/00333549221102390 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nadeem E, McNamee E, Lang JM, Perry D, & Lich KH (2023). Novel application of system support mapping for sustainment of trauma-focused mental health intervention in school-based health centers: A case study. Evidence-Based Practice in Child and Adolescent Mental Health, 8(2), 286–302. 10.1080/23794925.2022.2056928 [DOI] [Google Scholar]
- Nadeem E, & Ringle VA (2016). De-adoption of an evidence-based trauma intervention in schools: A retrospective report from an urban school district. School Mental Health, 8, 132–143. 10.1007/s12310-016-9179-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nadeem E, Saldana L, Chapman J, & Schaper H (2018). A mixed method study of the stages of implementation for an evidence-based trauma intervention in schools. Behavior Therapy, 49(4), 509–524. 10.1016/j.beth.2017.12.004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- National Institutes of Health (2019). Department of Health and Human Services. Department of Health and Human Services (HHS). PAR-19–274: Dissemination and Implementation Research in Health (R01 Clinical Trial Optional) (nih.gov). [Google Scholar]
- National Institutes of Health (2024). Science of Implementation in Health and Healthcare–SIHH. National Institute of Health Center for Scientific Review. https://public.csr.nih.gov/StudySections/DABP/HSS/SIHH. [Google Scholar]
- Nevedal AL, Reardon CM, Widerquist MAO, Jackson GL, Cutrona SL, White BS, & Damschroder LJ (2021). Rapid versus traditional qualitative analysis using the Consolidated Framework for Implementation Research (CFIR). Implementation Science, 16, 67. 10.1186/s13012-021-01111-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nilsen P, Ingvarsson S, Hasson H, von Thiele Schwarz U, & Augustsson H (2020). Theories, models, and frameworks for de-implementation of low-value care: A scoping review of the literature. Implementation Research and Practice. 10.1177/2633489520953762 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Noell GH, & Gansle KA (2014). Research examining the relationships between consultation procedures, treatment integrity, and outcomes. In Erchul WP& Sheridan SM(Eds.), Handbook of research in school consultation (2nd ed., pp. 386–408). Routledge. [Google Scholar]
- Norman DA, & Draper SW (1986). User center system design: New perspectives on human–computer interaction. CRC Press. 10.1201/9780367807320 [DOI] [Google Scholar]
- Norton WE, & Chambers DA (2020). Unpacking the complexities of de-implementing inappropriate health interventions. Implementation Science, 15, 2. 10.1186/s13012-019-0960-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Olswang LB, & Prelock PA (2015). Bridging the gap between research and practice: Implementation Science. Journal of Speech, Language, and Hearing Research, 58(6), S1818–S1826. 10.1044/2015_JSLHR-L-14-0305 [DOI] [PubMed] [Google Scholar]
- Ormiston CK, & Williams F (2022). LGBTQ youth mental health during COVID-19: Unmet needs in public health and policy. The Lancet, 399, 501–503. 10.1016/s0140-6736(21)02872-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ouellette RR, Strambler MJ, Genovese MA, Selino S, Joyner L, Sevin S, Granzow E, & Connors EH (2024). Selecting, adapting and implementing classroom kernels for student social and emotional development and resilience in local elementary schools: A Community–university partnership approach. School Mental Health. 10.1007/s12310-024-09639-9 [DOI] [Google Scholar]
- Owens JS, Allan DM, Hustus C, & Erchul WP (2018). Examining correlates of teacher receptivity to social influence strategies within a school consultation relationship. Psychology in the Schools, 55(9), 1041–1055. 10.1002/pits.22163 [DOI] [Google Scholar]
- Owens JS, Coles EK, Evans SW, Himawan LK, Girio-Herrera E, Holdaway AS, Zoromski AK, Schamberg T, & Schulte AC (2017a). Using multi-component consultation to increase the integrity with which teachers implement behavioral classroom interventions: A pilot study. School Mental Health, 9, 218–234. 10.1007/s12310-017-9217-4 [DOI] [Google Scholar]
- Owens JS, Evans SW, Coles EK, Holdaway AS, Himawan LK, Mixon CS, & Egan TE (2020). Consultation for classroom management and targeted interventions: Examining benchmarks for teacher practices that produce desired change in student behavior. Journal of Emotional and Behavioral Disorders, 28(1), 52–64. 10.1177/1063426618795440 [DOI] [Google Scholar]
- Owens JS, Exner-Cortens D, Cappella E, DeShazer M, May N, Seipp J, Claussen C, Zieg N, & Garcia N (2024). Partnering with educators to iteratively co-create tools to support teachers’ use of equity-focused positive behavioral supports. School Mental Health. 10.1007/s12310-024-09653-x [DOI] [Google Scholar]
- Owens JS, Lee M, Kassab H, Evans SW, & Coles EC (2021). Motivational ruler ratings among teachers receiving coaching in classroom management: Measurement and relationship to implementation integrity. Prevention Science, 22(6), 769–774. 10.1007/s11121-020-01111-9 [DOI] [PubMed] [Google Scholar]
- Owens JS, Lyon AR, Brandt NE, Masia Warner C, Nadeem E, Spiel C, & Wagner M (2014). Implementation science in school mental health: Key constructs in a developing research agenda. School Mental Health, 6, 99–111. 10.1007/s12310-013-9115-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Owens JS, Schwartz ME, Erchul WP, Himawan LK, Evans SE, Coles EK, & Schulte AC (2017b). Teacher perceptions of school consultant social influence strategies: Replication and expansion. Journal of Educational and Psychological Consultation, 27(4), 411–436. 10.1080/10474412.2016.275649 [DOI] [Google Scholar]
- Palinkas LA (2014). Qualitative and mixed methods in mental health services and implementation research. Journal of Clinical Child & Adolescent Psychology, 43(6), 851–861. 10.1016/j.psychres.2019.112516 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Partee A, Williford A, & Whittaker J (2022). Implementing banking time with teachers and preschoolers displaying disruptive behaviors: Links between consultant-teacher relationship quality, implementation fidelity and dosage, and dyadic teacher–child interactions. School Mental Health, 14, 1–16. 10.1007/s12310-021-09467-1 [DOI] [Google Scholar]
- Pas ET, Borden L, Debham KJ, De Lucia D, & Bradshaw CP (2022). Exploring profiles of coaches’ fidelity to Double Check’s Motivational Interviewing-embedded coaching: Outcomes associated with fidelity. Journal of School Psychology, 92, 285–298. 10.1016/j.jsp.2022.04.003 [DOI] [PubMed] [Google Scholar]
- Pas ET, Bradshaw CP, Becker KD, Domitrovich C, Berg J, Musci R, & Ialongo NS (2015). Identifying patterns of coaching to support the implementation of the Good Behavior Game: The role of teacher characteristics. School Mental Health, 7, 61–73. 10.1007/s12310-015-9145-0 [DOI] [Google Scholar]
- Pas ET, Kaiser LT, & Owens JS (2023). Innovative approaches to coaching teachers in implementing Tier 1 and Tier 2 classroom interventions. In Evans SW, Owens JS, Bradshaw CP, & Weist MD(Eds.), Handbook of school mental health: Innovations in science and practice (pp. 343–355). Springer. 10.1007/978-3-031-20006-9_23 [DOI] [Google Scholar]
- Pellecchia M, Marcus SC, Spaulding C, Seidman M, Xie M, Rump K, Reisinger EM, & Mandell DS (2020). Randomized trial of a computer-assisted intervention for children with autism in schools. Journal of the American Academy of Child & Adolescent Psychiatry, 59(3), 373–380. 10.1016/j.jaac.2019.03.029 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pettersson K, Liedgren P, Lyon AR, Hasson H, & von Thiele Schwarz U (2024). Fidelity-consistency and deliberateness of modifications in parenting programs. Implementation Science Communications, 5, 13. 10.1186/s43058-024-00545-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pfiffner LJ, Dvorsky MR, Friedman LM, Haack LM, Chung S, Charalel JM, Hawkey E, & Spiess M (2022). Development of a web-based training platform for school clinicians in evidence-based practices for ADHD. School Mental Health, 15, 49–66. 10.1007/s12310-022-09556-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pianta RC, Lipscomb D, & Ruzek E (2022). Indirect effects of coaching on pre-K students’ engagement and literacy skill as a function of improved teacher–student interaction. Journal of School Psychology, 91, 65–80. 10.1016/j.jsp.2021.12.003 [DOI] [PubMed] [Google Scholar]
- Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, Rycroft-Malone J, Meissner P, Murray E, Patel A, Sheikh A, & Taylor SJC (2017). Standards for reporting implementation studies (StaRI) statement. BMJ, 356, i6795. 10.1136/bmj.i6795 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Proctor EK, Bunger AC, Lengnick-Hall R, Gerke DR, Martin JK, Phillips RJ, & Swanson JC (2023). Ten years of implementation outcomes research: A scoping review. Implementation Science, 18, 31. 10.1186/s13012-023-01286-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- Prusaczyk B, Swindle T, & Curran G (2020). Defining and conceptualizing outcomes for de-implementation: Key distinctions from implementation outcomes. Implementation Science Communications, 1, 43. 10.1186/s43058-020-00035-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Reinke WM, Herman KC, Thompson A, Copeland C, McCall CS, Holmes S, & Owens SA (2021). Investigating the longitudinal association between fidelity to a large-scale comprehensive school mental health prevention and intervention model and student outcomes. School Psychology Review, 50(1), 17–29. 10.1080/2372966x.2020.1870869 [DOI] [Google Scholar]
- Reinke WM, Lewis-Palmer T, & Merrell K (2008). The classroom check-up: A classwide teacher consultation model for increasing praise and decreasing disruptive behavior. School Psychology Review, 37(3), 315–332. [PMC free article] [PubMed] [Google Scholar]
- Renshaw TL, & Phan ML (2023). Using implementation reporting to advance culturally sensitive and equity-focused mindfulness programs in schools. Mindfulness, 14, 307–313. 10.1007/s12671-023-02068-w [DOI] [Google Scholar]
- Rojas-Andrade R, & Bahamondes LL (2018). Is implementation fidelity important? A systematic review on school-based mental health programs. Contemporary School Psychology, 23, 339–350. 10.1007/s40688-018-0175-0 [DOI] [Google Scholar]
- Rose EJ (2016). Design as advocacy: Using a human-centered approach to investigate the needs of vulnerable populations. Journal of Technical Writing and Communication, 46(4), 427–445. 10.1177/0047281616653494 [DOI] [Google Scholar]
- Saldana L, Chamberlain P, Bradford WD, Campbell M, & Landsverk J (2014). The cost of implementing new strategies (COINS): A method for mapping implementation resources using the stages of implementation completion. Children and Youth Services Review, 39, 177–182. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sanetti LMH, Charbonneau S, Knight A, Cochrane WS, Kulcyk MCM, & Kraus KE (2020). Treatment fidelity reporting in intervention outcome studies in the school psychology literature from 2009 to 2016. Psychology in the Schools, 57, 901–922. 10.1002/pits.22364 [DOI] [Google Scholar]
- Sanetti LMH, & Collier-Meek MA (2019). Increasing implementation science literacy to address the research-to-practice gap in school psychology. Journal of School Psychology, 76, 33–47. 10.1016/j.jsp.2019.07.008 [DOI] [PubMed] [Google Scholar]
- Scaletta M, & Tejero Hughes M (2021). Sustained positive behavioral interventions and supports implementation: School leaders discuss their processes and practices. Journal of Positive Behavior Interventions, 23(1), 30–41. 10.1177/1098300720924350 [DOI] [Google Scholar]
- Shelton RC, Adsul P, Oh A, Moise N, & Griffith DM (2021). Application of an antiracism lens in the field of implementation science (IS): Recommendations for reframing implementation research with a focus on justice and racial equity. Implementation Research and Practice. 10.1177/26334895211049482 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shelton RC, & Brownson RC (2023). Enhancing impact: A call to action for equitable implementation science. Prevention Science. 10.1007/s11121-023-01589-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shernoff ES, Frazier S, Lisetti CL, Buche C, Lunn SJ, Brown C, Delmarre A, Chou T, Gabbard JL, & Morgan E (2018). Early career teacher professional development: Bridging simulation technology with evidence-based behavior management. The Journal of Technology and Teacher Education, 26(2), 299–326. [Google Scholar]
- Smith SN, Almirall D, Choi SY, Koschmann E, Rusch A, Bilek E, Lane A, Abelson JL, Eisenberg D, Himle JA, Fitzgerald KD, Liebrecht C, & Kilbourne AM (2022). Primary aim results of a clustered SMART for developing a school-level, adaptive implementation strategy to support CBT delivery at high schools in Michigan. Implementation Science, 17, 42. 10.1186/s13012-022-01211-w [DOI] [PMC free article] [PubMed] [Google Scholar]
- Solving implementation challenges. (2024). D3center; Institute for Social Research, University of Michigan. https://d3c.isr.umich.edu/solving-implementation-challenges/#mismart [Google Scholar]
- Sperber NR, Bruening RA, Choate A, Mahanna E, Wang V, Powell BJ, Damush T, Jackson GL, Van Houtven CH, Allen K, & Hastings SN (2019). Implementing a mandated program across a regional health care system: A rapid qualitative assessment to evaluate early implementation strategies. Quality Management in Health Care, 28(3), 147–154. 10.1097/QMH.0000000000000221 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stahmer AC, Reed S, Lee E, Reisinger EM, Connell JE, & Mandell DS (2015). Training teachers to use evidence-based practices for autism: Examining procedural implementation fidelity. Psychology in the Schools, 52(2), 181–195. 10.1002/pits.21815 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Steiner ED, & Woo A (2021). Job-related stress threatens the teacher supply: Key findings from the 2021 state of the U.S. teacher survey. RAND Corporation. 10.7249/RRA1108-1 [DOI] [Google Scholar]
- Stiles-Shields C, Cummings C, Montague E, Plevenisky JM, Psihogoios AM, & Williams KDA (2022). A call to action: Using and extending human-centered design methodologies to improve mental and behavioral health equity. Frontiers in Digital Health. 10.3389/fdgth.2022.848052 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stirman SW, Baumann AA, & Miller CJ (2019). The FRAME: An expanded framework for reporting adaptations and modifications to evidence-based interventions. Implementation Science, 14, 58. 10.1186/s13012-019-0898-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stirman SW, Kimberly J, Cook N, Calloway A, Castro F, & Charns M (2012). The sustainability of new programs and innovations: A review of the empirical literature and recommendations for future research. Implementation Science, 7, 17. 10.1186/1748-5908-7-17 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stormont M, Reinke WM, Newcomer L, Marchese D, & Lewis C (2015). Coaching teachers’ use of social behavior interventions to improve children’s outcomes: A review of the literature. Journal of Positive Behavior Interactions, 17(2), 69–82. 10.1177/1098300714550657 [DOI] [Google Scholar]
- Sugai G, & Horner RR (2006). A promising approach for expanding and sustaining school-wide positive behavior support. School Psychology Review, 35(2), 245–259. [Google Scholar]
- Sutherland KS, Conroy MA, McLeod BD, Algina J, & Wu E (2018). Teacher competence of delivery of BEST in CLASS as a mediator of treatment effects. School Mental Health, 10, 214–225. 10.1007/s12310-017-9224-5 [DOI] [Google Scholar]
- Sutherland KS, Conroy MA, Vo A, & Ladwig C (2015). Implementation integrity of practice-based coaching: Preliminary results from the BEST in CLASS efficacy trial. School Mental Health, 7, 21–33. 10.1007/s12310-014-9134-8 [DOI] [Google Scholar]
- Sutherland KS, & McLeod BD (2022). Advancing the science of integrity measurement in school mental health research. School Mental Health, 14, 1–6. 10.1007/s12310-021-09468-0 [DOI] [Google Scholar]
- Taylor B, Henshall C, Kenyon S, Litchfield I, & Greenfield S (2018). Can rapid approaches to qualitative analysis deliver timely, valid findings to clinical leaders? A mixed methods study comparing rapid and thematic analysis. British Medical Journal Open, 8(10), e019993. 10.1136/bmjopen-2017-019993 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Taylor MJ, McNicholas C, Nicolay C, Darzi A, Bell D, & Reed JE (2014). Systematic review of the application of the plan-do-study-act method to improve quality in healthcare. BMJ Quality & Safety, 23, 290–298. 10.1136/bmjqs-2013-001862 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Thayer AJ, Cook CR, Davis C, Brown EC, Locke J, Ehrhart MG, Aarons GA, Picozzi E, & Lyon AR (2022). Construct validity of the school-implementation climate scale. Implementation Research and Practice. 10.1177/26334895221116065 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Thompson KM (2022). The Role of Internal and External Coaches in Supporting Implementation of Proficiency Based Learning at the Secondary Level (Order No. 29385963). [Doctoral Dissertation, University of Maine]. Available from ProQuest Dissertations & Theses Global. (2724233723). https://www.proquest.com/dissertations-theses/role-internal-external-coaches-supporting/docview/2724233723/se-2. [Google Scholar]
- Turner L, Calvert HG, Fleming CM, Lewis T, Siebert C, Anderson N, Castleton T, Havlick A, & McQuilkin M (2022). Study protocol for a cluster-randomized trial of a bundle of implementation support strategies to improve the fidelity of implementation of schoolwide Positive Behavioral Interventions and Supports in rural schools. Contemporary Clinical Trials Communications. 10.1016/j.conctc.2022.100949 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Valenstein-Mah H, Greer N, McKenzie L, Hansen L, Strom TQ, Wiltsey Stirman S, Wilt TJ, & Kehle-Forbes SM (2020). Effectiveness of training methods for delivery of evidence-based psychotherapies: A systematic review. Implementation Science. 10.1186/s13012-020-00998-w [DOI] [PMC free article] [PubMed] [Google Scholar]
- Vindrola-Padros C, & Johnson GA (2020). Rapid techniques in qualitative research: A critical review of the literature. Qualitative Health Research, 30(10), 1596–1604. 10.1177/1049732320921835 [DOI] [PubMed] [Google Scholar]
- Washington-Nortey M, Granger K, Sutherland KS, Conroy M, Kaur N, & Hetrick A (2023). Sustaining BEST in CLASS: Teacher-reported evidence-based practice use with students at risk for emotional and behavioral disorders amidst the COVID-19 Pandemic. School Mental Health, 15, 470–483. 10.1007/s12310-022-09561-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Weist MD, Hoover S, Daly BP, Short KH, & Bruns EJ (2023). Propelling the global advancement of school mental health. Clinical Child and Family Psychology Review, 26, 851–864. 10.1007/s10567-023-00434-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wensing M, Sales A, Wilson P, Armstrong R, Kislov R, Rankin NM, Ramaswamy R, & Xu D (2021). Implementation Science and Implementation Science Communications: A refreshed description of the journals’ scope and expectations. Implementation Science, 16, 103. 10.1186/s13012-021-01175-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Westerlund A, Nilsen P, & Sundberg L (2019). Implementation of implementation science knowledge: The research-practice gap paradox. Worldviews on Evidence-Based Nursing, 16(5), 332–334. 10.1111/wvn.12403 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wilk AS, Hu J-C, Wen H, & Cummings JR (2022). Recent trends in school-based mental health services among low-income and racial and ethnic minority adolescents. JAMA Pediatrics, 176(8), 813–815. 10.1001/jamapediatrics.2022.1020 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wilkinson S, Freeman J, Simonsen B, Sears S, Byun SG, Xu X, & Luh HJ (2020). Professional development for classroom management: A review of the literature. Educational Research and Evaluation, 26(3–4), 182–212. 10.1080/13803611.2021.1934034 [DOI] [Google Scholar]
- Williams NJ (2016). Multilevel mechanisms of implementation strategies in mental health: Integrating theory, research, and practice. Administration and Policy in Mental Health and Mental Health Services Research, 43, 783–798. 10.1007/s10488-015-0693-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Woodbridge MW, Sumi CW, Thornton PS, Fabrikant N, Rouspil KM, Langley AK, & Kataoka SH (2016). Screening for trauma in early adolescence: Findings from a diverse school district. School Mental Health, 8, 89–105. 10.1007/s12310-015-9169-5 [DOI] [Google Scholar]
- Woodbridge MW, Sumi CW, Yu J, Rouspil K, Javitz HS, Seeley JR, & Walker HM (2014). Implementation and sustainability of an evidence-based program: Lessons learned from the PRISM applied to first step to success. Journal of Emotional and Behavioral Disorders, 22(2), 95–106. 10.1177/1063426613520456 [DOI] [Google Scholar]
- Woodward EN, Matthieu MM, Uchendu US, Rogal S, & Kirchner JE (2019). The health equity implementation framework: Proposal and preliminary study of hepatitis C virus treatment. Implementation Science, 14, 26. 10.1186/s13012-019-0861-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zhang Y, Cook CR, Azad GF, Larson M, Merle JL, Thayer J, Pauls A, & Lyon AR (2023). A pre-implementation enhancement strategy to increase the yield of training and consultation for school-based behavioral preventive practice: A triple-blind randomized controlled trial. Preventive Science, 24, 552–566. 10.1007/s11121-022-01464-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
