Abstract
Implementation strategies are methods or techniques used to enhance the adoption, implementation, and sustainment of a new program or practice. Recent studies have facilitated implementation strategy prioritization by mapping strategies based on their feasibility and importance, but these efforts have not been replicated across distinct service delivery contexts. The aim of the current project was to evaluate the feasibility and importance of an education-adapted taxonomy of implementation strategies and to directly compare feasibility and importance ratings to the original Expert Recommendations for Implementing Change (ERIC) taxonomy, the leading compilation of implementation strategies in healthcare. A sample of 200 school-based consultants who support social, emotional, and mental health services provided ratings of feasibility and importance for each of the 75 strategies included in the adapted School Implementation Strategies, Translating ERIC Resources (SISTER) compilation. Results identified strategies rated as: (a) both feasible and important, (b) important but not feasible, (c) feasible but not important, and (d) neither feasible nor important. When mapped onto scatterplots using feasibility and importance ratings, comparison of ERIC and SISTER ratings indicated that approximately one third of the strategies shifted from one quadrant of the feasibility and importance axis to another. Findings demonstrate the value of efforts to adapt and generalize existing implementation products to novel service settings, such as schools. Additionally, findings assist implementation researchers and practitioners in prioritizing the selection of actionable and practically relevant implementation strategies to advance the quality of school mental health services.
Keywords: Implementation strategies, adaptation, replication, schools, education
The persistent gaps between evidence-based practices (EBP) and routine services across healthcare and social service sectors, such as schools (Council, 2012; See, Gorard, & Siddiqui, 2016), have underscored the importance of identifying specific strategies that can support EBP adoption, effective delivery, and sustainment. As a field, implementation science has moved purposefully away from characterizing barriers and facilitators to identifying, testing, and comparing active implementation strategies, defined as methods or techniques used to enhance the adoption, implementation, and sustainment of a program or practice (Proctor, Powell, & McMillen, 2013). Strategies vary widely and may be designed to impact multiple system levels, including front line practitioners (e.g., via training and coaching), organizational leadership and climate (e.g., by selecting or preparing leaders to strategically support implementation climate), aspects of the outer setting (e.g., via policy changes), or specific features of the interventions applied (e.g., improving design quality). These efforts have begun to yield critical information about effective implementation strategies and strategy tracking methods (Boyd, Powell, Endicott, & Lewis, 2017; Bunger et al., 2017) and have even facilitated emerging work surrounding the mechanisms through which strategies impact implementation outcomes (Lewis et al., 2018; Williams, 2016). Nevertheless, advancement has not been uniform across service sectors. Although “pockets of excellence” have advanced the field considerably within particular service settings or domains (e.g., cancer care, Mitchell & Chambers, 2017; mental health, Lewis et al., 2016), these advancements often require adaptation and replication in novel contexts and populations of service providers and recipients (Aarons, Sklar, Mustanski, Benbow, & Brown, 2017; Cook, Lyon, Locke, Waltz, & Powell, 2019). The current study sought to replicate findings from the most comprehensive effort to date to compile implementation strategies in healthcare (Powell et al., 2012, 2015, Waltz et al., 2014, 2015) and extend them to the delivery of mental health services in schools. Implementation research in healthcare is considerably more advanced than many other service sectors, including schools (Sanetti & Collier-Meek, 2019), and provides a foundation upon which other sectors may build their own rich implementation literatures. Schools are the most common setting for youth mental health service delivery (Merikangas et al., 2011), but one in which EBP are used inconsistently if at all (Owens et al., 2014). Indeed, consistent findings from schools and other service sectors that standard “train and hope” approaches to professional development (which lack follow-up supports) are ineffective in producing behavior change (Herschell et al., 2010; Joyce & Showers, 2002) suggest that additional deliberate strategies are needed to ensure the impact of services for youth and adults. Replication of research from other service sectors in schools provides an opportunity to inform school-based implementation as well as evaluate the extent to which existing implementation research products generalize beyond traditional health service settings to novel contexts.
Implementation Strategies
Recent improvements in classification and reporting standards have fueled advancements in the identification, tracking, and testing of implementation strategies (Pinnock et al., 2017; Proctor et al., 2013). To a large extent, contemporary implementation science has become primarily focused on the identification of implementation strategies (Powell et al., 2015) that can address malleable determinants (i.e., barriers and facilitators) of successful implementation (Krause et al., 2014) with the goal of bringing about favorable implementation outcomes (e.g., acceptability, appropriateness, adoption, cost, feasibility, fidelity, penetration, sustainment; Proctor et al., 2011). According to Powell and colleagues (2012), implementation strategies can be discrete or multifaceted. Discrete strategies involve single actions or processes such as reminder systems or conducting educational meeting (e.g., trainings). Multifaceted strategies combine two or more discrete strategies, such as initiatives that provide both initial educational meetings and ongoing post-training consultation or coaching. Some multifaceted strategies have been formally branded (e.g., the Leadership and Organizational Change for Implementation intervention; Aarons, Ehrhart, Farahnak, & Hurlburt, 2015). As noted earlier, different strategies often target different levels of the destination setting (e.g., outer setting, inner setting, individuals, intervention; Damschroeder et al., 2009).
The Expert Recommendations for Implementing Change (ERIC) Project has provided one of the most significant contributions to a common language for describing implementation strategies in health (Waltz et al., 2014). ERIC built upon previous work in which implementation researchers developed an initial taxonomy of implementation strategies (Powell et al., 2012). The taxonomy was then refined with input from a variety of implementation researchers and practitioners, yielding 73 unique strategies (Powell et al., 2015), placed into nine categories (Engage Consumers, Use Evaluative and Iterative Strategies, Change Infrastructure, Adapt and Tailor to Context, Develop Stakeholder Interrelationships, Utilize Financial Strategies, Support Clinicians, Provide Interactive Assistance, Train and Educate Stakeholders), and organized based on stakeholder ratings of strategy feasibility and importance (Waltz et al., 2015). Feasibility and importance were selected by the ERIC researchers because, collectively, they cover two critical features of strategies that are likely to drive perceptions of implementation strategy acceptability – and, ultimately, likelihood of use – by implementation stakeholders (Waltz et al., 2015). Indeed, ratings of feasibility and importance provide key inputs into implementation strategy selection, such that those with high feasibility and high importance represent a subset of high priority strategies that may be easier to deliver and more likely to have an impact on implementation outcomes.
Despite their innovation and potential for impact, stakeholder ratings such as those compiled in the ERIC project have not been replicated outside of traditional healthcare settings. Nevertheless, service delivery contexts vary considerably in the ways that they facilitate or inhibit implementation, necessitating studies that can help to determine the extent to which existing implementation strategies are feasible and important in novel settings. Replication of key components of the ERIC project in other settings is a critical next step in evaluating the extent to which implementation strategies, and their relative utility, are context-dependent or generalizable.
School Context
Up to 80% of youth who engage in mental healthcare in the United States receive those services in schools (Costello, He, Sampson, Kessler, & Merikangas, 2014; Farmer, Burns, Phillips, Angold, & Costello, 2003; Langer et al., 2015; Lyon, Ludwig, VanderStoep, Gudmundsen, & McCauley, 2013; Merikangas et al., 2011). Furthermore, schools have been found to facilitate service access for historically underserved populations (e.g., youth from ethnic minority backgrounds) (Kataoka, Stein, Nadeem, & Wong, 2007; Lyon et al., 2013). Unfortunately, the services delivered are rarely evidence-based, leading to a substantial missed opportunity for improving public health and necessitating the identification of contextually appropriate implementation strategies (Gottfredson & Gottfredson, 2002; Owens et al., 2014; Rones & Hoagwood, 2000).
A diverse set of personnel engage in social, emotional, and mental health service delivery in schools (Owens et al., 2014). Among them, are school-based consultants who, although they may deliver direct services themselves, frequently operate as implementation intermediaries or local champions tasked with supporting the delivery of EBP across multiple levels of care (Bruns et al., 2016). The perspectives of these personnel, who are responsible for conducting much of the practice of implementation in schools, are particularly critical when selecting or prioritizing implementation strategies.
Implementation strategies in schools.
Given that the school context has been characterized by several unique implementation determinants – including seasonal educational timelines, professional characteristics (i.e., diverse and largely non-healthcare employees), policies, and organizational constraints (Forman et al., 2013; Owens et al., 2014) – strategies designed to support clinical practice in more traditional healthcare settings (e.g., primary care, specialty mental health) may require adaptation. Depending on the specifics of the implementation effort, these unique determinants may serve to hinder or facilitate success, and may be overcome or leveraged via implementation strategies, respectively. For instance, implementing a traditional mental health intervention with practitioners who do not have clinical backgrounds – such as parent advocates – may require additional post-training supports and supervision. However, adapting and streamlining the intervention to facilitate its delivery by paraprofessional staff could leverage parent advocates’ access to families and greatly enhance large-scale penetration into the service system. In a recent study, (Cook, Lyon, Locke, Waltz, & Powell, 2019) adapted the ERIC strategy compilation for use in schools via an iterative process of review and revision by a panel of experts in implementation and school-based mental health. The School Implementation Strategies, Translating ERIC Resources (SISTER) project (Cook et al., 2019) reviewed the 73 ERIC strategies, made surface-level changes (i.e., changes to wording or terminology) to 52 strategies, made deeper modifications (i.e., adaptations that changed the core meaning) to 5 strategies, deleted 5 strategies due primarily to contextual inappropriateness, and added 7 new strategies. Deep modifications and deletions were most common in the Financial Strategies category, which has previously been identified as incongruent with standard organizational practices in educational settings (Lyon et al., 2018). No other categories required as much strategy adaptation, suggesting greater applicability in the education context. The resulting 75 SISTER strategies were adapted to increase their relevance to implementation research and practice in schools, but their importance and feasibility have not yet been evaluated.
Current Study Aims
To help prioritize strategies that might have the most utility for real-world implementation efforts in schools, it is critical to evaluate the adapted SISTER strategies from the perspectives of field-based practitioners with experience supporting school-based mental health program implementation. The aim of the current project was to evaluate the feasibility and importance of the adapted strategies, replicating a key component of the ERIC project (Waltz et al., 2015) in schools to inform future implementation research and practice in that sector. Many of the strategies contained in the compilation are currently used routinely in the school context (e.g., conduct ongoing training; audit and feedback; create a professional learning collaborative), whereas others may be less common (e.g., change accreditation or membership requirements, model and simulate change, use mass media). The goal of the SISTER project is to build a common language via a broader compilation of more commonly and less commonly used strategies which can help advance the field of implementation in education by identifying new, potentially impactful, strategies to evaluate in school settings. Furthermore, the project was designed to allow for direct comparisons between ERIC and SISTER ratings to determine which strategies exhibited the greatest degree of change across compilations and, more generally, to inform a broader understanding of the feasibility and importance of implementation strategies across service sectors.
Method
Sample
Participants (n = 200) included members of a state-sponsored initiative in the western United States focused on the delivery of EBP for youth mental health concerns in schools. The initiative was established approximately 20 years ago following federal and state legislation calling for educators to implement high-quality, evidence-based behavior intervention plans for students with challenging behaviors. It has since evolved to focus more broadly on dissemination and implementation activities involving a continuum of universal, targeted, and intensive supports for social, emotional, and mental health problems. Initiative members are site-based, district, or regional school-based consultants who have been nominated by directors of regional special education agencies based on their commitment to EBP and roles as implementation intermediaries (i.e., district or region) and/or embedded EBP champions (i.e., site-based) within their school systems. The regional agencies provide full geographic coverage of the state and included rural, suburban, and urban school systems. Membership is maintained through participation in annual forums focused on providing training on facilitating EBP implementation and ongoing state-wide research and evaluation activities. Evaluation of participant demographics (see Table 1) indicated that the sample was 81% female and 73% non-Hispanic white. Ninety percent of participants had Master’s degrees in psychology or education. Regarding whether participants functioned internally or externally to a school building, 42% were site-based, 33% functioned at the district level, and 25% provided consultative support at a regional level. Additionally, the majority of the participants had job positions as school psychologists (32.5%) and behavior specialists (29%), while 26.5% of the participants’ positions were administrative in nature as either program specialists/coordinators or directors/assistant directors. Over 70% of participants had at least 11 years of experience working in schools, with 22% have over 20 years of experience.
Table 1.
Demographics of survey respondents.
Characteristic | n | % |
---|---|---|
Gender | ||
Male | 39 | 19.5 |
Female | 161 | 80.5 |
Ethnicity | ||
American Indian or Alaska Native | 1 | 0.5 |
Asian | 11 | 5.5 |
Black or African American | 9 | 4.5 |
Multirace | 11 | 5.5 |
Native Hawaiian or other Pacific Islander | 2 | 1.0 |
Other | 20 | 10.0 |
White/Non-Hispanic | 145 | 72.5 |
Prefer not to disclose | 1 | 0.5 |
Highest degree earned | ||
BA/BS | 1 | 0.5 |
Master’s degree | 179 | 89.5 |
Doctoral degree (PhD, EdD, PsyD) | 19 | 9.5 |
Prefer not to disclose | 1 | 0.5 |
Years Experience in School-Based Services | ||
0 to 3 years | 0 | 0.0 |
3 to 5 years | 14 | 7.0 |
6 to 10 years | 42 | 21.0 |
11 to 20 years | 98 | 49.0 |
> 20 years | 45 | 22.5 |
Prefer not to disclose | 1 | 0.5 |
Consultation level | ||
Site-based | 84 | 42.0 |
District | 66 | 33.0 |
Regional | 50 | 25.0 |
Position | ||
School psychologist | 65 | 32.5 |
Behavior specialist | 58 | 29.0 |
Mental health provider | 24 | 12.0 |
Program specialist or coordinator | 38 | 19.0 |
Director or Assistant Director | 15 | 7.5 |
Procedures
Data were collected via an online survey, distributed by email, which included basic demographic information and the implementation strategies survey described below. In the fall of 2016, members of the state initiative were sent an e-mail asking them to complete a survey examining their perceptions of strategies to support implementation of EBP for student social, emotional, and mental health problems. Bi-weekly email reminders were sent for a period of up to one month to recruit as many respondents as possible. These administration procedures resulted in a response rate of 94.3% (200 out of 212). The first author’s Human Subjects Institutional Review Board determined that the study was exempt. No consent forms were collected, but information disclosures were presented to all participants prior to survey completion. Leadership approval from each participating state-wide organization also was obtained.
School implementation strategies survey.
Replicating the ERIC processes (Waltz et al., 2015), our survey asked participants to rate each of the 75 school-adapted SISTER strategies (Cook et al., 2019) for importance and then for feasibility, with responses ranging from 1 to 5. Items were presented in a consistent order to all participants. Response options replicated those used by Waltz et al. (2015). Specifically, response options for importance were: 1 (relatively unimportant), 2 (somewhat important), 3 (moderately important,) 4 (very important), and 5 (extremely important). Responses for feasibility were: 1 (not at all feasible), 2 (somewhat feasible), 3 (moderately feasible), 4 (very feasible), and 5 (extremely feasible). In addition, importance was defined for participants as, “The impact of the strategy and how critical it is to a successful implementation effort.” Feasibility was defined as, “The extent to which a strategy is practical and can be successfully used to support implementation.” Strategies were presented via the secure Qualtrics online survey platform with definitions and auxiliary materials, all of which had been systematically adapted to increase appropriateness to schools (see Supplementary File 1 for a full list of SISTER strategies and definitions in numerical order). Prior to completing the ratings, participants were given these directions: “The following questions will assess your perceptions of the importance and feasibility of specific implementation strategies. Implementation strategies refer to methods or techniques that are used to enhance the uptake, use, and sustainment of EBPs. Please select a number from 1 to 5 for each discrete implementation strategy to provide a rating in terms of how important and feasible you think it is. Keep in mind that we are looking for relative importance and relative feasibility; use all the values in the rating scale to make distinctions.” Although the original ERIC project also included a card-sort procedure (Waltz et al., 2015), this was not feasible to complete in the current study.
Data Analysis
Summary statistics for importance and feasibility dimensions of each SISTER strategy were calculated. Strategies also were plotted in a graph according to importance and feasibility ratings. This produced a scatterplot that was divided into four quadrants, or Go-Zones, (e.g., I, II, III, IV) using the mean of each dimension as the midpoint of each axis. For example, quadrant I represents those strategies with ratings above the means on both dimensions, indicating high importance and high feasibility. Quadrant II contains strategies that are high on feasibility, but low on importance. Quadrant III contains strategies that are low on both dimensions and quadrant IV contains strategies that are high on importance, but low on feasibility (Waltz et al., 2015).
Finally, Euclidian distances (Danielsson, 1980) were calculated for strategies common to both ERIC and SISTER to quantify the difference, or degree of movement, on the scatterplots between the two studies. The Euclidian distance between two data points reflects the linear distance between them. In the present study, participant ratings of implementation strategy importance and feasibility were used to create scatterplots where the x-axis reflects the average importance rating for a strategy and the y-axis reflects the average feasibility rating for a strategy. The findings from the present study of strategies tailored to fit educational contexts were compared to findings from the original study identifying strategies in a more general healthcare implementation context. In both cases, participants rated the perceived importance and feasibility of the implementation strategies using the same 5-point rating scales. Thus, these common metrics provided the opportunity to conduct a direct comparison between the perceived importance and feasibility of implementation strategies when they are rated generally (i.e., ERIC) and these same strategies as tailored to an educational context by relevant stakeholders (i.e., SISTER).
The Euclidian distance for each strategy was calculated by taking the square root of the sum of the squared mean differences for each of the two ratings (equation below).
As the present study and Waltz et al. (2015) used the same measures, and these measures both had scales ranging from 1 to 5, raw Euclidian distances were used. These distances can intuitively be interpreted in relation to the measures. For example, a distance of 1 means that on average there was a 1-point difference (on the 1-to-5-point scales) between the importance and feasibility means obtained in the present project (SISTER) and the means obtained in Waltz et al. (2015; ERIC). In other words, if the difference in importance ratings between the SISTER and ERIC data sets was 0 and the difference between the feasibility ratings was 1, the Euclidian distance would equal 1 (i.e., ). A distance of 0 reflects absolutely no difference and a difference of 5.7 reflects the maximum difference possible (i.e., if the obtained means coordinates on a scatterplot for ERIC were 5, 1 and 1, 5 for SISTER or ).
Results
Table 2 displays summary statistics for importance and feasibility ratings of each SISTER strategy, their respective Go-Zone quadrants, and Euclidian distances from their corresponding ERIC strategies, when relevant. All participants rated all strategies. The five strategies rated by participants as most important included Conduct ongoing training (4.59, SD = .60), Make training dynamic (4.46, SD = .73), Provide ongoing consultation / coaching (4.46, SD = .65), Monitor the progress of the implementation effort (4.38, SD = .73), and Improve implementers’ buy-in (4.38, SD = .70). The five most feasible strategies included Make training dynamic (3.72, SD = .98), Distribute educational materials (3.60, SD = .92), Remind school personnel (3.46, SD = 1.05), Facilitation / problem solving (3.44, SD = .90), and Capture and share local knowledge (3.42, SD = .84).
Table 2.
A summary of the 75 implementation strategies with mean importance and feasibility ratings, and Go-zone quadrants, including the Euclidean distances, from the SISTER (N = 200) and ERIC (N = 35) samples.
Item | Importance x ± sd (possible range: 1-5) |
Feasibility x ± sd (possible range: 1-5) |
SISTER Go-zone quadrant |
ERIC Go-zone quadrant |
Euclidean Distances |
|
---|---|---|---|---|---|---|
39 | Conduct ongoing training | 4.59 ± 0.60 | 3.37 ± 0.97 | 1 | 1 | 0.65 |
43 | Make training dynamic | 4.46 ± 0.73 | 3.72 ± 0.98 | 1 | 1 | 0.84 |
44 | Provide ongoing consultation/coaching | 4.46 ± 0.65 | 3.18 ± 0.93 | 1 | 1 | 0.54 |
9 | Monitor the progress of the implementation effort | 4.38 ± 0.73 | 3.27 ± 0.95 | 1 | 1 | 0.76 |
51 | Improve implementers’ buy-in | 4.38 ± 0.70 | 2.88 ± 0.75 | 4 | - | - |
21 | Build partnerships (i.e., coalitions) to support implementation | 4.36 ± 0.72 | 3.22 ± 0.94 | 1 | 1 | 0.72 |
57 | Involve students, family members, and other staff | 4.30 ± 0.83 | 3.24 ± 0.90 | 1 | 1 | 0.59 |
30 | Model and simulate change | 4.29 ± 0.75 | 3.36 ± 0.99 | 1 | 3 | 1.00 |
6 | Develop and organize a quality monitoring system | 4.29 ± 0.70 | 3.15 ± 0.89 | 1 | 1 | 0.22 |
12 | Facilitation/problem-solving | 4.27 ± 0.75 | 3.44 ± 0.90 | 1 | 1 | 0.35 |
2 | Audit and provide feedback | 4.27 ± 0.77 | 3.14 ± 0.96 | 1 | 1 | 1.00 |
40 | Create a professional learning collaborative | 4.24 ± 0.77 | 3.22 ± 0.87 | 1 | 2 | 1.16 |
8 | Obtain and use student and family feedback | 4.23 ± 0.75 | 3.27 ± 0.92 | 1 | 1 | 0.77 |
14 | Provide practice-specific supervision Intervene/communicate with students, families, and other staff to enhance |
4.23 ± 0.77 | 2.97 ± 0.94 | 1 | 4 | 0.42 |
56 | uptake and fidelity | 4.22 ± 0.82 | 3.15 ± 0.96 | 1 | 4 | 0.72 |
1 | Assess for readiness and identify barriers and facilitators | 4.20 ± 0.69 | 3.21 ± 0.90 | 1 | 1 | 1.41 |
37 | Conduct educational meetings | 4.17 ± 0.76 | 3.36 ± 0.91 | 1 | 2 | 1.45 |
50 | Facilitate relay of intervention fidelity and student data to school personnel | 4.14 ± 0.80 | 3.34 ± 0.92 | 1 | 1 | 0.10 |
62 | Alter student or school personnel obligations to enhance participation in or delivery of new practice, respectively | 4.10 ± 0.80 | 2.57 ± 0.85 | 4 | 3 | 1.59 |
16 | Promote adaptability | 4.08 ± 0.80 | 3.05 ± 0.94 | 1 | 1 | 0.55 |
7 | Develop instruments to monitor and evaluate core components of the innovation/new practice | 4.07 ± 0.82 | 3.15 ± 0.96 | 1 | 1 | 0.57 |
55 | Increase demand and expectations for implementation | 4.06 ± 0.80 | 2.81 ± 0.88 | 4 | 3 | 0.90 |
22 | Capture and share local knowledge | 4.05 ± 0.81 | 3.42 ± 0.84 | 1 | 1 | 0.61 |
32 | Organize school personnel implementation team meetings | 4.05 ± 0.82 | 2.92 ± 0.89 | 4 | 1 | 0.62 |
65 | Make implementation easier by removing burdensome documentation | 4.04 ± 0.98 | 3.03 ± 0.99 | 1 | 3 | 1.68 |
72 | tasks Develop local policy that supports implementation |
4.03 ± 0.94 | 2.58 ± 0.92 | 4 | - | - |
4 | Conduct local needs assessment | 4.00 ± 0.87 | 3.24 ± 0.96 | 1 | 1 | 1.13 |
68 | Change/alter environment | 4.00 ± 0.96 | 3.01 ± 0.89 | 1 | 3 | 1.59 |
58 | Prepare families and students to be active participants | 4.00 ± 0.81 | 2.85 ± 0.85 | 4 | 3 | 0.63 |
42 | Distribute educational materials | 3.99 ± 0.89 | 3.60 ± 0.92 | 1 | 1 | 1.27 |
46 | Use train-the-trainer strategies | 3.99 ± 0.90 | 3.35 ± 1.03 | 1 | 2 | 0.67 |
36 | Visit other sites | 3.97 ± 0.91 | 3.27 ± 1.00 | 1 | 2 | 0.93 |
41 | Develop educational materials | 3.97 ± 0.87 | 3.26 ± 0.91 | 1 | 1 | 1.58 |
5 | Develop a detailed implementation plan or blueprint | 3.94 ± 0.88 | 3.18 ± 0.90 | 1 | 1 | 1.34 |
15 | Provide local technical assistance | 3.93 ± 0.93 | 2.86 ± 0.94 | 4 | 1 | 0.34 |
73 | Mandate for change | 3.92 ± 0.95 | 2.63 ± 0.95 | 4 | 3 | 0.69 |
60 | Access new funding | 3.90 ± 0.96 | 2.37 ± 0.83 | 4 | 4 | 0.33 |
26 | Identify and prepare champions | 3.88 ± 0.93 | 3.14 ± 0.93 | 1 | 1 | 0.70 |
64 | Fund and contract for the new practices | 3.85 ± 1.01 | 2.28 ± 0.91 | 4 | 4 | 0.24 |
27 | Identify early adopters | 3.81 ± 1.00 | 3.28 ± 0.99 | 1 | 1 | 0.43 |
24 | Develop academic partnerships | 3.81 ± 0.94 | 2.96 ± 0.82 | 1 | 2 | 1.07 |
52 | Pre-correction prior to implementation | 3.80 ± 0.93 | 2.95 ± 0.87 | 4 | - | - |
13 | Peer-Assisted Learning | 3.76 ± 0.87 | 2.95 ± 0.87 | 4 | - | - |
3 | Conduct cyclical small tests of change (piloting or trialing the practice first) | 3.76 ± 0.81 | 2.91 ± 0.94 | 4 | 1 | 1.13 |
49 | Develop resource sharing agreements | 3.75 ± 0.97 | 3.13 ± 0.95 | 1 | 3 | 0.68 |
38 | Conduct educational outreach visits | 3.75 ± 0.96 | 3.09 ± 0.93 | 1 | 2 | 1.17 |
47 | Work with educational institutions | 3.72 ± 1.03 | 3.03 ± 1.02 | 2 | 2 | 1.02 |
28 | Inform local opinion leaders | 3.70 ± 1.00 | 3.00 ± 0.96 | 2 | 1 | 1.05 |
18 | Test-drive and select practices | 3.69 ± 0.93 | 2.88 ± 0.88 | 3 | - | - |
48 | Create new practice teams | 3.68 ± 0.92 | 2.85 ± 0.86 | 3 | 3 | 1.03 |
19 | Use data experts | 3.68 ± 0.98 | 2.71 ± 0.95 | 3 | 3 | 0.62 |
11 | Centralize technical assistance | 3.66 ± 0.94 | 2.94 ± 0.91 | 3 | 3 | 0.94 |
31 | Obtain formal commitments | 3.65 ± 1.00 | 2.82 ± 0.97 | 3 | 4 | 0.37 |
33 | Promote network weaving | 3.61 ± 1.03 | 2.86 ± 0.88 | 3 | 3 | 0.91 |
29 | Involve governing organizations | 3.53 ± 1.12 | 2.63 ± 0.90 | 3 | 2 | 1.15 |
25 | Develop an implementation glossary | 3.46 ± 1.05 | 3.36 ± 1.02 | 2 | 2 | 1.34 |
23 | Conduct local consensus discussions | 3.41 ± 1.02 | 2.90 ± 0.89 | 3 | 1 | 1.19 |
69 | Change record systems | 3.35 ± 1.06 | 2.63 ± 0.88 | 3 | 3 | 0.65 |
35 | Use advisory boards and workgroups | 3.34 ± 1.02 | 2.69 ± 0.92 | 3 | 2 | 1.18 |
20 | Use data warehousing techniques | 3.28 ± 1.12 | 2.54 ± 0.96 | 3 | 3 | 0.42 |
67 | Change ethical and professional standards of conduct | 3.27 ± 1.19 | 2.51 ± 0.95 | 3 | 3 | 1.83 |
34 | Recruit, designate and train for leadership | 3.25 ± 0.73 | 2.81 ± 0.93 | 3 | 4 | 0.79 |
17 | Tailor strategies | 3.23 ± 0.68 | 3.19 ± 0.90 | 2 | 1 | 1.40 |
54 | Targeting/improving implementer wellbeing | 3.19 ± 0.73 | 2.90 ± 0.93 | 3 | - | - |
45 | Shadow other experts | 3.12 ± 0.73 | 2.95 ± 0.93 | 3 | 2 | 0.49 |
53 | Remind school personnel | 3.09 ± 0.79 | 3.46 ± 1.05 | 2 | 2 | 0.34 |
61 | Alter and provide individual- and system-level incentives | 2.97 ± 0.80 | 2.65 ± 0.97 | 3 | 3 | 0.46 |
71 | Create or change credentialing and/or professional development standards | 2.96 ± 1.27 | 2.08 ± 0.91 | 3 | 3 | 0.95 |
10 | Stage implementation scale up | 2.86 ± 0.72 | 2.83 ± 0.89 | 3 | 1 | 1.45 |
74 | Pruning competing initiatives | 2.77 ± 0.84 | 2.39 ± 0.90 | 3 | - | - |
59 | Use mass media | 2.75 ± 1.17 | 2.60 ± 1.07 | 3 | 3 | 0.59 |
63 | Develop disincentives | 2.71 ± 1.16 | 2.44 ± 1.06 | 3 | 3 | 0.62 |
75 | Start a dissemination/implementation organization | 2.66 ± 0.86 | 2.67 ± 0.94 | 3 | 3 | 0.83 |
70 | Change school or community sites | 2.66 ± 1.27 | 2.16 ± 0.92 | 3 | 3 | 0.46 |
66 | Change accreditation or membership requirements | 2.62 ± 1.21 | 2.32 ± 0.97 | 3 | 3 | 0.69 |
Figure 1 presents the Go-Zone scatterplot of the 75 SISTER implementation strategies, numbered consistently with Table 2 to facilitate cross-referencing. As is apparent in the figure, a positive and linear relationship (r = .63) was observed between feasibility and importance, with strategies rated as more feasible also tending to be rated as more important. This resulted in most strategies being placed either in quadrant I (high feasibility, high importance) or quadrant III (low feasibility, low importance). Nevertheless, some strategies (n = 13; 17%) were rated as important, but not feasible (e.g., Fund and contract for new practices; Access new funding; Alter student or school personnel obligations). Fewer strategies (n = 5; 7%) were viewed as feasible, but not important (e.g., Remind school personnel; Tailor implementation strategies; Develop an implementation glossary).
Figure 1.
Go-Zone Plot of SISTER Strategies. Horizontal and vertical lines in the plot indicate the mean values for the importance and feasibility scales. The upper-right quadrant is Zone 1: relatively high importance and feasibility. The upper-left quadrant is Zone 2: relatively low importance and relatively high feasibility. The lower-left quadrant is Zone 3: relatively low importance and feasibility. The lower-right quadrant is Zone 4: relatively high importance and relatively low feasibility. The numbers in the plot map on to the strategies listed in Table 2.
Direct comparisons between ERIC and SISTER strategy ratings revealed that, on average, SISTER strategies received a narrower range of scores for both importance (2.62 – 4.59 [SISTER] vs. 1.87 – 4.19 [ERIC]) and feasibility (2.08 – 3.72 [SISTER] vs. 1.33 – 4.83 [ERIC]). Table 2 also displays the Go-Zone quadrant for each SISTER strategy and its corresponding ERIC strategy, when relevant. Approximately one third of the strategies shifted quadrants between ERIC and SISTER. Euclidean distances, calculated for all strategies that were reflected in both compilations, captured total movement in both importance and feasibility across the two studies (mean movement = .85). New SISTER strategies and deleted ERIC strategies were omitted from the comparison. Table 3 displays the 12 strategies with Euclidean distances greater than 1 standard deviation (>1.25). These strategies were evenly split regarding whether higher ratings (on importance and feasibility) were received in the ERIC or SISTER study.
Table 3.
A comparison of implementation strategies from the SISTER (N = 200) and ERIC (N = 35) samples with Euclidean distances greater than one standard deviation from the mean on importance and feasibility ratings, including which version of the item had a higher rating.
SISTER | ERIC | |||||||
---|---|---|---|---|---|---|---|---|
# | Item | Importance x ± sd |
Feasibility x ± sd |
Item | Importance x ± sd |
Feasibility x ± sd |
Euclidean Distance |
Higher Rating |
67 | Change ethical and professional standards of conduct | 3.27 ± 1.19 | 2.51 ± 0.95 | Change liability laws | 1.87 ± 0.94 | 1.33 ± 0.55 | 1.83 | SISTER |
65 | Make implementation easier by removing burdensome documentation tasks | 4.04 ± 0.98 | 3.03 ± 0.99 | Make billing easier | 2.93 ± 1.26 | 1.77 ± 0.73 | 1.68 | SISTER |
62 | Alter student or school personnel obligations to enhance participation in or delivery of new practice, respectively | 4.10 ± 0.80 | 2.57 ± 0.85 | Alter patient/consumer fees | 2.60 ± 0.97 | 2.03 ± 0.76 | 1.59 | SISTER |
68 | Change/alter environment | 4.00 ± 0.96 | 3.01 ± 0.89 | Change physical structure and equipment | 2.60 ± 0.89 | 2.27 ± 0.94 | 1.59 | SISTER |
41 | Develop educational materials | 3.97 ± 0.87 | 3.26 ± 0.91 | Develop educational materials | 3.80 ± 1.19 | 4.83 ± 0.38 | 1.58 | Split |
37 | Conduct educational meetings | 4.17 ± 0.76 | 3.36 ± 0.91 | Conduct educational meetings | 3.27 ± 1.14 | 4.50 ± 0.68 | 1.45 | Split |
10 | Stage implementation scale up | 2.86 ± 0.72 | 2.83 ± 0.89 | Stage implementation scale up | 3.97 ± 1.00 | 3.77 ± 0.90 | 1.45 | ERIC |
1 | Assess for readiness and identify barriers and facilitators | 4.20 ± 0.69 | 3.21 ± 0.90 | Assess for readiness and identify barriers and facilitators | 4.60 ± 0.67 | 4.57 ± 0.63 | 1.41 | ERIC |
17 | Tailor strategies | 3.23 ± 0.68 | 3.19 ± 0.90 | Tailor strategies | 4.37 ± 0.81 | 4.00 ± 0.98 | 1.40 | ERIC |
25 | Develop an implementation glossary | 3.46 ± 1.05 | 3.36 ± 1.02 | Develop an implementation glossary | 2.87 ± 1.43 | 4.57 ± 0.77 | 1.34 | Split |
5 | Develop a detailed implementation plan or blueprint | 3.94 ± 0.88 | 3.18 ± 0.90 | Develop a formal implementation blueprint | 4.30 ± 0.88 | 4.47 ± 0.78 | 1.34 | ERIC |
42 | Distribute educational materials | 3.99 ± 0.89 | 3.60 ± 0.92 | Distribute educational materials | 3.50 ± 1.36 | 4.77 ± 0.43 | 1.27 | Split |
Discussion
This study sought to: (a) evaluate the feasibility and importance of an adapted compilation of implementation strategies for schools; and (b) assess the degree of change in ratings on these dimensions between the original and adapted compilations. The current findings can assist implementation researchers and practitioners in prioritizing the selection of implementation strategies to advance service quality in school-based mental health programming. The strategies rated as most important were (a) Conduct ongoing training, (b) Make training dynamic, (c) Provide ongoing consultation / coaching, (d) Monitor the progress of the implementation effort, and (e) Improve implementers’ buy-in. Three of the top five SISTER strategies rated as most important all explicitly reference training and consultation (Conduct ongoing training, Make training dynamic, Provide ongoing consultation / coaching), which have previously been identified as cornerstone (i.e., commonly occurring and essential) implementation strategies in a review of a statewide EBP initiative (Lyon, Pullmann, Walker, & D’Angelo, 2017). These strategies are also commonly explored in school psychology and special education research (Noell et al., 2014; Reinke, Lewis-Palmer, & Merrell, 2008; Sanetti, Collier-Meek, Long, Byron, & Kratochwill, 2015). Although not explicitly identified in the article cited above, monitoring implementation progress (including implementation strategies and outcomes) is also critical and consistent with the response to intervention (Gresham, 2005) and Positive Behavioral Interventions and Supports (Sugai & Horner, 2006) data-driven decision-making approaches commonly discussed in schools. This alignment may be partially responsible for the strong importance ratings. Further, although targeted efforts to improve implementers’ buy-in in schools are only just emerging, participants found them to be of high importance. One version of this strategy can be found in a recent study by Cook, Lyon, Kubergovic, Wright, and Zhang (2015) where the authors found that an implementation strategy designed to change front-line practitioner beliefs and attitudes was associated with adoption and high-fidelity delivery of educator-delivered behavioral support practices.
The five most feasible strategies were (a) Make training dynamic, (b) Distribute educational materials, (c) Remind school personnel, (d) Facilitation / problem solving, and (e) Capture and share local knowledge. The first three are relatively simple practices that may be applied in an implementation effort without significant resource allocation, so the fact that their feasibility was judged to be high is relatively unsurprising. Indeed, training efforts in schools are already likely to be dynamic and involve the distribution of guidelines, manuals, or toolkits. It is also a best practice for implementation of frameworks such as Positive Behavioral Interventions and Supports to include reminders in the form of visual cues posted throughout the school building (e.g., behavioral expectations). Facilitation is a somewhat more complex and amorphous process that relies on strong relationships with implementation sites (Kirchner et al., 2014). Despite this complexity, the respondents’ perceptions of its feasibility may have been driven by their experience implementing EBP in schools, which inevitably involve relationship building. Finally, capturing and sharing local knowledge involves gathering information from other sites on successful implementation to drive decision-making. It may be that the respondents, who were part of a statewide implementation-focused initiative, were particularly familiar with the successes and failures of other schools and districts.
Overall, the most positively rated SISTER strategies (i.e., those in Go-Zone quadrant I; n = 33 strategies) were primarily (28 strategies; 85%) those that had undergone surface-level changes (i.e., changes to wording/terminology, referents, etc.) (Cook et al., 2019). This is higher than the total percentage of SISTER strategies (69%) that underwent surface-level adaptations, suggesting that strategies that retained their full original meaning while being revised to improve their contextual appropriateness tended to be viewed as most important and feasible, whereas those that underwent more substantive adaptations from one context to the next often were rated as less compelling. The reasons for this are unclear, but it may be that those strategies flagged for deep modification were already less important and feasible and that modification to the school context may not have fully overcome that issue.
There were many notable similarities between the ratings for the ERIC and SISTER compilations, including a strong linear relationship between importance and feasibility, and most strategies (approximately two-thirds) rated in the same Go-Zone. Nevertheless, SISTER strategies were rated within a more restricted range. This could be due to the single type of stakeholder respondent in SISTER (school-based implementation consultants), relative to the two types of stakeholders who participated in ERIC (implementation researchers and clinical practice leads), which might have yielded more diverse response sets. Further, although results suggested many commonalities across the projects, approximately one-third of the strategies shifted Go-Zone quadrant, indicating notable differences in the perceived importance and feasibility across contexts. More strategies were classified as important but not feasible in SISTER (quadrant IV), whereas ERIC had more strategies classified as feasible, but not important (quadrant II), suggesting a potentially important role of context. Although it is unknown exactly why feasibility ratings were generally lower in SISTER, it may be because, in contrast to the ERIC sample (where the majority were affiliated with the US Department of Veteran’s Affairs), schools are not healthcare delivery settings. As such, they may be unlikely to prioritize – and thus make more feasible – implementation of mental health services above their core academic priorities (Owens et al., 2014).
Furthermore, it is important to highlight the potential for this study’s findings being bound to the respondents’ job roles. For example, although the respondents in this study served in diverse consultative capacities (e.g., site, district, or regional agency), the strategies rated as most important and feasible may be those that are most under the control and influence of school-based consultants (e.g., such as providing ongoing training, monitoring an implementation effort, and improving buy-in). Gathering feasibility and importance ratings from different groups of educators – such as school administrators – may have may have revealed different findings more closely aligned with their purview of influence (e.g., use mass media, change record systems, inform local opinion leaders). The potential for an interaction between job roles and ratings points towards an important avenue to explore with future research.
Limitations
Although this study is the first to replicate the findings from one of the most influential initiatives to identify and evaluate implementation strategies in healthcare and extend them to a novel service context, there are notable limitations. Specifically, although the SISTER project included ratings from a substantially larger set of implementation professionals (n = 200) than the ERIC project (n = 35), participants reflected just one type of role in their respective organizations. As indicated above, perceptions of importance and feasibility could vary by type of respondent and this may have resulted in the more restricted range of ratings observed in the current study. Nevertheless, the current participants were likely to be proximal to school-based implementation and have a diverse set of experiences surrounding implementation of different types of school-based EBP. Moreover, it is also possible that findings would have been different if the sample included a randomly selected group of school-based consultants who may have been less committed to translating EBPs as part of regular school-based service delivery. One of the advantages of this study, however, is that members of the state-sponsored initiative were nominated as having a strong ‘pulse’ on implementation of EBPs in their respective school systems and were committed to bridging the science-to-practice gap. Nonetheless, this introduces response bias and future research should compare findings across respondents who are and are not committed to translating EBPs into everyday practice. Related, we did not collect information about the schools and districts in which respondents worked to allow for exploration of variables that might have moderated perceptions of importance and feasibility. It may be that strategies such as change school or community sites are more feasible in larger urban districts that contain many more possible school sites. Furthermore, the current study was entirely quantitative, providing no opportunities to explore reasons why specific strategies were rated high or low on feasibility and importance. Next, although this study focused on perceptions of importance and feasibility, strategies rated as low importance and/or low feasibility may still be useful and effective in some implementation scenarios and should not be discounted. It may also be the case that participants’ views of strategies as more feasible may also have led them to rate the strategies as more important. In addition, the fact that some strategies were reworded between ERIC and SISTER introduces a potential confound. Moreover, participants completed ratings of all strategies in a single session, so it is possible that fatigue could have impacted later ratings. Finally, although participants were recruited from one region within the United States, school-based mental health is increasingly a global phenomenon (Weist, Short, McDaniel, & Bode, 2016). Future work should assess the international relevance of implementation strategies in educational settings.
Implications for Research and Practice
The current study has several implications for future implementation research and practice in schools, as well as other service sectors relevant to the delivery of mental health services for youth and families. Indeed, research to confirm or adapt the ERIC or SISTER compilations for use in a range of social service contexts, such as juvenile justice or global mental health, is indicated to continue to “test the limits” of existing compilations and determine the extent to which the strategies that required the most adaptation in the SISTER project or received the lowest ratings on importance and feasibility are also the most likely to require adaptation to additional novel settings. In schools, future research might complete the final step of the work reported by Waltz et al. (2015) and engage participants in a card-sort task to determine whether empirically-derived groupings of implementation strategies are similar to those identified in other settings. Perhaps more importantly, school-based implementation research studies should explicitly test whether strategies that are rated as more feasible and important are also more likely to be delivered with fidelity (i.e., implementation strategy fidelity) and more likely to impact implementation outcomes (e.g., adoption, intervention fidelity, sustainment) of EBPs.
Future research should examine the extant evidentiary support for prioritized strategies in schools (e.g., conduct ongoing training) via systematic review methods (Higgins & Green, 2011). Research should also begin evaluating the effectiveness of prioritized strategies that have limited to no evidence in schools, and explore the specific mechanisms through which strategies operate on implementation outcomes (Lewis et al., 2018). Strategies that have only undergone surface-level adaptations (i.e., the majority in the Go-Zone quadrant I) should be most likely continue to operate through the same mechanisms as originally designed.
Given the growing potential for a gap between implementation science and implementation practice that parallels the unfortunate gap between intervention science and practice (Lyon, Comtois, Kerns, Landes, & Lewis, in press), implications for implementation practice in schools are particularly warranted. Implementation practitioners (a.k.a., intermediaries and local champions) were selected as the participants in the current study to elevate their perspectives and ensure, to the extent possible, that the strategy ratings reflect the reality of implementing programs and practices in authentic educational settings. Across service delivery contexts, feasible and pragmatic strategies (i.e., those that emphasize relevance to the local context and stakeholder perspectives with the goal of accelerating and broadening impact on practice; Glasgow, 2013) are both sorely needed and in short supply (Lyon et al., in press). The current findings suggest several strategies that demonstrate high feasibility and may be more accessible to implementation practitioners in schools. Such feasible strategies might be prioritized to avoid replicating the research-to-practice gap in intervention science. Although the results are preliminary, feasibility and importance ratings such as those reported in the current paper could provide a foundation for a tailored implementation approach using a conjoint analysis approach that both prioritizes strategies and links them to identified implementation determinants (Lewis et al., 2018b; Powell et al., 2017).
Finally, the findings also have specific implications for school psychology research and practice, especially considering that the largest proportion of participants were school psychologists. School psychologists function as change agents who work with and through others to produce improvements in both practice quality and outcomes (Reddy, Kettler, & Kurz, 2015). It is through system- or individually-focused consultation that school psychologists are able to effect meaningful changes in practice that correspond to improved student outcomes (Castillo & Curtis, 2014; National Association of School Psychologists, 2010). The SISTER strategies are relevant to system change work (e.g., audit and feedback) as well as individual consultation with practitioners (e.g., provide reminders, educational materials). The SISTER compilation can also increase school psychology researchers’ and practitioners’ awareness of strategies beyond those traditionally studied in research and emphasized in pre-service training, such as training (Brock et al., 2017), performance-based feedback (Noell et al., 2014), and gathering data to monitor fidelity of an intervention (Sanetti & Kratochwill, 2009).
Conclusion
The current study extends influential findings from traditional healthcare settings (Powell et al., 2015; Waltz et al., 2015) to mental health services in schools and can serve as a model for the generalization of implementation research products to novel service contexts. Findings suggest that a subset of the adapted SISTER strategies may be prioritized due to high feasibility and importance. As the field of implementation continues its rapid expansion in healthcare and social services, education is well positioned to both benefit from and contribute to the science and practice of promoting the use of evidence in routine service delivery.
Supplementary Material
Acknowledgements
We thank Shanon Cox and Elissa Picozzi for checking manuscript references and manuscript formatting.
Funding
This publication was supported in part by funding from grants K08MH095939 (PI: Lyon), K01MH100199 (PI: Locke), and K01MH113806 (PI: Powell) awarded from the National Institute of Mental Health. BJP is also supported by National Institutes of Health awards UL1TR001111 (PI: Buse), P30AI050410 (PI: Golin), R25MH080916 (PI: Proctor), and R01MH103310 (PI: Lewis). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
References
- Aarons GA, Ehrhart MG, Farahnak LR, & Hurlburt MS (2015). Leadership and organizational change for implementation (LOCI): a randomized mixed method pilot study of a leadership and organization development intervention for evidence-based practice implementation. Implementation Science, 10, 11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons GA, Sklar M, Mustanski B, Benbow N, & Brown CH (2017). “Scaling-out” evidence-based interventions to new populations or new health care delivery systems. Implementation Science, 12, 111 10.1186/s13012-017-0640-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Boyd MR, Powell BJ, Endicott D, & Lewis CC (2017). A Method for Tracking Implementation Strategies: An Exemplar Implementing Measurement-Based Care in Community Behavioral Health Clinics. Behavior Therapy. 10.1016/j.beth.2017.11.012 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brock ME, Cannella-Malone HI, Seaman RL, Andzik NR, Schaefer JM, Page EJ, … Dueker SA (2017). Findings across practitioner training studies in special education: A comprehensive review and meta-analysis. Exceptional Children, 84, 7–26. [Google Scholar]
- Bruns EJ, Duong MT, Lyon AR, Pullmann MD, Cook CR, Cheney D, & McCauley E (2016). Fostering SMART partnerships to develop an effective continuum of behavioral health services and supports in schools. American Journal of Orthopsychiatry, 86, 156–170. 10.1037/ort0000083 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bunger AC, Powell BJ, Robertson HA, MacDowell H, Birken SA, & Shea C (2017). Tracking implementation strategies: A description of a practical approach and early findings. Health Research Policy and Systems, 15, 15 10.1186/s12961-017-0175-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Castillo JM, & Curtis MJ (2014). Best practices in systems-level change In Harrison PL & Thomas A (Eds.), Best practices in school psychology: Systems-level services (pp. 11–28). Bethesda, MD: National Association of School Psychologists. [Google Scholar]
- Cook CR, Lyon AR, Kubergovic D, Wright DB, & Zhang Y (2015). A supportive beliefs intervention to facilitate the implementation of evidence-based practices within a multi-tiered system of supports. School Mental Health, 7, 49–60. [Google Scholar]
- Cook CR, Lyon AR, Locke J, Waltz TJ, & Powell BJ (2019). Adapting a compilation of implementation strategies to advance school-based implementation research and practice. Prevention Science, 20, 914–935. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Costello EJ, He J, Sampson NA, Kessler RC, & Merikangas KR (2014). Services for adolescents with psychiatric disorders: 12-month data from the national comorbidity survey–adolescent. Psychiatric Services, 65, 359–366. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Council NR (2012). Using science as evidence in public policy. Washington D.C.: National Academies Press. [Google Scholar]
- Danielsson P-E (1980). Euclidean distance mapping. Computer Graphics and Image Processing, 14, 227–248. 10.1016/0146-664X(80)90054-4 [DOI] [Google Scholar]
- Farmer EM, Burns BJ, Phillips SD, Angold A, & Costello EJ (2003). Pathways into and through mental health services for children and adolescents. Psychiatric Services, 54, 60–66. [DOI] [PubMed] [Google Scholar]
- Forman SG, Shapiro ES, Codding RS, Gonzales JE, Reddy LA, Rosenfield SA, … Stoiber KC (2013). Implementation science and school psychology. School Psychology Quarterly, 28, 77. [DOI] [PubMed] [Google Scholar]
- Glasgow RE (2013). What does it mean to be pragmatic? Pragmatic methods, measures, and models to facilitate research translation. Health Education & Behavior, 40, 257–265. [DOI] [PubMed] [Google Scholar]
- Gottfredson DC, & Gottfredson GD (2002). Quality of school-based prevention programs: Results from a national survey. Journal of Research in Crime and Delinquency, 39, 3–35. [Google Scholar]
- Gresham FM (2005). Response to intervention: An alternative means of identifying students as emotionally disturbed. Education & Treatment of Children, 28, 328–344 [Google Scholar]
- Herschell AD, Kolko DJ, Baumann BL, & Davis AC (2010). The role of therapist training in the implementation of psychosocial treatments: A review and critique with recommendations. Clinical Psychology Review, 30, 448–466. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Higgins JP, & Green S (2011). Cochrane handbook for systematic reviews of interventions (Vol. 4). John Wiley & Sons. [Google Scholar]
- Joyce BR, & Showers B (2002). Student achievement through staff development (3rd ed.). Alexandria, VA: Association for Supervision and Curriculum Development. [Google Scholar]
- Kataoka S, Stein BD, Nadeem E, & Wong M (2007). Who gets care? Mental health service use following a school-based suicide prevention program. Journal of the American Academy of Child & Adolescent Psychiatry, 46, 1341–1348. 10.1097/chi.0b013e31813761fd [DOI] [PubMed] [Google Scholar]
- Kirchner JE, Ritchie MJ, Pitcock JA, Parker LE, Curran GM, & Fortney JC (2014). Outcomes of a partnered facilitation strategy to implement primary care–mental health. Journal of General Internal Medicine, 29, 904–912. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Krause J, Van Lieshout J, Klomp R, Huntink E, Aakhus E, Flottorp S, … et al. (2014). Identifying determinants of care for tailoring implementation in chronic diseases: an evaluation of different methods. Implementation Science, 9, 102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Langer DA, Wood JJ, Wood PA, Garland AF, Landsverk J, & Hough RL (2015). Mental health service use in schools and non-school-based outpatient settings: comparing predictors of service use. School Mental Health, 7, 161–173. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lewis CC, Klasnja P, Powell B, Tuzzio L, Jones S, Walsh-Bailey C, & Weiner B (2018a). From classification to causality: advancing understanding of mechanisms of change in implementation science. Frontiers in Public Health, 6, 136. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lewis CC, Puspitasari A, Boyd MR, Scott K, Marriott BR, Hoffman M, … & Kassab H (2018b). Implementing measurement based care in community mental health: a description of tailored and standardized methods. BMC research notes, 11, 76. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lewis C, Darnell D, Kerns S, Monroe-DeVita M, Landes SJ, Lyon AR, … et al. (2016). Proceedings of the 3rd Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2015: advancing efficient methodologies through community partnerships and team science. Implementation Science, 11, 85 10.1186/s13012-016-0428-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, Comtois KA, Kerns SE, Landes SJ, & Lewis CC (in press). Closing the science-practice gap in implementation before it widens. The Science of Implementation. [Google Scholar]
- Lyon AR, Cook CR, Brown EC, Locke J, Davis C, Ehrhart M, & Aarons GA (2018). Assessing organizational implementation context in the education sector: Confirmatory factor analysis of measures of implementation leadership, climate, and citizenship. Implementation Science, 13,5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, Ludwig KA, VanderStoep A, Gudmundsen G, & McCauley E (2013). Patterns and predictors of mental healthcare utilization in schools and other service sectors among adolescents at risk for depression. School Mental Health, 5, 155–165. 10.1007/s12310-012-9097-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon AR, Pullmann MD, Walker SC, & D’Angelo G (2017). Community-sourced intervention programs: Review of submissions in response to a statewide call for “promising practices.” Administration and Policy in Mental Health and Mental Health Services Research, 44, 16–28. 10.1007/s10488-015-0650-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Merikangas KR, He J, Burstein M, Swendsen J, Avenevoli S, Case B, … Olfson M (2011). Service utilization for lifetime mental disorders in U.S. adolescents: Results of the national comorbidity survey–adolescent supplement (NCS-A). Journal of the American Academy of Child & Adolescent Psychiatry, 50, 32–45. 10.1016/j.jaac.2010.10.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mitchell SA, & Chambers DA (2017). Leveraging implementation science to improve cancer care delivery and patient outcomes. Journal of Oncology Practice, 13, 523–529. 10.1200/JOP.2017.024729 [DOI] [PMC free article] [PubMed] [Google Scholar]
- National Association of School Psychologists. (2010). Model for comprehensive and integrated school psychological services. Washington, DC: Author. Retrieved 11/5/2018 from www.nasponline.org. [Google Scholar]
- Noell GH, Gansle KA, Mevers JL, Knox RM, Mintz JC, & Dahir A (2014). Improving treatment plan implementation in schools: A meta-analysis of single subject design studies. Journal of Behavioral Education, 23, 168–191. [Google Scholar]
- Owens JS, Lyon AR, Brandt NE, Warner CM, Nadeem E, Spiel C, & Wagner M (2014). Implementation science in school mental health: Key constructs in a developing research agenda. School Mental Health, 6, 99–111. 10.1007/s12310-013-9115-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, … Taylor SJC (2017). Standards for Reporting Implementation Studies (StaRI) statement. BMJ, 356, i6795 10.1136/bmj.i6795 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, & Mandell DS (2017). Methods to improve the selection and tailoring of implementation strategies. The Journal of Behavioral Health Services & Research, 44, 177–194. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, … York JL (2012). A compilation of strategies for implementing clinical innovations in health and mental health. Medical Care Research and Review, 69, 123–157. 10.1177/1077558711430690 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, … Kirchner JE (2015). A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science, 10, 21 10.1186/s13012-015-0209-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Proctor EK, Powell BJ, & McMillen JC (2013). Implementation strategies: Recommendations for specifying and reporting. Implementation Science, 8, 139 10.1186/1748-5908-8-139 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, … Hensley M (2011). Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health, 38(2), 65–76. 10.1007/s10488-010-0319-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Reddy LA, Kettler RJ, & Kurz A (2015). School-wide educator evaluation for improving school capacity and student achievement in high-poverty schools: Year 1 of the school system improvement project. Journal of Educational and Psychological Consultation, 25, 90–108. [Google Scholar]
- Reinke WM, Lewis-Palmer T, & Merrell K (2008). The classroom check-up: A classwide teacher consultation model for increasing praise and decreasing disruptive behavior. School Psychology Review, 37, 315. [PMC free article] [PubMed] [Google Scholar]
- Rones M, & Hoagwood K (2000). School-based mental health services: A research review. Clinical Child and Family Psychology Review, 3, 223–241. 10.1023/A:1026425104386 [DOI] [PubMed] [Google Scholar]
- Sanetti LMH, Collier-Meek MA, Long ACJ, Byron J, & Kratochwill TR (2015). Increasing teacher treatment integrity of behavior support plans through consultation and implementation planning. Journal of School Psychology, 53, 209–229. [DOI] [PubMed] [Google Scholar]
- Sanetti LMH, Knight AP, Cochrane W, & Minster M (manuscript in prepartion). Reporting of implementation fidelity of interventions with children in the school psychology literature from 2009 to 2017. [Google Scholar]
- Sanetti LMH, & Kratochwill TR (2009). Toward developing a science of treatment integrity: Introduction to the special series. School Psychology Review, 38, 445–459. [Google Scholar]
- See BH, Gorard S, & Siddiqui N (2016). Teachers’ use of research evidence in practice: A pilot study of feedback to enhance learning. Educational Research, 58, 56–72. 10.1080/00131881.2015.1117798 [DOI] [Google Scholar]
- Sugai G, & Horner RH (2006). A promising approach for expanding and sustaining school wide positive behavior support. School Psychology Review, 35, 246–259. [Google Scholar]
- Waltz TJ, Powell BJ, Chinman MJ, Smith JL, Matthieu MM, Proctor EK, … Kirchner JE (2014). Expert recommendations for implementing change (ERIC): protocol for a mixed methods study. Implementation Science, 9, 39 10.1186/1748-5908-9-39 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, … Kirchner JE (2015). Else of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Science, 10, 109 10.1186/s13012-015-0295-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Weist MD, Short K, McDaniel H, & Bode A (2016). The School Mental Health International Leadership Exchange (SMHILE): Working to advance the field through opportunities for global networking. International Journal of Mental Health Promotion, 18, 1–7. 10.1080/14623730.2015.1079420 [DOI] [Google Scholar]
- Williams NJ (2016). Multilevel mechanisms of implementation strategies in mental health: integrating theory, research, and practice. Administration and Policy in Mental Health & Mental Health Services Research, 43, 783–798. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.