Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2022 Mar 24.
Published in final edited form as: Prev Sci. 2019 Aug;20(6):914–935. doi: 10.1007/s11121-019-01017-1

Adapting a Compilation of Implementation Strategies to Advance School-Based Implementation Research and Practice

Clayton R Cook 1, Aaron R Lyon 2, Jill Locke 2, Thomas Waltz 3, Byron J Powell 4
PMCID: PMC8943907  NIHMSID: NIHMS1580419  PMID: 31152328

Abstract

Schools, like other service sectors, are confronted with an implementation gap, with the slow adoption and uneven implementation of evidence-based practices (EBP) as part of routine service delivery, undermining efforts to promote better youth behavioral health outcomes. Implementation researchers have undertaken systematic efforts to publish taxonomies of implementation strategies (i.e., methods or techniques that are used to facilitate the uptake, use, and sustainment of EBP), such as the Expert Recommendations for Implementing Change (ERIC) Project. The 73-strategy ERIC compilation was developed in the context of healthcare and largely informed by research and practice experts who operate in that service sector. Thus, the comprehensibility, contextual appropriateness, and utility of the existing compilation to other service sectors, such as the educational setting, remains unknown. The purpose of this study was to initiate the School Implementation Strategies, Translating ERIC Resources (SISTER) Project to iteratively adapt the ERIC compilation to the educational sector. The results of a seven-step adaptation process resulted in 75 school-adapted strategies. Surface level changes were made to the majority of the original ERIC strategies (52 out of 73), while five of the strategies required deeper modifications for adaptation to the school context. Six strategies were deleted and seven new strategies were added based on existing school-based research. The implications of this study’s findings for prevention scientists engaged in implementation research (e.g., creating a common nomenclature for implementation strategies) and limitations are discussed.

Keywords: Implementation science, implementation strategies, school-based mental and behavioral health, evidence-based practices


Research continues to produce a steady stream of innovations that can improve routine care for youth with behavioral health problems, such as anxiety, depression, trauma, and disruptive behavior problems (Weisz & Kazdin, 2010). Despite the promise of such research, these findings often are not successfully translated into everyday service settings in which youth naturally exist (Dingfelder & Mandell, 2011; Owens et al., 2014). Implementation research across different service sectors has shown that without deliberate efforts to bridge the science-to-practice gap through the use of implementation strategies, there is likely to be uneven uptake, use, and sustainment of research findings as part of routine practice (Proctor, Powell, & McMillen, 2013; Powell et al., 2015). In fact, research from the broader field of implementation science has estimated that two-thirds of implementation efforts fail (Burnes, 2004; Damschroder, Aron, Keith, Kirsh, Alexander, & Lowery, 2009) and most have no impact on service recipient outcomes (Powell, Proctor, & Glass, 2014).

There has been a strong push among researchers and policymakers to strategically increase the availability of evidence-based practices (EBP) as part of routine service delivery in the main settings in which youth function (Lester & Kelman, 1997; Fixsen, Blase, Metz, & Van Dyke, 2013). Schools continue to be one of these settings, as they are the primary venue in which youth receive behavioral health supports (Farmer et al., 2003). Due to greater access, reduced stigma, and the availability of professionals who can deliver needed services, schools are an ideal setting to integrate behavioral health services with academic supports (Owens et al., 2014). Researchers have developed and evaluated numerous EBP that span multiple tiers of prevention (universal, targeted, and intensive) for implementation in schools. For example, school-wide positive behavior intervention and supports (Bradshaw, Mitchell, Leaf, 2010) and social-emotional learning curricula (Brackett & Rivers, 2014) prevent behavioral health problems and promote success enabling factors (Cook et al., 2015). Moreover, targeted small group interventions grounded in cognitive behavior therapy have been shown to decrease mental health problems and promote better academic-related outcomes (e.g., Lochman & Wells, 2002). Last, more intensive forms of school-based treatment, such as individualized function-based behavior intervention plans (Walker, Chung, & Bonnet, 2017) and therapeutic interventions (e.g., Morina, Koerssen, & Pollet, 2016) have been linked to reduced problem behavior and improvements in social, emotional, and academic functioning among high risk youth. For these reasons, schools are under immense pressure from policy and stakeholders to deliver a continuum of EBP that target preventing and ameliorating behavioral health problems (Bruns et al., 2016).

Implementation Gap

Schools are confronted with an implementation gap, with the slow adoption of EBP into routine practice ultimately limiting their effects on youth outcomes (Gottfredson & Gottfredson, 2002; Owens et al., 2014; Ringwalt et al., 2009). Even when EBP are selected for adoption in schools, they are infrequently implemented with fidelity or sustained over time (Gottfredson & Gottfredson, 2002; Ringwalt et al., 2004). This is problematic given the demonstrated link between high-quality implementation and changes in youth social, emotional, and academic outcomes (Durlak & DuPre, 2008; St. Peter Pipkin, Vollmer, & Sloman, 2010). Addressing the extant gap between research and practice represents a critical aspect of translational prevention science to move beyond development and efficacy studies to dissemination and implementation research that seeks to realize the potential public health impact of prevention science (Fishebin, Ridenour, Stahl, & Sussman, 2016).

Implementation Strategies

Successful implementation efforts depend on the strategic use of implementation strategies, methods or techniques that are used to facilitate the adoption, use, and sustainment of EBP (Proctor et al., 2013). These methods and techniques target putative contextual and individual-level mechanisms that influence implementation processes and outcomes (e.g., acceptability, appropriateness, fidelity, penetration) (Lewis et al., 2018). Implementation outcomes, the targets of implementation strategies, are district from service outcomes, reflect the primary dependent variables in implementation research, and are defined as the effects of deliberate and purposeful efforts to influence implementation (Proctor et al., 2009). Increasingly, implementation strategies are being developed and tested to promote the adoption, delivery, and sustainment of EBP in routine service settings (Powell et al., 2017).

Implementation research and frameworks point to a wealth of strategies that are pertinent to different phases (e.g., exploration, preparation, implementation, and sustainment; Aarons, Hurlburt, & Horwitz, 2011) and across multiple levels (e.g., outer setting, inner setting, individual implementers, the innovation/practice itself) of the implementation process (Leeman, Birken, Powell, Rohweder, & Shea, 2017). For example, high quality professional development that involves dynamic training and follow-up consultation/coaching has been shown to successfully increase providers’ delivery of EBP (Herschell et al., 2010; Sholomskas et al., 2005). Moreover, assessing readiness by examining barriers to and facilitators of implementation can inform strategic planning that targets specific implementation outcomes, such as appropriateness (i.e., the suitability or fit of a particular practice to the context) and acceptability (i.e., satisfaction with a particular practice; Weiner et al.,2017). Additionally, monitoring implementation and providing data-driven performance-based feedback can serve as an effective means for continuous improvement of implementation outcomes, such as intervention fidelity and reach (McHugh & Barlow, 2010). Moreover, there is general consensus among implementations scientists that a core aspect of implementation practice is the selection and tailoring of implementation strategies to address the barriers present within a given service setting (Powell et al., 2017). Tailoring implementation strategies typically involves an assessment of determinants that are likely to influence implementation outcomes, such as features of the intervention or practice (e.g., Good Behavior Game as a classroom management practice; Barrish, Saunders, & Wolf, 1969), context-specific determinants associated with the school setting in which the practice will be implemented (e.g., supportive leadership, protected-time, connect between practice and performance evaluations, etc.), or individual-level factors associated with those expected to implement the practice (e.g., self-efficacy, beliefs and attitudes, intentions to implement). Prior research has established guides to inform the assessment of these factors (Flottorp et al., 2013; Wensing & Grol, 2005), with resulting data informing the selection and tailoring of implementation strategies to context-specific barriers associated with a given setting (Wensing et al., 2011).

Keeping track of all of the implementation strategies represents an information management problem that is likely to stall both research and practice, especially when inconsistent terminology and inadequate definitions are used in research (Proctor et al., 2013). Researchers have, therefore, focused on identifying, specific, defining and revising a taxonomy of implementation strategies that could inform future implementation research, as well as real world implementation practice efforts focused on bridging the science-to-practice gap.

Implementation research in the healthcare sector is more advanced than other service sectors, including schools (e.g., Sanetti, Knight, Cochrane, & Minster, 2017). In fact, organized efforts have been undertaken to generate a taxonomy of implementation strategies that could be utilized within healthcare research. The Expert Recommendations for Implementing Change (ERIC) project (Waltz et al., 2015) built upon work conducted by a smaller group of implementation researchers who systematically developed an initial taxonomy of implementation strategies (Powell et al., 2012). This list was refined via a larger panel of implementation experts (Powell et al., 2015) and analyzed to examine the feasibility and importance of each implementation strategy (Waltz et al., 2015). The ERIC compilation (Powell et al., 2015) has provided a much-needed common language for implementation researchers and practitioners and allowed for better tracking and reporting implementation strategies within and across studies (Proctor et al., 2013).

As it stands, no comparable effort has occurred to support implementation in schools. Given that the education sector has a number of unique implementation challenges – including educational timelines, professional characteristics, policies, and organizational constraints (Forman et al., 2013; Owens et al., 2014) – it is likely that strategies designed to support clinical practice in more traditional healthcare settings (e.g., primary care, specialty mental health) will require adaptation for use in schools. In fact, Waltz et al. (2015, pp. 4–5) have advocated there is a need to adapt strategies via expert consensus to ensure “a common nomenclature for implementation strategy terms, definitions, and categories that can be used to guide implementation research and practice.”

Adaptation to the School Context

Adaptation is a process of making changes to a method, program, practice, or finding to increase its suitability for use with a particular target population (e.g., school-based researchers and practitioners) or within a given organizational context (e.g., educational sector; McKleroy et al., 2006). Adaptation is a critical aspect to improve the appropriateness or contextual fit of a particular innovation (Proctor et al., 2013). This has resulted in some researchers concluding that implementation does not occur without adaptation (Lyon & Bruns, 2019). In intervention science, multiple models have been proposed to facilitate the adaptation of EBP, including making cultural and contextual changes to EBP to improve appropriateness and relevance of the practice to the service recipients (Bernal, Jimenez-Chafey, Domenech, & Rodriguez, 2009; Bernal, Domenech & Rodriguez, 2012). Most of these models share common features with regard to the level or depth of adaptation made to an existing practice (Barrera & Gonzalez-Castro, 2006; Bernal, Bonilla, & Bellido, 1995; Leong, 1996). One level of adaptation represents surface changes to alter the label, referents, terminology, and/or examples used to describe the practice to ensure the language facilitates comprehension, contextually appropriateness and usability by the intended end users of the innovation who operate in a specific context (Resnicow, Baranowski, Ahluwalia, & Braithwaite, 1990). In the education sector, this involves school-based implementation researchers and practitioners whose work focuses on the translation of EBP into routine service delivery through the use of implementation strategies. Another level of adaptation refers to deeper changes made to the substance of the practice that involves altering the meaning in a way that departs from the original content of the practice to increase its relevance and appropriateness within the specific context it will be deployed (McKleroy et al., 2006; Resnicow et al., 1990). Although many implementation strategies are generic and are applicable across contexts, including schools, the educational sector is a unique service setting with different nomenclature used to communicate information and contextual constraints (e.g., professional roles, scheduling) and features (e.g., teacher unions, school boards) rendering particular strategies more or less relevant and appropriate. Considering the above, to enhance the comprehension, contextual appropriateness, and utility of the ERIC strategy compilation in the educational sector, there is a need to engage in an iterative adaptation process.

Purpose of this Study

The purpose of this study was to initiate the School Implementation Strategies, Translating ERIC Resources (SISTER) Project by iteratively adapting the ERIC strategy compilation to derive a taxonomy of implementation strategies with relevance to the education sector. Consistent with the initial study procedures used to inform the eventual development of the ERIC compilation (Powell et al., 2012), a small group of implementation experts convened over multiple occasions to systematically and iteratively adapt the existing compilation for use in schools. The aim was to produce a SISTER strategy compilation that could inform subsequent research examining and comparing the feasibility and importance of the implementation strategies for use in the school context, as well as investigations exploring the mechanisms through which particular strategies work (Lewis et al., under review). Additionally, a sub-aim of this study was to model a process that could be used by implementation researchers working in other sectors to successfully leverage existing implementation products and resources and adapt them to their targeted setting.

Method

Expert participants

Consistent with the process used to generate the original implementation strategy compilation (Powell et al., 2012) that informed the development of the refined ERIC compilation (Powell et al., 2015), this study included a small subset of implementation experts to develop a school-adapted taxonomy of implementation strategies that is applicable to the education sector. The participants in this study included three PhD-level experts with extensive experience conducting implementation research in schools and two of the lead researchers from the ERIC project. These five experts engaged in an iterative adaptation process, with multiple rounds of revisions and feedback. The three school-based implementation experts took the lead on making changes to the original ERIC strategy compilation to enhance the comprehensibility and appropriateness of the strategies, while the two lead ERIC researchers provided feedback on the changes made to the implementation strategies to maintain conceptual consistency with the original strategies. This process was repeated three times until consensus was reached.

Original ERIC Strategy Compilation

The refined ERIC strategy compilation (Powell et al., 2015) was used to develop a school-adapted taxonomy of implementation strategies—the SISTER compilation. The revised ERIC compilation was generated based on input from an expert panel of implementation researchers and practitioners, with nearly two-thirds of the experts being affiliated with the Veteran’s Administration healthcare system. A modified Delphi approach involving three rounds of iterative revision was applied to Powell et al.’s (2012) published taxonomy of 68 strategies to develop a revised compilation based on expert consensus. Consistent with the Delphi approach, experts engaged in structured conversations across multiple rounds to iteratively adapt and reach consensus on adaptations to the original ERIC compilation (Dalkey & Helmer, 1963). This process resulted in the expert panel reaching consensus on a final compilation of 73 implementation strategies. These 73 implementation strategies span multiple levels of the service delivery context (inner setting, outer setting), stages/phases of the implementation process (exploration, preparation, implementation, sustainment), and target different stakeholders involved in the uptake and use of EBP (leaders, implementers, recipients, other stakeholders). The revised ERIC compilation has informed subsequent research examining the feasibility and importance of implementation strategies for use in particular service sectors, classification of strategies into conceptual categories, and linking strategies to specific barriers to advance tailored implementation (Powell et al., 2017; Waltz et al., 2015).

Procedure

Prior to conducting this study, IRB approval was sought, and the university IRB determined that this study was exempt. An iterative adaptation process was developed to systematically examine and make changes to the revised ERIC compilation of 73 implementation strategies to create the SISTER strategy compilation. A key aspect of the adaptation process included the recruitment and participation of two of the developers from the ERIC project to serve as independent experts who provided feedback at specific points. All changes to extant ERIC strategies considered the common language and unique constraints and features of the school context to increase comprehension, contextual appropriateness, and utility for school-based implementation researchers and practitioners. The adaptation protocol proceeded systematically along a series of seven sequential steps: (1) school-based implementation experts reviewed existing implementation strategies to make revisions to the language, referents, and terminology to be consistent with the school context; (2) modification or expansion of examples to increase comprehension regarding how each strategy is applicable to EBP implementation in the school context; (3) removal of implementation strategies determined to be contextually inappropriate to the school context or redundant with other strategies as they manifest in schools; (4) addition of novel implementation strategies not included in the ERIC compilation that have evidence to enhance EBP implementation in schools; (5) review and feedback by ERIC investigators on the school-adapted compilation to ensure conceptual consistency with original strategy; (6) additional revision, based on feedback from ERIC developers, to ensure conceptual consistency with original strategies and increase the comprehension, contextual appropriateness, and utility to the school context; and (7) re-review by ERIC developers and finalization of an initial set of school-adapted implementation strategies.

The analytic approach consisted of categorizing the nature of revisions made to each of the strategies as either no change, surface change, or deep change. Further, we recorded the specific features of the strategy that were modified, including (a) changes to the label of strategy, (b) changes to the referents used to contextualize the strategy (e.g., replacing agency with school or district or replacing clinician with teacher), (c) changes to the terminology used within the definition of the strategy, and/or (d) changes to the examples used to illustrate the strategy. No change referred to strategies that remained unaltered and, therefore, included the same label, referents, terminology, and examples as the original ERIC strategy.. Surface level changes reflected relatively minor changes to the strategy that did not depart from the meaning of the original strategy, but were made to increase contextual appropriateness for school-based researchers and practitioners. Specific surface level changes were recorded, which included changes to the strategy label, referents (e.g., school personnel instead of clinician or school instead of clinic or agency), terminology (e.g., new practice instead of clinical innovation), or parenthetical and non-parenthetical examples in the strategy description. Deep change was used to categorize strategies that underwent more substantial modifications that altered the meaning of the strategy in a way that it departed from the original ERIC strategy. For all strategies that underwent deep changes, the specific reason for the deep change was recorded. Additionally, strategies that were deleted from or added to the ERIC taxonomy were recorded in order to tabulate the number of strategies that were deemed irrelevant and inappropriate to the school context, as well as those novel strategies that were not included in the ERIC compilation but educational research has identified as a method or procedure that could impact the successful uptake and use of EBP in schools. After completing the iterative adaption process, to examine patterns in the types of modifications, we synthesized the different changes (no change, surface, deep, deleted, added) according to each of the strategy categories derived from Waltz et al. (2015). Waltz and colleagues (2015) used expert ratings and concept mapping (Kane & Trochim, 2007) to derive the following nine strategy categories: (1) use evaluative and iterative strategies; (2) provide interactive assistance; (3) adapt and tailor to context; (4) develop stakeholder relationships; (5) train and educate stakeholders; (6) Support educators (word “clinicians” from the original was replaced with “educators”); (7) Engage consumers; (8) Financial strategies; and (9) Change infrastructure. These categories were used to organize a side-by-side comparison of the ERIC and SISTER compilations, as well as examine patterns in the types of modifications made to the original strategies.

Results

The results of the adaptation process are depicted in Tables 19, which includes a side-by-side comparison of the ERIC strategy compilation and the adapted SISTER compilation for each of the 9 conceptual categories. The strategies are organized in alphabetical order within each of the conceptual categories. After applying the iterative adaptation process to the ERIC strategy compilation, the final SISTER compilation included a total of 75 unique implementation strategies. Below is a detailed account of the adaptations that were made to generate the 75 strategies included in the SISTER compilation.

Table 1.

Adaptations to Strategies Falling Under Use Evaluative and Iterative Strategies

Use Evaluative and Iterative Strategies Change
Original ERIC Strategy and Definition SISTER Adapted Strategy and Definition Type Change Details #
Assess for readiness and identify barriers and facilitators
Assess various aspects of an organization to determine its degree of readiness to implement, barriers that may impede implementation and strengths that can be used in the implementation effort.
Assess for readiness and identify barriers and facilitators
Assess various aspects of the school context to determine the degree to which it and the school personnel within it are ready to implement, barriers that may impede implementation, and strengths or facilitators (such as, coaches, professional learning communities, whole staff training) that can be used/leveraged in the implementation effort.
Surface Surface: R & T 1
Audit and provide feedback
Collect and summarize clinical performance data over a specified time period and give it to clinicians and administrators to monitor, evaluate, and modify provider behavior.
Audit and provide feedback
Collect and summarize data regarding implementation of the new program or practice over a specified time period and give it to administrators and school personnel to monitor, evaluate, and support implementer behavior.
Surface Surface: R & T 2
Conduct cyclical small tests of change
Implement changes in a cyclical fashion using small tests of change before taking changes system-wide. Tests of change benefit from systematic measurement, and results of the tests of change are studied for insights on how to do better. This process continues serially over time, and refinement is added with each cycle.
Conduct cyclical small tests of change (piloting or trialing the practice first)
Implement changes in a cyclical fashion using small tests of change before taking changes system-wide. Tests of change benefit from systematic measurement, and results of the tests of change are studied for insights on how to better implement. This process continues over time, and refinements are made with each to incrementally adjust the new practices to make it more feasible and appropriate for the school context.
Surface Surface: L, T, & E 3
Conduct local needs assessment
Collect and analyze data related to the need for the innovation.
Conduct local needs assessment
Collect and analyze data related to the need for new practices.
Surface Surface: T 4
Develop a formal implementation blueprint
Develop a formal implementation blueprint that includes all goals and strategies. The blueprint should include: 1) aim/purpose of the implementation; 2) scope of the change (e.g., what organizational units are affected); 3) timeframe and milestones; and 4) appropriate performance/progress measures. Use and update this plans to guide the implementation effort over time.
Develop a detailed implementation plan or blueprint
Develop a detailed implementation plan or blueprint that includes the intended goals/outcomes to be achieved via the implementation effort as well the process and strategies that will be used to achieve those goals. The blueprint should include: 1) aim/purpose of the implementation; 2) scope of the change (e.g., who and what settings will be affected); 3) goals/outcomes to be achieved; (4) timeframe and milestones; (5) appropriate performance/progress measures; and (6) specific strategies that will be used to attain goals/outcomes. Use and update these plan to guide the implementation effort over time.
Surface Surface: L, R, T, & E 5
Develop and organize quality monitoring systems
Develop and organize systems and procedures that monitor clinical processes and/or outcomes for the purpose of quality assurance and improvement.
Develop and organize quality monitoring system
Develop and organize systems and procedures that monitor implementation and/or student outcomes for the purpose of quality assurance and improvement.
Surface Surface: R & T 6
Develop and implement tools for quality monitoring
Develop, test, and introduce into quality-monitoring systems the right input—the appropriate language, protocols, algorithms, standards, and measures (of processes, patient/consumer outcomes, and implementation outcomes) that are often specific to the innovation being implemented.
Develop instruments to monitor and evaluate core components of the innovation/ new practice
Develop, validate, and integrate measurement instruments or tools to monitor and evaluate the extent to which school personnel are implementing the core components of the intervention (i.e., with fidelity).
Deep Surface: L, R, & T
Deep: change to substance to purposefully narrow strategy to be more appropriate to school context
7
Obtain and use patients/consumers and family feedback
Develop strategies to increase patient/consumer and family feedback on the implementation effort.
Obtain and use student and family feedback
Develop strategies to increase student and family feedback on the implementation effort.
Surface Surface: L & R 8
Purposely reexamine the implementation
Monitor progress and adjust clinical practices and implementation strategies to continuously improve the quality of care.
Monitor the progress of the implementation effort
Monitor the progress of key implementation outcomes (fidelity, reach of the intervention, acceptability) and adjust practices and implementation strategies as needed to continuously improve the quality of delivery.
Surface Surface: L, T, & E 9
Stage implementation scale up
Phase implementation efforts by starting with small pilots or demonstration projects and gradually moving to a system wide rollout.
Stage implementation scale up
Phase implementation efforts by starting with small pilots or demonstration projects and gradually moving to a system wide rollout.
None 10

Note: L = Label change; R = Referent change; T = Terminology change; E = Example change

Table 9.

Adaptations to Strategies Falling Under Change Infrastructure

Change Infrastructure Change
Original ERIC Strategy and Definition SISTER Adapted Strategy and Definition Type Change Details #
Change accreditation or membership requirements
Strive to alter accreditation standards so that they require or encourage use of the clinical innovation. Work to alter membership organization requirements so that those who want to affiliate with the organization are encouraged or required to use the clinical innovation.
Change accreditation or membership requirements
Strive to alter accreditation standards so that they require or encourage use of the specific new practice (e.g., proactive classroom manage practices, school-wide PBIS, social-emotional learning curriculum). Work to alter membership organization requirements so that those who want to affiliate with the organization are encouraged or required to use new practices.
Surface Surface: R, T & E 66
Change liability laws
Participate in liability reform efforts that make clinicians more willing to deliver the clinical innovation.
Change ethical and professional standards of conduct
Participate in efforts to reform ethical and professional standards for conduct that encourage school personnel to view delivery of new practices as an ethical responsibility and consistent with the expectations for professional conduct.
Deep Surface: L, R, & T
Deep: change in substance to general absence of liability laws in education (e.g., no educational malpractice)
67
Change physical structure and equipment
Evaluate current configurations and adapt, as needed, the physical structure and/or equipment (e.g., changing the layout of a room, adding equipment) to best accommodate the targeted innovation.
Change/alter environment
Evaluate current environment and, as needed, alter or change aspects of it (e.g., changing the layout of a classroom, master scheduling, repurposing space) to best accommodate new practices.
Surface Surface: L, R, & E 68
Change record systems
Change records systems to allow better assessment of implementation or clinical outcomes.
Change record systems
Change data collection systems to allow better assessment of implementation or relevant outcomes.
Surface Surface: T 69
Change service sites
Change the location of clinical service sites to increase access.
Change school or community sites
Changing the location of services could enable students to have increased access to new practices.
Surface Surface: L, R, & T 70
Create or change credentialing and/or licensure standards
Create an organization that certifies clinicians in the innovation or encourage an existing organization to do so. Change governmental professional certification or licensure requirements to include delivering the innovation. Work to alter continuing education requirements to shape professional practice toward the innovation.
Create or change credentialing and/or professional development standards
Create an organization that certifies school personnel in new practices or encourage an existing organization to do so. Change governmental professional certification or licensure requirements to include delivering the new practices. Work to alter continuing education requirements to shape professional practice toward new practices.
Surface Surface: L, R, & T 71
Develop local policy that supports implementation
Develop local school system policy that establishes rules, expectations, and guidelines for implementation of new practices.
Addition New strategy added given the literature indicating the importance of policy-practice alignment to support implementation efforts in schools 72
Mandate change
Have leadership declare the priority of the innovation and their determination to have it implemented.
Mandate for change
Have leadership declare the priority of new practices (i.e., top down) and their determination to have it implemented.
Surface Surface: T 73
Pruning competing initiatives
Taking away or reducing other implementation efforts to reduce implementation overload and enable school personnel to focus their energy and effort on delivering an identified program or practice.
Addition Strategy added in light of recent evidence on the importance of de-prioritizing implementation activities or initiatives to make room for the new practice. 74
Start a dissemination organization
Identify or start a separate organization that is responsible for disseminating the clinical innovation. It could be a for-profit or non-profit organization.
Start a dissemination/implementation organization
Identify or start a separate organization that is responsible for disseminating and implementing new practices. It could be a for-profit or non-profit organization.
Surface Surface: L & T 75

Note: L = Label change; R = Referent change; T = Terminology change; E = Example change

Strategy Adaptation

No Change.

Out of the 73 ERIC strategies, 11 remained unaltered with no surface level changes made to the label, referents, terminology, and/or examples. Representative example strategies that were deemed to generalize well to the educational sector in their current form included: Access new funding: Access new or existing money to facilitate the implementation (strategy #60); Visit other sites: Visit sites where a similar implementation effort has been considered successful (strategy #36); and Develop academic partnerships: Partner with a university or academic unit for the purposes of shared training and bringing research skills to an implementation project (strategy #24).

Overall Changes.

Results from the coding indicated that changes were made to 57 (78%) of the original ERIC strategies. Changes included the following: (a) 28 strategies with label changes, (b) 39 strategies with changes to the referent (e.g., teacher instead of clinician or school instead of agency), (c) 50 strategies with changes to terminology used to describe the strategy, and (d) 17 strategies with changes to the examples used to illustrate the strategy.

Surface Change.

For 52 of the 57 adapted strategies, only surface-level changes were made. In total, 147 unique surface-level changes were made to the labels, referents, terminology, or examples to increase the comprehension and appropriateness of these 52 ERIC strategies. On average, there were roughly 2.5 surface-level changes to each of the adapted strategies, with a range of one surface-level change (e.g., terminology change to strategy #45 shadow other experts) to four surface-level changes (e.g., label, referent, terminology, and examples changes to strategy #50 facilitate relay of clinical data to providers). Specifically, changes to the label were made to 33 of the strategies, with examples including changing “Remind clinicians” to “Remind school personnel” and changing “Facilitate relay of clinical data to providers” to “Facilitate relay of intervention fidelity and student data to school personnel.” Further, changes to the referents (implementers, service recipients or service setting) in the strategy were made to 40 ERIC strategies. The most common referent changes consisted of replacing “clinician” with “school personnel” (13 times), “sites or organizations” to “school or district” (25 times) and “consumer/patient” to “students and/or families” (25 times). Out of all the surface level changes, the most common changes consisted of modifications to the terminology used to describe the strategy, with a total of 55 of the ERIC strategies undergoing terminology changes. Changes to terminology included using “new practice” instead of “clinical innovation” and adding terminology that represents common language used by school-based researchers and practitioners. Last, changes or additions to examples in the definition (parenthetical and non-parenthetical) were applied to 19 of the ERIC strategies to increase understanding of how the strategy could be applied in the school context. For example, for strategy #38, an expanded parenthetical example was provided to better describe the type of trained person who could conduct an educational outreach visit to support the implementation of a new practice.

Deep Change.

Deep changes were made to five of the ERIC strategies and resulted in modifications that altered the core meaning of the adapted strategy in a way that departed from the original. These deep changes were made in addition to the surface level changes described above. Three of these deep changes were made to strategies involving the use of financial mechanisms to influence implementation outcomes, which the school-based implementation experts agreed were inappropriate to the school context. However, these financial strategies had parallels to the school context and, thus, underwent deep changes. For example, develop disincentives (strategy # 63) was preserved but altered to remove reference to financial penalties and instead include description of disincentives that are more appropriate to the school context, such as write up in professional file, meeting with the administrator to discuss insufficient implementation, and participating in additional professional development for failure to implement or use the new practices. Moreover, make billing easier (strategy #65) was maintained but substantially altered to make implementation easier by removing burdensome documentation tasks, as the latter is a more contextually appropriate strategy to reduce burdens that impede educators’ implementation efforts. Another strategy underwent deep change because it involved changing liability law (strategy #67), which currently do not exist in education to the extent they do in healthcare (e.g., there is no educational malpractice statute like there is in medicine). Thus, the strategy was altered to reflect changes in ethical and professional standards of practice, which represents an implementation strategy that is more appropriate to how schools operate. Last, the ERIC strategy develop and implement tools for quality monitoring (strategy #7) had deep changes made to it because it included a diffuse set of recommendations (changes to language, protocols, algorithms, standards, and measures of processes, patient/consumer outcomes, and implementation outcomes) that would limit its appropriateness and usability in the school context. Thus, deeps changes were made to narrow the focus of the strategy and make it more appropriate to the school context.

Deleted.

A total of five ERIC strategies were deleted and not included in the final SISTER strategy compilation due to consensus that they were not appropriate to the school context. Three out of the five of strategies were deleted because they involved methods or techniques targeting the manipulation of financial structures to facilitate implementation outcomes. Due to the unique constraints of educational settings, such as school boards, compulsory attendance, and educational policy, financial strategies such as fee-for-service, use capitated payments, and use other payment schemes are not applicable and appropriate to the school context (Lyon, Cook et al., in press). One strategy was deleted due to redundancy given overlap with and lack of distinction from other strategies: use an implementation advisor. Last, revising professional roles was removed from inclusion in the SISTER compilation because of the contextual inappropriateness of revising educators roles in the contexts of schools. Teachers, for example, have highly prescriptive roles and credentials that prohibit shifting or revising their roles with other educators (e.g., with a school counselor; or a special education teacher with a general education teacher; Herlihy & Corey, 2006; Urbach et al. 2015).

Added.

A deliberate scan of the ERIC compilation to identify missing strategies resulted in a total of 7 new strategies being added to the SISTER compilation: (a) Develop local policy that supports implementation (strategy #72), (b) Improve implementers’ buy-in (strategy #51), (c) Peer-assisted learning (strategy #13), (d) Pre-correction prior to implementation (strategy #52), (e) Pruning competing initiatives (strategy #74), (f) Targeting/improving implementer wellbeing (strategy #54), and (g) Test-drive and select practices (strategy #18). These strategies were included based on knowledge of findings from school-based research on different methods and techniques used across multiple levels (e.g., policy to individual implementers) to facilitate implementation. Expanded definitions of each of these newly added strategies are included in the tables.

Strategy Changes by Category.

The types of modifications made for each of the 9 Waltz et al. (2015) conceptual strategy categories are depicted in Table 10. Proportionally, the category of Financial Strategies underwent the most significant modifications, with two-thirds of the strategies undergoing deep changes to modify meaning (n =3, 33%) or deletion from inclusion in the SISTER compilation (n =3, 33%). Strategies were deleted from only three of the 9 categories (Develop stakeholder relationships, Support educators, and Financial strategies), while new strategies were added to four of the 9 categories (Provide interactive assistance, Adapt & tailor to context, Support educators, Change infrastructure). Five of the categories included strategies that required deep changes that altered its meaning from the original ERIC strategy (Develop stakeholder relationships n = 1, 5%; Support educators n = 1, 20%; Financial strategies n = 3, 33%).

Table 10.

Types of Modifications According to Established Conceptual Strategy Categories (Powell et al., 2015)

Modifications

# of ERIC strategies No change Surface Change Only Deep Change Deleted Added # of SISTER strategies

Use Evaluative & Iterative Strategies 10 1 (10%)a 8 (80%) 1 (10%) 0 0 10
Provide Interactive Assistance 4 1 (25%) 3 (75%) 0 0 1 5
Adapt & Tailor to Context 4 2 (50%) 2 (50%) 0 0 1 5
Develop Stakeholder Relationships 17 4 (24%) 12 (71%) 0 1 (5%) 0 16
Train & Educate Stakeholders 11 2 (22%) 9 (82%) 0 0 0 11
Support Educators 5 0 4 (80%) 0 1 (20%) 3 7
Engage Consumers 5 0 5 0 0 0 5
Financial Strategies 9 1 (12%) 2 (22%) 3 (33%) 3 (33%) 0 6
Change Infrastructure 8 0 7 (88%) 1 (12%) 0 2 10

Totals 73 11 52 5 5 7 75

Note:

a =

percent of the original strategies within the conceptual category that underwent specific modifications

Discussion

The identification, deployment and testing of implementation strategies is critical to advancing implementation science and practice. This study iteratively adapted the refined ERIC strategy compilation (Powell et al, 2015) for use by school-based implementation researchers and practitioners. Application of the iterative adaptation process resulted in 11 of the 73 ERIC strategies requiring no modification, 52 undergoing surface level changes only, and five needing deep changes. Five strategies were deleted and seven new strategies were added, resulting in a total of 75 unique school-based implementation strategies.

Dissemination of this study’s findings is important to ensure that school-based implementation researchers and practitioners become aware of the full range of implementation strategies available to support the uptake, delivery, and sustainment of EBP given that the majority of efforts to change routine practice fail (Burnes, 2004; Damschroder et al., 2009). Dissemination of implementation strategies is critical to establish a common nomenclature among prevention scientists engaged in school-based research and to develop a generalizable knowledge base to answer key questions, such as what strategy worked under what conditions and how did it work? Akin to intervention science, clear labels and definitions of implementation strategies will facilitate more precise assessment and reproducibility in research and practice (Proctor, Powell, & McMillen, 2013). For example, the SISTER compilation may enable prevention scientists to more accurately identify and track the core implementation strategies they deploy in efficacy studies (e.g., conduct ongoing training, provide local technical assistance, provide ongoing consultation) to support the successful uptake and delivery of EBP with fidelity that otherwise go unreported, resulting in a greater likelihood of replication across studies and investigative groups (Boyd, Powell, Endicott, & Lewis, 2017; Bunger et al., 2017). Further, capturing the types of strategies that are needed to promote effective implementation (e.g., identify and prepare champions, alter and provide system- and individual-level incentives, provide practice-specific supervision) will be critical to support both indigenous school personnel (e.g., school psychologists, social workers) and EBP purveyors (e.g., external organizations who provide training and technical assistance on a given EBP) to facilitate the successful translation of EBP into everyday practice when strict oversight and control by researchers is lessened or not available (i.e., effectiveness research).

Emerging Patterns by Strategy Category

When examining patterns in the types of modifications made to strategies according to the Waltz et al. (2015) conceptual categories, several interesting findings emerged. First, consistent with the above, the strategy category with the most substantial modifications was financial strategies, with two-thirds of the strategies (6 out of 9) either being deeply modified or deleted from inclusion in the SISTER compilation. Financial strategies are largely inappropriate for use in schools due to unique policy, collective bargaining arrangements (i.e., unions and contracts), and compensation schemes (Lyon et al., in press). These findings suggest that certain types of implementation strategies may be more bound to a specific service sector, and thus, less transmittable across contexts that have different organizational constraints regarding how services are accessed (e.g., fee for service) and providers are incentivized to implement new practices. Some of the financial strategies had parallels, however, to the school context. For example, although financial disincentives are inappropriate for use in schools, the broader notion of creating disincentives for lackluster implementation is appropriate for application in schools. Indeed, creating situations that educators want to avoid (e.g., teacher meeting with the site administrator to discuss lackluster implementation at an inconvenient time) as a way of promoting greater uptake and delivery of EBP is a strategy that has been found to be effective in schools (DiGennaro, Martens, & McIntyre, 2005).

Second, there were four strategy categories (provide interactive assistance, adapt & tailor to context, train & educate stakeholders, and engage consumers) that underwent minimal modifications to increase the comprehension, contextual appropriateness, and utility by implementation researchers and practitioners operating in schools. Strategies that fall under these categories may be agnostic to the service delivery context, and therefore, more generalizable to a variety of implementation scenarios, settings and providers. For example, there is consensus among researchers and practitioners across different service sectors that the category of train and educate stakeholders is relevant and necessary whether one is functioning within the context of healthcare, behavioral health, or education (Beidas & Kendall, 2010; Grol, 2001; Lyon, Pullmann, Walker, & D’Angelo, 2017; Stahmer, Reed, & Lee, 2015), as stakeholders need to have knowledge of the underlying reasons why the EBP is needed, what the EBP entails, and what implementations looks like. Moreover, it is clear across service contexts, including schools, that providing interactive assistance is critical to support frontline providers’ (e.g., nurses, mental health providers, or teachers) with ongoing support via technical assistance, facilitation, and supervision to promote their uptake and delivery of EBP (Cook & Odom, 2013; Lyon et al., 2017; Stetler et al., 2006).

Last, the seven newly generated strategies were classified into only four out of the nine conceptual strategy categories, with most additions falling under supporting educators (n = 3) and changing infrastructure (n = 2). This finding speaks to the overall representativeness of the refined ERIC strategy compilation (Powell et al., 2015), as relatively few new strategies were generated and classified into a small subset of the conceptual categories. This finding also indicates that certain strategy categories, like supporting educators and changing infrastructure, may have greater room for innovation regarding the generation of additional individual- and contextual-level strategies to support implementation. The generation of additional strategies for inclusion into strategy compilations should continue to be guided by consensus driven procedures using the best available evidence, including efforts to classify new strategies under existing conceptual strategy categories to facilitate understanding of how the new ones fit among the more comprehension collection of other strategies.

Addition of New Strategies

The rationale for including additional unique strategies in schools that were missing from the ERIC compilation warrants further discussion. Develop local policy that supports implementation was added based on research findings related to universal prevention efforts, such as school-wide positive behavior intervention supports, suggesting that changes to school discipline policy leads to changes in adult behavior regarding how educators effectively respond to problem behavior (Horner, Sugai, & Fixsen, 2017). Improve implementers’ buy-in was included based on emerging evidence linking changes in educator beliefs and attitudes as important predictors of implementation intentions and behaviors (Cook et al., 2015). Peer-assisted learning was added in light of research suggesting that peer learning networks or collaborative frameworks are facilitative of reflective practice and provide educators with a form of peer accountability to enhance the implementation of academic and behavioral supports (Kohler, Crillery, Shearer, & Good, 1997; Vescio, Ross, & Adams, 2008). Pre-correction prior to implementation was generated based on the impact of antecedent strategies delivered temporally before an opportunity to facilitate educators’ successful delivery of an EBP (Cook et al., 2017). Last, pruning competing initiatives reflects strategic de-adoption practices to offset the potential for implementation overload, and was included as a strategy to make room for frontline providers to prioritize the implementation of a new program or practice (Abrahamson, 2004, Nadeem & Ringle, 2016). Targeting/improving implementer wellbeing has recently emerged as an implementation strategy, with findings showing stress and burnout reductions lead to improved intentions to implement and actual use of EBP by teachers (Cook et al., 2017; Larson et al., 2017). Test-drive and select practices is a way of incorporating implementer choice/preference in the selection of an EBP, and has shown promise as a technique for improving fidelity among educators who are initially resistant to adopt and deliver a new practice (Dart, Cook, Gresham, & Chenier, 2012; Johnson et al., 2014).

Although these additions were identified with the school context in mind, most of them are likely to be applicable to other service sectors. For example, efforts to promote implementer buy-in prior to and during an implementation effort are likely facilitative of implementation outcomes across other service sectors focused on promoting youth behavioral health outcomes, such as healthcare, child welfare, juvenile justice and public health (e.g., Russ et al., 2015). Moreover, stress and burnout among implementers are not unique barriers to implementation in schools (e.g., Khamisa, Peltzer, & Oldenburg, 2013). Thus, efforts targeting stress and burnout reduction are likely to help promote providers’ wellbeing and may serve to increase their intentions to adopt and deliver clinical innovations (Damian, Gallo, Leaf, & Mendelson, 2017). In the multidisciplinary spirit of implementation science, strategies facilitative of implementation outcomes in one context may ultimately be appropriate and have utility beyond the setting in which they were originally developed.

Implications

This study has notable implications for prevention scientists dedicated to improving youth access to high quality behavioral health services in schools. First, although implementation science is far less advanced in the educational sector than other fields (Sanetti, Knight, Cochrane, & Minster, 2017), lagging behind other sectors can be viewed an opportunity for strategic adaptation of established implementation tools and resources. Service sectors, such as education, with lagging research are well-positioned to take advantage of extant findings from other service sectors, such as healthcare, by strategically adapting findings for use in a novel context. As highlighted in this study, strategic selection and adaptation of existing resources involves capitalizing on the trailblazing work by implementation scientists and practitioners operating in other service sectors to generalize and adapt extant findings for use in a novel service setting, such as schools. To support these advancements, school-based prevention scientists must strive to keep informed of implementation research outside of their own discipline to identify existing findings that could be strategically adapted for use in their specific context. As an example, in the areas of measurement, researchers in child welfare and youth mental health have developed pragmatic tools to assess key factors of the inner organizational context (i.e., the microsystem in which implementation happens) that are most proximal to providers’ implementation behaviors (Aarons, Ehrhart, & Farahnak, 2014; Ehrhart, Aarons, & Farahnak, 2015; Ehrhart, Aarons, & Farahnak, 2014), and these measures have been adapted for use in the context of school-based implementation research and practice (Lyon et al., in press).

Establishing an adapted compilation of implementation strategies has implications for deepening understanding of which strategies are most commonly needed, feasible to deploy, and effective across implementation efforts. The existing implementation strategies are not necessarily equal, as some may require more resources (i.e., time, money, and energy) to deploy, some may be more or less effective, and some may be needed more frequently. Thus, there is a need to examine pragmatic dimensions of strategies that impact their likely use among implementation practitioners. Ultimately, school-based implementation research should be concerned with potentially replicating the divide that it seeks to address between what research indicates works and what gets adopted in everyday service settings (Lyon, Comtois, Kerns, Landes, & Lewis, in press). Similar to work undertaken with the ERIC compilation (Waltz et al., 2015), researchers should examine experts and practitioners perceptions of the feasibility and impact of strategies to identify those that are low burden to deploy yet likely to have an influence on EBP implementation.

The SISTER strategy compilation, as well as other published taxonomies, have implications for identifying the subset of strategies that are most frequently needed by implementation practitioners within a given service setting. One starting place is to link strategies to most the commonly encountered malleable determinants (i.e., barriers or facilitators to implementation) that impact successful EBP implementation. This represents a useful starting place as one approach to tailored implementation involves targeting strategies to specific barriers identified in a given context. Pareto’s Law of the Vital Few (Bookstein, 1990), which captures the natural distribution of problems for particular phenomena in order to distill them to a core set, suggests that there are likely a smaller subset of barriers (e.g., 20%) that account for the majority of implementation issues encountered (e.g., 80%). This may prove quite useful, given that 601 plausible determinants of implementation have been identified (Krause et al., 2014). If this law holds true, then researchers and practitioners need to identify the vital determinants that account for the majority of implementation failures. Those barriers that are frequently encountered and are malleable could be the ideal targets for developing more pragmatic approaches to tailoring strategies to a given setting (Locke et al., 2016). Researchers have identified four different methodologies that could help inform tailoring implementation strategies to context, including concept mapping, group model building, conjoint analysis, and intervention mapping (Powell et al., 2017). These methodologies can help provide greater guidance on how to link implementation strategies to more precise: (a) stages of the implementation process (e.g., exploration, preparation, implementation, and sustainment; Aarons et al., 2011), (b) determinants that serve as barriers to implementation (e.g., insufficient knowledge of or motivation to implement the new innovation), and (c) measures to monitor specific implementation outcomes (e.g., appropriateness, intervention fidelity, penetration/reach) to inform data-driven improvement decisions. Further streamlining of implementation strategies may come from emerging efforts to detail the mechanisms through which strategies influence implementation outcomes (e.g., Lewis, 2017; Williams, 2015). Similar to the push to identify mechanisms of action in intervention science (Kazdin, 2007), identifying and testing implementation mechanisms holds promise for eliminating strategies (or strategy components) that do not operate through the strongest pathways of action. Researchers also have begun to outline methodologies that could be used to begin developing and testing specific strategy-mechanism-outcomes linkages (e.g., Lyon et al., 2016), which have relevance to work in the education sector. Research focused on determining how to tailor implementation strategies to a given context will hopefully provide more efficient and effective approaches to implementation.

We believe that existing taxonomies, like the original ERIC, need to be adapted to the specific service sector in which they will be used, as adaptation helps ensure that products and ideas are comprehensible and appropriate to stakeholders (e.g., researchers, practitioners, and policymakers) operating in that sector (Bernal et al., 2012). In the area of children’s mental and behavioral health, education, community mental health, juvenile justice, primary care, and child welfare represent the main child-serving sectors in which children receive services. Thus, we anticipate that the potential number of adapted compilations would mirror the number of child-serving sectors. It is also important for findings stemming from adaptation efforts, like SISTER, should be fed back to the original source to potentially expand and refine the ERIC compilation.

Limitations and Directions for Future Research

This study has several limitations. First, this initial study did not include as comprehensive a group of experts as the original ERIC group, which included a total of 71 implementation research and practitioner experts. Future research on the SISTER compilation will seek to expand the representativeness of research and practitioner experts who provide input on the compilation and recommendations to inform pragmatic use as part of real world implementation efforts. This would ideally include input from implementation practitioners or intermediaries (e.g., external organizations or individuals who are EBP champions and use the science of implementation to support real world implementation efforts; Franks & Bory, 2018) working in real-world educational settings. Second, the adaptation process employed was not predicated on a widely established approach. Rather, the seven-step adaptation process was constructed for the purposes of this study due to the lack of a widely accepted approach to adapt existing research findings for use in novel contexts. Researchers may use the adaptation process in this study as a starting point to establish a more rigorous approach through expert consensus driven procedures. Third, although there are systematic reviews of the school-based literature examining the use and effects of consultation and coaching on implementation, there are no comprehensive reviews of implementation strategies. Such research will be an important follow up to the work presented in this paper. Lastly, this study provides no guidance to facilitate decision making with regard to the selection and use of strategies in response to particular implementation scenarios. The lack of empirical guidance is noteworthy in school-based behavioral health relative to other services sectors (Novins et al., 2013), as there are few experimental studies or comparisons of implementation strategies.

Conclusion

Implementation strategies are essential to effectively incorporate EBP into school-based behavioral health service delivery and improve outcomes for youth. This study established the initial school-adapted SISTER strategy compilation, which will hopefully provide common language and stimulate future implementation research in the education sector. The SISTER compilation provides a useful starting place to move school-based behavioral health forward. Eventually, we hope to arrive at a place of greater understanding among implementation researchers and practitioners regarding when and how to select implementation strategies for new circumstances. Nevertheless, it is unlikely that the current SISTER compilation reflects the full set of potential relevant and useful implementation strategies in the education sector. As our research and collaborations in this area continue to advance, as well as the field of implementation science more generally, we anticipate further revisions will be made to this list. Moreover, we are hopeful that prevention scientists will scale-out ( ) this work by adapting it to novel child-serving sectors for use by researchers and practitioners seeking to advance EBP implementation.

Table 2.

Adaptations to Strategies Falling Under Provide Interactive Assistance

Provide Interactive Assistance Change
Original ERIC Strategy and Definition SISTER Adapted Strategy and Definition Type Change Details #
Centralize technical assistance
Develop and use a centralized system to deliver technical assistance focused on implementation issues.
Centralize technical assistance
Develop and use a centralized system within a district, region, or state to deliver and facilitate access to technical assistance focused on implementation issues.
Surface Surface: R & T 11
Facilitation
A process of interactive problem solving and support that occurs in a context of a recognized need for improvement and a supportive interpersonal relationship.
Facilitation/problem-solving
A process of interactive problem solving and support that occurs in a context of a recognized need for improvement in the implementation of a specific practice and a non-evaluative but informative and supportive interpersonal relationship.
Surface Surface: L & E 12
Peer-Assisted Learning
Pair school personnel together, provide them with a training and a validated rubric to observe one another, and have them schedule a debrief session to share findings
Addition Strategy added in light of findings indicating impact of pairing/linking school personnel to support implementation. 13
Provide clinical supervision
Provide clinicians with ongoing supervision focusing on the innovation. Provide training for clinical supervisors who will supervise clinicians who provide the innovation.
Provide practice-specific supervision
Provide school personnel with supervision focusing on new practices. Supervisors are in a position of authority and support school personnel who deliver new practices with evaluative feedback via performance assessment. Supervision is typically differentiated from consultation/coaching, which may be provided by an internal or external individual who may or may not have authority over the implementer.
Surface Surface: L, R, T, & E 14
Provide local technical assistance
Develop and use a system to deliver technical assistance focused on implementation issues using local personnel.
Provide local technical assistance
Develop and use a system to deliver technical assistance focused on implementation issues using local personnel.
None 15

Table 3.

Adaptations to Strategies Falling Under Adapt and Tailor to Context

Adapt and Tailor to Context Change
Original ERIC Strategy and Definition SISTER Adapted Strategy and Definition Type Change Details #
Promote adaptability
Identify the ways a clinical innovation can be tailored to meet local needs and clarify which elements of the innovation must be maintained to preserve fidelity.
Promote adaptability
Identify the ways a new practice can be tailored or adapted to best fit with the school/classroom context, meet local needs and clarify which elements of the new practice must be maintained to preserve fidelity.
Surface Surface: R & T 16
Tailor strategies
Tailor the implementation strategies to address barriers and leverage facilitators that were identified through earlier data collection.
Tailor strategies
Tailor the implementation strategies to address barriers and leverage facilitators that were identified through earlier data collection.
None 17
Test-Drive and Select Practices
Support school personnel to try out various practices in small doses and have them choose/select the one they find most acceptable and appropriate.
Addition Strategy added in light of findings indicating importance of allowing implementers to choose/select EBP based experiential preferences. 18
Use data experts
Involve, hire, and/or consult experts to inform management on the use of data generated by implementation efforts.
Use data experts
Involve, hire, and/or consult experts to inform management and use of data generated by implementation efforts.
None 19
Use data warehousing techniques
Integrate clinical records across facilities and organizations to facilitate implementation across systems.
Use data warehousing techniques
Integrate educational and administrative data within and between schools and with outside community organizations to facilitate implementation internally and/or across different schools or service settings.
Surface Surface: R & T 20

Note: L = Label change; R = Referent change; T = Terminology change; E = Example change

Table 4.

Adaptations to Strategies Falling Under Develop Stakeholder Interrelationships

Develop Stakeholder Interrelationships Change
Original ERIC Strategy and Definition SISTER Adapted Strategy and Definition Type Change Details #
Build a coalition
Recruit and cultivate relationships with partners in the implementation effort.
Build partnerships (i.e., coalitions) to support implementation
Recruit and cultivate relationships with partners external and/or internal to the school who help facilitate the implementation effort.
Surface Surface: L & R 21
Capture and share local knowledge
Capture local knowledge from implementation sites on how implementers and clinicians made something work in their setting and then share it with other sites.
Capture and share local knowledge
Capture local knowledge from other school sites on how school personnel were able to implement the new practice effectively in their setting and then share it with other sites.
Surface Surface: R 22
Conduct local consensus discussions
Include local providers and other stakeholders in discussions that address whether the chosen problem is important and whether the clinical innovation to address it is appropriate.
Conduct local consensus discussions
Include local teachers, staff, and other stakeholders in discussions that address whether the identified problem/need is important and whether the new practices to address the identified problem are appropriate.
Surface Surface: R & T 23
Develop academic partnerships
Partner with a university or academic unit for the purposes of shared training and bringing research skills to an implementation project.
Develop academic partnerships
Partner with a university or academic unit for the purposes of shared training and bringing research skills to an implementation project.
None 24
Develop an implementation glossary
Develop and distribute a list of terms describing the innovation, implementation, and the stakeholders in the organizational change.
Develop an implementation glossary
Develop and distribute a list of terms describing the new practice and its core components, implementation, and the stakeholders who will be involved in implementation effort.
Surface Surface: T 25
Identify and prepare champions
Identify and prepare individuals who dedicate themselves to supporting, marketing, and driving through an implementation, overcoming indifference or resistance that the intervention may provoke in an organization.
Identify and prepare champions
Identify and prepare individuals who dedicate themselves to supporting, marketing, and driving through an implementation, overcoming indifference or resistance that the intervention may provoke in a school or district.
Surface Surface: R 26
Identify early adopters
Identify early adopters at the local site to learn from their experiences with the practice innovation.
Identify early adopters
Identify early adopters within the school or district to learn from their experiences with the implementation of the new practice.
Surface Surface: R & T 27
Inform local opinion leaders
Inform providers identified by colleagues as opinion leaders or ‘educationally influential’ about the clinical innovation in the hopes that they will influence colleagues to adopt it.
Inform local opinion leaders
Inform school personnel identified by colleagues as opinion leaders who are ‘educationally influential’ and can advocate for and support colleagues to adopt and implement new practices.
Surface Surface: R & T 28
Involve executive boards
Involve existing governing structures (e.g., boards of directors, medical staff boards of governance) in the implementation effort, including the review of data on implementation processes.
Involve governing organizations
Involve existing governing structures (e.g., school boards, state-level compliance teams) in the implementation effort, including the review of data on implementation processes.
Surface Surface: L, R, & T 29
Model and simulate change
Model or simulate the change that will be implemented prior to implementation.
Model and simulate change
Model or simulate the change that will be implemented prior to implementation.
None 30
Obtain formal commitments
Obtain written commitments from key partners that state what they will do to implement the innovation.
Obtain formal commitments
Obtain written commitments from key partners that state what they will do to implement new practices.
Surface Surface: T 31
Organize clinician implementation team meetings
Develop and support teams of clinicians who are implementing the innovation and give them protected time to reflect on the implementation effort, share lessons learned, and support one another’s learning.
Organize school personnel implementation team meetings
Develop and support teams of school personnel who are implementing new practices and give them protected time to reflect on the implementation effort, share lessons learned, and support one another’s learning.
Surface Surface: L, R, & T 32
Promote network weaving
Identify and build on existing high quality working relationships and networks within and outside the organization, organizational units, teams, etc. to promote information sharing, collaborative problem-solving, and a shared vision/goal related to implementing the innovation.
Promote network weaving
Identify and build on existing high quality working relationships and networks within and outside the school, organizational units, teams, etc. to integrate and expand social networks and promote information sharing, collaborative problem-solving, and a shared vision/goal related to implementing new practices.
Surface Surface: R & T 33
Recruit, designate, and train for leadership
Recruit, designate, and train leaders for the change effort.
Recruit, designate and train for leadership
Recruit, designate, and train leaders for the change effort so they can effectively engage in leadership behaviors that support others to adopt and deliver the new practice.
Surface Surface: T & E 34
Use advisory boards and workgroups
Create and engage a formal group of multiple kinds of stakeholders to provide input and advice on implementation efforts and to elicit recommendations for improvements.
Use advisory boards and workgroups
Create and engage a formal group of multiple kinds of stakeholders to provide input and advice on implementation efforts and to elicit recommendations for improvements.
None 35
Use an implementation advisor
Seek guidance from experts in implementation.
Deletion Redundant with other ERIC strategies (#s 24, 12, 44, 19)
Visit other sites
Visit sites where a similar implementation effort has been considered successful.
Visit other sites
Visit sites where a similar implementation effort has been considered successful.
None 36

Note: L = Label change; R = Referent change; T = Terminology change; E = Example change

Table 5.

Adaptations to Strategies Falling Under Train and Educate Stakeholders

Train and Educate Stakeholders Change
Original ERIC Strategy and Definition SISTER Adapted Strategy and Definition Type Change Details #
Conduct educational meetings
Hold meetings targeted toward different stakeholder groups (e.g., providers, administrators, other organizational stakeholders, and community, patient/consumer, and family stakeholders) to teach them about the clinical innovation.
Conduct educational meetings
Hold meetings targeted toward different stakeholder groups (e.g., teachers, principals, central administrators, other organizational stakeholders, and community, and family stakeholders) to teach them about the new practices.
Surface Surface: R & T 37
Conduct educational outreach visits
Have a trained person meet with providers in their practice settings to educate providers about the clinical innovation with the intent of changing the provider’s practice.
Conduct educational outreach visits
Have a trained person (i.e., person who has developed the intervention, received certified training in the practice, and/or extensive experience implementing the practice) meet with school personnel in their practice settings to educate them about new practices with the intent of changing the school personnel’s practice.
Surface Surface: R, T & E 38
Conduct ongoing training
Plan for and conduct training in the clinical innovation in an ongoing way.
Conduct ongoing training
Plan for and conduct training in new practices in an ongoing way.
None 39
Create a learning collaborative
Facilitate the formation of groups of providers or provider organizations and foster a collaborative learning environment to improve implementation of the clinical innovation.
Create a professional learning collaborative
Facilitate the formation of groups of school personnel within or between school systems to foster a collaborative learning environment to improve implementation of new practices.
Surface Surface: L, R, & T 40
Develop educational materials
Develop and format manuals, toolkits, and other supporting materials in ways that make it easier for stakeholders to learn about the innovation and for clinicians to learn how to deliver the clinical innovation.
Develop educational materials
Develop and format manuals, toolkits, and other supporting materials in ways that make it easier for stakeholders to learn about new practices and for school personnel to learn how to deliver the new practices with fidelity.
Surface Surface: R & T 41
Distribute educational materials
Distribute educational materials (including guidelines, manuals and toolkits) in person, by mail, and/or electronically.
Distribute educational materials
Distribute educational materials (including guidelines, manuals and toolkits) in person, by mail, and/or electronically.
None 42
Make training dynamic
Vary the information delivery methods to cater to different learning styles work contexts, and shape the training in the innovation to be interactive.
Make training dynamic
Vary the information delivery methods to cater to different learning styles, structures for professional development, and shape the training in new practices to be interactive.
Surface Surface: T 43
Provide ongoing consultation
Provide ongoing consultation with one or more experts in the strategies used to support implementing the innovation.
Provide ongoing consultation/coaching
Provide ongoing consultation/coaching with one or more experts in the strategies used to support implementing new practices.
Surface Surface: L & T 44
Shadow other experts
Provide ways for key individuals to directly observe experienced people engage with or use the targeted practice change/innovation.
Shadow other experts
Provide ways for key individuals to directly observe experienced people engage with or use new practices.
Surface Surface: T 45
Use train-the-trainer strategies
Train designated clinicians or organizations to train others in the clinical innovation.
Use train-the-trainer strategies
Train designated school personnel to train others in new practices.
Surface Surface: R & T 46
Work with educational institutions
Encourage educational institutions to train clinicians in the innovation.
Work with educational institutions
Encourage educational institutions to train school personnel in new practices on a pre- and/or in-service basis.
Surface Surface: R, T, & E 47

Note: L = Label change; R = Referent change; T = Terminology change; E = Example change

Table 6.

Adaptations to Strategies Falling Under Support Clinicians

Support Educators Change
Original ERIC Strategy and Definition SISTER Adapted Strategy and Definition Type Change Details #
Create new clinical teams
Change who serves on the clinical team, adding different disciplines and different skills to make it more likely that the clinical innovation is delivered (or is more successfully delivered).
Create new practice teams
Change who serves on the team supporting the practice or implementation effort, adding different disciplines (counselor, school psychologist, behavior specialist, school-based mental health provider) and different skills to make it more likely that the new practices is delivered (or is more successfully delivered).
Surface Surface: L, R, T, & E 48
Develop resource sharing agreements
Develop partnerships with organizations that have resources needed to implement the innovation.
Develop resource sharing agreements
Develop partnerships with organizations that have resources needed to implement new practices.
Surface Surface: T 49
Facilitate relay of clinical data to providers
Provide as close to real-time data as possible about key measures of process/outcomes using integrated modes/channels of communication in a way that promotes use of the targeted innovation.
Facilitate relay of intervention fidelity and student data to school personnel
Provide as close to real-time data as possible about key measures of intervention fidelity and student outcomes using integrated modes/channels of communication (e.g., email, social media, face- to-face notes) in a way that promotes use of the targeted new practices.
Surface Surface: L, R, T, & E 50
Improve implementers’ buy-In
Engage school personnel in activities or discussions that attempt to increase their buy-in and motivation to adopt and use the new practice.
Addition Strategy added due to findings indicating strategies that increase practitioner buy-in and motivation lead to better implementation outcomes 51
Pre-correction prior to implementation
Pre-correction is a frontloaded strategy that involves instruction and/or reminders about how to deliver core components of the intervention immediately prior to delivery.
Addition Strategy added in light of findings indicating effects of proactive supports in temporal proximity to the delivery of a practice. 52
Remind clinicians
Develop reminder systems designed to help clinicians to recall information and/or prompt them to use the clinical innovation.
Remind school personnel
Develop reminder systems (e.g., email prompts or visual cues) designed to help school personnel recall information and/or prompt them to deliver core components of new practices.
Surface Surface: L, R, & E 53
Revise professional roles
Shift and revise roles among professionals who provide care, and redesign job characteristics.
Deletion Revising professional roles is inappropriate in school context
Targeting/improving implementer wellbeing
Supporting school personnel to reduce stress and burnout in order to promote their wellbeing and behavioral intentions to implement new practices.
Addition Strategy added in light of findings indicating stress/burnout among educators as a determinant of adoption and delivery of school- based EBP. 54

Note: L = Label change; R = Referent change; T = Terminology change; E = Example change

Table 7.

Adaptations to Strategies Falling Under Engage Consumers

Engage Consumers Change
Original ERIC Strategy and Definition SISTER Adapted Strategy and Definition Type Change Details #
Increase demand
Attempt to influence the market for the clinical innovation to increase competition intensity and to increase the maturity of the market for the clinical innovation.
Increase demand and expectations for implementation
Attempt to influence the demand and expectations for new practices, relative to other practices, by educating key stakeholders about the new practice and its associated outcomes.
Surface Surface: L & T 55
Intervene with patients/consumers to enhance uptake and adherence
Develop strategies with patients to encourage and problem solve around adherence.
Intervene/communicate with students, families, and other staff to enhance uptake and fidelity
Develop strategies with students, families, and other staff who may not directly be involved in delivering the new practice to encourage and problem solve around intervention adoption and fidelity.
Surface Surface: L, R, & T 56
Involve patients/consumers and family members
Engage or include patients/consumers and families in the implementation effort.
Involve students, family members, and other staff
Engage or include students, families, and other staff in the implementation effort who may not directly be involved in delivering the new practice but are associated with it.
Surface Surface: L, R, & T 57
Prepare patients/consumers to be active participants
Prepare patients/consumers to be active in their care, to ask questions, and specifically to inquire about care guidelines, the evidence behind clinical decisions, or about available evidence-supported treatments.
Prepare families and students to be active participants
Prepare families and/or students to create “pull” (i.e., motivation or pressure to implement) for the delivery of the new practice by asking relevant questions, advocating for the new practice, and inquiring about guidelines for implementation, the evidence and rationale behind decisions, or about other effective new practices that could be implemented.
Surface Surface: L, R, T, & E 58
Use mass media
Use media to reach large numbers of people to spread the word about the clinical innovation.
Use mass media
Use media to reach large numbers of people to spread the word about new practices.
Surface Surface: T 59

Note: L = Label change; R = Referent change; T = Terminology change; E = Example change

Table 8.

Adaptations to Strategies Falling Under Use Financial Strategies

Use Financial Strategies Change
Original ERIC Strategy and Definition SISTER Adapted Strategy and Definition Type Change Details #
Access new funding
Access new or existing money to facilitate the implementation.
Access new funding
Access new or existing money to facilitate the implementation.
None 60
Alter incentive/allowance structures
Work to incentivize the adoption and implementation of the clinical innovation.
Alter and provide individual- and system-level incentives
Work to provide individual- (e.g., recognition and acknowledge, gift card) and/or system-level incentives to districts or schools to participate (e.g., grant money, free training and consultative support) and engage in an implementation effort involving a new practice.
Surface Surface: L, R, T, & E 61
Alter patient/consumer fees
Create fee structures where patients/consumers pay less for preferred treatments (the clinical innovation) and more for less-preferred treatments.
Alter student or school personnel obligations to enhance participation in or delivery of new practice, respectively
Create structures where students or school personnel are relieved of a particular obligation for participating in or delivering more preferred practices/supports (i.e., new practices) than less-preferred practices/supports.
Deep Surface: L, R & T
Deep: changes in substance due to the inappropriateness of fees-for-service in the school context
62
Develop disincentives
Provide financial disincentives for failure to implement or use the clinical innovations.
Develop disincentives
Provide disincentives (e.g., write up in professional file, meeting with the administrator to discuss insufficient implementation, participation in additional professional development) for failure to implement or use the new practices.
Deep Surface: T & E
Deep: changes to substance due to inappropriateness of using financial disincentives in the school context
63
Fund and contract for the clinical innovation
Governments and other payers of services issue requests for proposals to deliver the innovation, use contracting processes to motivate providers to deliver the clinical innovation, and develop new funding formulas that make it more likely that providers will deliver the innovation.
Fund and contract for the new practices
State departments of education, regional educational networks, local school districts, and other payers of services issue requests for proposals to schools to provide resources for them to deliver new practices, use contracting processes to motivate school personnel to deliver new practices, and develop new funding formulas that make it more likely that school personnel will adopt and deliver new practices.
Surface Surface: L, R, & T 64
Make billing easier
Make it easier to bill for the clinical innovation.
Make implementation easier by removing burdensome documentation tasks
Make it easier to implement the new practice by removing or alleviating burdensome tasks or documentation (e.g., completing unnecessary and unused data forms, completing rubrics that are not used to inform decisions, reports, etc.).
Deep Surface: L, T, & E
Deep: change substance due to irrelevance of billing to most school-based services
65
Place innovation on fee for service lists/formularies
Work to place the clinical innovation on lists of actions for which providers can be reimbursed (e.g., a drug is placed on a formulary, a procedure is now reimbursable).
Deletion
Use capitated payments
Pay providers or care systems a set amount per patient/consumer for delivering clinical care.
Deletion Financial arrangements are inappropriate to the school context
Use other payment schemes
Introduce payment approaches (in a catch-all category).
Deletion Financial arrangements are inappropriate to the school context

Note: L = Label change; R = Referent change; T = Terminology change; E = Example change

Acknowledgments

Funding: This publication was supported in part by funding from the Institute of Education Sciences (R305A160114 PIs - Lyon and Cook; R305A170292 PIs - Cook and Lyon). BJP was supported by K01MH113806, R25MH080916, and UL1TR001111. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health or Institute of Education Sciences.

Ethical approval: Exempt status was obtained from the university IRB prior to conducting the study.

Informed consent: No informed consent was necessary as part of this study.

Footnotes

Disclosure of potential conflicts of interest: None of the authors have conflicts of interest to report with regard to this research.

References

  1. Aarons GA, Hurlburt M, & Horwitz SM (2011). Advancing a conceptual model of evidence-based practice implementation in child welfare. Administration and Policy in Mental Health and Mental Health Services Research, 38, 4–23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Aarons GA, Ehrhart MG, & Farahnak LR (2014). The Implementation Leadership Scale (ILS): development of a brief measure of unit level implementation leadership. Implementation Science, 9, 45. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Abrahamson E. 2004. Change Without Pain: How Managers Can Overcome Initiative Overload, Organizational Chaos, and Employee Burnout. Boston, MA: Harvard Business School Press. [Google Scholar]
  4. Barrera M, & González-Castro F. (2006). A heuristic framework for the cultural adaptation of interventions. Clinical Psychology: Science and Practice, 13, 311–316. [Google Scholar]
  5. Beidas RS, & Kendall PC (2010). Training therapists in evidence-based practice: a critical review of studies from a systems-contextual perspective. Clinical Psychology: Science and Practice, 17, 1–30. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Bernal G, Bonilla J, & Bellido C. (1995). Ecological validity and cultural sensitivity for outcome research: issues for the cultural adaptation and development of psychosocial treatments with Hispanics. Journal of Abnormal Child Psychology, 23, 67–82. [DOI] [PubMed] [Google Scholar]
  7. Bernal G, Jimenez-Chafey MI, & Domenech-Rodríguez MM (2009). Cultural adaptation of treatments: a resource for considering culture in evidence-based practice. Professional Psychology Research and Practice, 40, 361–68 [Google Scholar]
  8. Bookstein Abraham (1990), “Informetric distributions, part I: Unified overview. Journal of the American Society for Information Science, 41 (5): 368–375, [Google Scholar]
  9. Boyd MR, Powell BJ, Endicott D, & Lewis CC (2017). A method for tracking implementation strategies: An exemplar implementing measurement-based care in community behavioral health clinics. Behavior Therapy. doi: 10.1016/j.beth.2017.11.012 [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Brackett MA, & Rivers SE (2014). Transforming students’ lives with social and emotional learning. International handbook of emotions in education, 368. [Google Scholar]
  11. Bradshaw CP, Mitchell MM, & Leaf PJ (2010). Examining the effects of schoolwide positive behavioral interventions and supports on student outcomes: Results from a randomized controlled effectiveness trial in elementary schools. Journal of Positive Behavior Interventions, 12(3), 133–148. [Google Scholar]
  12. Bruns EJ, Duong MT, Lyon AR, Pullmann MD, Cook CR, Cheney D, McCauley E. (2016) Fostering SMART partnerships to develop an effective continuum of behavioral health services and supports in schools. American Journal of Orthopsychiatry. 86(2), 156–170. [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Bunger AC, Powell BJ, Robertson HA, MacDowell H, Birken SA, Shea C. (2017). Tracking implementation strategies: A description of a practical approach and early findings. Health Research Policy and Systems, 15(15), 1–12. 
 [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Burnes B. (2004). Emergent change and planned change: competitors or allies? The case of XYZ construction. International Journal of Operations & Production Management, 24, 886–902. [Google Scholar]
  15. Cook BG, & Odom SL (2013) Evidence-based practices and implementation science in special education. Exceptional Children 79, 135–144 [Google Scholar]
  16. Cook CR, Miller F, Fiat A, Renshaw TL, Frye M, & Joseph G. (2017). Promoting secondary teachers’ well-being and intentions to implement evidence-based practices: Randomized evaluation of the ACHIEVER Resilience Curriculum. Psychology in the Schools, 54, 13–28. [Google Scholar]
  17. Cook CR, Pauling S, McCaslin S, Larson M, Thayer AJ, & Fiat A. (2017). Brief reminders as a low threshold strategy to facilitate teacher delivery of evidence-based classroom practices. Manuscript in preparation. [Google Scholar]
  18. Damian J, Gallo J, Leaf P, & Mendelson T. (2017). Organizational and provider level factors in implementation of trauma-informed care after a city-wide training: an explanatory mixed methods assessment BMC Health Services Research, 17, 750. [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. 2009. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implement Science, 4, 50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Dart E, Cook CR, Gresham FM, & Cheneir J. (2012). Test driving to increase treatment integrity and student outcomes. School Psychology Review, 41, 467–481. [Google Scholar]
  21. DiGennaro FD, Martens BK, & McIntyre LL (2005). Increasing treatment integrity through negative reinforcement: Effects on teacher and student behavior. School Psychology Review, 34, 220–231. [Google Scholar]
  22. Dingfelder HE, & Mandell DS (2011). Bridging the research-to-practice gap in autism intervention: An application of diffusion of innovation theory. Journal of Autism and Developmental Disorders, 41, 597–609. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Durlak JA, & DuPre EP (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American journal of community psychology, 41(3–4), 327–350. [DOI] [PubMed] [Google Scholar]
  24. Ehrhart MG, Aarons GA, & Farahnak LR (2014). Assessing the organizational context for EBP implementation: The development and validity testing of the Implementation Climate Scale (ICS) Implementation Science, 9, 157. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Ehrhart MG, Aarons GA, Farahnak LR (2015). Going above and beyond for implementation: The development and validity testing of the Implementation Citizenship Behavior Scale (ICBS). Implementation Science, 10, 65. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Farmer EM, Burns BJ, Phillips SD, Angold A, & Costello EJ (2003). Pathways into and through mental health services for children and adolescents. Psychiatric services, 54(1), 60–66. [DOI] [PubMed] [Google Scholar]
  27. Fishbein DH, Ridenour TA, Stahl M, & Sussman S. (2016). The full translational spectrum of prevention science: facilitating the transfer of knowledge to practices and policies that prevent behavioral health problems. Translational Behavioral Medicine, 6(1), 5–16. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Flottorp SA, Oxman AD, Krause J, et al. A checklist for identifying determinants of practice: A systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implementation Science. 2013;8(35):1–11. doi: 10.1186/1748-5908-8-35. [PMC free article] [PubMed] [CrossRef] [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Gottfredson D, & Gottfredson G. (2002). Quality of school-based prevention programs: Results from a national survey. Journal of Research in Crime & Delinquency, 39, 3–35. [Google Scholar]
  30. Grol R. (2001). Successes and failures in the implementation of evidence-based guidelines for clinical practice. Medical Care, 39, 46–54. [DOI] [PubMed] [Google Scholar]
  31. Herlihy B, & Corey G. (2006). Boundary issues in counseling: Multiple roles and responsibilities. Alexandria, VA: American Counseling Association. [Google Scholar]
  32. Herschell AD, Kolko DJ, Baumann BL & Davis AC (2010). The role of therapist training in the implementation of psychosocial treatments: a review and critique with recommendations. Clinical Psychology Review, 30, 448–466. [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Horner RH, Sugai G, & Fixsen DL (2017). Implementing effective educational practices at scales of social importance. Clinical Child & Family Psychology Review, 20, 25–35. [DOI] [PubMed] [Google Scholar]
  34. Johnson LD, Wehby JH, Symons FJ, Moore TC, Maggin DM, & Sutherland KS (2014). An analysis of preference relative to teacher implementation of intervention. The Journal of Special Education, 48(3), 214–224. [Google Scholar]
  35. Kane M, & Trochim WMK (2007). Concept mapping for planning and evaluation.Thousand Oaks, CA: Sage Publications. [Google Scholar]
  36. Kazdin AE (2007). Mediators and mechanisms of change in psychotherapy research. Annual Review of Clinical Psychology, 3, 1–27. [DOI] [PubMed] [Google Scholar]
  37. Kendall PC, & Beidas RS (2007). Smoothing the trail for dissemination of evidence-based practices for youth: Flexibility within fidelity. Professional Psychology: Research and Practice, 38, 13–20. [Google Scholar]
  38. Khamisa N; Peltzer K; Oldenburg B. Burnout in relation to specific contributing factors and health outcomes among nurses: A systematic review. Int. J. Environ. Res. Public Health 2013, 10, 2214–2240. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Kohler F, Crillery K, Shearer D, & Good G. (1997). Effects of peer coaching on teacher and student outcomes. Journal of Educational Research, 90, 240–250. [Google Scholar]
  40. Krause J, Van Lieshout J, Klomp R, Huntink E, Aakhus E, Flottorp S, et al. (2014). Identifying determinants of care for tailoring implementation in chronic diseases: An evaluation of different methods. Implementation Science, 9, 102. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Larson M, Cook CR, Fiat A, & Lyon AR (under review). Examining the interplay between teacher work-related stress and evidence-based practice implementation fidelity. [Google Scholar]
  42. Leong FT (1996). Toward an integrative model of cross-cultural counseling and psychotherapy. Applied and Preventive Psychology, 5, 189–209 [Google Scholar]
  43. Lewis CC (2017). Systematic Review of D & I Mechanisms in Health. Keynote address at 4th Biennial Society for Implementation Research Collaboration Conference. [Google Scholar]
  44. Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, Walsh-Bailey C, & Weiner B. (under review). From confusion to causality: Advancing understanding of mechanisms of change in implementation science. [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. Lyon AR, Lewis CC, Melvin A, Boyd M, Nicodimos S, Liu FF, & Jungbluth N. (2016). Health Information Technologies—Academic and Commercial Evaluation (HIT-ACE) methodology: Description and application to clinical feedback systems. Implementation Science, 11(1), 128. [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Lyon AR, Pullmann MD, Walker SC, & D’Angelo G, (2017). Community-sourced intervention programs: Review of submissions in response to a statewide call for “promising practices.” Administration and Policy in Mental Health and Mental Health Services Research, 44(1), 16–28. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Lyon A, Cook CR, Brown E, Locke J, Ehrhart M, Davis C, & Aarons G. (2017). Assessing the organizational implementation context in the education sector: Confirmatory factor analysis of measures of implementation leadership, climate, and citizenship. Implementation Science. [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. Lyon AR, Comtois KA, Kerns SEU, Landes SJ, & Lewis CC (in press). Closing the science-practice gap in implementation before it widens. In Shlonsky A, Mildon R, & Albers B. (Eds.) The Science of Implementation. Springer. [Google Scholar]
  49. McHugh RK, & Barlow DH (2010). The dissemination and implementation of evidence-based psychological treatments. A review of current efforts. American Psychologist, 65(2), 73–84. [DOI] [PubMed] [Google Scholar]
  50. McKleroy V,S,, Galbraith JS, Cummings B, Jones P, Harshbarger C., et al. 2006. Adapting evidence-based behavioral interventions for new settings and target populations. AIDS Education & Prevention, 18, 59–73 [DOI] [PubMed] [Google Scholar]
  51. Morina N. Koerssen R, & Pollet TV. (2016). Interventions for children and adolescents with posttraumatic stress disorder: A meta-analysis of comparative outcome studies Clinical Psychology Review, 47, 41–54 [DOI] [PubMed] [Google Scholar]
  52. Nadeem E, & Ringle V. (2016). De-adoption of an evidence-based trauma intervention in schools: A retrospective report from an urban school district. School Mental Health, 8, 132–143. [DOI] [PMC free article] [PubMed] [Google Scholar]
  53. Novins DK, Green AE, Legha RK, et al. (2013). Dissemination and implementation of evidence-based practices for child and adolescent mental health: A systematic review. Journal of the American Academy of Child & Adolescent Psychiatry, 52(10), 1009–1025. [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Owens J, Lyon AR, Brandt NE, Maisa Warner M, Nadeem E, Spiel C, & Wagner M. (2014). Implementation science in school mental health: Key constructs and a proposed research agenda. School Mental Health, 6, 99–111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, et al. (2012). A compilation of strategies for implementing clinical innovations in health and mental health. Medical Care Research and Review, 69, 123–57. [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Powell BJ, Proctor EK, & Glass JE (2014). A systematic review of strategies for implementing empirically supported mental health interventions. Research Social Work & Practice. 24, 192–212. [DOI] [PMC free article] [PubMed] [Google Scholar]
  57. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, et al. (2015). A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science, 10, 1–14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  58. Powell BJ, Beidas R. Lewis CC, Aarons GA, McMillen JC, Proctor EK, & Mandell DS. (2017). Methods to improve the selection and tailoring of implementation strategies. The Journal of Behavioral Health Services & Research, 44, 177–194. [DOI] [PMC free article] [PubMed] [Google Scholar]
  59. Proctor EK, Powell BJ, & McMillen JC (2013). Implementation strategies: recommendations for specifying and reporting. Implementation Science, 8, 1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  60. Resnicow K, Baranowski T, Ahluwalia J, & Braithwaite R. (1990). Cultural sensitivity in public health: Defined and demystified. Ethnicity and Disease, 9, 10–21. [PubMed] [Google Scholar]
  61. Ringwalt CL, Vincus A, Ennett S, Johnson R, & Rohrbach LA (2004). Reasons for teachers’ adaptation of substance use prevention curricula in schools with non-white student populations. Prevention Science, 5(1), 61–67. [DOI] [PubMed] [Google Scholar]
  62. Russ SJ, Sevdalis N, Moorthy K, et al. (2014). A qualitative evaluation of the barriers and facilitators toward implementation of the WHO surgical safety checklist across hospitals in England: lessons from the “surgical checklist implementation project”. Annual Surgery, 261, 81–91. [DOI] [PubMed] [Google Scholar]
  63. Sanetti LMH, Knight A, Cochrane W, & Minster M. (2017). Reporting of implementation fidelity of interventions with children in the school psychology literature from 2009 to 2017. Manuscript in preparation. [Google Scholar]
  64. Sholomskas DE, Syracuse-Siewert G, Rounsaville BJ, Ball SA, Nuro KF, & Carroll KM (2005). We don’t train in vain: A dissemination trial of three strategies of training clinicians in cognitive-behavioral therapy. Journal of Consulting and Clinical Psychology, 73, 106–115. [DOI] [PMC free article] [PubMed] [Google Scholar]
  65. Stahmer AC, Reed S, Lee E. (2015) Training teachers to use evidence-based practices for autism: examining procedural implementation fidelity. Psychology in the School, 52, 181–195. [DOI] [PMC free article] [PubMed] [Google Scholar]
  66. Stetler CB, Legro MW, Rycroft-Malone J, Bowman C, Curran G, & Guihan M. (2006). Role of “external facilitation” in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implementation Science, 1, 23–16. 10 [DOI] [PMC free article] [PubMed] [Google Scholar]
  67. St. Peter Pipkin C, Vollmer TR, Sloman KN. (2010). Effects of treatment integrity failures during differential reinforcement of alternative behavior: A translational model. Journal of Applied Behavior Analysis, 43, 47–70. [DOI] [PMC free article] [PubMed] [Google Scholar]
  68. Urbach J, Moore BA, Klinger JK, Galman S, Haager D, Brownell MT, & Dingle M. (2015). “That’s my job”: comparing the beliefs of more and less accomplished special educators related to their roles and responsibilities. Teacher Education and Special Education, 38 (4), 323–336 [Google Scholar]
  69. Vescio V, Ross D, & Adams A. (2008). A review of research on the impact of professional learning communities on teaching practice and student learning. Teaching and Teacher Education, 24(1), 80e91. [Google Scholar]
  70. Walker VL, Chung YC, & Bonnet LK (2017). Function-Based Intervention in Inclusive School Settings: A Meta-Analysis, Journal of Positive Behavior Interventions. [Google Scholar]
  71. Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. (2015). Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: Results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Science, 10(109), 1–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  72. Weiner BJ, Lewis CC, Stanick CF, Powell BJ, Dorsey CN, Clary AS, Boynton MH, & Halko H. (2017). Psychometric assessment of three newly developed implementation outcome measures. Implementation Science, 12(108), 1–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  73. Weisz J. & Kazdin A. (2017). Evidence-Based Psychotherapies for Children and Adolescents. Guilford Press. [Google Scholar]
  74. Wensing M, Oxman A, Baker R, et al. (2011). Tailored implementation for chronic diseases (TICD): A project protocol. Implementation Science, 6(103), 1–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  75. Wensing M, Grol R. Methods to identify implementation problems. In: Grol R, Wensing M, Eccles M, editors. Improving Patient Care: The Implementation of Change in Clinical Practice. Edinburgh, Ireland: Elsevier; 2005. pp. 109–120. [Google Scholar]
  76. Williams NJ (2016). Multilevel mechanisms of implementation strategies in mental health: Integrating theory, research, and practice. Administration and Policy in Mental Health and Mental Health Services Research, 43(5), 783–798. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES