Abstract
Background
The implementation strategies used to enhance the implementation of interventions during efficacy and effectiveness studies are rarely reported. Tracking and reporting implementation strategies during these phases has the potential to improve future research studies and real-world implementation. We present an exemplar of how this might be executed by specifying and reporting the implementation strategies that were used during a school-based efficacy trial, Project POWER, which tested a trauma-informed prevention program delivered by a university research team, community members, and school staff facilitators in 29 schools.
Methods
Following the conclusion of the 4-year trial, core Project POWER research team members identified the implementation strategies that supported intervention delivery during the trial using an established taxonomy of school-based implementation strategies. The actors, actions, action targets, temporality, dose, and implementation outcomes were specified using established implementation strategies reporting guidelines.
Results
The research team identified 37 implementation strategies that were used during the Project POWER trial. Most strategies fell within the categories of Train and Educate Stakeholders, Use Evaluative and Iterative Strategies, and Develop Stakeholder Interrelationships. Actors included members of the research team and partner schools. Strategies were used multiple times during the preparation and implementation phases. Action targets were most often characteristics of individuals, implementation process, and characteristics of the inner setting. Strategies predominantly targeted the implementation outcomes of fidelity, acceptability, feasibility, and adoption.
Conclusions
This study provided evidence that implementation strategies are used and can be identified in efficacy research using a retrospective approach. Identifying and specifying implementation strategies used during the initial phases of the translational research pipeline can inform the implementation strategies that are carried forward, adapted, or discontinued in future trials and routine practice to improve implementation and effectiveness outcomes.
Plain Language Abstract
Intervention development and testing often occurs separately from implementation planning. However, evaluating an intervention without considering how it will be subsequently used in real-world settings is a major factor contributing to the research-to-practice gap. During the rigorous testing of interventions, research teams invest significant effort and resources to ensure their program is delivered as intended and so that beneficial outcomes can be assessed. However, the methods or techniques used to support implementation (i.e., implementation strategies) are often not measured or specified to be used and evaluated during later research or included with intervention materials that are distributed to stakeholders; this is a missed opportunity. This study identifies and describes the implementation strategies used during a large school-based research trial of a universal trauma-informed prevention program delivered by a university research team, community members, and school staff. In collaboration with the trial’s research team, we identified 37 implementation strategies that were used during the trial and defined how each strategy was used, including: the actions (i.e., things done), people who carried out the strategies, the targets of the actions, when and how often during the implementation process the strategies were used, and which implementation outcome(s) the strategy was expected to impact. Explicating implementation strategies during early phases of intervention research in schools can inform which implementation supports to carry forward, adapt, or discontinue in future studies and routine practice.
Keywords: Consolidated Framework for Implementation Research (CFIR), efficacy trial, implementation strategies, schools, SISTER implementation strategy taxonomy
Introduction
In response to the established gap between intervention development and uptake into practice across clinical and public health settings (Ennett et al., 2003; Gottfredson & Gottfredson, 2002; Hicks et al., 2014; Shelton et al., 2018), there has been an increased focus on identifying implementation strategies to support adoption, implementation, and sustainment of evidence-based practices (EBPs) in real-world settings (Powell et al., 2019a). Implementation strategies are the methods or techniques used to enhance the adoption, implementation, and sustainability of a given intervention (Proctor et al., 2013). They include discrete (i.e., single component) and multifaceted strategies targeting implementation factors at multiple levels (Powell et al., 2019a). Existing taxonomies describe over 70 implementation strategies for use in health care (i.e., Expert Recommendations for Implementing Change [ERIC]; Powell et al., 2015; Waltz et al., 2015) and educational settings (i.e., School Implementation Strategies, Translating ERIC Resources [SISTER]; Cook et al., 2019). Established reporting guidelines further inform the operationalization of implementation strategies to support replication (Proctor et al., 2013).
Within the traditional translational research pipeline (efficacy, effectiveness, dissemination and implementation), implementation strategies are emphasized and investigated as part of implementation studies after an intervention has demonstrated efficacy and effectiveness (Brown et al., 2017; Lane-Fall et al., 2019). Existing work that has applied the criteria for specifying and reporting implementation strategies has been exclusively situated within the implementation phase of translational research (e.g., Boyd et al., 2018; Bunger et al., 2017; Huynh et al., 2018; Perry et al., 2019; Rogal et al., 2017). However, limiting implementation strategies research to the latter phases of the translational research pipeline may contribute to delays in interventions achieving public health impact (Rudd et al., 2020). Rigid adherence to the traditional translational research process may perpetuate the research-to-practice gap, particularly when efficacious interventions are later found to be incompatible with real-world service delivery. Scholars increasingly urge that interventions are designed and tested with future dissemination and implementation in mind (Lane-Fall et al., 2019).
Interventions that are developed and tested with later implementation as a priority may be more readily adopted and implemented to scale (Rudd et al., 2020). The goal of clinical efficacy and effectiveness research is to produce an evidence-based intervention that is successfully disseminated to, adopted by, and implemented by stakeholders in real-world settings. However, few published clinical research studies sufficiently report the information needed to subsequently implement interventions (Premachandra & Lewis, 2021); this information is typically not prioritized until implementation trials or hybrid effectiveness-implementation studies (Curran et al., 2012). Identifying implementation outcomes and the implementation strategies used to achieve these outcomes earlier in the translational research pipeline (i.e., during the efficacy phase) is aligned with calls to design interventions for future implementation (Lane-Fall et al., 2019) and may be valuable for enhancing real-world implementation (Arnold et al., 2020; Rudd et al., 2020). Despite their underreporting in the literature, implementation strategies are still often used during efficacy and effectiveness studies to achieve implementation, service, and health outcomes (Curran, 2020; Stevens et al., 2020). For example, it is common practice to track and evaluate the fidelity of program implementation during intervention trials to ensure interventions have been delivered as intended (e.g., Gould et al., 2014). Yet, the extensive resources (i.e., implementation strategies) directed toward achieving intervention fidelity are not explicated as part of standard research trials or in the resulting literature; this is a missed opportunity. Investigating implementation strategies during the efficacy phase of intervention research has potential to improve future effectiveness, hybrid effectiveness-implementation, and implementation trials, especially in educational settings in which school partners may be involved in the implementation process during early phases of research.
Schools are recognized as a prominent mental healthcare system for children in the United States (Duong et al., 2020; Jacob & Coustasse, 2008), particularly children of color and children from families with low income who often have less access to mental health services (Alegria et al., 2010). Young people spend most of their time in school, and mental health interventions can be integrated within school curricula or made available to identified students with particular challenges (Atkins et al., 2010; Domitrovich et al., 2010; Masten, 2003). Educational settings—characterized by principals’ organizational leadership, professionals in varied roles working to achieve common or related goals, and a unique calendar that influences all aspects of service delivery—are rich contexts in which to conduct implementation research (Owens et al., 2014). Fidelity is the most commonly assessed and reported aspect of implementation in school-based mental health research (Rojas-Andrade & Bahamondes, 2019). Some researchers have evaluated school-based mental health intervention adoption (Arnold et al., 2020), teacher-based program delivery (Franklin et al., 2012; Han & Weiss, 2005), and sustainability (Arnold et al., 2021; Herlitz et al., 2020), as well as characterized facilitators and barriers to successful implementation (Beidas et al., 2012; Eiraldi et al., 2015; Locke et al., 2017; Powell et al., 2019b).
Implementation scientists have advocated for an increased focus on identifying and testing implementation strategies in settings where mental health services are delivered to children (Novins et al., 2013; Powell et al., 2014). Across mental health service contexts, however, few studies have assessed or reported the extent to which they employ the range of implementation strategies identified in common implementation taxonomies, such as ERIC and SISTER. No research has been conducted within the context of an efficacy study or in school settings (e.g., Boyd et al., 2018; Bunger et al., 2017). Identifying implementation strategies that are used within school-based intervention research is imperative, however, in realizing recommendations to tailor implementation strategies to their intended contexts to advance implementation, service, and child outcomes (Powell et al., 2017; Boyd et al., 2018).
Current study
Recognizing the opportunities present in educational settings to implement EBPs that prevent and address youth mental health needs (Lyon & Bruns, 2019) and to model designing for implementation during early phases of the translational research pipeline (Lane-Fall et al., 2019), this study specifies and reports the implementation strategies used during the testing of a universal trauma-informed prevention program for middle school youth. This study employed the SISTER implementation strategy taxonomy (Cook et al., 2019), and Proctor and colleagues’ (2013) reporting guidelines to identify, describe, and operationalize the implementation strategies used during the school-based Project POWER efficacy trial (Mendelson et al., 2020). This study provides a unique examination of the use of implementation strategies during school-based efficacy research, with a goal of informing recommendations for investigating and reporting implementation strategies during the initial phases of the translational research process.
Method
Study context
Project POWER (Promoting Options for Wellness and Emotion Regulation) was a randomized controlled school-based prevention trial conducted in 29 Baltimore City public schools (Mendelson et al., 2020). The goal of the trial was to test the effect of a 12-session universal trauma-informed group intervention (RAP Club) compared to a health education active control group (Healthy Topics) on the mental health and academic outcomes of 8th-grade students who volunteered to participate. Students were randomized within schools to receive RAP Club or Healthy Topics, with approximately 10–15 students assigned to each group; approximately 600 8th graders were enrolled across all schools.
RAP Club was adapted as a school-based prevention program from Structured Psychotherapy for Adolescents Responding to Chronic Stress (SPARCS; DeRosa et al., 2006; DeRosa & Pelcovitz, 2009), an evidence-based group trauma treatment. The core components of SPARCS and RAP Club are evidence-based mindfulness and cognitive behavioral therapy strategies, augmented by psychoeducation about the effects of stress and trauma. RAP Club was adapted to have a prevention rather than treatment focus and included young adult community members as program cofacilitators (i.e., “mentors”) to enhance trust and buy-in from participants. Pilot research conducted in two Baltimore City Public Schools supported RAP Club’s feasibility, acceptability, and potential benefits (Mendelson et al., 2015).
Although the research team delivered both programs, a unique feature of the Project POWER trial was that the team engaged school stakeholders in training, program delivery, and supervision to build the school’s capacity to continue delivering the programs after study participation ended. The research team partnered with each participating school for one year and worked with 7–9 schools each year for four academic years (2016–2017 to 2019–2020). School mental health personnel (e.g., psychologists, social workers, or counselors) and/or teachers who were selected by the principal received training in the RAP Club curriculum immediately prior to the start of the school year. They attended and assisted with RAP Club sessions at their school and were invited to join weekly phone supervision sessions with the group leaders and project staff. Teachers with interest or expertise in health (e.g., health or physical education teachers) received training in the Healthy Topics curriculum and were engaged in the same manner with program delivery and supervision as the RAP Club trainees. The role of school staff during the intervention trial was to observe the modeling of program delivery by research staff and participate in weekly supervision. Throughout this article we refer to partnering school staff as “cofacilitators” in recognition of Project POWER’s goal to equip these stakeholders with knowledge and skills to support their continued use of the RAP Club and Healthy Topics interventions following the trial; the amount of cofacilitation varied across school staff members.
Data collection procedures
The Project POWER trial was approved by the Institutional Review Board (IRB) at Johns Hopkins University. Procedures for this study, which involved discussions with team members and review of study documents, were executed within the parent IRB as part of ongoing research team operations. Data for this study were obtained through meetings with 10 Project POWER trial research team members, each with varied years of experience with the trial. Four team members were involved in the trial from project initiation (2016–2020), three for the final two years of the trial (2018–2020), and three for the final year of the trial (2019–2020). Team members represented multiple roles, including principal investigator, project scientist, project coordinator, research assistant or associate, intervention group leader or mentor, and data manager. The number of team members present at these meetings fluctuated between four and ten, depending on their availability and expertise regarding implementation across school sites. The first authors (SM, KA) of this study participated in the Project POWER trial as a Healthy Topics intervention group leader (SM) and a RAP Club mentor (KA) for two years and one year, respectively. Our knowledge of the trial and each interventions’ delivery enhanced our understanding and coding of the research team’s data.
We used a group consensus building process following the conclusion of the Project POWER trial. The final cohort of Project POWER schools completed the implementation of RAP Club and Healthy Topics in November 2019. In December 2019, two weeks prior to the first meeting with the trial’s research team, the first authors distributed the SISTER taxonomy (Cook et al., 2019) to senior research team members (n = 6) along with the definition of each school implementation strategy and ERIC ancillary material to reference (Powell et al., 2015). Each team member was asked to record the implementation strategies that they thought were used during the trial and to come to the team meeting prepared to discuss the strategies with other team members.
From mid-December 2019 to mid-February 2020, the first authors met five times with the Project POWER research team members to name, define, and operationalize the implementation strategies that were used during the Project POWER trial. The first two sessions were focused on naming and defining the implementation strategies that were used based on the SISTER taxonomy. The name and definition of each of the 75 strategies from the SISTER taxonomy were presented individually to team members. Individuals endorsed whether the strategy was used during the trial, and group consensus regarding strategy use was reached through moderated discussion facilitated by the first authors. Team members discussed activities congruent with the target strategy and once a majority of team members were in agreement about whether activities were performed that were congruent with the target strategy’s definition, the target strategy was indicated to have been used during the trial.
The three subsequent meetings with the trial’s research team were focused on operationalizing the identified strategies using the implementation strategy reporting guidelines developed by Proctor and colleagues (2013). The first authors recorded notes from each meeting into an Excel spreadsheet containing the name and definition of each identified SISTER strategy and columns for the seven implementation strategy reporting domains (i.e., actor, action, action target, temporality, implementation outcome, dose, justification; Proctor et al., 2013). The database was populated during group discussions and displayed in real time for team members to view and correct for accuracy.
Each meeting lasted for 1 h, except for one meeting that lasted for 2 h (6 h total). The resulting implementation strategies and operational definitions were reviewed by all ten research team members prior to data analysis. Finally, we reviewed the Project POWER grant proposal to determine whether any of the implementation strategies that were identified by the research team were also outlined in the project proposal.
Data analysis
The first authors individually cleaned and summarized the data collected from the trial’s research team for each identified implementation strategy and determined a coding scheme for the action target, temporality, and implementation outcome domains. We coded each strategy’s conceptual action target using constructs from the Consolidated Framework for Implementation Research (CFIR; Damschroder et al., 2009). The first authors’ training in implementation determinant frameworks and experience with Project POWER implementation were used during the coding process to link each strategy to conceptual targets within the five CFIR domains. We coded temporality using the four established stages of implementation and developed final codes via consensus for each domain. Data were recoded as needed throughout the iterative coding process. The resulting data file, including all codes, was sent to all research team members to review and verify for accuracy and completeness before doing any further analysis of the data; no team members disputed the accuracy of the data or codes. We summarized each strategy’s actions but could not assess the frequency with which each occurred. Descriptive statistics were used to explore and describe the identified SISTER strategies’ actors, action targets, temporality, implementation outcomes, and dose. Identified strategies were also categorized into one of four categories based on importance (i.e., impact of the strategy and how critical it is for implementation) and feasibility (i.e., practical and can be used to support implementation) ratings reported by school implementation leaders in Lyon et al.’s (2019) study: both important and feasible, important but not feasible, feasible but not important, and neither feasible nor important.
Results
Congruent with Hooley et al.’s (2020) recommendations, an implementation strategy description table that includes each implementation strategy identified in this study and its operational definition as specified using the Proctor et al. (2013) categories is included as Supplemental File 1. Data and codes generated from this table were used in this study’s analyses.
Summary of school implementation strategies used during the project POWER trial
The Project POWER research team reported that 37 of the 75 SISTER strategies were used during the four-year trial (Table 1). Most of the employed strategies were in the SISTER categories of Train and Educate Stakeholders (n = 7), Use Evaluative and Iterative Strategies (n = 7), and Develop Stakeholder Interrelationships (n = 5). Four strategies each were within the Adapt and Tailor to Context and Support Educators categories. The categories of Change Infrastructure, Engage Consumers, and Use Financial Strategies each had three strategies, and only one strategy from the Provide Interactive Assistance category was identified.
Table 1.
Implementation strategies used during Project POWER efficacy trial.
| SISTER category/strategy | SISTER definition (Cook et al., 2019) | Action |
| Train and educate stakeholders (n = 7) | ||
| Conduct educational meetings | Hold meetings targeted toward different stakeholder groups (e.g., teachers, principals, central administrators, other organizational stakeholders and community, and family stakeholders) to teach them about the new practices. | Back to school nights and classroom visits during recruitment used to educate families and students about intervention; kickoff meetings used to educate principals and administrators about intervention. |
| Create a professional learning collaborativea,b,c | Facilitate the formation of groups of school personnel within or between school systems to foster a collaborative learning environment to improve implementation of new practices. | In years (Y) 1–3, weekly phone supervision with research team’s group leaders and mentors and school staff across school sites; calls replaced with individual contacts with school staff in Y4 to increase participation. Weekly meetings with community mentors and intervention group leaders meeting to discuss program implementation within and across schools. |
| Develop educational materialsa | Develop and format manuals, toolkits, and other supporting materials in ways that make it easier for stakeholders to learn about new practices and for school personnel to learn how to deliver the new practices with fidelity. | Developed culturally appropriate training and intervention materials, including facilitator manuals (i.e., intervention curriculum) to guide implementation. Developed student activity booklets and 1-page parent/student project-information sheet. |
| Distribute educational materialsa | Distribute educational materials (including guidelines, manuals, and toolkits) in person, by mail, and/or electronically. | Group leaders, mentors, and school staff received a program facilitators’ manual. Parents/students were given information about intervention during recruitment. Students were given binder of resources, handouts, and homework during intervention. |
| Make training dynamic | Vary the information delivery methods to cater to different learning styles, structures for professional development, and shape the training in new practices to be interactive. | Training included mixture of lecture, role-play, discussion, and activities and exercises to increase participation and sustain attention. |
| Provide ongoing consultation/coachinga,b | Provide ongoing consultation/coaching with one or more experts in the strategies used to support implementing new practices. | Weekly group supervision with group leaders, mentors, and school cofacilitators (by phone) and individual check-ins with group leaders and school cofacilitators (Y1-3). Changed to 1-1 meetings (Y4). Program developers available for consultation during implementation. |
| Shadow other expertsa | Provide ways for key individuals to directly observe experienced people engage with or use new practices. | School cofacilitators shadowed/observed group leaders model program delivery at each session to learn program content. Community mentors shadowed group leaders to learn from their intervention delivery style. |
| Use evaluative and iterative strategies (n = 7) | ||
| Assess for readiness and identify barriers and facilitatorsb | Assess various aspects of the school context to determine the degree to which it and the school personnel within it are ready to implement, barriers that may impede implementation, and strengths or facilitators (such as, coaches, professional learning communities, whole staff training) that can be used/leveraged in the implementation effort. | Identified readiness/barriers through series of meetings: Initial meetings with principals to determine school’s fit for the project (Y1-4); meetings with school personnel to be trained in program delivery (Y1-4); and meetings with teachers who would provide ratings on student participants (Y1-3 only). During meetings, school staff are asked what would help them get ready to implement the program, anticipated barriers, and how the research team can address barriers to implementation. School partnerships discontinued prior to program delivery if significant barriers were uncovered. |
| Audit and provide feedbacka,b | Collect and summarize data regarding implementation of the new program or practice over a specified time period and give it to administrators and school personnel to monitor, evaluate, and support implementer behavior. | Implementation fidelity and student and teacher (Y4) attendance and engagement data collected for each session. Data used to inform intervention delivery, supervision, and communication with school stakeholders. |
| Develop a detailed implementation plan or blueprinta | Develop a detailed implementation plan or blueprint that includes the intended goals/outcomes to be achieved via the implementation effort as well as the process and strategies that will be used to achieve those goals. The blueprint should include (1) aim/purpose of the implementation, (2) scope of the change (e.g., who and what settings will be affected), (3) goals/outcomes to be achieved, (4) time frame and milestones, (5) appropriate performance/progress measures, and (6) specific strategies that will be used to attain goals/outcomes. Use and update these plans to guide the implementation effort over time. | Developed two grant proposals (IES, NICHD) with detailed implementation plan/blueprint, which guided study implementation after funding. Plans were updated as needed during implementation. |
| Develop and organize quality monitoring systema,b | Develop and organize systems and procedures that monitor implementation and/or student outcomes for the purpose of quality assurance and improvement. | Program implementation quality assessed for each session by group leaders rating session fidelity, student attendance, and student engagement and by videotaping sessions for observational coding. Recorded data were discussed during group supervision and team meetings with group leaders and mentors to address program implementation challenges. |
| Develop instruments to monitor and evaluate core components of the innovation/new practicea | Develop, validate, and integrate measurement instruments or tools to monitor and evaluate the extent to which school personnel are implementing the core components of the intervention (i.e., with fidelity). | Fidelity ratings for each session were developed for group leaders to monitor implementation adherence and quality. Sessions also videotaped for observational coding of fidelity and to verify accuracy of group leaders’ ratings. |
| Monitor the progress of the implementation efforta | Monitor the progress of key implementation outcomes (fidelity, reach of the intervention, acceptability) and adjust practices and implementation strategies as needed to continuously improve the quality of delivery. | Fidelity of implementation for program delivery monitored using self-report and observational methods. Intervention dosage monitored via attendance logs. Responsiveness measured with facilitator ratings. Group leader logs informed supervision discussions and adjustments to implementation as needed. |
| Obtain and use student and family feedbacka | Develop strategies to increase student and family feedback on the implementation effort. | Student focus groups in Y1 and Y3 assessed their experiences of the program. Facilitators assessed retention of program material each session to guide implementation. Post-test surveys assessed student satisfaction with program. |
| Develop stakeholder interrelationships (n = 5) | ||
| Build partnerships (i.e., coalitions) to support implementationa,b | Recruit and cultivate relationships with partners external and/or internal to the school who help facilitate the implementation effort. | Research team partnered with two local youth-serving organizations to recruit young adult mentors to serve as program cofacilitators, and with schools as sites for program delivery. Team members partnered with school staff to assist with implementation and build their capacity for sustainable implementation. |
| Capture and share local knowledgea,c | Capture local knowledge from other school sites on how school personnel were able to implement the new practice effectively in their setting and then share it with other sites. | Group leader meetings, mentor meetings, and group supervision used to share implementation challenges, successes, and program activity adaptations for use across different project schools. |
| Identify and prepare championsa,c | Identify and prepare individuals who dedicate themselves to supporting, marketing, and driving through an implementation, overcoming indifference or resistance that the intervention may provoke in a school or district. | Principals identified school liaisons to assist with implementation and school staff to be trained in the intervention. Research team members engaged and further prepared some of these school staff to be champions in separate meetings. |
| Obtain formal commitmentsa,b | Obtain written commitments from key partners that state what they will do to implement new practices. | Commitments and letters of support obtained from school district and principals (Y1-4). School personnel signed a contract outlining their role/responsibilities as program cofacilitators during first meeting with group leaders (Y4 only). |
| Recruit, designate, and train for leadershipa,c | Recruit, designate, and train leaders for the change effort so they can effectively engage in leadership behaviors that support others to adopt and deliver the new practice. | Individuals designated/trained to implement the program include research staff (group leaders), young people from the community (mentors), and school staff (cofacilitators). Group leaders delivered interventions, monitored implementation, and served as liaison between the larger research team and school partners. Mentors supported intervention implementation and youth engagement. School cofacilitators were designated by the principal as building leaders—they supported implementation and student recruitment/ engagement. They received training in the intervention alongside mentors and group leaders with a goal of also providing leadership in future delivery of groups. |
| Adapt and tailor to context (n = 4) | ||
| Promote adaptability | Identify the ways a new practice can be tailored or adapted to best fit with the school/classroom context, meet local needs, and clarify which elements of the new practice must be maintained to preserve fidelity. | Research team members made minor adaptations to program activities each year based on feedback from students, group leaders, and school staff without altering core program content. Length of intervention sessions adjusted to fit each school’s schedule. |
| Tailor strategiesb | Tailor the implementation strategies to address barriers and leverage facilitators that were identified through earlier data collection | Tailoring of strategies included: a) augmenting quality monitoring system—began tracking school staff program attendance/participation on intervention session logs (added Y4; due to research-team reports of variable staff participation that could influence student participation); b) altering cofacilitator incentives—linked school staff incentives to their session attendance/participation (Y4); c) changing supervision model for school staff from group calls (Y1-3) to one-on-one (Y4) to address barriers to their participation; and d) modifying student incentives (Y2 and Y3) in response to student feedback and providing incentives for session attendance (began Y3). |
| Test-drive and select practicesc | Support school personnel to try out various practices in small doses and have them choose/select the one they find most acceptable and appropriate. | School staff cofacilitators were encouraged to select preferred intervention activities/skills to teach in session and practice with students. |
| Use data expertsa | Involve, hire, and/or consult experts to inform management and use of data generated by implementation efforts. | Data manager hired to assist with project data; coinvestigator experts guided quantitative and qualitative data collection and analyses. |
| Support educators (n = 4) | ||
| Develop resource sharing agreementsa | Develop partnerships with organizations that have resources needed to implement new practices. | Letters of support provided by school district and participating schools. Research team provided group leaders, trainings, and materials for the intervention; school partners provided staff to be trained, programming space, and access to students and families. Principal investigator contracted with intervention developer to lead trainings and partnered with two community organizations to recruit young adults as mentors. |
| Improve implementers’ buy-ina | Engage school personnel in activities or discussions that attempt to increase their buy-in and motivation to adopt and use the new practice. | Attempted to get buy-in from principals via initial meetings and from school staff via initial meetings, program trainings, and ongoing communication (e.g., discussing benefits of intervention for students, developing knowledge and skills to implement, soliciting input on program topics, connecting curriculum to student strengths). Incentives for participation offered to school staff to increase their buy-in. |
| Precorrection prior to implementation | Precorrection is a frontloaded strategy that involves instruction and/or reminders about how to deliver core components of the intervention immediately prior to delivery. | Group leaders communicated with mentors and school staff cofacilitators after each group or before the next session to plan program delivery for the next session, including review of core content and learning objectives. |
| Remind school personnel | Develop reminder systems (e.g., email prompts or visual cues) designed to help school personnel recall information and/or prompt them to deliver core components of new practices. | Email, texts, and phone calls used to remind school staff cofacilitators and administrators about project components, such as consent visits, assessments, program sessions, and data collection. |
| Change infrastructure (n = 3) | ||
| Change record systemsb | Change data collection systems to allow better assessment of implementation or relevant outcomes. | Fidelity logs changed to record school staff participation/engagement (Y4); school staff given new option to complete baseline and outcome ratings of students online (starting Spring Y3); two measures added to student survey (Y4). |
| Change/alter environmenta,c | Evaluate current environment and, as needed, alter or change aspects of it (e.g., changing the layout of a classroom, master scheduling, repurposing space) to best accommodate new practices. | Schools adjusted their scheduling and space (e.g., classroom configuration, offered intervention during resource time in school schedule) to accommodate program delivery to participating 8th graders. Program timing and space also adjusted as needed in response to holidays and school activities. |
| Mandate for changec | Have leadership declare the priority of new practices (i.e., top down) and their determination to have it implemented. | Principals identified participating school staff and required and supported their participation in the project. Principals also helped communicate the importance of project/programming to students and parents. |
| Engage consumers (n = 3) | ||
| Increase demand and expectations for implementationa,b | Attempt to influence the demand and expectations for new practices, relative to other practices, by educating key stakeholders about the new practice and its associated outcomes. | To increase demand for intervention, potential program benefits for students were highlighted to principals by research staff and to school staff cofacilitators, group leaders, and mentors by trainers. As implementation occurred, school stakeholders were encouraged by group leaders to play an increasing role in implementation by identifying portions of lessons they could lead. School cofacilitator incentives tied to participation in program sessions in Y4. |
| Involve students, family members, and other staffa,c | Engage or include students, families, and other staff in the implementation effort who may not directly be involved in delivering the new practice but are associated with it. | Research team engaged school administrators in implementation planning and visited classrooms, assemblies, and “back to school nights” to encourage student participation. Some teachers helped with student recruitment and assessments, others completed ratings on student participants. Students participated in program sessions and helped lead some activities. |
| Prepare families and students to be active participantsa | Prepare families and/or students to create “pull” (i.e., motivation or pressure to implement) for the delivery of the new practice by asking relevant questions, advocating for the new practice, and inquiring about guidelines for implementation, the evidence and rationale behind decisions, or about other effective new practices that could be implemented. | During recruitment, research team members visited classrooms and assemblies to motivate student interest and enrollment; they also attended “back to school nights” and other school events, and phoned parents to build parent interest. |
| Use financial strategies (n = 3) | ||
| Access new fundinga | Access new or existing money to facilitate the implementation. | Initial grant proposal was funded by IES to cover costs of program implementation and research; subsequent grant proposal submitted to, and funded by, NICHD to extend scope of original trial. |
| Alter and provide individual- and system-level incentivesa,b | Work to provide individual- (e.g., recognition and acknowledge, gift card) and/or system-level incentives to districts or schools to participate (e.g., grant money, free training, and consultative support) and engage in an implementation effort involving a new practice. | Incentives for project participation included gift cards, refreshments, and small prizes for students and gift cards or checks for school staff. Schools received free capacity building resources (e.g., training and consultative support for school personnel) to support sustainment. |
| Alter student or school personnel obligations to enhance participation in or delivery of new practice, respectivelya,c | Create structures where students or school personnel are relieved of a particular obligation for participating in or delivering more preferred practices/supports (i.e., new practices) than less-preferred practices/supports. | School administrators approved changes to students’ schedules to allow them to attend program sessions and approved staff participation in sessions. When possible, also advocated for protection from conflicting responsibilities during program sessions. |
| Provide interactive assistance (n = 1) | ||
| Facilitation/problem-solvinga | A process of interactive problem-solving and support that occurs in a context of a recognized need for improvement in the implementation of a specific practice and a nonevaluative but informative and supportive interpersonal relationship. | Research team members and group leaders communicated with school staff cofacilitators and administrators to address scheduling or attendance conflicts. Supervision calls, team meetings, and/or one-on-one conversations with group leaders addressed fidelity challenges. |
Note. IES = Institute of Education Sciences; NICHD = National Institute of Child Health and Human Development; SISTER = School Implementation Strategies, Translating ERIC Resources.
Implementation strategy was included in the Project POWER grant proposal.
Implementation strategy changed during the Project POWER trial.
Implementation strategy actor(s) included school partners.
Of these 37 implementation strategies, 27 (73%) were referenced in the trial’s funded grant proposal. Eleven (29.7%) strategies’ actions were reported to have changed during the trial (see Table 1). The nature of these changes was largely to improve implementation strategy delivery (e.g., moved from group-based to individualized consultation to accommodate school partner schedules). In describing why implementation strategies were changed during the trial, most team members noted reasons consistent with increasing stakeholder engagement in either the implementation strategy itself (e.g., ongoing consultation) or in the intervention delivery (e.g., provided smaller incentives more frequently, rather than a large incentive at the end of the intervention, to increase demand and expectations for implementation). Some modifications to implementation strategy use were related to preserving the fidelity of intervention delivery (e.g., obtained formal commitments from school staff in year 4 of the trial using a contract) or integrity of fidelity data needed to evaluate the intervention at the conclusion of the trial (e.g., changed fidelity logs to capture cofacilitator attendance and participation in intervention sessions).
When comparing the strategies that were used during the Project POWER efficacy trial to school implementation leaders’ ratings of SISTER strategies’ feasibility and importance, most (57%) strategies used during the present trial were within the important and feasible category in the Lyon et al. (2019) study. However, several strategies used during our trial were in the categories of important but not feasible (19%; e.g., access new funding, alter student or school personnel obligations, improve implementer’s buy-in, precorrection prior to implementation) or neither important nor feasible (19%; e.g., alter and provide individual- and system-level incentives, change record systems, obtain formal commitments, test-drive and select practices). Few strategies in the feasible but not important category (5%; i.e., remind school personnel, tailor strategies) were used during the trial.
Actors
Implementation strategy actors during the Project POWER trial included the research team, principal investigator, coinvestigators, community partners, intervention developers/trainers, project coordinator, data manager, intervention group leaders, community mentors, school principals, and school cofacilitators (see Table 2). Across employed implementation strategies, 90% of reported actors were members of the research team. Trial-employed intervention group leaders or a collective of research team members were the most frequent types of actors.
Table 2.
Actors from research team and partner schools.
| Frequency | Percentage | |
| Research Team | 18 | 20.5% |
| Principal Investigator | 12 | 13.6% |
| Coinvestigator | 4 | 4.6% |
| Community Partner | 1 | 1.1% |
| Intervention Developer/Trainer | 8 | 9.1% |
| Project Coordinator | 11 | 12.5% |
| Data Manager | 1 | 1.1% |
| Group Leader | 18 | 20.5% |
| Mentor | 6 | 6.8% |
| School Principal | 4 | 4.6% |
| Cofacilitator | 5 | 5.7% |
| Total Research Team Actors | 79 | 89.8% |
| Total School Partner Actors | 9 | 10.2% |
| Both Research Team and School Partner Actors | 7 | 18.9% |
Of the 37 identified implementation strategies, nine included school partner actors—two strategies (i.e., mandate for change and alter student or school personnel obligations) were enacted only by school partners, and seven (19%) involved actors from both the research team and partner schools. These implementation strategies spanned six of the nine SISTER categories (i.e., Adapt and Tailor to Context, Change Infrastructure, Develop Stakeholder Interrelationships, Engage Consumers, Train and Education Stakeholders, Use Financial Strategies; see Table 1). School partner actors included school administrators and school staff cofacilitators.
Actions
Actions comprising each implementation strategy used during the trial are thoroughly described in Supplementary File 1 and summarized in Table 1. Most strategies were composed of several actions (see Table 1). For example, increasing demand and expectations for implementation involved: (a) during recruitment and training, informing school principals, school staff cofacilitators, and research team-employed group leaders about the interventions’ structure and potential benefits for their students (based on data from pilot trials); (b) during intervention implementation, group leaders reviewed implementation plans for each lesson with school staff cofacilitators and identified concrete portions of the lesson that the school cofacilitator would lead; and (c) developing written expectations for participation in intervention sessions and tying participation to financial incentives during the final year of the trial to increase school staff cofacilitator engagement in leading intervention groups.
Action target and dose
An average of 2.81 conceptual action targets was identified per implementation strategy. The most common action targets were in the CFIR determinant domains of characteristics of individuals (38.5%; e.g., knowledge and beliefs about the intervention), followed by implementation process (36.5%; e.g., engaging key stakeholders and formally appointed implementation leaders), and characteristics of the inner setting (21.2%; e.g., available resources). Only 3.8% of strategies targeted characteristics of the intervention (e.g., adaptability), and none targeted the outer setting (see Supplementary Tables 1 and 2).
To specify dose, we examined whether each implementation strategy was used once or multiple times during each year of intervention implementation during the trial. Most implementation strategies were used multiple times (n = 24), with 13 strategies used only once (see Supplementary File 1). Of those strategies used multiple times, four were used during each intervention group session (i.e., capture and share local knowledge, monitor the progress of the implementation effort, precorrection prior to implementation, and shadow other experts), and three were used weekly (i.e., audit and provide feedback, capture and share local knowledge, provide ongoing consultation/coaching). We were unable to more precisely quantify the frequency of strategies used more than once. Refer to Supplementary File 1 and Supplementary Tables 1 and 2 for more detailed results for these domains.
Implementation outcome
The research team reported that the implementation strategies used during the Project POWER trial were most likely to impact the implementation outcomes of fidelity (64.9%), acceptability (54.1%), feasibility (29.7%), and adoption (27.0%). Fewer implementation strategies were reported as likely to impact sustainability (16.2%), penetration (5.4%), or cost (2.7%), and none were indicated as likely to impact appropriateness. Overall, implementation strategies were reported as likely to affect an average of two implementation outcomes.
Temporality
Figure 1 summarizes implementation strategies used by stage of implementation (i.e., exploration, preparation, implementation, sustainment). Most implementation strategies were used during the implementation (83.7%) and preparation (54.1%) phases. Fourteen (37.8%) strategies were used during both the preparation and implementation phases, 6 (16.2%) only during the preparation phase, and 17 (45.9%) during only the implementation phase. Strategies were not identified for the exploration or sustainment phases.
Figure 1.
Names of School Implementation Strategies, Translating ERIC Resources (SISTER). Implementation strategies used during each stage of implementation of the Project POWER trial.
Discussion
Implementation strategies are traditionally emphasized in the latter phases of translational research (Brown et al., 2017) and are rarely measured or reported by clinical (i.e., efficacy and effectiveness) researchers (Rudd et al., 2020). This study illustrated that implementation strategies are used and can be identified in efficacy research. We found that 37 implementation strategies were used during the Project POWER efficacy trial of a universal, trauma-informed prevention program for middle school youth. Implementation strategies were identified using the school-adapted, SISTER implementation strategy taxonomy (Cook et al., 2019) and operationalized using Proctor et al.’s (2013) implementation strategies reporting guidelines; we reported findings for eight of the nine categories. The retrospective method we used provides an exemplar for specifying and reporting implementation strategies used during school-based efficacy research.
A range of implementation strategies—spanning multiple implementation actors, outcomes, and phases—were used during the Project POWER trial. Half of the 75 SISTER implementation strategies were used during the trial; most strategies (27) were described, although not labeled, in the trial’s grant proposal. It is noteworthy that so many strategies were used because implementation strategies are rarely measured or reported in efficacy research. This number of strategies is consistent with research specifying and reporting implementation strategies used during implementation studies (i.e., 11–45; Boyd et al., 2018; Bunger et al., 2017; Huynh et al., 2018; Rogal et al., 2017) and with a recent illustration within an effectiveness study (20; Rudd et al., 2020). We extended this research by examining whether implementation strategies were adapted over the course of the four-year Project POWER trial. Eleven strategies were changed, suggesting that implementation in efficacy trials is dynamic and iterative in support of these studies’ goals to achieve internal validity via appropriate implementation.
Congruent with the efficacy context of this study, implementation strategies were reported as most likely to impact the implementation outcomes of fidelity, acceptability, feasibility, and adoption and were used only during the preparation and implementation phases. Although no strategies targeted appropriateness, it is important to assess the intervention’s fit for the intended population and context in which it will be delivered or sustained. Interestingly, approximately 16% of strategies targeted sustainability as an implementation outcome, which is typically not a priority of efficacy trials. Nevertheless, it remains important to plan and build the capacity for the sustainment of promising interventions beyond the research context to maximize public health impact and promote health equity (Arnold et al., 2021). These findings underscore the need for collaboration among clinical and implementation researchers to understand how implementation strategies are employed during the early phases of translational research (Rudd et al., 2020).
A unique aspect of the Project POWER trial was the involvement of school stakeholders in intervention delivery. Most strategies used during the trial were consistent with the types of implementation strategies that are well established in educational settings (i.e., SISTER categories of Train and Educate Stakeholders, Use Evaluative and Iterative Strategies, and Develop Stakeholder Interrelationships; e.g., Lyon & Bruns, 2019). As expected in the context of an efficacy trial, implementation strategies were primarily enacted by members of the research team and used the trial’s financial and personnel resources; only 10% of the implementation strategy actors were school partners. In the future real-world implementation of Project POWER’s interventions, responsibility for supporting implementation would fall upon school stakeholders, who are less likely to have equivalent resources. Whereas most strategies used in this trial may be considered important and feasible, others (43%) may have limited feasibility and/or importance for school stakeholders (Lyon et al., 2019). Future implementation research and practice that builds on this efficacy trial should investigate how implementation strategies affect implementation and child outcomes. Strategies that both improve outcomes and are feasible and important for school contexts should be prioritized.
The approach employed in this study enabled identification of implementation strategies and explication of their key features, including primary actors and their actions, when each strategy was used during implementation, and focal implementation outcomes. If applied to other efficacy research, this approach could facilitate a clearer understanding of how implementation strategies are used during early translational studies in school settings. However, we encountered challenges in retrospectively using Proctor et al.’s (2013) reporting guidelines. First, it was not always possible to identify discrete actions for each strategy, and some strategies had multiple discrete actions, making them difficult to operationalize. Adopting a more action-oriented approach, such as tracking implementation activities, may be preferable; however, matching discrete actions with strategies will likely continue to present challenges (Bunger et al., 2017). The retrospective approach also limited our ability to fully quantify the magnitude or frequency of strategy use (as team members were often engaged in similar implementation activities across multiple sites simultaneously) and to report the empirical, theoretical, or pragmatic justification for selection of each strategy. Our approach was also time-intensive. Instead of tracking and reporting as part of the implementation process, we had several meetings to identify with the research team which strategies were used, operationalize the strategies using the reporting guidelines, analyze the results, and then verify the results with the team. In contrast, activity logs completed retrospectively or in real time during the study period may be more efficiently completed and yield more precise estimates of strategies’ actions, dose, and temporal sequence (Boyd et al., 2018; Bunger et al., 2017).
The independent identification of implementation strategies and their conceptual action targets by research team members—particularly those who were unfamiliar with implementation science, the SISTER taxonomy, or the CFIR—was also a challenge. This project benefitted from having trained implementation researchers facilitate strategy specification. It may be difficult for researchers and school stakeholders who are less steeped in implementation to articulate which implementation strategies are used and how; thus, intervention developers should consider including implementation experts on their research teams to support strategy specification (Rudd et al., 2020; Tabak et al., 2021).
Finally, it was difficult during data collection to capture the full extent of informal activities that occurred during the trial. For example, one research team member mentioned that informal communications occurred that were not a required part of routine job responsibilities and/or did not fall squarely into the SISTER strategy definitions. Current implementation strategies, taxonomies, and activity logs might not adequately capture informal communications and other activities (Boyd et al., 2018). Due to inadequate data and the retrospective nature of this study, we were unable to determine which informal activities should be elevated to an implementation strategy; this is an area that warrants future research.
Limitations
This study demonstrated that the SISTER taxonomy and Proctor et al.’s (2013) reporting guidelines could be utilized to specify and report implementation strategies used during a school-based efficacy trial. We acknowledge several limitations of this research. Research team members were sole informants in this study, given their primary role in the selection and employment of implementation strategies; the inclusion of school stakeholders as informants may have yielded important information relevant for continued use of the identified strategies in schools. As data were collected retrospectively via self-reported use of implementation strategies by research team members in a group setting, there is potential for reporting and recall biases. Although most research team members who explicated the strategies were involved in the trial from the beginning, their reporting of strategies used during the 4-year trial may not have been as accurate because strategies were not tracked in real time. Additionally, both the first authors were part of the Project POWER research team before this study was conducted and served as the data collectors and data analysts for this study, potentially contributing to researcher bias. To reduce this bias, we used member checking with research team members in real time during the implementation strategy specification meetings and electronically during the data analysis phase. The retrospective nature of this study also limited our ability to measure the impact of used strategies on implementation outcomes or to determine which strategies were most critical for this type of intervention trial.
Implications and future directions
This study illustrates the breadth and depth of information that can be gleaned when implementation strategies are retrospectively identified and operationalized during efficacy research. Our findings underscore the need for comprehensive specification and reporting of implementation strategies during the early phases of translational research. This recommendation is aligned with calls for implementation to be considered from the start of intervention testing to maximize its potential feasibility, acceptability, and scalability in the real world and to reduce delays and roadblocks along the research to practice pipeline (Lane-Fall et al., 2019; Lyon & Bruns, 2019).
Whereas the retrospective approach used in this study may be most practical for research studies that have concluded or are currently underway, prospective implementation strategy tracking, for example, via activity logs or other means of documenting activities systematically in real time over the study period (e.g., Boyd et al., 2018; Bunger et al., 2017), may lead to a more thorough characterization of implementation strategy use. Prospective tracking can facilitate comparably more accurate descriptions of an implementation strategy’s dose and temporal sequence (Huynh et al., 2018) and can be less time-intensive when incorporated in study data collection protocols. Regardless of whether a prospective or retrospective approach is used, adaptations made to implementation strategies and associated outcomes of these adaptations should be tracked in future studies. Explicating when and in what sequence implementation strategies are used, and when and how strategies are altered throughout the implementation process, is essential for speeding real-world implementation (Powell et al., 2019a).
When intervention efficacy or effectiveness studies have already been published, recent research has analyzed published manuscripts to identify implementation strategies (Hooley et al., 2020; Premachandra & Lewis, 2021). Our data indicate it is also beneficial to review funded grant proposals for a more comprehensive understanding of strategy use. However, asking clinical researchers to complete a comprehensive implementation strategy specification and reporting processes using an established strategy taxonomy, and Proctor and colleagues’ (2013) reporting guidelines may yield greater breadth and depth of implementation information (Rudd et al., 2020).
Identification and monitoring of implementation strategies throughout the intervention development process has implications for school practices and policies. Researchers who carefully monitor and evaluate implementation strategies during efficacy trials may be well positioned to provide school stakeholders with detailed information on strategies to facilitate adoption, implementation, sustainability, and scale-up of the intervention upon the conclusion of research support. In addition to tracking the school actors who are involved in implementing strategies during the trial, researchers should study which implementation strategies are related to successful implementation during the research study and recommend strategies that could be carried forward by school partners in future implementation. This information could be disseminated to schools in the form of user-friendly toolkits that accompany intervention materials and specified in publications from efficacy and effectiveness studies. Making detailed information about implementation available in a variety of outlets enables school implementation leaders to more effectively plan for implementation and to provide guidance for educators and school mental health professionals responsible for intervention delivery. To support planning for school stakeholder-led implementation, these materials should clearly define the roles and responsibilities of school actors in delivering the intervention and enacting implementation strategies.
Conclusion
This study provides evidence that implementation strategies are indeed used, and may change, during efficacy research and highlights the importance of examining implementation strategies during earlier phases of research. We further illustrated a retrospective approach to specifying and reporting implementation strategies that may be leveraged within efficacy research in educational settings. The unacknowledged and overlooked implementation supports that are built into school-based efficacy, and effectiveness studies are often relevant to improving implementation (e.g., fidelity, acceptability), health (e.g., reduced symptoms of anxiety and depression), and academic (e.g., grades) outcomes. We urge that implementation strategies be strategically selected prior to implementation, clearly explained to research team and school partners involved in implementation, tracked during the trial (including adaptations), and reported in the literature using an implementation strategy taxonomy and Proctor et al.’s (2013) reporting guidelines.
Supplemental Material
Supplemental material, sj-docx-1-irp-10.1177_26334895211047841 for Specifying and reporting implementation strategies used in a school-based prevention efficacy trial by Stephanie A. Moore, Kimberly T. Arnold and Rinad S. Beidas, Tamar Mendelson in Implementation Research and Practice
Footnotes
Acknowledgements: The authors would like to thank the schools in the Baltimore City Public Schools District that participated in the Project POWER trial, and the following research team members who generously contributed their time to this project (names listed alphabetically): Laura Clary, Christine Crimmins, Rachel Dows, Karen Edwards, Jeffery Krick, Marcus Nole, Steven Sheridan, Violet Odom, and Alexander Welna. The authors would also like to thank our colleagues for their feedback on this project: Gazi Azad, Courtney Benjamin Wolk, Molly Davis, David Mandell, and Brittany Rudd.
The author(s) declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: Rinad S. Beidas receives royalties from Oxford University Press, served as a consultant to Camden Coalition of Healthcare Providers, currently consults for United Behavioral Health, and receives an honorarium for serving on the Optum Behavioral Health Clinical Scientific Advisory Council. Dr. Beidas is an Associate Editor of Implementation Research and Practice; all decisions on this paper were made by another editor.
Funding: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the National Institute of Mental Health (grant numbers T32MH109433-03, T32MH109436) and the Institute of Education Sciences (grant number R305A160082). The opinions expressed are those of the authors and do not represent the views of the National Institutes of Health or the US Department of Education.
ORCID iD: Stephanie A. Moore https://orcid.org/0000-0003-1550-8344
Supplemental material: Supplemental material for this article is available online.
References
- Alegria M., Vallas M., Pumariega A. (2010). Racial and ethnic disparities in pediatric mental health. Child and Adolescent Psychiatric Clinics of North America, 19(4), 759–774. 10.1016/j.chc.2010.07.001 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Arnold K. T., Pollack Porter K. M., Frattaroli S., Durham R. E., Clary L. K., Mendelson T. (2021). Multilevel barriers and facilitators to sustainability of a trauma-informed mindfulness intervention for adolescents in urban schools after participation in an efficacy trial: A qualitative study. School Mental Health, 13, 174–185. 10.1007/s12310-020-09402-w [DOI] [PMC free article] [PubMed] [Google Scholar]
- Arnold K. T., Pollack Porter K. M., Frattaroli S., Durham R. E., Mmari K., Clary L. K., Mendelson T. (2020). Factors that influenced adoption of a school-based trauma-informed universal mental health intervention. Prevention Science, 21, 1081–1092. 10.1007/s11121-020-01144-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Atkins M. S., Hoagwood K. E., Kutash K., Seidman E. (2010). Toward the integration of education and mental health in schools. Administration and Policy of Mental Health, 37, 40–47. 10.1007/s10488-010-0299-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas R. S., Mychailyszyn M. P., Edmunds J. M., Khanna M. S., Downey M. M., Kendall P. C. (2012). Training school mental health providers to deliver cognitive-behavioral therapy. School Mental Health, 4, 197–206. 10.1007/s12310-012-9074-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Boyd M. R., Powell B. J., Endicott D., Lewis C. C. (2018). A method for tracking implementation strategies: An exemplar implementing measurement-based care in community behavioral health clinics. Behavior Therapy, 49(4), 525–537. 10.1016/j.beth.2017.11.012 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brown C. H., Curran G., Palinkas L. A., Aarons G. A., Wells K. B., Jones L., Collins L. M., Duan N., Mittman B. S., Wallce A., Tabak R. G., Ducharme L., Chambers D. A., Neta G., Wiley T., Landsverk J., Cheung K., Cruden G. (2017). An overview of research and evaluation designs for dissemination and implementation. Annual Review of Public Health, 38, 1–22. 10.1146/annurev-publhealth-031816-044215 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bunger A. C., Powell B. J., Robertson H. A., MacDowell H., Birken S. A., Shea C. (2017). Tracking implementation strategies: A description of a practical approach and early findings. Health Research Policy and Systems, 15, 15. 10.1186/s12961-017-0175-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cook C. R., Lyon A. R., Locke J., Waltz T., Powell B. J. (2019). Adapting a compilation of implementation strategies to advance school-based implementation research and practice. Prevention Science, 20, 914–935. 10.1007/s11121-019-01017-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Curran G. M. (2020). An overview on hybrid effectiveness-implementation designs [webinar]. U.S. Department of Veterans Affairs Health Services Research & Development. [Google Scholar]
- Curran G. M., Bauer M., Mittman B., Pyne J. M., Stetler C. (2012). Effectiveness-implementation hybrid designs. Medical Care, 50(3), 217–226. 10.1097/MLR.0b013e3182408812 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Damschroder L. J., Aron D. C., Keith R. E., Kirsh S. R., Alexander J. A., Lowery J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4, 50. 10.1186/1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
- DeRosa R., Habib M., Pelcovitz D., Rathus J., Sonnenkler J., Ford J., Sunday S., Layne C., Saltzman W., Turnbull A., Labruna V., Kaplan S. (2006). Structured Psychotherapy for Adolescents Responding to Chronic Stress (SPARCS): A trauma-focused guide. North Shore University Hospital. [Google Scholar]
- DeRosa R., Pelcovitz D. (2009). Group treatment for traumatized adolescents. In Brom D., Pat-Horenczyk R., Ford J. D. (eds.), Treating traumatized children: Risk, resilience, and recovery (pp. 225–239). Routledge. [Google Scholar]
- Domitrovich C. E., Bradshaw C. P., Greenberg M. T., Embry D., Poduska J. M., Ialongo N. S. (2010). Integrated models of school-based prevention: Logic and theory. Psychology in the Schools, 47(1), 71–88. 10.1002/pits.20452 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Duong M. T., Bruns E. J., Lee K., Cox S., Coifman J., Mayworm A., Lyon A. R. (2020). Rates of mental health service utilization by children and adolescents in schools and other common service settings: A systematic review and meta-analysis. Administration and Policy in Mental Health and Health Services Research, 48, 420–439. 10.1007/s10488-020-01080-9 [DOI] [PubMed] [Google Scholar]
- Eiraldi R., Wolk C. B., Locke J., Beidas R. (2015). Clearing hurdles: The challenges of implementation of mental health evidence-based practices in under-resourced schools. Advances in School Mental Health Promotion, 8(3), 124–140. 10.1080/1754730X.2015.1037848 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ennett S. T., Ringwalt C. L., Thorne J., Rohrbach L. A., Vincus A., Simons-Rudolph A., Jones S. (2003). A comparison of current practice in school-based substance use prevention programs with meta-analysis findings. Prevention Science, 4, 1–14. 10.1023/A:1021777109369 [DOI] [PubMed] [Google Scholar]
- Franklin C. G. S., Kim J. S., Ryan T. N., Kelly M. S., Montgomery K. L. (2012). Teacher involvement in school mental health interventions: A systematic review. Children and Youth Services Review, 34(5), 973–982. 10.1016/j.childyouth.2012.01.027 [DOI] [Google Scholar]
- Gottfredson D. C., Gottfredson G. D. (2002). Quality of school-based prevention programs: Results from a national survey. Journal of Research in Crime and Delinquency, 39(1), 3–35. 10.1177/002242780203900101 [DOI] [Google Scholar]
- Gould L. F., Mendelssohn T., Dariotis J. K., Ancona M., Smith A. S. R., Gonzalez A. A., Smith A. A., Greenberg M. T. (2014). Assessing fidelity of core components in a mindfulness and yoga intervention for urban youth: Applying the CORE process. New Directions for Youth Development, 2014(142), 59–81. 10.1002/yd.20097 [DOI] [PubMed] [Google Scholar]
- Han S. S., Weiss B. (2005). Sustainability of teacher implementation of school-based mental health programs. Journal of Abnormal Child Psychology, 33, 665–679. 10.1007/s10802-005-7646-2 [DOI] [PubMed] [Google Scholar]
- Herlitz L., MacIntyre H., Osborn T., Bonell C. (2020). The sustainability of public health interventions in schools: A systematic review. Implementation Science, 15, 4. 10.1186/s13012-019-0961-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hicks T. B., Shahidullah J. D., Carlson J. S., Palejwala M. H. (2014). Nationally certified school Psychologists’ use and reported barriers to using evidence-based interventions in schools: The influence of graduate program training and education. School Psychology Quarterly, 29(4), 469–487. 10.1037/spq0000059 [DOI] [PubMed] [Google Scholar]
- Hooley C., Amano T., Markovitz L., Yaeger L., Proctor E. (2020). Assessing implementation strategy reporting in the mental health literature: A narrative review. Administration and Policy in Mental Health and Health Services Research, 47, 19–35. 10.1007/s10488-019-00965-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Huynh A. K., Hamilton A. B., Farmer M. M., Bean-Mayberry B., Stirman S. W., Moin T., Finley E. P. (2018). A pragmatic approach to guide implementation evaluation research: Strategy mapping for complex interventions. Frontiers in Public Health, 6, 134. 10.3389/fpubh.2018.00134 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jacob S., Coustasse A. (2008). School-based mental health: A de facto mental health system for children. Journal of Hospital Marketing & Public Relations, 18(2), 197–211. 10.1080/15390940802232499 [DOI] [PubMed] [Google Scholar]
- Lane-Fall M. B., Curran G. M., Beidas R. S. (2019). Scoping implementation science for the beginner: Locating yourself on the “subway line” of translational research. BMC Medical Research Methodology, 19, 133. 10.1186/s12874-019-0783-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- Locke J., Wolk C. B., Harker C., Olsen A., Shingledecker T., Barg F., Mandell D., Beidas R. (2017). Pebbles, rocks, and boulders: The implementation of a school-based social engagement intervention for children with autism. Autism, 21(8), 985–994. 10.1177/1362361316664474 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon A., Cook R., Locke C. R., Davis J., Powell C., & Waltz B. J., J T. (2019). Importance and feasibility of an adapted set of implementation strategies in schools. Journal of School Psychology, 76, 66–77. 10.1016/j.jsp.2019.07.014 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon A. R., Bruns E. J. (2019). From evidence to impact: Joining our best school mental health practices with our best implementation strategies. School Mental Health, 11, 106–114. 10.1007/s12310-018-09306-w [DOI] [PMC free article] [PubMed] [Google Scholar]
- Masten A. S. (2003). Commentary: Developmental psychopathology as a unifying context for mental health and education models, research, and practice in schools. School Psychology Review, 32(2), 169–173. 10.1080/02796015.2003.12086189 [DOI] [Google Scholar]
- Mendelson T., Clary L. K., Sibinga E., Tandon D., Musci R., Mmari K., Salkever D., Stuart E. A., Ialongo N. (2020). A randomized controlled trial of a trauma-informed school prevention program for urban youth: Rationale, design, and methods. Contemporary Clinical Trials, 90, 105895. 10.1016/j.cct.2019.105895 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mendelson T., Tandon S. D., O'Brennan L., Leaf P. J., Ialongo N. S. (2015). Brief report. Moving prevention into schools: The impact of a trauma-informed school-based intervention. Journal of Adolescence, 43, 142–147. 10.1080/02796015.2003.12086189 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Novins D. K., Green A. E., Legha R. K., Aarons G. A. (2013). Dissemination and implementation of evidence-based practices for child and adolescent mental health: A systematic review. Journal of the American Academy of Child & Adolescent Psychiatry, 52(10), 1009–1025.e18. 10.1016/j.jaac.2013.07.012 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Owens J. S., Lyon A. R., Brandt N. E., Warner C. M., Nadeem E., Spiel C., Wagner M. (2014). Implementation science in school mental health: Key constructs in a developing research agenda. School Mental Health, 6, 99–111. 10.1007/s12310-013-9115-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Perry C. K., Damschroder L. J., Hemler J. R., Woodson T. T., Ono S. S., Cohen D. J. (2019). Specifying and comparing implementation strategies across seven large implementation interventions: A practical application of theory. Implementation Science, 14, 32. 10.1186/s13012-019-0876-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Powell B. J., Beidas R. S., Lewis C. C., Aarons G. A., McMillen J. C., Proctor E. K., Mandell D. S. (2017). Methods to improve the selection and tailoring of implementation strategies. Journal of Behavioral Health Services Research, 44(2), 177–194. 10.1007/s11414-015-9475-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Powell B. J., Fernandez M. E., Williams N. J., Aarons G. A., Beidas R. S., Lewis C. C., McHugh S. M., Weiner B. J. (2019a). Enhancing the impact of implementation strategies in healthcare: A research agenda. Frontiers in Public Health, 7, 3. 10.3389/fpubh.2019.00003 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Powell B. J., Patel S. V., Haley A. D., Haines E. R., Knocke K. E., Chandler S., Katz C. C., Seifert H. P., Ake G., III, Amaya-Jackson L., Aarons G. A. (2019b). Determinants of implementing evidence-based trauma-focused interventions for children and youth: A systematic review. Administration and Policy in Mental Health and Mental Health Services Research, 47, 705–719. 10.1007/s10488-019-01003-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Powell B. J., Proctor E. K., Glass J. E. (2014). A systematic review of strategies for implementing empirically supported mental health interventions. Research on Social Work Practice, 24(2), 192–212. 10.1177/1049731513505778 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Powell B. J., Waltz T. J., Chinman M. J., Damschroder L. J., Smith J. L., Matthieu M. M., Proctor E. K., Kirchner J. E. (2015). A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science, 10, 21. 10.1186/s13012-015-0209-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Premachandra B., Lewis N. A. (2021). Do we report the information that is necessary to give psychology away? A scoping review of the psychological intervention literature 2000-2018. Perspectives on Psychological Science. 10.1177/1745691620974774 [DOI] [PubMed] [Google Scholar]
- Proctor E. K., Powell B. J., McMillen J. C. (2013). Implementation strategies: Recommendations for specifying and reporting. Implementation Science, 8, 139. 10.1186/1748-5908-8-139 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rogal S. S., Yakovchenko V., Waltz T. J., Powell B. J., Kirchner J. E., Proctor E. K., Gonzalez R., Park A., Ross D., Morgan T. R., Chartier M., Chinman M. J. (2017). The association between implementation strategy use and the uptake of hepatitis C treatment in a national sample. Implementation Science, 12, 60. 10.1186/s13012-017-0588-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rojas-Andrade R., Bahamondes L. L. (2019). Is implementation fidelity important? A systematic review on school-based mental health programs. Contemporary School Psychology, 23, 339–350. 10.1007/s40688-018-0175-0 [DOI] [Google Scholar]
- Rudd B. N., Davis M., Beidas R. S. (2020). Integrating implementation science in clinical research to maximize public health impact: A call for the reporting of implementation strategy use with implementation outcomes in clinical research. Implementation Science, 15, 103. 10.1186/s13012-020-01060-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shelton R. C., Cooper B. R., Stirman S. W. (2018). The sustainability of evidence-based interventions and practices in public health and health care. Annual Review of Public Health, 39, 55–76. 10.1146/annurev-publhealth-040617-014731 [DOI] [PubMed] [Google Scholar]
- Stevens E. R., Shelley D., Boden-Albala B. (2020). Unrecognized implementation science engagement among health researchers in the USA: A national survey. Implementation Science Communications, 1, 39. 10.1186/s43058-020-00027-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tabak R. G., Bauman A. A., Holtrop J. S. (2021). Roles dissemination and implementation scientists can play in supporting research teams. Implementation Science Communications, 2, 9. 10.1186/s43058-020-00107-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Waltz T. J., Powell B. J., Mattheu M. M., Damschroder L. J., Chinman M. J., Smith J. L., Proctor E. K., Kirchner J. E. (2015). Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: Results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Science, 10, 109. 10.1186/s13012-015-0295-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supplemental material, sj-docx-1-irp-10.1177_26334895211047841 for Specifying and reporting implementation strategies used in a school-based prevention efficacy trial by Stephanie A. Moore, Kimberly T. Arnold and Rinad S. Beidas, Tamar Mendelson in Implementation Research and Practice

