Skip to main content
Implementation Research and Practice logoLink to Implementation Research and Practice
. 2024 Jan 8;5:26334895231220279. doi: 10.1177/26334895231220279

The perspective of school leaders on the implementation of evidence-based practices: A mixed methods study

Stephanie A Moore 1,, Aksheya Sridhar 2, Isabella Taormina 3, Manasi Rajadhyaksha 1, Gazi Azad 3,4
PMCID: PMC10775739  PMID: 38322802

Abstract

Background

School leaders play an integral role in the use of implementation strategies, which in turn support special education teachers in the implementation of evidence-based practices (EBPs). In this convergent mixed methods study, we explored school leaders’ perceptions of the facilitators and barriers to EBP implementation, particularly for students receiving special education, as well as the importance and feasibility of 15 implementation strategies.

Method

School leaders (N  =  22, principals, assistant principals, school psychologists, etc.) participated in a semistructured interview that included three parts—qualitative questions, quantitative ratings of strategies’ importance and feasibility, and discussion of the top three implementation strategies. Data strands were analyzed independently and then integrated to generate meta-inferences.

Results

The qualitative data identified facilitators such as access to resources about the intervention (e.g., professional development) and collaboration and teamwork, while barriers centered on lack of school supports, culture/climate, and organizational factors (e.g., lack of communication). The quantitative data indicated that the implementation strategy provide ongoing consultation/coaching was rated as important and feasible. Monitor the progress of the implementation effort was rated as important but less feasible, while conduct educational meetings and change the environment were rated as feasible, but less important. Build partnerships to support implementation was rated as less important and feasible. There was convergence and divergence in mixed methods findings.

Conclusion

This study underscores the critical need to increase school leaders’ knowledge and skills related to implementation science to better leverage implementation strategies that address the confluence of relevant implementation determinants.

Keywords: implementation strategy, mixed methods, leaders, school

Plain Language Summary

School leaders, such as principals, assistant principals, and school psychologists, are responsible for supporting special education teachers in doing interventions (e.g., through teacher training). In this study, we interviewed 22 school leaders to understand what factors support or hinder teachers in doing interventions, especially for students receiving special education. We also provided school leaders with a list of 15 strategies that may be used to support teachers, to determine which strategies school leaders think are the most important and feasible in schools. Our results indicated that it is important to train school leaders on how to move research into practice settings, such as schools. It is equally important for researchers and policymakers to understand the priorities of school leaders.

Introduction

Schools are the primary service setting for children with mental and behavioral health needs, including those who receive special education (Duong et al., 2021). Students receiving special education are provided with an Individualized Education Program (IEP) that delineates their needs, goals, and services (Reich, 2010). School-based providers are encouraged to address IEP goals using evidence-based practices (EBPs; Individuals with Disabilities Education Act [IDEA], 2004); however, the uptake and scale-up of EBPs are onerous in schools (Lyon et al., 2020; Lyon & Bruns, 2019). School leaders are often equated with school administrators (e.g., principals; Melgarejo et al., 2020; Stadnick et al., 2019). However, common team-based leadership models recognize administrative and practice-level leaders (e.g., school psychologists, behavioral coaches; Bush & Glover, 2014). Each type of leader plays a key role in supporting the implementation of EBPs by using implementation strategies (e.g., professional development [PD]). However, there is limited research on school leaders’ perceptions of the barriers and facilitators that influence the implementation of EBPs and which implementation strategies they find the most important and feasible.

Barriers and Facilitators to School-Based Implementation of EBPs

Despite national laws mandating EBPs in educational settings (IDEA, 2004), EBP adoption, implementation, and sustainability in schools and special education settings is limited (Burns & Ysseldyke, 2009; Lyon et al., 2020). Determinant frameworks, such as the Consolidated Framework for Implementation Research (CFIR; Damschroder et al., 2009, 2022a), delineate multilevel contextual factors that influence implementation outcomes. Barriers and facilitators of EBP in special education are noted at the individual (e.g., knowledge and attitudes), inner setting (e.g., competing priorities and school resources), and innovation (e.g., complexity of EBPs) levels (B. G. Cook & Odom, 2013; Edmunds et al., 2022; Forman et al., 2013). Studies on implementation determinants in special education have often focused on autism services and have primarily included teachers (Lawson et al., 2022), direct service providers (Suhrheinrich et al., 2021), or a combination of personnel (e.g., administrators, teachers, and other staff; Locke et al., 2019b). There are limited studies, to our knowledge, that examine the perspective of school leaders—both administrative- and practice-level—on implementation barriers and facilitators.

Implementation Strategies in Schools

Implementation strategies were developed to address barriers and bolster facilitators to expedite the translation of EBPs into routine service delivery. The Expert Recommendations for Implementing Change (ERIC) project compiled and defined 73 implementation strategies, distributed across nine categories (e.g., engage consumers, develop stakeholder interrelationships, etc.; Powell et al., 2015; Waltz et al., 2015); however, this compilation was limited to health care. Through an iterative expert review process, the School Implementation Strategies, Translating ERIC Resources (SISTER) project delineated 75 school-based implementation strategies, with definitions, across the same nine categories. The ERIC and SISTER compilations were developed to increase consistency and terminology when describing implementation strategies in research and practice (C. R. Cook et al., 2019; Powell et al., 2015).

The literature on implementation strategies in schools has largely focused on individual-level strategies, such as PD including training, consultation, and coaching (Lyon & Bruns, 2019). Yet, research suggests that high-quality implementation is supported when a variety of tailored strategies are used at multiple levels, including the inner and outer settings (Lyon et al., 2019; Powell et al., 2017). In research, Fernandez and colleagues (2019) suggest implementation mapping as a systematic process for developing implementation strategies based on needs assessment data, as well as the identification of determinants and mechanisms of change.

In practice, it is unclear how implementation strategies are chosen. Lyon and colleagues (2019) attempted to understand this process by asking 200 school-based consultants (e.g., school psychologists and behavior specialists) to rate the importance and feasibility of the 75 SISTER strategies. The three most important implementation strategies were: (1) conduct ongoing training, (2) make training dynamic, and (3) provide ongoing consultation/coaching. The three most feasible strategies were: (1) make training dynamic, (2) distribute educational materials, and (3) remind school personnel. Although the participants in the Lyon et al. (2019) study had a variety of roles, administrative leaders were not included. Their study also targeted school-based implementation in general education, therefore, these findings may not generalize to special education settings.

Contribution of the Present Study

School leaders, including administrators and practice leaders, often have the executive decision-making power to utilize implementation strategies, which in turn promote the effective use of EBPs by teachers (Pauling et al., 2021). However, there is limited research focusing on their perceptions of the barriers and facilitators to implementation and which implementation strategies they find most important and feasible for special education settings. This knowledge would make an important contribution to the literature by providing insight into school leaders’ decision-making, including how funds may be allocated and which initiatives are prioritized. This study explored the perspective of school leaders with three research questions: (1) What factors facilitate the implementation of EBPs for students receiving special education? (2) What barriers impede the implementation of EBPs for students receiving special education? (3) What implementation strategies were identified as most important and feasible?

Method

Participants

Participants were 22 school leaders from nine schools in one Mid-Atlantic state. They primarily identified as female (81.8%) and as either White (86.4%) or African American/Black (13.6%). Leaders’ roles included administrators (59.1%) or related service providers (40.9%) with some experience or exposure supporting students with autism in either public (54.5%) or nonpublic (45.5%) schools. The nonpublic schools (part of one university-affiliated alternative school system) served public school students with disabilities (e.g., autistic, intellectual disability, and emotional disability) whose educational needs could not be met in public schools. School leaders were prompted to discuss implementation by special education teachers serving students with autism. Table 1 depicts leaders’ demographics.

Table 1.

Participant Demographics (N  =  22, Nschools  =  9)

Variable % or M (SD)
Female 81.8%
Age 41.1 (7.1)
Race/ethnicity
 White 86.4%
 African American 13.6%
 Asian 0%
 Pacific Islander 0%
 Middle Eastern 0%
 American Indian or Alaska Native 0%
 Hispanic/Latino 0%
 Other 0%
Position
 Administrator (e.g., principal and assistant principal) 59.1%
 Related service provider (e.g., school psychologist, social worker, behavior specialist, and IEP manager) 40.9%
Years in education
 Fewer than 15 years 40.9%
 15 years or more 59.1%
Years in role at any school
 <3 years 27.3%
 3–9 years 54.5%
 10  +  years 18.2%
Years in role at current school
 <3 years 31.8%
 3–9 years 59.1%
 10+  years 9.0%
School type
 Public 54.5%
 Nonpublic 45.5%

Note. IEP = Individualized Education Program.

Procedure

Research procedures were approved by the university's and school district's Institutional Review Boards. Participants were recruited from a larger study aimed at modifying a school-based implementation package to enhance the use of EBPs by parents and teachers of autistic children. The project did not involve EBP delivery nor implementation strategy testing, thus participants’ exposure to EBPs was not affected because of their affiliation with the project. As part of the larger study, the senior author had contact information for all school leaders. She emailed them to explain the research opportunity which included surveys and an optional qualitative interview. For those that expressed interest (N  =  34), the senior author met with each school leader to provide further information about herself (e.g., qualifications, experience, and relationship with teachers) and the study, including risks and benefits, and answer any questions. One school leader did not consent to participate. Three leaders (from public schools) consented but did not return their surveys. The subsample who expressed interest in the qualitative interviews is the focus of this study (N  =  22). Sampling saturation was achieved when all possible leaders whose school sites were participating in the larger study were approached for recruitment.

Interviews were conducted by the senior author at participating schools and audio recorded. Data were collected between September 2019 and February 2020. The interview (analyzed between May 2022 and April 2023) was broken into three parts: (1) open-ended questions, (2) rating implementation strategies’ importance and feasibility, and (3) discussion of top three implementation strategies. Each part was completed consecutively during the 45–60-min interview.

Measures

Interview Part 1: Open-Ended Qualitative Questions

A semistructured interview protocol was developed to gather in-depth information about implementation in special education. We analyzed school leaders’ responses to two prompts about (a) the culture and climate at their school (i.e., “Tell me about the culture and climate at this school”) and (b) how special education teachers were supported in using EBPs (i.e., “How does your school support special education teachers in doing evidence-based practices?”).

Interview Part 2: Quantitative Ratings

School leaders rated the importance and feasibility of 15 implementation strategies identified from the SISTER compilation (C. R. Cook et al., 2019). The 15 implementation strategies were selected through a multistep process (see Supplementary File 1): (1) The senior author and colleagues with expertise in schools, implementation science, and special education reviewed the 75 SISTER strategies and eliminated strategies that were already embedded in the implementation package being modified in the larger project, resulting in a narrowed list of 58 strategies. (2) The remaining 58 implementation strategies were examined for conceptual relevance and applicability for serving special education students. An additional 16 strategies were removed from consideration. (3) The importance and feasibility of the 42 remaining implementation strategies were considered using average ratings provided by Lyon et al. (2019). The 42 strategies were first ordered by importance. Then, we examined the feasibility of strategies with importance ratings above the median value. Strategies for which the average feasibility rating was below 3 (range  =  1–5) were eliminated from consideration. (4) This final list included 15 implementation strategies spanning five categories: training and educating stakeholders, using evaluative and iterative strategies, developing stakeholder relationships, supporting educators, and changing infrastructure.

Replicating processes used in the ERIC (Waltz et al., 2015) and SISTER (Lyon et al., 2019) studies, participants were given separate surveys to rate the importance and feasibility of the 15 selected strategies. Each survey provided: (1) a definition of importance (i.e., how critical the implementation strategy is to successful implementation efforts) or feasibility (i.e., the extent to which a strategy is practical and can be successfully used to support implementation) from Lyon et al. (2019), and (2) a definition of each implementation strategy from C. R. Cook et al. (2019). Participants rated the degree to which each strategy was important (1  =  relatively unimportant to 5  =  extremely important) and feasible (1  =  not at all feasible to 5  =  extremely feasible) for implementation efforts at their school (Lyon et al., 2019; Waltz et al., 2015).

Interview Part 3: Qualitative Responses to Top Three Implementation Strategies

Next, the interviewer and participant visually examined the participant's ratings to identify the top three implementation strategies to discuss. Strategies with the highest ratings (i.e., “5” for both importance and feasibility) were identified. If only three strategies were identified, then those were discussed. If more than three strategies were identified with equally high ratings (e.g., there was a tie between two strategies’ ratings), participants were asked to pick which of those they wanted to discuss. If fewer than three strategies were identified with the highest ratings, participants were also asked to pick an additional strategy or strategies to discuss.

Data Analyses

Qualitative Analyses

Audio recordings of interviews were deidentified, transcribed verbatim, and imported into MAXQDA. Interviewer field notes supported clarity and interpretation of transcripts. Data were analyzed using constant comparison coding methods and grounded theory. Multiple steps were taken to enhance trustworthiness. Specifically, the coding team included three independent coders (an master’s-level research coordinator and a bachelor’s-level research assistant led by a doctoral student with extensive experience with qualitative analysis) who were not involved in study development or data collection. The codebook was developed iteratively, such that additional codes were added and codes were refined throughout the process. Coders first reviewed a subset of interviews (n  =  4) to familiarize themselves with the data. A combination of inductive and deductive coding practices was used. Some codes were identified a priori (e.g., “importance of implementation strategies”), while other codes were developed inductively, based on ideas that emerged during the coding process (e.g., “access to resources”). All interviews were coded independently. Consensus meetings were held to discuss discrepancies, finalize codes, and update the codebook to reflect coder agreement; an audit trail was used to track the rationale for codebook changes. Interrater reliability following initial coding was 74%. Upon completion of the coding process, the coding team developed overarching categories and themes to organize the final codebook by topic area. Our reporting of qualitative results was guided by the Standards for Reporting Qualitative Research checklist (O’Brien et al., 2014).

Quantitative Analyses

Descriptive analyses (mean, standard deviation, and range) provided information regarding school leaders’ perceptions of the importance and feasibility of implementation strategies. The frequency with which participants identified each strategy as one of their top three during the qualitative interview was also generated.

Mixed Methods Analyses

In mixed methods, quantitative and qualitative data are collected and analyzed, and then integrated (Creswell, 2022). We used a convergent mixed method design such that the quantitative (QUAN) and qualitative (QUAL) data were collected concurrently and given equal weight (QUAL  +  QUAN; Palinkas et al., 2011). Integration involved comparing the results from the qualitative and quantitative data by merging, which provided a more complete understanding than what would have been provided by the qualitative or quantitative results alone (Creswell, 2022).

First, the quantitative and qualitative data were analyzed independently, and inferences were drawn from each of these results. Additionally, the qualitative data were quantified to present the frequency of each code (i.e., the number of times the code was assigned across all transcripts; Tables 2 and 4); this provided information regarding the saliency of each code (Landrum & Garza, 2015). Second, both data strands were merged to integrate findings. A joint display was developed that linked qualitative themes to related quantitative constructs to facilitate integration (Guetterman et al., 2015). Third, we generated meta-inferences by merging the data.

Table 2.

Themes and Subthemes of Factors Facilitating and Hindering Implementation With Illustrative Quotes

Theme Code (frequency coded) Illustrative quote
Professional development and training Resources/access to resources (6) “I think access to the information is the first step so making sure that the teachers know where to find evidence-based information whether that’s research literature or verified websites that have actual legitimate information”
Professional development opportunities (9) “So if the teacher is not familiar with certain interventions, I feel like there's always more opportunities for them to be able to go gain more information about interventions. And that can be something as simple as making sure that the teacher gets covered so that they can attend a professional development”
Teacher training (14) “The mentor teachers facilitate a group with the classrooms that they are mentoring to … open up discussions. They try to use some exemplars of what’s happening on site to talk about the topics and train on those topics. And then at our community meetings within the next two weeks we revisit it, and see if they had homework; so for example, the video modeling was just the content for the month of December, and so the mentor teachers got together, they went through the step-by-step checklist from the module and they made video models.”
Collaboration and working in teams School supports for implementation strategies (18) “so for teachers, it's how do we implement this in the classroom and that's the focus really for that month – and it gets brought up again in our community meeting and we talked more to that practice reminders different ways that the different service providers can provide support for implementation of those things as well.”
Meetings (14) “so teachers have weekly […] meetings with support staff. So every other week, to meet… for more administrative sort of assistance in helping to manage parents. But then in the alternating weeks, they're also meeting with their curriculum, instruction, assessment… so in either of those meetings we’re going to be talking about what they’re doing with kids, we’re going to be figuring out what training they might need.”
Support from nonteaching staff (49) “So providers like the SLP, OT, social workers, the psychologist, they have kind of […] developed this communication between them and me being the facilitator, I am involved. Whenever [teacher] comes to me, like [for] a new student in kindergarten, and say ‘hey I have some concerns,’ then I can make recommendations. I say ‘oh well let me ask one of our providers…,’ so we have that sort of an open communication here.”
Culture/climate (24) “Something unique about the program is just how collaborative we really are, in terms of our stakeholders. We work really hard in a team … listen to each other and work together.”
Family/student-level factors (15) “because the bulk of our children are pre-kindergarten and very few get to stay for their kindergarten year…. We have to build that culture with parents all over again every single year. The bulk of our children are here one year and then they’re gone, so […] it's an uphill battle every year, but it's one that we already anticipate. So we try to get parents early whether it's back to school night or even how we connect with them before school year even starts. Phone calls or letters that we send out, trying to get them comfortable with where their children are going to come.”
Teachers/staffing-level factors (31) “so then I was able to build in staff that had the collaborative mindset to work with general education students … and with parents and to understand what each individual child needed so that was hard because you’re working with brand new staff who were brand new teachers to the county … you’re working with, not only knowing the curriculum but knowing how to work with students in autism when you’re brand-new as a teacher.”
Barriers to implementation Lack of school supports (2) “we have one classroom that was supposed to be a verbal behavior room yeah like specifically worth a lot about the know just working on readiness skills yeah and teacher got docked for not focusing on the curriculum, they’re all about test scores all about curriculum.”
Culture climate (3) “A positive culture is just difficult to maintain with a lot of turnover and just sort of the nature of the job that we do here.”
Organization-level factors (15) “I think communication occasionally is an issue just globally across the school. I don’t think things are always communicated to every stakeholder person who needs to know that information…some things [are] a little disjointed but I know I’ve had times where I feel as though someone was left off that email […] so I see that disconnection a little bit in this building.”
Table 4.

Joint Display of Most Discussed Implementation Strategies by Importance and Feasibility

Top discussed implementation strategies (frequency coded) QUAN strand QUAL strand Meta-inferences
Feasibility
M (SD)
Importance
M (SD)
Illustrative quote Convergence or divergence of data sources on importance and feasibility
Provide ongoing consultation and coaching (8) 3.8 (0.9) 4.8 (0.4) “Providing ongoing consultation and coaching is very important and very feasible, in that we have the structures in place.” All three data sources converge on importance and feasibility
Model and simulate change (7) 3.6 (1.0) 4.5 (0.6) “I haven’t had a chance to really do modeling this year, but I am a visual learner and so are a lot of people, so I often show them side by side. But for me, or based on what I hear is most effective, is when somebody goes in and actually models a lesson or models how to do it. So that’s obviously very feasible because it's just somebody in the building with them like instructional coaches or I know our math literacy representative run the same thing.” Data strands diverge on importance and feasibility. This strategy was not rated as one of the top 3 strategies based on QUAN data but was often discussed in QUAL
Conduct ongoing training (6) 3.8 (1.0) 4.7 (0.5) “I think the reason why it’s important is that we pride ourselves on being the best and always keeping up with best practices and research and wanting to be innovative; and in this day and age practices are changing every single day. So we need to be on the front end of that; we need to be proactive.” All three data sources converge on importance and feasibility
Change and alter environment (6) 3.8 (0.9) 4.0 (0.8) “Super feasible of those three, and something that we do, quickly changing and altering the environment.” Data sources converge on the feasibility of this strategy

QUAL = Qualitative; QUAN = Quantitative.

Results

What Factors Facilitate EBP Implementation in Special Education?

Theme 1: Professional Development and Training

This theme captured PD opportunities and training resources that supported implementation. Specific codes included resources/access to resources, PD opportunities, and teacher training. Table 2 presents illustrative quotes and code frequencies.

Resources/Access to Resources

School leaders identified a variety of resources/access to resources including receiving support from web-based training modules (e.g., about autism interventions) and video models. Others described specific types of resources such as a book study, during which teachers worked in small groups and focused on several topics (e.g., behavior management). Another participant described an “avatar lab” which allowed teachers to practice conversations with families. Finally, participants highlighted the importance of understanding how to distinguish accurate versus inaccurate information.

Professional Development Opportunities

School leaders discussed the importance of PD activities including “professional learning” or providing teachers with the necessary knowledge to teach a variety of topics. Participants also mentioned engaging in PD outside of the school to gain information about specific interventions and have greater exposure to different EBPs. They described attending weekly PD meetings as well as collaborative planning meetings, which included training and support for professional practices. One participant mentioned engaging with lead teachers; lead teachers were provided with a stipend and opportunities to gain leadership experience by providing PD support to newer teachers/staff.

Teacher Training

PD support also included teacher trainings that covered a range of topics, including “on-the-job” training with Board Certified Behavior Analysts, developing goals/objectives, addressing student behavior with parents, building classroom structure, behavioral consultation and support, live practice and support, access to conferences on specific topics (e.g., identifying EBPs and specific childhood disorders), and training on the use of specific interventions (e.g., social stories). Additionally, teachers received mentorship from “mentor teachers” and/or were given a half day to complete training from experts outside of the school. Leaders believed that teacher training was a very important component of providing EBP support.

Theme 2: Collaboration and Working in Teams

This theme captured facilitating factors related to collaboration and working in teams across staff and leadership at the schools.

School Supports

School supports were described as “robust” and included several built-in social support systems such as “team-led environments” and supportive teams, support from the district, use of walk-throughs and training, meetings to review EBPs, receiving mentorship, quarterly feedback from staff, and open communication across teachers and the team. Additionally, monthly community meetings among the teachers allowed school staff to focus on implementation each month.

Meetings

School leaders mentioned the use of weekly team meetings with teachers and paraprofessionals, special team meetings for focused discussions on a student or a student's programing, and meetings with administrators. Staff meetings covered a range of topics, including programing/interventions, developing goals/objectives, discussions around training/support needed to provide relevant programing, working with parents, identifying priorities, weekly planning, and preparing for behavioral concerns. Weekly meetings were instrumental in providing new teachers with opportunities to learn about EBPs and how to “apply these interventions…or procedures.” Other meetings included monthly meetings for different staff, including all-teacher meetings, meetings for assistant teachers, and meetings for program aids. Supervision meetings were held with leadership and curriculum/instructional/assessment specialists to address teaching practices, use of assessment data, IEPs, and classroom interventions.

Support From Nonteaching Staff

Nonteaching staff reportedly contributed greatly to the implementation of EBPs. School leaders mentioned teachers working in collaboration with (1) curriculum experts who helped set up programming and addressed academic issues, (2) behavioral teams who supported behavior interventions, (3) speech-language pathologists, and (4) committees that helped evaluate best practices and program-wide implementation. The important role of IEP chairs and autism coordinators was also described. Curriculum, instruction, and assessment teams supported teachers when developing IEPs and conducting assessments, often meeting weekly to discuss academic programming. Communication and collaboration between nonteaching staff and classroom teachers were noted as primary facilitators.

Culture/Climate

School leaders described the culture/climate of their organization as “unique,” “accepting,” “welcoming,” “positive,” and “safe.” There was great emphasis on the need for all stakeholders to collaborate, including embracing/celebrating individual differences while trying to be as adaptive and functional as possible. Some participants discussed the need for an “open door policy” to build better parent–teacher relationships. Leaders indicated that conveying appropriate expectations and encouraging continued education and PD among staff members immensely contributed to a successful climate/culture. Lastly, culture/climate was linked to the school's size and the “population served”; one participant explained “I think for our specific school that the culture climate is much different than in other schools just given the student population that we serve. I think that is very difficult to maintain a positive culture all the time.”

Family/Student-Level Factors

Participants discussed the need to collaborate with parents/caregivers when engaging in service delivery. Leaders attempted to connect with parents at the start of the academic year to ensure that parents felt comfortable about where their children were spending their day. Given that students come from different cultural backgrounds and have different learning styles, participants highlighted the need to collaborate with parents to meet unique student needs and ensure consistency in activities between home and school. Recommended steps for building positive relationships with families included: (1) introducing yourself to parents/caregivers during IEP meetings, (2) showing parents around to make them feel comfortable in an unfamiliar environment, and (3) conducting parent training to help address concerns at home.

Teacher/Staffing-Level Factors

Participants who viewed their school's culture/climate as positive, also described their staff as “caring,” “tenacious,” “friendly,” and “problem-solvers.” School leaders highlighted the need to positively acknowledge their staff's efforts, rather than focusing on the negatives, to empower teachers and encourage them to attempt new practices. Participants emphasized the need to have staff with a collaborative mindset to work with parents and general educators, and to cater to individual student needs. Leaders discussed the importance of staff actively communicating with each other and working as a team, and for schools to provide PD and training to inculcate staff into organizational culture and keep them updated.

What Barriers Impede EBP Implementation in Special Education?

Theme 3: Barriers to Implementation

This theme captured several factors, including lack of school support, culture/climate, organization-level factors, and teachers/staff-level factors.

Lack of School Supports

Some school leaders noted that there was a lack of support for the implementation of EBPs. While participants did not indicate who they perceived a lack of support from, common issues mentioned included schools being too focused on test scores, curriculum, and a lack of communication throughout the school leading to confusion among teachers.

Culture/Climate

One participant mentioned the difficulty in maintaining a consistently good culture/climate when there is a great deal of staff turnover. Staff turnover was mentioned often as being a consequence of the nature of special education. Another participant mentioned the culture/climate as being a “work in progress” and something they were “always striving to do better.”

Organization-Level Factors

A subcode of culture/climate factors included several organizational challenges, including a lack of consistent and transparent communication across the school, inadequate financial support, and limited resources. Participants highlighted the need for frequent and smooth communication between school-based teams to ensure that good ideas did not get overlooked and could be successfully implemented. One participant noted that “administrators can do a lot of things to maintain a positive work culture and collaborative environment.” One leader noted that being a Positive Behavioral Interventions and Supports school helped them shift the focus to school-wide positive approaches, which in turn helped their culture.

Teachers/Staffing-Level Factors

One major barrier to the implementation of EBPs was staff burnout. This is often due to staff turnover, especially of classroom staff (i.e., lead teachers and paraprofessionals). Leaders noted that teachers often felt stressed/overwhelmed due to understaffing and being overworked. Managing difficult behaviors was noted as a reason that teachers and staff felt burnt out.

What Are Important and Feasible Implementation Strategies?

Table 3 presents descriptive analyses of school leaders’ perceptions of the importance and feasibility of implementation strategies. Participants’ ratings of strategies were restricted in range; no strategy was rated as “relatively unimportant” or “not at all feasible.” The response range was more restricted for importance ratings (3.8 ≤ M 4.8) than for feasibility ratings (3.2 ≤ M 3.9); 11 of the 15 strategies were endorsed as at least “moderately important.”

Table 3.

Descriptive Statistics for Implementation Strategy Importance and Feasibility

Importance Feasibility Interview ranking (n)
Implementation strategy M SD Range M SD Range 1 2 3
Use evaluative and iterative strategies 4.4 0.6 3.5 0.8
 Monitor the progress of the implementation effort 4.6 0.6 3–5 3.3 0.8 2–5 1
 Audit and provide feedback 4.4 0.5 4–5 3.6 0.8 2–5 3 1
 Assess for readiness and identify barriers and facilitators 4.2 0.7 3–5 3.6 0.7 2–5 1 1
 Conduct local needs assessment 4.2 0.7 3–5 3.5 0.8 2–5 1 2
Develop stakeholder relationships 4.1 0.8 3.5 0.9
 Build partnerships (i.e., coalitions) to support implementation 3.8 0.9 2–5 3.2 0.9 2–5 2 1 1
 Model and simulate change 4.5 0.6 3–5 3.6 1.0 2–5 4 2 1
 Identify and prepare champions 4.1 0.9 2–5 3.5 0.8 2–5 1 3
Train and educate stakeholders 4.4 0.7 3.7 0.9
 Conduct ongoing training 4.7 0.5 4–5 3.8 1.0 2–5 5 1
 Make training dynamic 4.3 0.9 2–5 3.7 0.8 2–5 4 1
 Provide ongoing consultation/coaching 4.8 0.4 4–5 3.8 0.9 2–5 2 3 3
 Create a professional learning collaborative 4.4 0.7 3–5 3.5 0.8 2–5 1 2 2
 Conduct educational meetings 4.0 0.8 2–5 3.9 1.1 2–5 2 2 1
 Use train-the-trainer strategies 4.2 0.7 3–5 3.4 0.9 2–5 1 1
Support educators 4.3 0.7 3.2 0.9
 Facilitate relay of intervention fidelity and student data to school personnel 4.3 0.7 3–5 3.2 0.9 2–5 1 1
Change infrastructure 4.0 0.8 3.8 0.9
 Change/alter environment 4.0 0.8 3–5 3.8 0.9 2–5 1 2 5

Note. Implementation strategies are organized by category (italicized text), as identified in the SISTER project (C. R. Cook et al., 2019). The possible response range for strategy importance was 1  =  relatively unimportant to 5  =  extremely important and for strategy feasibility was 1  =  not at all feasible to 5  =  extremely feasible. Interview ranking describes the frequency with which school leaders identified each strategy as one of the top three strategies during the qualitative interview. SISTER = School Implementation Strategies, Translating ERIC Resources.

Average importance ratings were highest for provide ongoing consultation/coaching (M  =  4.8), conduct ongoing training (M  =  4.7), and monitor the progress of the implementation effort (M  =  4.6). The least variability was observed in the ratings of provide ongoing consultation and coaching (SD  =  0.4). Build partnerships to support implementation, conduct educational meetings, and change/alter environment were rated as relatively less important (M  =  3.8, 4.0, 4.0, respectively), with slightly more variability (SD  =  0.9) in participants’ ratings.

Conduct educational meetings, change/alter environment, conduct ongoing training, and provide ongoing consultation/coaching were rated as relatively more feasible (M  =  3.9, 3.8, 3.8, 3.8, respectively). Ratings were the most variable (SD  =  1.1) for conduct educational meetings. Build partnerships to support implementation, facilitate relay of intervention fidelity and student data to school personnel, and monitor the progress of the implementation effort (M  =  3.2, 3.2, 3.3, respectively) were rated as relatively less feasible.

Overall, participant ratings indicated that provide ongoing consultation/coaching was both important and feasible. Monitor the progress of the implementation effort, in contrast, was rated as one of the more important strategies, but was also considered less feasible. Similarly, conduct educational meetings and change/alter the environment were rated, on average, to be relatively less important but more feasible. Build partnerships to support implementation was rated as less important and feasible.

Discussion

School leaders, including administrative and practice leaders, play an integral role in the use of implementation strategies, which in turn support special education teachers in the implementation of EBPs. This convergent QUAL  +  QUAN mixed methods study is a first attempt to investigate school leaders’ perceptions of the facilitators and barriers to EBP implementation, particularly for students receiving special education, as well as the importance and feasibility of 15 implementation strategies.

Facilitators and Barriers Identified Through Qualitative Data

Facilitators primarily included access to resources related to interventions/EBPs (e.g., training and PD), as well as collaboration and teamwork (e.g., consistent communication between teachers, nonteaching staff, and family members; team meetings). Our findings about special education are consistent with the CFIR determinant framework (Damschroder et al., 2022a) that highlights the relevance of inner-setting factors. Specifically, facilitators related to Professional Development and Training (Theme 1) are consistent with multiple CFIR inner-setting domains. School leaders discussed the importance of available resources (money, training, and education) for EBP implementation, as well as access to knowledge and information (e.g., about the intervention/EBP and its contextual fit). Similarly, the second theme, Collaboration and Working in Teams, includes determinants in the domains of relational connections and communications, which refer to formal or informal relationships, networks, and teams as well as information-sharing practices within organizations.

School leaders’ perceptions of barriers also included inner-setting determinants. For example, participants reported that implementation of EBPs was not prioritized, consistent with the relative priority domain (i.e., individuals responsible for implementation may not view a novel innovation as a priority) as well as with implementation climate, which was classified as a critical antecedent for implementation in the revised CFIR and CFIR outcomes addendum (Damschroder et al., 2022a, 2022b). Lastly, structural characteristics were noted (e.g., high staff turnover and burnout) as factors hindering implementation.

Important and Feasible Implementation Strategies Identified Through Quantitative Data

Our participants’ ratings of the importance and feasibility of 15 selected implementation strategies were relatively restricted in range, paralleling prior research (Waltz et al., 2015). On average, leaders rated provide ongoing consultation/coaching, conduct ongoing training, and monitor the progress of the implementation effort as relatively most important, and provide ongoing consultation/coaching, conduct ongoing training, conduct educational meetings, and change/alter environment as relatively most feasible. Our findings are consistent with prior research investigating implementation strategy importance and feasibility in general education settings (Lyon et al., 2019) and with broader trends in school-based implementation research (Lyon & Bruns, 2019). Individual, education-focused implementation strategies, such as direct training (Fallon et al., 2017), consultation (Noell et al., 2005), and coaching (Pas et al., 2022), are commonly investigated in school-based implementation research and used during intervention trials (Moore et al., 2021). In research with school-based implementation consultants, education-focused strategies were also rated to be the most important and feasible (Lyon et al., 2019). However, school leaders’ mean importance and feasibility ratings were, on average, slightly higher in this study.

One strategy—build partnerships to support implementation—was rated to be relatively less important and feasible. This finding suggests that although researchers continuously seek to establish partnerships with schools, school leaders may view partnerships as less pertinent than other strategies. Misalignment in the priorities and timelines of researchers with those of community partners can impede partnership effectiveness. As such, researchers are encouraged to move toward embedded research approaches that prioritize partners’ needs (Beidas et al., 2022). School leaders considered two additional strategies—facilitate relay of intervention fidelity and student data to school personnel and monitor the progress of the implementation effort—to be less feasible. Despite the priority placed on data-based decision-making processes in schools (Lyon & Bruns, 2019), this finding indicates a need to improve school data systems and build capacity for school leaders and educators to effectively use data to guide implementation.

Integration of Qualitative and Quantitative Findings through Mixed Methods

After completing the importance and feasibility surveys, leaders identified three strategies that they considered most important and/or feasible and discussed how these were implemented. All 15 strategies were described by at least one participant. The top strategies participants discussed varied somewhat from the quantitative ratings. Whereas provide ongoing consultation/coaching and conduct ongoing training continued to be described as one of the top three strategies by several leaders (n  =  8 and 6, respectively), model and simulate change was one of the top strategies that participants discussed (n  =  7). Additionally, change/alter environment was discussed by eight leaders despite a relatively lower importance rating.

Factors Influencing Implementation

Consistent with qualitative findings about the importance of PD, training, and related resources, school leaders quantitatively rated and discussed strategies supporting EBP implementation through training/education to be some of the most important and feasible. Yet, leaders’ awareness of inner-setting factors that impede EBP implementation in special education was not prominently reflected in the strategies they rated as important for supporting EBP implementation. Research suggests that the organizational context is important for implementation (Forman & Selman, 2011; Locke et al., 2019a), and indicates that implementation strategies are most effective when tailored to relevant implementation determinants (Baker et al., 2015). Together, prior literature and our findings indicate that there is a need to consider a broader range of implementation strategies, including organizational-level strategies (e.g., bolstering school-wide communication).

Importance and Feasibility of Implementation Strategies

There were areas of convergence and divergence in school leaders’ perceptions of implementation strategies’ importance and feasibility (see Table 4). Qualitative and quantitative findings converged to support the importance and feasibility of provide ongoing consultation and coaching, as well as conduct ongoing training. School leaders described having “structures in place” to support consultation/coaching and that ongoing training is important in “[being] proactive” to “[keep] up with best practices and research.” While findings did not converge on the importance of change/alter the environment, qualitative and quantitative strands converged on its feasibility. Participants noted the ease of changing the physical environment quickly, including the layout of the classroom or materials available to students (e.g., fidgets). Leaders’ discussions of this strategy most often centered on supporting special education students’ safety and behavior rather than practice implementation (e.g., behavior-specific praise). Our findings diverged for monitor the progress of the implementation effort. Although this strategy was identified as one of the more important strategies in the quantitative data, only one leader discussed this as a top strategy. Our quantitative data indicate that school leaders perceived this strategy to be relatively less feasible.

Limitations and Future Directions

There are important limitations to note. First, participants in this study only rated 15 implementation strategies across five categories, potentially leading to the oversight of other relevant strategies. Future studies that examine school leaders’ perceptions of a comprehensive range of strategies using qualitative and quantitative methods can build upon our work. Second, this study's small sample of school leaders were predominantly female, White, and recruited from elementary schools in one Mid-Atlantic state. These leaders were recruited from a larger project focused on modifying an implementation strategy package to support EBPs with autistic students in special education. Thus, results may have limited transferability to other school settings or to racially, ethnically, and linguistically diverse leader and student populations, and may reflect perceptions of leaders who value EBPs. Additional research with representative samples of school leaders is needed.

Third, in contrast to prior research with school principals, our sample included school leaders with varied roles (administrators and service providers) in either public or nonpublic school settings and who likely have varied experiences with implementing EBPs. This sample heterogeneity may have contributed to the variability and restricted ranges observed in leaders’ strategy ratings. Our findings should be interpreted with these characteristics in mind. Individuals’ perceptions of the implementation context may differ depending on their role (Locke et al., 2019b). Thus, future research with larger samples should investigate whether leaders’ perceptions of the implementation context and strategies vary by their role, school setting, or prior experiences, as well as whether leaders’ perceptions differ from perceptions of educators delivering classroom interventions. Finally, although the time frame for the study did not enable participants to review the accuracy of their transcripts and findings, members of an advisory board for the overarching project were able to provide feedback on the findings.

Implications

Despite their recognition of the importance of inner-setting factors for EBP implementation in special education, few of the school leaders in this study prioritized implementation strategies that may be capable of addressing these implementation determinants. Inner-setting factors and related strategies have received insufficient attention in school contexts (Lyon & Bruns, 2019), suggesting that school leaders may be less familiar with strategies addressing these determinants. Future researchers may collect data on leaders’ experiences with various strategies to investigate whether prior exposure affects how strategies are evaluated and prioritized. To fill potential knowledge or experience gaps, implementation science concepts can be integrated into training and certification programs for school leaders and in-service training opportunities (Forman et al., 2013; Pauling et al., 2021). Building school leaders’ knowledge and skills for identifying and addressing implementation determinants at multiple levels as part of these trainings is important.

While implementation scientists recommend that feasibility and importance ratings may guide the selection of implementation strategies (Lyon et al., 2019; Waltz et al., 2015), our mixed method findings suggest that school leaders’ ratings of importance and feasibility may not directly align with their beliefs about the most important strategies for EBP implementation in special education. Prior work suggests that a range of factors influence the selection of implementation strategies to address target determinants (Waltz et al., 2019). Future research that investigates school leaders’ decision-making processes surrounding implementation strategy selection (beyond feasibility and importance) and how implementation strategies address targeted determinants is needed (Powell et al., 2019). Investigations that leverage explanatory sequential mixed method designs (e.g., Bustos et al., 2021) may illuminate how importance and feasibility considerations, as well as leaders’ role and prior experience with EBPs and implementation strategies, contribute to school leaders’ implementation planning and strategy selection. In practice, integrating a systematic process for identifying determinants and tailoring strategies (e.g., implementation mapping) may benefit school-based implementation of EBPs in special education (Fernández et al., 2019). Importantly, the observed discrepancies in this study also signal potential incompatibility between the complex concepts, frameworks, theories, and terminology applied in implementation science and those used by school-based implementation partners (Beidas et al., 2022). Both research- and practice-based evidence, generated through stakeholder-driven partnerships, will be needed to inform the use of tailored implementation strategies and, ultimately, to support high-quality implementation of EBPs in schools.

Supplemental Material

sj-docx-1-irp-10.1177_26334895231220279 - Supplemental material for The perspective of school leaders on the implementation of evidence-based practices: A mixed methods study

Supplemental material, sj-docx-1-irp-10.1177_26334895231220279 for The perspective of school leaders on the implementation of evidence-based practices: A mixed methods study by Stephanie A. Moore, Aksheya Sridhar, Isabella Taormina, Manasi Rajadhyaksha and Gazi Azad in Implementation Research and Practice

Footnotes

Author Contributions: Stephanie A. Moore, PhD (she/her), is an assistant professor of school psychology whose research is focused on school mental health systems and implementation science. She has conducted primarily quantitative and survey-based research and has experience with direct service provision in schools. She coconceptualized this study's aims, led the quantitative analyses, wrote the quantitative portions of the method and results, wrote the discussion, and reviewed all sections of the manuscript. Aksheya Sridhar (she/her) is a doctoral candidate in clinical science, and focuses her research at the intersection of autism services and implementation science. She has conducted primarily qualitative and mixed methods research. She led the qualitative analysis, wrote the qualitative portions of the method and results, and wrote the discussion for this study. Isabella Taormina (she/her) is a research coordinator within the division of child and adolescent psychiatry and works primarily on school-based implementation research. She conducted qualitative coding, wrote the qualitative portion of the results, and wrote the introduction for this study. Manasi Rajadhyaksha (she/her) is a doctoral student in school psychology, with a research focus on school-based mental health and implementation science. She has conducted primarily quantitative and mixed methods research. She conducted qualitative coding and wrote the introduction for this study. Gazi Azad is an assistant profession in the division of child and adolescent psychiatry. She is the PI on the grant that funded this study. She conceptualized the aims, conducted the interviews, and facilitated with all components of the manuscript (i.e., helped write the introduction, methods, and discussion, as well as supervised quantitative and qualitative results).

The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding: The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This study was supported in part by a Faculty-Community Award from the Urban Health Institute at Johns Hopkins University (74210). Preparation of this article was supported in part by grants from the National Institute of Mental Health (5K23 MH1119331). Open access publishing of this article was supported in part by the University of California libraries.

Supplemental Material: Supplemental material for this article is available online.

References

  1. Baker R., Comosso-Stefinovic J., Gillies C., Shaw E. J., Cheater F., Flottorp S., Robertson N., Wensing M., Fiander M., Eccles M. P., Godycki-Cwirko M., van Lieshout J., Jäger C. (2015). Tailored interventions to address determinants of practice. Cochrane Database of Systematic Review, 4, Article CD005470. 10.1002/14651858.CD005470.pub3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Beidas R. S., Dorsey S., Lewis C. C., Lyon A. R., Powell B. J., Purtle J., Saldana L., Shelton R. C., Stirman S. W., Lane-Fall M. B. (2022). Promises and pitfalls in implementation science from the perspective of US-based researchers: Learning from a pre-mortem. Implementation Science, 15, Article 55. 10.1186/s13012-022-01226-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Burns M. K., Ysseldyke J. E. (2009). Reported prevalence of evidence-based instructional practices in special education. The Journal of Special Education, 43(1), 3–11. 10.1177/0022466908315563 [DOI] [Google Scholar]
  4. Bush T., Glover D. (2014). School leadership models: What do we know? School Leadership & Management, 35(5), 553–571. 10.1080/13632434.2014.928680 [DOI] [Google Scholar]
  5. Bustos T. E., Sridhar A., Drahota A. (2021). Community-based implementation strategy use and satisfaction: A mixed-methods approach to using the ERIC compilation for organizations serving children on the autism spectrum. Implementation Research and Practice, 2. 10.1177/26334895211058086 [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Cook B. G., Odom S. L. (2013). Evidence-based practices and implementation science in special education. Exceptional Children, 79(3), 135–144. 10.1177/001440291307900201 [DOI] [Google Scholar]
  7. Cook C. R., Lyon A. R., Locke J., Waltz T., Powell B. J. (2019). Adapting a compilation of implementation strategies to advance school-based implementation research and practice. Prevention Science, 20, 914–935. 10.1007/s11121-019-01017-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Creswell J. W. (2022). A concise introduction to mixed methods research (2nd ed.). Sage Publications Inc. [Google Scholar]
  9. Damschroder L. J., Aron D. C., Keith R. E., Kirsh S. R., Alexander J A., Lowery J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4, Article 50. 10.1186/1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Damschroder L. J., Reardon C. M., Opra Widerquist M. A., Lowery J. (2022b). Conceptualizing outcomes for use with the Consolidated Framework for Implementation Research (CFIR): the CFIR outcomes addendum. Implementation Science, 17, Article 7. 10.1186/s13012-021-01181-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Damschroder L. J., Reardon C. M., Opra Widerquist M. A., Lowery J. (2022a). The updated consolidated framework for implementation research based on user feedback. Implementation Science, 17, Article 75. 10.1186/s13012-022-01245-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Duong M. T., Bruns E. J., Lee K., Cox S., Coifman J., Mayworm A., Lyon A. R. (2021). Rates of mental health service utilization by children and adolescents in schools and other common service settings: A systematic review and meta-analysis. Administration and Policy in Mental Health and Mental Health Services Research, 48(3), 420–439. 10.1007/s10488-020-01080-9 [DOI] [PubMed] [Google Scholar]
  13. Edmunds S. R., Frost K. M., Sheldrick R. C., Bravo A., Straiton D., Pickard K., Broder-Fingert S. (2022). A method for defining the CORE of a psychosocial intervention to guide adaptation in practice: Reciprocal imitation teaching as a case example. Autism, 26(3), 601–614. 10.1177/13623613211064431 [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Fallon L. M., Kurtz K. D., Mueller M. R. (2017). Direct training to improve educators’ treatment integrity: A systematic review of single-case design studies. School Psychology Quarterly, 33(2), 169–181. 10.1037/spq0000210 [DOI] [PubMed] [Google Scholar]
  15. Fernandez M. L., Hoor G. A. T., Van Lieshout S., Rodriguez S. A., Beidas R. S., Parcel G. S., Ruiter R. A. C., Markham C., Kok G. (2019). Implementation mapping: Using intervention mapping to develop implementation strategies. Frontiers in Public Health, 7, Article 158. 10.3389/fpubh.2019.00158 [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Forman S. G., Selman J. S. (2011). Systems-based service delivery in school psychology. In Bray M. A., Kehle T. J. (Eds.), Oxford handbook of school psychology (pp. 628–646). Oxford University Press. [Google Scholar]
  17. Forman S. G., Shapiro E. S., Codding R. S., Gonzales J. E., Reddy L. A., Rosenfield S. A., Sanetti L. M. H., Stoiber K. C. (2013). Implementation science and school psychology. School Psychology Quarterly, 28(2), 77–100. 10.1037/spq0000019 [DOI] [PubMed] [Google Scholar]
  18. Guetterman T. C., Fetters M. D., Creswell J. W. (2015). Integrating quantitative and qualitative results in health science mixed methods research through joint displays. Annals of Family Medicine, 13(6), 554–561. 10.1370/afm.1865lawson [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Individuals with Disabilities Education Act, 20 U.S.C. § 1400 (2004)
  20. Landrum B., Garza G. (2015). Mending fences: Defining the domains and approaches of quantitative and qualitative research. Qualitative Psychology, 2(2), 199–209. 10.1037/qup0000030 [DOI] [Google Scholar]
  21. Lawson G. M., Owens J. S., Mandell D. S., Tavlin S., Rufe S., So A., Power T. J. (2022). Barriers and facilitators to teachers’ use of behavioral classroom interventions. School Mental Health, 14(4), 844–862. 10.1007/s12310-022-09524-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Locke J., Kang-Yi C., Frederick L., Mandell D. S. (2019a). Individual and organizational characteristics predicting intervention use for children with autism in schools. Autism, 24(5), 1152–1163. 10.1177/1362361319895923 [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Locke J., Lee K., Cook C. R., Frederick L., Vázquez-Colón C., Ehrhart M. G., Aarons G. A., Davis C., Lyon A. R. (2019b). Understanding the organizational implementation context of schools: A qualitative study of school district administrators, principals, and teachers. School Mental Health, 11(3), 379–399. 10.1007/s12310-018-9292-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Lyon A. R., Bruns E. J. (2019). From evidence to impact: Joining our best school mental health practices with our best implementation strategies. School Mental Health, 11, 106–114. 10.1007/s12310-018-09306-w [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Lyon A. R., Comtois K., Kerns S., Landes S., Lewis C. (2020). Closing the science-practice gap in implementation before it widens. In Albers B., Sholnsky A., Mildon R. (Eds.), Implementation Science 3.0 (pp. 295–313). Springer. 10.1007/978-3-030-03874-8_12 [DOI] [Google Scholar]
  26. Lyon A. R., Cook C. R., Locke J., Davis C., Powell B. J., Waltz T. J. (2019). Importance and feasibility of an adapted set of implementation strategies in schools. Journal of School Psychology, 76, 66–77. 10.1016/j.jsp.2019.07.014 [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Melgarejo M., Lind T., Stadnick N. A., Helm J. L., Locke J. (2020). Strengthening capacity for implementation of evidence-based practices for autism in schools: The roles of implementation climate, school leadership, and fidelity. American Psychologist, 75(8), 1105–1115. 10.1037/amp0000649 [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Moore S. A., Arnold K. T., Beidas R. S., Mendelson T. (2021). Specifying and reporting implementation strategies in a school-based prevention efficacy trial. Implementation Research and Practice, 2. 10.1177/26334895211047841 [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Noell G. H., Witt J. C., Slider N. J., Connell J. E., Gatti S. L., Williams K. L., Koenig J. L., Resetar J. L., Duhon G. J. (2005). Treatment implementation following behavioral consultation in schools: A comparison of three follow-up strategies. School Psychology Review, 34(1), 87–106. 10.1080/02796015.2005.12086277 [DOI] [Google Scholar]
  30. O’Brien B. C., Harris I. B., Beckman T. J., Reed D. A., Cook D. A. (2014). Standards for reporting qualitative research: A synthesis of recommendations. Academic Medicine, 89(9), 1245–1251. 10.1097/ACM.0000000000000388 [DOI] [PubMed] [Google Scholar]
  31. Palinkas L. A., Aarons G. A., Horwitz S. M., Chamberlain P., Hurlburt M. S., Landsverk J. (2011). Mixed method designs in implementation research. Administration and Policy in Mental Health, 38(1), 44–53. 10.1007/s10488-010-0314-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Pas E. T., Kaihoi C. A., Debman K. J., Bradshaw C. P. (2022). Is it more effective or efficient to coach teachers in pairs or individually? A comparison of teacher and student outcomes and coaching costs. Journal of School Psychology, 92, 346–359. 10.1016/j.jsp.2022.03.004 [DOI] [PubMed] [Google Scholar]
  33. Pauling S., Cook C., Pekel K., Larson M., Zhang Y. (2021). A cross-sectional survey of school administrators’ implementation of evidence-based practices and programs: Training, knowledge, and perceived barriers. Leadership and Policy in Schools, 22(3), 676–694. 10.1080/15700763.2021.1998545 [DOI] [Google Scholar]
  34. Powell B. J., Beidas R. S., Lewis C. C., Aarons G. A., McMillen J. C., Proctor E. K., Mandell D. S. (2017). Methods to improve the selection and tailoring of implementation strategies. Journal of Behavioral Health Services Research, 44(2), 177–194. 10.1007/s11414-015-9475-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Powell B. J., Fernandez M. E. Williams N. J. Aarons G. A. Beidas R. S. Lewis C. C. McHugh S. M. Weiner B. J. (2019). Enhancing the impact of implementation strategies in healthcare: A research agenda. Frontiers in Public Health, 7, Article 3. 10.3389/fpubh.2019.00003 [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Powell B. J., Waltz T. J., Chinman M. J., Damschroder L. J., Smith J. L., Matthieu M. M., Proctor E. K., Kirchner J. E. (2015). A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science, 10, Article 21. 10.1186/s13012-015-0209-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Reich S. M. (2010). Individualized education plan (IEP). In Clauss-Ehlers C. S. (Ed.), Encyclopedia of cross-cultural school psychology (pp. 540–542). Springer. 10.1007/978-0-387-71799-9_211 [DOI] [Google Scholar]
  38. Stadnick N. A., Meza R. D., Suhrheinrich J., Aarons G. A., Brookman-Frazee L., Lyon A. R., Mandell D. S., Locke J. (2019). Leadership profiles associated with the implementation of behavioral health evidence-based practices for autism spectrum disorder in schools. Autism , 23(8), 1957–1968. 10.1177/1362361319834398 [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Suhrheinrich J., Melgarejo M., Root B. V., Aarons G. A., Brookman-Frazee L. (2021). Implementation of school-based services for students with autism: Barriers and facilitators across urban and rural districts and phases of implementation. Autism, 25(8), 2291–2304. 10.1177/13623613211016729 [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Waltz T. J., Powell B. J., Fernández M. E., Abadie B., Damschroder L. J. (2019). Choosing implementation strategies to address contextual barriers: Diversity in recommendations and future directions. Implementation Science, 14, Article 42. 10.1186/s13012-019-0892-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Waltz T. J., Powell B. J., Matthieu M. M., Damschroder L. J., Chinman M. J., Smith J. L., Proctor E. K., Kirchner J. E. (2015). Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: Results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Science, 10, Article 109. 10.1186/s13012-015-0295-0 [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

sj-docx-1-irp-10.1177_26334895231220279 - Supplemental material for The perspective of school leaders on the implementation of evidence-based practices: A mixed methods study

Supplemental material, sj-docx-1-irp-10.1177_26334895231220279 for The perspective of school leaders on the implementation of evidence-based practices: A mixed methods study by Stephanie A. Moore, Aksheya Sridhar, Isabella Taormina, Manasi Rajadhyaksha and Gazi Azad in Implementation Research and Practice


Articles from Implementation Research and Practice are provided here courtesy of SAGE Publications

RESOURCES