Abstract
Background
Few “intervention agnostic” strategies have been developed that can be applied to the broad array of evidence-based practices (EBPs) in schools. This paper describes two studies that reflect the initial iterative redesign phases of an effective leadership-focused implementation strategy—Leadership and Organizational Change for Implementation (LOCI)—to ensure its acceptability, feasibility, contextual appropriateness, and usability when used in elementary schools. Our redesigned strategy—Helping Educational Leaders Mobilize Evidence (HELM)—is designed to improve principals’ use of strategic implementation leadership to support the adoption and high-fidelity delivery of a universal EBP to improve student outcomes.
Method
In Study 1, focus groups were conducted (n = 6) with 54 district administrators, principals, and teachers. Stakeholders provided input on the appropriateness of original LOCI components to maximize relevance and utility in schools. Transcripts were coded using conventional content analysis. Key themes referencing low appropriateness were summarized to inform LOCI adaptations. We then held a National Expert Summit (Study 2) with 15 research and practice experts. Participants provided feedback via a nominal group process (NGP; n = 6 groups) and hackathon (n = 4 groups). The research team rated each NGP suggestion for how actionable, impactful/effective, and feasible it was. We also coded hackathon notes for novel ideas or alignment with LOCI components.
Results
Study 1 suggestions included modifications to LOCI content and delivery. Study 2's NGP results revealed most recommendations to be actionable, impactful/effective, and feasible. Hackathon results surfaced two novel ideas (distributed leadership teams and leaders’ knowledge to support educators EBP use) and several areas of alignment with LOCI components.
Conclusion
Use of these iterative methods informed the redesign of LOCI and the development of HELM. Because it was collaboratively constructed, HELM has the potential to be an effective implementation strategy to support the use of universal EBP in schools.
Keywords: leadership, implementation climate, schools, organization
Plain Language Summary
Our research team designed a strategy (HELM) for school principals to improve the support they provide to staff to implement practices proven to work in research for improving student outcomes. We designed HELM by conducting focus groups with school district administrators, principals, and teachers. Participants were asked for their feedback on how to adapt an existing leadership strategy (LOCI) to the school context. After collecting this feedback, we held a meeting with 15 research and practice experts. During this meeting, the group of experts reviewed the focus group feedback and decided how to incorporate it into the design of the HELM strategy. We believe that collecting this feedback and involving research and practice experts in interpreting and integrating participant feedback into the HELM strategy will make HELM a more effective strategy for supporting school principals’ in implementing supports in their schools.
Introduction
A growing body of research suggests that organizational factors are particularly critical in the successful implementation of evidence-based practices (EBPs), defined as practices and programs shown by high-quality research to have meaningful effects on student outcomes (Aarons et al., 2012; Beidas et al., 2013, 2014, 2015; Bonham et al., 2014; Cook & Odom, 2013; Locke et al., 2019; Williams et al., 2019). Implementation leadership (i.e., specific leader behaviors that support EBP use; Aarons, Ehrhart, et al., 2014) and implementation climate (i.e., shared perceptions among implementers of the extent to which EBP use is expected, supported, and rewarded within the organization; Ehrhart et al., 2014) are malleable organizational factors that predict EBP use (Aarons, Ehrhart, et al., 2014, 2015, 2017; Beidas & Kendall, 2010; Brookman-Frazee & Stahmer, 2018; Damschroder et al., 2009; Locke et al., 2019; Williams et al., 2018, 2020). Implementation leadership behaviors are theorized to affect implementation outcomes by creating a positive implementation climate that subsequently affects teachers’ implementation behaviors (Aarons, Farahnak, et al., 2014; Williams et al., 2020), including intervention fidelity, the degree to which an EBP is implemented as designed (Proctor et al., 2011). Considering the resources invested in school organizational change (Darling-Hammond et al., 2009), new implementation strategies, defined as methods used to enhance the adoption, implementation, and sustainment of an EBP (Powell et al., 2015; Proctor et al., 2013), need to target school leaders, such as principals, who have a large role in decision-making and are most proximal to implementation efforts.
One organizational implementation strategy, Leadership and Organizational Change for Implementation (LOCI), targets general and implementation-specific leadership to improve implementation climate (Aarons et al., 2015, 2017). LOCI is a data-driven intervention with eight core components: (1) assessment and feedback (360° surveys on leadership and climate are administered to the leader and their staff, with feedback to the leader); (2) leadership development plan (leader's goals for improving their and others’ leadership and the implementation climate of their organization); (3) initial training (a two-day didactic and interactive training that covers effective implementation leadership); (4) individual coaching (weekly coaching calls to review progress and update the development plan); (5) group coaching (monthly calls with trainees to review progress and share strategies); (6) organizational strategy development (monthly meetings with first-level and upper-level leaders as well as other employees to develop an organizational implementation strategy and work on alignment between levels); (7) follow-up training (two one-day sessions to review content, feedback, and update leadership development plans); and (8) graduation (review leaders’ final feedback and celebrate progress). LOCI has been successfully tested in mental health and substance use settings, resulting in leaders reporting higher perceived utility for managing organizational change, higher utility for implementing EBPs, and clinicians reporting improved implementation leadership and climate for targeted EBPs (Aarons et al., 2015, 2017; Skar et al., 2022). LOCI also has recently been used in mental health and school programs to support therapists and teachers to deliver EBPs for autistic youth (Brookman-Frazee & Stahmer, 2018); however, LOCI was not systematically adapted or iteratively redesigned for use in schools, which is a unique implementation context. The perpetual shifts in staff, students, operating policies and procedures, and emergent crises that plague public schools suggest the need for LOCI redesign for it to support high-fidelity EBP use to achieve positive outcomes for students.
Despite considerable research documenting the criticality of building-level leadership—and principal leadership in particular—to implementation efforts designed to improve student social, emotional, behavioral, and academic outcomes (Hallinger & Heck, 1996; McIntosh et al., 2016; Pinkelman et al., 2015), little research has developed and evaluated strategies to support leaders’ implementation of effective programs and practices in schools. Although implementation leadership has high relevance to universal social, emotional, and behavioral programming in schools, redesign of LOCI is needed to attend to the specific organizational structures, priorities, and personnel present in schools considering LOCI was developed and tested outside of educational settings. Factors that affect the implementation process (staff turnover, emergent crises in schools, policies, etc.) differ in schools in comparison to community agencies (Brookman-Frazee et al., 2020). While LOCI has recently been used in mental health and school programs to support therapists and teachers to deliver EBPs for autistic youth (Brookman-Frazee & Stahmer, 2018), it was not systematically adapted to specifically address implementation determinants most proximal to EBP implementation in educational settings. The activities in the current study (e.g., usability testing) are driven by our theory of change (see Figure 1), which details how the redesigned Helping Educational Leaders Mobilize Evidence (HELM) implementation strategy will impact proximal mechanisms (e.g., leader behaviors) and distal outcomes (e.g., student behavior).
Figure 1.
HELM Theory of Change
Note. HELM = Helping Educational Leaders Mobilize evidence; Lead. Dev. Plan = Leadership Development Plan; Ind. = Individual; Grp. = Group; Org. = Organizational
We used the Discover, Design, Build, and Test Framework (DDBT) to guide the iterative redesign of the LOCI implementation strategy. The DDBT framework combines user-centered design and implementation science to enhance the usability, contextual fit, and uptake of complex interventions, including implementation strategies (Lyon et al., 2019). The Discover Phase engages stakeholders to determine: (1) context requirements, so the redesign of the implementation strategy is useful and usable in elementary schools; and (2) required changes to ensure usability and contextual appropriateness. The Design/Build Phase uses the information gleaned from the Discover phase to iteratively build, evaluate, and refine a redesigned implementation strategy, and the Test Phase involves feasibility testing with a larger number of end-users and in their actual milieu.
To maximize the relevance and utility of HELM in schools, two studies were designed to gather expert input on the appropriateness (i.e., suitability/fit) of the original LOCI components to inform systematic adaptation. In both studies, the original components of LOCI (described above) were presented to stakeholders to elicit their feedback. In Study 1, focus groups were conducted with principals, district administrators, and teachers. Inclusion of these three stakeholder groups ensured input from the perspectives of individuals who (a) engage in strategic leadership (principals), (b) oversee strategic leadership (district administrators), and (c) are the targets of strategic leadership (teachers). Study 1 sought to answer the following research question:
What modifications are necessary to render LOCI components appropriate for use with elementary school principals and teachers?
Informed by the results of the first study (e.g., key themes referencing low appropriateness), Study 2 involved an “Expert Summit” in which research and practice experts took part in a structured information-generation and prioritization process (the Nominal Group Process [NGP]) and hackathon (both methods described below) to gather input about LOCI and the proposed adaptations, evaluate the assumptions of the theory of change (e.g., the links between implementation leadership and climate), and finalize adaptations to enhance their effectiveness, while simultaneously ensuring appropriateness and feasibility in elementary school settings. Summit participants represented a breadth of experience in: (1) principal leadership; (2) organizational leadership training and professional development; (3) implementation science; (4) evidence-based social, emotional, and behavioral practices; and (5) constraints and opportunities in educational settings. Study 2 sought to answer the following research questions:
What modifications are necessary to ensure that HELM's theory of change is clear, comprehensive, and accurate?
What additional refinements or additions to LOCI intervention components are needed to enhance impact on its proximal (e.g., implementation climate and citizenship) and distal (e.g., fidelity, penetration) mechanisms of change?
Study 1
Method
Participants
Fifty-nine district administrators, principals, and teachers were recruited from two school districts in one Pacific Northwestern and one Midwestern state in the USA (see Table 1). Fifty-four (91.5%) participants provided consent and participated in the focus groups. Of the five participants who were recruited and did not participate, three district administrators were unavailable during the scheduled focus groups and one district administrator and one principal did not provide consent. The final sample included 14 district administrators, 21 principals, and 19 teachers (see Table 2).
Table 1.
Study 1 Sample Counts by State, Role, and Recruitment Status
| Recruited | Participated | |
|---|---|---|
| District administrators | ||
| State 1 | 10 | 7 |
| State 2 | 8 | 7 |
| Principals | ||
| State 1 | 12 | 11 |
| State 2 | 10 | 10 |
| Teachers | ||
| State 1 | 7 | 7 |
| State 2 | 12 | 12 |
| Total | 59 | 54 |
Table 2.
Study 1 Participant Demographics
| Participants (n = 54) | ||
|---|---|---|
| n | % | |
| Gender | ||
| Male | 11 | 20% |
| Female | 43 | 80% |
| Latinx | ||
| Yes | 8 | 15% |
| No | 46 | 85% |
| Race | ||
| Asian | 1 | 1.9% |
| Black or African American | 5 | 9.2% |
| White | 44 | 81.5% |
| Other | 4 | 7.4% |
| Age (average) | 42.7 | |
| Years of experience (average) | 7.9 | |
Procedures
Following the Institutional Review Board and Protection of Human Subjects approval, we worked with our district partners to identify potential participants, met with them to describe the study, and obtained informed consent. As part of the DDBT Discover Phase, we scheduled a convenient time to conduct separate virtual focus groups for each set of participants to identify redesign targets to be an efficient and contextually appropriate version for schools. Two weeks prior, participants received written materials describing LOCI and its components to review before attending the focus groups. Focus groups were facilitated by the first (female), eighth, and last authors (both male) who were academic researchers at the time of the study and all of whom were formally trained in qualitative methods, including focus group facilitation (same for Study 2). Facilitators presented a review of the LOCI components and led a discussion of the questions detailed in the semi-structured interview protocol (see below). Consistent with recommendations from the literature, focus groups lasted approximately 90-min (Morgan, 1997; Vaughn et al., 1996) and included 7–12 participants each. The groups were audio recorded for transcription. Participants were offered $50 as an incentive for their time.
Two systematic and comprehensive semi-structured focus group protocols were developed: one for district administrators/principals and another for teachers. Each protocol emphasized: (1) the relevance and utility of the original LOCI intervention and its overarching objectives in schools; (2) the feasibility and appropriateness of each LOCI component; (3) strategies or modifications for ensuring the appropriateness, impact, and scalability of the HELM adaptation in schools (e.g., identifying formal and informal school implementation leaders); and (4) alignment and misalignment between the LOCI participation expectations and standard professional development processes for principals and teachers. District administrators and principals provided feedback on all original LOCI components; teachers provided feedback on components most relevant to their role (360° assessment, leadership development plan, and distributed leadership teams).
Data Analysis
Because the purpose of Study 1 was to explore the full range of possible adaptations to LOCI and to improve the feasibility and contextual appropriateness of HELM, a conventional content analysis approach (Hsieh & Shannon, 2005) was used in which meaning is derived from the content of verbal communications, but no a priori codes were identified prior to reviewing transcripts. Focus group recordings were transcribed, anonymized, and saved as electronic documents for the coding team to review. The first author, who has extensive training in and experience leading qualitative research, led the development of the coding scheme using a rigorous, systematic, transparent, and iterative approach. A codebook was iteratively developed via close reading of the initial transcripts (i.e., inductive approach; Bradley et al., 2008), code generation, and group meetings. The codebook included operational definitions of each code, examples of the code from the data, and guidance on when to use and not use the code (see Tables 3 and 4). After a stable set of codes was reached, four different raters coded each transcript independently and resolved disagreements through consensus dialogue (DeSantis & Ugarriza, 2000; Hill et al., 2005). No member checking was done. Inter-rater reliability was calculated using percent agreement (98% across all transcripts).
Table 3.
Study 1 District Administrator and Principal Codebook
| HELM Study 1 District Administrator and Principal Qualitative Codebook | |
|---|---|
| 1. Coaching | DEFINITION: HELM coaching components, functions of coaching, method of coaching, how coaching should be delivered |
| Example Codes: private coaching, coach needs to create trust/non-judgmental, coach training (i.e., principal to principal), tailor coaching, etc. | |
| 2. End users | DEFINITION: Who should be receiving HELM and who should be involved with HELM and how/what will their involvement look like |
| Example Codes: teams at district and building need to work together, connection with districts to focus on importance of leadership, etc. | |
| 3. New versus seasoned leaders | DEFINITION: Differentiation between new and seasoned leaders needs to be clearly articulated in HELM |
| Example Codes: differentiation between new and seasoned leaders (adapting to leader based on experience), a new leader will be more likely to engage than a seasoned leader, how to differentiate goals for new vs seasoned principals, concerns around use of HELM for first year principals | |
| 4. Assessment | DEFINITION: All things related to HELM assessment including survey type, frequency, duration, how to administer, when to administer, language around assessment, why we need to do an assessment, who to include in the assessment |
| Example Codes: simplify assessment/data collection tools/measures, growth mindset/frame as growth potential, tool for self-reflection rather than test/evaluation, low-burden, adaptive based on experience, gather data from other informants (non-traditional stakeholders), etc. | |
| 5. HELM intervention components | DEFINITION: Recommendations for HELM components including duration, time frame, focus/goals, data-driven decisions, feedback, content |
| Example Codes: make sure HELM is used in context of prioritized EBP, prioritization must come from collective voices (communal ownership), data-driven decisions about EBP selection, shared decision-making, framing of HELM, etc. | |
| 6. Alignment with extant principal training | DEFINITION: How does/will HELM align with other principal leadership training/professional development available to improve feasibility |
| Example Codes: consistency with other principal training, align HELM constructs with existing language being used, align with existing principal leadership development and training, explain the why for each HELM construct/component (statement of clarity), etc. | |
Table 4.
Study 1 Teacher Codebook
| HELM Study 1 Teacher Qualitative Codebook | |
|---|---|
| 1. Assessment & feasibility | DEFINITION: How to make the HELM assessment more feasible to ensure participant completion without excess burden |
| Example Codes: frequency of assessments (3–4 times per year), over what period of time, spacing of assessments to allow enough time to see change, reflection that is meaningful that can be captured on a survey, who should get survey, specify the amount of time for completion | |
| 2. Survey administration | DEFINITION: Logistics of when/how to administer survey |
| Example Codes: utilizing existing time or structures to increase buy-in, protected time to do assessments (e.g., staff meetings), use online format to increase accessibility | |
| 3. Purpose | DEFINITION: What we say to participants to enhance their likelihood of assessment completion |
| Example Codes: addressing buy-in of teachers/survey respondents, explanation of why we are doing these surveys to increase buy-in | |
| 4. Feedback report | DEFINITION: What goes into the feedback report |
| Example Codes: explanation of why principals get this feedback, recommendations (concrete, actionable steps for leaders to develop their plans), make the feedback factual and evidence-based, report should be strengths-based | |
| 5. Survey contents | DEFINITION: Items, questions, surveys (ILS/ICS) that go into the assessment |
| Example Codes: clearly articulate the referent (principal) for whom the survey is about, allow for open-ended responses, etc. | |
| 6. Provision of feedback to leaders | DEFINITION: How we present the data to principals |
| Example Codes: anonymously/confidential (allow for more honest feedback), ensuring the report maintains participants’ relationship with the leader (to prevent negative repercussions), aggregate all responses for additional anonymity, etc. | |
| 7. Strategies for Leaders to Promote Staff Engagement, Value & Respect | DEFINITION: Things that principals/leaders can do to support staff, foster engagement, cultivate value, and show respect |
| Example Codes: building personal relationships early in the year, fostering a community that cares, specific praise/affirmation, recognition/acknowledgment, gratitude, appreciation, being present in the classroom (observe and provide feedback to teachers), etc. | |
| 8. Strategies for EBP implementation | DEFINITION: Things that principals/leaders can do to support EBP adoption and use |
| Example Codes: protected time to plan and prep for EBP, offer professional development in EBP, use data to enhance buy-in for implementation, active administrator involvement in EBP trainings (leader goes to the training too for collaboration not delegation), etc. | |
Results
HELM Content Modifications
Focus group participants across all roles focused primarily on content modification related to the original LOCI 360° assessment and coaching. Participants strongly encouraged the research team to clearly communicate the “why” of the 360° assessment to ensure respondents provided an accurate representation of the school. One district administrator shared,
I think … it's really important for the people who are completing the survey
to know … what is the purpose of the goal for this survey, how will it be used, because the responses that people give will depend on if they really understand the goal or the intended outcome.
Teachers felt similarly, with one teacher expressing,
I think, too, maybe just starting with the why with your staff, too, and I think as a staff member, when you feel really supported, you want to reciprocate that. If our principal [is] looking for feedback, I would definitely want to put in the time and thoughtfulness to respond to that. But if you start with why we’re doing this and what we’re hoping to gain … that's probably helpful for staff.
Administrators and principals also suggested the non-evaluative nature of the 360° assessment be clearly communicated to garner buy-in from principals who are expected to review and act upon their assessment data. One district administrator reflected,
I would not want this to get confused with principal evaluation … because I think most of us, when we think of 360° assessments, it's very much in an evaluative framework. Is it going to affect my compensation? Is it going to affect my career?
Principals also discussed ways to increase the effectiveness of HELM coaching. School leaders noted the importance of HELM coaches having experience as a principal to ensure they understand principals’ roles, responsibilities, and day-to-day activities, which would build buy-in and trust. One principal explained,
At least for me, [it] is someone [who has] done the job. I approach coaching differently when I know they have walked in my shoes … because I approach feedback from someone who's done the job way differently than someone who hasn’t.
Several principals also suggested prioritizing coach-principal relationship-building to lay the foundation for effective coaching. One principal explained the importance of relationships to giving and receiving feedback, noting, “I think starting with the challenge of building relationships with them … because all of us are different … how we take feedback or how we want to receive praise … are pieces that I would want to build on.”
Finally, principals and district administrators suggested aligning HELM content with existing systems, structures, practices, and policies. One principal noted that alignment with existing evaluation systems would be, “…really nice because if you can make it as much about what you’re already doing and tying it to things that you’re trying to implement so that you can move forward with whatever feedback you receive would be helpful.” A district administrator shared a similar sentiment,
… if this was built into a principal meeting or a structure we already have versus … in isolation, it makes sense and I understand it, and when I think of this in the context of … all of the moving parts, then it feels like it's overwhelming or it's big. And so I would think we could layer it into principal professional development or say improvement planning, as an element of that. There's existing structures that this could be woven into versus an add on.
HELM Delivery Modifications
Several actionable suggestions emerged related to HELM delivery. The most prominent theme was to continuously emphasize the “why” behind HELM. Principals felt the “why” would paint the “big picture” and make clear “… how does this relate to my current state of being an administrator.” Another principal pointed out that re-orienting school leader teams to the “why” could help them know, “…it's going to be additional work, but can it be work that helps me now …” Taking a broader view, one district administrator shared, “I guess I’m trying to emphasize that it really needs to be systemic, that people understand the why behind the importance of this and what better outcomes we can get if we do implement things in this fashion.”
Principals also proposed several suggestions to increase the flexibility of coaching to meet the ebbs and flows of the principalship across time. Several principals suggested flexibility in the frequency of coaching sessions, with one explaining, “So, the frequency changes if you’re going through an implementation cycle, there might be … at one point in the year you want it to be weekly or monthly and then later on, [coaching] might be more sporad[ic].” Flexibility in the duration of coaching sessions also was discussed. One principal reflected,
…weekly 15-minute coaching … That seems a lot when I just think about the year and how it flows, like I’ve done coaching before where we’ve done it every other week. And at times it feels really helpful around certain projects and at other times it's kind of like we don’t have much to check in about.
Relatedly, participants shared feedback that indicated their preference for in-person coaching whenever possible, “Because that's the other thing I find very helpful is that coaches actually come on site and see what's going on.”
Discussion of Modifications Informed by Study 1
The research team made the first-round of iterative changes to LOCI to develop HELM for use in schools based on the recommendations from the focus groups. Specifically, the purpose and non-evaluative nature of the 360° assessment was communicated to all participants in various formats and settings (e.g., recruitment materials, participant disclosure form, HELM training). HELM coaching also was modified to ensure that (a) HELM coaches had experience as a principal, (b) one of the first two coaching sessions happened in-person, and (c) the first two coaching sessions were partially focused on building strong relationships with principals. In terms of delivery, each HELM component and learning target was cross-walked with the principal evaluation framework of the National Educational Leadership Standards (NELS), the professional standards that define the nature and quality of the principal profession that guide how principals are prepared, hired, developed, supervised, and evaluated in their profession in the United States (US). The NELS was used in the development of HELM to ensure alignment between the duties and responsibilities of the principalship in US public schools.
Additionally, HELM recruitment, training, and coaching materials were all modified to underscore the “why” behind HELM as an implementation strategy. Altogether, Study 1 allowed HELM to modify integral components (i.e., 360° assessments, coaching) to ensure its feasibility, flexibility, and usability with principals and distributed leadership teams in elementary schools.
Study 2
Methods
Participants
Eighteen implementation research and practice experts, who were known contacts of study team members or snowball sampled from those contacts, were invited via email to participate in Study 2. Of these, 15 (83%) provided consent and participated in the Expert Summit. Two participants did not respond to recruitment emails, and one was unavailable during the scheduled time. Participants identified themselves as experts in implementation research (n = 5), practice (n = 8), and research and practice (n = 2). Five attendees had previous experience with LOCI as a coach or trainer. See Table 5 for sample demographic information.
Table 5.
Study 2 Participant Demographics
| Participants (n = 15) | ||
|---|---|---|
| n | % | |
| Gender | ||
| Cisgender female | 10 | 66.7% |
| Cisgender male | 5 | 33.3% |
| Latinx | ||
| Yes | 1 | 6.7% |
| No | 14 | 93.3% |
| Race | ||
| Asian | 2 | 13.3% |
| Black or African-American | 3 | 23.1% |
| White or Caucasian | 8 | 53.3% |
| Multi-racial | 2 | 13.3% |
| Age (average) | 45.2 | |
Procedures
We convened a one-day Expert Summit over Zoom. The summit provided an opportunity for focused expert feedback and discussion about the overall project and proposed adaptations to LOCI to maximize its effectiveness. After reviewing the overall purpose and objectives of the project (documents were sent to participants three weeks prior to the summit; see pre-summit procedures below), we presented the following topics: (1) implementation leadership and climate; (2) EBPs in the school setting; and (3) principal leadership to support social, emotional, and behavioral-focused EBP implementation. Two breakout sessions with three groups in each were held to gather specific qualitative input from participants regarding: (1) HELM's theory of change; and (2) adaptations to enhance LOCI content/activities for each of the proposed mechanisms of change (e.g., implementation leadership and climate), including strategies through which principals can activate and leverage distributed leadership structures; and (3) recommendations to maximize the impact, feasibility, and contextual fit of LOCI within standard professional development structures/sequences (e.g., length, number of consultations, combination of general and implementation-specific leadership content) in authentic elementary settings. An additional breakout session with four groups was used to generate novel ideas that might be integrated into the HELM strategy. Participants with varying expertise were evenly distributed in each group across the breakout sessions. Summit participants received $500.
Pre-Summit Procedures. All Summit participants provided their demographic information via a pre-Summit secure online survey via Qualtrics. Prior to completing the survey, they were provided a digital set of written and visual materials for their review, including a description of the theoretical foundations, objectives, and key components of LOCI, its leader training and consultation procedures, and themes/adaptations identified from Study 1 coding. Participants had three weeks to review the materials and complete the survey.
Summit Procedures. The first two breakout sessions used a Nominal Group Decision Making Process (Delbecq et al., 1975), which takes participants through a five-step process (introduction and explanation, silent idea generation, sharing ideas, group discussion, and voting and ranking) to generate a set of endorsed recommendations. The purpose of NGP is to introduce an issue to the study group, generate information in response to the issue, and prioritize that information through a structured process of group discussion and voting (Potter et al., 2004).
The hackathon methodology was created to support the rapid development of information technology advances, though it has been widely adopted to promote innovation across various fields (Rys, 2023). Standard hackathons are time-limited (e.g., 24 h), focused on a specific challenge or theme, and result in a demonstration of the proposed innovation (Komssi et al., 2015). The third breakout session used a modified hackathon such that four groups with 4–5 participants in each were prompted to develop an innovation that focused on elementary school leaders, supported school leaders to create the conditions for successful implementation of any EBP that enhances student outcomes, and was feasible for use in authentic educational contexts. All groups were given a one-page document that described a hackathon, success criteria (i.e., impact, innovation, pragmatic), and what needed to be accomplished. Each group had 45-min to brainstorm and develop a pitch for their proposed innovation, which was then shared with the larger group.
Post-Summit Analysis. All recommendations that received votes during the NGP were rank ordered for consideration when revising LOCI. Transcriptions from all three breakout sessions were coded using a qualitative content analysis like the procedures described in Study 1 to focus on ways to enhance the impact of LOCI (and the HELM adaptation) on its theorized mechanisms of action (e.g., implementation leadership). Transcripts for each NGP and the hackathon were reviewed by the second and third authors to generate a set of codes for each, resulting in three separate code sets (see Tables 6 to 8). The same two people independently coded all transcripts for each event, meeting to resolve disagreements via consensus dialogue. No member checking was done.
Table 6.
NGP 1 Codes
| Code | Definition and examples |
|---|---|
| Leadership | Leadership structures and processes to enhance the Theory of Change and or LOCI. I. Account for and/or include guidance around how (in)formal leaders (i.e., distributed leadership teams) impact EBP implementation. |
| Contextual factors/Constraints | Contextual factors that are relevant to HELM (i.e., school context) and were unrepresented in the Theory of Change. I. General and implementation culture in schools. II. EBP implementation history of schools. |
| Design & language | Aspects relating to HELM and/or the Theory of Change design and language. I. Prioritize alignment across multiple levels (e.g., district, school). II. Specify responsibilities and necessary qualities of HELM coaches or that facilitate effective coaching. |
Table 8.
Hackathon Codes
| Code | Definition and examples |
|---|---|
| Intervention design, training, and materials | Aspects of interventions design, training, and materials that are necessary for the intervention to be effective among school leaders. I. Incorporate different learning theories, approaches, and/or multimedia materials that coaches/leaders use and/or that promote coach/leader learning related to EBP implementation. II. Training coaches and/or leaders to effectively carry out their roles related to EBP implementation. |
| Leadership | Elements, qualities, and characteristics that could be developed, leveraged, and/or inform (positively, negatively, neutrally) leaders’ EBP implementation. I. Leaders’ knowledge of a prioritized EBP so that they can better support EBP implementation among teachers and staff. II. How to build, manage, and/or leverage a distributed leadership team to support EBP implementation. |
| Systems and structures | Systems and structures across levels of influence (e.g., school, district) intended to support EBP implementation. I. Developing and/or widely implementing technology in new and/or innovative ways to support EBP implementation. II. Use of measurement and data to support EBP implementation (e.g., Leaders complete self-assessments of their learning to evaluate and improve upon their implementation leadership; create an implementation database). |
Table 7.
NGP 2 Codes
| Code | Definition and examples |
|---|---|
| Assessment |
Description: Aspects having to do with surveys that are completed about the LOCI trainee's leadership and the EBP implementation climate of their school by: 1) the leader (i.e., administrator) for a self-assessment; 2) staff supervised by the leader (i.e., teachers); and 3) the leader's supervisor (i.e., district administrators). I. Phrase instrument questions/statements around participants’ context (e.g., “our school,” “we”). II. Ensure Assessments are tied to improvement cycles. III. Change timing to better align with the school calendar. |
| Leadership Development Plan (LDP) |
Description: Aspects having to do with trainers and coaches who work with leaders to review their personalized assessment data, identify strengths and areas for development, and set a timeline for issues to be addressed immediately and those to be addressed in future coaching sessions. I. Build in ways for coaches to crosswalk goals being established for the leaders and the evaluation framework used to evaluate principals. II. If intervention is a team support, have each distributed leadership team member have individual goals that contribute to each team goal. III. Provide coaches with prompts to get leaders continuously thinking about organizational change, staff empowerment, etc. |
| Coaching |
Description: Edits to LOCI's weekly 15-minute coaching calls, which are intended to provide support, keep LOCI participants on track with their goals and development plans, and iteratively update plans with new goals. Coaches should have principalship experience. I. Ensure sessions are long enough (∼45 minutes) to provide enough time to process and focus on interactions. II. Coaching sessions should incorporate, at least once a month, other individuals with leadership roles (i.e., distributed leadership teams). |
Results
NGP 1: HELM Theory of Change
Participants of the first NGP were asked to provide feedback on HELM's theory of change. Feedback centered on two types of modifications to HELM content and/or delivery that were not surfaced in study 1. First, participants suggested that HELM content better reflect how distributed leadership teams could be organized and leveraged to support EBP implementation. There was a strong emphasis on learning more about implementation specific leadership behaviors. One participant prompted the research team to, “Take into account how formal and informal leaders can impact the process. Everyone needs to be on the same page.” This feedback aligned with another participant who wondered, “How do we capture the whole team (each member has a different role – so how do you integrate that and show it works together/not siloed)?” Second, several participants felt that aligning personnel across different levels of influence (e.g., district, school) would result in a unified team approach that would maximize the effectiveness of HELM components. One participant called for, “more alignment processes to create a team throughout this process. Recognize and break down the levels as much as you can. Intentional alignment process to allow school community to function as a team.” Similarly, another participant suggested that HELM, “foster authentic collab[oration] between administrators and teachers, first level leader [and] upper management.”
NGP 2: HELM Components
Participants shared several pieces of feedback to improve the feasibility and contextual appropriateness of the following HELM components.
360° Assessment. Specific modifications to the timeline for 360° assessment data surfaced. The original timeline for HELM included four time points of 360° assessment data collection, with the fourth scheduled for September of the academic year following the first year of initial EBP and HELM implementation. Participants felt that HELM delivery would be most feasible if it spanned one academic year (August to June) and included only three time points of data collection (beginning, middle, and end of year). Participants also suggested that the 360° assessment include language like “we” and “our school” to better capture organizational processes as opposed to how individual educators perceive those processes related to their own EBP implementation.
Coaching. Feedback suggested that 15-min per coaching session was not enough time to review progress, process feedback, and set new goals based on 360° assessment data review. Participants felt that 45-min coaching sessions would be more conducive to the reflective processes and coach-leader interactions needed to identify and troubleshoot barriers to goal attainment and/or to identify new goals and action steps.
Leadership Development Plan (LDP). Feedback also suggested that the LDP should be modified to better reflect team processes for schools with a distributed leadership structure. Participants wanted the LDP to include space to log and track individual team members’ goals that would ultimately contribute to the achievement of an overarching team goal.
Hackathon
Most of the brainstorming and ideas generated by hackathon groups aligned with existing LOCI components. For example, most groups (n = 3) invoked diverse learning approaches and materials. One group prioritized, “…using a coherent framework and human-centered design principles through which the learning framework is designed that allows people to acquire specific competencies linked to … improving … leadership.” Another group focused on, “Self-assessment: where are [leaders] in their learning journey to acquire competencies consistent with the coherent framework?” Relatedly, one group's innovation to “Create a database for implementation data akin to student information systems that are ubiquitous. What is happening? Where is it happening? Who is it happening to/with?” was aligned with LOCI's extensive collection and use of implementation-related data.
Hackathon results also surfaced two innovative ideas that led to HELM modifications. First, one group reflected on distributed leadership teams, pondering, “How can I intentionally build a distributed leadership model so that [leader] turnover is less disruptive to the school as a whole?” Prior feedback led to suggestions about modifying HELM content to include distributed leadership teams, not about how to develop one. The second piece of feedback emphasized the knowledge leaders need to support educators to effectively implement EBPs. Specifically, one group wanted to, “Ensur[e] that leaders have the on the ground knowledge for what it takes in order to support those that are implementing the EBPs.”
Discussion of Modifications Informed by Study 2
We made a second round of iterative changes to LOCI to further develop HELM for use in schools based on the feedback from Study 2. First, it was necessary to streamline the HELM training to fit within the school calendar year and constraints of competing responsibilities (e.g., availability of teachers and administrators to attend training). The Theory of Change discussion suggested implementation specific leadership behaviors with an emphasis on distributed leadership was critical and more proximal to our target mechanisms. Therefore, we removed the transformational and transactional leadership model as part of the training. Second, a worksheet was developed to help school leaders determine how to develop or improve their existing distributed leadership team. Further, the LDP was adapted to log, track, and link individual team member's goals to the shared team goal. The coaching manual also was edited to reference distributed leadership teams throughout. In response to calls for multilevel alignment, the decision was made to invite district administrators to the HELM training to provide opportunities for shared learning that might promote teamwork across levels of influence. Relatedly, school leaders were required to attend the EBP training to ensure they had the knowledge necessary to support their staffs’ EBP implementation. Changes also were made to the 360° assessment such that the proposed fourth time point of data collection was eliminated, and the language in selected instruments (i.e., School Implementation Leadership Scale and School Implementation Climate Scale) was edited to “our principal” or “this school.” Finally, more flexibility was built into HELM coaching to meet school leaders’ constantly changing needs and availability. For example, a suggested dosage of coaching (i.e., time) was recommended, but scheduling of that time was determined by the coach and school leader. The culmination of Studies 1 and 2 led to a modification blueprint; differences between LOCI and HELM are highlighted in Table 9.
Table 9.
Differences Between LOCI and HELM
| LOCI | HELM modification |
|---|---|
| 360° assessment and feedback | Focused exclusively on (1) School Implementation Leadership Scale, and (2) School Implementation Climate Scale |
| Initial training (2 full days) | Shortened to two, 2-h online sessions one week apart. Day 1: overview of HELM, challenges to implementation, leveraging leadership to overcome challenges, implementation leadership, distributed leadership, implementation climate, reflections; Day 2: review of 360° feedback, debrief review of feedback reports, leadership development planning, goal setting, reflection |
| Leadership Development Plan | Template developed in partnership with school stakeholders |
| Individual coaching (15–20 min weekly) | Provided by former principals with experience with implementation of EBP in schools monthly for 1 h. |
| Group coaching (monthly) | Optional (∼1 h) |
| Organizational strategy meetings (monthly) | Shortened to twice during the school year (Fall & Spring) and includes district administrators and principals |
| Booster sessions (n = 2) | Repurposed as Professional Learning Collaboratives focused on aligning HELM strategies with principles of the National Educational Leadership Standards and EBP sustainment |
| Graduation | No change |
Discussion
This study applied the DDBT Framework to systematically and iteratively redesign an organizational-focused implementation strategy in partnership with community stakeholders to enhance its acceptability, appropriateness, feasibility, and usability in elementary schools. The collaborative redesign process included different methods (i.e., focus groups, NGP, and hackathon) and the culmination of Studies 1 and 2 generated a modification blueprint that ultimately informed the redesign of LOCI to develop HELM. The prototype of HELM is streamlined in comparison to LOCI to fit within the school calendar year and includes specific features for use in schools (e.g., former principals as coaches, distributed leadership teams, alignment with NELS). Because it was collaboratively constructed across iterations, HELM has the potential to be an effective organizational-level implementation strategy to support the use of universal EBP in schools.
The results of this study point to the importance of incorporating stakeholder feedback in all phases of redesign as a methodology to ensure the end product (i.e., HELM) is applicable to support EBP implementation in a different setting (i.e., schools) than originally designed. Stakeholder feedback in Study 1 resulted in setting-specific considerations that led to initial LOCI modifications (shortening and streamlining LOCI components such as assessment and coaching frequency to fit within the school timeline; enlisting former principals with experience with EBP implementation in schools as coaches). The principalship is complex and in the US, requires adherence to NELS. Without contextualization into what the principalship requires, HELM appears to be another “task” or thing to do for principals. But stakeholder feedback highlighted that one of the core components of HELM—coaching—could benefit from having former principals who deeply understand the role, responsibilities, and regulations that guide the profession as coaches. This ensures HELM activities align with what principals already do as part of the job and may enhance usability, buy-in, and feasibility. Subsequently, the NGP and hackathon yielded additional LOCI modifications (forming and activating distributed leadership, creation of user-friendly tools and worksheets, surface-level changes to the assessment battery) to enhance the contextual appropriateness and support the development of HELM. Co-creation with local stakeholders resulted in school-facing tools (i.e., LDP) that were relevant to the implementation context and usable among end-users. Ultimately, the collaborative redesign process allowed greater alignment with HELM components and school needs and priorities and should, in theory, increase adoption and implementation. These findings are consistent with recent calls for more tailorable or customizable implementation in complex settings (Munson et al., 2022) and other efforts to collaboratively redesign implementation strategies outside the education sector (Deatrick et al., 2021; Harkness et al., 2022). Given there are a number of usability challenges for implementation strategies when applied to novel settings, it is critical to ensure redesign solutions address barriers in the implementation context (Lyon et al., 2020).
Study 2 employed an Expert Summit that comprised an NGP and a hackathon. The hackathon was a novel methodology for participants to come together and generate ideas and potential solutions to a problem (e.g., how to feasibly use HELM in schools). The use of a hackathon allowed Summit participants to engage in structured creative and innovative thinking for increased and distributed participation (Yarmohammadian et al., 2021). Researchers ought to consider employing hackathons in future redesign work to promote collaborative teaming and idea generation that can inform redesign solutions to usability challenges and/or foreground aspects of the existing intervention that align with innovations emerging from the hackathon (e.g., the implementation database suggested by one hackathon group is a technological extension of implementation leadership and climate survey data).
Limitations and Future Directions
Several limitations are noted. First, recruitment was geographically limited to two USA regions; the inclusion of other geographically diverse school districts may provide additional insights that would lead to further adaptations. Future studies should expand the recruitment catchment area to ensure HELM is universally applicable and usable. Second, special educators were not recruited. Because special educators also implement Tier 1 programs and often are siloed within schools, their feedback may be instrumental to further refining HELM. Third, data collection occurred during the COVID-19 pandemic and all data collection activities were conducted via Zoom. While we had a set of agreements during the conduct of each activity (e.g., one person speaking at a time), Zoom does not allow natural conversations to emerge and thereby limits the potential collaborations or idea generation that may have otherwise occurred.
Conclusion
An approach combining human-centered design with implementation science has strong utility to ensure implementation strategies are intentionally redesigned for use in settings different from that for which they were developed. Using different methodologies to engage with local stakeholders with lived experience and expertise about the implementation context should ensure an end-product that is usable. Future research should conduct usability testing on HELM to continue to iterate on practical and logistical considerations to ensure its usefulness in supporting school based EBP implementation, and pilot testing to better understand whether HELM activates its theorized mechanisms of change.
Footnotes
Dr. Lyon is an Associate Editor for Implementation Research and Practice. As such, he had no part in the peer review process. All other authors declare that there is no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Ethics Approval: All procedures were approved by the University of Washington IRB (Study No. STUDY00010282).
Funding: The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the Institute of Education Sciences (R305A200023) and National Institute of Mental Health (P50MH115837). The content is solely the responsibility of the authors and does not necessarily represent the view of the funder.
Informed Consent: This study was determined to be IRB exempt.
ORCID iDs: Jill Locke https://orcid.org/0000-0003-1445-8509
Cathy M. Corbin https://orcid.org/0000-0002-2674-4425
Vaughan K. Collins https://orcid.org/0000-0002-5303-7291
Aaron R. Lyon https://orcid.org/0000-0003-3657-5060
References
- Aarons G. A., Ehrhart M. G., Farahnak L. R. (2014). The Implementation Leadership Scale (ILS): Development of a brief measure of unit level implementation leadership. Implementation Science, 9(45), 1–10. 10.1186/1748-5908-9-45 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons G. A., Ehrhart M. G., Farahnak L. R., Hurlburt M. S. (2015). Leadership and organizational change for implementation (LOCI): A randomized mixed method pilot study of a leadership and organization development intervention for evidence-based practice implementation. Implementation Science, 10(11), 1–12. 10.1186/s13012-014-0192-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons G. A., Ehrhart M. G., Moullin J. C., Torres E. M., Green A. E. (2017). Testing the leadership and organizational change for implementation (LOCI) intervention in substance abuse treatment: A cluster randomized trial study protocol. Implementation Science, 12(29), 1–11. 10.1186/s13012-017-0562-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons G. A., Farahnak L. R., Ehrhart M. G. (2014). Leadership and strategic organizational climate to support evidence-based practice implementation. In Dissemination and implementation of evidence-based practices in child and adolescent mental health (pp. 82–97). Oxford University Press. [Google Scholar]
- Aarons G. A., Green A. E., Palinkas L. A., Self-Brown S., Whitaker D. J., Lutzker J. R., Silovsky J. F., Hecht D. B., Chaffin M. J. (2012). Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implementation Science, 7(32), 1–9. 10.1186/1748-5908-7-32 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas R. S., Aarons G., Barg F., Evans A., Hadley T., Hoagwood K., Marcus S., Schoenwald S., Walsh L., Mandell D. S. (2013). Policy to implementation: Evidence-based practice in community mental health – study protocol. Implementation Science, 8(38), 1–9. 10.1186/1748-5908-8-38 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas R. S., Edmunds J., Ditty M., Watkins J., Walsh L., Marcus S., Kendall P. (2014). Are inner context factors related to implementation outcomes in cognitive-behavioral therapy for youth anxiety? Administration and Policy in Mental Health and Mental Health Services Research, 41(6), 788–799. 10.1007/s10488-013-0529-x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas R. S., Kendall P. C. (2010). Training therapists in evidence-based practice: A critical review of studies from a systems-contextual perspective. Clinical Psychology: Science and Practice, 17(1), 1–30. 10.1111/j.1468-2850.2009.01187.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas R. S., Marcus S., Aarons G. A., Hoagwood K. E., Schoenwald S., Evans A. C., Hurford M. O., Hadley T., Barg F. K., Walsh L., Adams M., Mandell D. R., S D. (2015). Predictors of community therapists’ use of therapy techniques in a large public mental health system. JAMA Pediatrics, 169(4), 374–382. 10.1001/jamapediatrics.2014.3736 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bonham C. A., Sommerfeld D., Willging C., Aarons G. A. (2014). Organizational factors influencing implementation of evidence-based practices for integrated treatment in behavioral health agencies. Psychiatry Journal, 2014, 1–9. 10.1155/2014/802983 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bradley R., Doolittle J., Bartolotta R. (2008). Building on the data and adding to the discussion: The experiences and outcomes of students with emotional disturbance. Journal of Behavioral Education, 17(1), 4–23. 10.1007/s10864-007-9058-6 [DOI] [Google Scholar]
- Brookman-Frazee L., Chlebowski C., Suhrheinrich J., Finn N., Dickson K. S., Aarons G. A., Stahmer A. (2020). Characterizing shared and unique implementation influences in two community services systems for autism: Applying the EPIS framework to two large-scale autism intervention community effectiveness trials. Administration and Policy in Mental Health and Mental Health Services Research, 47(2), 176–187. 10.1007/s10488-019-00931-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brookman-Frazee L., Stahmer A. C. (2018). Effectiveness of a multi-level implementation strategy for ASD interventions: Study protocol for two linked cluster randomized trials. Implementation Science, 13(1), 66. 10.1186/s13012-018-0757-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cook B. G., Odom S. L. (2013). Evidence-based practices and implementation science in special education. Exceptional Children, 79(3), 135–144. 10.1177/001440291307900201 [DOI] [Google Scholar]
- Damschroder L. J., Aron D. C., Keith R. E., Kirsh S. R., Alexander J. A., Lowery J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4(50), 1–15. 10.1186/1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Darling-Hammond L., Wei R. C., Andree A., Richardson N., Orphanos S. (2009). State of the profession: Study measures status of professional development. Journal of Staff Development, 30(2), 42. https://www.proquest.com/scholarly-journals/state-profession-study-measures-status/docview/61901632/se-2 [Google Scholar]
- Deatrick J. A., Kazak A. E., Madden R. E., McDonnell G. A., Okonak K., Scialla M. A., Barakat L. P. (2021). Using qualitative and participatory methods to refine implementation strategies: Universal family psychosocial screening in pediatric cancer. Implementation Science Communications, 2(62), 1–10. 10.1186/s43058-021-00163-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Delbecq A. L., Van de Ven A. H., Gustafson D. H. (1975). Group techniques for program planning: A guide to nominal group and Delphi processes. Scott, Foresman. [Google Scholar]
- DeSantis L., Ugarriza D. N. (2000). The concept of theme as used in qualitative nursing research. Western Journal of Nursing Research, 22(3), 351–372. 10.1177/019394590002200308 [DOI] [PubMed] [Google Scholar]
- Ehrhart M. G., Aarons G. A., Farahnak L. R. (2014). Assessing the organizational context for EBP implementation: The development and validity testing of the Implementation Climate Scale (ICS). Implementation Science, 9(157), 1–11. 10.1186/s13012-014-0157-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hallinger P., Heck R. H. (1996). Reassessing the principal’s role in school effectiveness: A review of empirical research, 1980–1995. Educational Administration Quarterly, 32(1), 5–44. 10.1177/0013161X96032001002 [DOI] [Google Scholar]
- Harkness A., Weinstein E. R., Lozano A., Mayo D., Doblecki-Lewis S., Rodrígues-Díaz C. E., Brown C. H., Prado G., Safren S. A. (2022). Refining and implementation strategy to enhance the reach of HIV-prevention and behavioral health treatments to Latino men who have sex with men. Implementation Research and Practice, 3, 1–20. 10.1177/26334895221096293 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hill C. E., Knox S., Thompson B. J., Williams E. N., Hess S. A., Ladany N. (2005). Consensual qualitative research: An update. Journal of Counseling Psychology, 52(2), 196–205. https://psycnet.apa.org/doi/10.1037/0022-0167.52.2.196. 10.1037/0022-0167.52.2.196 [DOI] [Google Scholar]
- Hsieh H.-F., Shannon S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277–1288. 10.1177/1049732305276687 [DOI] [PubMed] [Google Scholar]
- Komssi M., Pichlis D., Raatikainen M., Kindström K., Järvinen J. (2015). What are hackathons for? IEEE Software, 32(5), 60–67. 10.1109/MS.2014.78 [DOI] [Google Scholar]
- Locke J., Lawson G. M., Beidas R. S., Aarons G. A., Xie M., Lyon A. R., Stahmer A., Seidman M., Frederick L. K., Oh C., Spaulding C., Dorsey S., Mandell D. S. (2019). Individual and organizational factors that affect implementation of evidence-based practices for children with autism in public schools: A cross-sectional observational study. Implementation Science, 14(39), 1–9. 10.1186/s13012-019-0877-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon A. R., Koerner K., Chung J. (2020). Usability evaluation for evidence-based psychosocial interventions (USE-EBPI): A methodology for assessing complex intervention implementability. Implementation Research and Practice, 1, 2633489520932924. 10.1177/2633489520932924 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lyon A. R., Munson S. A., Renn B. N., Atkins D. C., Pullmann M. D., Friedman E., Areán P. A. (2019). Use of human-centered design to improve implementation of evidence-based psychotherapies in low-resource communities: Protocol for studies applying a framework to assess usability. JMIR Research Protocols, 8(10), e14990. 10.2196/14990 [DOI] [PMC free article] [PubMed] [Google Scholar]
- McIntosh K., Kelm J. L., Delabra A. C. (2016). In search of how principals change: A qualitative study of events that help and hinder administrator support for school-wide PBIS. Journal of Positive Behavior Interventions, 18(2), 100–110. 10.1177/1098300715599960 [DOI] [Google Scholar]
- Morgan D. L. (1997). The focus group guidebook. Sage Publications. [Google Scholar]
- Munson S. A., Friedman E., Osterhage K., Allred R., Pullmann M. D., Areán P. A., UW ALACRITY Center Researchers. (2022). Usability issues in evidence-based psychosocial interventions and implementation strategies: A cross-project analysis. Journal of Medical Internet Research, 24(6), e37585. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pinkelman S. E., McIntosh K., Rasplica C. K., Berg T., Strickland-Cohen M. K. (2015). Perceived enablers and barriers to sustainability of school-wide positive behavioral interventions and supports. Behavioral Disorders, 40(3), 171–183. 10.17988/0198-7429-40.3.171 [DOI] [Google Scholar]
- Potter M., Gordon S., Hamer P. (2004). The nominal group technique: A useful consensus methodology in physiotherapy research. New Zealand Journal of Physiotherapy, 32(2), 70–75. [Google Scholar]
- Powell B. P., Waltz T. J., Chinman M. J., Damschroder L. J., Smith J. L., Matthieu M. M., Proctor E. K., Kirchner J. E. (2015). A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science, 10(21), 1–15. 10.1186/s13012-015-0209-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Proctor E., Silmere H., Raghavan R., Hovmand P., Aarons G., Bunger A., Griffey R., Hensley M. (2011). Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research, 38(2), 65–76. 10.1007/s10488-010-0319-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Proctor E. K., Powell B. J., McMillen J. C. (2013). Implementation strategies: Recommendations for specifying and reporting. Implementation Science, 8(139), 1–11. 10.1186/1748-5908-8-139 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rys M. (2023). Invention development. The hackathon method. Knowledge Management Research & Practice, 21(3), 499–511. 10.1080/14778238.2021.1911607 [DOI] [Google Scholar]
- Skar A.-M. S., Braathu N., Peters N., Bækkelund H., Endsjø M., Babaii A., Borge R. H., Wentzel-Larsen T., Ehrhart M. G., Sklar M., Brown C. H., Aarons G. A., Egeland K. M. (2022). A stepped-wedge randomized trial investigating the effect of the leadership and organizational change for implementation (LOCI) intervention on implementation and transformational leadership, and implementation climate. BMC Health Services Research, 22(298), 1–15. 10.1186/s12913-022-07539-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Vaughn S., Schumm J. S., Sinagub J. (1996). Focus group interviews in education and psychology. Sage. [Google Scholar]
- Williams N. J., Ehrhart M. G., Aarons G. A., Marcus S. C., Beidas R. S. (2018). Linking molar organizational climate and strategic implementation climate to clinicians’ use of evidence-based psychotherapy techniques: Cross-sectional and lagged analyses from a 2-year observational study. Implementation Science, 13(85), 1–13. 10.1186/s13012-018-0781-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Williams N. J., Frank H., Frederick L., Beidas R., Mandell D. S., Aarons G. A., Green P., Locke J. (2019). Organizational culture and climate profiles: Relationships with fidelity to three evidence-based practices for autism in elementary schools. Implementation Science, 14(15), 1–14. 10.1186/s13012-019-0863-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Williams N. J., Wolk C. B., Becker-Haimes E. M., Beidas R. S. (2020). Testing a theory of strategic implementation leadership, implementation climate, and clinicians’ use of evidence-based practice: A 5-year panel analysis. Implementation Science, 15(1), 1–15. 10.1186/s13012-020-0970-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yarmohammadian M. H., Monsef S., Javanmard S. H., Yazdi Y., Amini-Rarani M. (2021). The role of hackathon in education: Can hackathon improve health and medical education? Journal of Education and Health Promotion, 10, 334. 10.4103/jehp.jehp_1183_20 [DOI] [PMC free article] [PubMed] [Google Scholar]

