Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2023 Jun 12.
Published in final edited form as: Prev Sci. 2022 Nov 11;24(3):552–566. doi: 10.1007/s11121-022-01464-3

A Pre-Implementation Enhancement Strategy to Increase the Yield of Training and Consultation for School-Based Behavioral Preventive Practices: a Triple-Blind Randomized Controlled Trial

Yanchen Zhang 1, Clayton R Cook 2, Gazi F Azad 3, Madeline Larson 4, James L Merle 5, Jordan Thayer 2, Alex Pauls 1, Aaron R Lyon 6
PMCID: PMC10258873  NIHMSID: NIHMS1896104  PMID: 36367633

Abstract

As the most common setting where youth access behavioral healthcare, the education sector frequently employs training and follow-up consultation as cornerstone implementation strategies to promote the implementation of evidence-based practices (EBPs). However, these strategies alone are not sufficient to promote desirable implementation (e.g., intervention fidelity) and youth behavioral outcomes (e.g., mitigated externalizing behaviors). Theory-informed pragmatic pre-implementation enhancement strategies (PIES) are needed to prevent the lackluster outcomes of training and consultation. Specifically, social cognitive theory explicates principles that inform the design of PIES content and specify mechanisms of behavior change (e.g., “intentions to implement” (ITI)) to target increasing providers’ responsiveness to training and consultation. This triple-blind parallel randomized controlled trial preliminarily examined the efficacy of a pragmatic PIES based on social cognitive theories (SC-PIES) to improve implementation and youth behavioral outcomes from universal preventive EBPs in the education sector. Teachers from a diverse urban district were recruited and randomly assigned to the treatment (SC-PIES; ntreatment = 22) or active control condition (administrative meeting; ncontrol = 21). Based on the condition assigned, teachers received the SC-PIES or met with administrators before their EBP training. We assessed teachers’ ITI, intervention fidelity, and youth behavioral outcome (academic engagement as an incompatible behavior to externalizing disorders) at baseline, immediately after training, and 6 weeks afterward. A series of ANCOVAs detected sizeable effects of SC-PIES, where teachers who received SC-PIES demonstrated significantly larger improvement in their ITI, intervention fidelity, and youth behaviors as compared to the control. Conditional analyses indicated that teachers’ ITI partially mediated the effect of SC-PIES on intervention fidelity, which in turn led to improved youth behaviors. Findings suggest that theory-informed pragmatic PIES targeting providers’ ITI can boost their responsiveness to implementation strategies, as reflected in improved implementation behaviors and youth behavioral outcomes. The results have implications for targeting motivational mechanisms of behavior change and situating preventive implementation strategies at the intersection between the preparation and active implementation stages of an implementation process. Limitations and implications for research and practice are discussed. Clinicaltrials.gov: NCT05240222. Registered on: 2/14/2022. Retrospectively registered. https://clinicaltrials.gov/show/NCT05240222

Keywords: Implementation strategy, Theory of planned behavior, Intentions to implement, Intervention fidelity, School-based behavioral preventive practices

Background

Across virtually every service sector, training and follow-up consultation are cornerstone implementation strategies designed to promote provider uptake and use of evidence-based practices (EBPs; Lyon et al., 2017). While training and consultation are essential implementation strategies, there is mounting evidence indicating that these strategies alone are insufficient to translate what is known to work in research into routine practices that ultimately improve service recipient outcomes (Edmunds et at., 2013; Powell et al., 2017). Pre-implementation enhancement strategies (PIES) that complement and increase the yield of training and follow-up consultation are needed, particularly ones that are pragmatic and theoretically informed to target precise mechanisms of behavior change (McCleod et al., 2018; Lyon et al., 2019). PIES that bolster the yield of implementation strategies are especially needed in the context of youth behavioral health where many youths in need of care do not access quality prevention or treatment services even though numerous EBPs exist (Phillippi et al., 2020; Williams & Beidas, 2019). The education sector is one of the best settings to develop and test PIES as it is the most common setting where youth access behavioral health services (Duong et al., 2021; Fazel et al., 2014; Weist et al., 2017). With a triple-blind parallel randomized controlled trial, this study aimed to test a pragmatic PIES designed with principles rooted in social cognitive theories (SC-PIES) as a preventive complement to EBP training and follow-up consultation. The goal was to enhance teachers’ intentions to implement and then subsequent implementation in the service of improving youth behavioral health outcomes in the education sector.

Youth Behavioral Health and EBPs in the Education Sector

As a prevalent and impactful youth behavioral health problem, externalizing behavioral disorder and its prevention and treatment remain top priorities for policymakers, researchers, and healthcare professionals in the education sector (OSG, 2022). Externalizing behavioral health problems are outer-directed behaviors that harm relationships with others and violate social norms and rules (Splett et al., 2019). Youth who exhibit externalizing behavioral problems are likely to experience negative outcomes, including academic difficulties (Lee & Bierman, 2015), strained relationships with others (Leflot et al., 2011), elevated risk for dropout (Bevilacqua et al., 2018), substance abuse (Regan et al., 2020), repeated exposure to exclusionary discipline practices, and contact with the juvenile justice system (Mitchell & Bradshaw, 2013). To prevent these untoward outcomes, researchers have established several EBPs for use in the educational sector. For example, proactive classroom behavior management (PCBM) includes a suite of EBPs that are prevention-oriented and aim to promote high levels of behavioral academic engagement in class as incompatible with externalizing behavior problems (Rathvon, 2008). Extant research has established the effectiveness of various PCBM practices for consistent use by teachers in the context of classrooms where youth spend significant amounts of time (Simonsen et al., 2008). PCBM practices promote youth’s behavioral and academic success in classrooms which leads to short-term success in school (e.g., reduced externalizing behaviors, increased academic achievement, and school engagement; Cook et al., 2018; Nagro et al., 2019). These short-term outcomes are determinants of longer-term beneficial outcomes that reach into adulthood (Rodwell et al., 2018). Although PCBM has been disseminated through multiple outlets (e.g., refereed journals, books, and media outlets), passive implementation efforts have proven insufficient to produce scalable changes in teachers’ delivery of PCBM (Brownson et al., 2018).

Teachers as Primary Implementers with Varied Response to Training and Consultation

In the education sector, the implementation of universal prevention-oriented EBPs targeting youth behavioral health outcomes is primarily the responsibility of classroom teachers who spend the most time during the school days interfacing with youth (Lyon et al., 2018). For this reason, teachers are the recipients of training and consultation, which are the most common implementation strategies used to promote teachers’ uptake and delivery of universal EBPs (Edmunds et al., 2013). While training and follow-up consultation are considered core implementation strategies, these strategies alone are insufficient to produce quality and consistent implementation across practitioners. For instance, didactic training is necessary to increase knowledge of practices, but they are ineffective alone in producing changes in teachers’ classroom practices (Robertson et al., 2021). Moreover, teachers may demonstrate resistance or ambivalence to change during follow-up support (e.g., consultation; Lyon et al., 2019).

As primary implementers, teachers’ motivation plays an instrumental role in determining whether they will respond to training and follow-up consultation to initially adopt and then persist in using an EBP with fidelity (Dusenbury et al., 2005). Indeed, teachers are a heterogeneous group of implementers who vary significantly according to motivational factors that influence whether they are likely to adopt new practices as part of their regular classroom routines (Abry et al., 2013; Dickie & Shuker, 2014). Variability in motivational determinants of change exists regardless of whether teachers work in school settings with optimal organizational characteristics in place (e.g., supportive leadership, quality training, and coaching; Kincaid et al., 2007). Therefore, there is a need for complementary pre-implementation strategies that occur before training and consultation to target motivational determinants of behavior change to enhance teachers’ responsiveness to training and consultation (Low et al., 2016).

Preventive Pre-Implementation Enhancement Strategies for Training and Consultation

Theory-informed and pragmatic PIES offer promise as preventive complementary strategies that can increase the yield of training and follow-up consultation on both implementation and youth behavioral health outcomes. PIES occur before more active implementation supports such as training and consultation. EBP training and consultation more directly focus on increasing providers’ knowledge of and following through with specific practices. A common implicit assumption of training and consultation is that providers are already motivated to change their behaviors. However, this is not always the case. PIES occur before providers participate in training and consultation activities and are designed to motivate and prime providers to engage and respond more to active implementation strategies. Essentially, PIES sit at the immediate intersection of the preparation and active implementation phases of an implementation process (Lyon & Bruns, 2019).

Consistent with recommendations for developing implementation strategies (Cook et al., 2019), the development of PIES should be grounded in theory to ensure the content targets precise motivational mechanisms of behavior changes before receipt of training and consultation. Theoretically, effective PIES can enhance providers’ motivational mechanisms (e.g., intention to implement), which in turn improve their responsiveness to EBP-specific implementation strategies (e.g., training and consultation) that lead to enhanced implementation outcomes (e.g., adoption, fidelity) that ultimately optimize youth behavioral outcomes.

Use of Social Cognitive Theory to Develop and Test PIES

Applied social cognitive theory and research offer principles and evidence that could inform the development of pragmatic and effective PIES (Steinmetz et al., 2016). In the context of implementation, strategies informed by social cognitive theory could serve to alter teachers’ perceptions and motivation regarding the uptake and use of EBPs, in the context of actively acquiring knowledge about them (training) and participating in ongoing support and feedback to deliver them with fidelity (consultation). Social cognitive research suggests that implementers’ behavioral intentions are malleable mechanisms of behavior change that can be targeted via intervention (Cialdini & Goldstein, 2004). The theory of planned behavior (TPB; Birken et al., 2020) is a widely established social cognitive theory that has been used to predict and target behavior change (Godin et al., 2008). The central tenet of the TPB is that one of the best predictors of behavior is a person’s behavioral intentions. Behavioral intentions “capture the motivational factors that influence a behavior; they are indicators of how hard people are willing to try, of how much effort they are planning to exert, to perform the behavior” (Godin et al., 2008). Previous research using the TPB has shown that implementation strategies targeting implementers’ behavioral intentions are linked to improved implementation outcomes during the active implementation phase when EBP uptake and use are critical (Fishman et al., 2021; Kortteisto et al., 2010; Mangurian et al., 2017). Moreover, a study conducted with teachers serving youth with autism indicated that, following in-service training, teachers endorsing high intentions to implement were five times more likely to adopt and deliver EBPs than teachers endorsing low intentions to implement (Fishman et al., 2021). As such, implementation strategies that target behavioral intentions before initiating active implementation may serve as an effective approach to increasing teachers’ responsiveness to training and consultation.

Several empirically established social cognitive principles can be used to guide the content development of PIES (Birken et al., 2020; Fishman et al., 2021). We strategically selected three social cognitive principles based on two criteria: (1) empirical evidence related to behavior change and (2) ease of designing content and activities that are perceived as acceptable and pragmatic by teachers. First, the principle of growth mindset was selected as it is a popularized construct that has gained widespread attention in education and can be leveraged for purposes of face validity to engage teachers (Yeager & Dweck, 2012). Research has shown that people’s mindsets or implicit theories about the malleability of their cognitive and physical abilities have significant impacts on their goals, effort, and performance (Blackwell et al., 2007).

Second, the saying-is-believing principle was selected, because it involves individuals advocating for an idea or action to others to increase their commitment to the idea or action regardless of their initial beliefs about it (Aronson et al., 2002). For instance, college youth randomly assigned to write letters to incoming youth endorsing the importance of overcoming social and academic adversity and using problem-solving strategies to do so showed significant improvements in their school-related behaviors and achievement (Walton & Cohen, 2007). Teachers who are provided with opportunities to advocate ideas relevant to adopting and implementing new practices may in turn exhibit greater intentions to implement and are more likely to implement after receiving training and follow-up consultation (Cook et al., 2015).

Last, we selected the principle of commitment and consistency (Wood, 2000), which involves evoking a state of psychological and emotional tension that gets activated when individuals recognize a discrepancy between a belief and their behavior. The tension and discrepancy lead to an increased likelihood of individuals striving to maintain consistency between their beliefs and actions (Wood, 2000). Specific desired behaviors (i.e., intervention fidelity) can be increased by evoking commitments that are active, public, and voluntary (Cialdini, 2001). Once individuals make a commitment, they are more motivated to maintain consistency between their beliefs and actions (i.e., commitment or follow through with the stated action). Research has leveraged this principle to induce cognitive dissonance to promote behavior change (Pratkanis, 2011). This technique has been applied to a wide range of behaviors, including voting behavior, fund-raising, and recycling (Spangenberg & Greenwald, 2001; Filter & Brown, 2019).

Gaps in Current Literature and Study Aims

There is limited research on the development and evaluation of PIES as complementary strategies to prevent providers’ non-responsiveness to training and consultation, as well as to increase their yield on both implementation and youth behavioral health outcomes. Moreover, the literature on implementation strategies from other service sectors has largely neglected the specification of change mechanisms by which strategies impact implementation and client outcomes (Lewis et al., 2018). A focus on theory-informed mechanisms as mediators of behavior change is vital to identifying why and how implementation strategies (e.g., PIES) enhance the yield of prevention programs, which can subsequently inform the development, testing, and refinement of high-quality implementation strategies and prevention programming (Birken et al., 2020; Kazdin, 2007). Such inquiries must be informed by established theories to explain how strategies at different implementation stages (e.g., pre-implementation vs active implementation) influence implementation and youth behavioral health outcomes (Proctor et al., 2013).

This study aimed to experimentally examine the efficacy of pragmatic PIES based on social cognitive theories (SC-PIES). The SC-PIES was hypothesized to influence teachers’ intentions to implement EBPs, which would in turn improve their intervention fidelity of EBPs and youth academic engagement. Specifically, we conducted a preliminary triple-blind parallel randomized controlled trial (RCT) to address four research questions: (1) As compared to control, does SC-PIES improve teachers’ intentions to implement (ITI) PCBM practices at posttest after adjusting baseline and covariates? (2) Compared to control, does SC-PIES improve teachers’ intervention fidelity and youth academic engagement at the 6-week follow-up after adjusting baseline and covariates? (3) Does teachers’ ITI mediate the association between study condition and intervention fidelity? (4) Does teachers’ intervention fidelity mediate the association between their ITI and youth academic engagement?

Method

Setting and Participants

This study took place at two elementary schools in a large and diverse urban district in the Pacific Northwest. The district was in the first year of a district-wide implementation initiative for a universal prevention program for youth behavioral health. The two participating schools had no experience implementing universal prevention programs targeting youth behaviors before the study. General education teachers were recruited if they have limited training and implementation experience about proactive classroom behavioral management strategies (PCBM) before this study (neligible = 56). Among the 43 teachers consented to participate, the majority were White (n = 29; 67.44%) and female (n = 35; 81.40%). The average teaching experience was 8.7 years (SD = 6.4), and about a third had a master’s degree (32.55%; Table 1). Twenty-two teachers reported taking one pre-service training about PCBM, while the remaining had no former training/experience in PCBM. The participant demographics were consistent with the education literature (Christofferson & Sullivan, 2015).

Table 1.

Descriptive statistics of participating teacher demographics (N = 43)

Variables Categories N %
Gender Female 35 81.40
Male 8 18.60
Grade level Unknown 8 18.60
1.00 7 16.28
2.00 7 16.28
3.00 7 16.28
4.00 7 16.28
5.00 7 16.28
Ethnicity White 29 67.44
Black 6 13.95
Latino 5 11.63
Asian 3 6.98
Teaching experience (years) ≤ 5 17 39.53
6 to 9 12 27.91
> 10 14 32.56
Condition Active control 21 48.84
Treatment 22 51.16

Procedures

Recruitment, Randomization, and Data Collection

This study was approved by the university IRB and the evaluation department of the partnering district. A single-level two-arm RCT was selected because the theoretical change mechanism of SC-PIES, treatment, and outcomes of interests were all at the teacher/classroom level (Taljaard et al., 2020). First, school administrators introduced this study to teachers and connect those interested to the authors. Forty-three teachers consented to participate, while 13 did not due to random reasons (e.g., job changes, workload, and turnover), which are common in the educational sector. With an online randomization tool (Uribank & Plous, 2013), a graduate assistant randomly assigned teachers to the treatment (ntreatment = 22) or control condition (ncontrol = 21). The assistant did not participate in data collection, analysis, or reporting. The triple-blind design ensured that none of the participants, researchers, or observers who assessed outcomes were aware of the treatment allocation (i.e., allocation concealment). This study rolled out in multiple steps in the fall semester as depicted in the CONSORT diagram (Fig. 1; Moher et al., 2012). We followed the STARi and CONSORT checklists in reporting (Online Resources 2 and 3). To enable proper causal inferences, we separated and sequenced the study components into (1) baseline pretest, (2) SC-PIES, (3) PCBM training, (4) immediate posttest of change mechanism (intention), (5) follow-up consultation, and (6) 6-week follow-up test (Fig. 1).

Fig. 1.

Fig. 1

The CONSORT diagram delineates the study design timeline of treatment (P-PIES), assessment, and training, consultation, as well as pre-/post-/follow-up tests and variables assessed at each time point. ITI, teachers’ intention to implement; FI, intervention fidelity; AET, class-wide youth academic engaged time. The brackets on the right indicate specific implementation stages that correspond to each step in this study

Pretest data were collected from all participants about their intentions to implement EBPs (ITI), intervention fidelity, and class-wide youth academic engaged time (AET). Then, the treatment group received the SC-PIES, and the control group met with their school administrators. Two days after the SC-PIES or meeting with administrators, participants from both conditions received a 2-day training (2 h per day) about proactive classroom management strategies (PCBMs; Online Resource 1). The 2-day delay between SC-PIES/meeting with administrators and PCBM training was an intentional part of the experimental design to ensure adequate time for the public commitment postings to be displayed and internalized by teachers receiving SC-PIES (see SC-PIES procedure). The delay isolated and prevented potential confounding between the public commitment posting activity and PCBM training. The PCBM training sessions followed a tell-show-do approach, where the trainer detailed the core components of PCBM to participating teachers, modeled implementation of these core components, then led teachers to plan how to implement PCBM in their classrooms and rehearse. In the end, teachers received one-on-one performance-based feedback from the trainer. After the PCBM training, the posttest (ITI only) was conducted immediately because social cognitive theories support that implementers’ intention, as a predictor of their subsequent behavioral change, can be most effectively assessed before their actual enactment of implementation behaviors (Fishman et al., 2021). Following the PCBM training, the authors provided two follow-up consultations to teachers (1- and 4-week). Follow-up data (fidelity and AET) was collected after teachers completed the training and implemented PCBM for 6 weeks.

Control Condition: Meeting with Administrators

Teachers in the control condition met with their school administrators to discuss the math/reading curriculum, teaching schedule, and logistics. The meeting was of the same duration as the SC-PIES (1 h) to control for dosage. The meeting served as attention control and intentionally precluded any active components of the SC-PIES.

Treatment Condition: SC-PIES

The SC-PIES is a 1-h professional development for teachers before attending the training about PCBM. To maximize the pragmatism of SC-PIES, all content and activities were streamlined to be deliverable in 1 h. The SC-PIES was grounded in three social cognitive principles: (a) growth mindset, (b) saying-is-believing, and (c) commitment and consistency.

Growth Mindset

Consistent with prior research, growth mindset was not directly implicated as something teachers need to acquire and exhibit. Instead, the growth mindset component of SC-PIES was modeled after prior research (Larson et al., 2021), where teachers learned about neuroplasticity (i.e., the brain can change with effort and persistence to engage in growth-oriented experiences) as it relates to youth development. Specifically, teachers first watched two videos and then were prompted to engage in guided reflective discussion with colleagues. During the discussion, teachers were encouraged to (a) explore the implications of neuroplasticity for promoting youth’s growth, enabling them to persevere in the face of adversity when learning new knowledge and skills, and (b) discuss ways they could encourage youth to internalize the notion of neuroplasticity. Next, the facilitator led a whole-group discussion with teachers to share their responses. Overall, the growth mindset component of SC-PIES attempted to instill in teachers a practice-oriented notion of growth mindset toward PCBM, where we encouraged teachers to act as role models for youth through their efforts to learn new things and persevere through adversity.

Saying-Is-Believing

Building on the growth mindset component and prior research (Aronson et al., 2002), the saying-is-believing component of the SC-PIES involved teachers writing letters to new teachers in which they explained the concept of neuroplasticity and illustrated the concept by describing how they were able to overcome challenges as a new teacher to learn new practices and better manage the conditions of leading a classroom. Specifically, they described a real-life scenario where they experienced adversity as new teachers but were able to overcome it by stretching themselves and putting in the effort to learn and apply a new practice. This activity puts teachers in the role of mentors for new teachers and encouraged them to advocate a message about learning new things and expressing the positive outcomes associated with persistence. Teachers were informed that these letters would be shared with the new teachers to normalize common difficulties and emphasize the importance of effort and perseverance to learn new skills over time as a teacher.

Commitment and Consistency

This component involved encouraging teachers to make public commitments regarding supporting youth’s growth mindsets. Based on the commitment and consistency principles (Cialdini & Goldstein, 2004), this strategy hypothesizes that once individuals have made a public and explicit commitment, their beliefs tend to shift to maintain consistency between their commitment and actions. Teachers were asked to publicly sign a billboard that would be posted for youth and parents to see that read, “At our school, we believe in our youth’s ability to grow their brains and academic skills through hard work, putting in extra effort to learn new skills, seeking assistance when confronted with challenges and persevering in the face of failure through the feedback we provide them and modeling effort and perseverance ourselves.” In addition to posting the billboard in a highly visible area near the school office, a photocopy of the signed billboard was sent out to every teacher in the treatment condition to post it as a reminder in their immediate environments (e.g., office or classrooms).

Example Preventive EBP: The PCBM

Based on a consensus-driven approach facilitated by school administrators, school staff selected four evidence-based preventive PCBM practices as non-negotiables to implement across every classroom (Online Resource 1), including (1) greeting youth at the door, (2) behavior-specific praise, (3) providing numerous opportunities to respond, and (4) post, teach, review, and provide feedback about positively stated behavioral expectations (PTRP). Teachers implemented the PCBM for 6 weeks in their classrooms before assessing fidelity and student outcome.

Measures

Intentions to Implement

The Modified Intentions to Use Scale was adopted to assess teachers’ intentions to implement new EBPs (Kortteisto et al., 2010). The scale is originally developed from research on practitioners’ intention to adhere to measurement-based care and is consistent with the recommendations for assessing behavioral intentions based on the theory of planned behavior (Godin & Kok, 1996). The scale consists of five items on a 7-point Likert scale ranging from “greatly disagree” to “greatly agree.” The items were modified to specifically assess teachers’ intentions to implement EBPs. The scale showed acceptable reliability and validity in prior research (Williams, 2015) and the current study (α = 0.75; Larson et al., 2021).

Intervention Fidelity

A team of trained professionals, who were blinded to the random assignment results, conducted structured observations to assess intervention fidelity, which is operationalized in this study as “adherence to core components of the classroom management practices.” The data collection took place at the pretest and 6-week follow-up to allow enough time to detect and compare changes in fidelity. First, during core instructional times, trained observers observed two occasions at pretest and 6-week follow-up to code the presence or absence of the core components of the PCBM. Immediately following the observations, these observers completed a fidelity rating rubric, which was created by operationalizing three core components of each PCBM such that observers could reliably observe and rate them (Sanetti & Kratochwill, 2009). The average of the observation coding and the rubric scores represents each teacher’s fidelity. Adequate inter-rater reliability (Spearman’s r) was found for all PCBMs (Mean r = 0.79), which was based on ratings from two observers for the same teacher for 20% of the same occasion.

Class-Wide AET

AET was defined as “any instance where youth are attending to instruction, watching the teacher or speaker, or concentrating on their classwork.” This study used direct behavior ratings (DBRs) for AET with which a teacher observes a youth’s AET throughout a predetermined interval. Teacher-completed DBRs had strong correlations to structured direct observations in assessing AET (r = 0.81; Chafouleas, 2011). The DBRs were completed at pretest and 6-week follow-up. For comparability across classrooms, each teacher completed DBRs during the language arts block at the same time each weekday for five randomly selected youths, whose results were averaged for that class. This approach has demonstrated adequate reliability and validity to capture class-wide AET (Chafouleas, 2011). To secure the temporal sequences in the mediation analyses, the AET was measured after intervention fidelity at 6-week follow-up.

Data Analysis

First, assumptions for all planned analyses were tested and generally met (e.g., normality of residuals and homoscedasticity). Then, Chi-square tests and independent samples t-tests were performed to assess baseline equivalences between the study conditions in the pretest values of teachers’ demographics (experience, grade, ethnicity, and gender) and three dependent variables (DVs; teachers’ ITI, intervention fidelity, and class-wide AET). For RQs 1 and 2, we used ANCOVA to estimate the treatment effect of SC-PIES on each of the three DVs. The ANCOVA models were configured the same way where the post-treatment score of a DV was regressed on the binary variable of study condition (reference category = control) as the focal predictor while controlling for the pretest score of the DV and demographics as covariates. The literature generally supported ANCOVA as the most appropriate and informative treatment effect analytic approach for pre/post control group designs because it can enable (a) adjustment of the pretest score as a covariate because it is not an outcome by definition, (b) convenient and interpretable adjustment of other covariates at baseline, and (c) handling of covariance matrix heterogeneity of parameters (Wan, 2021). To adjust for the bias caused by the covariance matrix heterogeneity in relatively small samples, we used the heteroskedasticity-consistent standard error estimator (HC4; Hayes & Cai, 2007). Given the relatively small sample and multiple tests performed, traditional null hypothesis tests with p-values are suboptimal (i.e., inflated type I error). The FDR-corrected p-values (false discovery rate; q-values) were estimated to correct for potential false positives with an intended level of significance corresponding to q = 0.05 (Benjamini & Hochberg, 1995; Wason & Robertson, 2021). Power analysis was performed with the G*Power version 3.19 (Faul et al., 2007). Given n = 43, six predictors, and alpha level of 0.05, we will have sufficient power (≥0.80) to detect an effect size (Cohen’s d) as small as 0.19.

To examine the hypothesized change mechanisms of SC-PIES (RQs 3 and 4), we sequentially explored the mediation effect of ITI on the relationship between study conditions and intervention fidelity, followed by a secondary model where teachers’ fidelity was hypothesized to mediate the relationships between their ITI and class-wide AET. To control for baseline status, the mediation models were fitted with change scores that were calculated by subtracting pretest values from the posttest (Vandenberghe et al., 2017). The nonparametric bootstrapping mediation analysis was used rather than the causal steps approach (Baron & Kenny, 1986) because of its superior capacities to handle small samples, estimate robust standard errors, and detect mediation effects without significant total effects (Hayes, 2017). We interpreted results based on statistical significances and standardized mean differences with pooled pretest standard deviations (SMDES; Morris, 2008). The SMDES is interpreted as the units of standard deviations of the mean difference between the pretest vs posttest changes of teachers’ outcomes in the treatment condition and those in the control. All analyses were performed with SPSS version 26.

Results

Treatment Effects of SC-PIES (RQs 1 and 2)

The t-tests confirmed that the random assignment created probabilistically equivalent groups in terms of the baseline status of key variables (Table 2). For the ANCOVA on teachers’ ITI (RQ 1; Table 3), results showed a significant positive effect of SC-PIES where the post-treatment scores of teachers’ ITI in the treatment condition were significantly higher than the scores of those in the control (b = 2.18, p < 0.01). The standardized mean difference effect size (SMDES) was 0.40. Of note, teachers’ work experience (b = −0.13, p < 0.05) was a significant predictor of post-treatment scores of their ITI, which suggests that 1-year increase in experience is associated with a 0.13 unit decrease in the scores of teachers’ ITI controlling for the other covariates in the model. For the ANCOVA on intervention fidelity (RQ 2; Table 3), results indicated that the intervention fidelity of teachers in the treatment condition was significantly higher than the fidelity of those in the control at 6-week follow-up (b = 2.39, p < 0.001). The SMDES for fidelity was 0.91. For class-wide AET (RQ 2; Table 3), results indicate that youth’s AET in the classrooms of the teachers from the treatment condition was significantly higher than that of the teachers from the control (b = 0.06, p < 0.01). The SMDES for AET is 0.47. No demographic covariates were significant for either fidelity or AET.

Table 2.

Independent samples t-tests of pretests of variables between treatment and control conditions

Variables Mean SD t-test for equality of means Levene’s test for equality of variances




Ctrl Trt Ctrl Trt t df p F p
Process and outcome variables Intentions to implement 20.57 20.23 5.69 5.50 −0.20 41 .84 0.01 .93
Intervention fidelity 7.90 7.59 2.84 2.63 −0.38 41 .71 0.15 .70
Academic engaged time 0.68 0.72 0.12 0.10   1.12 41 .27 0.21 .65
Teacher demographics Years of teaching experience 8.71 8.68 6.73 7.17 −0.02 41 .99 0.08 .78
Grade Pearson Chi-square df Asymptotic significance (2-sided)
2.98 5 .70
Ethnicity 0.55 3 .91
Gender 0.51 1 .48

Trt, treatment condition (P-PIES), Ctrl active control condition, SD standard deviation

Table 3.

ANCOVAs controlling for pretest values and demographics as covariates

Outcomes Predictors b Beta HC-SE t p q ES Adj. R2 F (p)
Intentions to implement (ITI; posttest) Intercept   7.52   - 1.95   3.85 < .001 .01* - .81 35.07 (< .001)
Treatment   2.18   .23 0.65   3.34 < .01 .02* .40
ITI (pretest)   0.71   .83 0.07   9.84 < .001 .01* -
Experience −0.13 −.19 0.05 −2.54 <.05 .03* -
Grade   0.04   .01 0.18   0.21   .84 .04 -
Ethnicity −0.05 −.01 0.33 −0.14   .89 .05 -
Gender −0.95 −.08 0.79 −1.2   .24 .04 -
PCBM fidelity (posttest) Intercept   3.78 - 0.97   3.9 < .001 .02* - .67 24.01 (< .001)
Treatment   2.39   .43 0.53   4.51 < .001 .01* .91
PCBM fidelity (pretest)   0.71   .69 0.09   7.77 < .001 .01* -
Experience −0.07 −.17 0.05 −1.44   .16 .03 -
Grade −0.01 −.01 0.14 −0.1   .92 .05 -
Ethnicity −0.11 −.04 0.32 −0.35   .73 .04 -
Gender −0.18 −.03 0.8 −0.22   .82 .04 -
Academic engaged time (AET; posttest) Intercept   0.24 - 0.07   3.48 < .01 .01* - .67 16.02 (< .001)
Treatment   0.06   .28 0.02   3.03 <.01 .02* .47
AET (pretest)   0.69   .66 0.1   6.69 < .001 .01* -
Experience −0.002 −.15 0 −1.11   .27 .03 -
Grade   0.004   .06 0.01   0.50   .62 .04 -
Ethnicity   0.01   .08 0.01   0.77   .45 .04 -
Gender −0.01 −.04 0.05 −0.25   .81 .05 -

q = the FDR-corrected p-value for the corresponding hypothesis test for a variable.

The asterisk (*) following a q value indicates that, for this test, the p-value is less than the q, which suggests a statistically significant result after FDR adjustment.

Treatment = study conditions (reference = the attention control condition), b = unstandardized coefficients, Beta = standardized coefficients, HC-SE heteroscedasticity-consistent standard error, “-” = not available, Adj. R2 adjusted R square

Mediational Mechanisms of Change (RQs 3 and 4)

For teachers’ ITI, we first fitted Model 1 by entering the binary variable of study condition as the predictor (reference = control), the change score of ITI as the mediator, and the change score of intervention fidelity as the outcome (Table 4; Fig. 2A). Results indicated that SC-PIES exerted a significant direct effect (DE) on the changes in teachers’ fidelity (DE = 1.21, p < 0.05). Also, there was a positive indirect/mediation effect (IE) where changes in teachers’ ITI partially mediated the relationship between study condition and fidelity (IE = 1.04, 95% CI = [0.38, 1.79]). This finding suggested that teachers who received the SC-PIES demonstrated greater improvement in their intentions to implement PCBM, which in turn enhanced their fidelity of PCBM, as compared to those in the control condition. For Model 2 (Table 4; Fig. 2B), resrdts indicated that teachers’ fidelity was a significant positive mediator between their ITI and youth’ AET (IE = 0.02, 95% CI = [0.01, 0.03]). However, the total and direct effects of teachers’ ITI on their youth’ AET were non-significant (TE = 0.01, p > 0.05; DE = −0.01, p < 0.001). This constitutes the phenomenon called “inconsistent mediation,” where a significant indirect effect is present with a nonsignificant total effect because the direct and indirect effects possess opposite directions and cancel each other out (MacKinnon et al., 2000; Altikriti, 2022).

Table 4.

Configurations and results of the mediation models

Model # IV M DV Direct effect Indirect/mediation effect 95% CI of mediation effect Total effect Mediation type
1 Trt ITI Fid   1.207* (0.497) 1.036* 0.377, 1.791 2.242*** (0.61) Partial mediation
2 ITI Fid AET −0.014*** (0.004) 0.019* 0.012, 0.027 0.005 (0.003) Inconsistent mediation

n = 43,

*

p < .05,

**

p < .01,

***

p < .001.

IV = predictor, M = mediator, DV = outcome, ITI = change score of teachers’ intentions to implement EBPs. Fid = change score of intervention fidelity of PCBM. Trt = treatment vs control conditions. Statistics in the parentheses are standard errors. A mediation effect is considered significant if its 95% confidence interval based on 5000 bias corrected bootstraps (CI; two tailed) does not include 0. Three decimals are reported to enhance precision

Fig. 2.

Fig. 2

*p < .05 **p < .01, ***p < .001. N = 43. Intentions to implement, intervention fidelity, and AET are change scores calculated by subtracting pretest score from posttest to account for baseline status. AET, academic engaged time. All beta weights are based on 5000 bias corrected bootstrap estimations, βa is the beta weight of the predictor regressing on the mediator, and βb is the beta weight for the mediator regressing on the outcome, βc is the beta weight for the predictor when the mediator is controlled (direct effect), while βd is the beta weight for the predictor when the mediator is not controlled (total effect)

Discussion

There is a need for theory-informed PIES to prevent lackluster implementation and youth outcomes that result from training and follow-up consultation about universal preventive practices in school-based youth behavioral healthcare. This study offers preliminary evidence for PIES, which is based on social cognitive theories (SC-PIES) and designed to be pragmatic (e.g., 60-min group delivery). PIES preventively target a motivational mechanism of behavior change (ITI) before school-based service providers receive EBP-specific training and follow-up consultation. Using a triple-blind parallel randomized controlled trial, this study found that the SC-PIES was effective at boosting teachers’ responsiveness to EBP-specific training and consultation as evidenced by improved ITI, intervention fidelity, and youth behavioral health outcomes (i.e., behavioral academic engagement as an incompatible response to externalizing problems).

Our findings preliminarily supported that implementers’ intention to implement can be an effective and amendable change mechanism. The SC-PIES appears to function by first enhancing teachers’ intentions to implement EBPs, which in turn leads to higher levels of intervention fidelity. Higher levels of intervention fidelity were then associated with improvements in a youth behavioral health outcome (i.e., behavioral academic engagement, AET). The findings are consistent with previous research on the theory of planned behavior indicating implementers’ intentions as a potential change mechanism that leads to increased intervention fidelity (Fishman et al., 2021). The significant mediation effect implied that teachers’ intentions to implement new EBPs are important to target before receiving training and consultation about those EBPs.

The inconsistent mediation effect of fidelity (MacKinnon et al., 2000) echoes the mixed finding of extant research on the hypothetical causal chains between implementers’ cognition, implementation behavior, and client outcomes (e.g., Godin, et al., 2008; Lewis et al., 2018; Williams, 2015). Consistent with behavior change literature, the positive indirect effect (Table 4) supported that improved ITI led to better implementation fidelity, which in turn led to improved student AET (Larson et al., 2021; Williams, 2015). Seemingly counterintuitive, the negative direct effect of ITI on student AET can be explained in multiple ways. First, mediation effect estimates can be unstable in small samples. Thus, the small yet negative estimate implies an unmeaningful effect that is a statistical artifact of fluctuating standard errors (Hayes, 2017). Second, unobserved moderators may confound the direction of the association between ITI and AET, such as student–teacher race match and relationship (Diaz et al., 2017; Wittrup et al., 2019). Similarly, the pathway between ITI and AET may contain unobserved mediators with opposite effects on ITI and AET, which lead to the negative association observed in this study (Wu & Huang, 2007). Future research should explore moderators and mediators with large samples to test these hypotheses to elucidate the causal chain among ITI, implementation fidelity, and student behaviors (Birken et al., 2020).

Implications for Prevention Research and Practice

This study contributes to the emerging research to investigate PIES as a generic and EBP-agnostic pre-implementation strategy situated between the preparation and active implementation stages in the process of implementing universal prevention/intervention practices (e.g., Larson et al., 2021; Lyon et al., 2019) in school settings. In field application, practitioners may integrate the SC-PIES seamlessly into the beginning of any EBP training. Notably, the 2-day delay between SC-PIES and EBP training in this tightly controlled study is not necessary in practice because field practice does not need to isolate study components or control for confounding. This study also demonstrated the importance and utility of social cognitive theories (e.g., TPB; Birken et al., 2020; Godin & Kok 1996) to inform the design of PIES for application in school-based mental health settings. Consistent with prior research on TPB, our findings supported the promise of a specific PIES (SC-PIES) targeting teachers’ intention to implement (ITI) which served as a mechanism of behavioral change to enhance the yield of training and follow-up consultation of EBPs as evidenced in improved implementation and youth outcomes.

Moreover, our findings suggest that ITI may serve as a tailoring variable to differentiate the intensity and frequency of implementation supports. For teachers with weak pretraining intentions, PIES can be used to boost their intentions to implement EBPs, which has the potential to increase practitioners’ engagement with and responsiveness to training and consultation supports. For teachers with strong pretraining intentions, active implementation support alone (e.g., training) may serve as an effective strategy to produce successful implementation (Collier-Meek et al., 2017). Future research should explore a precision-based approach, especially in low-resource settings (e.g., public schools), to develop implementation strategies that are adaptive to individual characteristics as a cost-effective way to improve implementation and client outcomes using methods such as SMART designs (Eisman et al., 2020). Research in several fields (e.g., public health, schools) indicates that adaptive strategies are more likely to produce superior implementation and client outcomes than a “one-size-fits-all” approach (Sohl & Moyer, 2007).

The field of implementation science is at risk of replicating the problem it seeks to avoid (i.e., the science-to-practice gap) by creating a divide between implementation research and practice (Westerlund et al., 2019). This gap is especially prevalent in school-based implementation science and practice. The complexity, cost, and expertise required to deliver implementation strategies impact whether these strategies can be feasibly transferred into routine practice (Brownson et al., 2017). Too often researchers develop strategies that are unlikely to be adopted or used under real-world conditions to support the delivery of prevention services. This exacerbates the science-to-practice gap. The brevity and intuitiveness of the SC-PIES were intentional to ensure that it was pragmatic and fit the context of the educational sector where staff are often burnt-out with limited time available for training or delivery of new EBPs (Fiorilli et al., 2020). A focus on the pragmatics of implementation strategies increases their potential to be translated into real-world practices if effectively disseminated (Glasgow, 2013).

Limitations and Future Directions

Several limitations should be considered when interpreting the findings of this study. First, our findings may be moderated by the characteristics and complexity of EBPs. In this pilot trial, we selected one type of EBP commonly used and well-established in schools (i.e., PCBM). It is unclear how our findings would hold up with EBPs that are more complex and demanding regarding learning and time (e.g., multi-component complex therapies). Future research should extend our study by testing the effect of SC-PIES on a spectrum of EBPs with varying levels of complexity and demand for resources. Second, this preliminary study utilized a single-level sample of teachers from two schools. The teachers from the same building may have communicated with each other, which would lead to treatment contamination (Danga & Korb, 2014). However, contamination was unlikely in this study because teachers had to have access to the training material and participate in the activities of SC-PIES to obtain an effective dosage of treatment exposure (Teerenstra et al., 2006). Also, this contamination effect may be leveraged in future studies by implementing the SC-PIES as a universal preventive implementation strategy that has a school-level effect and broad reach. Future research should replicate this study with a large sample using cluster or pseudo-cluster randomized trials to estimate the school-level effect of SC-PIES and control for contamination (Pence et al., 2015). Third, self-selection bias may limit the representativeness of our sample and the generalizability of our findings. Future research should address this bias with large and diverse samples, novel participant recruitment and retention strategies, and the “intent-to-treat” analysis (Elston, 2021). Although our findings supported the efficacy of the SC-PIES, not all teachers demonstrated comparable changes in response to the SC-PIES on their intentions to implement, implementation behavior, and youth outcome. Future research needs to explore school contextual moderators (e.g., student demographics, teacher stress, or burnout), which can elucidate for whom SC-PIES works better to guide a tailored approach to implementation support. Lastly, the SC-PIES should be replicated in other youth behavioral healthcare settings to test if its effects will transfer and hold in different provider and client populations (e.g., therapists in community clinics; Anderson & Maxwell, 2017).

Conclusion

This study offers insights into how to design theory-informed and pragmatic PIES to preventively boost the yield of training and consultation. Our findings of the SC-PIES suggested that promoting teachers’ intentions to implement before training and consultation about preventive EBPs can be a viable way to enhance their subsequent intervention fidelity and student outcomes in school-based mental health settings. The findings highlight the importance of mechanism-oriented approaches in developing and testing implementation strategies, especially those grounded in social cognitive theories. We call for further investigation of cost-effective ways to elevate practitioners’ motivation as a mechanism of behavior change linked to their engagement and responsiveness to training and follow-up consultation. Last, we hope that this study stimulates researchers to develop and test motivationally focused PIES that are situated between the preparation and active implementation phases in the process of implementing prevention or intervention services in different youth-serving settings.

Supplementary Material

Supplement 1
Supplement 3
Supplement 2

Abbreviations

PIES

Pre-implementation enhancement strategies

SC-PIES

Social cognitive theory-informed pre-implementation enhancement strategies

EBP

Evidence-based practice

TPB

Theory of planned behavior

ITI

Intentions to implement

AET

Academic engaged time

Footnotes

Supplementary Information The online version contains supplementary material available at https://doi.org/10.1007/s11121-022-01464-3.

Ethics Approval and Consent to Participate Informed consent was obtained from all individual participants in the study. All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional research committee (University of Washington Human Subject Division, HSD52139EG) and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.

Conflict of Interest The authors declare no competing interests.

Availability of Data and Material

The de-identified datasets are available in the Open Science Framework repository, (osf.io/d5t4m/).

References

  1. Abry T, Rimm-Kaufman SE, Larsen RA, & Brewer AJ (2013). The influence of fidelity of implementation on teacher-student interaction quality in the context of a randomized controlled trial of the Responsive Classroom approach. Journal of School Psychology, 51(4), 437–453. [DOI] [PubMed] [Google Scholar]
  2. Altikriti S (2022). Examining the relationship between cognitive ability and arrest using a differential offenses hypothesis: Evidence of inconsistent mediation. Crime & Delinquency, 00111287211057862. [Google Scholar]
  3. Aronson J, Fried CB, & Good C (2002). Reducing the effects of stereotype threat on African American college students by shaping theories of intelligence. Journal of Experimental Social Psychology, 38(2), 113–125. [Google Scholar]
  4. Anderson SF, & Maxwell SE (2017). Addressing the “replication crisis”: Using original studies to design replication studies with appropriate statistical power. Multivariate Behavioral Research, 52(3), 305–324. [DOI] [PubMed] [Google Scholar]
  5. Baron RM, & Kenny DA (1986). The moderator–mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51(6), 1173. [DOI] [PubMed] [Google Scholar]
  6. Benjamini Y, & Hochberg Y (1995). Controlling the false discovery rate: A practical and powerful approach to multiple testing. Journal of the Royal Statistical Society: Series B (methodological), 57(1), 289–300. [Google Scholar]
  7. Bevilacqua L, Hale D, Barker ED, & Viner R (2018). Conduct problems trajectories and psychosocial outcomes: A systematic review and meta-analysis. European Child & Adolescent Psychiatry, 27(10), 1239–1260. [DOI] [PubMed] [Google Scholar]
  8. Birken SA, Haines ER, Hwang S, Chambers DA, Bunger AC, & Nilsen P (2020). Advancing understanding and identifying strategies for sustaining evidence-based practices: A review of reviews. Implementation Science, 15(1), 1–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Blackwell LS, Trzesniewski KH, & Dweck CS (2007). Implicit theories of intelligence predict achievement across an adolescent transition: A longitudinal study and an intervention. Child Development, 78(1), 246–263. [DOI] [PubMed] [Google Scholar]
  10. Brownson RC, Eyler AA, Harris JK, Moore JB, & Tabak RG (2018). Research full report: Getting the word out: New approaches for disseminating public health science. Journal of Public Health Management and Practice, 24(2), 102. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Brownson RC, Colditz GA, & Proctor EK (2017). Dissemination and implementation research in health: Translating science to practice. Oxford University Press. [Google Scholar]
  12. Chafouleas SM (2011). Direct behavior rating: A review of the issues and research in its development. Education and Treatment of Children, 34(A), 575–591. [Google Scholar]
  13. Christofferson M, & Sullivan AL (2015). Preservice teachers’ classroom management training: A survey of self-reported training experiences, content coverage, and preparedness. Psychology in the Schools, 52(3), 248–264. [Google Scholar]
  14. Cialdini RB, & Goldstein NJ (2004). Social influence: Compliance and conformity. Annual Review of Psychology, 55, 591–621. [DOI] [PubMed] [Google Scholar]
  15. Cialdini RB (2001). The science of persuasion. Scientific American, 284(2), 76–81.11285825 [Google Scholar]
  16. Collier-Meek M, Fallon LM, & DeFouw ER (2017). Toward feasible implementation support: E-mailed prompts to promote teachers’ treatment integrity. School Psychology Review, 46(4), 379–394. [Google Scholar]
  17. Cook CR, Fiat A, Larson M, Daikos C, Slemrod T, Holland EA, & Renshaw T (2018). Positive greetings at the door: Evaluation of a low-cost, high-yield proactive classroom management strategy. Journal of Positive Behavior Interventions, 20(3), 149-. [Google Scholar]
  18. Cook CR, Frye M, Slemrod T, Lyon AR, Renshaw TL, & Zhang Y (2015). An integrated approach to universal prevention: Independent and combined effects of PBIS and SEL on youths’ mental health. School Psychology Quarterly, 30(2), 166. [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Cook CR, Lyon AR, Locke J, Waltz T, & Powell BJ (2019). Adapting a compilation of implementation strategies to advance school-based implementation research and practice. Prevention Science, 20(6), 914–935. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Diaz A, Eisenberg N, Valiente C, VanSchyndel S, Spinrad TL, Berger R, & Southworth J (2017). Relations of positive and negative expressivity and effortful control to kindergarteners’ student–teacher relationship, academic engagement, and externalizing problems at school. Journal of Research in Personality, 67, 3–14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Danga L, & Korb KA (2014). The effect of treatment diffusion on educational experimental designs. Benin J Educ Stud, 23, 29–37. [Google Scholar]
  22. Dickie J, & Shuker MJ (2014). Ben 10, superheroes and princesses: Primary teachers’ views of popular culture and school literacy. Literacy, 48(1), 32–38. [Google Scholar]
  23. Duong MT, Bruns EJ, Lee K, Cox S, Coifman J, Mayworm A, & Lyon AR (2021). Rates of mental health service utilization by children and adolescents in schools and other common service settings: A systematic review and meta-analysis. Administration and Policy in Mental Health and Mental Health Services Research, 48(3), 420–439. [DOI] [PubMed] [Google Scholar]
  24. Dusenbury L, Brannigan R, Hansen WB, Walsh J, & Falco M (2005). Quality of implementation: Developing measures crucial to understanding the diffusion of preventive interventions. Health Education Research, 20(3), 308–313. [DOI] [PubMed] [Google Scholar]
  25. Edmunds JM, Beidas RS, & Kendall PC (2013). Dissemination and implementation of evidence–based practices: Training and consultation as implementation strategies. Clinical Psychology: Science and Practice, 20(2), 152. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Elston DM (2021). Participation bias, self-selection bias, and response bias. Journal of the American Academy of Dermatology. [DOI] [PubMed] [Google Scholar]
  27. sFaul F, Erdfelder E, Lang AG, & Buchner A (2007). G* Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39(2), 175–191. [DOI] [PubMed] [Google Scholar]
  28. Fazel M, Hoagwood K, Stephan S, & Ford T (2014). Mental health interventions in schools in high-income countries. The Lancet Psychiatry, 1(5), 377–387. [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Filter KJ, & Brown J (2019). Validation of the PBIS-ACT full: An updated measure of staff commitment to implement SWPBIS. Remedial and Special Education, 40(1), 40–50. [Google Scholar]
  30. Fiorilli C, Buonomo I, Romano L, & Pepe A (2020). Teacher confidence in professional training: The predictive roles of engagement and burnout. Sustainability, 12(16), 6345. [Google Scholar]
  31. Fishman J, Yang C, & Mandell D (2021). Attitude theory and measurement in implementation science: A secondary review of empirical studies and opportunities for advancement. Implementation Science, 16(1), 1–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Glasgow RE (2013). What does it mean to be pragmatic? Pragmatic methods, measures, and models to facilitate research translation. Health Education & Behavior, 40(3), 257–265. [DOI] [PubMed] [Google Scholar]
  33. Godin G, Bélanger-Gravel A, Eccles M, & Grimshaw J (2008). Healthcare professionals’ intentions and behaviours: A systematic review of studies based on social cognitive theories. Implementation Science, 3(1), 1–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Godin G, & Kok G (1996). The theory of planned behavior: a review of its applications to health-related behaviors. American journal of health promotion, 11(2), 87–98. [DOI] [PubMed] [Google Scholar]
  35. Hayes AF, & Cai L (2007). Using heteroskedasticity-consistent standard error estimators in OLS regression: An introduction and software implementation. Behavior Research Methods, 39(4), 709–722. [DOI] [PubMed] [Google Scholar]
  36. Hayes AF (2017). Introduction to mediation, moderation, and conditional process analysis: A regression-based approach. Guilford publications. [Google Scholar]
  37. Kazdin AE (2007). Mediators and mechanisms of change in psychotherapy research. Annual Review of Clinical Psychology, 3, 1–27. [DOI] [PubMed] [Google Scholar]
  38. Kincaid D, Childs K, Blase KA, & Wallace F (2007). Identifying barriers and facilitators in implementing schoolwide positive behavior support. Journal of Positive Behavior Interventions, 9(3), 174–184. [Google Scholar]
  39. Kortteisto T, Kaila M, Komulainen J, Mäntyranta T, & Rissanen P (2010). Healthcare professionals’ intentions to use clinical guidelines: A survey using the theory of planned behaviour. Implementation Science, 5(1), 1–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Larson M, Cook CR, Brewer SK, Pullmann MD, Hamlin C , Merle JL, & Lyon AR (2021). Examining the effects of a brief, group-based motivational implementation strategy on mechanisms of teacher behavior change. Prevention Science, 22(6), 722–736. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Lee P, & Bierman KL (2015). Classroom and teacher support in kindergarten: Associations with the behavioral and academic adjustment of low-income students. Merrill-Palmer quarterly (Wayne State University. Press; ), 61(3), 383. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Leflot G, Van Lier PA, Verschueren K, Onghena P, & Colpin H (2011). Transactional associations among teacher support, peer social preference, and child externalizing behavior: A four-wave longitudinal study. Journal of Clinical Child & Adolescent Psychology, 40(1), 87–99. [DOI] [PubMed] [Google Scholar]
  43. Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, & Weiner B (2018). From classification to causality: Advancing understanding of mechanisms of change in implementation science. Frontiers in Public Health, 6, 136. [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Low S, Smolkowski K, & Cook C (2016). What constitutes high-quality implementation of SEL programs? A latent class analysis of second step implementation. Prevention Science, 17(8), 981–991. [DOI] [PubMed] [Google Scholar]
  45. Lyon AR, & Bruns EJ (2019). From evidence to impact: Joining our best school mental health practices with our best implementation strategies. School Mental Health, 11(1). [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Lyon AR, Cook CR, Brown EC, Locke J, Davis C, Ehrhart M, & Aarons GA (2018). Assessing organizational implementation context in the education sector: confirmatory factor analysis of measures of implementation leadership, climate, and citizenship. Implementation Science, 13(1), 1–14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Lyon AR, Cook CR, Duong MT, Nicodimos S, Pullmann MD, Brewer SK, & Cox S (2019). The influence of a blended, theoretically-informed pre-implementation strategy on school-based clinician implementation of an evidence-based trauma intervention. Implementation Science, 14(1), 1–16. [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. Lyon AR, Pullmann MD, Walker SC, & D’Angelo G (2017). Community-sourced intervention programs: Review of submissions in response to a statewide call for “promising practices.” APMH, 44(1), 16–28. [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. MacKinnon DP, Krull JL, & Lockwood CM (2000). Equivalence of the mediation, confounding and suppression effect. Prevention Science, 1(4), 173–181. [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Mangurian C, Niu G, Schillinger D, Newcomer J, Dilley J, & Handley M (2017). Utilization of the behavior change wheel framework to develop a model to improve cardiometabolic screening for people with severe mental illness. Imp Sci, 12(1), 1–16. [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. McLeod BD, Cox JR, Jensen-Doss A, Herschell A, Ehrenreich-May J, & Wood JJ (2018). Proposing a mechanistic model of clinician training and consultation. Clinical Psychology: Science and Practice, 25(3) [DOI] [PMC free article] [PubMed] [Google Scholar]
  52. Mitchell MM, & Bradshaw CP (2013). Examining classroom influences on student perceptions of school climate: The role of classroom management and exclusionary discipline strategies. Journal of School Psychology, 51(5), 599–610. [DOI] [PubMed] [Google Scholar]
  53. Moher D, Hopewell S, Schulz KF, Montori V, Gøtzsche PC, Devereaux PJ, & Altman DG (2012). CONSORT 2010 explanation and elaboration: Updated guidelines for reporting parallel group randomised trials. International Journal of Surgery, 10(1), 28.22036893 [Google Scholar]
  54. Morris SB (2008). Estimating effect sizes from pretest-posttest-control group designs. Organizational Research Methods, 11(2), 364–386. [Google Scholar]
  55. Nagro SA, Fraser DW, & Hooks SD (2019). Lesson planning with engagement in mind: Proactive classroom management strategies for curriculum instruction. Intervention in School and Clinic, 54(3), 131–140. [Google Scholar]
  56. Office of the Surgeon General. (2022). Protecting Youth Mental Health: US Surgeon General’s Advisory. Retrieved October 25, 2022, from https://www.hhs.gov/surgeongeneral/reports-and-publications/youth-mental-health/index.html
  57. Phillippi S, Beiter K, Thomas C, & Vos S (2020). Identifying gaps and using evidence-based practices to serve the behavioral health treatment needs of medicaid-insured children. Children and Youth Services Review, 115, 105089 [Google Scholar]
  58. Pence B, Gaynes B , Thielman N, Heine A, Mugavero M, Turner E, & Quinlivan E (2015). Balancing contamination and referral bias in a randomized clinical trial: An application of pseudo-cluster randomization. American Journal of Epidemiology, [DOI] [PMC free article] [PubMed] [Google Scholar]
  59. Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, & Mandell DS (2017). Methods to improve the selection and tailoring of implementation strategies. The Journal of Behavioral Health Services & Research, 44(2), 177–194. [DOI] [PMC free article] [PubMed] [Google Scholar]
  60. Pratkanis AR (2011). Social influence analysis: An index of tactics. In The science of social influence (pp. 17–82). Psychology Press. [Google Scholar]
  61. Proctor EK, Powell BJ, & McMillen JC (2013). Implementation strategies: Recommendations for specifying and reporting. Implementation Science, 8(1), 1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  62. Rathvon N (2008). Effective school interventions: Evidence-based strategies for improving student outcomes. Guilford Press. [Google Scholar]
  63. Regan T, Tubman JG, & Schwartz SJ (2020). Relations among externalizing behaviors, alcohol expectancies and alcohol use problems in a multi-ethnic sample of middle and high school students. Substance abuse: research and treatment, 14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  64. Robertson RE, Buonomo K, Abdellatif H, & DeMaria S (2021). Results of a “Psychologically Wise” professional development to increase teacher use of proactive behavior management strategies. Psychology in the Schools, 58(9), 1724–1740. [Google Scholar]
  65. Rodwell L, Romaniuk H, Nilsen W, Carlin JB, Lee KJ, & Patton GC (2018). Adolescent mental health and behavioral predictors of being NEET: A prospective study of young adults not in employment. Psychological Medicine, 45(5), 861–871. [DOI] [PubMed] [Google Scholar]
  66. Sanetti LMH, & Kratochwill TR (2009). Toward developing a science of treatment integrity: Introduction to the special series. School Psychology Review, 38(4), 445. [Google Scholar]
  67. Simonsen B, Fairbanks S, Briesch A, Myers D, & Sugai G (2008). Evidence-based practices in classroom management: Considerations for research to practice. Education and treatment of children, 351–380. [Google Scholar]
  68. Sohl SJ, & Moyer A (2007). Tailored interventions to promote mammography screening: A meta-analytic review. Preventive Medicine, 45(4), 252–261. [DOI] [PMC free article] [PubMed] [Google Scholar]
  69. Spangenberg ER, & Greenwald AG (2001). Self-prophecy as a behavior modification technique in the United States. [Google Scholar]
  70. Splett JW, Garzona M, Gibson N, Wojtalewicz D, Raborn A, & Reinke WM (2019). Teacher recognition, concern, and referral of children’s internalizing and externalizing behavior problems. School Mental Health, 11(2), 228–239. [Google Scholar]
  71. Steinmetz H, Knappstein M, Ajzen I, Schmidt P, & Kabst R (2016). How effective are behavior change interventions based on the theory of planned behavior?. Zeitschrift für [Google Scholar]
  72. Taljaard M, Goldstein CE, Giraudeau B, Nicholls SG, Carroll K„ Hey SP, & Weijer C (2020). Cluster over individual randomization: Are study design choices appropriately justified? Review of a random sample of trials. Clinical Trials, 17(3), 253-. [DOI] [PubMed] [Google Scholar]
  73. Teerenstra S, Melis RJF, Peer PGM, & Borm GF (2006). Pseudo cluster randomization dealt with selection bias and contamination in clinical trials. Journal of Clinical Epidemiology, 59(4), 381–386. [DOI] [PubMed] [Google Scholar]
  74. Urbaniak GC, & Plous S (2013). Research Randomizer (Version 4.0) [Computer software]. Retrieved June 22, 2013, from https://www.randomizer.org/
  75. Vandenberghe S, Vansteelandt S, & Loeys T (2017). Boosting the precision of mediation analyses of randomised experiments through covariate adjustment. Statistics in Medicine, 36(6), 939–957. [DOI] [PubMed] [Google Scholar]
  76. Walton GM, & Cohen GL (2007). A question of belonging: Race, social fit, and achievement. Journal of Personality and Social Psychology, 92(1), 82. [DOI] [PubMed] [Google Scholar]
  77. Wan F (2021). Statistical analysis of two arm randomized pre-post designs with one post-treatment measurement. BMC Medical Research Methodology, 21(1), 1–16. [DOI] [PMC free article] [PubMed] [Google Scholar]
  78. Wason JM, & Robertson DS (2021). Controlling type I error rates in multi-arm clinical trials: A case for the false discovery rate. Pharmaceutical Statistics, 20(1), 109–116. [DOI] [PMC free article] [PubMed] [Google Scholar]
  79. Weist MD, Bruns EJ, Whitaker K, Wei Y, Kutcher S, Larsen T, & Short KH (2017). School mental health promotion and intervention: Experiences from four nations. School Psychology International, 55(4), 343–362. [Google Scholar]
  80. Westerlund A, Nilsen P, & Sundberg L (2019). Implementation of implementation science knowledge: The research-practice gap paradox. Worldviews on EB Nursing, 16(5), 332. [DOI] [PMC free article] [PubMed] [Google Scholar]
  81. Williams NJ, & Beidas RS (2019). Annual research review: The state of implementation science in child psychology and psychiatry: A review and suggestions to advance the field. Journal of Child Psychology and Psychiatry, 60(4), 430–450. [DOI] [PMC free article] [PubMed] [Google Scholar]
  82. Williams NJ (2015). Assessing mental health clinicians’ intentions to adopt evidence-based treatments: Reliability and validity testing of the evidence-based treatment intentions scale. Implementation Science, 11(1), 1–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
  83. Wittrup AR, Hussain SB, Albright JN, Hurd NM, Varner FA, & Mattis JS (2019). Natural mentors, racial pride, and academic engagement among black adolescents: Resilience in the context of perceived discrimination. Youth & Society, 51(4), 463–483. [DOI] [PMC free article] [PubMed] [Google Scholar]
  84. Wood W (2000). Attitude change: Persuasion and social influence. Annual review of psychology, 51(1). [DOI] [PubMed] [Google Scholar]
  85. Wu HK, & Huang YF (2007). Ninth-grade student engagement in teacher-centered and student-centered technology-enhanced learning environments. Science education, 91(5). [Google Scholar]
  86. Yeager DS, & Dweck CS (2012). Mindsets that promote resilience: When students believe that personal characteristics can be developed. Educational Psychologist, 47(4), 302–314. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplement 1
Supplement 3
Supplement 2

Data Availability Statement

The de-identified datasets are available in the Open Science Framework repository, (osf.io/d5t4m/).

RESOURCES