Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2014 Nov 3.
Published in final edited form as: Health Educ (Lond). 2013 Jun;113(4):281–296. doi: 10.1108/09654281311329231

The Effects of Implementation Fidelity in the Towards No Drug Abuse Dissemination Trial

Melissa A Little 1,*, Steve Sussman 2, Ping Sun 2, Louise Ann Rohrbach 2
PMCID: PMC4217135  NIHMSID: NIHMS527965  PMID: 24386646

Abstract

Purpose

The current study examines the influence of contextual and provider-level factors on the implementation fidelity of a research-based substance abuse prevention program. Also, it investigates whether two provider-level factors, self-efficacy and beliefs about the value of the program, statistically moderate and mediate the effects of a provider training intervention on implementation fidelity.

Design/methodology/approach

Using generalized mixed-linear modeling, we examine relationships between program provider-, organizational, and community-level factors and implementation fidelity in a sample of 50 high school teachers from 43 high schools in 8 states across the U.S. Fidelity of implementation was assessed utilizing an observation procedure.

Findings

Implementation fidelity was negatively associated with the urbanicity of the community and the level of teachers’ beliefs about the value of the program, and positively predicted by the organizational capacity of the school. Comprehensive training significantly increased teachers’ self-efficacy, which resulted in an increase in implementation fidelity.

Research implications

School-based prevention program implementation is influenced by a variety of contextual factors occurring at multiple ecological levels. Future effectiveness and dissemination studies need to account for the complex nature of schools in analyses of implementation fidelity and outcomes.

Practical implications

Our findings suggest that both provider- and organizational-level factors are influential in promoting implementation fidelity. Before implementation begins, as well as throughout the implementation process, training and ongoing technical assistance should be conducted to increase teachers’ skills, self-efficacy, and comfort with prevention curricula.

Originality/value

The present study is one of the few to examine contextual and provider-level correlates of implementation fidelity and use mediation analyses to explore whether provider-level factors mediate the effects of a provider training intervention on implementation fidelity.

INTRODUCTION

Considerable advances have been made in the field of school-based substance abuse prevention in the last quarter century. There are now a number of research-based prevention programs that are ready for broad dissemination and implementation (Dusenbury et al. 2010). Despite this potential for wide-scale use, recent studies have shown that only a small percentage of prevention programs implemented in schools are research-based (Hallfors et al., 2000; Ringwalt et al. 2011). Moreover, when research-based programs are implemented, a host of real-world conditions often compromise fidelity of implementation (Ennett et al., 2011). Further, poor implementation fidelity has been shown to dilute prevention program effects (Durlak & Dupre, 2008; Fixen et al., 2005). Thus, in order to maximize the potential of research-based prevention programs to produce lasting public health impacts, it is important to understand the factors that influence fidelity and identify the conditions necessary for effective program utilization (Kam et al. 2003).

To date, there has been relatively little research investigating the influence of contextual factors on the development and implementation of prevention programming in schools. While schools offer enormous opportunity to reach large numbers of adolescents in well controlled environments, unique contextual factors embedded within the school structure are likely to impede program implementation (Chen 1998; Domitrovich et al. 2008). Schools tend to be overburdened meeting academic and policy-related priorities. Many school systems are facing significant reductions in public funds, and some are experimenting with school reform measures and structural changes in an effort to cope with these crises. The complexity of school organizational structures can be a barrier to prevention program implementation, as implementation often requires approval and buy-in from multiple levels of decision-makers including superintendents, principals, teachers, school boards, and community partners (Greenberg 2010).

Several recent conceptual frameworks propose that implementation of research-based programs takes place within many hierarchical levels (Chen, 1998; Durlak & DuPre, 2008; Wandersman et al., 2008). In Chen’s (1998) model, program implementation is influenced by three levels of factors, (1) characteristics of the program provider (e.g., experience and beliefs), (2) the organizational context (e.g., administrator support and school climate), and (3) the implementation system (e.g., staff training and infrastructure to coordinate implementation). At the core of program implementation are the teachers who deliver the programs. Fidelity of implementation has been found to increase when teachers are comfortable with the program and delivery method and have strong teaching skills, self-efficacy, enthusiasm, preparedness, and beliefs about the value and effectiveness of the program (Ringwalt et al. 2003; Rohrbach et al. 1993; Mihalic et al, 2008). Organizational characteristics are also central to understanding implementation fidelity, because administrators, teachers and students are embedded within the shared environment of the school and school district (Domitrovich et al. 2008). Implementation fidelity of prevention programs has been associated with various aspects of the school climate, such as capacity for change, openness to change, and positive communication between teachers and administrators (Beets et al. 2008; Ennett et al. 2003; Gittelsohn et al. 2003; Gottfredson and Gottfredson 2002; Kallestad and Olweus 2003; Klimes-Dougan et al. 2009; Rohrbach et al. 2005). Additionally, when a program aligns with a school’s policies, it is more likely to be implemented with quality (Kallestad and Olweus 2003; Payne et al. 2006). Factors related to the structure of the school and surrounding community, such as school size and urbanicity, also have been associated with prevention program implementation, such that larger schools in more urban areas demonstrate higher levels of program use (Payne 2009; Payne et al. 2006).

In the conceptual models of Durlak and Dupre (2008) and Wandersman and colleagues (2008), prevention delivery systems (e.g., organizational capacity) and prevention support systems (e.g., training and technical assistance) need to be developed within and across organizations to ensure successful prevention program implementation. Research in school settings has shown that school principal support, encouragement, and monitoring of implementation increase fidelity of implementation (Gingiss et al., 2006; Kam et al, 2003; Ringwalt et al., 2003; Rohrbach et al, 1993). In regard to prevention support systems, there is considerable evidence that the provision of training increases implementation fidelity of research-based prevention programs (Blake, et al., 2005; Ennett, et al., 2003; Fagan, et al., 2008; Gingiss, et al., 2006; Hallfors & Godette, 2002; Ringwalt, et al., 2003; Roberts-Gray, et al., 2007; Rohrbach, Gunning, Sun, & Sussman, 2010a). Also, there is some evidence that technical assistance (e.g., ongoing training or retraining, emotional support, consultation) leads to improved implementation of research-based interventions (Fagan, et al., 2008; Gingiss, et al., 2006; Roberts-Gray, et al., 2007; Rohrbach, Gunning, Sun, et al., 2010a). Given the challenges associated with implementing the interactive teaching methods typical of prevention curricula, training is an essential tool for demonstrating and reinforcing the use of these methods in the context of prevention programs (Ennett, et al., 2003).

There is an urgent need for more studies on the contextual factors that impede and enhance implementation of research-based prevention programs under conditions common in schools. At present, we have a limited understanding of the range of factors that may potentially influence a school to adopt a research-based program and a teacher to implement it with fidelity, and which of these factors may be modifiable through training and other interventions. Although training appears to be essential in enhancing fidelity of program implementation (Durlak & DuPre, 2008), research on the relative effectiveness of various training models as well as the processes by which training work is largely lacking (Rohrbach et al., 2006). In the current study, we address some of these gaps in the literature by examining factors influencing the implementation fidelity of a research-based prevention program and exploring the effect of a training intervention on two modifiable factors that appear to be important predictors of implementation fidelity, teacher self-efficacy and beliefs about the value of the program.

The Current Study

The present study examines the implementation fidelity of Project Towards No Drug Abuse (TND), a research-based substance abuse prevention program for high school students (Sussman et al. 2003). The Project TND curriculum uses a motivation, skills, and decision-making approach to change substance use and violence-related behaviors (Sussman et al.2004b). The curriculum has 12 classroom sessions, lasting approximately 45 minutes each. Through interactive teaching techniques, the curriculum provides students with cognitive motivation enhancement activities, information about the consequences of drug use, correction of cognitive misperceptions, communication and coping skills enhancement, and tobacco cessation techniques (Skara et al. 2005; Sussman et al. 2004a).

Project TND has been evaluated in seven randomized trials that have demonstrated an impact of the program on 30-day substance use at a one`-year follow-up or longer (Dent et al. 2001; Rohrbach et al. 2010b; Sun et al. 2008; Sun et al. 2006; Sussman et al. 2002; Sussman et al. 2003; Sussman et al. 2011; Sussman et al. 1998; Valente et al. 2007). The current paper reports on findings from the Project Towards No Drug Abuse (TND) Dissemination Trial, designed to test the relative effectiveness of two approaches to training high school teachers to implement Project TND, a standard training workshop versus a comprehensive training and implementation support model (Rohrbach et al. 2010a; Rohrbach et al. 2010b). Previously, we reported that implementation fidelity was higher in the comprehensive training and implementation support condition relative to the standard training workshop alone (Rohrbach et al. 2010a).

The current study addressed four research questions. First, what contextual- and provider-level factors influence some teachers to implement Project TND with greater fidelity than others? Second, what was the differential effect of the training interventions on teachers’ self-efficacy to implement TND and positive beliefs about the value of the program? Third, did teachers’ beliefs about the value of the program prior to implementation moderate the effect of the comprehensive training on implementation fidelity? Fourth, did changes in self-efficacy and beliefs about the value of the program mediate the effect of the comprehensive training on implementation fidelity?

METHODS

School Selection and Experimental Design

A total of 65 high schools from 10 school districts across the United States were recruited for the study (Rohrbach et al. 2010a). The sample was derived from a pool of school districts that had requested information about purchasing, but had not yet adopted Project TND. Districts were approached if they had at least three high schools that could be randomized to the experimental conditions. Within each recruited school district, participating schools were randomly assigned to one of the following conditions: (1) comprehensive training and implementation support for Project TND teachers (standard training workshop plus on-site coaching, technical assistance, and web-based support); (2) standard workshop training only; or 3) standard care control. The current study focuses on the 43 schools in the two training (program) conditions.

Within each program school, at least one teacher was recruited to participate in the training intervention and deliver Project TND to his/her students. The school designated the subject area for program implementation (health or physical education). Delivery of the program took place in existing class groupings of students.

Teachers completed two surveys: prior to implementation (baseline) and immediately following implementation (immediate posttest). School administrators completed one survey immediately following implementation. Students completed surveys at baseline, 1–2 weeks following implementation, and one year following implementation. All student surveys were administered by project staff at single classroom sessions during regular school hours. A more detailed account of the school selection, experimental design, and research procedures can be found in Rohrbach et al. (2010a).

Subjects

Study subjects included teachers and school administrators from the 43 schools assigned to the program conditions. Forty-one school administrators from the 43 program schools completed a survey (21 and 20 administrators in the comprehensive training and standard training condition, respectively). Of the 59 teachers in the program conditions, 50 teachers had complete data on the baseline and post-implementation surveys, had their program delivery observed at least once (described below), and their school administrator had completed a survey. Of these teachers, 28 were in the comprehensive training and 22 were in the standard training. Characteristics of the teachers and school administrators are presented in Table 1. There were no statistically significant differences between the two program conditions in the demographic characteristics of teachers and administrators (not shown).

Table 1.

Characteristics of teachers and school administrator

μ (std) or %
Teacher Characteristics (N = 50)
  Teaching experience (in years) (μ) 15.5(11.5)
  Age (μ) 36.9(17.3)
  Female (%) 52.0
  Ethnicity (%)
    White, non-Hispanic 77.6
    Latino 6.1
    African American 14.3
    Asian 2.0
  Health teacher (%) 83.7
  Certified to teach health (%) 79.6
  Masters degree (%) 53.1
School Administrator Characteristics (N = 41)
  Experience as principal (in years) (μ) 9.2(8.6)
  Age (μ) 49.2(8.7)
  Female (%) 20.0
  Ethnicity (%)
    White, non-Hispanic 65.9
    Latino 7.3
    African American 17.1
    Asian or Pacific Islander 2.4
    American Indian or Alaskan Native 2.4
    Other 2.4

Data Collection and Measurement

Many of the items included in the teacher and administrator surveys were adapted from previous studies (e.g., Battistich et al. 1995; Ennett et al. 2003; Rohrbach et al. 1993; Rohrbach et al. 1998; Steckler et al. 1992) and some were developed specifically for the study.

Independent Variables

Teacher Self-Report Measures

Immediately following the training workshop, teachers completed a baseline survey that assessed their background characteristics, beliefs, and perceptions of the school climate. Background characteristics included gender, age, ethnicity, overall teaching experience and experience with drug education (in years), degrees obtained (master’s degree or less), certification to teach health (yes or no), and subject taught (physical education or health). Disciplinary teaching style was comprised of two items that were averaged, measuring the extent to which teachers used strict discipline in their classroom (4-point scales, 1=definitely not me to 4=definitely me; r= .63). Perceived positive school climate was a composite index of six items that assessed positive aspects of the work climate at the teacher’s school (e.g., “Teachers in this district feel free to communicate with district administrators;" each on 5-point response scales, 1=strongly disagree to 5=strongly agree; α= .83).

On both the baseline and post-implementation surveys, teachers completed items that assessed self-efficacy and beliefs regarding TND. Self-efficacy to implement Project TND was a composite index of three items (e.g., “How confident are you that you will do (did) a good job teaching Project TND?”), each with 10-point response scales (1=not at all confident to 10=very confident; α=.74), that were standardized and averaged. Beliefs about the value of Project TND was a composite of four items (e.g., “How successful do you think Project TND will be (was) in preventing or reducing substance use by your students?”), each with 10-point response scales (1=not at all successful to 10=very successful; α=.88), that were standardized and averaged.

School Administrator Self-Report Measures

School Administrators (principals) completed a self-report survey following initial program implementation that assessed their perceptions of the school and school district context. Measures of the school district context included their perceptions of a clear mandate from the district to implement substance use prevention education (1 item; 5-point scale; 1=strongly disagree to 5=strongly agree) and the extent to which the district encouraged the use of a research-based substance abuse program (1 item; 5-point scale; 1=strongly disagree to 5=strongly agree). Measures of the school context included the school’s openness to change (1 item; 5-point scale; 1=strongly disagree to 5=strongly agree) and the extent of a collaborative environment at the school (3 items, averaged; 5-point scales, 1=strongly disagree to 5=strongly agree; α=.81). The organizational capacity index was the mean of four items, including staff turnover (1= high to 3=low), experience as a principal (in years), the presence of specific teachers within the school who were responsible for substance use prevention education (0=no, 1=yes) and whether teachers received the resources they need to implement substance use prevention curricula (0=no, 1=yes). Cronbach’s alpha for this index was 0.56.

Additional Measures of Program Context

School demographic characteristics, including percentage of white students, population density (recoded into three categories: urban, suburban and rural) and student poverty (percentage of students falling below the federal government poverty level) were taken from the national Common Core Data file (Thomas et al. 2007). An index of urbanicity was created by averaging these four school demographic characteristics (α = .85).

Student drug use items at baseline included 30-day use of cigarettes, alcohol, marijuana, and “hard drugs.” The hard drug use index summed responses to six items regarding use of cocaine, hallucinogens, inhalants, stimulants, ecstasy, and “other” drugs (i.e., depressants, PCP, steroids, heroin, or “other” drugs) in the last 30 days. For each substance use item, there were eight response options, including “0”, “1–10”, “11–30,” “31–50”, “51–70”,“71–90”, “91–100,” and “more than 100 times.” For data analyses, a school-level mean was created for each drug use variable. The student baseline drug use index was a composite that averaged the school-level hard drug use index and the 30-day use of cigarettes, alcohol, and marijuana variables (α=0.87).

Outcomes

Fidelity of Implementation

Fidelity of program implementation was assessed with a classroom observation procedure that was used in previous Project TND trials (Rohrbach, et al., 2007). Initially, the goal was to observe each teacher twice, while he/she delivered the same TND lesson (#5) to two separate class groupings. However, for one of the 50 teachers, observation was possible during only one classroom period; thus, analyses of implementation fidelity data are based on a total of 99 classroom observations. The lesson that was observed (#5) utilized psychodrama techniques to simulate a talk show in which various negative consequences of drug abuse were presented. This lesson was selected because it is highly interactive and the classroom observation procedure emphasizes assessment of program process. Observations were conducted by trained members of the Project TND staff, and observers were not blind to the experimental condition of the school.

Fidelity of implementation commonly refers to “the degree to which teachers and other program providers implement programs as intended by the program developers" (Dusenbury et al. 2003). Previous reviews of the literature have shown that fidelity of program implementation is generally assessed through five domains (Dane & Schneider, 1998; Dusenbury et al., 2003): (1) adherence to the program, (2) dose, (3) quality of program delivery, (4) participant responsiveness and (5) program differentiation (the degree to which components that would distinguish the program from another program were presented). In the current study, we measured three of these five domains of implementation fidelity: adherence to the program as written, quality of program delivery, and participant responsiveness. All of the items utilized seven-point rating scales that specified behaviorally anchored criteria for the end- and mid-points of the scales (e.g., Mowbray, Holter, Teague, & Bybee, 2003). Adherence to the program (3 items) assessed whether the objectives of the lesson were met, the extent to which the teacher elicited student participation and responses, and how well the lesson went overall (7-point scales; 1=not at all to 7=a great deal). These items were averaged to create an index (α=0.88). The quality of program delivery index was the mean of four items that assessed teacher enthusiasm, confidence, and the extent to which the teacher treated students respectfully (7-point scales; 1=not at all to 7=a great deal; α=0.92). The participant responsiveness index was comprised of three items (averaged) that measured how interested the students appeared to be, how much they seemed to like the teacher, and class control (7-point scales; 1=not at all to 7=a great deal; α=0.93). Because these three indexes were highly inter-correlated (r=0.95), for data analyses the individual items were averaged and standardized (mean=0; std=1) to create a composite implementation fidelity score (α=0.96).

DATA ANALYSIS

All analyses were conducted using generalized mixed-linear modeling (Murray and Hannan 1990). Prior to conducting regression models, variables were standardized (mean = 0, std = 1). Random effects adjusted for in the models included: school district, school and teacher. Betas and standard errors are reported. All analyses were conducted using SAS statistical package (SAS Institute Inc. SAS/C Online Doc TM 2000). We used two-tailed tests with significance set at p < .10.

To explore our first question of what contextual and provider-level factors were related to fidelity of implementation, the composite implementation fidelity score was regressed on each of the correlates independently as a fixed effect variable, controlling for the training condition of the school. Those variables that showed a statistically significant effect on implementation fidelity in the bivariate analyses at p < .10 were included in the final multivariate model.

To examine our second question of whether the training intervention had an effect on changes in teacher beliefs about the value of the program and self-efficacy, the data were averaged to the teacher level. Next, post-test teacher beliefs about the value of the program and self-efficacy were regressed on training condition as a fixed effect variable, controlling for the pretest value of the respective index.

Our third analysis examined whether holding positive beliefs about the value of the program prior to implementation moderated the effect of the comprehensive training on implementation fidelity (Aiken & West, 1991). First, we centered the baseline index of beliefs about the value of the program. Next, implementation fidelity was regressed on a 2-way interaction: training condition × beliefs about the value of the program. Then, we took a median split of beliefs about the value of the program using PROC RANK (SAS Institute Inc. SAS/C Online Doc TM 2000). We ran two regression models where implementation fidelity was regressed on training condition: (1) among teachers with fewer positive beliefs about the value of the program and (2) among teachers with more positive beliefs about the value of the program. Finally, we re-ran these regression models controlling for self-efficacy, perceived positive school climate, urbanicity and organizational capacity.

Finally, we explored whether changes from baseline to posttest in teacher self-efficacy and beliefs about the value of the program mediated the effects of the training intervention on implementation fidelity, using mediation analysis with two steps of regressions (MacKinnon, 2007a). In the first step, each of the potential mediators (change in self-efficacy and change in beliefs) was regressed on the training condition and other covariates. In the second step, the outcome (implementation fidelity) was regressed on training condition, the potential mediators, and other covariates. The point estimates of the mediated effects were calculated as the multiplication of the two coefficients linking the training condition (X) to potential mediators (M), and linking the potential mediators to implementation fidelity (Y). The asymmetrical confidence interval for the mediated effect was then estimated in PRODCLIN (MacKinnon, 2007b), based on the distribution of the product method.

RESULTS

Relationships between Contextual and Provider-level Factors and Implementation Fidelity

The results of the mixed linear regression models exploring relationships between implementation fidelity and contextual and provider-level factors are shown in Table 2. In the bivariate models, urbanicity and beliefs about the value of Project TND at baseline were negatively associated with fidelity of implementation (p’s < .05). The school’s organizational capacity to implement Project TND was positively associated with fidelity (p < .05). In the multivariate model, all three indices were retained as predictors of fidelity (p’s < .05). After adjusting for urbanicity, organizational capacity and pretest beliefs about the value of Project TND, training condition significantly predicted implementation fidelity (p = 0.05), with higher fidelity in the comprehensive relative to the standard training.

Table 2.

Correlates of fidelity of TND program implementation, Beta (SE)

Correlates Fidelity
Bivariate Multivariate
School District Context
  Clear mandate to implement substance-abuse prevention1 0.10 (0.15) --
  Encouraged use of research-based program1 0.15 (0.13) --
  Urbanicity −0.34 (0.11)* −0.29 (0.11)*
  Student baseline drug use2 −0.12 (0.12) --
School Context
  School’s openness to change1 0.15 (0.13) --
  Collaborative school environment1 0.15 (0.13) --
  Organizational capacity1 0.39 (0.12)* 0.31 (0.11)*
Teacher-Level Factors
  Female3 0.05 (0.12) --
  Prior experience with drug education3 −0.09 (0.13) --
  White ethnicity3 0.36 (0.29) --
  Health teacher3 −0.02 (0.13) --
  Disciplinary teaching style3,5 0.01 (0.12) --
  Perceived positive school climate3,5 −0.17 (0.12) --
  Self-efficacy to implement Project TND4,5 −0.18 (0.13) --
  Beliefs about value of Project TND4,5 −0.32 (0.13)* −0.43 (0.13)*

Notes:

n=99

*

p < .05, + p < .10

Variables are standardized (mean = 0 std =1).

All models were at the classroom level, N=99. Random effects included school district, school and teacher; fixed effects included program condition.

p-values reported using 2-tailed tests

1

Assessed on school administrator survey

2

Assessed on student baseline survey

3

Assessed on teacher baseline survey

4

Assessed on teacher baseline and post-implementation survey

5

Scaled low to high

The Effects of Training on Provider-level Factors

Mixed linear regression models that examined the effects of the comprehensive training (versus the standard training) on changes in teachers’ beliefs about the value of the program and self-efficacy showed a marginally statistically significant increase in self-efficacy for teachers receiving the comprehensive training (p< .10), and a trend toward more favorable beliefs regarding the value of Project TND (p = 0.13).

Moderation of the Effects of Training

We tested whether baseline beliefs about the value of the program were a modifier of the effect of the training condition on the outcome of implementation fidelity (Aiken & West, 1991). Holding more positive beliefs about the value of the program prior to implementation was found to significantly modify the effect of training (comprehensive vs. regular) on implementation fidelity (p =0.0005 for the moderation). The detailed sub-group analysis showed that among teachers with less favorable beliefs about the value of the program prior to implementation, the comprehensive training did not generate a statistically significant effect over the regular training on implementation fidelity (β = 0.28, p = 0.43); however, among teachers with more favorable beliefs at baseline, comprehensive training generated a statistically significant effect over the regular training on fidelity (β = 1.35, p = 0.0009). The conclusion held even after controlling for self-efficacy, perceived positive school climate, urbanicity and organizational capacity (see Figure 1).

Figure 1. Moderation Effect of Beliefs about the Value of Project TND on Implementation Fidelity.

Figure 1

Variables were centered and unstandardized. Moderation analyses controlled for self-efficacy, perceived positive school climate, urbanicity and organizational capacity.

Mediation Pathway from Comprehensive Training to Fidelity

Self-efficacy and teachers’ beliefs about the value of the program were tested as potential mediators of the effect of the comprehensive training on implementation fidelity using the mediation analysis procedure described by MacKinnon (2007a). While teacher beliefs about the value of TND were not a statistically significant mediator, self-efficacy was found to mediate 34.0% of the total effect of the comprehensive training on implementation fidelity. As shown in Figure 2, the comprehensive training significantly increased teachers’ self-efficacy by a 0.36 standard deviation change (βa=0.14; p=0.01); and each standard deviation difference in change in teachers’ self-efficacy was found to be significantly related to a 0.54 standard deviation change in implementation fidelity (βb=0.25; p=0.03). The indirect effect (mediated effect), which was the multiplication of these two coefficients, was found to be statistically significant βindirect = 0.20 (p<0.05). Lastly, the asymmetrical confidence interval for the mediated effect was estimated in PRODCLIN (MacKinnon, 2007b), based on the distribution of the product method. The confidence interval for the indirect effect (βindirect = 0.20) was 95% CI: 0.01–0.48.

Figure 2. Potential Mediation Pathways for the Effects of the Training Intervention on Implementation Fidelity.

Figure 2

Standardized regression coefficients for the relationship between training condition (Comprehensive vs. Standard) and implementation fidelity as mediated by change in self-efficacy. The standardized regression coefficients between training condition and implementation fidelity controlling for change in self-efficacy are in parentheses.

* p< .05.

DISCUSSION

Guided by several conceptual models (e.g., Chen, 1998; Domitrovich, et al., 2008; Durlak & DuPre, 2008), the current study examined associations between multiple contextual and program provider-level factors and the implementation fidelity of a research-based substance abuse prevention program in a sample of high schools from around the U.S. We also explored whether a comprehensive training intervention, relative to a standard training workshop, had an effect on teachers’ beliefs about the value of the program and self-efficacy to implement it, and if these changes in teacher self-efficacy and beliefs mediated the effect of the comprehensive training on implementation fidelity. We found that several contextual and provider-level factors were associated with implementation fidelity, and the comprehensive training significantly increased teachers’ self-efficacy, which resulted in an increase in teachers’ implementation fidelity.

Consistent with previous research (Fagan et al. 2008; Mihalic et al. 2008; Thaker et al. 2008; Wiecha et al. 2004) and Chen’s conceptual model (1998), we found that school organizational capacity was related to higher levels of implementation fidelity. These findings suggest that implementation fidelity will be higher in schools where there is stability in school leadership and faculty and teachers are provided with the necessary resources and responsibility to teach substance abuse prevention programs. Another contextual variable that we assessed, urbanicity, was negatively associated with implementation fidelity. Future studies should investigate possible reasons why implementation fidelity of research-based programs might be lower in urban, relative to suburban and rural, environments. For example, teachers in urban schools may be more likely to adapt programs to better fit the perceived needs of their students (Ringwalt et al., 2003).

Despite previous research showing a relationship between teacher characteristics, such as teaching experience and style, and implementation fidelity (Ringwalt et al. 2002; Rohrbach et al. 1993), we did not find support for these relationships. Surprisingly, teachers’ favorable beliefs about the value of Project TND following the standard training workshop were negatively associated with implementation fidelity. This finding is inconsistent with previous studies that have shown when providers’ beliefs are favorable towards the prevention program, implementation fidelity is higher (Beets, et al., 2008; Kallestad & Olweus, 2003; Klimes-Dougan, et al., 2009; Ringwalt, et al., 2003). However, our findings are consistent with one published study that found that teachers’ support for the research-based program was inversely associated with their use of the program’s interactive teaching techniques during program implementation (Mihalic, et al., 2008). Mihalic and colleagues (2008) hypothesized that teachers may have been motivated to teach the program, but lacked the skills necessary to implement the interactive teaching techniques required by the curriculum.

We speculated an alternative explanation for the negative relationship between fidelity and beliefs about the value of TND. Possibly, less positive beliefs could indicate a more realistic perspective about the many competing influences on drug abuse and the challenges involved in implementing a multifaceted program like TND. Realism about this program may lead teachers to attempt to focus on implementing the program as written, rather than envisioning some type of adaptation or change (e.g., self-disclosing personal experiences during discussion or providing new media). Possibly, teachers more favorable to the program may not be in touch with the exacting nature of its delivery and may need the comprehensive training to help avoid reinventing material.

To explore this hypothesis, we tested whether baseline beliefs about the value of the program were a modifier of the effect of the training condition on implementation fidelity. We found evidence for moderation, with the comprehensive training producing greater fidelity among teachers with more favorable beliefs about the value of the program at baseline and showing no effect on fidelity among teachers with less positive beliefs. Unfortunately, we do not know the reasons for teachers’ beliefs about the value of TND prior to implementation, other than that they were also significantly correlated with their self-efficacy to implement it (r =0.58, p <.0001). The results of the moderation analysis support the speculation suggested by Mihalic and colleagues (Mihalic et al., 2008), or the idea of differential realism. Other explanations are possible, and additional research is needed.

Our findings highlight the importance of offering teachers comprehensive training prior to program implementation in order to provide them with the necessary skills to channel their enthusiasm about the program into quality implementation. The results of the mediation model support these speculations. We tested a mediation model that examined whether increases in teacher self-efficacy and beliefs about the value of the program mediated the effect of the training intervention (comprehensive vs. regular) on implementation fidelity. We found that the effect of the training on fidelity was mediated, in part, by an increase in self-efficacy. Our findings suggest that in order to improve implementation fidelity of research-based prevention programs in real-world school settings, teachers need to be provided with ongoing training, coaching, technical assistance, and support throughout the implementation phase. Additionally, it is important that these types of training methods focus on strengthening teachers’ self-efficacy to implement the research-based program with high quality and fidelity.

Limitations and Strengths

There are at least four limitations of this study. One limitation is that classroom observers were not blind to the experimental condition of the school. As a result, we cannot rule out the possibility of bias in their ratings of implementation fidelity. Second, our study is limited by the weakly validated assessment tools for teacher beliefs, self-efficacy and climate. In particular, because our measure of school climate did not involve assessment of all members of the school organization, it should be conceived as a measure of psychological climate, or an individual’s perceptions of the school climate, rather than a true measure of school organizational climate. Third, the Cronbach’s alpha for the organizational capacity index was low. The items constituting this index could represent several dimensions of organizational capacity rather than one. A fourth limitation of the study is that only one session was observed for fidelity, which provides a limited assessment considering the breadth of the program. Still, it is a strength of the current study that we used classroom observations to assess implementation fidelity, as such assessments are a rarity in effectiveness and dissemination research. Observation of implementation fidelity by trained observers is considered the most objective way to measure program implementation (Dane and Schneider 1998; Dusenbury et al. 2003). An additional strength of the study is the use of multiple data sources for assessing school contextual variables, instead of relying exclusively on teachers or administrators for assessments. Future studies should explore the association between fidelity of implementation and school climate as assessed from all employees.

Implications for Future Research

In order to reduce substance abuse and other social and emotional problems in adolescents, we must continue to conduct research that increases understanding of how to get prevention programs with proven effectiveness successfully implemented in schools. School-based prevention is influenced by a variety of contextual factors occurring at multiple ecological levels. In order to move the field forward, future effectiveness and dissemination studies need to account for the complex nature of schools in analyses of implementation fidelity and outcomes. Future research should address how provider- and contextual-level factors interact to affect implementation fidelity, which provider- and context-level variables are modifiable, and with what types of interventions.

Implications for Health Education

Given the increasing demands on teachers to focus their efforts on core subjects (e.g., English, math and science), relative to health education and prevention (Kaftarian et al. 2004), it is important to explore innovative ways of improving implementation fidelity of research-based prevention programs in schools. If effective programs are to achieve significant population impact, they must be disseminated widely and implemented with fidelity (Glasgow, Vogt, & Boles, 1999). Consistent with previous research (Domitrovich et al. 2008, Durlak & DuPre 2008), our findings suggest that strong organizational capacity may be just as influential in promoting implementation fidelity as teacher-level factors, such as self-efficacy and positive beliefs about the specific prevention approach. Before implementation begins, as well as throughout the implementation process, training and ongoing technical assistance should be conducted to increase teachers’ skills, self-efficacy, and comfort with research-based prevention curricula.

Acknowledgements

This study was funded by a grant from the National Institute on Drug Abuse (no. 1 R01 DA16090). Melissa Little was supported during the work on this project by a postdoctoral fellowship on grant R25 CA90956. The authors wish to thank Gaylene Gunning for her assistance in program management and data collection.

References

  1. Aiken LS, West SG. Multiple Regression: Testing and interpreting interactions. Newbury Park, CA: Sage; 1991. [Google Scholar]
  2. Battistich V, Solomon D, Kim D, Watson M, Schaps E. Schools as communities, poverty levels of student populations and students’ attitudes, motives and performance: a multilevel analysis. American Educational Research Journal. 1995;Vol. 32:627–658. [Google Scholar]
  3. Beets M, Flay B, Vuchinich S, Acock A, Li K-K, Allred C. School Climate and Teachers’ Beliefs and Attitudes Associated with Implementation of the Positive Action Program: A Diffusion of Innovations Model. Prevention Science. 2008;Vol. 9:264–275. doi: 10.1007/s11121-008-0100-2. [DOI] [PubMed] [Google Scholar]
  4. Blake SM, Ledsky RA, Sawyer RJ, Goodenow C, Banspach S, Lohrmann DK, Hack T. Local school district adoption of state-recommended policies on HIV prevention education. Preventive Medicine: An International Journal Devoted to Practice and Theory. 2005;Vol. 40(No. 2):239–248. doi: 10.1016/j.ypmed.2004.05.028. [DOI] [PubMed] [Google Scholar]
  5. Chen H-T. Theory-driven evaluations. Adv Educ Productivity. 1998;Vol. 7:15–34. [Google Scholar]
  6. Dane AV, Schneider BH. Program integrity in primary and early secondary prevention: are implementation effects out of control? Clinical Psychology Review. 1998;Vol. 18(No. 1):23–45. doi: 10.1016/s0272-7358(97)00043-3. [DOI] [PubMed] [Google Scholar]
  7. Dent CW, Sussman S, Stacy AW. Project Towards No Drug Abuse: Generalizability to a General High School Sample. Preventive Medicine. 2001;Vol. 32(No. 6):514–520. doi: 10.1006/pmed.2001.0834. [DOI] [PubMed] [Google Scholar]
  8. Domitrovich CE, Bradshaw CP, Poduska JM, Hoagwood K, Buckley JA, Olin S, Romanelli LH, Leaf PJ, Greenberg MT, Ialongo NS. Maximizing the Implementation Quality of Evidence-Based Preventive Interventions in Schools: A Conceptual Framework. Advances in School Mental Health Promotion. 2008;Vol. 1(No. 3):6–28. doi: 10.1080/1754730x.2008.9715730. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Durlak J, DuPre E. Implementation Matters: A Review of Research on the Influence of Implementation on Program Outcomes and the Factors Affecting Implementation. American Journal of Community Psychology. 2008;Vol. 41(No. 3):327–350. doi: 10.1007/s10464-008-9165-0. [DOI] [PubMed] [Google Scholar]
  10. Dusenbury L, Brannigan R, Falco M, Hansen WB. A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Health Educ Res. 2003;Vol 18(No. 2):237–256. doi: 10.1093/her/18.2.237. [DOI] [PubMed] [Google Scholar]
  11. Dusenbury L, Hansen WB, et al. Coaching to enhance quality of implementation in prevention. Health Education. 2010;Vol 110(No. 1):43–60. doi: 10.1108/09654281011008744. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Ennett ST, Ringwalt CL, Thorne J, Rohrbach LA, Vincus A, Simons-Rudolph A, Jones S. A Comparison of Current Practice in School-Based Substance Use Prevention Programs with Meta-Analysis Findings. Prevention Science. 2003;Vol. 4(No 1):1–14. doi: 10.1023/a:1021777109369. [DOI] [PubMed] [Google Scholar]
  13. Ennett ST, Haws S, Ringwalt CR, Vincus AA, Hanley S, Bowling JM, Rohrbach LA. Evidence-based practice in school substance use prevention: Fidelity of implementation under real-world conditions. Health Education Research. 2011;Vol. 26(No. 2):361–371. doi: 10.1093/her/cyr013. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Fagan AA, Hanson K, Hawkins JD, Arthur MW. Bridging science to practice: achieving prevention program implementation fidelity in the community youth development study. American Journal of Community Psychology. 2008;Vol. 41(No. 3–4):235–249. doi: 10.1007/s10464-008-9176-x. [DOI] [PubMed] [Google Scholar]
  15. Fixsen D, Naoom SF, Blasé DA, Friedman RM, Wallace F. Implementation research: A synthesis of the literature. Tampa, Fl: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network; 2005. (FMHI Publication #231). [Google Scholar]
  16. Gingiss PM, Roberts-Gray C, Boerm M. Bridge-It: A System for Predicting Implementation Fidelity for School-Based Tobacco Prevention Programs. Prevention Science. 2006;Vol. 7(No. 2):197–207. doi: 10.1007/s11121-006-0038-1. [DOI] [PubMed] [Google Scholar]
  17. Gittelsohn J, Merkle S, Story M, Stone EJ, Steckler A, Noel J, Davis S, Martin CJ, Ethelbah B. School climate and implementation of the Pathways study. Preventive Medicine. 2003;Vol. 37(Supplement 1):S97–S106. doi: 10.1016/j.ypmed.2003.08.010. [DOI] [PubMed] [Google Scholar]
  18. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;Vol. 89(No. 9):1322–1327. doi: 10.2105/ajph.89.9.1322. [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Gottfredson DC, Gottfredson GD. Quality of school-based prevention programs: Results from a national survey. Journal of Research in Crime and Delinquency. 2002;Vol. 39(No. 1):3–35. [Google Scholar]
  20. Greenberg MT. School-based prevention: current status and future challenges. Effective Education. 2010;Vol. 2(No. 1):27–52. [Google Scholar]
  21. Hallfors D, Godette D. Will the `Principles of Effectiveness' improve prevention practice? Early findings from a diffusion study. Health Educ Res. 2002;Vol. 17(No. 4):461–470. doi: 10.1093/her/17.4.461. [DOI] [PubMed] [Google Scholar]
  22. Hallfors D, Sporer A, Pankratz M, Godette D. Drug-free schools survey: Report of results. Chapel Hill, NC: University of North Carolina; 2000. [Google Scholar]
  23. Kaftarian S, Robertson E, Compton W, Davis BW, Volkow N. Blending Prevention Research and Practice in Schools: Critical Issues and Suggestions. Prevention Science. 2004;Vol. 5(No. 1):1–3. doi: 10.1023/b:prev.0000013975.74774.bc. [DOI] [PubMed] [Google Scholar]
  24. Kallestad JH, Olweus D. Predicting Teachers' and Schools' Implementation of the Olweus Bullying Prevention Program: A Multilevel Study. Prevention & Treatment. 2003;Vol. 6(No. 1) [Google Scholar]
  25. Kam C-M, Greenberg MT, Walls CT. Examining the role of implementation quality in school-based prevention using the PATHS curriculum. Prevention Science. 2003;Vol. 4(No. 1):55–63. doi: 10.1023/a:1021786811186. [DOI] [PubMed] [Google Scholar]
  26. Klimes-Dougan B, August GJ, Lee C-YS, Realmuto GM, Bloomquist ML, Horowitz JL, Eisenberg TL. Practitioner and site characteristics that relate to fidelity of implementation: The Early Risers prevention program in a going-to-scale intervention trial. Professional Psychology: Research and Practice. 2009 Oct;Vol. 40(No. 5):467–475. [Google Scholar]
  27. MacKinnon DP, Fairchild AJ, Fritz MS. Mediation analysis. Annual Review of Psychology. 2007a;Vol. 58:593–614. doi: 10.1146/annurev.psych.58.110405.085542. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. MacKinnon DP, Fritz MS, Williams J, Lockwood CM. Distribution of the product confidence limits for the indirect effect: Program PRODCLIN. Behavior Research Methods. 2007b;Vol. 39:384–389. doi: 10.3758/bf03193007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Mihalic S, Fagan A, Argamaso S. Implementing the LifeSkills Training drug prevention program: factors related to implementation fidelity. Implementation Science. 2008;Vol. 3(No. 1):5. doi: 10.1186/1748-5908-3-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Mowbray CT, Holter MC, Teague GB, Bybee D. Fidelity Criteria: Development, Measurement, and Validation. American Journal of Evaluation. 2003;Vol. 24(No. 3):315–340. [Google Scholar]
  31. Murray DM, Hannan PJ. Planning for the appropriate analysis in school-based drug-use prevention studies. Journal of Consulting & Clinical Psychology. 1990;Vol. 58(No. 4):458–468. doi: 10.1037//0022-006x.58.4.458. [DOI] [PubMed] [Google Scholar]
  32. Payne AA. Do Predictors of the Implementation Quality of School-Based Prevention Programs Differ by Program Type? Prevention Science. 2009;Vol. 10(No. 2):151–167. doi: 10.1007/s11121-008-0117-6. [DOI] [PubMed] [Google Scholar]
  33. Payne AA, Gottfredson DC, Gottfredson GD. School Predictors of the Intensity of Implementation of School-Based Prevention Programs: Results from a National Study. Prevention Science. 2006;Vol. 7(No. 2):225–237. doi: 10.1007/s11121-006-0029-2. [DOI] [PubMed] [Google Scholar]
  34. Ringwalt CL, Ennett S, et al. The Prevalence of Effective Substance Use Prevention Curricula in US Middle Schools. Prevention Science. 2002;Vol. 3(No. 4):257–265. doi: 10.1023/a:1020872424136. [DOI] [PubMed] [Google Scholar]
  35. Ringwalt CL, Ennett S, Johnson R, Rohrbach LA, Simons-Rudolph A, Vincus A, Thorne J. Factors Associated with Fidelity to Substance Use Prevention Curriculum Guides in the Nation's Middle Schools. Health Education Behavior. 2003;Vol. 30(No. 3):375–391. doi: 10.1177/1090198103030003010. [DOI] [PubMed] [Google Scholar]
  36. Ringwalt C, Vincus A, Hanley S, Ennett S, Bowling J, Haws S. The Prevalence of Evidence-based Drug Use Prevention Curricula in U.S. Middle Schools in 2008. Prevention Science. 2011;Vol. 12(No. 1):63–69. doi: 10.1007/s11121-010-0184-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Roberts-Gray C, Gingiss PM, Boerm M. Evaluating school capacity to implement new programs. Evaluation and Program Planning. 2007;Vol. 30(No. 3):247–257. doi: 10.1016/j.evalprogplan.2007.04.002. [DOI] [PubMed] [Google Scholar]
  38. Rohrbach LA, Graham JW, Hansen WB. Diffusion of a School-Based Substance Abuse Prevention Program: Predictors of Program Implementation. Preventive Medicine. 1993;Vol. 22(No. 2):237–260. doi: 10.1006/pmed.1993.1020. [DOI] [PubMed] [Google Scholar]
  39. Rohrbach LA, Dent CW, Johnson CA, Unger J, Gunning G. Independent Evaluation Consortium, Final Report: Independent Evaluation of the California Tobacco Control Program. Wave 1 Data, 1996–1997. Sacramento, CA: California Department of Health Services, Tobacco Control Section; 1998. Evaluation of the School Tobacco Use Prevention Education program. [Google Scholar]
  40. Rohrbach LA, Ringwalt CL, Ennett ST, Vincus AA. Factors associated with adoption of evidence-based substance use prevention curricula in US school districts. Health Educ. Res. 2005;20(5):514–526. doi: 10.1093/her/cyh008. [DOI] [PubMed] [Google Scholar]
  41. Rohrbach LA, Gunning M, Grana R, Sussman S. Taking Project Towards No Drug Abuse (TND) to Scale: Program Adoption and Implementation. Poster presented at the 15th Annual Society for Prevention Research Conference; Washington, DC. 2007. [Google Scholar]
  42. Rohrbach LA, Grana R, Sussman S, Valente TW. Type II Translation: Transporting Prevention Interventions from Research to Real-World Settings. Eval Health Prof. 2006;Vol. 29(No. 3):302–333. doi: 10.1177/0163278706290408. [DOI] [PubMed] [Google Scholar]
  43. Rohrbach LA, Gunning M, Sun P, Sussman S. The Project Towards No Drug Abuse (TND) Dissemination Trial: Implementation Fidelity and Immediate Outcomes. Prevention Science. 2010a;Vol. 11:77–88. doi: 10.1007/s11121-009-0151-z. {Erratum, Prevention Science Vol. 11, pp. 113.} [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Rohrbach LA, Sun P, Sussman S. One-year follow-up evaluation of the Project Towards No Drug Abuse (TND) dissemination trial. Preventive Medicine. 2010b;Vol. 51(No. 3–4):313–319. doi: 10.1016/j.ypmed.2010.07.016. [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. SAS Institute Inc. SAS/C Online Doc TM. Release 9.0. Cary, NC: SAS Institute; 2000. [Google Scholar]
  46. Skara S, Rohrbach LA, Sun p, Sussman S. An evaluation of the fidelity of implementation of a school-based drug abuse prevention program: Project toward no drug abuse (TND) Journal of Drug Education. 2005;Vol. 35(No. 4):305–329. doi: 10.2190/4LKJ-NQ7Y-PU2A-X1BK. [DOI] [PubMed] [Google Scholar]
  47. Steckler A, Goodman RM, McLeroy KR, Davis S, Koch G. Measuring the diffusion of innovative health promotion programs. American Journal of Health Promotion. 1992;Vol. 6:214–224. doi: 10.4278/0890-1171-6.3.214. [DOI] [PubMed] [Google Scholar]
  48. Sun P, Sussman S, Dent CW, Rohrbach LA. One-year follow-up evaluation of Project Towards No Drug Abuse (TND-4) Preventive Medicine. 2008;Vol. 47(No.4):438–442. doi: 10.1016/j.ypmed.2008.07.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. Sun W, Skara S, Sun P, Dent CW, Sussman S. Project Towards No Drug Abuse: Long-term substance use outcomes evaluation. Preventive Medicine. 2006;Vol. 42(No. 3):88–192. doi: 10.1016/j.ypmed.2005.11.011. [DOI] [PubMed] [Google Scholar]
  50. Sussman S, Dent CW, Stacy AW, Craig S. One-Year Outcomes of Project Towards No Drug Abuse. Preventive Medicine. 1998;Vol. 27(No. 4):632–642. doi: 10.1006/pmed.1998.0338. [DOI] [PubMed] [Google Scholar]
  51. Sussman S, Dent CW, Stacy AW. Project Towards No Drug Abuse: A review of the findings and future directions. American Journal of Health Behavior. 2002;Vol. 26(No. 5):354–365. doi: 10.5993/ajhb.26.5.4. [DOI] [PubMed] [Google Scholar]
  52. Sussman S, McCuller WJ, Dent CW. The associations of social self-control, personality disorders, and demographics with drug use among high-risk youth. Addictive Behaviors. 2003;Vol. 28:1159–1166. doi: 10.1016/s0306-4603(02)00222-8. [DOI] [PubMed] [Google Scholar]
  53. Sussman S, Rohrbach L, Mihalic C. Blueprints for Violence Prevention, Book Twelve: Project Towards No Drug Abuse. Boulder, CO, Center for the Study and Prevention of Violence. 2004a [Google Scholar]
  54. Sussman S, Earleywine M, Wills T, Cody C, Biglan T, Dent CW, Newcomb MD. The Motivation, Skills, and Decision-Making Model of Drug Abuse Prevention. Substance Use & Misuse. 2004b;Vol. 39(No. 10):1971–2016. doi: 10.1081/ja-200034769. [DOI] [PubMed] [Google Scholar]
  55. Sussman S, Sun P, Rohrbach LA, Spruijt-Metz D. One-year outcomes of a drug abuse prevention program for older teens and emerging adults: Evaluating a motivational interviewing booster component. Health Psychology, Epub ahead of print. 2011 doi: 10.1037/a0025756. [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Thaker S, Steckler A, Sanchez V, Khatapoush S, Rose J, Hallfors DD. Program characteristics and organizational factors affecting the implementation of a school-based indicated prevention program. Health Education Research. 2008;Vol. 23(No. 2):238–248. doi: 10.1093/her/cym025. [DOI] [PubMed] [Google Scholar]
  57. Thomas JM, Sable J, Dalton B, Sietsema J. Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education; 2007. Documentation to the NCES Common Core of Data Local Education Agency Universe Survey: School Year 2004–05 (NCES 2006-440rev) [Google Scholar]
  58. Valente TW, Okamoto J, Pumpuang P, Okamoto P, Sussman S. Differences in perceived implementation of a standard versus peer-led interactive substance abuse prevention program. Am J Health Behav. 2007;31(3):297–311. doi: 10.5555/ajhb.2007.31.3.297. [DOI] [PMC free article] [PubMed] [Google Scholar]
  59. Wandersman A, Duffy J, Flaspohler P, Noonan R, Lubell K, Stillman L, Blachman M, Dunville R, Saul J. Bridging the gap between prevention research and practice: The interactive systems framework for dissemination and implementation. American Journal of Community Psychology. 2008;Vol 41:171–181. doi: 10.1007/s10464-008-9174-z. [DOI] [PubMed] [Google Scholar]
  60. Wiecha Jean L, El Ayadi Alison M, Fuemmeler Bernard F, Carter Jill E, Handler Shirley, Johnson Stacy, Strunk Nancy, Korzec-Ramirez Debra, Gortmaker Steven L. Diffusion of an Integrated Health Education Program in an Urban School System: Planet Health. J. Pediatr. Psychol. 2004;29(6):467–474. doi: 10.1093/jpepsy/jsh050. [DOI] [PubMed] [Google Scholar]

RESOURCES