Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2010 Mar 2.
Published in final edited form as: J Drug Educ. 2009;39(1):39–58. doi: 10.2190/DE.39.1.c

Implementing Evidence-based Substance Use Prevention Curricula with Fidelity: The Role of Teacher Training*

Sean Hanley 1, Chris Ringwalt 2, Amy A Vincus 3, Susan T Ennett 4, J Michael Bowling 5, Susan W Haws 6, Louise A Rohrbach 7
PMCID: PMC2830624  NIHMSID: NIHMS174002  PMID: 19886161

Abstract

It is widely recognized that teacher training affects the fidelity with which evidence-based substance use prevention curricula are implemented. We present the results of a 2005 survey of teachers from a nationally representative sample of 1721 public middle schools in the US (78.1% response rate). We measured fidelity along two dimensions (adherence and dose) and also assessed the number of hours, recency, and perceived effectiveness of teachers’ training, as well as the degree to which adherence was emphasized during training. Among teachers using evidence-based curricula, 35.3% reported following their curriculum guide very closely. The average proportion of lessons taught was 64.9%, and only 30.2% of teachers taught all the lessons in their curriculum. Analyses revealed that teachers whose training emphasized adherence were five times as likely to be more adherent. We present recommendations for training-related factors that may increase fidelity of implementation.

Introduction

Although recent trends suggest a decline in substance use among middle school students in the United States, their level of use remains a concern. In 2007, over one-third of 8th grade students reported having ever used alcohol, and approximately one-fifth reported having ever used cigarettes or any illicit drug (Monitoring the Future, 2007a, 2007b). Given these figures, the role of school-based substance use prevention education continues to remain salient. In this paper, we investigate the role of teacher training in fidelity of implementation of evidence-based substance use prevention curricula.

The Principles of Effectiveness, promulgated by the Office of Safe and Drug-Free Schools Program (OSDFS) in 2002, requires the use of evidence-based prevention curricula in schools that receive support from the program. There are now a number of compendia of such curricula available (e.g., National Registry of Evidence-based Programs and Practices [NREPP], Blueprints for Violence Prevention, US Department of Education), and there is widespread agreement that these curricula are a critical component of the nation’s efforts to prevent substance use (National Institute on Drug Abuse, 2003; Drug Strategies, 1996). As of the beginning of the decade, however, a large majority of the nation’s middle schools had not yet implemented any evidence-based curricula (Ringwalt et al., 2002).

Fidelity of implementation of school-based curricula has been repeatedly associated with achieving expected outcomes (Abbott et al., 1998; Botvin, Baker, Dusenbury, & Tortu, 1990; Connell, Turner, & Mason, 1985; Dane & Schneider, 1998; Dusenbury, Brannigan, Falco, & Hansen, 2003; Elliott & Mihalic, 2004; Rohrbach, Graham, & Hansen, 1993). In a comprehensive review of health education in the United States, Connell and colleagues (1985) reported that fidelity was related to increases in student knowledge, attitudes, and self-reported behavior. Botvin and colleagues (1990) found a positive relationship between the proportion of curriculum activities completed and effects achieved, while Rohrbach and colleagues (1993) reported effects on key mediators for teachers who implemented a curriculum with a high degree of fidelity. Although a certain amount of adaptation may be unavoidable (Pankratz et al., 2006; Stead, Stradling, MacNeil, MacKintosh, & Minty, 2007), or may even enhance the fit between a program and its intended audience (Dane & Schneider, 1998; Rogers, 2003), and despite multiple challenges to the successful translation of research to practice in school settings (Renes, Ringwalt, Clark, & Hanley, 2007), program providers are generally advised to implement their curricula with high fidelity.

Fidelity has been defined as the degree to which a program is implemented as intended by the developer (Dusenbury et al., 2003) and helps to ensure the internal validity of the program when taken to scale (Wagner, Tubman, & Gil, 2004). Because of the heterogeneity with which the concept has been defined and labeled (alternates include “adherence,” “integrity” and “compliance”) the prevention field has suffered from a lack of agreement as to how it should be conceptualized. However, in a recent review of the literature on fidelity, Dusenbury and colleagues (2003) identified the following five dimensions: (1) adherence, or the extent to which activities and methods are implemented as written; (2) dose, or the amount of the content received by the intended audience; (3) quality of program delivery, or how close to a theoretical ideal the provider delivers the content; (4) participant responsiveness, or participants’ engagement and involvement in the program’s content and activities; and (5) program differentiation, or the implementation of essential components that uniquely characterize the program.

A variety of factors affecting fidelity have been identified, including school and district contextual factors (McCormick, Steckler, & McLeroy, 1995; Ringwalt, Vincus, Ennett, Johnson, & Rohrbach, 2004), as well as characteristics associated with both program providers (Rohrbach, Dent, Skara, Sun, & Sussman, 2007; Rohrbach et al., 1993; St. Pierre, Osgood, Siennick, Kauh, & Burden, 2007) and their students (Ringwalt et al., 2003; Ringwalt et al., 2004). One factor that is consistently cited as essential to curriculum fidelity is teacher training (Connell et al., 1985; Dusenbury, Brannigan, Falco, & Lake, 2004; Dusenbury & Falco, 1995; Dusenbury, Hansen, & Giles, 2003; McCormick et al., 1995; Parcel et al., 1991; Perry, Murray, & Griffin, 1990).

Training provides an important opportunity for teachers to develop and practice interactive teaching skills such as group discussions, brainstorming, and student role plays, thereby enhancing the quality of implementation (Basen-Engquist et al., 1994; Dusenbury et al., 2003; Levenson-Gingiss & Hamilton, 1989; Rohrbach et al., 1993). Many of these skills constitute the foundation of evidence-based substance use prevention curricula (Tobler et al., 2000), but may not typically be utilized by teachers during their normal instruction (Tortu & Botvin, 1989).

Although teachers’ level of implementation may vary widely (Botvin et al., 1990) and may attenuate in subsequent years (Rohrbach et al., 1993), it appears that teachers who are trained in a given curriculum are more likely than those not trained to adhere to its guide (Jones & Bowler, 1997; Ringwalt et al., 2003) and implement a higher proportion of lesson activities (Connell et al., 1985; McCormick et al., 1995). The importance of training has been underscored by Ringwalt and colleagues (2003), who found significant relationships between fidelity and four measures of training, of which two, perceived effectiveness and recency, were most strongly correlated. McCormick, Steckler, and McLeroy (1995) also found that training was a significant predictor of whether several curricula (Growing Healthy, Teenage Health Teaching Modules, or Project SMART) were implemented, both initially and in later years.

A number of studies have considered different aspects of training and examined their relationships to fidelity. Ringwalt and colleagues (2003) found that teachers who were recently trained were more likely to adhere to curriculum guides and Ennett and colleagues (2003) found that they were more likely to use effective content and delivery methods. Teachers’ post-training self-efficacy and perceptions of the effectiveness of their training are also related to increased levels of fidelity (Ringwalt et al., 2003; Rohrbach et al., 1993). The modality used to provide teacher training also appears to be important, although little research has compared the effectiveness of different approaches. In-person training workshops have been shown to produce greater fidelity than self-instruction via video (Basen-Engquist et al., 1994; Perry et al., 1990). To date, the most common training modality offered by developers of evidence-based curricula has been in-person training workshops (cf., NREPP; Blueprints), although some curricula (e.g., All Stars, Project ALERT) now provide online training as an option. Regardless of mode, teacher training is provided for the majority of evidence-based curricula, and many developers require it as a condition of receiving curriculum materials.

The purpose of this paper is to estimate the prevalence of training among teachers using evidence-based substance use prevention curricula in US public middle schools. We also examine the association of key training-related characteristics with two dimensions of fidelity identified by Dusenbury and colleagues (2003), adherence and dose. In particular, we address the following research questions in regard to teachers identified by their schools as having the greatest amount of responsibility for teaching substance use prevention:

  1. What proportion of lead substance use prevention teachers in US public middle schools who are implementing an evidence-based curriculum have been trained in the curriculum?

  2. How many total hours were they trained in the curriculum, how recently were they trained, and for how many hours?

  3. How effective did they perceive their training to be?

  4. To what extent was adherence to the curriculum guide emphasized during training?

  5. To what extent are these training factors, both individually and collectively, related to the fidelity with which the teachers reported that they implemented their curricula, both in terms of adherence and dose?

We expected to find that teachers who were trained recently, who believed that their training was effective, and whose trainers emphasized fidelity would be more likely to adhere to their curriculum guide (Jones & Bowler, 1997; Ringwalt et al., 2003). We also expected that the amount of training teachers received would be significantly related to the proportion of lessons they implemented (Connell et al., 1985; McCormick et al., 1995).

Methods

Sample

Data for these analyses come from the second wave of the School-based Substance Use Prevention Programs Study, a longitudinal nationwide survey of substance use prevention practices in US middle schools. We selected schools in two phases, the first of which came from a 1997–1998 sampling frame from Quality Educational Database (Quality Education Data Inc., 1998). We defined schools with middle school grades as those that included either 7th or 8th grade, those that were 5th and 6th grade only, and those with a stand-alone 6th grade. Schools comprising grade configurations other than these were not included in the frame, nor were those designated as alternative, charter, vocational or technical, or special education, those administered by the US Department of Defense or Bureau of Indian Affairs, or those with fewer than 20 students. The sampling frame yielded 2,273 eligible public schools.

A second sample of 210 schools was drawn from a 2002–2003 sampling frame maintained by the Common Core of Data (National Center for Education Statistics, 2004) using the inclusion criteria specified above. The purpose of this “refreshment” sampling strategy was to maintain the sample’s representativeness by accounting for the net gain in the number of schools nationwide in the intervening five-year period. The original and refreshment samples were both stratified by population density, school size, and poverty level, with equal probabilities of selection within each stratum.

Because of the possibility of error on the sampling frames, we contacted sampled schools to confirm their eligibility status. This process yielded 2,204 verified eligible schools with middle school grades.

Participants and data collection

We began contacting schools in October 2004 to verify their eligibility status and to collect the name and contact information of the person who was the most appropriate respondent for our study; that is, the person at the school responsible for teaching substance use prevention to students in middle school grades. If the school identified more than one such teacher, we asked for the name of the person who taught substance use prevention and knew more about it than anyone else. We then labeled this person as the school’s “lead” substance use prevention teacher. In recognition of the fact that prevention may have been taught by district personnel, prevention specialists, law enforcement personnel, or other outside providers, we did not stipulate that our respondent be a classroom teacher or be employed at the school.

Full data collection began in January 2005 and concluded in July of that year. We began by mailing an invitation to the respondent identified during our initial sample verification process. Each respondent received a unique PIN and instructions for logging onto and completing our secure, 45-minute Web questionnaire. We also provided a $10 prepaid cash incentive, as well as information about the study in the form of a “Frequently Asked Questions” sheet. We contacted non-responders via a series of email and mailed reminders. We then mailed paper copies of the questionnaire, which included a self-addressed stamped return envelope and a letter of support from OSDFS, to those who had not yet responded. Finally, we attempted to conduct an abbreviated telephone interview with those who did not respond to the paper questionnaire. This sequential mode of data collection yielded an overall response rate of 78.1% (n=1721), with 65.2% (n=1122) responding by Web, 18.9% (n=325) by paper, and 15.9% (n=274) by phone. More detailed information about our data collection approach is available elsewhere (Pope, Vincus, & Hanley, 2007).

Measures

Our questionnaire included a set of items related to use of substance use prevention curricula in the respondent’s school. The first item assessed whether the respondent taught a substance use prevention curriculum to students in middle school grades in their school. The next two questions asked them to identify all such curricula they taught during the current school year, and which one of those curricula they taught the most that school year. Web and paper respondents were provided a list of 28 curricula from which to choose for these two questions, including an option for locally-developed curricula and an open-ended response option to identify any other curricula we did not specify. Telephone respondents were asked these two questions in an open-ended format and responses were coded into our list.

The 28 curricula on our list were derived from registries maintained by NREPP (NREPP, 2004), Blueprints for Violence Prevention (Center for the Study and Prevention of Violence, 2004), and the US Department of Education (US Department of Education, 2002), as well as additional curricula that are not evidence-based but were shown in an earlier study to have been used by at least 5% of the nation’s middle schools in 1999 (Ringwalt et al., 2002). For the purposes of this paper, we consider evidence-based substance use prevention curricula to be those that, as of 2005, targeted universal student populations, were intended for middle school students, were commercially available, and were designated as “effective” or “model” by NREPP, “promising” or “model” by Blueprints, or “exemplary” by the US Department of Education. Using these criteria, we identified the following ten evidence-based curricula: All Stars, keepin’ it REAL, Life Skills Training, Lion’s Quest Skills for Adolescence, Positive Action, Project ALERT, Project Northland, Project TNT, Social Competence Promotion Program for Young Adolescents, and Too Good for Drugs.

We included a set of items that measured various aspects of training. Respondents were asked to answer the questions in reference to the curriculum they used the most. The first of these questions asked respondents to specify the total number of hours of training they had received in the curriculum. They next reported how recently they had received training in the curriculum. Respondents who reported that they had received no training whatsoever skipped the remaining items on this topic. Our next question asked how many hours of training they received during their most recent training. To obtain a measure of perceived effectiveness of training, we asked to what extent respondents felt their most recent training prepared them to teach the curriculum. Respondents also reported how much emphasis had been given during their most recent training to following the curriculum guide closely. They were then asked to indicate whether any of a number of potential barriers had been a problem in their teaching of the curriculum, including “the curriculum guide does not meet my needs.”

After completing our training items, respondents were routed to one of four templates that assessed various factors related to fidelity of implementation. Based on their response to the question which identified the curriculum they used the most, they completed a fidelity template specific to All Stars, Life Skills Training, Project ALERT, or “Other.” All pertinent questions in all four templates were the same, and were framed in reference to the most recent time the respondent had taught the curriculum.

Each template included a question that asked respondents to indicate the number of lessons they taught, and also asked them to indicate their level of adherence to the curriculum guide. The wording of the latter question is the same as that used by Ringwalt and colleagues (2003) in analyses of data from an earlier wave of the current study. Table 1 lists our measures and response options. A copy of our instrument is available from the first author.

Table 1.

Measures Assessed in the School-based Substance Use Prevention Programs Study Questionnaire

Measure Response options
Curricula use
Do you use a substance use prevention curriculum with students in middle or junior high grades in your school? Yes
No
During the current school year, which of the following substance use prevention curricula are you using with students in middle or junior high grades in your school? Respondents were provided with a list of 28 curricula from which to choose.a
During the current school year, which one curriculum are you using the most with students in middle or junior high grades in your school? Respondents were provided with a list of 28 curricula from which to choose.a
Training
How many hours of training have you received, in total, to prepare you to teach the substance use prevention curriculum you are using the most with students in middle or junior high grades? Respondents were provided with an open- end field to record the total number of hours of training received.
When was the most recent time you received training for the substance use prevention curriculum you are using the most with students in middle or junior high grades? Less than 1 year
At least 1 year, but less than 3 years
At least 3 years, but less than 5 years
At least 5 years
I have never received training for this curriculum
How many hours of training did you receive this most recent time? Less than 2 hours
2 up to 4 hours
4 up to 6 hours
6 up to 8 hours
8 hours or more
To what extent did this training prepare you to teach the substance use prevention curriculum you are using the most with students in middle or junior high grades? Not at all
Very little extent
Some extent
A great extent
During your most recent training for the substance use prevention curriculum you are using the most with students in middle or junior high grades, how much emphasis was given to closely following the curriculum guide? None at all
Very little
Some
A great deal
Adequacy of curriculum guide
Have any of the following been a problem in your teaching of the substance use prevention curriculum you are using the most with students in middle or junior high grades? The curriculum guide does not meet my needs. Yes
No
Dose
During the most recent time you taught [curriculum name], how many [curriculum name] lessons did you teach to this class, regardless of how many class periods were used? Respondents were asked to select between 0 and 15 lessons, or “16 or more”.b
Adherence
How closely did you follow the curriculum guide for [curriculum name] in teaching your lessons? I did not use a curriculum guide
Very closely – I taught the material as specified
Somewhat closely – I sometimes adapted the material as appropriate
Not very closely – I frequently adapted the material as appropriate
a

Contact first author for complete list.

b

The upper bound of the response options varied to reflect the number of lessons in the specific curriculum.

Based on the distributions of our variables, we collapsed the number of hours respondents reported that they were most recently trained into a three-level variable (0–6 hours, 6–8 hours, 8+ hours). We also dichotomized “perceived effectiveness of training” and “emphasis placed on following the curriculum guide” into “a great extent/not a great extent” and “a great deal/not a great deal,” respectively. Respondents who indicated that they taught “16 or more lessons” of the curriculum were credited as having taught all the lessons in the curriculum.

Analysis plan

All analyses were conducted in SAS version 9.1 using weighted data and procedures that accounted for the stratified sampling design. Sample weights were constructed from original probabilities of selection computed on the 1998 sample (Jones, Sutton, & Boyle, (2002), in conjunction with probabilities of selecting new schools from the 2002 sample.

Prior to analyses, we confirmed that each of the ten evidence-based curricula offered training and provided curriculum guides to teachers. Because we were interested in examining the relationship of training to fidelity of implementation of evidence-based curricula, we began our analyses by restricting our sample to those who indicated that they taught one of these curricula “the most.” This criterion reduced our analysis sample from 1721 to 400 cases. Our analyses were further restricted to Web and paper respondents (n=343) because questions related to training were not asked in the abbreviated telephone interview. We eliminated an additional 16 cases due to inconsistent responses to either our training questions or our fidelity template items. The demographic characteristics of our final analysis sample of 327 respondents are provided in Table 2.

Table 2.

Respondent Demographic Characteristics (unweighted n=327)

Characteristic Weighted mean or percent 95% Confidence interval
Gender(% female) 76.8 72.2, 81.4
Age (mean) 44.4 43.3, 45.6
Race/ethnicity (%)
 White 85.5 82.2, 88.7
 African-American 9.5 7.1, 11.9
 Other 5.0 2.6, 7.5
Positiona (%)
 Classroom or PE teacher 43.2 37.6, 48.9
 Counselor/Social worker 35.1 29.7, 40.4
 Prevention specialist 26.1 21.4, 30.9
 Other 15.1 11.0, 19.2
Years teaching substance use prevention (mean) 11.6 10.6, 12.5

Note. Estimates are weighted, n is unweighted.

a

Categories are not mutually exclusive.

Using this analysis sample, we computed descriptive statistics for our two dependent variables of interest (adherence to the curriculum guide and dose). Dose was computed by dividing the number of lessons respondents reported teaching by the total number of lessons in the curriculum.

We derived estimates of the proportion of teachers implementing an evidence-based curriculum who had been trained in the curriculum via our question regarding recency of training. We limited all subsequent analyses to those teachers who had been trained (n=276). We then computed means and proportions for all of our training variables.

We followed these analyses by performing cumulative logit analyses (Allison, 1999) on each of our two dependent variables, adherence to the curriculum guide and proportion of lessons taught, using PROC SURVEYLOGISTIC (SAS Institute Inc., 2004). Adherence was recoded as a three-level variable in which those who did not use a curriculum guide at all and those who followed one but not very closely were collapsed into one category. Given that our dose measure was calculated as a proportion score, linear regression analyses were not appropriate. We, therefore, categorized it as a three-level variable based on a tertile split of the distribution and conducted cumulative logit analyses on the resulting ordinal variable.

We included respondents’ perceptions of the adequacy of the curriculum guide as a control variable in the adherence model. We excluded the total number of hours of training from both models due to its similarity and potential collinearity with our measure of the number of hours of most recent training.

Results

Our analyses revealed that 35.3% (CI=29.8%, 40.8%) of all lead teachers in US public middle schools using an evidence-based curriculum followed their curriculum guide very closely, while 56.3% (CI=50.8%, 61.7%) followed it somewhat closely, 6.0% (CI=3.3%, 8.6%) followed it not very closely, and 2.5% (CI=0.7%, 4.2%) did not use a curriculum guide at all. The average proportion of lessons taught was 64.9% (CI=61.2%, 68.6%), and 30.2% (CI=24.8%, 35.6%) of teachers taught all the lessons in the curriculum.

Overall, 82.3% (CI=77.9%, 86.6%) of the lead teachers who were implementing one of the ten evidence-based curricula we identified had been trained at some point in that curriculum. In Table 3, we present the means and proportions and associated 95% confidence intervals for our independent variables. Approximately one-third of teachers’ most recent training lasted at least eight hours, and over 90% believed that their training prepared them to teach the curriculum to either “some” or “a great” extent. Nearly two-thirds indicated that during their most recent training, “a great deal” of emphasis was placed on following the curriculum guide, and less than 15% felt that the curriculum guide did not meet their needs.

Table 3.

Means and Proportions of Training Variables (unweighted n=327)

Variable Weighted mean or percent 95% Confidence interval
Recency of training (%)
 Less than 1 year ago 16.3 12.2, 20.3
 1–3 years ago 29.7 24.2, 35.1
 3–5 years ago 19.7 15.5, 24.0
 At least 5 years ago 16.6 12.3, 20.9
 Never been trained 17.7 13.4, 22.1
Total number of hours of training (mean)a 19.6 13.7, 25.5
Number of hours of most recent training (%)a
 Less than 2 hours 10.4 6.9, 13.8
 2 up to 4 hours 9.0 5.7, 12.3
 4 up to 6 hours 16.4 12.2, 20.5
 6 up to 8 hours 32.5 26.6, 38.4
 8 hours or more 31.8 26.3, 37.2
Perceived effectiveness of training (%)a
 Not at all 0.7 0.0, 1.6
 Very little extent 8.6 5.0, 12.3
 Some extent 44.7 38.6, 50.8
 Great extent 46.0 39.9, 52.1
Emphasis on following curriculum guide during training (%)a
 None 1.2 0.0, 3.0
 Very little 3.3 1.1, 5.4
 Some 32.3 26.7, 38.0
 A great deal 63.2 57.4, 69.1
Curriculum guide does not meet needs (% agreed)a 14.4 9.4, 19.5

Note. Estimates are weighted, n is unweighted.

a

Analyses include only those teachers who reported having been trained in the curriculum they used the most (unweighted n=276).

The results of our cumulative logit analyses are presented in Table 4. With respect to adherence to the curriculum guide, perceived effectiveness of training (OR=2.70, p <.001) and the emphasis trainers placed on following the curriculum guide (OR=5.59, p <.0001) emerged as significant correlates. After controlling for all other variables, including adequacy of the guide, those teachers who perceived their training to be effective were twice as likely to be more rather than less adherent compared to those who did not perceive their training to be effective (AOR=2.15, p <.05). Teachers whose training emphasized the importance of adherence were five times as likely to be more rather than less adherent than teachers whose training did not emphasize adherence (AOR=5.45, p<.0001).

Table 4.

Unadjusted (OR) and Adjusted (AOR) Odds Ratios of Relationship between Training and Adherence and Dose (unweighted n=266)

Adherencea Doseb

Training measure OR AOR OR AOR
Recency of training (1=At least 5 years ago; 4=Less than 1 year ago) 1.16 1.15 1.24 1.23
Hours of most recent training (6–8 hours vs. 0–6 hours) 1.34 0.83 1.26 1.15
Hours of most recent training (8+ hours vs. 0–6 hours) 1.48 1.30 1.56 1.53
Perceived effectiveness of training (A great extent vs. not a great extent) 2.70** 2.15* 1.39 1.13
Emphasis placed on following curriculum guide (A great deal vs. not a great deal) 5.59*** 5.45*** 1.70* 1.52
Adequacy of curriculum guide (Adequate vs. not adequate) 2.06 1.83

Note. Estimates are weighted, n is unweighted.

a

1=did not use guide/followed guide not very closely; 2=followed guide somewhat closely; 3=followed guide very closely.

b

Three-level variable based on tertile split of distribution of proportion of lessons taught. A higher value indicates a higher proportion of lessons taught.

p<.10.

*

p<.05.

**

p<.001.

***

p<.0001.

Emphasis placed on following the curriculum guide also emerged as a significant correlate in our dose model (OR=1.70, p<.05), while recency of training approached significance (OR=1.24, p<.10). After adjusting for all other variables, these two variables approached significance (AOR=1.52, p=.10 and AOR=1.23, p<.10, respectively).

Discussion

In this study, we found that of the US public middle school teachers who were using an evidence-based curriculum in 2005, 82% had received training in the curriculum. This is encouraging because of the overwhelming evidence that suggests training is vital to ensuring fidelity. Teachers who receive training are more likely to develop the essential interactive teaching skills upon which most evidence-based curricula depend. The high proportion of teachers trained in their curriculum is likely an indicator of financial and other support for prevention offered by the school or school district (Connell et al., 1985; King, Wagner, & Hedrick, 2001). There remains some room for improvement, however, given that about one-fifth of teachers have not received training in the curriculum they are using. Recent reports suggest that among teachers who have not been trained, the majority recognize the value of training and would prefer to be trained (Anderson, Aromaa, & Rosenbloom, 2007).

We were discouraged by the fact that only 35% of teachers reported following their curriculum guide very closely. This, however, represents a considerable improvement over what we found in our previous study (Ringwalt et al., 2003) in which only 15% of teachers reported following their curriculum guide very closely in 1999. The previous results, however, included users of any of 48 commercially-available curricula, many of which were not evidence-based, and some of which may not have included a curriculum guide. Developers of evidence-based curricula may be more aware of the importance of fidelity and, thus, more likely to emphasize this concept in their training protocols. Furthermore, these curricula have been recognized as evidence-based by the national registries specified above and, therefore, have demonstrated that they can be implemented with fidelity. We note, however, that only about two-thirds of respondents in our study - all of whom implemented an evidence-based curriculum -indicated that a great deal of emphasis was placed on following the guide during their most recent training. Although such a proportion may be promising and indicative of recent efforts by state and federal agencies to promote fidelity, we are cautious about it given our finding that those teachers whose training emphasized adherence were five times as likely to be more adherent compared to those whose training did not emphasize adherence. Curriculum developers, therefore, should continue to highlight the importance of this critical concept in their training procedures.

We also found that the average proportion of lessons taught by teachers was only 65%, and only 30% of teachers taught all the lessons in their curriculum. We note that these figures may be slightly inflated given that those who indicated they taught 16 or more lessons were assumed to have taught all the lessons in the curriculum. Although we recognize the apparent inevitability of program adaptation and abbreviation (Pankratz et al., 2006), the fragmented fashion in which evidence-based curricula are being implemented jeopardizes their ability to prevent substance use among students. With fewer than two-thirds of lessons being taught, critical content is undoubtedly being omitted. These results, taken together with those related to adherence, indicate low levels of fidelity along the two dimensions we measured.

Although we did not find support for our hypothesis that recency of training was related to adherence, we did find that teachers who believed their training was effective were twice as likely to be more adherent to the curriculum guide. As suggested by Rohrbach, Graham, and Hansen (1993) and Romano (1996), this may be due, in part, to increases in teachers’ self-efficacy. Unfortunately, our measures did not allow us to explore this relationship.

Although no significant correlates emerged in our dose analyses, including number of hours of training as we hypothesized, recency of training and emphasis placed on following the guide showed promise and warrant further consideration in future research.

In post hoc analyses, we found that only 7.5% (CI 3.9%, 11.0%) of teachers indicated that their most recent training was conducted either wholly or partially online. As developers consider ways to effectively disseminate their curricula, and as schools and districts struggle to appropriately allocate diminishing resources available for substance use prevention, online training represents a promising avenue to achieve both. However, although online training may be effective in increasing trainees’ self-efficacy and application of learned skills in a public health context (Farel, Pfau, Paliulis, & Umble, 2003), it may not provide them with sufficient opportunities to develop, practice, and receive feedback on the interactive skills needed to implement evidence-based programs with fidelity (Tortu & Botvin, 1989). There is some evidence to suggest that those trained in-person are more likely to use interactive methods than those trained by other means such as video (Basen-Engquist et al., 1994). As opportunities for online training proliferate, future research should evaluate whether this approach effectively prepares teachers for the skills required to implement substance use prevention curricula as intended. We are aware of one study currently underway that is addressing this issue (ETR Associates, 2008).

This study has several limitations. First, our bi-dimensional conceptualization of fidelity (adherence and dose) does not address other aspects of the construct identified by Dusenbury and colleagues (2003) and others. Our approach did not take into account quality of delivery, for example, which has been widely shown to affect student outcomes. However, by including a measure of dose in our analyses, we advance the earlier work by Ringwalt and colleagues (2003) who considered only adherence per se. Furthermore, we considered additional factors such as the number of hours of training respondents received and the emphasis placed on following the curriculum guide during training.

We did not, however, take into account any of several curriculum-specific characteristics that may influence adherence and dose, such as time requirements or the number of lesson activities. However, our goal was to aggregate across all evidence-based curricula and identify training-related variables that affect fidelity. The fact that we found significant correlates suggests that training is, in fact, related to fidelity irrespective of curriculum type.

The current study is further limited by the fact that we only surveyed the lead substance use prevention teacher in each school. As suggested by Ringwalt and colleagues (2003), these teachers, by virtue of their position, may be more invested in the curriculum they teach – and thus more likely to maintain fidelity to it – than other substance use prevention teachers at the school. Our approach of identifying these respondents in sampled schools may thus have led to positively biased estimates of training and fidelity.

Finally, because our training-related items were not asked as part of our abbreviated telephone interview, our analysis sample included only those who responded via the Web or paper, thereby raising questions of mode effects. Mode analyses revealed that telephone respondents did not differ significantly from those responding via Web or paper on key variables such as evidence-based curricula use and adherence to the curriculum guide. Although we are unable to assess mode effects for the training variables across all three modes, we compared Web and paper respondents and found that mode effects were not a concern in regard to these variables. We thus believe that these findings, in tandem with the fact that the training variables were non-sensitive in nature, indicate that mode effects were not a concern for these analyses.

In summary, our findings provide insight into the current state of teacher training in the United States as it relates to the implementation of evidence-based substance use prevention curricula in public middle schools. Given the low levels of fidelity reflected in our study, more work is needed to ensure that these curricula are implemented as planned and, therefore, have the greatest opportunity to prevent adolescent substance use. The effective preparation of teachers, which should clearly include an emphasis on adherence, plays a critical role in achieving that goal.

Acknowledgments

We are deeply appreciative of the contributions made by Duston Pope (now of Gongos Research), and the Market Strategies, Inc. team, to ensuring a successful data collection effort. We gratefully acknowledge the letter we received from the Office of Safe and Drug-Free Schools of the US Department of Education in support of our data collection. This study was supported by NIDA grant #R01 DA016669.

Footnotes

*

This study was supported by NIDA grant #R01 DA016669.

Contributor Information

Sean Hanley, Pacific Institute for Research and Evaluation, Chapel Hill, NC.

Chris Ringwalt, Pacific Institute for Research and Evaluation, Chapel Hill, NC.

Amy A. Vincus, Pacific Institute for Research and Evaluation, Chapel Hill, NC.

Susan T. Ennett, Department of Health Behavior and Health Education, The University of North Carolina at Chapel Hill, Chapel Hill, NC.

J. Michael Bowling, Department of Health Behavior and Health Education, The University of North Carolina at Chapel Hill, Chapel Hill, NC.

Susan W. Haws, Department of Health Behavior and Health Education, The University of North Carolina at Chapel Hill, Chapel Hill, NC.

Louise A. Rohrbach, Department of Preventive Medicine, Institute for Prevention Research, University of Southern California, Los Angeles, CA.

References

  1. Abbott RD, O’Donnell J, Hawkins JD, Hill KG, Kosterman R, Catalano RF. Changing teaching practices to promote achievement and bonding to school. American Journal of Orthopsychiatry. 1998;68(4):542–552. doi: 10.1037/h0080363. [DOI] [PubMed] [Google Scholar]
  2. Allison PD. Logistic regression using the SAS system: Theory and application. Cary, NC: SAS Institute Inc; 1999. [Google Scholar]
  3. Anderson P, Aromaa S, Rosenbloom D. Prevention education in America’s schools: Findings and recommendations from a survey of educators. 2007 Retrieved November 21, 2007, from http://www.jointogether.org/resources/2007/prevention-education-in.html.
  4. Basen-Engquist K, O’Hara-Tompkins N, Lovato CY, Lewis MJ, Parcel GS, Gingiss P. The effect of two types of teacher training on implementation of Smart Choices: A tobacco prevention curriculum. Journal of School Health. 1994;64(8):334–339. doi: 10.1111/j.1746-1561.1994.tb03323.x. [DOI] [PubMed] [Google Scholar]
  5. Botvin GJ, Baker E, Dusenbury L, Tortu S. Preventing adolescent drug abuse through a multimodal cognitive-behavioral approach: Results of a 3-year study. Journal of Consulting and Clinical Psychology. 1990;58(4):437–446. doi: 10.1037//0022-006x.58.4.437. [DOI] [PubMed] [Google Scholar]
  6. Center for the Study and Prevention of Violence. Blueprints for Violence Prevention overview. 2004 Retrieved June 4, 2004, from http://www.colorado.edu/cspv/blueprints/model/criteria.html.
  7. Connell DB, Turner RR, Mason EF. Summary of findings of the School Health Education Evaluation: Health promotion effectiveness, implementation, and costs. Journal of School Health. 1985;55(8):316–321. doi: 10.1111/j.1746-1561.1985.tb05656.x. [DOI] [PubMed] [Google Scholar]
  8. Dane AV, Schneider BH. Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clinical Psychology Review. 1998;18(1):23–45. doi: 10.1016/s0272-7358(97)00043-3. [DOI] [PubMed] [Google Scholar]
  9. Drug Strategies Inc. Making the grade: A guide to school drug prevention programs. Washington, DC: Drug Strategies, Inc; 1999. [Google Scholar]
  10. Dusenbury L, Brannigan R, Falco M, Hansen WB. A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings. Health Education Research. 2003;18(2):237–256. doi: 10.1093/her/18.2.237. [DOI] [PubMed] [Google Scholar]
  11. Dusenbury L, Brannigan R, Falco M, Lake A. An exploration of fidelity of implementation in drug abuse prevention among five professional groups. Journal of Alcohol & Drug Education. 2004;47(3):4–19. [Google Scholar]
  12. Dusenbury L, Falco M. Eleven components of effective drug abuse prevention curricula. Journal of School Health. 1995;65(10):420. doi: 10.1111/j.1746-1561.1995.tb08205.x. [DOI] [PubMed] [Google Scholar]
  13. Dusenbury LA, Hansen WB, Giles SM. Teacher training in norm setting approaches to drug education: A pilot study comparing standard and video-enhanced methods. Journal of Drug Education. 2003;33(3):325–336. doi: 10.2190/LH8K-G404-CJAW-PER0. [DOI] [PubMed] [Google Scholar]
  14. Elliott DS, Mihalic S. Issues in disseminating and replicating effective prevention programs. Prevention Science. 2004;5(1):47–53. doi: 10.1023/b:prev.0000013981.28071.52. [DOI] [PubMed] [Google Scholar]
  15. Ennett ST, Ringwalt CL, Thorne J, Rohrbach LA, Vincus A, Simons-Rudolph A. A comparison of current practice in school-based substance use prevention programs with meta-analysis findings. Prevention Science. 2003;4(1):1–14. doi: 10.1023/a:1021777109369. [DOI] [PubMed] [Google Scholar]
  16. ETR Associates. California keepin’ it REAL project. 2008 Retrieved May 15, 2008, from http://www.etr.org/kirealeducators/aboutProject.htm.
  17. Farel AM, Pfau SE, Paliulis SC, Umble KE. Online analytic and technical training. Journal of Public Health Management & Practice. 2003;9(6):513. doi: 10.1097/00124784-200311000-00012. [DOI] [PubMed] [Google Scholar]
  18. Jones DJ, Bowler S. The link between a participant’s perceptions of training and subsequent levels of application. Journal of Alcohol and Drug Education. 1997;42(2):53–68. [Google Scholar]
  19. Jones SM, Sutton BC, Boyle KE. Survey methodology for studying substance use prevention programs in schools. In: Chaubey YP, editor. Recent Advances in Statistical Methods. Proceedings of Statistics 2001 Canada. The 4th Conference in Applied Statistics. London: Imperial College Press; 2002. pp. 157–168. [Google Scholar]
  20. King KA, Wagner DI, Hedrick B. Safe and drug-free school coordinators’ perceived needs to improve violence and drug prevention programs. Journal of School Health. 2001;71(6):236–241. doi: 10.1111/j.1746-1561.2001.tb01324.x. [DOI] [PubMed] [Google Scholar]
  21. Levenson-Gingiss P, Hamilton R. Determinants of teachers’ plans to continue teaching a sexuality education course. Family & Community Health. 1989;12(3):40–53. [Google Scholar]
  22. McCormick LK, Steckler AB, McLeroy KR. Diffusion of innovations in schools: A study of adoption and implementation of school-based tobacco prevention curricula. American Journal of Health Promotion. 1995;9(3):210–219. doi: 10.4278/0890-1171-9.3.210. [DOI] [PubMed] [Google Scholar]
  23. Monitoring the Future. Trends in annual prevalence of use of various drugs for eighth, tenth, and twelfth graders. 2007a Retrieved February 12, 2008, from http://www.monitoringthefuture.org/data/07data/pr07t2.pdf.
  24. Monitoring the Future. Trends in lifetime prevalence of use of various drugs for eighth, tenth, and twelfth graders. 2007b Retrieved February 12, 2008, from http://www.monitoringthefuture.org/data/07data/pr07t1.pdf.
  25. National Center for Education Statistics. Common Core of Data: Public elementary/secondary school universe survey data, 2002–03. 2004 Retrieved August 15, 2004, from http://nces.ed.gov/ccd/pubschuniv.asp.
  26. National Institute on Drug Abuse. Preventing drug use among children and adolescents: A research-based guide. 2. Rockville, MD: National Institute on Drug Abuse; 2003. (No. 04-4212(B) [Google Scholar]
  27. National Registry of Evidence-based Programs and Practices. SAMHSA model programs. 2004 Retrieved June 4, 2004, from http://modelprograms.samhsa.gov.
  28. Pankratz M, Jackson-Newsom J, Giles S, Ringwalt C, Bliss K, Bell M. Implementation fidelity in a teacher-led alcohol use prevention curriculum. Journal of Drug Education. 2006;36(4):317–333. doi: 10.2190/H210-2N47-5X5T-21U4. [DOI] [PubMed] [Google Scholar]
  29. Parcel GS, Ross JG, Lavin AT, Portnoy B, Nelson GD, Winters F. Enhancing implementation of the Teenage Health Teaching Modules. Journal of School Health. 1991;61(1):35–38. doi: 10.1111/j.1746-1561.1991.tb07857.x. [DOI] [PubMed] [Google Scholar]
  30. Perry CL, Murray DM, Griffin G. Evaluating the statewide dissemination of smoking prevention curricula: Factors in teacher compliance. Journal of School Health. 1990;60(10):501–504. doi: 10.1111/j.1746-1561.1990.tb05890.x. [DOI] [PubMed] [Google Scholar]
  31. Pope D, Vincus A, Hanley S. Using a multi-mode design to maintain response rates. Paper presented at the meeting of the American Association of Public Opinion Research; Anaheim, CA. 2007. [Google Scholar]
  32. Quality Education Data Inc. QED national education database [database] 1998. [Google Scholar]
  33. Renes S, Ringwalt C, Clark H, Hanley S. Great minds don’t always think alike: The challenges of conducting substance abuse prevention research in public schools. Journal of Drug Education. 2007;37(2):97–105. doi: 10.2190/T467-T0K6-8140-8635. [DOI] [PubMed] [Google Scholar]
  34. Ringwalt CL, Ennett S, Johnson R, Rohrbach LA, Simons-Rudolph A, Vincus A, et al. Factors associated with fidelity to substance use prevention curriculum guides in the nation’s middle schools. Health Education & Behavior. 2003;30(3):375–391. doi: 10.1177/1090198103030003010. [DOI] [PubMed] [Google Scholar]
  35. Ringwalt CL, Ennett S, Vincus A, Thorne J, Rohrbach LA, Simons-Rudolph A. The prevalence of effective substance use prevention curricula in U.S. middle schools. Prevention Science. 2002;3(4):257–265. doi: 10.1023/a:1020872424136. [DOI] [PubMed] [Google Scholar]
  36. Ringwalt CL, Vincus A, Ennett S, Johnson R, Rohrbach LA. Reasons for teachers’ adaptation of substance use prevention curricula in schools with non-White student populations. Prevention Science. 2004;5(1):61–67. doi: 10.1023/b:prev.0000013983.87069.a0. [DOI] [PubMed] [Google Scholar]
  37. Rogers EM. Diffusion of innovations. 5. New York, NY: The Free Press; 2003. [Google Scholar]
  38. Rohrbach L, Dent C, Skara S, Sun P, Sussman S. Fidelity of implementation in Project Towards No Drug Abuse (TND): A comparison of classroom teachers and program specialists. Prevention Science. 2007;8(2):125–132. doi: 10.1007/s11121-006-0056-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Rohrbach LA, Graham JW, Hansen WB. Diffusion of a school-based substance abuse prevention program: Predictors of program implementation. Preventive Medicine. 1993;22(2):237–260. doi: 10.1006/pmed.1993.1020. [DOI] [PubMed] [Google Scholar]
  40. SAS Institute Inc. SAS/STAT 9.1 User’s Guide. 2004 Retrieved June 27, 2008, from http://support.sas.com/documentation/onlinedoc/91pdf/sasdoc_91/stat_ug_7313.pdf.
  41. StPierre TL, Osgood DW, Siennick SE, Kauh TJ, Burden FF. Project ALERT with outside leaders: What leader characteristics are important for success? Prevention Science. 2007;8(1):51–64. doi: 10.1007/s11121-006-0055-0. [DOI] [PubMed] [Google Scholar]
  42. Stead M, Stradling R, MacNeil M, MacKintosh AM, Minty S. Implementation evaluation of the Blueprint multi-component drug prevention programme: Fidelity of school component delivery. Drug & Alcohol Review. 2007;26(6):653–664. doi: 10.1080/09595230701613809. [DOI] [PubMed] [Google Scholar]
  43. Tobler NS, Roona MR, Ochshorn P, Marshall DG, Streke AV, Stackpole KM. School-based adolescent drug prevention programs: 1998 meta-analysis. Journal of Primary Prevention. 2000;20(4):275–336. [Google Scholar]
  44. Tortu S, Botvin GJ. School-based smoking prevention: the teacher training process. Preventive Medicine. 1989;18(2):280–289. doi: 10.1016/0091-7435(89)90075-3. [DOI] [PubMed] [Google Scholar]
  45. US Department of Education. Exemplary and promising safe, disciplined, and drug-free schools programs 2001. 2002 Retrieved June 4, 2004, from http://www.ed.gov/admins/lead/safety/exemplary01/exemplary01.pdf.
  46. Wagner EF, Tubman JG, Gil AG. Implementing school-based substance abuse interventions: Methodological dilemmas and recommended solutions. Addiction. 2004;99:106–119. doi: 10.1111/j.1360-0443.2004.00858.x. [DOI] [PubMed] [Google Scholar]

RESOURCES