Abstract
Purpose
As interventions are disseminated widely, issues of fidelity and adaptation become increasingly critical to understand. This study aims to describe the types of adaptations made by teachers delivering a school-based substance use prevention curriculum and their reasons for adapting program content.
Design/methodology/approach
To determine the degree to which implementers adhere to a prevention curriculum, naturally adapt the curriculum, and the reasons implementers give for making adaptations, the study examined lesson adaptations made by the 31 teachers who implemented the keepin' it REAL drug prevention curriculum in 7th grade classrooms (n = 25 schools). Data were collected from teacher self-reports after each lesson and observer coding of videotaped lessons. From the total sample, 276 lesson videos were randomly selected for observational analysis.
Findings
Teachers self-reported adapting more than 68 percent of prevention lessons, while independent observers reported more than 97 percent of the observed lessons were adapted in some way. Types of adaptations included: altering the delivery of the lesson by revising the delivery timetable or delivery context; changing content of the lesson by removing, partially covering, revising, or adding content; and altering the designated format of the lesson (such as assigning small group activities to students as individual work). Reasons for adaptation included responding to constraints (time, institutional, personal, and technical), and responding to student needs (students' abilities to process curriculum content, to enhance student engagement with material).
Research limitations/implications
The study sample was limited to rural schools in the US mid-Atlantic; however, the results suggest that if programs are to be effectively implemented, program developers need a better understanding of the types of adaptations and reasons implementers provide for adapting curricula.
Practical implications
These descriptive data suggest that prevention curricula be developed in shorter teaching modules, developers reconsider the usefulness of homework, and implementer training and ongoing support might benefit from more attention to different implementation styles.
Originality/value
With nearly half of US public schools implementing some form of evidence-based substance use prevention program, issues of implementation fidelity and adaptation have become paramount in the field of prevention. The findings from this study reveal the complexity of the types of adaptations teachers make naturally in the classroom to evidence-based curricula and provide reasons for these adaptations. This information should prove useful for prevention researchers, program developers, and health educators alike.
Keywords: Implementation, Schools, Programmes, Adaptation, Prevention, Curricula
Introduction
The 7th grade students sit in class, some leaning in toward their friends talking and others drawing, writing, or just sitting alone at their desks. A few students begin to laugh at something and Ms. Smith, approximately age 35, issues a warning that if the class doesn't behave they would not be permitted to watch the drug prevention video in today's “keepin' it REAL” lesson. The students quiet down for about five minutes as the students work independently on an activity, and then laughter is heard elsewhere in the room. As promised, Ms. Smith says, “Okay. No video.” She later reports to the researchers, “My class decided that they could not behave for the movie, so I did not show it.”
This scene is somewhat typical of classroom settings where school-based prevention curricula are being taught. Sometimes all curriculum material is covered as planned and sometimes it is not. Although prevention programs are developed carefully and implementers (such as teachers) are trained to implement the programs with fidelity, real-world events sometimes conspire to influence program delivery in ways program developers do not imagine. In these implementation efforts, delivery of prevention programs is a negotiation among the curriculum, teachers' classroom management and interests, students' behavior and needs, and administrative influence. Thus, evidence-based programs developed and evaluated in a research context are rarely, if ever, delivered in the same way they were originally designed and adaptations to program models are the norm rather than the exception (Breitenstein et al., 2010a, b; Dariotis et al., 2008; Dusenbury et al., 2003, 2005; Gottfredson, 2001; Greenberg et al., 2001; Ozer et al., 2010; Ringwalt et al., 2004a; Rohrbach et al., 2010). Teachers often delete and/or change materials due to the time constraints (Hill et al., 2007) with some reviews claiming adaptations occur to more than 50 percent of program content (Knoche et al., 2010; Odom et al., 2010). Durlak (1998) estimated that as much as 80 percent of program activities may be omitted during implementation.
Prevention researchers' interest in fidelity and adaptation of evidence-based curricula has been keen since the mid-1970's when there was increased attention to educational innovations (Dane and Schneider, 1998). In the USA, school adoption of evidence-based prevention programs has been aided by several federal government agencies such as the National Institute on Drug Abuse (NIDA), the Center for Substance Abuse Prevention (CSAP), and the Substance Abuse and Mental Health Services Administration who have created registries such as the National Registry of Evidence-based Programs and Practices (NREPP) to identify evidence-based prevention programs. This information is often available online. As the number of evidence-based programs increased and, especially as they became widely known and disseminated, prevention researchers became concerned with fidelity and adaption, asking how adequately evidence-based models were being implemented (Bond et al., 2000; Durlak et al., 2010; Rohrbach et al., 2006). With nearly half of US public schools implementing some form of evidence-based substance use prevention program (Ringwalt et al., 2011), issues of implementation fidelity and adaptation have become paramount.
How to conceptualize adaptation?
In most models of prevention science an essential definitional element of fidelity is loyalty to the planned content of a curriculum (Breitenstein et al., 2010b), reasoning that without fidelity to the intended delivery model and method, there is no way to determine whether unsuccessful outcomes reflect a failure of the model itself or failure to implement the model as intended (Chen, 1990; Dusenbury et al., 2003). Based on this view, past fidelity research predominately focussed on adherence and dosage as key determinants for the effectiveness of prevention curricula (Fagan et al., 2008; Sloboda et al., 2009; Szulanski and Winter, 2002). Adherence refers to both the delivery of specified program content and the use of specified delivery strategies (Ennett et al., 2003; Tobler and Stratton, 1997), and research indicates that both types are required for program effectiveness (Ennett et al., 2011). The second component, dosage, refers to sufficient exposure to the program, such as the number and length of sessions (Ennett et al., 2011).
Yet, recent studies have revealed that some degree of curriculum adaptation is inevitable in the process of implementation (Kelly et al., 2000; Ringwalt et al., 2003), which has led many prevention researchers to support balancing the need for program fidelity with a desire for local or cultural adaptation (Backer, 2001; Griner and Smith, 2006; Hecht and Miller-Day, 2010; Hohmann and Shear, 2002; Ringwalt et al., 2004a).
Beets et al. (2008) found that teachers do not implement intervention programs with the same degree of fidelity even when similarly trained to use a structured curriculum. Other evidence shows that teachers who modify program content were often more motivated, creative, and better teachers overall than those who exhibit greater fidelity (Dusenbury et al., 2003). Based on theoretical and empirical evidence, it is unreasonable to assume that adaptation at the level of implementation can be eliminated or that it is even desirable to do so (Hecht and Miller-Day, 2010). Instead, we submit that research should seek to first describe adaptation processes so that we might, then, be able to determine under what circumstances they are beneficial and when they are detrimental.
It stands to reason that an understanding of adaptation processes should consider why adaptations occur. If they are a normal part of implementation and are more likely to be made by the best teachers, we need to understand whether changes are made on account of poor skills or lack of motivation or for more proactive reasons. Teachers, in fact, may have a variety of reasons for adapting prevention curricula in natural settings. However, despite its important, we are aware of very few studies that evaluate the reasons implementers cite for adapting prevention curricula. One exception, Moore et al. (in press) describe two categories of reasons for adaptations – fit (logistical or philosophical).
Moore et al. (in press) defined logistical fit as adaptations made to better accommodate logical constraints such as omitting lessons or lesson material due to lack of time or changes in setting; whereas, philosophical fit referred to making adaptations to better accommodate differing philosophical approaches such as omitting material because it was not deemed developmentally or culturally appropriate for the target audience (e.g. discussion of sexuality with pre-adolescents). Indeed, many prevention programs are not developed with cultural diversity in mind (Hecht et al., 2003) and, so, may require adaptation for different cultural audiences. For instance, it was found that teachers with ethnic minority students were inclined to adapt prevention curricula to make lessons more culturally appropriate for their students (Ringwalt et al., 2004b). Thus, Moore et al. begins to elucidate reasons for program adaptation as well as the potential importance of these adaptations.
While the focus of this kind of adaptation research is often on the adaptations themselves, little attention has been paid to who is implementing the adaptations. In the case of school-based programs, it is frequently teachers who are the implementers. Given this, education theory may inform the adaptations that teachers make in school-based programs.
Adapting curriculum to teaching style
Education researchers have proposed that teachers tend to exhibit different teaching styles; that is, ways in which teachers generally interact with students across a variety of classroom settings. Paulson et al. (1998) conceptualized teaching styles along two dimensions: teacher control and responsiveness toward students. Drawing on research describing parenting styles (Baumrind, 1973), the Paulson et al. taxonomy identifies authoritarian and authoritative teachers. Authoritarians are those with moderate to high levels of control but low levels of responsiveness, while authoritatives are those with moderate levels of control but high levels of responsiveness.
Given this research, it seems logical to assume that teacher style may have an impact on the degree of and types of adaptations teachers make when delivering a prevention curriculum with those more responsive teachers possibly adapting curricula in response to student and situational needs. Pettigrew et al. (2013), recently added significant insight into this issue by providing a teacher-driven delivery model of school-based prevention. In a separate study based on the current study's data set, this model describes two salient integrative dimensions of delivery, teacher control (passive, coordinated, strict) and student participation (disconnected, attentive, participatory) that, in combination, reveal six distinct patterns of teacher-student interaction in the delivery of a school-based prevention curriculum. While the research linking education theory to prevention interventions is sparse, the patterns described in Pettigrew et al. (2013) show promise for guiding future research on adaptations. The current literature on types of and reasons for curriculum adaptation is also slight, therefore this current study sought to conduct further inquiry into this area and expand on it to more fully understand the degree to which implementers adapt curricula, the types of adaptations they make, and the reasons implementers give for these adaptations. In the current study we examine the nature of adaptations made in the implementation of the classroom-based keepin' it REAL drug prevention curriculum in 25 rural school districts across Pennsylvania and Ohio and address the following research questions:
RQ1. To what degree do implementers adhere to a prevention curriculum?
RQ2. How do implementers naturally adapt a prevention curriculum?
RQ3. What reasons do implementers give for adaptations?
Building a strong empirical foundation in understanding adaptations may lead to a better understanding of how to prevent curriculum drift and promote effective innovations. It also may yield new insights about the needs, wishes, limitations, and constraints implementers face in real-world settings.
Methods
The data from this study are part of a larger investigation of adaptation processes. The larger study examines both designer adaptation and implementer adaptation. Designer adaptation is operationalized as adaptations implemented by the researchers to customize or “tailor” curricula for a specific cultural audience. In the larger investigation, researchers culturally re-grounded the urban keepin' it REAL, seventh grade intervention curriculum – a multicultural program designed in Phoenix, Arizona (see Hecht et al., 2003) – for rural adolescents in Pennsylvania and Ohio. This re-grounding retained the same lesson structure and lesson content, but adapted a variety of surface and deep structure elements to appeal to a rural audience (see Colby et al., 2013). To study effects of this designer adaptation, rural schools were randomly assigned to a control condition that continued existing prevention practices, a rural condition that received the re-grounded version of the curriculum, or a “classic” condition that received the original multicultural version of the curriculum. For schools in both treatment conditions (rural, classic), we also collected data allowing us to study ways implementers adapt the curricula. For this report, we focus on the examination of these data, studying implementer adaptations in the two treatment conditions. Because of the breadth of information needed to report on processes involved in both designer and implementer adaptation and because the processes – although related to one another – largely operate separately, this study focusses on types of and reasons for implementer adaptation, examining adaptations collectively across the curricula.
Sample
For the purposes of the current implementation study, we examined adaptations made by the 31 teachers who implemented the classic and rural keepin' it REAL conditions (n = 25 schools). Teachers' ranged in age from 20 to 30 (28.6 percent), 31 to 40 (23.8 percent), 41 to 50 (33.3 percent), and 51 to 60 (14.3 percent), with 86 percent being female and 14 percent male, and 97 percent Caucasian. One teacher's ethnicity was American Indian/Alaska Native. This reflects the predominantly Caucasian rural population of Pennsylvania and Ohio. Also representative of the school districts in the area, teachers delivered the curriculum to 7th graders in a variety of school types including, elementary, middle, and high schools. The mean years of experience for these teachers was 12.81 years (SD = 9.04). Many participating teachers taught the curriculum to multiple classes of students for a total of 31 teachers teaching the curricula to 73 classes of students in the 25 schools. Seventh grade youth were chosen to minimize attrition due to drop outs (with school attendance mandated until the age of 16 and most districts reporting dropout rates of 10 percent or less) and because our data from our previous studies and other research shows that some students report experimenting with drugs in the sixth grade, with experimentation intensifying in the seventh grade (Hecht et al., 2006), making this an effective time to intervene (Dryfoos, 1990).
Teacher training
Research in school settings has shown that teachers who participate in training adhere more closely to program manuals than do untrained teachers (Basen-Enquist et al., 1994; Dusenbury et al., 2003). Therefore, teachers implementing both versions of the curriculum were provided with comparable one-day, eight-hour training workshops. Funds were provided to schools for substitute teachers so that all teachers could attend the training. All teacher implementers joined together for the first or generic part of the training and then were offered opportunities for practice in separate breakout groups for each version of the curricula. The generic part of the training provided a history of the keepin' it REAL program, a review of program effectiveness, benefits of the program, and a clear explanation of the program philosophy and goals (30 min). The training also provided an overview of the study (30 min) and then a detailed training in each lesson of the curriculum (4 hrs). Lunch was provided as the participants divided into their breakout sessions. The breakout sessions included training in how to complete the research tasks (e.g. online survey, videotape lessons) (1.5 hrs) and each teacher was required to teach a mini-lesson from their curriculum and get hands on practice with the video camera provided them for the research (1.5 hrs).
Data collection
Data for this adaptation study came from two sources: teacher reports and observer coding and rating of videotaped lessons. Self-reports are needed to identify the reasons teachers cite for making adaptations and what they identify as adaptations. Self-reported fidelity is typically higher than that resulting from observations by outsiders; however, and observational data are assumed to be more valid than self-reports because the latter are more subject to social desirability bias (Ennett et al., 2011; Lillehoj et al., 2004). The following describes our procedures.
Teacher reports
After completing each of the ten lessons, teachers completed an online survey, which included questions about the delivery and adaptation of the lesson for that day. The survey contained five sections assessing: demographic information; student interest in the lesson; how much of the lesson was completed; adaptations and reasons for adaptations; and teacher satisfaction with the lesson content and length. For this particular study, we focussed on analyzing the data reported on how much of the lesson was completed, adaptations, and reasons for adaptations.
Teachers were specifically asked “How much of the lesson did you complete today?” and provided with five response options (0 = none to 5 = all of it). Additionally, lessons were divided into components (e.g. Review of Homework; Activity 2; Video Discussion) and teachers were asked to indicate if they “omitted” or “changed, added, or improved” any of these components, then asked to describe any adaptations, and report reasons for making any changes or omissions. Finally, teachers were provided with a list of eight adaptations that were identified in the literature and instructed to check “all that apply to any changes made in this lesson” (e.g. I wanted to make the materials and activities more appropriate to my students' ethnic group; I added more about the effects of drugs on the body) and asked “Did any factors outside of your control affect your delivery of this lesson?” (yes/no) and, “If so, please check all that apply” from a list of possible factors (e.g. assembly, field trip). Data pertaining to 700 of the 730 lessons were collected from 31 teachers through this online survey.
Observer coding and rating of video data
In addition to teacher self-report, teachers videotaped every lesson using digital video recording equipment provided by the project. Teachers mailed digital video cards to project staff after recording each class. A total of 730 digital videos of lessons were uploaded into Nvivo 8, a qualitative data management and analysis software program. Of those 730 10-45 minutes videos, 624 videos had complete audio and video data. Given the massive amount of data, from these, a total of 276 videos were randomly selected for analysis. We systematically eliminated the first and last lessons to eliminate introductory and summary content which was by the nature of those lessons slightly different in structure from the other lessons. The selection procedures employed to select the videos for coding resulted in a balanced and random sampling of lessons across teachers and curricula. Trained observers then viewed the videotapes, coded adaptations as they occurred, and provided dimensional ratings of implementer adaptations along with other study variables. Additional details regarding video data and selection procedure are reported in Pettigrew et al. (2013).
Data analysis
The video data were analyzed by trained coders who quantitatively coded adherence to the curricula, types of adaptations, student and teacher engagement, and overall quality of the lesson. Six coders participated in a 16-20-hrs training period and did not contribute to data analysis until sample coding on three randomly selected videos reached a minimum of 0.80 intercoder reliaiblity using Krippendorf's α coefficient (Hayes and Krippendorff, 2007). Example quantitative codes included time stamps at the start and stop points in the video for each activity in a lesson (e.g. Review of Homework; Activity 2), adherence (whether an activity was covered, yes/no), and adaptations (whether an activity was changed, yes/no). In addition to the quantitative ratings, coders qualitatively described teacher adaptations and how these adaptations were implemented. For example, coders were trained to describe how each lesson component was altered in content (e.g. added a discussion about the legal consequences of drug use) or strategy (e.g. created a new activity for the students; group work was changed to be an individual activity). To prevent coder drift, intercoder agreement was calculated every two to three months during the data analysis period.
In order to answer the first research question (To what degree do implementers adhere to prevention curricula?), descriptive statistics were calculated separately for teacher reports of how much of the lesson they completed and observer and teacher ratings of whether lesson components (content or strategies) were covered/omitted or changed.
In order to address the second question (How do implementers naturally adapt drug prevention curricula?), a qualitative content analysis was conducted on qualitative observer descriptions and teacher responses to identify the types of adaptations that occurred in the lessons and frequencies were calculated on responses to items regarding other adaptations and factors that affected delivery of the lesson. Additionally, to assess amount of time spent on certain tasks and identify time reductions/additions, time stamps for each lesson component were combined into four lesson segments: introduction and homework review, providing information/content, activities (including discussion, role play, dyadic, and individual activities), and lesson review. Average planned times (as determined from the curriculum manual) for each segment across lessons was calculated and compared to average times spent on each segment during real-world implementation.
Finally, to answer our third research question, a qualitative content analysis was conducted to identify teacher's self-reported reasons for making adaptations. Additionally, frequencies were calculated for a priori reasons teachers selected as representing their adaptations.
Findings
There was a great degree of, numerous types of, and several reasons for the adaptations teachers made to the kiR drug prevention curriculum. Adding to the fidelity and implementation literature, this study illustrates the complexity of implementing prevention material under real-world conditions.
Degree of adaptation
According to teachers' self-reports on the online survey, 32 percent of the 700 lessons were delivered in their entirety, 60 percent “mostly delivered,” and in 8 percent only half or less of the lesson was delivered. In contrast, independent observer ratings of 276 videotaped lessons revealed that across the two programs 2.5 percent (n = 7) of the coded lessons were delivered with total fidelity (no omissions or adaptations). Thus, consistent with previous research, a considerable amount of adaptation was present, with a greater amount described by observers than was reported by teachers.
Types of adaptations
At the end of each lesson teachers were asked to identify adaptations they made from a list of possible adaptations. The most commonly reported adaptation from the teacher selections was “sharing my own personal experiences/stories with my students” (35 percent, n = 245 lessons), followed by “I adapted the lesson to be more relevant for students living in rural areas” (15.4 percent percent, n = 108), “I adapted the lesson to be more relevant for students in the 7th grade” (11.3 percent, n = 79), “I added more about how drugs are bad for you” (8.7 percent, n = 61), and “I added more information about the effects of drugs on the body” (4.7 percent, n = 33). From the researcher-generated list of possible adaptations, teachers reported that <3 percent of the lessons were adapted to add information about how drugs are immoral or wrong (n = 16) or to adapt to students' ethnicity (n = 19) or special needs (n = 12).
To move beyond a priori categories of possible adaptations, teachers were also provided an open-ended text box and asked to describe the actual adaptations they implemented. A total of 426 responses were obtained. Independent observers then coded both the 426 teacher descriptions along with their own independent observations of teacher adaptations in the 276 videotaped lessons. This coding revealed four broad categories of adaptations. The first includes the delivery timetable, adapting when and how lessons are taught. The second category, delivery context, involves where lessons were implemented and by whom. Third, adaptations occurred when teachers altered the content of a lesson by partially or completely omitting a lesson component, revising or reframing lesson content, or by adding new content to a lesson. The fourth broad category of teacher adaptations included altering the format of a lesson or lesson component. This category of adaptations occurred when teachers covered the material included in the keepin' it REAL curriculum in a manner different than what the curriculum stipulated. Each of these four categories of adaptations is discussed in this section (see Table I).
Table I. Lesson component adaptations.
Type adaptation | Number of adaptations coded to this category |
---|---|
Content | |
Partially omitting a lesson component | 3,436 |
Completely omitting lesson component | 2,477 |
Adding content to a lesson component | 1,590 |
Revising content in a lesson component | 484 |
Format | |
Changed a non-classroom activity into an entire class activity | 1,919 |
Changed a non-individual activity into an individual activity | 614 |
Complete lesson component at another time/subsequent lesson | 584 |
Added extra time to a lesson component | 371 |
Changed materials | 47 |
Notes: Mean lesson components per lesson = 15. An adaptation could be coded into more than one category if necessary
Delivery timetable
Both curricula are designed to have ten separate classroom-based lessons taught once a week for ten consecutive weeks. In the curriculum guides, the average prescribed lesson lengths were 45 minutes (rural program) and 40 minutes (classic program). Some teachers in our sample followed the prescribed timeframe, but many did not. Teachers reported teaching lessons every day for two consecutive weeks. Others taught some lessons before their winter holiday break and completed the curriculum after the break. Still others skipped or combined lessons due to unanticipated changes in their schools' schedule. For example, snow days forced some schools to alter their planned delivery schedule: of the 20 scheduled school days for February, 16 were delayed or canceled due to inclement weather. Beyond snow days, there were a variety of factors that teachers specifically identified as affecting the delivery of the lessons, including student disruptions (12.6 percent, n = 88), problems with technology (5.6 percent, n = 39), shortened days due to statewide testing (2.1 percent, n = 15), and student field trips (1.9 percent, n = 13). Teachers also shortened, canceled, and interrupted lessons due to their own lack of preparation, illness, or perceived lack of student interest. In their actual implementation, the average duration of the kiR lessons across versions was 35.5 minutes (SD = 8 min). These data suggest that teachers tended to adapt the delivery timetable in order to complete the lessons within the constraints of the school day. This first category of adaptations reveals some difficulties faced in delivering prevention curricula in natural settings.
Delivery context
The second category of adaptations included determining in what settings implementers presented curricula and by whom. First it was noted that the settings varied widely. While the majority of the lessons were taught in a single classroom setting to 15-25 students, several schools combined classes in auditorium settings and taught upwards to 65 students. If seating was not moveable, this provided significant barriers to students getting into groups and developing and performing role-play scenarios. Observers reported that behavioral issues were exacerbated in these larger groups.
Next we examined the type of class/teacher. Again, a great deal of diversity was noted. The lessons were presented by Literature, Science, Health, and Physical Education teachers/coaches. In some cases the curricula were presented during homeroom (i.e. non-academic) period rather than during a subject period such as health class. These contexts influenced implementation. Teachers often interjected context-related materials from their own areas of expertise and training into the lesson delivery. For example, when a student asked about the effects of inhalant use, the science teacher inserted additional information about the physiological mechanisms affected by inhalants. A health teacher incorporated activities that were based on fear appeals and that the students would “normally do in 8th grade health.” We concluded that context had important implications for implementation fidelity.
Altering content
Any adaptations to content in individual lesson components were coded as this type of adaptation. They included modifying the content of a lesson by fully omitting or partially covering a lesson component, revising content, or by adding content to a lesson component (see Table I for frequencies).
Fully omitting or partially covering lesson component. On the whole, observers noted a high degree of omission during implementation. Across curricula, 37 percent of the total adaptations to lessons were omissions. With the exception of those lessons abbreviated due to a shortened school schedule, the omission of lesson components typically occurred toward the latter part of lessons. That is, teachers omitted components such as the summary review of the lesson or the assignment and explanation of homework more frequently than introductory materials such as previewing the objectives of the lesson or introductory lectures/discussions.
Teachers not only omitted entire lesson components but also chose to only partially cover lesson content by shortening or hurrying through components. For example one teacher (KG-S8_L2) stated that she assigned homework for the lesson, “but the bell rang for dismissal before I could go over instructions clearly […] [I] will address [this] tomorrow in class.” Thus, she partially covered the homework component by making the homework assignment but omitting the explanation of the homework. Several teachers (WVD-R7AC_L2; OH-R3_L2; KG-R14_L; WVD-R13_L2; OH-R4_L2; KG-R6_L2) admitted that they shortened the Guessing Game, an activity in which students “risk” (i.e. wager) their team's “points” while guessing the correct answers on questions dealing with statistics about substances. Rather than play the entire game, teachers reported covering a portion of the 12 questions. Teachers also skipped elements of the discussion that was to follow the Guessing Game. For example, WVD-R7AC_L2 did not debrief the purpose of the game, perhaps contributing to students missing the point of the activity. WVD-R10_L2 “only briefly” discussed one lesson component because she/he had “taught this subject already” to the students (in a non-prevention oriented class). Still others sped through lesson components due to time constraints. The lesson-end review also was frequently shortened. For example, when examining the time stamp information for one of the lessons, we discovered that while the curriculum prescribed 11 minutes for review, in the rural curriculum, teachers spent on average 2.19 minutes reviewing (SD = 4.34) and in the original curriculum, teachers spent on average 4.15 minutes reviewing lesson content at the end of class (SD = 8.18).
Revising content. In a few classes, material was replaced or substantially revised rather than omitted. Teachers like OH-R2_L2 replaced scenarios presented in the curriculum for discussion with scenarios of her own choosing. Sometimes these adaptations included locally relevant examples and situations. For example, when discussing students' homework before the Thanksgiving break and the opening day for buck/deer rifle hunting season, KG-S8_1_L6 suggested that students might consider avoiding hunting camps, which are known for beer drinking and use of chewing tobacco.
Adding content to lesson component. A final way teachers altered the content of lessons was by adding their own material. For instance, one teacher added a Michael Jackson music video to conclude her lesson and consistently added discussion questions to powerpoint slides (KG-S11). Another teacher added a visual demonstration of curriculum content. He shared an illustration of stress using a balloon (WVD-R11_1_L9). By inflating a balloon until it popped, this teacher audibly and visually demonstrated the potentially negative consequences of accumulating stress without appropriate coping skills. His use of a balloon as a metaphor grabbed students' attention to the lesson content and led to an active class discussion. We also observed one teacher who asked students to write down if they smoked and, if so, why they smoke. The teacher spent several minutes assuring students that he did not “care” if they smoked and assured the students that he would not report anyone to the principal. When 100 percent of the students in the class reported they did not smoke, the teacher turned the situation into an opportunity to highlight norms, pointing out “that sometimes people think others smoke, but they really do not.”
Still other teachers added narratives to lesson content. These teachers shared stories from their own experience or other students as “real-life” examples of curriculum concepts. One teacher commented: “I told the students a story of a student from another school district that took a very unhealthy risk and the consequences that she had to go through. The students took the story very seriously. It seems that they really listen when you give them real-life examples and not examples that are very far out of their element” (KG-R14_L2). Another teacher (OH_R3_L2) shared the following story:
Just last week, there was a – I'm from [a town nearby] – but I think I've told you that before. But on Wednesday, [my town] didn't have school at all last week. They had parent-teacher conferences Monday and Tuesday, but like how we didn't have school on Wednesday, they didn't either. And there is a boy who plays on my brother-in-law's basketball team. He's a senior. His mom and his sister were on their way to Fort Wayne to go shopping and this happened like at four o'clock in the afternoon. They were on [highway], the same highway out here, US [##] that goes all the way from [town 2] to [town 3]. They were leaving from [my town] and were going to [town 3] and this drunk driver decided to run a stop sign and ran right into their car. And it killed the mom and the daughter blacked out doesn't really remember. She was okay. She wasn't hurt, but her mom was killed. She's your age. She's in the 7th grade and she lost her Mom because of a drunk driver. Because he made the decision to drink excessively and get behind the wheel. And so, these kids lost their mom. They have a son that already graduated and their other boy is a senior and then their daughter is your age. So could you imagine, you know, the day before Thanksgiving just going out to run errands to go shopping with your parents and get hit by a drunk driver in the middle of the day? At 4 o'clock. That's […] you know things like that happen, so there are choices, risks and consequences that go along with things. So imagine […] consequences as they affect your dreams (OH_R3_L2).
This teacher's narrative was followed by students sharing related stories about consequences they have witnessed due to drinking and driving.
As a research group we were especially interested in how narratives were employed in the classroom. Thus, observers were keen to note times teachers used narratives and when narratives were told by students. The example cited above was followed by at least three student narratives about the curriculum concepts of risks and consequences. The story-swapping session lasted about seven minutes in all. Other teachers also added narratives to kiR curricula as illustrations and examples of lesson content. These narratives sometimes spawned brief periods of time where students shared different, but related stories from their own experiences and knowledge. After a teacher stressed the importance of how one choice to use drugs can have long-lasting consequences, for example, one student shared a story illustrating the teacher's point (KG-S8_2_L6). The student had seen a program on the History channel about a monkey who was given the choice between crack cocaine and social interaction with another monkey. The monkey choose crack. Another student indicated that she/he had seen the same program. In this case, students shared a story from their own experience (i.e. watching television) that helped illustrate curriculum content and the teacher's point. This case also shows the communal nature of some narratives. Students in the class sometimes indicated that a narrative being shared by one was known by others.
While many teachers employed narratives as teaching tools, others “went off on rabbit trails” with their narratives. In these cases, stories were either unrelated to curriculum content or were antithetic to the underlying theory or philosophy of the program. For example, one teacher (WVD-S9_4_L7) shared a personal story followed by a superb example of an open class discussion – open exchange and debate of ideas, many students participating, teacher facilitating student interaction. However, the topic was deer hunting. In situations like this, narratives were integrated into the lesson, but were off-topic and did not present or reinforce curricula concepts.
Altering the format
The fourth category of teacher adaptations included altering the format of a lesson or lesson component. This category of adaptations occurred when teachers covered lessons in a manner different than what was prescribed (see Table I for frequencies).
Complete material as a class. The most common change in format was to transposing a small group activity into an entire class activity. Instead of dividing students into groups and allowing them to practice curriculum content as instructed, teachers completed the activity as an entire class or had a few students volunteer to complete the group work with the rest of the class watching. WVD-R9_L3 reported that “s/he used three students to demo passive, aggressive, and assertive response [styles] instead of groups.” KG-R14_L3 also translated small group work to a class demonstration. “I introduced the different styles of communicating by giving examples and having the students demonstrate their ideas in class.” OH-R3_L6 altered the format of a group activity by reading the scenario aloud to the class and then calling on students to share examples of how to avoid the situation.
Not only was small group work translated to entire class work, the format of other types of activities was adapted as well. Teachers translated partnered activities and homework into entire class work. WVD-S7_L1 translated an activity designed to have students pair with one other and share their responses into an entire class activity. She reasoned that her students were “more receptive to respond as whole group than in partners.” Regarding homework, WVD-R10_L5&L6 said that she revised the homework assignment and, instead, “made this the main activity of my lesson.” OH-R3_L2 said she completed the previous lessons' homework at the beginning of class. She added, “I think it prepared the students for the content that was being taught today, but I could have skipped it because I wasn't able to complete the entire lesson today.” Teachers adapted the format of a variety of curriculum activities including adapting small group work and partnered activities, and homework into entire class work. Less often, however, teachers would having students complete small group work individually and complete small group work in pairs or as individuals. These kinds of format changes often altered the dynamics of the activity and at times impacted its purpose.
Complete material at another time. Another format change that teachers used was to cover curriculum content in a future lesson or by creating a new homework assignment. Representative of a number of teachers (e.g. WVD-R7_L2; OH-R3_L1&L2; WVD-R13_L5; OH-R1_L10; WVD-R11_L10) OH-R2_L3 stated, “We ran out of time and will begin the next class with this activity.” Similarly, when teachers were unable to cover all the content in a given lesson, some created a new homework assignment to cover that material. OH-S6_L8 explained that an activity sheet “was distributed right before the bell so I gave it as homework, and we finished it the next day.” OH-R4_L1 and KG-R14_L8 also translated material to homework when they were unable to complete it during class time.
Other format changes. Other types of adaptations to the format of curricula content were less frequent and included adding extra time to a lesson component, repeating content, and changing materials (e.g. editing materials to include new cartoons). In all, teachers seemed to be comfortable changing the format of curricula activities to fit their needs.
Reasons for adaptations
Throughout the presentation of what was changed, teachers offered explanations for why they changed material. A formal analysis of the 220 reasons teachers made for making adaptations revealed constraints the teachers faced and the need to be responsive to students.
Constraints
Among the number of constraints identified by teachers, the most common included time, institutional, personal, and technical constraints.
Time constraints. By far, the most common reason provided for adaptation was the limited amount of time, with “time” mentioned in 35 percent of the reasons for adaptations (n = 107). Two categories of time issues emerged including, too much material to cover (n = 40) and shortened school schedule (n = 37). Teachers reported “running out of time” and not having “enough time to thoroughly teach” lesson components. These phrases imply that teachers desired to cover material but were unable to due to time constraints.
Many of the teachers who omitted what came toward the end of lessons reported that they did so because they had spent too much time on components toward the beginning of the lesson or did not have enough time allotted to them to cover the entire lesson. As one teacher explained, “I didn't get through the entire review because we ran out of time. We had a great discussion on the risks and consequences part, but we didn't have time for all of the other” (OH-S6_L2).
Other teachers, however, made more conscious decisions about what components of the lesson to include or exclude in the face of time constraints. For example one teacher omitted the lesson review (which always occurs at the end of each lesson) because she thought it was the least important component. She said, “I thought [reviewing the lesson] was the least important of the lesson [components]. So, if I had to omit something due to time running out, it was going to be this” (OH-R4_L1). Another teacher decided to omit homework throughout the curriculum because she wanted to spend more time in discussion (KG-S11). Some teachers considered students' discussion of ideas to be more important than covering all lesson components. For example, KG-S11_L10 stated that she originally planned to cover the concept of asking for help, but “students really enjoyed the eco-map [lesson component] and I felt it was important to allow them time to do this because I felt it was valuable information.” Similarly, OH-R4_L2 stated, “I wanted to leave time for the game so I didn't go into detail and did not ask all of the questions that were going to lead to a discussion.” Evidence from our data indicates that some teachers were conscientious about which lesson components they would cover – proactively determining how to adapt in the face of time constraints. Other teachers, however, seemed to respond to the dictums of the situation allowing students' participation and time constraints to determine what adaptations would be made.
Institutional constraints. Teachers reported adapting 19 percent of lessons due to various institutional constraints. As part of schools, teachers must comply with school-wide activities and policies which potentially force teachers to adapt curricula. For example, school assemblies were cited as a reason some teachers cut lessons short. One class was cut short due to a g fire drill. Sirens sounded over the loud speaker and everyone evacuated the room (KG-R14). Grading restrictions were also mentioned by teachers as a reason for adapting lessons. Homework, in particular, was mentioned by several teachers. Some teachers (e.g. KG-S11) thought the homework was an ineffective teaching tool. Another teacher (WVD-S9) had promised his students that he would never have homework in his class. So, he made turning in homework optional. Still other teachers (e.g. KG-R6) explained that their classes were not for credit so they did not assign homework. In our sample, there appeared some resistance to using homework in prevention curricula, and perhaps in schooling more generally.
Personal constraints. Personal reasons for adaptation were provided for 8 percent of the lessons and included teacher comfort or agreement with curriculum material, forgetfulness, and teacher-error. Although few, teacher errors included forgetting to pass out handouts, overlooking portions of the lesson, misreading lesson plans, using wrong slides (lesson 3 instead of lesson 4), and simply forgetting to cover or assign a lesson component. A teacher's personal disagreement with the lesson content also factored into observed adaptations. One teacher stated, “I don't agree with the AVOID section of the video, so I explained to the class the better choice, than to hang out at an underage drinking party and fake drinking/smoking, is probably just to not go” (WVD-R10_L1). Another teacher critiqued the video's fit with the lesson. “We also discussed what made the video bad and how it could be improved. The video was done poorly and didn't follow the lessons that were taught about passive, aggressive, and assertive styles and having body language match what you are saying” (KG-R6_L5). Other teachers thought that some of the class activities could be better accomplished without handouts. They believed that printing slips of paper for an activity might be a waste of paper, so they eliminated that portion, had students raise hands, or put material in a powerpoint slide rather than on handout. Teachers sometimes reported omitting the video as a form of class discipline; one teacher reasoned that “the video is the part that the students enjoy the most and I felt that they did not earn the honor of watching it” (KG-S11_L7).
Technical constraints. Adaptations in a few cases occurred due to technical issues, such as problems with DVD, no electric/power outage, “copies didn't arrive from central copy,” “had to move due to a band concert being set up in the auditorium,” and incomplete curriculum guides/lesson plans in 10 percent of the lessons. These technical constraints delayed the progress of the lesson and required teachers to adapt.
Responsiveness to students
A second general reason teachers gave for adapting lessons was in response to their students' general disposition and engagement with curriculum material. These accounted for 15 percent of teachers' stated reasons for making adaptations.
Responsiveness to students' disposition. Teachers reported adapting lessons to increase students' comprehension and adjust to students' attention span. For example, one teacher (KG-S11_L3&L8) adapted the curriculum due to student reading levels. She read some material aloud instead of requiring students to read directions and scenarios themselves. WVD-S7_L3 also “read stories aloud and gave students time to rank them” in order to increase “comprehension of the stories.” Another teacher taught students about the explain strategy and decided to omit the subsequent activity because “students were just grasping the idea of ‘Explain’ so I thought throwing the additional topic of ‘I statements’ would confuse them” (WVD-R7AC_L5). One teacher cited his reason for shortening an introductory discussion to the topic of risks and consequences was “because my kids cannot sit this long in a discussion and listen” (KG-R6_L2). OH-R4_L3 concurred saying that “Sometimes [my] students can […] easily get off subject so I just asked a few of the [questions].” Other teachers thought the students “goof around too much” (WVD-R10_L6) or were an “immature group” (KG-S11_L1) and not able to finish an activity. These examples demonstrate that teachers sometimes adapted lesson content and format in response to their students' disposition or based on their evaluation of the age-appropriateness of curricula content.
Responsiveness to students' engagement. Teachers also adjusted lessons due to students' responses to the material. At times, teachers tried to facilitate greater student participation. One teacher adapted lessons by changing an activity from a volunteer-based role play to having all students work in groups on the role play and then randomly selecting one group to present their role play to the class (KG-S8_L7). Another teacher adapted the powerpoint to include extra discussion questions (KG-S11_L1-10). Related, teachers sometimes stated that because their students participated (e.g. had a good discussion of one lesson component) that they did not have time to complete another lesson component. When the students wanted to share a lot or when there was good discussion, some teachers let it continue rather than rushing through the lesson components. A number of teachers reported that students were excited about the Guessing Game (KG-R14_L2; OH-R3_L2; WVD-R13_L2) and the Norms Questionnaire (KG-S8_L8). Student excitement and participation in these activities prevented teachers from fully covering these lesson components. Conversely, at times teachers reported that their students did not participate or complete their assignments and therefore teachers could not cover the material in the way the curricula stipulated. For example, if the majority of students did not complete their homework assignment, reviewing it in pairs was impossible. Instead, one teacher had a few students share their homework with the class (KG-R14_L4). Teachers exhibited flexibility in what they covered during any given lesson based on their students' response to that particular lesson content.
The current analyses expand the existing literature on the types and reasons for prevention implementation adaptation. Not only do teachers adapt curriculum in many ways, they do so for variety of reasons. The implications for health education curriculum developers and practitioners are discussed in the next section.
Conclusions and implications
Adaptations can be innovative. As Botvin (2004) discussed, modifying curricula to the target students can increase engagement and enhance significant effects of the intervention. On the other hand, outcome-based research suggests that omitting essential elements of the program reduces effectiveness. Regardless of the debate, assessing fidelity of implementation provides researchers with insightful findings of how the drug prevention curricula are taught under real-world conditions (Ennett et al., 2011). Moreover, understanding implementation processes can be crucial to guiding refinements in interventions (Dusenbury et al., 2003) such as structuring intervention curricula, training and supporting implementers, and evaluating implementation quality.
Structuring intervention curricula
The majority of the adaptations in this study were logistical rather than philosophical and reactive rather than proactive. In keeping with other studies of adaptation and fidelity (e.g. Moore et al., in press), this research shows that time constraints are one of the most commonly reported reasons for adapting curriculum lessons. On average, the lessons in this study were prescribed to be between 40 and 45 minutes but in practice they were delivered in approximately 35 minutes. Based on this finding, we recommend that curriculum developers of school-based prevention lessons should consider planning lessons accordingly.
One way to accommodate teachers' time constraints would be to develop lessons for no more than 30 minutes. Simply shortening lessons, however, still might subject them to interruption. Another, more flexible, curriculum design would be to format lessons as two 20-minute, three 15-minute, or even four 10-minute modules. Such a modularized curriculum design might give teachers options for delivery that would fit with the varied delivery timetables (course rotations, block scheduling, etc.) and delivery contexts they experience. Modules might also account for the numerous, often uncontrollable, interruptions that impinge on class time (e.g. snow days, school assemblies, etc.). The curriculum could recommend completing modules in sequence, or alternatively, prioritizing one module over another. That is, health educators could be instructed to emphasize which core elements to retain. Core elements are regarded as features in the intervention program that are presumed responsible for the effectiveness (Kelly et al., 2000) and fundamentally define its nature (Backer, 2001). Thus, the goal would be to maintain core elements while allowing implementer flexibility in the delivery of less essential elements.
Findings reported here reveal teachers' preferences for particular components and suggest which ones may not be essential. For example, homework and lesson reviews were commonly omitted, avoided, and were perceived to be not valuable. However, since the outcome data from this research are currently being analyzed, “core” elements keepin' it REAL cannot yet be determined. Given the non-experimental or correlational design, this type of evaluation can only be speculative. Experimental designs such as those proposed by developers of the MOST system (Collins et al., 2007, 2011) are still needed to more completely identify and understand what would be considered core lesson components of kiR.
Training and support
A perennial recommendation made by curriculum developers is to encourage high fidelity in curricula delivery through training. While we agree that this is sound advice, evidence consistently shows it is not heeded or it is impossible to achieve and, therefore, we also recommend developing more comprehensive as well as novel approaches to training and support. The typology of adaptations identified in this study (altering the delivery of the lesson by revising the delivery timetable or delivery context, changing content of the lesson by removing, partially covering, or adding content, and altering the designated format of the lesson) are much broader and more inclusive than the a priori adaptations seen in previous literature. These types of changes can help guide implementer training in the future by addressing each.
Our findings also illuminated some issues related to competence, including delivery skill, comfort with material, and technical problems. One particular area that may need to be emphasized is the need for explanations about the difference between demonstrating skills and providing actual skills practice, based on the finding that many teachers altered the format of delivery to eliminate practice components. Training should emphasize the need for students to actually practice skills when the program aims for skills mastery, especially since information only approaches to prevention have been shown to be ineffective (for review, see Tobler et al., 2000). A logical flow of lesson material from concept presentation, to skill demonstration, to skill practice likely should be maintained as it corresponds with research suggesting how human brains best process new information (e.g. Kirschner et al., 2006; Sweller et al., 2007).
Since initial training is unlikely to eliminate all of the observed problems related to adaptations, more extensive follow-up training and/or technical support modules aimed at delivery as well as content issues may reduce their frequency. Recent investigations that vary training components – enhanced training (e.g. Caldwell et al., 2012; Rohrbach et al., 1993), training coupled with personalized implementation coaching (Dusenbury et al., 2010; Ringwalt et al., 2009a), and training with ongoing technical support (Downer et al., 2009) – show promise for increasing fidelity and, in turn, program outcomes.
Other types of training and support might be geared toward equipping program implementers to develop or modify program activities that align with both their own teaching style and stay true to evidence-based program objectives (see Pettigrew et al., 2013). Teachers in our sample seemed to pattern adaptations based on their teaching and classroom management style. It seems reasonable, then, to recommend different sets of adaptations for teachers with various teaching styles. Although theory and research suggest that an authoritative teaching style coupled with student-centered management techniques may create an optimal learning environment (see Fraser, 1998; Freiberg and Lamb, 2009; Walker, 2008; Wentzel, 2002), not all teachers exhibit this style; therefore, groupings of planned adaptations for a variety of styles presented during training and supported throughout implementation may facilitate positive program outcomes for teachers with diverse preferences, experiences, and skills. Measurement of teacher efficacy is needed in a controlled study to determine the value of customizing materials for different teaching styles.
Finally, some adaptations were made to better meet the needs of students. Indeed, teachers responded to their students' cognitive abilities as well as engagement with curriculum activities. One of the most commonly reported adaptations was the insertion of implementer's personal experiences and stories. Rather than discourage all adaptation, we believe that empowering implementers to make philosophy-consistent adaptations may be warranted. However, training regarding what is “philosophy-consistent” would be needed. In the case of added personal narratives, for example, implementer training may be needed to identify appropriate experiences to share and guidelines for managing student's reciprocity in sharing their own personal experiences. While beyond the scope of the current paper, these issues should be addressed in future research.
Evaluating implementation quality
Another area illuminated by findings from this study is the study of implementation quality. This study outlined various types of adaption to program delivery, content, and format providing a descriptive or typological basis for future observational coding research. In addition, the use of time stamps to evaluate time spent on various activities can be tested in future research to establish implementation quality norms. Finally, it is clear that teacher reports under-estimated the amount of changes made when implementing curriculum lessons and future research is needed to clarify these discrepancies.
So, what aspects of implementation are important? Are all of these areas equally important? Heretofore, research on fidelity has tended to focus on issues of curriculum content, asking if the spirit of the program was delivered by a competent and engaged teacher to a participatory class (for review, see Dusenbury et al., 2003). We propose that implementation research in general and studies of preventative interventions in particular need to move beyond this question and explore instead which types of adaptations are made, in what contexts, and for what reasons. As interventions are taken to scale and delivered by a variety of teachers in numerous settings, answers to these questions can help guide program developers and practitioners.
Acknowledgments
The authors would like to thank the students and schools who participated in this study. They are also grateful to the other members of the research team for their involvement with this project [M. Colby, K. Glunt, S. Mizenko, T. Tanner, and C. Terwilliger]. This publication was supported by Grant Number R01DA021670 from the National Institute on Drug Abuse to The Pennsylvania State University (Michael Hecht, Principal Investigator). Its contents are solely the responsibility of the authors and do not necessarily represent the official views of the National Institutes of Health.
Contributor Information
Michelle Miller-Day, Email: millerda@chapman.edu, Chapman University, Orange, California, USA.
Jonathan Pettigrew, University of Tennessee, Knoxville, Tennessee, USA.
Michael L. Hecht, Penn State University, University Park, Pennsylvania, USA
YoungJu Shin, Indiana University-Purdue University Indianapolis, Indianapolis, Indiana, USA.
John Graham, Pennsylvania State University, University Park, Pennsylvania, USA.
Janice Krieger, The Ohio State University, Columbus, Ohio, USA.
References
- Backer TE. Finding the Balance: Program Fidelity and Adaptation in Substance Abuse Prevention: A State-Of-The-Art Review. Department of Health and Human Services, Substance Abuse and Mental Health Services Administration; Rockville, MD: 2001. [Google Scholar]
- Basen-Enquist K, O'Hara-Tompkins N, Lovato CY, Lewis MJ, Parcel GS, Gingiss P. The effect of two types of teacher training on implementation of Smart Choices: a tobacco prevention curriculum. Journal of School Health. 1994;64(8):334–339. doi: 10.1111/j.1746-1561.1994.tb03323.x. [DOI] [PubMed] [Google Scholar]
- Baumrind D. The development of instrumental competence through socialization. In: Pick A, editor. Minnesota Symposia on Child Psychology. Vol. 7. University of Minnesota Press; Minneapolis, MN: 1973. pp. 3–46. [Google Scholar]
- Beets M, Flay B, Vuchinich S, Acock A, Li KK, Allred C. School climate and teachers' beliefs and attitudes associated with implementation of the positive action program: a diffusion of innovations model. Prevention Science. 2008;9(4):264–275. doi: 10.1007/s11121-008-0100-2. [DOI] [PubMed] [Google Scholar]
- Bond GR, Evans L, Salyers MP, Williams J, Kim HW. Measurement of fidelity in psychiatric rehabilitation. Mental Health Services Research. 2000;2(2):75–87. doi: 10.1023/a:1010153020697. [DOI] [PubMed] [Google Scholar]
- Botvin GJ. Advancing prevention science and practice: challenges, critical issues, and future directions. Prevention Science. 2004;5(1):69–72. doi: 10.1023/b:prev.0000013984.83251.8b. [DOI] [PubMed] [Google Scholar]
- Breitenstein SM, Fogg L, Garvey C, Hill C, Resnick B, Gross D. Measuring implementation intervention fidelity: ensuring application to practice for youth and families in a community-based parenting intervention. Nursing Research. 2010a;59(3):158–165. doi: 10.1097/NNR.0b013e3181dbb2e2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Breitenstein SM, Gross D, Garvey CA, Hill C, Fogg L, Resnick B. Implementation fidelity in community-based interventions. Research in Nursing & Health. 2010b;33(2):164–173. doi: 10.1002/nur.20373. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Caldwell L, Smith E, Collins L, Graham J, Lai M, Wegner L, Vergnani T, Matthews C, Jacobs J. Translational research in South Africa: evaluating implementation quality using a factorial design. Child and Youth Care Forum. 2012;41(2):119–136. doi: 10.1007/s10566-011-9164-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chen H. Theory-Driven Evaluations. Sage; Thousand Oaks, CA: 1990. [Google Scholar]
- Colby M, Hecht M, Miller-Day M, Krieger J, Syvertsen A, Graham J, Pettigrew J. Adapting school-based substance use prevention curriculum through cultural grounding: an exemplar of adaptation processes for rural schools. American Journal of Community Psychology. 2013;51(1-2):190–205. doi: 10.1007/s10464-012-9524-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Collins LM, Baker TB, Mermelstein RJ, Piper ME, Jorenby DE, Smith SS, Christiansen BA, Schlam TR, Cook JW, Fiore MC. The multiphase optimization strategy for engineering effective tobacco use interventions. Annual of Behavioral Medicine. 2011;41(2):208–226. doi: 10.1007/s12160-010-9253-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Collins LM, Murphy SA, Strecher V. The Multiphase Optimization Strategy (MOST) and the Sequential Multiple Assignment Randomized Trial (SMART): new methods for more potent e-health interventions. American Journal of Preventive Medicine. 2007;32:S112–S118. doi: 10.1016/j.amepre.2007.01.022. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dane AV, Schneider BH. Educational environments for students with emotional and behavioral disorders. In: Brooks BL, Sabatino DA, editors. Contemporary Interdisciplinary Interventions for Children With Emotional/Behavioral Disorders. Carolina Academic Press; Durham, NC: 1998. pp. 113–142. [Google Scholar]
- Dariotis JK, Bumbarger BK, Duncan LG, Greenberg MT. How do implementation efforts relate to program adherence? Examining the role of organizational, implementer, and program factors. Journal of Community Psychology. 2008;36(6):744–760. [Google Scholar]
- Downer JT, Locasale-Crouch J, Hamre B, Pianta R. Teacher characteristics associated with responsiveness and exposure to consultation and online professional development resources. Early Education & Development. 2009;20(3):431–455. doi: 10.1080/10409280802688626. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dryfoos J. Adolescents at Risk: Prevalence and Prevention. Oxford University Press; New York, NY: 1990. [Google Scholar]
- Durlak JA. Why program implementation is important. Journal of Prevention & Intervention in the Community. 1998;17(2):5–18. [Google Scholar]
- Durlak JA, Weissberg RP, Pachan M. A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. American Journal of Community Psychology. 2010;45:294–309. doi: 10.1007/s10464-010-9300-6. [DOI] [PubMed] [Google Scholar]
- Dusenbury L, Branningan R, Falco M, Hansen WB. A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Health Education Research. 2003;18:237–256. doi: 10.1093/her/18.2.237. [DOI] [PubMed] [Google Scholar]
- Dusenbury L, Brannigan R, Hansen WB, Walsh J, Falco M. Quality of implementation: developing measures crucial to understanding the diffusion of preventive interventions. Health Education Research: Theory and Practice. 2005;20(3):308–313. doi: 10.1093/her/cyg134. [DOI] [PubMed] [Google Scholar]
- Dusenbury L, Hansen WB, Jackson-Newsom J, Pittman DS, Wilson CV, Nelson-Simley K, Ringwalt C, Pankratz M, Giles SM. Coaching to enhance quality of implementation in prevention. Health Education. 2010;110(1):43–60. doi: 10.1108/09654281011008744. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ennett ST, Haws S, Ringwalt CL, Vincus AA, Hanley S, Bowling JM, Rohrbach LA. Evidence-based practice in school substance use prevention: fidelity of implementation under real-world conditions. Health Education Research. 2011;26(2):361–371. doi: 10.1093/her/cyr013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ennett ST, Ringwalt CL, Thorne J, Rohrbach LA, Vincus A, Simons-Rudolph A, Jones S. A comparison of current practice in school-based substance use prevention programs with meta-analysis findings. Prevention Science. 2003;4(1):1–14. doi: 10.1023/a:1021777109369. [DOI] [PubMed] [Google Scholar]
- Fagan A, Hanson K, Hawkins J, Arthur M. Bridging science to practice: achieving prevention program implementation fidelity in the community youth development study. American Journal of Community Psychology. 2008;41(3-4):235–249. doi: 10.1007/s10464-008-9176-x. [DOI] [PubMed] [Google Scholar]
- Fraser BJ. Classroom environment instruments: development, validity and applications. Learning Environments Research. 1998;1(1):7–34. [Google Scholar]
- Freiberg HJ, Lamb SM. Dimensions of person-centered classroom management. Theory Into Practice. 2009;48(2):99–105. [Google Scholar]
- Gottfredson D. Schools and Delinquency. Cambridge University Press; Cambridge: 2001. [Google Scholar]
- Greenberg M, Domitrovich C, Graczyk P, Zins J. The Study of Implementation in School-Based Preventive Interventions: Theory, Research and Practice. Center for Mental Health Services, Substance Abuse and Mental Health Administration, US Department of Health and Human Services; Washington, DC: 2001. [Google Scholar]
- Griner D, Smith TB. Culturally adapted mental health interventions: a meta analytic review. Psychotherapy: Theory, Research, Practice, Training. 2006;43(4):531–548. doi: 10.1037/0033-3204.43.4.531. [DOI] [PubMed] [Google Scholar]
- Hayes AF, Krippendorff K. Answering the call for a standard reliability measure for coding data. Communication Methods and Measures. 2007;1(1):77–89. [Google Scholar]
- Hecht ML, Miller-Day MA. Applied' aspects of the drug resistance strategies project. Journal of Applied Communication Research. 2010;38(3):215–229. doi: 10.1080/00909882.2010.490848. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hecht ML, Graham JW, Elek E. The drug resistance strategies intervention: program effects on substance use. Health Communication. 2006;20(3):267–276. doi: 10.1207/s15327027hc2003_6. [DOI] [PubMed] [Google Scholar]
- Hecht ML, Marsiglia FF, Elek E, Wagstaff DA, Kulis S, Dustman P, Miller-Day M. Culturally-grounded substance use prevention: an evaluation of the keepin' it R.E.A.L. curriculum. Prevention Science. 2003;4(4):233–248. doi: 10.1023/a:1026016131401. [DOI] [PubMed] [Google Scholar]
- Hill LG, Maucione K, Hood BK. A focused approach to assessing program fidelity. Prevention Science. 2007;8(1):25–34. doi: 10.1007/s11121-006-0051-4. [DOI] [PubMed] [Google Scholar]
- Hohmann AA, Shear MK. Community-based intervention research: coping with the ‘noise’ of real life in study design. American Journal of Psychiatry. 2002;159(2):201–207. doi: 10.1176/appi.ajp.159.2.201. [DOI] [PubMed] [Google Scholar]
- Kelly JA, Heckman TG, Stevenson LY, Williams PN, Ertl T, Hays RB, Leonard NR, O'Donnell L, Terry MA, Sogolow ED, Neumann MS. Transfer of research-based HIV prevention interventions to community service providers: fidelity and adaptation. AIDS Education and Prevention: Official Publication of the International Society for AIDS Education. 2000;12(5):87–98. [PubMed] [Google Scholar]
- Kirschner PA, Sweller J, Clark RE. Why minimal guidance during instruction does not work: an analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist. 2006;4(2):75–86. [Google Scholar]
- Knoche LL, Sheridan SM, Edwards CP, Osborn AQ. Implementation of a relationships-based school readiness intervention: a multidimensional approach to fidelity measurement for early childhood. Early Childhood Research Quarterly. 2010;25(3):299–313. doi: 10.1016/j.ecresq.2009.05.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lillehoj CJ, Griffin KW, Spoth R. Program provider and observer ratings of school-based preventive intervention implementation: agreement and relation to youth outcomes. Health Education & Behavior. 2004;31(2):242–257. doi: 10.1177/1090198103260514. [DOI] [PubMed] [Google Scholar]
- Moore JE, Bumbarger BK, Rhoades BL. Examining adaptations of evidence-based programs in natural contexts. Journal of Primary Prevention. doi: 10.1007/s10935-013-0303-6. in press. [DOI] [PubMed] [Google Scholar]
- Odom SL, Fleming K, Diamond K, Lieber J, Hanson M, Butera G, Horn E, Palmer S, Marquis J. Examining different forms of implementation and in early childhood curriculum research. Early Childhood Research Quarterly. 2010;25(3):314–328. doi: 10.1016/j.ecresq.2010.03.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ozer EJ, Wanis MG, Bazell N. Diffusion of school-based prevention programs in two urban districts: adaptations, rationales, and suggestions for change. Prevention Science. 2010;11(1):42–55. doi: 10.1007/s11121-009-0148-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Paulson SE, Marchant GJ, Rothlisberg BA. Early adolescents' perceptions of patterns of parenting, teaching, and school atmosphere: Implications for achievement. The Journal of Early Adolescence. 1998;18(1):5–26. [Google Scholar]
- Pettigrew J, Miller-Day M, Shin Y, Hecht ML, Krieger JR, Graham JW. Describing teacher-student interactions: a qualitative assessment of teacher implementation of the 7th grade keepin' it REAL substance use intervention. American Journal of Community Psychology. 2013;51(1-2):43–56. doi: 10.1007/s10464-012-9539-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ringwalt CL, Ennett S, Vincus A, Simons-Rudolph A. Students' special needs and problems as reasons for the adaptation of substance abuse prevention curricula in the nation's middle schools. Prevention Science. 2004a;5(3):197–206. doi: 10.1023/b:prev.0000037642.40783.95. [DOI] [PubMed] [Google Scholar]
- Ringwalt CL, Ennett S, Johnson R, Rohrbach LA, Simons-Rudolph A, Vincus A, Thorne J. Factors associated with fidelity to substance use prevention curriculum guides in the nation's middle schools. Health Education & Behavior. 2003;30(3):375–391. doi: 10.1177/1090198103030003010. [DOI] [PubMed] [Google Scholar]
- Ringwalt C, Hanley S, Ennett ST, Vincus AA, Bowling JM, Haws SW, Rohrbach LA. The effects of no child left behind on the prevalence of evidence-based drug prevention curricula in the nation's middle schools. Journal of School Health. 2011;81(5):265–272. doi: 10.1111/j.1746-1561.2011.00587.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ringwalt CL, Pankratz MM, Hansen WB, Dusenbury L, Jackson-Newsom J, Giles SM, Brodish PH. The potential of coaching as a strategy to improve the effectiveness of school-based substance use prevention curricula. Health Education & Behavior. 2009a;36(4):696–710. doi: 10.1177/1090198107303311. [DOI] [PubMed] [Google Scholar]
- Ringwalt CL, Vincus A, Ennett S, Johnson R, Rohrbach LA. Reasons for teachers' adaptation of substance use prevention curricula in schools with non-white student populations. Prevention Science. 2004b;5(1):61–67. doi: 10.1023/b:prev.0000013983.87069.a0. [DOI] [PubMed] [Google Scholar]
- Rohrbach LA, Graham JW, Hansen WB. Diffusion of a school-based substance abuse prevention program: predictors of program implementation. Preventive Medicine. 1993;22(2):237–260. doi: 10.1006/pmed.1993.1020. [DOI] [PubMed] [Google Scholar]
- Rohrbach LA, Grana R, Sussman S, Valente TW. Type II translation transporting prevention interventions from research to real-world settings. Evaluation & the Health Professions. 2006;29(3):302–333. doi: 10.1177/0163278706290408. [DOI] [PubMed] [Google Scholar]
- Rohrbach LA, Gunning M, Sun P, Sussman S. The Project Towards No Drug Abuse (TND) dissemination trial: implementation fidelity and immediate outcomes. Prevention Science. 2010;11(1):77–88. doi: 10.1007/s11121-009-0151-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sloboda Z, Stephens RC, Stephens PC, Grey SF, Teasdale B, Hawthorne RD, Williams J, Marquette JF. The adolescent substance abuse prevention study: a randomized field trial of a universal substance abuse prevention program. Drug and Alcohol Dependence. 2009;102(1-3):1–10. doi: 10.1016/j.drugalcdep.2009.01.015. [DOI] [PubMed] [Google Scholar]
- Sweller J, Kirschner PA, Clark RE. Why minimally guided teaching techniques do not work: a reply to commentaries. Educational Psychologist. 2007;42(2):115–121. [Google Scholar]
- Szulanski G, Winter S. Getting it right the second time. Harvard Business Review. 2002;80(1):62–69. [PubMed] [Google Scholar]
- Tobler NS, Stratton HH. Effectiveness of school-based drug prevention programs: a meta-analysis of the research. The Journal of Primary Prevention. 1997;18(1):71–128. [Google Scholar]
- Tobler NS, Roona MR, Ochshorn P, Marshall DG, Streke AV, Stackpole KM. School-based adolescent drug prevention programs: 1998 meta-analysis. The Journal of Primary Prevention. 2000;20(4):275–336. [Google Scholar]
- Walker JMT. Looking at teacher practices through the lens of parenting style. The Journal of Experimental Education. 2008;76(2):218–240. [Google Scholar]
- Wentzel KR. Are effective teachers like good parents? Teaching styles and student adjustment in early adolescence. Child Development. 2002;73(1):287–301. doi: 10.1111/1467-8624.00406. [DOI] [PubMed] [Google Scholar]
- Durlak JA, DuPre EP. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology. 2008;41(3-4):327–350. doi: 10.1007/s10464-008-9165-0. [DOI] [PubMed] [Google Scholar]
- Ringwalt C, Vincus A, Hanley S, Ennett S, Bowling J, Rohrbach L. The prevalence of evidence-based drug use prevention curricula in US middle schools in 2005. Prevention Science. 2009b;10(1):33–40. doi: 10.1007/s11121-008-0112-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Substance Abuse and Mental Health Services Administration. [accessed 30 June 2011];NREPP: SAMSHA's National Registry of Evidence-based Programs and Practice. available at: www.nrepp.samhsa.gov/