Abstract
Translational research (Pentz, Jasuja, Rohrbach, Sussman, & Bardo, 2006; Woolf, 2008) is concerned with moving advances in prevention science into everyday practice in communities, yet there are few models for ensuring this transfer of knowledge. Communities That Care (CTC) provides a planned, structured, and data-driven system that trains community prevention coalitions to select evidence-based programs and replicate them with strong implementation fidelity. This paper describes the implementation of the CTC prevention system in 12 communities participating in the Community Youth Development Study.
The results indicated that intervention communities enacted, on average, 90% of the core components of the CTC system, and achieved high rates of implementation fidelity when replicating school, afterschool, and parent training programs. These results held over time; communities successfully launched their prevention coalitions and programs and maintained the quality of their prevention services over five years. These results indicate that the CTC system can be used to foster translational research.
There is evidence from randomized, controlled evaluations and other research trials that problem behaviors can be prevented by preventive interventions that focus on reducing risk factors and enhancing protective factors. School-based curricula, certain structured afterschool activities, community- and home-based parent training programs, and community-wide prevention strategies have all been found to be effective in reducing adolescent drug use, delinquency, violence, and other problem behaviors (Hawkins & Catalano, 2004; Mihalic, Fagan, Irwin, Ballard, & Elliott, 2004; Sherman, Gottfredson, MacKenzie, Eck, Reuter, & Bushway, 1997; Welsh & Farrington, 2006).
Despite their availability, such programs have not been widely disseminated in communities; many schools and service agencies use prevention approaches that have little or no evidence of effectiveness (Ennett, Ringwalt, Thorne, Rohrbach, Vincus, Simons-Rudolph, & Jones, 2003; Gottfredson & Gottfredson, 2002; Kumpfer & Alvarado, 2003; Wandersman & Florin, 2003). For example, Kumpfer and Alvarado (2003) estimated that only about 10% of practitioners implement family-focused programs that have evidence of effectiveness, and a national study of schools (Ringwalt, Ennett, Vincus, Thorne, Rohrbach, & Simons-Rudolph, 2002) found that 82% reported using drug prevention curricula, but only 27% were implementing programs that had evidence of success in reducing substance use.
Even when organizations select a tested, effective prevention approach, they often do not implement it with fidelity and fail to adhere to the standards delineated by program designers (Gottfredson & Gottfredson, 2002; Hallfors & Godette, 2002; Kumpfer & Alvarado, 2003). Kumpfer and Alvarado (2003) reported that only about one-fourth of practitioners implement family-focused programs with integrity, and a national assessment of school-based prevention programming (Hallfors & Godette, 2002) found that only 19% of districts were implementing curricula with fidelity, while the other districts delivered programs with untrained teachers, without the required materials, or with misspecification of the population to be served (e.g., targeting high-risk students with universal programs).
The development and testing of strategies for disseminating effective preventive interventions and improving the quality of community-based replications of these interventions are priorities for prevention research (Glasgow, Lichtenstein, & Marcus, 2003; Pentz et al., 2006; Saul, Duffy, Noonan, Lubell, Wandersman, Flaspohler, Stillman, Blachman, & Dunville, 2008; Spoth & Greenberg, 2005; Wandersman, 2003). Few models have been identified that are effective in facilitating translational research (Pentz et al., 2006; Rohrbach et al., 2006; Woolf, 2008) or in helping to bridge the gap between science and practice (Arthur & Blitz, 2000; Saul et al., 2008). More research is needed to investigate processes through which practitioners gain access to tested and effective interventions and put them into everyday use.
Community coalitions have the potential for translating scientific principles into effective prevention action (Butterfoss, Goodman, & Wandersman, 1993; Specter, 2008; Wandersman, 2003; Wandersman & Florin, 2003). While some research has cautioned that community coalition initiatives are difficult to enact (Wandersman & Florin, 2003), and some studies have indicated that coalition efforts have not achieved significant improvements in healthy youth behavior (Collins, Johnson, & Becker, 2007; Flewelling, Austin, Hale, LaPlante, Liebig, Piasecki, & Uerz, 2005; Hallfors, Cho, Livert, & Kadushin, 2002), there is emerging evidence that coalitions can prevent the development of youth drug use and delinquency (Feinberg, Greenberg, Osgood, Sartorius, & Bontempo, 2007; Spoth, Redmond, Shin, Greenberg, Clair, & Feinberg, 2007). The PROSPER (Promoting School/community-university Partnerships to Enhance Resilience) model, for example, demonstrated improvements in youth outcomes using a collaborative approach that linked research scientists with community teams comprised of staff from local land-grant university extension offices, teachers or administrators from public schools, health and social service providers, parents, and youth (Spoth, Guyll, Lillehoj, Redmond, & Greenberg, 2007). A randomized, controlled evaluation of this model showed that the 14 PROSPER communities successfully implemented selected parent training and school-based drug prevention curricula (Spoth et al., 2007) and achieved greater reductions in student substance use initiation and past-year marijuana and inhalant use compared to the 14 control communities (Spoth et al., 2007).
Evidence that coalitions can be effective and recognition that the widespread reduction of youth problem behaviors necessitates community-wide prevention efforts has increased calls for community-driven, collaborative approaches to prevention (Specter, 2008; Woolf, 2008). Yet, more information is needed to understand the extent to which communities can effectively form, maintain, and sustain coalitions, the challenges faced during coalition operations, and coalition-driven effects to select and successfully implement tested and effective prevention strategies. This paper describes these issues using data from a process evaluation of 12 communities utilizing the Communities That Care (CTC) prevention system (Hawkins & Catalano, 1992). Unlike the PROSPER model, which focuses on improving youth outcomes through land-grant university extension services, educational infrastructures and school-based preventive interventions (Spoth, Greenberg, Bierman, & Redmond, 2004), CTC relies on broad-based, diverse community coalitions to implement a range of prevention practices to address community-specific needs (Hawkins & Catalano, 1992; Hawkins, Catalano, & Arthur, 2002). It provides communities with an organizational structure and methodology for facilitating the transmission of prevention science concepts and practices to community action (Hawkins & Catalano, 1992).
The utility of the CTC system in reducing youth problem behaviors is being investigated in a randomized, controlled evaluation of 24 communities in the Community Youth Development Study (Hawkins, Catalano, Arthur, Egan, Brown, Abbott, & Murray, 2008). Previous papers have described implementation of the CTC process in the 12 intervention communities during the first 2 to 3 years of this study, when coalitions were forming and new prevention strategies were being adopted and implemented (Fagan, Hanson, Hawkins, & Arthur, 2008; Quinby, Fagan, Hanson, Brooke-Weiss, Arthur, & Hawkins, 2008). This paper extends this research by evaluating communities’ fidelity to the CTC system and selected prevention programs over five years. Two research questions are addressed: (a) To what extent did communities faithfully replicate and maintain the core components of the CTC system? (b) To what extent were the specific prevention programs selected by communities implemented and maintained with fidelity?
The answers to these questions provide information on the utility of relying on community-based coalitions to enact translational research. We describe implementation of the CTC system over five years in order to identify challenges associated with early coalition formation, organization, goal-setting, and planning, all of which can be difficult to achieve (Wandersman & Florin, 2003), and to investigate coalition maintenance over time. Long-term coalition sustainability has not been much studied, but there is recognition that maintenance is difficult because coalitions may face turnover in membership, shifts in goals and priorities, and waxing and waning of members’ time and support for prevention (Feinberg, Meyer Chilenski, Greenberg, Spoth, & Redmond, 2007; Leviton, Herrera, Pepper, Fishman, & Racine, 2006).
We also describe the processes by which coalitions manage the delivery of specific prevention activities. Although a focus on implementing tested, effective programs has been linked to coalition effectiveness (Flewelling et al., 2005; Hallfors et al., 2002), there has been little research regarding how well coalitions oversee program replications. In addition, while attention to program implementation issues is increasing (Durlak & DuPre, 2008; Fixsen, Naoom, Blase, Friedman, & Wallace, 2005), many evaluations have focused on barriers to program adoption or on challenges faced during early stages of implementation. Comparatively little is known regarding the maintenance and sustainability of preventive innovations (Saul et al., 2008), particularly programs overseen by local coalitions. A five-year analysis of implementation is important because it often takes several years before new preventive innovations are fully adopted and integrated into agencies and organizations, and studies that do not investigate implementation processes over several years may fail to accurately measure the true extent of fidelity (Fixsen et al., 2005).
Method
The Community Youth Development Study
The Community Youth Development Study (CYDS) is a 10-year community randomized trial of the efficacy of the Communities That Care prevention operating system in reducing adolescent delinquency and drug use. Twenty-four small- to medium-sized communities in seven states participated in the study, 12 implementing CTC and 12 control sites conducting prevention services as usual. From 2003 to 2008, each of the 12 intervention communities was provided with funding for a full-time CTC Coordinator and up to $75,000 annually to replicate research-based prevention programs targeting fifth- to ninth-grade students and their families (Hawkins, Catalano, Arthur, Egan, Brown, Abbott, & Murray, 2008).
Training and technical assistance (TA) in the CTC model was provided by certified CTC trainers and research staff at the Social Development Research Group (SDRG) at the University of Washington. SDRG staff provided TA primarily via weekly or biweekly phone calls to the local CTC Coordinators. In total, 2,295 calls were conducted, and each site received an average of 191 telephone TA contacts over the first 5 years of the study. Site visits were made 1 - 3 times per year, with 145 total visits during the study (an average of 12 visits per community). In addition, written or email communication was conducted as needed to discuss CTC implementation issues.
The Communities That Care System
The Communities That Care (CTC) operating system provides a planned and structured framework for diverse community partners to utilize advances in prevention science. CTC coalition members participate in a series of six CTC training workshops in which they learn to enact the five phases of the system, including: (a) assessing community readiness to undertake collaborative prevention efforts; (b) forming a diverse and representative prevention coalition (i.e., the CTC Board); (c) using community-level epidemiologic data to assess prevention needs; (d) choosing evidence-based prevention policies, practices, and programs to implement, as based on the data assessment; and (e) implementing the new innovations with fidelity, in a manner congruent with the programs’ theory, content, and methods of delivery.
By bringing together multiple partners, the CTC Board allows for a pooling of financial and human resources to undertake prevention, and the group training workshops provide coalition members with a common language and dedicated time to discuss community prevention needs. CTC trains community members to base their actions on evidence, rather than on political views or personal whims. The system relies on the collection and analysis of community surveillance data using the CTC Youth Survey. The CTC Youth Survey can be administered in local schools to students in Grades 6 through 12 in order to provide valid and reliable self-reported measures of community, family, school, peer, and individual risk and protective factors, as well as drug use and delinquency (Arthur, Hawkins, Pollard, Catalano, & Baglioni, 2002; Glaser, Van Horn, Arthur, Hawkins, & Catalano, 2005). Coalitions analyze their data and identify areas of greatest need (i.e., elevated risk factors and depressed protective factors), then select programs and policies with evidence of effectiveness (as described in the CTC Prevention Strategies Guide (https://preventionplatform.samhsa.gov) to address these needs. After programs are selected, communities attend the last CTC training workshop to learn strategies for monitoring program implementation quality.
During the maintenance phase of CTC, coalitions are expected to ensure ongoing recruitment and training of coalition members, and to repeat administration of the CTC Youth Survey, analyze this data, and make revisions to prevention activities when necessary to ensure success in meeting prevention goals. In this manner, the CTC protocols provide communities with a logical, specific monitoring and feedback system to ensure that prevention efforts are implemented with fidelity (Fixsen et al., 2005).
Measures
The extent to which the five phases of CTC were fully implemented in the 12 intervention communities is the focus of this paper. The instruments used to measure implementation fidelity of the CTC system as a whole (see Quinby et al., 2008) and the selected prevention programs (see Fagan et al., 2008; Fagan, Hanson, Hawkins, & Arthur, 2008) are summarized below.
CTC System Fidelity
Implementation of the CTC system by intervention communities was assessed according to completion of milestones and benchmarks, which are the core elements associated with the five phases of CTC. The milestones are goals to be met by communities, and the benchmarks are the specific actions that community members take or conditions that must be present to achieve those goals. To illustrate, during Phase Two, a CTC milestone is: “A Community Board has been developed.” One benchmark in this process is: “Potential Community Board members have been identified and recruited from a diverse, representational list.”
The extent of benchmark completion in each phase of CTC was rated by both the CTC Community Coordinators and the TA providers. Coordinators indicated the extent to which their coalitions completed each benchmark using a dichotomous measure (not achieved or achieved). TA providers rated levels of benchmark completion using a 4-point scale (not at all achieved, somewhat achieved, mostly achieved, or completely achieved)i, as based on their telephone and in-person technical assistance contacts with sites. Coordinators provided ratings of all benchmarks at four time points: December 2004 (Year 1.5, when most communities were just beginning Phase 5); June 2006 (end of Year 3); June 2007 (end of Year 4); and March 2008 (spring of Year 5). TA provider ratings occurred during the same periods, except for the last rating, which occurred in December 2007, at the end of regular TA provision. At each time point, all benchmarks were assessed by both groups, except that at Time 1, communities were rated only through Milestone 5.3 as they had not yet finished Phase 5.
To calculate the degree of CTC implementation, the TA providers’ ratings were recoded as dichotomous measures with benchmarks that were not met or somewhat met contrasted to those that were mostly or fully achieved. Then, within each group of raters, scores on the 1 to 8 benchmarks comprising each milestone were averaged across communities. Comparisons of TA providers’ and coordinators’ ratings showed high rates of agreement between the two groups at each time point. At the first time point (Year 1.5), 88% (range 70 – 100%) of benchmarks were scored the same by the two groups across all 12 intervention sites. In later assessments, 94% of benchmarks were scored the same at Year 3 (range 69 – 100%), 85% (range 56% – 100%) at Year 4, and 81% (range 46 – 100%) at Year 5. Given the similarity in ratings, the overall CTC implementation score was calculated by averaging scores across the two groups of raters. The resulting measure represents the percentage of benchmarks achieved for each milestone in the CTC process across all 12 communities.
Prevention Program Fidelity
Implementing and monitoring fidelity of prevention programs are major activities that occur in Phase 5 of CTC, which occurred in Years 2 through 5 of the study. Prevention program fidelity was assessed using Phase 5 ratings on the milestones and benchmarks, as described above, and additional instruments which allowed more detailed measurement of the multiple components of implementation fidelity considered important in the literature (Dusenbury, Brannigan, Hansen, Walsh, & Falco, 2005).
Adherence
Adherence to the programs’ required objectives and core components was measured by fidelity checklists completed by local program implementers. These checklists identified the content and activities to be taught each time the program met (for more structured programs; e.g., classroom-based curricula) or the core components to be implemented during the entire program (for less structured programs; e.g., the Big Brothers/Big Sisters mentoring program). Implementers rated whether or not each objective was taught each session or whether each component was achieved during implementation. The adherence score was calculated as the percentage of objectives taught or components achieved divided by the total possible score. For example, an adherence score of 60% indicated that 6 of 10 required objectives were taught.
Program adherence was calculated as the percentage of objectives met out of all objectives that were to be taught, with reports from all implementers in all communities teaching the program combined. When checklists were not returned, all items on the adherence checklists were treated as missing and not counted in the overall adherence score. When checklists were returned but adherence data on the forms were missing, omitted items were counted as unmet objectives. Missing data were minimal; on average the percentage of checklists and/or data missing ranged from 2.1% to 8.2% across the 4 years of delivery of preventive interventions.
Community coalition members and implementers’ supervisors were trained to conduct observations of 10% to 15% of program sessions in order to validate self-reported adherence information and provide information on implementation quality and participant responsiveness.ii Program observations were conducted by implementers’ supervisors or CTC coalition members, who completed the same fidelity checklists as program implementers. A reliability score was calculated by comparing the number of objectives on which the observer and implementer agreed on the level of coverage (i.e., both rated the objective as “met” or both rated the objective as “not met”). For example, if a program session had 10 objectives, and both raters indicated that 7 objectives were met but disagreed as to whether the other 3 objectives were covered, the level of agreement would be 70%. Agreement scores were based on all observations conducted for each program, across all communities that implemented it. Missing data were not included in the agreement calculations.
Delivery (also referred to as “dosage” in the literature). Delivery of the required number, length, and frequency of sessions was assessed using the dates and times of program sessions listed on the fidelity checklists. The overall delivery score was an average of: (a) the percentage of required sessions actually taught versus the total number of sessions; (b) the actual length of the program session compared to the required length (only assessed when classes were shorter than required); and (c) whether or not the specified frequency of sessions was delivered (a dichotomous measure; either achieved or not achieved). Programs for which a required number, length, or frequency of sessions were not specified by developers were coded as missing that component. Delivery scores were calculated each time a program was offered, then averaged across all communities implementing that program.
Quality of delivery
The quality of implementation delivery focused on the teaching skills of the presenter and was assessed by the same community observers who rated program adherence. In addition to completing the fidelity checklists, these observers rated the degree to which the implementer provided clear explanations, kept on time, seemed rushed or hurried, and used stories to illustrate points; as well as the implementer’s knowledge of the program, enthusiasm, poise/confidence, rapport with participants, and ability to answer questions. These nine items, and one item regarding the overall quality of the session, were each rated on a 5-point scale in which higher scores indicated a higher quality delivery of the material. Scores on these items were combined and averaged across all communities implementing each program to form a “quality of delivery” scale (Cronbach’s alpha 0.87 – 0.90 across the 4 years).
Participant responsiveness
Participant responsiveness was also rated by the community observers, who indicated the degree to which participants understood the material and participated in the lesson, both assessed on a 5-point scale. These two items were combined and scores were averaged across all implementers and communities conducting the observed programs.
Program participation
Participation in school programs was based on the number and percentage of students in the targeted grade who attended school each day during the semester(s) in which the program was taught, as reported by school officials. Program implementers recorded attendance at each session for parent training and afterschool programs. Overall program participation refers to the number of students (in afterschool and school-based programs) or families (for parent training programs) who attended at least one program session. Retention refers to the percentage of all participants who attended at least 60% of all sessions that were offered.
Results
Implementation of the CTC System
Table 1 lists the milestones that represent the core components of the CTC system, and the percentage of benchmarks comprising each milestone that were rated as achieved by the intervention communities using combined responses from Community Coordinators and Technical Assistance providers. The results indicate a very high level of implementation of the CTC system during both the adoption and maintenance stages of implementation. Approximately 90% of the benchmarks were completed across all four time points and each phase of CTC. These scores did not vary much across the four time points (see Table 1). There was a small increase in the percentage of benchmarks completed from Year 1.5 (93%) to Year 3 (96%), followed by small decreases at Year 4 (89%) and Year 5 (87%).
Table 1.
Percentage of CTC Benchmarks Achieved in Each Milestone for Intervention Communities (n=12) in the Community Youth Development Study
CTC Milestone | Implementation Score | |||
---|---|---|---|---|
Time 1 (Year 1.5) | Time 2 (Year 3) | Time 3 (Year 4) | Time 4 (Year 5) | |
PHASE ONE | 89 | 96 | 91 | 91 |
1.1 The community is organized to begin CTC | 93 | 97 | 95 | 93 |
1.2 The parameters of the CTC effort have been defined | 85 | 98 | 93 | 95 |
1.3 Community readiness issues have been identified | 86 | 96 | 89 | 88 |
1.4 Community readiness issues have been analyzed and either addressed, or a plan for addressing them has been developed | 92 | 94 | 88 | 90 |
PHASE TWO | 90 | 94 | 84 | 84 |
2.1 Key Leaders (positional and informal) have been engaged | 93 | 94 | 78 | 81 |
2.2 A Community Board has been developed to facilitate assessment, prioritization, selection, implementation, and evaluation of tested, effective programs, policies and practices | 96 | 96 | 89 | 88 |
2.3 The community has been educated and involved in the CTC process | 82 | 92 | 85 | 82 |
PHASE THREE | 100 | 99 | 88 | 83 |
3.1 The Community Board has the capacity to conduct a community assessment and prioritization | 100 | 100 | 90 | 89 |
3.2 Community assessment information has been collected and prepared for prioritization | 100 | 98 | 90 | 83 |
3.4 Priority risk and protective factors have been identified | 100 | 100 | 90 | 88 |
3.5 A resource assessment and gaps analysis has been conducted | 99 | 100 | 82 | 72 |
PHASE FOUR | 96 | 98 | 94 | 89 |
4.1 The Community Board has the capacity to create a focused community action plan | 89 | 97 | 99 | 86 |
4.2 The desired outcomes of the plan have been identified, based on the community assessment data | 100 | 100 | 100 | 96 |
4.3 Tested, effective programs, policies, and practices have been selected to address priority risk factors and protective factors and fill gaps | 94 | 100 | 98 | 88 |
4.4 Implementation plans for each program, policy, or practice to be implemented have been developed | 100 | 100 | 97 | 94 |
4.5 An evaluation plan has been developed | 100 | 100 | 100 | 100 |
4.6 A written community action plan has been developed | 92 | 92 | 71 | 73 |
PHASE FIVE | 90 | 91 | 85 | 83 |
5.1 The role of the Key Leader Board, the Community Board, and stakeholder groups in implementing and evaluating the plan has been specified | 86 | 99 | 90 | 79 |
5.2 Implementers of new programs, policies, or practices have the necessary skills, expertise, and resources to implement with fidelity | 98 | 92 | 88 | 83 |
5.3 Implement new programs, policies, or practices with fidelity | 86 | 95 | 93 | 93 |
5.4 Program-level outcomes are conducted for each program cycle | - | 100 | 96 | 93 |
5.5 Systematic and comprehensive actions are taken to inform the community about the prevention programs and to engage community members in those programs | - | 81 | 77 | 78 |
5.6 Systematic and comprehensive actions are taken to inform the community about the CTC effort and the Social Development Strategy, and to engage community members in supporting healthy youth development | - | 75 | 60 | 56 |
5.7 The CTC Board remains active, holding regular Board and Work Group meetings | - | 86 | 84 | 79 |
5.8 Community-level assessments are conducted at least every 2 years | - | 99 | 91 | 91 |
5.9 Observed improvements in risk and protective factors and child and adolescent well-being are shared and celebrated | - | 96 | 88 | 96 |
TOTAL | 93 | 96 | 89 | 87 |
NOTE: Scores are based on combined ratings from Technical Assistance providers and local CTC Coordinators
The results clearly indicated that across communities, the majority of CTC core components were achieved. There were only two cases in which average scores on milestones were below 60%, an implementation threshold identified in a meta-analysis of intervention research (Durlak & DuPre, 2008) as associated with a greater likelihood of achieving positive results among participants. Both cases involved the same milestone. At Times 3 and 4, communities had some difficulty achieving Milestone 5.6. This component of CTC advocates community-wide adoption of the Social Development Strategy (Catalano & Hawkins, 1996), in which coalitions encouraged community members to provide young people with opportunities and skills to be involved in prosocial activities and then recognize youth for their efforts.
When completion of the individual CTC benchmarks was examined (results not shown), the results indicated that only 6 (2%) of the 373 rated benchmarks were achieved by fewer than 60% of the communities at any of the four time points. More conservatively, only 18 (5% of the total) benchmarks were completed by fewer than 70% of the communities. Of these 18 cases, four were the benchmarks associated with Milestone 5.6, described above. Five cases were associated with repeating the core CTC trainings. Although all communities received all CTC trainings during the first 18 months of the study, the CTC system does not specify a particular time frame for retraining community members, and some coalitions did not repeat trainings in later years. The other relatively low-rated benchmarks involved creating a specific plan for youth involvement in the coalition, assessing community resources in Phase 3, distributing the coalition’s prevention plan to the larger community, reaching the designated population with prevention programming, and encouraging coalition members and key leaders in the community to recruit children and families into prevention programs. All but one of the 18 cases received these low benchmark ratings at the third or fourth time points, suggesting that communities had somewhat greater challenges maintaining these prevention activities over time than implementing them initially.
Implementation of Prevention Programs
During Years 2 – 5 of the study, communities implemented a variety of prevention programs as part of their implementation of the CTC system. Based on their analyses of local student survey data and programs available to target their prioritized risk and protective factors, the 12 intervention communities selected a unique combination of 1 to 4 prevention programs to implement during Year 2. In subsequent years, 10 of the 12 communities added one or more new programs and most continued their original selections, though some programs were discontinued. As shown in Table 2, 17 different school-based, afterschool, and parent training programs were selected and implemented using project fundingiii during the five years. In most cases, the school-based interventions selected by communities were universal interventions implemented school-wide or in all classrooms serving the targeted grade(s); the afterschool programs were selected interventions, serving youth at elevated risk because of academic difficulties (for tutoring programs) or those from single-parent homes (for Big Brothers/Big Sisters); and the parenting programs were universal interventions for all families with children in the target age group. As shown in Table 2, about half of the programs were implemented by multiple communities, resulting in 27 to 34 program replications each year of the study across the 12 CTC communities.
Table 2.
Prevention Programs Implemented, with Average Adherence1 and Delivery2 Scores, in Years 2 – 5 of the Community Youth Development Study
Program Name | Number of Communities Implementing | Adherence1 | Delivery2 | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Year 2 | Year 3 | Year 4 | Year 5 | Year 2 | Year 3 | Year 4 | Year 5 | Year 2 | Year 3 | Year 4 | Year 5 | |
School-based | ||||||||||||
All Stars Core | 1 | 1 | 1 | 1 | 93 | 98 | 99 | ns | 93 | 96 | 100 | ns |
Life Skills Training | 2 | 2 | 3 | 2 | 89 | 90 | 91 | 93 | 90 | 88 | 95 | 91 |
Lion’s-Quest Skills for Adolescence | 2 | 3 | 3 | 3 | 73 | 82 | 82 | 92 | 94 | 79 | 76 | na |
Program Development Evaluation | 1 | 1 | 0 | 0 | 93 | 54 | - | - | na | na | - | - |
Project Alert | 0 | 1 | 1 | 1 | - | 95 | 100 | ns | - | 83 | 98 | ns |
Olweus Bullying Prevention | 0 | 1 | 1 | 1 | - | 100 | 100 | ns | - | 68 | 56 | ns |
Project Towards No Drug Abuse | 0 | 0 | 0 | 1 | - | - | - | 58 | - | - | - | 97 |
Afterschool | ||||||||||||
Stay SMART | 3 | 3 | 1 | 1 | 98 | 96 | 97 | 100 | 99 | 97 | 100 | 100 |
Participate and Learn Skills | 1 | 1 | 1 | 2 | 80 | 80 | 100 | ns | 97 | 100 | 100 | ns |
Big Brothers/Big Sisters | 2 | 2 | 2 | 1 | 90 | 100 | 90 | 93 | 75 | 93 | 94 | 92 |
Tutoring | 4 | 6 | 6 | 7 | 91 | 97 | 94 | 92 | 94 | 93 | 89 | 95 |
Valued Youth | 1 | 1 | 1 | 0 | 77 | 95 | 97 | - | 92 | 100 | 100 | - |
Parent Training | ||||||||||||
Strengthening Families 10–14 | 2 | 3 | 3 | 2 | 94 | 95 | 93 | 94 | 100 | 100 | 95 | 100 |
Guiding Good Choices | 6 | 6 | 7 | 7 | 99 | 99 | 97 | 99 | 99 | 100 | 97 | 97 |
Parents Who Care | 1 | 1 | 0 | 0 | 87 | 94 | - | - | 94 | 100 | - | - |
Family Matters | 1 | 1 | 2 | 2 | 93 | 98 | 94 | 96 | 100 | 100 | 100 | 100 |
Parenting Wisely | 0 | 1 | 1 | 2 | - | ns | 100 | 100 | - | na | na | na |
Overall Score | 91 | 94 | 93 | 93 | 94 | 93 | 93 | 95 |
The percentage of required material or core components delivered, averaged across all communities implementing the program
The percentage of the required number, length, and frequency of sessions that were achieved, averaged across all communities implementing the program
ns: no fidelity forms submitted
na: not applicable
Program adherence rates are shown in Table 2. Overall adherence rates ranged from 91% to 94% over the four years of the study. Implementers reported teaching most of the required objectives and achieving the majority of core components in all programs. The validity of these scores was assessed using observations from community coalition members and implementers’ supervisors. Averaged across all programs and communities, the percentage of the material scored the same between observers and implementers was 93% in Year 2, 95% in Year 3, 92% in Year 4, and 97% in Year 5, indicating a strong overall correspondence between observer and implementer reports of program adherence and increasing confidence in the validity of the implementer reports.
Adherence scores did not generally differ across programs or program types, except for two school-based interventions. The Program Development Evaluation (PDE) program was successfully launched in one community in Year 2, but adherence dropped from 93% to 54% in Year 3. PDE is a multicomponent program in which schools conduct a comprehensive needs assessment, select interventions to address problem areas, then evaluate and refine their action plan. The decline in adherence occurred in part because the small, rural community that selected the program could not hire a coordinator with the required level of expertise to conduct the program evaluation. Given their lack of capacity to fully implement the model, the CTC coalition in this community decided to discontinue PDE after Year 3. In a different community, Project Towards No Drug Abuse was adopted in Year 5. The low adherence score (58%) was largely attributed to the regular omission of a game intended to review the course content previously learned. As the game is built into the start and end of each of the 12 sessions, its deletion resulted in a low adherence score.
The program delivery results listed in Table 2 indicated that program staff successfully delivered the required amount of programming. Across all programs and years, about 94% of the required number, length, and frequency of sessions were implemented. Communities were successful in meeting delivery requirements for parent training programs and met most expectations for afterschool programs. In the school-based programs, the community implementing the Olweus Bullying Prevention Program found it difficult to consistently offer the weekly classroom-based sessions in which teachers and students discuss bullying behaviors and their consequences. In this community, 68% of teachers taught weekly sessions in Year 3, and 56% met this goal in Year 4. While delivery scores for this program were not high, adherence rates were 100% in Years 3 and 4, as the school did enact all of the core components of the intervention, including school-wide processes, individual meetings with bullies and victims, and the classroom sessions. The final delivery challenge observed was related to the Lion’s Quest Skills for Adolescence program, taught in three communities. This program requires implementation of 40 sessions, delivered at least once or twice per week through the school year. Across sites, regular delivery of lessons was often interrupted by school holidays, teacher illness, special events (e.g., field trips or assemblies), academic testing, or other issues, which made it somewhat difficult for schools in these three communities to achieve both the required number of lessons and the expected frequency of occurrence.
According to observer reports, implementers evidenced a high quality of delivery of all programs. As shown in Table 3, across all programs and years of implementation, the average quality score was greater than 4.0 on a 5-point scale, in which higher scores represented a higher quality of delivery. Scores improved somewhat from Year 2 (4.37) to Year 3 (4.55), then remained consistently high for the remaining 2 years (4.52 in each year). These results did not vary much by program. Project Alert was the only program delivered with a quality score less than 4.0, but its rating (3.79) was still above average.
Table 3.
Observed Quality of Delivery and Participant Responsiveness of Prevention Programs in Years 2 – 5 of the Community Youth Development Study
Average Quality of Delivery | Average Participant Response | |||||||
---|---|---|---|---|---|---|---|---|
Program Name | Year 2 | Year 3 | Year 4 | Year 5 | Year 2 | Year 3 | Year 4 | Year 5 |
All Stars Core | 4.61 | 4.75 | 4.51 | ns | 4.41 | 4.75 | 4.40 | ns |
Life Skills Training | 4.30 | 4.20 | 4.31 | 4.20 | 4.25 | 4.05 | 4.33 | 4.17 |
Lion’s Quest Skills for Adolescence | 4.08 | 4.38 | 4.51 | 4.59 | 4.23 | 4.26 | 4.52 | 4.70 |
Project Alert | - | ns | 3.79 | ns | - | ns | 3.64 | ns |
Olweus Bullying Prevention Program | - | 4.80 | 4.70 | ns | - | 4.73 | 4.68 | ns |
Participate and Learn Skills | 4.72 | 4.60 | 4.56 | ns | 4.71 | 4.61 | 4.59 | ns |
Stay SMART | 4.30 | 4.67 | 4.56 | 4.88 | 4.12 | 4.46 | 4.40 | 4.63 |
Strengthening Families 10–14 | 4.48 | 4.41 | 4.55 | 4.80 | 4.61 | 4.61 | 4.65 | 4.71 |
Guiding Good Choices | 4.27 | 4.61 | 4.65 | 4.67 | 4.28 | 4.54 | 4.62 | 4.52 |
Parents Who Care | 4.17 | 4.73 | - | - | 4.00 | 4.50 | - | - |
Overall Score | 4.37 | 4.55 | 4.52 | 4.52 | 4.34 | 4.46 | 4.52 | 4.49 |
Note: Ratings were averaged across each community implementing the program; scores are based on a 1 – 5 scale (from lower to higher quality of delivery and responsiveness)
ns: no observation forms submitted
Participant responsiveness was rated by observers as the degree to which participants understood the material and participated in the lesson. Across all programs and years, the average responsiveness score ranged from 4.34 to 4.52 on a 5-point scale, indicating that programs implemented in the intervention communities were well received by participants (see Table 3). These scores were consistently high across programs and across the study years.
The last measured aspect of prevention program implementation fidelity was participation. As shown in Table 4, participation in school programs increased more than fourfold during Years 2 – 5, from 1,432 to 5,705 students. According to the National Center for Educational Statistics, the total eligible population of sixth-, seventh-, and eighth-grade students in 2005 – 2006 (the most current year of available data) across all 12 intervention communities was 10,031. Using this figure, the proportion of youth in Grades 6–8 reached by school programs increased from 14% in Year 2 to 57% in Year 5.
Table 4.
Prevention Program Participation1 and Retention2 in Years 2 – 5 of the Community Youth Development Study, by Program Type
Year | School | Afterschool | Parent Training | |||
---|---|---|---|---|---|---|
Participation (N) | Retention (%) | Participation (N) | Retention (%) | Participation (N) | Retention (%) | |
Year 2 | 1432 | 96 | 546 | 77 | 517 | 79 |
Year 3 | 3886 | 91 | 612 | 81 | 665 | 78 |
Year 4 | 5165 | 95 | 589 | 65 | 476 | 79 |
Year 5 | 5705 | 94 | 448 | 70 | 379 | 75 |
Participation: the number of students or families attending at least one program session
Retention: the percentage of participants who attended at least 60 percent of all sessions that were offered
Note: the total eligible population of 6th-, 7th-, and 8th-grade students in 2005 – 2006 (the most current year of available data) across all 12 intervention communities was 10,031 (source: the National Center for Educational Statistics).
The increase in participation reflects the increase in the number of communities implementing school-based prevention programs over the study. Only five communities initially adopted school-wide interventions and classroom-based curricula intended to prevent adolescent drug use and delinquency, whereas all intervention communities were implementing at least one school program by Year 5 (Fagan, Hawkins, & Catalano, 2008). The participation rates generally represent unique students, as schools served new, incoming cohorts of students with programming in years following program adoption. However, a few communities also served the same students in multiple years, either because the program involved booster sessions (Life Skills Training and Project Alert) or because it was a school-wide initiative that involved all students every year (the Olweus Bullying Prevention Program and Program Development Evaluation).
Participation in afterschool and parent training programs, shown in Table 4, was much more modest. Afterschool programs served approximately 500 students each year, while about 400 to 600 families participated in parent training programs each year. Rates were similar during Years 2 – 4, but dropped in Year 5. However, this decrease likely reflects measurement differences rather than a lack of success. Because the research study ended in March of Year 5, this year’s rates did not include participation during spring months, whereas other years were based on a full school-year of program delivery. Using the National Center for Educational Statistics, afterschool programs reached 4.5% to 6.1% of the middle school population in the intervention communities, while parent training programs reached 3.8% to 6.6% of the families of middle school students.
Retention of participants was high across all years of program implementation and all types of programs (see Table 4). In school-based programs, nearly all students participated in most lessons, which was expected given that all programs were offered during school hours. Retention in afterschool and parent training programs was also good, with 70% to 80% of participants attending at least 60% of the offered sessions. These results indicate that once students and families were recruited into programs, they attended most sessions.
The Challenge of Participant Recruitment
While participation in afterschool programs was expected to be less than that of school-based programs, given that the former typically targeted at-risk youth, while the latter targeted universal populations, participation in parent training programs was lower than expected. The CTC system does not specify a desired reach of prevention programming, but communities participating in this project understood that in order to achieve community-wide improvements in student outcomes, they would need to reach a relatively large proportion of the population with prevention services. To this end, most communities set goals of providing about 20% of families in the targeted age group with services each year, but these goals were rarely met. Recruitment challenges in parent-focused programs have been well documented in prevention research (Bauman, Ennett, Foshee, Pemberton, & Hicks, 2001; Dumka, Garza, Roosa, & Stoerzinger, 1997; Heinrichs, Bertram, Kuschel, & Hahlweg, 2005; Spoth & Redmond, 2002), and recruitment of families into this study’s universal parenting programs was very difficult. Although coalition members enacted multiple, varied, and repeated marketing strategies, provided multiple incentives for participation, and attempted to remove barriers to participation, recruitment remained a significant challenge.
Given the considerable human and financial resources needed for recruitment, and their inability to achieve recruitment goals, one community decided to discontinue its parent training program after two years, while another community dropped two of the three family-focused programs it had selected. Three other communities decided that, in addition to their group-based parent training program, they would offer a home-based parent training option (either Family Matters or Parenting Wisely) to try to surmount participation barriers associated with group-based programs (e.g., transportation, child care, competing activities, stigma, etc.). Recruitment challenges also prompted two communities to discontinue their afterschool programs (Stay SMART and Big Brothers/Big Sisters), and a third to replace an afterschool program (Stay SMART) with a school-based curriculum (the Life Skills Training Program).
Including the PDE program termination described earlier, eight programs were discontinued in five different communities during the study.iv In each case, the decisions to stop programming were made locally by coalitions and the agencies responsible for service delivery as part of the implementation of the CTC system. CTC trains coalitions to monitor implementation fidelity and make changes when necessary to ensure that prevention goals are being met. In this study, all these programs were discontinued after repeated attempts by coalitions to overcome obstacles associated with program delivery, most notably recruitment challenges. In all cases, these communities chose to re-invest their prevention resources in programs they thought they were better equipped to implement.
Discussion
This paper has described the implementation of the Communities That Care system by 12 intervention communities participating in the Community Youth Development Study. The data indicated that all 12 communities faithfully enacted the CTC prevention system and achieved high rates of implementation fidelity when replicating a variety of school, afterschool, and parent training programs. These results held over time. Communities successfully formed coalitions, launched prevention programs, and maintained the quality of their prevention services over 5 years of the study. The only significant and consistent challenge faced by communities was recruiting significant numbers of participants into universal parent-training programs, and to a lesser extent, selective afterschool programs.
The findings from this study help to document the extent to which communities implemented the intervention being evaluated in the Community Youth Development Study. The CYDS has shown improvements in targeted risk factors and in the initiation and prevalence of drug use and delinquency for youth in Grades 5 through 8 in intervention versus control communities (Hawkins, Brown, Oesterle, Arthur, Abbott, & Catalano, 2008; Hawkins, Oesterle, Brown, Arthur, Abbott, Fagan, & Catalano, In Press). This process evaluation helps to improve confidence that the student outcomes can be attributed to the implementation of CTC in the 12 intervention sites.
More generally, the findings contribute to emerging knowledge regarding translational research and indicate that coalitions have promise for reducing the development of problem behaviors. In this study, communities relied on the CTC model to form broad-based, prevention-oriented coalitions whose primary goal was to use epidemiologic data to select tested, effective prevention strategies that targeted their community-specific needs and to implement these activities with fidelity. CTC sites were successful in forming and maintaining active coalitions. Intervention communities implemented about 90% of the core components of the CTC model at each of the four time points assessed. Moreover, CTC coalitions were able to ensure very high rates of adherence and delivery of most program sessions, and observers rated implementers as effective facilitators who engaged participants. These results are more favorable than the low rates of adherence and dosage reported from community agencies replicating programs outside the tight controls of scientific efficacy trials (Gottfredson & Gottfredson, 2002; Hallfors & Godette, 2002; Kumpfer & Alvarado, 2003). While the CTC sites in this study were participating in an efficacy trial, communities were responsible for all aspects of program implementation, including arranging program training workshops, hiring and supervising staff, and taking corrective actions when necessary to improve delivery.
While most of the implementation fidelity scores were high, it should be noted that the methodology used to assess implementation quality in this study has limitations. Measures of CTC system and program implementation were based in part on self-reports of project staff who may have overestimated their performance due to social desirability (Lillehoj, Griffin, & Spoth, 2004; Melde, Esbensen, & Tusinski, 2006). Furthermore, self-reports of program implementation were validated by information collected by community observers who may also have been motivated to report favorable outcomes. Ideally, objective, trained research staff would have observed program sessions and rated adherence to CTC system- and program-level requirements. However, in this project, implementation monitoring procedures were purposively designed so that they could be understood, utilized, and sustained by community-based coalitions. Rather than rely on video-tapes of community meetings or university research staff to conduct ratings, procedures typically used in efficacy research, we designed an implementation monitoring system that could be used by any community utilizing the CTC prevention system.
Despite the scientific limitations of these measures, the program-level adherence rates reported here are comparable to those demonstrated in two similar research projects. In the Blueprints Initiative (Elliott & Mihalic, 2004), university staff indicated that 74% of the 42 sites involved in the project successfully implemented all the core components of the eight replicated violence prevention programs (Mihalic & Irwin, 2003), and community observers rated teachers as delivering an average of 81% to 86% of all required material in the Life Skills Training drug prevention curriculum in over 400 schools (Fagan & Mihalic, 2003). In the PROSPER trial, implementer adherence was rated by trained community members, who indicated that an average of 90% of program content was delivered by implementers of parent training programs and school-based curricula (Spoth et al., 2007).
The current process evaluation was conducted under conditions which may have made system and program fidelity more likely. All intervention communities had full-time, paid coordinators, participated in all required CTC training workshops, received up to $275,000 over 4 years to enact prevention programs, and had regular phone contact and periodic site visits from technical support providers for the 5 years of the study. Other research has indicated that monetary resources, training workshops, and proactive technical assistance may be necessary to ensure successful coalition operations and high quality prevention programming (Feinberg, Bontempo, & Greenberg, 2008; Greenberg, Feinberg, Gomez, & Osgood, 2005; Mihalic & Irwin, 2003). Given that these elements were present in the current study, and that positive outcomes were seen in both this process evaluation and in student outcomes through Grade 8 (Hawkins et al., In Press), we recommend that future translational research also ensure that financial incentives, training, and TA are provided to communities.
Even with these resources, community coalitions and program practitioners in this study faced significant challenges when implementing the CTC system and prevention programs, and it is important that communities be aware of these obstacles. Some CTC benchmarks were difficult for some coalitions to achieve, particularly promoting widespread adoption of the Social Development Strategy (which encouraged adults to provide opportunities and reinforcement for youth prosocial involvement in communities), repeating CTC training workshops to ensure that coalition members and community stakeholders understood the CTC prevention model, and publicizing prevention plans and activities throughout the community. In addition, many coalitions struggled to convince schools to adopt new programs, and it took four years before all communities had done so. Reports from program facilitators and observers indicated that some implementers did not teach all required program lessons or material, and some failed to facilitate lessons in a high-quality manner that solicited strong participant responsiveness. Furthermore, it was very difficult for communities to achieve desired rates of program participation, especially in universal parent training programs.
These challenges sometimes led coalitions to discontinue particular prevention programs. However, in contrast to other evaluations of coalition-based prevention efforts (Collins et al., 2007; Flewelling et al., 2005; Hallfors et al., 2002; Wandersman & Florin, 2003), the problems did not result in systematic program drift or failure to achieve overall prevention goals. By implementing CTC faithfully (i.e., ensuring careful prevention planning, actively monitoring implementation efforts, and taking corrective actions when necessary), the intervention communities were proactive and took action to solve problems. Even when the challenges were great enough to lead to program discontinuation, the overall CTC prevention effort was not significantly damaged; instead, communities shifted their resources to other tested prevention activities they felt had a greater likelihood of success.
While it is clearly not easy to translate scientific protocols and recommendations into community-based practices, or to achieve community-wide improvements in youth outcomes, the findings from this study are encouraging. These data suggest that the CTC system has potential for increasing community capacity to enact successful prevention activities. CTC is now owned by the federal government and CTC training materials are available from the Center for Substance Abuse Prevention’s Prevention Platform (https://preventionplatform.samhsa.gov/Macro/CSAP/dss_portal/Templates_redesign/start.cfm). The accessibility and free materials provide a unique opportunity to disseminate this model widely throughout the United States. Findings from this process evaluation, as well as early findings demonstrating improved youth outcomes, indicate that communities committed to implementing the CTC system in its entirety, and with adequate training, technical assistance, and resources, have great potential to promote healthy youth development and prevent the development of problem behaviors among young people community-wide.
Acknowledgments
This work was supported by a research grant from the National Institute on Drug Abuse (R01 DA015183-01A1) with co-funding from the National Cancer Institute, the National Institute of Child Health and Human Development, the National Institute of Mental Health, and the Center for Substance Abuse Prevention. The content of this paper is solely the responsibility of the authors and does not necessarily represent the official views of the funding agencies. Findings from this paper were presented at the 2008 Society for Prevention Research conference in San Francisco, CA. The authors wish to acknowledge the contributions of the communities participating in the Community Youth Development Study.
Footnotes
Two TA providers shared oversight of eight intervention communities during Year 1. During the first assessment they conducted independent ratings of benchmark completion in these communities, then reached agreement through discussion when ratings differed. The other four communities were rated by the one staff member who provided TA, and in the remainder of the study, one TA provider per community rated benchmark completion.
Observations were not conducted for two self-administered parent programs (Family Matters and Parenting Wisely); three programs with one-on-one or small-group administration which made observation overly intrusive (Big Brothers/Big Sisters, tutoring, and Valued Youth); and two school-wide interventions (Program Development Evaluation and Olweus Bullying Prevention Program). Observations were conducted for the classroom component of Olweus and for the Participate and Learn Skills program to assess quality of delivery and participant responsiveness, not adherence.
Some community coalitions implemented these programs or one other (Project Northland’s Class Action curriculum) using sources of funding other than the study. These programs are not listed in Table 2 because program implementation data was not analyzed as part of the research project (though it may have been collected and/or analyzed by local CTC coalitions).
After their monitoring data failed to demonstrate significant improvements in participants, one other community replaced the Valued Youth Tutoring program with a local tutoring model they considered more cost-effective.
References
- Arthur MW, Blitz C. Bridging the gap between science and practice in drug abuse prevention through needs assessment and strategic community planning. Journal of Community Psychology. 2000;28(3):241–255. [Google Scholar]
- Arthur MW, Hawkins JD, Pollard JA, Catalano RF, Baglioni AJ. Measuring risk and protective factors for substance use, delinquency, and other adolescent problem behaviors: The Communities That Care Youth Survey. Evaluation Review. 2002;26(6):575–601. doi: 10.1177/0193841X0202600601. [DOI] [PubMed] [Google Scholar]
- Bauman KE, Ennett ST, Foshee VA, Pemberton M, Hicks K. Correlates of participation in a family-directed tobacco and alcohol prevention program for adolescents. Health Education and Behavior. 2001;28(4):440–461. doi: 10.1177/109019810102800406. [DOI] [PubMed] [Google Scholar]
- Butterfoss FD, Goodman RM, Wandersman A. Community coalitions for prevention and health promotion. Health Education Research. 1993;8(3):315–330. doi: 10.1093/her/8.3.315. [DOI] [PubMed] [Google Scholar]
- Catalano RF, Hawkins JD. The social development model: A theory of antisocial behavior. In: Hawkins JD, editor. Delinquency and crime: Current theories. New York: Cambridge University Press; 1996. pp. 149–197. [Google Scholar]
- Collins D, Johnson K, Becker BJ. A meta-analysis of direct and mediating effects of community coalitions that implemented science-based substance abuse prevention interventions. Substance Use and Misuse. 2007;42:985–1007. doi: 10.1080/10826080701373238. [DOI] [PubMed] [Google Scholar]
- Dumka LE, Garza CA, Roosa MW, Stoerzinger HD. Recruitment and retention of high-risk families into a preventive parent training intervention. The Journal of Primary Prevention. 1997;18(1):25–39. [Google Scholar]
- Durlak JA, DuPre EP. Implementation matters: A review of the research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology. 2008;41(3–4):327–350. doi: 10.1007/s10464-008-9165-0. [DOI] [PubMed] [Google Scholar]
- Dusenbury L, Brannigan R, Hansen WB, Walsh J, Falco M. Quality of implementation: Developing measures crucial to understanding the diffusion of preventive interventions. Health Education Research. 2005;20(3):308–313. doi: 10.1093/her/cyg134. [DOI] [PubMed] [Google Scholar]
- Elliott DS, Mihalic S. Issues in disseminating and replicating effective prevention programs. Prevention Science. 2004;5(1):47–53. doi: 10.1023/b:prev.0000013981.28071.52. [DOI] [PubMed] [Google Scholar]
- Ennett ST, Ringwalt C, Thorne J, Rohrbach LA, Vincus AA, Simons-Rudolph A, et al. A comparison of current practice in school-based substance use prevention programs with meta-analysis findings. Prevention Science. 2003;4(1):1–14. doi: 10.1023/a:1021777109369. [DOI] [PubMed] [Google Scholar]
- Fagan AA, Hanson K, Hawkins JD, Arthur MW. Implementation fidelity of prevention programs replicated in the Community Youth Development Study. American Journal of Community Psychology. 2008;41(3–4):235–249. doi: 10.1007/s10464-008-9176-x. [DOI] [PubMed] [Google Scholar]
- Fagan AA, Hanson K, Hawkins JD, Arthur MW. Implementing effective community-based prevention programs in the Community Youth Development Study. Youth Violence and Juvenile Justice. 2008;6(3):256–278. [Google Scholar]
- Fagan AA, Hawkins JD, Catalano RF. Using community epidemiologic data to improve social settings: The Communities That Care prevention system. In: Shinn M, Yoshikawa H, editors. The power of social settings: Promoting youth development by changing schools and community programs. New York: Oxford University Press; 2008. pp. 292–312. [Google Scholar]
- Fagan AA, Mihalic S. Strategies for enhancing the adoption of school-based prevention programs: Lessons learned from the Blueprints for Violence Prevention replications of the Life Skills Training Program. Journal of Community Psychology. 2003;31(3):235–254. [Google Scholar]
- Feinberg M, Bontempo D, Greenberg M. Predictors and level of sustainability of community prevention coalitions. American Journal of Preventive Medicine. 2008;34:495–501. doi: 10.1016/j.amepre.2008.01.030. [DOI] [PubMed] [Google Scholar]
- Feinberg M, Greenberg MT, Osgood DW, Sartorius J, Bontempo D. Effects of the Communities That Care model in Pennsylvania on youth risk and problem behaviors. Prevention Science. 2007;8(4):261–270. doi: 10.1007/s11121-007-0073-6. [DOI] [PubMed] [Google Scholar]
- Feinberg M, Meyer Chilenski S, Greenberg M, Spoth RL, Redmond C. Community and team member factors that influence the operations phase of local prevention teams: The PROSPER project. Prevention Science. 2007;8(3):214–226. doi: 10.1007/s11121-007-0069-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F. Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network; 2005. (FMHI Publication #231) [Google Scholar]
- Flewelling RL, Austin D, Hale K, LaPlante M, Liebig M, Piasecki L, et al. Implementing research-based substance abuse prevention in communities: Effects of a coalition-based prevention initiative in Vermont. Journal of Community Psychology. 2005;33(3):333–353. [Google Scholar]
- Glaser RR, Van Horn ML, Arthur MW, Hawkins JD, Catalano RF. Measurement properties of the Communities That Care Youth Survey across demographic groups. Journal of Quantitative Criminology. 2005;21(1):73–102. [Google Scholar]
- Glasgow RE, Lichtenstein E, Marcus AC. Why don’t we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. American Journal of Public Health. 2003;93(8):1261–1267. doi: 10.2105/ajph.93.8.1261. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gottfredson DC, Gottfredson GD. Quality of school-based prevention programs: Results from a national survey. Journal of Research in Crime and Delinquency. 2002;39(1):3–35. [Google Scholar]
- Greenberg MT, Feinberg ME, Gomez BJ, Osgood DW. Testing a community prevention focused model of coalition functioning and sustainability: A comprehensive study of Communities That Care in Pennsylvania. In: Stockwell T, Gruenewald P, Toumbourou J, Loxley W, editors. Preventing harmful substance use: The evidence base for policy and practice. West Sussex, England: John Wiley and Sons; 2005. pp. 129–142. [Google Scholar]
- Hallfors D, Cho H, Livert D, Kadushin C. Fighting back against substance use: Are community coalitions winning? American Journal of Preventive Medicine. 2002;23(4):237–245. doi: 10.1016/s0749-3797(02)00511-1. [DOI] [PubMed] [Google Scholar]
- Hallfors D, Godette D. Will the “Principles of effectiveness” Improve prevention practice? Early findings from a diffusion study. Health Education Research. 2002;17(4):461–470. doi: 10.1093/her/17.4.461. [DOI] [PubMed] [Google Scholar]
- Hawkins J, Catalano R, Arthur M, Egan E, Brown E, Abbott R, et al. Testing Communities That Care: Rationale and design of the Community Youth Development Study. Prevention Science. 2008;9(3):178–190. doi: 10.1007/s11121-008-0092-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hawkins JD, Brown E, Oesterle S, Arthur M, Abbott R, Catalano R. Early effects of Communities That Care on targeted risks and initiation of delinquent behavior and substance use. Journal of Adolescent Health. 2008;43(1):15–22. doi: 10.1016/j.jadohealth.2008.01.022. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hawkins JD, Catalano RF. Communities That Care: Action for drug abuse prevention. San Francisco, CA: Jossey-Bass Publishers; 1992. [Google Scholar]
- Hawkins JD, Catalano RF. Communities That Care Prevention Strategies Guide. South Deerfield, MA: Channing Bete Company, Inc; 2004. [Google Scholar]
- Hawkins JD, Catalano RF, Arthur MW. Promoting science-based prevention in communities. Addictive Behaviors. 2002;27:951–976. doi: 10.1016/s0306-4603(02)00298-8. [DOI] [PubMed] [Google Scholar]
- Hawkins JD, Oesterle S, Brown EC, Arthur MW, Abbott RD, Fagan AA, et al. Results of a type 2 translational research trial to prevent adolescent drug use and delinquency: A test of Communities That Care. Archives of Pediatric Adolescent Medicine. doi: 10.1001/archpediatrics.2009.141. (In Press) [DOI] [PMC free article] [PubMed] [Google Scholar]
- Heinrichs N, Bertram H, Kuschel A, Hahlweg K. Parent recruitment and retention in a universal prevention program for child behavior and emotional problems: Barriers to research and program participation. Prevention Science. 2005;6(4):276–286. doi: 10.1007/s11121-005-0006-1. [DOI] [PubMed] [Google Scholar]
- Kumpfer KL, Alvarado R. Family-strengthening approaches for the prevention of youth problem behaviors. American Psychologist. 2003;58(6–7):457–465. doi: 10.1037/0003-066X.58.6-7.457. [DOI] [PubMed] [Google Scholar]
- Leviton LC, Herrera C, Pepper SK, Fishman N, Racine DP. Faith in action: Capacity and sustainability of volunteer organizations. Evaluation and Program Planning. 2006;29:201–207. [Google Scholar]
- Lillehoj CJ, Griffin KW, Spoth R. Program provider and observer ratings of school-based preventive intervention implementation: Agreement and relation to youth outcomes. Health Education and Behavior. 2004;31(2):242–257. doi: 10.1177/1090198103260514. [DOI] [PubMed] [Google Scholar]
- Melde C, Esbensen FA, Tusinski K. Addressing program fidelity using onsite observations and program provider descriptions of program delivery. Evaluation Review. 2006;30(6):714–740. doi: 10.1177/0193841X06293412. [DOI] [PubMed] [Google Scholar]
- Mihalic S, Fagan AA, Irwin K, Ballard D, Elliott D. Blueprints for Violence Prevention. Washington, DC: Office of Juvenile Justice and Delinquency Prevention; 2004. [Google Scholar]
- Mihalic S, Irwin K. Blueprints for Violence Prevention: From research to real world settings--factors influencing the successful replication of model programs. Youth Violence and Juvenile Justice. 2003;1(4):307–329. [Google Scholar]
- Pentz MA, Jasuja GK, Rohrbach LA, Sussman S, Bardo MT. Translation in tobacco and drug abuse prevention research. Evaluation and the Health Professions. 2006;29(2):246–271. doi: 10.1177/0163278706287347. [DOI] [PubMed] [Google Scholar]
- Quinby R, Fagan AA, Hanson K, Brooke-Weiss B, Arthur MW, Hawkins JD. Installing the Communities That Care prevention system: Implementation progress and fidelity in a randomized controlled trial. Journal of Community Psychology. 2008;36(3):313–332. [Google Scholar]
- Ringwalt C, Ennett ST, Vincus AA, Thorne J, Rohrbach LA, Simons-Rudolph A. The prevalence of effective substance use prevention curricula in U.S. middle schools. Prevention Science. 2002;3(4):257–265. doi: 10.1023/a:1020872424136. [DOI] [PubMed] [Google Scholar]
- Rohrbach LA, Grana R, Sussman S, Valente TW. Type II translation: Transporting prevention interventions from research to real-world settings. Evaluation and the Health Professions. 2006;29(3):302–333. doi: 10.1177/0163278706290408. [DOI] [PubMed] [Google Scholar]
- Saul J, Duffy J, Noonan R, Lubell K, Wandersman A, Flaspohler P, et al. Bridging science and practice in violence prevention: Addressing ten key challenges. American Journal of Community Psychology. 2008;41(3–4):197–205. doi: 10.1007/s10464-008-9171-2. [DOI] [PubMed] [Google Scholar]
- Sherman LW, Gottfredson DC, MacKenzie D, Eck J, Reuter P, Bushway S, editors. Preventing crime: What works, what doesn’t, what’s promising: A report to the United States Congress. Washington, DC: U.S. Department of Justice, Office of Justice Programs; 1997. [Google Scholar]
- Specter A. Making youth violence prevention a national priority. American Journal of Preventive Medicine. 2008;34(3S):S3–4. doi: 10.1016/j.amepre.2007.12.018. [DOI] [PubMed] [Google Scholar]
- Spoth RL, Greenberg M, Bierman K, Redmond C. PROSPER community-university partnership model for public education systems: Capacity-building for evidence-based, competence-building prevention. Prevention Science. 2004;5(1):31–39. doi: 10.1023/b:prev.0000013979.52796.8b. [DOI] [PubMed] [Google Scholar]
- Spoth RL, Greenberg MT. Toward a comprehensive strategy for effective practitioner-scientist partnerships and larger-scale community health and well-being. American Journal of Community Psychology. 2005;35(3/4):107–126. doi: 10.1007/s10464-005-3388-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Spoth RL, Guyll M, Lillehoj CJ, Redmond C, Greenberg M. PROSPER study of evidence-based intervention implementation quality by community-university partnerships. Journal of Community Psychology. 2007;35(8):981–999. doi: 10.1002/jcop.20207. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Spoth RL, Redmond C. Project Family prevention trials based in community-university partnerships: Toward scaled-up preventive interventions. Prevention Science. 2002;3(3):203–222. doi: 10.1023/a:1019946617140. [DOI] [PubMed] [Google Scholar]
- Spoth RL, Redmond C, Shin C, Greenberg M, Clair S, Feinberg M. Substance use outcomes at eighteen months past baseline from the PROSPER community-university partnership trial. American Journal of Preventive Medicine. 2007;32(5):395–402. doi: 10.1016/j.amepre.2007.01.014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wandersman A. Community science: Bridging the gap between science and practice with community-centered models. American Journal of Community Psychology. 2003;31(3–4):227–242. doi: 10.1023/a:1023954503247. [DOI] [PubMed] [Google Scholar]
- Wandersman A, Florin P. Community intervention and effective prevention. American Psychologist. 2003;58(6–7):441–448. doi: 10.1037/0003-066x.58.6-7.441. [DOI] [PubMed] [Google Scholar]
- Welsh B, Farrington DP, editors. Preventing crime: What works for children, offenders, victims and places. Berlin, Germany: Springer; 2006. [Google Scholar]
- Woolf SH. The meaning of translational research and why it matters. Journal of the American Medical Association. 2008;299(2):211–213. doi: 10.1001/jama.2007.26. [DOI] [PubMed] [Google Scholar]