Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2014 Apr 29.
Published in final edited form as: Clin Pediatr (Phila). 2011 Nov;50(11):995–1000. doi: 10.1177/0009922811407183

Implementation of effective health innovations and pediatricians

Bonita Stanton, Linda Kaljee, Sonja Lunn, Lynette Deveaux, Xiaoming Li, Xinguang Chen, Sylvie Naar-King, Carole Harris, Ambika Mathur, Deepak Kamat
PMCID: PMC4004516  NIHMSID: NIHMS574924  PMID: 22008708

Introduction

Over the last decade much attention has been directed to the need for the public to benefit from medical and behavioral research. The recognized importance of moving scientific discovery to clinical and public health practice has inspired countless publications and serves as the foundation for the National Institutes of Health (NIH) Roadmap Initiative.1, 2, 3 “Translation Research” can be categorized into “Translation 1” activities (developing and determining the efficacy of interventions derived from more basic biomedical, behavioral or epidemiologic discovery) and “Translation 2” activities (bringing efficacious programs to practice).4, 5 Under this broad mandate, Translation 2 activities include at least two phases: 1) further effectiveness trials of efficacious interventions; and, 2) the diffusion (defined as the uncontrolled or natural spread of innovations), dissemination(defined as purposeful or deliberate efforts to move an innovation) and implementation (defined as adoption or utilization of the innovation) of efficacious programs that may or may not have been further validated as effective6, 7, 8.

Pediatricians may be interested and/or involved in many aspects of translation research depending on the nature of their practice, but all will be involved to some degree in implementation, whether as a researcher, a health care provider, or both. While probably aware of the importance of implementation of new advances in health care, many clinicians may not be aware of the issues encountered in this final and critically important step of the research chain. Until recently, implementation generally had been conceptualized as the final step of the broader processes of diffusion or dissemination, typically had not been the primary focus of the research or field activity6, 7, 9 and had received little attention during medical school or residency.

Implementation of an effective intervention, whether a procedure, a medication, a device or a program designed to change, prevent or reinforce a behavior (behavioral intervention), requires focused attention on identifying exactly what aspects of the intervention have contributed to efficacious outcomes and therefore need to be retained in subsequent implementations to disparate audiences. While it may seem logical that to retain its effectiveness an intervention must be implemented (reproduced) “exactly as it was” when it was demonstrated to be effective, clinicians and scientists have come to realize that often it is not clear what it means to reproduce an intervention “exactly like it was” when it was demonstrated to work. Key questions for the implementation component of translational research are:

What occurs in practice when a new product or intervention is introduced?

What is (are) the critical components of the intervention that resulted in effectiveness?;

and,

What factors are related to increased fidelity?

Addressing these three questions about any intervention that has been shown to improve health outcomes under different conditions is complex; the process is particularly vexing with regard to interventions that are largely or exclusively behavioral in nature, as is the case for many prevention programs, adherence programs, etc. Behavioral interventions frequently involve multiple sessions delivered by interventionists with different levels of training and experience and differing personality and communication styles. Whereas quite specific requirements and procedures have been established for defining the safe and effective regiments for delivery of new medications, vaccines and medical devices by regulatory organizations such as the Food and Drug Administration10, the process for defining safe and effective behavioral interventions has not been subject to the scrutiny and rigor. Effective behavioral interventions have been developed that address important pediatric public health issues including obesity treatment and prevention11, medication compliance12, violence, pregnancy and STD prevention13, and responding to bullying14; lagging behind is guidance to pediatricians regarding their implementation in a manner that will retain their effectiveness. Accordingly, for the remainder of this paper we shall discuss the important area of implementation by addressing the three questions noted above and with a particular focus on interventions that are behavioral in nature.

What occurs in practice when a new product or intervention is introduced?

There is ample documentation that effective biomedical interventions are delivered in a practice (as opposed to a research) setting, they generally are not delivered as they were under the research conditions when they were found to be effective15. For example, an estimated two-thirds of Americans who are prescribed medications fail to take some or all of the medication16. This simple, common occurrence, well known to most clinicians, illustrates many of the issues confronted in implementing medical innovations. Determining whether in fact a patient has been compliant with a medication prescription is not necessarily straight forward. Besides the basic question as to whether the medication was taken at all, what additional parameters must be met for a patient to be “compliant”? If the patient takes a medication one hour late, is he still compliant?; if yes, then what about two hours late? If in a 10-day course of medication the patient misses one dose, is she compliant? Does it matter of the missed dose was on day 2 versus day 9? Do these variations from protocol reflect a lack of compliance on the part of the physician; that is, did the physician describe exactly when the medication should be taken, ascertain whether or not the patient understood, and assess the issues that the patient saw that might make following this regiment difficult?

As complex as assessing the adequacy of adherence to a medication prescription (a largely biomedical intervention) may be, it may be even more difficult to assess adherence to an intervention that is primarily behavioral. First, the process for defining the parameters of the effective intervention that is to be replicated has not been well delineated; that is, how do we decide what part or parts of a behavioral intervention need to be identical in order to represent a “replication” rather than a “reinvention” (or adaptation) of a previously successful intervention? A vast if not universal experience in the implementation of previously developed behavioral interventions is that new “adaptations” or “reinventions” will occur during the implementation of the intervention in a new setting. For example, a process and outcome evaluations of a crime prevention program, “Community Works” conducted across 14 schools found that only 16% of the sessions were delivered “with a high degree of fidelity”, defined as “delivered in a manner intended by program developers”17 page 725. They observed that the majority of the curricular changes appeared to be dictated by local needs and resources. Further, the authors suggested that some changes are needed and recommend that program developers should articulate “acceptable degrees of variation”, although this is not further specified.17, p. 735

The recognition that behavioral interventions are changed as they are implemented is not new. A Rand report issued in the 1970’s entitled “Implementation of Educational Innovation” found that among a representative sample of US teachers, 79.8% reported adapting their prevention curricula18. Indeed, a vast literature exists on the phenomenon of local change to interventions. Whether it is called “adaptation”, “reinvention”, “cultural adaptation” or “culturally appropriate modification,” all of the terms acknowledge that some change appears to part of the process of implementation of interventions18-23.

What is (are) the critical components of the intervention that resulted in its effectiveness?

Translation research is predicated on the importance of effective interventions being adopted by users; as a corollary, the more widespread the use of new medical innovations, the greater the likelihood that the intervention is reaching the target population(s) and improving health. However, as indicated in the examples above, part of the process of adopting new innovations appears to be some degree of adaptation or modification of these interventions. If this is the case, then what is meant by “fidelity of implementation” when is some change necessary and when is does that change become too much? Despite widespread acceptance of the importance of “fidelity” in intervention delivery, implementation fidelity has been loosely defined. Definitions applied to implementation fidelity have included combinations of each of the following characteristics: adherence to theoretical guidelines underpinning the intervention; attention to completeness and dosage of content; retention of unique features of the program; the quality of the delivery of the intervention; and, engagement of the participants.23-26 There is a need to achieve consensus around an operational definition of “implementation fidelity” before we can begin to address what and how much change should be tolerated.

An operational definition of fidelity must address the dilemma that while some degree of adaptation is inevitable (and probably necessary) in intervention implementation in new settings,18-23 modification cannot be limitless nor undefined and still be considered consistent with “fidelity”. Therefore, implementation researchers have moved toward new strategies for identifying those elements of effective interventions which are believed to be critical to their positive impacts on health.

In the study of the school-based crime-prevention program “Community Works” noted above17, the researchers asked the developers of the intervention program to specifically articulate the key elements of the intervention. They note that “ambiguity on the part of program providers concerning vital versus adaptable program components decreases the probability a program will be properly transferred across sites”17, page 736. While this approach offers a partial solution to the question as to “what is the intervention”, it presumes that the intervention developer actually know which elements in their intervention are “critical” and which are adaptable, and has communicated this information in a format which is accessible to new implementers. To the contrary, this assumption has limited empirical backing. Program developers and program implementers operating in isolation of one another have been observed to permit deletions, alterations and even additions that may seriously undermine intervention effectiveness27. There has been less concern about the impact of additions to an intervention during adaptation (such as more information, new stories or activities to underscore a point); in fact, a small body of research suggests that local additions of to an intervention may not detract from --and may enhance-- effectiveness28, 29. As an example, in a new setting, additional activities were added to an HIV prevention curriculum targeting adolescents to address a specific local issue concerning the intersection of alcohol use and sexual risk behaviors30. An effectiveness study revealed that these additions did not appear to undermine the efficacy of the intervention and contributed toward another positive health outcomes (decreased alcohol use).

The Centers for Disease Control and Prevention (CDC) is engaged in identifying the critical component of effective interventions. The CDC has developed specific programs, including the “Diffusion of Effective Behavioral Interventions” (DEBI) program which actively identifies interventions that have been demonstrated through randomized controlled trials to be effective and specifically promotes dissemination of these effective programs. The DEBI initiative is developing a structured process to codify critical aspects of effective behavioral intervention by identifying essential components or Core Elements”. The Core Elements are those components of the intervention thought to be critical aspects of the original effective program without which it is unlikely that the program would remain effective. Core Elements are identified by the developers of the intervention and the DEBI staff who are experienced in intervention implementation and packaging30, 31. Core Elements have been categorized into implementation elements (the logistics of establishing a positive learning environment), pedagogy elements ( the teaching processes used); and content elements (factual information, values and other knowledge transferred)32. The DEBI paradigm advises that in implementing the effective program, if changes are made from the original, the new implementation protocol should retain all of the identified Core Elements.

The framework provided by the CDC’s DEBI program provides a reasonable state-of-the-art definition of fidelity of intervention delivery recognizes both the importance of remaining true to the original effective intervention and the need for some change to reflect differing circumstances or conditions. The DEBI process specifically focuses attention on adherence to the Core Elements of an intervention as identified by a consortium of persons knowledgeable about the theoretic underpinning and performance of the intervention (the program developers), individuals knowledgeable about intervention implementation, and individuals knowledgeable about packaging. However, this definition of fidelity is “experimental” we have yet to determine how accurately we can identify Core Elements and/or whether other activities or aspects of the intervention that are not identified as Core Elements and are therefore deleted or changed may adversely impact outcomes. It would seem prudent therefore that during this phase of defining terms, researchers and implementers attempt to include and monitor all activities in the curricula/interventions that have been demonstrated to be effective (even those not considered to be Core Elements). Further, it would be prudent to document and describe activities/lessons added as such additions may impact (positively or negatively) intervention impact.

A related question is how much cultural adaptation can be done to a Core Element before it no longer effectively serves its purpose in the intervention. For example, a core element in an HIV intervention might be inclusion of a hands-on condom-use exercise. In the original intervention the condom was placed on a model. In adaptations, would it undermine intervention effectiveness to instead place the condom on a banana or cucumber (as is commonly done) if this was felt to be less explicit and therefore more culturally acceptable? Or, in more culturally conservative settings, would further modification of the exercise to instead use pictures of a condom and ask the respondents to select accurate pictures and line them up in proper order still be regarded as adherence to the “a core element” ? These are the important questions that need to be carefully considered in modifying effective interventions lest their impact be unintentionally reduced.

Factors impacting fidelity

Factors which could be modified during training and adoption

18, 19, 21,23,25,33-41 Potentially interveneable characteristics that have been identified as associated with increased fidelity include those that increase the confidence of the individual conducting the intervention, the perceived relevance, ownership and/or benefit of the intervention to the implementers. Examples of strategies that might increase confidence include: 1) intensive in-service training that follows a set curriculum itself (compared to no training, training that is variable in nature, non-participatory training and/or brief training); 2) less curricular discretion (e.g. a more detailed script or lesson plan); and 3) actual practice in their ability to teach the curriculum and to use interactive methods.

Perceived relevance can be increased by supporting the perception that the situation being addressed in the intervention applies to the community in which the intervention is being adopted. Ownership --- a sense on the part of the implementers that they or their community had significant input into the development of the intervention—can be enhanced by incorporating community contextual factors. Support for the importance of the intervention being adopted by local authorities (e.g. the school principal or the clinic director) serves to indicate the relative benefit to be derived from teaching this curriculum as opposed to more time spent on another subject (such as math or reading or in a clinic setting, other aspects of anticipatory guidance). Also relevant to the question of benefit is the perception of the effectiveness of currently available (in this case HIV prevention) curricula; if the existing curriculum is perceived as adequate, substitution with a new one is likely to be more difficult. Organizational characteristics including receptivity to innovation and decreased turmoil have also been associated with increased fidelity. Finally, there are factors associated with the intervention that may influence fidelity23, 31, 42-45. Complex programs are less likely to be successfully implemented; as a corollary, programs that are packaged and present the materials in a straight-forward manner are more faithfully executed. Detailed instruction manuals for the trainers increase fidelity. Organizations concerned with fidelity of replication are aware of this reality.

An agenda for the future

The importance of research in and implementation of behavioral interventions and the need to understand their relevance, adaptation and dissemination should be an integral component of the pediatrician’s training process, and should be introduced at the very earliest stages of the pediatric health care provider’s training programs. Training in intervention protocols should be introduced in the regular medical school curriculum and/or in longitudinal co-curricular studies. Increasingly, courses such as Bench-to-Bedside are used to train medical students to understand the principles of translating bench research into practical clinical applications; such courses should be extended to include training in creating, understanding and use of relevance, core fidelity, acceptable adaptation and dissemination of behavioral interventions in future clinical practice. Establishment of workshops, seminars and certificate courses that provide hands on training in these areas would be particularly relevant in building a cadre of pediatricians who could perform research and aid in implementation and dissemination of behavioral interventions. Effective communication skills taught in these sessions will result in enhanced communication between the pediatric care providers and their patients, thereby resulting in increased compliance of adherence to behavioral interventions. Training of this nature should be vertically integrated into the career of the pediatrician-in-training, starting with medical school curriculum and spanning through residency and sub-specialty training, and extending through life-long learning mechanisms such as continuing medical education in pediatrics.

Grateful Acknowledgements

This work was supported by support from the National Institutes of Health (R01HD064350 and R01 MH069229). We thank Madeline Balice for her editorial assistance in this manuscript.

References

  • 1.Zerhouni E. Clinical research at a crossroads: the NIH roadmap. J. Investing. Med. 2006;54:73. doi: 10.2310/6650.2006.X0016. [DOI] [PubMed] [Google Scholar]
  • 2.Zerhouni E. Translational research moving discovery to practice. Clin. Pharmacol. Ther. 2006;81:126–28. doi: 10.1038/sj.clpt.6100029. [DOI] [PubMed] [Google Scholar]
  • 3.Woolf SH. The Meaning of Translational Research and Why It Matters. JAMA. 2008;299(2):211–213. doi: 10.1001/jama.2007.26. [DOI] [PubMed] [Google Scholar]
  • 4.Solomon J, Card JJ, Malow RM. Adapting efficacious interventions: Advancing translational research in HIV prevention. Eval Health Prof. 2006;29:1–33. doi: 10.1177/0163278706287344. [DOI] [PubMed] [Google Scholar]
  • 5.Sussman S, Valente TW, Rohrbach LA, Skara S, Pentz MA. Translation in the health professions: converting science into action. Eval Health Prof. 2006;29(1):7–32. doi: 10.1177/0163278705284441. [DOI] [PubMed] [Google Scholar]
  • 6.Green LW, Ottoson JM, Garcia C, Hiatt RA. Diffusion Theory and Knowledge Dissemination, Utilization, and Integration in Public Health. Annual Rev. Public Health. 2009;30:151–74. doi: 10.1146/annurev.publhealth.031308.100049. [DOI] [PubMed] [Google Scholar]
  • 7.Dearing JW. Evolution of Diffusion and Dissemination Theory. J Public Health Management Practice. 2008;14:99–108. doi: 10.1097/01.PHH.0000311886.98627.b7. [DOI] [PubMed] [Google Scholar]
  • 8.Dearing JW. Applying Diffusion of Innovation Theory to Intervention Development. Research on Social Work Practice. 2009:1–16. doi: 10.1177/1049731509335569. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Rotheram-Borus MJ, Swendeman D, Chovnick G. The past, present, and future of HIV prevention: integrating behavioral, biomedical, and structural intervention strategies for the next generation of HIV prevention. Annual Review of Clinical Psychology. 2009;5:143–67. doi: 10.1146/annurev.clinpsy.032408.153530. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. http://www.fda.gov/BiologicsBloodVaccines/DevelopmentApprovalProcess/default.htm.
  • 11.Magarey AM, Perry RA, Baur LA, Steinbeck KS, Sawyer M, Hills AP, Wilson G, Lee A, Daniels LA. A Parent-Led Family-Focused Treatment Program for Overweight Children Aged 5 to 9 Years: The PEACH RCT. Pediatrics. 2011;127:214–22. doi: 10.1542/peds.2009-1432. [DOI] [PubMed] [Google Scholar]
  • 12.Berben L, Bogert L, Leventhal ME, Fridlund B, Jaarsma T, Norekvól TM, Smith K, Strömberg A, Thompson DR, De Geest S. Which interventions are used by health care professionals to enhance medication adherence in cardiovascular patients? A survey of current clinical practice. Eur J Cardiovasc Nurs. 2011 Mar 10;:14–21. doi: 10.1016/j.ejcnurse.2010.10.004. [DOI] [PubMed] [Google Scholar]
  • 13.Lyles CM, Kay LS, Crepaz N, Herbst JH, Passin WF, Kim AS, Rama SM, Thadiparthi S, DeLuca JB, Mullins Best-Evidence Interventions: Findings from a systematic review of HIV behavioral interventions for US populations at high risk, 2000-2004. American Journal of Public Health. 2007;9:133–143. doi: 10.2105/AJPH.2005.076182. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Stagg SJ, Sheridan D. Effectiveness of bullying and violence prevention programs. AAOHN J. 2010;58:419–24. doi: 10.3928/08910162-20100916-02. [DOI] [PubMed] [Google Scholar]
  • 15.Song MK, Happ MB, Sandelowski M. Development of a tool to assess fidelity to a psycho-educational intervention. J Adv Nurs. 2010;66:673–682. doi: 10.1111/j.1365-2648.2009.05216.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Ansell BJ. Not getting to goal: the clinical costs of noncompliance. J Manag Care Pharm. 2008;14(6 Suppl B):9–15. doi: 10.18553/jmcp.2008.14.S6-B.9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Melde C, Esbebsen FA, Tusinski K. Addressing program fidelity using onsite observations and program provider descriptions of program delivery. Evaluation Review. 2006;30:714–740. doi: 10.1177/0193841X06293412. [DOI] [PubMed] [Google Scholar]
  • 18.Berman P, McLaughlin M. Implementation of educational innovation. Education Forum. 1976;40:345–370. [Google Scholar]
  • 19.Ringwalt C, Vincus A, Ennett S, Johnson R, Rohrbach A. Reasons for teachers’ adaptation of substance use prevention curricula in schools with non-white student popultions. Prevention Science. 2004;5:61–67. doi: 10.1023/b:prev.0000013983.87069.a0. [DOI] [PubMed] [Google Scholar]
  • 20.Bell SG, Newcomer SF, Bachrach C, Borawski E, Jemmott JL, Morrison D, Stanton B, Tortolero S, Zimmerman R. Challenges in Replicating Interventions. Journal of Adolescent Health. 2007;40:514–20. doi: 10.1016/j.jadohealth.2006.09.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Galbraith J, Stanton B, Boekeloo B, King W, Desmond S, Howard D, Black MM, Carey JW. Exploring implementation and fidelity of evidence-based behavioral interventions for HIV prevention: Lessons learned from the Focus on Kids diffusion case study. Health Ed and Beh OnlineFirst. 2008 Apr 29; doi: 10.1177/1090198108315366. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Hallfors D, Godette D. Will the ‘Principles of Effectiveness’ improve prevention practice? Early findings from a diffusion study. Health Education Research. 2002;17:461–470. doi: 10.1093/her/17.4.461. [DOI] [PubMed] [Google Scholar]
  • 23.Dusenbury L, Brannigan R, Falco M, Hansen WB. A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Health Educ Res. 2003;18:237–256. doi: 10.1093/her/18.2.237. [DOI] [PubMed] [Google Scholar]
  • 24.Perrin KM, Burke SG, O’Connor D, Walby G, Shippey C, Pitt S, McDermott RJ, Forthofer MS. Factors contributing to intervention fidelity in a multi-site chronic disease self-management program. Implement Sci. 2006;1:26. doi: 10.1186/1748-5908-1-26. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Backer TE. Finding the balance-Program Fidelity and adaptation in substance abuse prevention: A state-of-the-art review. Center for Substance Abuse Prevention; Rockville MD: Available at: www.samhsa.gov/cents/csap/modelprograms/pdfs/Findingbalance/pdf. [Google Scholar]
  • 26.Dane AV, Schneider BH. Program integrity in primary and early secondary prevention: are implementation effects out of control? Clinical Psychology Review. 1998;18:23–4. doi: 10.1016/s0272-7358(97)00043-3. [DOI] [PubMed] [Google Scholar]
  • 27.Stanton B, Gua J, Cottrell L, Galbraith J, Li X, Gibson C, et al. The complex business of adapting effective interventions to new settings: An urban to rural transfer. J Adol Health. 2005;37:163.e17–163.e26. doi: 10.1016/j.jadohealth.2004.10.005. [DOI] [PubMed] [Google Scholar]
  • 28.Blakely CH, Mayer JP, Rand GG, Schmitt N, Davidson W, Roitman D, et al. The fidelity-adaptation debate: Implications for the implementation of public sector of social programs. Am J Community Pschology. 1987;15:253–268. [Google Scholar]
  • 29.Mayer JP, Blakely CH, Davidson WS., II Social program innovation and dissemination: A study of organizational change processes. Policy Studies Review. 1986;6(2):273–286. [Google Scholar]
  • 30.Harshbarger C, Simmons G, Coelho H, Sloop K, Collins C. An empirical assessment of implememntation, adaptation, and tailoring: The evaluation of CDC’s national diffusion of VOICES/VOCES. AIDS Education and Prevention. 2006;18 SA:184–197. doi: 10.1521/aeap.2006.18.supp.184. [DOI] [PubMed] [Google Scholar]
  • 31.McKleroy VS, Galbraith JS, Cummings B, et al. Adapting evidence-based behavioral interventions for new settings and target populations. AIDS Education and Prevention. 2006;18:59–74. doi: 10.1521/aeap.2006.18.supp.59. [DOI] [PubMed] [Google Scholar]
  • 32.Education, Training and Research Associates and CDC . Adaptation guidance for science-based pregnancy, STD and HIV prevention education programs for adolescents. ETR Associates; Scotts Valley, CA: In press. [Google Scholar]
  • 33.Dworkin SL, Pinto RM, Hunter J, RApkin B, Remien RH. Keeping the spirit of community partnerships alive in the scale up of HIV/AIDS prevention: Critical reflections on the roll out of DEBI (Diffusion of Effective Behavioral Interventions) Am J Community Psychol. 2008;42:51–59. doi: 10.1007/s10464-008-9183-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Kelly JA, Heckman TG, Stevenson LY, Williams PN, Ertl T, Hays RB, et al. Transfer of research-based HIV prevention interventions to community service providers: Fidelity and adaptation. AIDS Education and Prevention. 2000;12(Suppl. A):87–98. [PubMed] [Google Scholar]
  • 35.Dusenbury L, Brannigan R, Hansen WB, Walsh J, Falco M. Quality of implementation: developing measures crucial to understanding the diffusion of prevention interventions. Health Ed Res. 2005;20:308–313. doi: 10.1093/her/cyg134. [DOI] [PubMed] [Google Scholar]
  • 36.Mihalic SF, Fagan AA, Argamaso S. Implementing the LifeSkills training drug prevention program: factors related to implementation fidelity. Implementation Science. 2008:3. doi: 10.1186/1748-5908-3-5. Available at www.implementationscience.com/content/3/1/5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Elliot D, Mihalic S. Issues in disseminating and replicating effective prevention programs. Prevention Science. 2004;5:47–53. doi: 10.1023/b:prev.0000013981.28071.52. [DOI] [PubMed] [Google Scholar]
  • 38.Rohrbach LA, Graham JW, Hansen WB. Diffusion of a school-based substance abuse prevention program: predictors of program implementation. Preventive Medicine. 1993;22:237–260. doi: 10.1006/pmed.1993.1020. [DOI] [PubMed] [Google Scholar]
  • 39.Fors SW, Doster ME. Implication of results: factors for success. Journal of School Health. 1983;55:332–334. doi: 10.1111/j.1746-1561.1985.tb05658.x. [DOI] [PubMed] [Google Scholar]
  • 40.Parcel GS, Ross JG, Lavin AT, Portnoy B, Nelson GD, Winters F. Enhancing implementation of the Teenage Health Teaching Modules. Journal of School Health. 1991;61:35–38. doi: 10.1111/j.1746-1561.1991.tb07857.x. [DOI] [PubMed] [Google Scholar]
  • 41.Perry CL, Murray DM, Griffin G. Evaluating the statewide dissemination of smoking prevention curricula: factors in teacher compliance. Journal of School Health. 1990;60:501–504. doi: 10.1111/j.1746-1561.1990.tb05890.x. [DOI] [PubMed] [Google Scholar]
  • 42.Gottfredson GD. A theory-ridden approach to program evaluation: a method for stimulating researcher-implementer collaboration. American Psychologist. 1984;39:1101–1112. [Google Scholar]
  • 43.Durlak JA. Why program implementation is important. Journal of Prevention and Intervention in the Community. 1998;17:5–18. [Google Scholar]
  • 44.Centers for Disease Control, Diffusion of Effective Behavioral Interventions (DEBI) [last accessed November 17, 2010]; http://www.cdc.gov/hiv/topics/research/prs/evidence-based-interventions.htm.
  • 45.Centers for Disease Control (b) [Last accessed November 17, 2010];Questions and answers: REP process. www.cdc.gov/hiv/topics/prev_prog/rep/resources/qa/process.htm.

RESOURCES