Abstract
The concern that addiction treatment be grounded in science has been recognized and enthusiastically endorsed in both the clinical and research communities. With recognition of the gap between knowledge development and application, there has been a recent emphasis on developing strategies for more effective application, i.e., for the incorporation of evidence-based practice in routine clinical programming. This has translated to a need to develop strategies designed to achieve organizational change and a field of study whose objective is to better understand how to expedite change in treatment organizations and their clinical practices. This paper focuses on the roles and responsibilities of researchers, practitioners, and the federal government in achieving changed practice and applying new knowledge to improve treatment. Even though great strides have been made to shift the emphasis from dissemination of knowledge to its application, much still remains to be done in the development and testing of additional application strategies specific to the substance abuse treatment field. Future considerations for implementation research are discussed.
Keywords: implementation, diffusion, evidence-based, dissemination, technology transfer, substance abuse treatment
1. Introduction
The ultimate concern of health services research is with improving the delivery of health care services to reduce disease and promote greater well-being (Steinwachs & Hughes, 2008). In the addictions, the objective of health services research has been to improve treatment delivery while enabling individuals to become drug free and socially productive. Because of the link presumed between research advances and improvements in health, the U.S. annually invests large sums to support research and knowledge development. Federal agencies responsible for the conduct of research, as well as the researchers who are funded to develop new knowledge have operated, for the most part, in isolation from the front-line of healthcare service delivery promoting a long-standing gulf between research and practice that has only lately come to be recognized and had attention paid to it. As a consequence of that belated attention there has been a significant monetary and intellectual commitment to examine ways of increasing the rate of adoption of treatment innovations (Rogers, 2003), and of strategies for reducing the well documented distance between research and application of new knowledge (Lamb, Greenlick, & McCarty, 1998). Helping to trigger that action is the concern that a number of evidence-based practices and programs have been identified, but the extent of their use by community-based treatment programs across the U.S. has been far less than desirable (Institute of Medicine, 2005). Some of the conditions that both permit and maintain the gulf between research and practice are described below.
1.1 Research and clinical practice: Different principles
Research is purposefully pains-taking, never hurried, and typically takes years to generate evidence that is often deemed promising but inconclusive and the intervention or clinical issue is said to require further study. Clinical services, in contrast, occur in real time and practitioners must respond to immediate and presenting problems. Thus, knowledge development can be measured in years, a luxury not available to clinicians. In addition, researchers and clinicians are responsive to different audiences and are recipients of professional literature with different objectives. Scientific journals emphasize methods and statistics with little attention to the details of an intervention’s implementation, integration into existing program, or costs – all of which are central to the concerns of clinicians that should appear in the clinicians’ professional literature (Kegeles et al., 2000).
1.2 Federal isolation of responsibilities and functions
Originally the National Institute on Drug Abuse (NIDA) had the capacity for both knowledge development and application (Brown & Flynn, 2002). Research and services were administered through a single organization until Congress separated these functions through the Alcohol, Drug Abuse and Mental Health Administration Reorganization Act of 1992. This legislation billeted each function in its own federal agency (i.e., NIDA-research; Center for Substance Abuse Treatment/CSAT-services) without demanding or creating a mechanism to allow the findings developed by NIDA to be transferred into practice/services by CSAT. Even with this divorce, significant efforts at knowledge application (implementing science-based practices) have been realized but efforts are hampered by the splitting of responsibilities and functions of knowledge development and application between agencies.
2. Progress toward knowledge application
With the advent of implementation science as a significant National Institutes of Health (NIH) concern, researchers have begun to develop and test strategies designed to transfer empirically tested interventions into clinical practice. Clear distinctions have been recognized between knowledge dissemination and knowledge application research. According to the National Institutes of Health (2009), knowledge dissemination research involves identifying mechanisms and approaches that improve the packaging and conveyance of evidence-based innovations, whereas, implementation or knowledge application research involves study of methods to improve the uptake of evidence-based interventions.
2.1 Federal responses to formal separation of responsibilities
As an Institute of the NIH, NIDA has a primary mission of research (i.e., knowledge development) with some assumed responsibility for dissemination, and its sister agency CSAT which is part of the Substance Abuse and Mental Health Services Administration (SAMHSA) has the responsibility to improve and expand treatment services, i.e., to support the work of knowledge application. Together, they have found ways to develop several collaborative strategies overcoming thereby their separate responsibilities for knowledge development and application. These initiatives include the NIDA and SAMHSA Blending Initiative to accelerate the translation of research into practice (NIDA; http://www.nida.nih.gov/blending/; Condon, Miner, Balmer, & Pintello, 2008); the SAMHSA/CSAT Practice Improvement Collaboratives (PIC; http://csat.samhsa.gov/pic/index.html) to improve treatment through the adoption of evidence-based practices; and the now defunct National Institute on Alcoholism and Alcohol Abuse (NIAAA)/SAMHSA/CSAT Researcher in Residence Program to encourage the adoption of research-based improvements (Hilton, 2001).
2.2 SAMHSA Initiatives
Two other important knowledge application initiatives by SAMHSA are the establishment of an online library for evidence-based practices and the creation of regional transfer centers. The National Registry of Evidence-based Programs and Practices (NREPP) is the searchable online library of empirically supported interventions that has considerably advanced knowledge application. The other substantial initiative is the set of regional Addiction Technology Transfer Centers (ATTCs). While NREPP primarily provides a list of available interventions, the ATTCs, with their modest levels of funding, provide the interpersonal activities that can bring science to services.
3. Conceptualization of change processes
Change is a difficult process (Thompson, 2010), there are many theories regarding implementation and effective change processes (Damschroder et al., 2009), and technology transfer requires organizational behavior change (Davis & Salasin, 1977). Despite the array of frameworks and models of implementation for changing practice (e.g., Elwyn, Taubert, & Kowalczuk, 2007; Flynn & Simpson, 2009; Greenhalgh, Robert, MacFarlane, Bate, & Kyriakidou, 2004; Heidenreich, Sahay, & Massie, 2009; Kerner, & Hall, 2009; Ogden, Hagen, & Askeland, 2009; Proctor et al., 2009; Simpson, 2009), there are a modest number of elements seen as common to the change process. Across most conceptual models there are three basic ideas, a) planning, b) acting or implementing, and c) evaluating. Major differences between models typically lie in the number and detail of steps specified in the action phase or principles described as being integral to the process. All view change as dynamic and participatory. Most focus primarily on the action phase and include steps involving presentation of a rationale for change oriented toward engaging staff around a need to increase personal and organizational effectiveness and how the proposed intervention can accomplish that; a period of adoption which includes time for staff decision-making and a trial period for the new intervention; implementation where the intervention is put into place with an expectation that it will become a part of routine clinical practice; sustainability which involves a system to maintain the change with attention to the fidelity of the model adopted; and finally an assessment of the intervention’s impact on effectiveness.
Adaptation of the chosen intervention to the demands of the host organization will likely occur during the processes of adoption, implementation, and sustainability. The degree of adaptation seen as necessary will dictate whether the model has become so different from its research-based original that it must be regarded as essentially new and require its own evaluation. Unfortunately, criteria do not exist for determining when adaptation crosses over to innovation, but those judgments are significant to the process. Where staff believes they are implementing the core treatment model and find that it fails their needs and expectations, it risks loss of confidence in the intervention and in later novel treatments suggested to them. It risks, as well, giving the intervention a “bad name” with those colleagues with whom they have reason to discuss it.
Early adopters, who put their stamp of approval on an innovation, have been characterized as organizations ready for change and acceptance of new ideas (Rogers, 2003). Just as failure can exert a negative influence, success can exert a positive influence. Identifying early adopters and assisting them in their efforts to adopt and implement innovations can demonstrate to others that change is feasible in their own organizations. Change agents can have a greater impact on adoption and implementation by first working with the organizations most likely to achieve success and gaining positive experience before attempting to encourage change in those organizations less ready and accepting of new ideas. Finally, the complexity of organizational change processes should be obvious, but the state of knowledge regarding these processes is such that our efforts involve both art and science. The need to expand the science-base is well recognized. Until then our efforts are likely to continue to involve a degree of art when preparing organizations for change and assisting them with the change processes.
4. Changing Organizations
In order to change practices through the adoption of new clinical initiatives, a program must be ready to accept that change and then be prepared to adopt and maintain the new intervention. Keys to successful change include the relevance, utility, and effectiveness of the proposed initiative, as well as support, sufficient resources, and expertise to enact the change in practice.
4.1 Readiness for change
Before giving up the comfort of the status quo and moving toward new initiatives, program staff must accept that there is a need for change and that the change proposed will be worth their effort. Where need for change is accepted, and the initiative proposed is believed responsive to that need, the initiative must still pass a test of feasibility. Staff must believe that change is needed to increase theirs and the organization’s effectiveness, that the chosen initiative will meet that need, and that the initiative is within their reach in terms of skills and time, and within the reach of the organization in terms of resources demanded. If there is no apparent need for change (i.e., recognized inadequacies or inefficiencies), or if the demands of the intervention – even in the face of acceptance of the need for change – is seen as overtaxing the individual and/or the organization, it is unlikely there will be a willingness to change established clinical routines.
Thus, preparing the organization can be seen as a considerable undertaking, but until recently it received little attention from the research and service communities. Much effort and resources have already been concentrated on knowledge development with little regard given to strategies to prepare organizations and their staff to accept and adopt the knowledge developed. Institutionalizing a process to assure staff understanding and support for change is easily overlooked amid demands to adopt empirically supported interventions. Without organizational and staff preparation there is the likelihood that the process may not occur or will be slow at best, that staff will be unenthusiastic, that fidelity to new protocol will be diminished, and that maintenance and sustainability of the change will be neglected.
4.2 Preparing for uptake
As noted above, in accepting the potential contribution of new initiatives, program staff accepts a responsibility to undertake the incorporation of new skills. The organization accepts a responsibility to provide training to support skills development, and administrative support that will be sufficient to achieving adoption and implementation, including, importantly, appropriate clinical supervision to guarantee fidelity to the model that was proven effective in other settings. Training activities typically require manuals and materials developed for step by step implementation of an intervention as well as an expert trainer who conveys confidence in the initiative’s capacity to make a positive impact on program effectiveness. Beyond formal training sessions, there is typically a need for other activities such as homework assignments, technical assistance, booster sessions, coaching, and so forth to enhance learning. This will involve additional time and resources to support a continuing education process, particularly where programs are challenged by staff turnover and need training for new staff to ensure maintenance and sustainability of the initiative. Incentives can also be used to increase motivation for use of innovations (Miller, Sorensen, Selzer, & Brigham, 2006), but these should not preclude the organizational development efforts needed to increase readiness for change and acceptance of an innovation. It behooves us as well to remember that clinical staffs have chosen a helping profession and if the proposed change will benefit their clients, there will be a strong intrinsic motivation for adopting that change that promises greater effectiveness and better outcomes.
With the advent of the Internet, online learning, and computer-based training, the field is now more than ever positioned to provide training on demand. Instruction can be individually tailored and interpersonal aspects of the training and learning process can be simulated to help maintain and increase knowledge and skills. It remains to be seen as to what extent Internet training can serve as a reasonable substitute for interpersonal activities, or if it will still represent an impersonal approach and be less effective than traditional training approaches to adoption and implementation. Strategies for presenting booster sessions and technical support will also be needed. Because existing research strongly supports the use of interpersonal contact in technology transfer (e.g., Fairweather, 1980; Fairweather, Sanders, & Tomatzky, 1974; Hall, Sorensen, & Loeb, 1988; Sorensen et al., 1988; Stevens & Tomatzky, 1980), attention must be given to the development of effective interactive models for the Internet.
A great deal of attention has been given in the literature to the training needed for skills acquisition to enable implementation, but there is much less regard to aspects of supervision needed to ensure continuing fidelity to the innovation. Without adequate supervisor training, monitoring, and support, there may be “drift” from the intervention model attenuating its effectiveness.
5. Implementation science
The knowledge base for guiding adoption and implementation of evidence-based practices is still only modest but growing, and has been described as being in an “embryonic” state (Proctor et al., 2009). Ever since the IOM report (Lamb et al., 1998) raised awareness of the gap between research and practice, the addictions field has stepped-up its research on organizations and processes to effectively apply new knowledge. Until these efforts unfold, we will continue to rely on work primarily from the health and service fields to guide diffusion of innovations in substance abuse treatment centers. While the implementation science base is developing, several issues are worthy of consideration.
5.1 Resources and applicability
Like the implementation suggested for adoption, the demand on the organization for resources in support of knowledge application must be realistic in terms of both the human and material resources it can bring to bear. Implementation is not cheap (Klein & Knight, 2005) and demands for change should be undertaken in association with a realistic assessment of the cost to the organization in time, money, and personnel. Attention to costs during the knowledge development process has been limited and price tags for new interventions are missing. The organization must be enabled to understand both whether the intervention is feasible in terms of the time, personnel, and equipment demanded, and must be assured it has those same resources in sufficient supply to support the process of adoption of the intervention. Arguably, it is the responsibility of the treatment organization to make a judgment as to its capacity of adopting the evidence-based initiative in question, and the responsibility of the federal agencies that have supported knowledge development and are expected to support knowledge application to contribute to the effort to implement new initiatives. Certainly, it appears past time to avoid the practice of past knowledge development efforts that accepted as sufficient the support of treatment research where interventions were fated to have life spans limited to the length of grants by virtue of a demand on program resources that could be seen as exceeding the capacity of all but a few well endowed treatment programs. It would seem too past time to expect programs to adopt new initiatives in association with their own capacities to develop the organizational change activities required of them based on their own limited resources.
5.2 Practice to Research
For too many years the field has been inundated by catch phrases such as “bench to bedside, research to practice, science to services, and technology transfer.” These phrases imply governance by science over the substance abuse field. Information is expected to flow in only one direction- from research to practice. This ignores and denies the contributions to science that can be made by the clinicians. Practice should be afforded the opportunity to influence research, if only because the more relevant the clinical research, the greater the chances are for its adoption. Thus, there can be seen a need to develop protocols for learning from practitioners their information needs and concerns.
6. Conclusions
Until recently, most of the emphasis in the field has been on knowledge development and dissemination through print media. The newly recognized need for knowledge application and for implementation science to support those efforts has stimulated both a healthy reappraisal of federal, researcher, and program administration responsibility for the adoption of evidence-based interventions, and a recognition that changing clinical practice involves an effort to bring about organizational change. It remains to be seen whether the current enthusiasm for knowledge application as a research topic and as a clinical concern is adequately supported over time and maintained as a priority. Treatment programs are not likely to have the resources to undertake change by themselves in terms of identifying and paying appropriate change agents to guide staff consideration of the need for change and the appropriateness of the intervention proposed to meet that need. Nor do they have resources for developing training manuals and measures of fidelity, as well as providing training and ongoing support to clinical supervisors to guarantee the sustaining of new clinical program that they have adopted. Treatment programs alone cannot guarantee that issues and problems emanating from clinical practice find their way to research agendas and permit clinical concerns to become the focus of clinical research. It will be incumbent on the appropriate federal agencies to work together to lead this effort if we are to realize the potential all see as both positive and essential to permit effective clinical care.
Acknowledgments
Funding Source: This work was funded in part by the National Institute on Drug Abuse (Grant 2R01DA013093-11). The interpretations and conclusions, however, do not necessarily represent the position of the NIDA, NIH, or Department of Health and Human Services.
Role of Funding Source: The NIDA had no role in the design and writing of the manuscript, or in the decision to submit the manuscript for publication in Addictive Behaviors.
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Conflict of Interest
Both authors declare that they have no conflict of interest.
Contributors: Authors Flynn and Brown designed the study and wrote the manuscript. Both Flynn and Brown contributed to and have approved the final manuscript.
References
- Brown BS, Flynn PM. The federal role in drug abuse technology transfer: A history and perspective. Journal of Substance Abuse Treatment. 2002;22(4):245–257. doi: 10.1016/s0740-5472(02)00228-3. [DOI] [PubMed] [Google Scholar]
- Condon TP, Miner LL, Balmer CW, Pintello D. Blending addiction research and practice: Strategies for technology transfer. Journal of Substance Abuse Treatment. 2008;35:156–160. doi: 10.1016/j.jsat.2007.09.004. [DOI] [PubMed] [Google Scholar]
- Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science. 2009;4(50) doi: 10.1186/1748-5908-4-50. http://www.implementationscience.com/content/pdf/1748-5908-4-50.pdf. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Davis HR, Salasin SE. Applied social research: In combat with waste and suffering. International Journal of Comparative Sociology. 1977;19:107–113. [Google Scholar]
- Elwyn G, Taubert M, Kowalczuk J. Sticky knowledge: A possible model for investigating implementation in healthcare contexts. Implementation Science. 2007;2(44) doi: 10.1186/1748-5908-2-44. http://www.implementationscience.com/content/2/1/44. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fairweather GW. The Fairweather Lodge: A twenty-five year retrospective. San Francisco: Jossey-Bass; 1980. [Google Scholar]
- Fairweather GW, Sanders DH, Tomatzky LG. Creating change in mental health organizations. New York: Pergamon; 1974. [Google Scholar]
- Flynn PM, Simpson DD. Adoption and implementation of evidence-based treatment. In: Miller PM, editor. Evidence-based addiction treatment. San Diego, CA: Elsevier; 2009. pp. 419–437. [Google Scholar]
- Greenhalgh T, Robert G, MacFarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: Systematic review and recommendations. The Milbank Quarterly. 2004;82(4):581–629. doi: 10.1111/j.0887-378X.2004.00325.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hall SM, Sorensen JL, Loeb PC. Development and diffusion of a skills-training intervention. In: Baker TB, Canno DS, editors. Assessment and treatment of addictive disorders. New York: Praeger; 1988. [Google Scholar]
- Heidenreich P, Sahay A, Massie B. A conceptual model of VA implementation with a focus on heart failure and the role of QUERI. 2009 www.queri.research.va.gov/chf/products/white-paper-impl.pdf.
- Hilton ME. Researcher in residence program: Experiences from New York State. Rockville, MD: National Institute on Alcohol Abuse and Alcoholism; 2001. [Google Scholar]
- Institute of Medicine. Improving the quality of health care for mental and substance abuse conditions: Quality Chasm Series. Washington, DC: The National Academy Press; 2005. [Google Scholar]
- Kegeles SM, Rebchook GM, Hays RB, Terry MA, O’Donnell L, Leonard NR, Kelly JA, Neumann MS. From science to application: The development of an intervention package. AIDS Education and Prevention. 2000;12 Supplement A:62–74. [PubMed] [Google Scholar]
- Kerner JF, Hall KL. Research dissemination and diffusion: Translation within science and society. Research on Social Work Practice. 2009;19(5):519–530. [Google Scholar]
- Klein KJ, Knight AP. Innovation implementation: Overcoming the challenge. Current Directions in Psychological Science. 2005;14:243–246. [Google Scholar]
- Lamb S, Greenlick MR, McCarty D. Bridging the gap between practice and research. Washington, DC: Institute of Medicine, National Academy Press; 1998. [PubMed] [Google Scholar]
- National Institutes of Health. [Retrieved October 11, 2010];2009 from http://grants.nih.gov/grants/guide/pa-files/PAR-10-038.html.
- Miller WR, Sorensen JL, Selzer JA, Brigham GS. Disseminating evidence-based practices in substance abuse treatment: A review with suggestions. Journal of Substance Abuse Treatment. 2006;31:25–39. doi: 10.1016/j.jsat.2006.03.005. [DOI] [PubMed] [Google Scholar]
- Ogden T, Hagen KA, Askeland E. Implementing and evaluating evidence-based treatments of conduct problems in children and youth in Norway. Research on Social Work Practice. 2009;19(5):582–591. [Google Scholar]
- Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: An emerging Science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health. 2009;36:24–34. doi: 10.1007/s10488-008-0197-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rogers EM. Diffusion of innovations. 5th ed. New York: The Free Press; 2003. [Google Scholar]
- Simpson DD. Organizational readiness for stage-based dynamics of innovation implementation. Research on Social Work Practice. 2009;19(5):541–551. [Google Scholar]
- Sorensen JL, Hall SM, Loeb P, Allen T, Glaser EM, Greenberg PD. Dissemination of a job seekers’ workshop to drug treatment programs. Behavioral Therapy. 1988;19:143–155. [Google Scholar]
- Steinwachs DM, Hughes RG. Health Services Research: Scope and Significance (Chapter 8; PDF file, 90.8 KB) In: Hughes RG, editor. Patient safety and quality: An evidence-based handbook for nurses (AHRQ Publication No. 08-0043) Rockville, MD: Agency for Healthcare Research and Quality; 2008. [Google Scholar]
- Stevens WF, Tomatzky LG. The dissemination of evaluation: An experiment. Evaluation Review. 1980;4:339–354. [Google Scholar]
- Thompson JM. Understanding and managing organizational change: Implications for public health management. Journal of Public Health Management and Practice. 2010;16(2):167–173. doi: 10.1097/PHH.0b013e3181c8cb51. [DOI] [PubMed] [Google Scholar]
