Abstract
Background
Mental health and substance abuse are among the most commonly reported reasons for visits to Federally Qualified Health Centers (CHCs), yet only 6.5% of encounters are with on-site behavioral health specialists. Rural CHCs are significantly less likely to have on-site behavioral specialists than urban CHCs. Due to this lack of mental health specialists in rural areas, the most promising approach to improving mental health outcomes is to help rural primary care providers deliver evidence based practices (EBPs). Despite the scope of these problems, no research has developed an effective implementation strategy for facilitating the adoption of mental health EBPs for rural CHCs.
Objectives
To describe the conceptual components of an Implementation Partnership that focuses on the adaption and adoption of mental health EBPs by rural CHCs in Arkansas.
Methods
We present a conceptual model that integrates seven separate frameworks: 1) Jones and Wells’ Evidence-Based Community Partnership Model, 2) Kitson’s Promoting Action on Research Implementation in Health Services (PARiHS) implementation framework, 3) Sackett’s definition of evidence-based medicine, 4) Glisson’s organizational social context model, 5) Rubenstein’s Evidence-Based Quality Improvement (EBQI) facilitation process, 6) Glasgow’s RE-AIM evaluation approach, and 7) Naylor’s concept of shared decision making.
Conclusions
By integrating these frameworks into a meaningful conceptual model, we hope to develop a successful Implementation Partnership between an academic health center and small rural CHCs to improve mental health outcomes. Findings from this Implementation Partnership should have relevance to hundreds of clinics and millions of patients, and could help promote the sustained adoption of EBPs across rural America.
Keywords: Implementation, CBPR, FQHCs, Rural, Quality improvement
Introduction
Federally Qualified Health Centers (CHCs) are the nation’s largest and fasting growing primary care (PC) network, providing services to over 20 million Americans living in medically underserved inner-city neighborhoods (47%) and rural communities (53%). CHCs provide services to 10% of rural Americans, 14% of minorities, and 20% of the uninsured. CHCs represent the country’s PC safety net—they treat all patients equally, serve all patients regardless of ability to pay, and aim to eliminate health disparities. Mental health problems are common among CHC patients. In our OUTREACH study (R01 MH076908), the prevalence of depression was noted to be 16% in rural CHC settings, and mental health and substance abuse are among the most commonly reported reasons for visits to CHCs, yet only 6.5% of encounters are with on-site behavioral health specialists.1 Rural CHCs are significantly less likely to have on-site behavioral specialists than urban CHCs,2 and linkages with Community Mental Health Centers are often inadequate.3 Despite the scope of these problems, no research has developed an effective implementation strategy for facilitating the adoption of mental health evidence based practices (EBPs) for rural CHCs.
Although mental health outcomes are equally poor in rural and urban areas, quality improvement (QI) strategies should not necessarily be the same.4 Due to the lack of mental health specialists in rural areas, one promising approach to improving mental health outcomes is to help rural PC providers deliver EBPs.2,4-6 Compared to urban areas, PC providers in rural areas play an even larger role in the de facto mental health care system,6-8 yet rural PC providers frequently lack the expertise, time, and resources to effectively treat mental health and substance use disorders. The relatively large patient panels of rural PC physicians 9 and short encounter times10 make it challenging to provide high quality mental health care during an encounter with multiple competing demands.11 Moreover, the linkages between rural PC practices and distant specialty mental health care practices are weak in most rural areas, making referrals infeasible,10 and use of off-site mental health specialists unlikely.12
Another major barrier to implementing EBPs in small rural clinics is that efficacy and effectiveness researchers typically design and test EBPs in large urban clinics. Thus, the efficacy and effectiveness trials that constitute the evidence base are not necessarily generalizable to rural clinics serving disadvantaged populations, and the interventions are not necessarily feasible to implement. Organizational theory and experience suggests that adaptation to local context is critical to adoption and sustainability.13-16 Because researchers cannot conduct randomized controlled trials for every intervention in every possible clinical setting, it is necessary for CHCs to develop an internal capacity to adapt EBPs based on their own preferences, needs and capacities and to evaluate the clinical impact of adoption using locally collected data. Moreover, the availability of timely, locally collected outcomes data should promote EBP sustainability, perhaps even more than external data from published randomized controlled trials.17,18 However, small rural clinics often lack the centralized infrastructure (e.g., dedicated staff with QI or evaluation expertise) needed to coordinate implementation evaluation efforts.19,20
In this article, we describe the conceptual components of an Implementation Partnership that focuses on the adaptation and adoption of mental health EBPs by rural CHCs in Arkansas. While the overarching conceptual components of this Implementation Partnership would be the same in urban CHCs, we focus on rural CHCs because of the higher level of adaptation required in those settings. With funding from the National Institutes of Mental Health (R24 MH085104), the Implementation Partnership embeds research faculty from an Academic Health Center (the University of Arkansas for Medical Sciences [UAMS] Department of Psychiatry) into the Arkansas community health centers and the Arkansas primary care association, the Community Health Centers of Arkansas, Inc. (CHCA), which is responsible for coordinating quality improvement activities in CHCs. Scholars and policy makers have argued that funding partnerships to embed researchers within clinical organizations is a promising approach for implementing EBPs and advancing the emergent field of implementation science.21-29
Methods
The conceptual model for this rural Implementation Partnership is presented in Figure 1. Partnership building is guided by the principles of the Evidence-Based Community Partnership Model.22 Implementation strategies are based on the Promoting Action on Research Implementation in Health Services (PARiHS) model,30 including facilitation via the Evidence Based Quality Improvement (EBQI) model.31,32 Implementation outcomes are measured using the RE-AIM framework.33-36 Facilitation processes and implementation outcomes both reciprocally influence the success of the partnership. The evaluation of the Implementation Partnership itself is guided by the principles of shared decision-making.37 This project received approval from the University of Arkansas for Medical Sciences Institutional Review Board.
Figure 1. Conceptual Frame for an Implementation Partnership between Community Health Centers and an Academic Health Center.

Partnership Building
One of the main barriers to EBP implementation has been the lack of collaborative efforts between academic researchers and community stakeholders, and the subsequent failure of researchers to address community needs.23 Research conceived and conducted within a clinical-research partnership has the potential to increase the rate of uptake of research findings dramatically.38 The President’s New Freedom Commission on Mental Health specifically charges researchers to work in conjunction with community partners to bridge the gap between science and service.39 Participatory research within a partnership builds on the strengths of the various partners (researchers, clinicians, patients, administrators). The reciprocity, equity and transparency of participatory research over the long run is needed to build the trust required to understand and address health disparities among disadvantaged populations.40 Despite strong calls for participatory research, there is a paucity of systematic studies on collaborative partnerships between mental health services researchers and rural community stakeholders.37,41 Although embedding researchers within clinical organizations is a highly promising approach to understanding and facilitating implementation, we know little about how to make such partnerships effective.25
Differences in goals, methods, and timelines between scientists and community stakeholders create fundamental challenges to sustaining partnerships. Scientists are expected to carefully apply meticulous research methodologies to generate incremental advances in knowledge while providers are expected to quickly solve acute problems in their clinics.42 Partnerships must be forged that align the goals, methods and timelines of researchers with those of community stakeholders. Successful collaborations require researchers to revise expectations concerning scientific rigor (i.e., internal validity).17,18 To be successful, the researchers in the Implementation Partnership will need to demonstrate flexibility in terms of specifying goals, methods and timeframes of the research to fit the needs and priorities of the community stakeholders.43,44
The conceptualization of the Implementation Partnership is based on Wells’ Evidence-Based Community Partnership Model which “…differs from a fully participatory process, which could lead to more sustained change, but not necessarily to use of evidence-informed strategies.” This integrated model blends the traditions of evidence-based QI and participatory research.22 The Implementation Partnership is structured according to the 12 guiding principles outlined by Jones and Wells: 1) shared decision-making and power between academics and community stakeholders; 2) use of written agreements and standard operating procedures; 3) frequent communication and direct forms of conflict resolution; 4) transparency; 5) understandability; 6) resource sharing; 7) respect for community values and timeframes; 8) scientific integrity; 9) academic productivity; 10) mutual reliance; 11) awareness of history and culture; and 12) development of capacity and leadership among community stakeholders.43
The structure of the Implementation Partnership includes a Steering Committee, EBQI Teams (focused on specific clinical disorders), two Advisory Boards and six Support Cores: 1) Clinical Training Core, 2) Data & Technology Core, 3) Evaluation Core, 4) Technical Writing Core, 5) Financing Core, and 6) Sustainability Core. The National Advisory Board includes clinical research experts in relevant clinical disorders and in implementation science. The Community Advisory Board includes members of the patient-majority community boards, including representation from minority groups. Our academic-community partnership initially developed through the NIMH-funded OUTREACH study (R01 MH076908). We developed this study to determine whether it is more effective for small rural PC clinics to provide collaborative care services on-site or to contract with an off-site depression care team via telemedicine. To initiate the partnership formation, we first presented the proposed study to the CHCA Executive Director and Clinical QI Manager and later presented a preliminary study design to the CHC Medical Directors at a quarterly Provider Board meeting. In partnership with local CHC staff, the OUTREACH team adapted the collaborative care intervention for CHCs and jointly developed the study design. This participatory process resulted in a more feasible study design. Through this RCT, the on-site research and clinical staff in the CHCs became more engaged with the academic medical researchers leading to our strong current academic-community partnership.
A Steering Committee comprised of faculty from the academic health center, staff from the Arkansas primary care association (CHCA), and Arkansas community health center staff/patients governs the Implementation Partnership. A key component of successful participatory research models is written agreements that outline the partnership’s mission, common values, responsibilities, leadership structure, operating procedures, data ownership, product review, guidelines for publications and press releases, and conflict resolution procedures. We base our Implementation Partnership’s operating procedures on the Collaboration Agreement developed by Jones and Wells.43 The Steering Committee will conduct strategic planning, based on ongoing formative evaluation of the partnership to revaluate priorities and progress towards goals, conduct annual review of budget and personnel allocation to ensure they are aligned with priorities, serve as the Data and Safety Monitoring Board, and disseminate findings to stakeholders. It will be critical to disseminate results to CHC administrators, providers, patients in a timely manner and to tailor the information to meet the administrative and clinical needs of all the CHC stakeholders.
PARiHS Implementation Framework
In this study, we use the PARiHS framework to guide our implementation strategy.30 The PARiHS framework proposes that successful implementation of EBPs is a function of: 1) evidence, 2) context, and 3) facilitation.45 Researchers have traditionally considered evidence to be the clinical outcomes of interventions tested in randomized controlled efficacy trials and then re-evaluated in effectiveness studies conducted under “real world” conditions.21 We adopt Sackett’s definition of evidence-based medicine: “the conscientious, explicit and judicious use of current best evidence in making decisions about the care of the individual patient. It means integrating individual clinical expertise with the best available external clinical evidence from systematic research.”46 We adopt this definition for our proposed participatory research because it weighs equally the importance of research findings and clinical experience. We also argue that clinicians and policy makers will be more likely to rely on locally generated “practice-based evidence” than on the scientifically generated evidence base. 21,22,47-50
Organizational Context, including organizational culture, climate, and capacity,51-58 is expected to influence the adoption of EBPs. Culture is generally conceptualized as the values and expectations of an organization.59 Climate is generally conceptualized as activities and experiences of workers.59 Capacity is generally conceptualized as the ability to make changes (e.g., resources, skills, etc.).60 We expect that, compared to large urban clinics, small rural clinics have very different organizational culture, climate and capacity. CHC culture and capacity is being assessed quantitatively using the Organizational Social Context survey59,61 which is being administered anonymously to clinical staff. During site-visits, researchers are also conducting non-participant observations of clinical and administrative operations to better understand each program’s common and accepted ways of doing things and to get a sense of staff cohesion, conflict, and burnout. In additional, key informant interviews with clinical and administrative leaders are addressing organizational capacity for QI and organizational culture and climate. The quantitative and qualitative data is being summarized into organizational profiles that will be used to assess capacity for EBP adoption, to customize facilitation methods for each CHC, and to aid in interpretation of implementation outcomes.
Facilitation typically involves a set of integrated strategies including identifying and engaging key stakeholders at all organizational levels, problem identification and resolution, assistance with technical issues, academic detailing, marketing, staff training, audit and feedback, quality improvement, and fostering role modeling.62,63 We are using a well validated facilitation method known as Evidence-Based Quality Improvement (EBQI).64,65 EBQI was developed by Rubenstein and colleagues based on the findings of the Mental Health Awareness Project, which compared two quality improvement strategies for depression in PC.66 Clinics were randomized to a top-down or a bottom-up model.66 The top-down approach involved centralized experts implementing depression evidence-based practices with some input from local PC staff. The bottom-up approach involved local clinical staff implementing depression evidence-based practices with some input from experts. The bottom-up quality improvement teams had both the best and worst outcomes in terms of fidelity to the evidence base.66 This finding suggests that the bottom-up approach has the best potential for quality improvement, but is subject to substantial variation depending on local climate, culture and capacity.66 Based on these findings, the EBQI model was developed which involves both centralized strategic decision-making and local tactical decision-making.67 There is a growing consensus among implementation experts68-70 and frontline clinicians and managers 66,67,71 that quality improvement strategies that incorporate both top-down and bottom-up approaches hold the most promise for sustained implementation of evidence based practices.
In EBQI, both researchers (clinical experts, implementation experts) and local staff participate fully in the quality improvement process, with the researchers facilitating rather than dictating implementation efforts.66,67,71 Thus, EBQI is intended to foster a researcher/clinician partnership that promotes buy-in.72,73 While emphasizing the involvement of outside experts and empirical evidence, EBQI stresses that an organization’s own healthcare professionals and staff are best positioned to improve their systems.72 Clinicians and administrators contribute local knowledge needed to tailor the evidence based practice for their own particular needs and organizational capabilities. Researchers contribute knowledge of the evidence base and tools needed for successful implementation. In addition to providing expertise, researchers in the EBQI model also facilitate problem solving and provide ongoing technical support. EBQI also emphasizes continuously revising the adapted evidence based practice based on feedback during Plan-Do-Study-Act cycles, and thus should lead to adapted evidence based practices that are robust, user-friendly, and feasible to deploy in real-world practice settings.
Evaluation of Implementation Outcomes
The evaluation of implementation outcomes is based on the RE-AIM Framework (Reach, Effectiveness, Adoption, Implementation, Maintenance).33-36 Adoption represents the absolute number/proportion of staff who use the EBP.33 Reach represents the absolute number/proportion of eligible patients who receive the EBP.33 Implementation represents the fidelity of the EBP as implemented in routine care.33 Effectiveness represents the clinical impact (on patient outcomes) of the EBP as implemented in routine care settings.35 Maintenance represents the degree to which the implementation of the EBP is sustained.33 To have an impact on health of the target population, an EBP must be adopted by providers, reach a large proportion of the targeted patient population, be implemented with fidelity, effectively improve outcomes, and be maintained after research funds are withdrawn.
For each adopted EBP, implementation outcomes initially will be measured during a pilot test conducted at one clinic of each CHC system. Once the Plan-Do-Study-Act cycle is completed, an implementation trial will be conducted in which clinics will being randomized to implementation sites or wait list control sites. CHC staff will collect all implementation outcome data during the pilot tests and the implementation trials. The Data and Technology Core will develop procedures for collecting, tracking, storing and analyzing data for these evaluations, as well as evaluating and deploying technologies to support data collection and analysis efforts. These implementation outcome data will provide the CHCs with internally valid local evidence about the adapted EBP in a timely manner. In Table 1, we display how both the implementation framework and evaluation will be used in an example demonstrating the implementation of an EBP for alcohol use disorders in rural CHCs.
Table 1.
PARiHS Implementation and RE-AIM Evaluation Frameworks: An Example with Alcohol Use Disorders (AUDs)
| Implementation Framework (PARiHS) | Examples: Implementation of an EBP for Alcohol Use Disorders (AUDs) |
|---|---|
| Evidence |
|
| Context |
|
| Facilitation using Evidence-Based Quality Improvement (EBQI) |
|
| Implementation Outcomes Evaluation Framework (RE-AIM) | Examples: Evaluation of Implementation of an EBP for Alcohol Use Disorders (AUDs) |
| Provider Adoption | Adoption will be defined as the percent of PC providers who screen their patients and provide advice, brief counseling, outcomes monitoring, and/or referral to those screening positive. |
| Patient Reach | Reach will be defined by the percent of patients screened for AUDs and the percent of those screening positive who received advice to reduce intake, brief counseling, outcomes monitoring, and/or referral for special alcohol treatment. |
| Implementation Fidelity | Fidelity will be defined as the accurate scoring of screeners and concordance with brief intervention protocols. |
| Clinical Effectiveness | Effectiveness will be measured using a site-level randomized study design. |
| Maintenance | Maintenance as continuous implementation of the EBP during the year long Implementation Trial and intent to sustain implementation after research funding ceases, as measured during brief phone interviews with Executive Directors and Medical Directors. |
Evaluation of Partnership
We are conducting a formative evaluation of the Implementation Partnership. Ongoing evaluation will enable us to determine which elements of the partnership are successful and which need improvement. We will use data from the formative evaluation to revise the activities and policies of the partnership in order to make it function more effectively/equitably. The formative evaluation focuses on: 1) the extent of shared decision-making, 2) the efficiency and equity of the EBQI facilitation process, and 3) CHC capacity to conduct their own QI evaluations.
The shared decision-making component of the formative evaluation is based on Naylor’s six elements common to participatory research: 1) identification of clinical issues to target; 2) definition of clinical and research goals; 3) sources of funding; 4) specification of evaluation methodologies; 5) definition of outcome measures; and 6) sustainability of changes and spread to other areas.37 Using the quantitative scale developed by Naylor to measure the extent of researcher/community participation, the Steering Committee members will rate the level of participation for each element along a continuum (researcher driven, cooperation by community partners, equal participation by community partners, or community driven). Researchers will explore these ratings during follow-up key informant interviews. In contrast to formal qualitative analysis methods, we will utilize rapid content analysis techniques so that we can feedback information to the Steering Committee in real time.74,75
We will also conduct ongoing formative evaluations the EBQI process during the implementation initiatives to insure that they operate efficiently and equitably. We define the EBQI process as being “efficient and equitable” under the following conditions: 1) the correct stakeholder groups are represented, 2) all stakeholders are engaged and participate fully, 3) all stakeholders consider the process balanced across research/clinical needs, 4) participants reach consensus concerning the adaptation of the EBP, 5) community participant’s burden is at an acceptable level, and 6) all stakeholders report high levels of satisfaction with the process.14,43,44 To explore these issues, we will conduct interviews with the members of the EBQI teams. Again, we will utilize rapid content analysis techniques so that we can feedback timely information to the EBQI Teams and inform their implementation process in real time.74,75
For the goal of developing an internal capacity for evaluation, the Steering Committee will set yearly goals for CHC staff taking methods courses, receiving training from UAMS faculty, and conducting valid evaluations of the implementation initiatives. We will interview UAMS research faculty conducting the trainings regarding the progress of CHC staff, and will evaluate data analyses conducted by CHC staff for completeness/accuracy.
Conclusions
Previous efforts to implement EBPs for depression in CHCs have proved challenging. For example, the Institute for Health Improvement’s (IHI) Breakthrough Series model used a series of learning sessions and action periods to implement Chronic Care models for specific disorders, including a Depression Collaborative.76 The Breakthrough Series allowed CHC staff to consult with national experts and learn from each others’ experiences implementing the Chronic Care models. For the Depression Collaboratives, CHCs collected data on performance measures (e.g., documented PHQ9 reassessment, and 50% reduction in PHQ9 score). The initial outcomes of the IHI Depression Breakthrough Series were good;77 however, the CHCs sustained few of the depression care improvements over time.78
Considering these less than optimal outcomes, the field clearly needs to develop, test, refine, and disseminate new and more intensive QI models for deploying mental health EBPs in Federally Qualified Health Centers, especially those serving rural populations.79 In this paper, we have presented a conceptual model that integrates seven separate frameworks. By integrating these various frameworks into a meaningful conceptual model, we hope to develop a successful Implementation Partnership between an academic health center and small rural CHCs to achieve and sustain improvements in mental health outcomes. Findings and products from the Implementation Partnership in Arkansas should have relevance to hundreds of clinics and millions of patients, and could help promote the sustained adoption of EBPs across rural America.
Acknowledgments
The work of our partnership is supported through the Interventions and Practice Research Infrastructure Program (R24 MH085104 awarded to Drs. Fortney and Curran and Ms. Mouden) from the National Institutes of Mental Health (NIMH). We want to thank the other members of the Implementation Partnership Steering Committee, Terry Hill, RNP (ARCare, Inc), Holli Banks-Giles, MD (East Arkansas Family Health Center), Lela Alston (community board representative with Jefferson Comprehensive Care System, Inc.), and Micah Hester, PhD (UAMS Medical Ethicist) for their helpful comments on the manuscript. Support has also been provided in part by the Arkansas Biosciences Institute, the major research component of the Arkansas Tobacco Settlement Proceeds Act of 2000 (Dr. Hunt).
Reference List
- 1.National Association of Community Health Centers. US Health Center Fact Sheet 2009. 2010 [Google Scholar]
- 2.Druss BG, Bornemann T, Fry-Johnson YW, McCombs HG, Politzer RM, Rust G. Trends in Mental Health and Substance Abuse Services at the Nation’s Community Health Centers: 1998-2003. Am J Public Health. 2006;96:1779–1784. doi: 10.2105/AJPH.2005.076943. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Institue of Medicine. America’s Health Care Safety Net. Washington, D.C.: 2000. [Google Scholar]
- 4.Rost K, Fortney J, Fischer E, Smith J. Use, quality and outcomes of care for mental health: The rural perspective. Med Care Res Rev. 2002;59(3):231–265. doi: 10.1177/1077558702059003001. [DOI] [PubMed] [Google Scholar]
- 5.Geller JM, Muus KJ. The role of rural primary care physicians in the provision of mental health services: Letter to the field no. 5. [1/14/02];1997 Available at: http://www.wiche.edu/MentalHealth/Frontier/index.htm.
- 6.Geller JM. Rural primary care providers’ perceptions of their roles in the provision of mental health services: Voices from the plains. J Rural Health. 1999;15(3):326–334. doi: 10.1111/j.1748-0361.1999.tb00754.x. [DOI] [PubMed] [Google Scholar]
- 7.Fox J, Merwin E, Blank M. De facto mental health services in the rural south. Journal of Healthcare for the Poor and Underserved. 1995;6(4):434–468. doi: 10.1353/hpu.2010.0003. [DOI] [PubMed] [Google Scholar]
- 8.Hartley D, Bird DC, Dempsey P. Rural mental health and substance abuse. In: Ricketts T, editor. Rural Health in the United States. New York: Oxford University Press; 1999. pp. 159–178. [Google Scholar]
- 9.Federal Office of Rural Health Policy. Facts about rural physicians. U.S. Department of Health and Human Services; 1997. [Google Scholar]
- 10.Reschovsky JD, Staiti AB. Access and quality: Does rural America lag behind? Health Aff. 2005;24(4):1128–1139. doi: 10.1377/hlthaff.24.4.1128. [DOI] [PubMed] [Google Scholar]
- 11.Rost K, Nutting P, Smith J, Coyne JC, Cooper-Patrick L, Rubenstein L. The role of competing demands in the treatment provided primary care patients with major depression. Arch Fam Med. 2000;9(2):150–154. doi: 10.1001/archfami.9.2.150. [DOI] [PubMed] [Google Scholar]
- 12.Hauenstein EJ, Petterson S, Rovnyak V, Merwin E, Heise B, Wagner D. Rurality and mental health treatment. Admin Ment Health. 2007;34(3):255–267. doi: 10.1007/s10488-006-0105-8. [DOI] [PubMed] [Google Scholar]
- 13.Stange KC, Goodwin MA, Zyzanski SJ, Dietrich AJ. Sustainability of a practice-individualized preventive service delivery intervention. Am J Prev Med. 2003;25:296–300. doi: 10.1016/s0749-3797(03)00219-8. [DOI] [PubMed] [Google Scholar]
- 14.Parker LE, dePillis E, Altschuler A, Rubenstein LV, Meredith LS. Balancing participation and expertise: A comparison of locally and centrally managed health care quality improvement within primary care practices. Qualitative Health Research. 2007;17:1268–1279. doi: 10.1177/1049732307307447. [DOI] [PubMed] [Google Scholar]
- 15.Hagedorn H, Hogan M, Smith JL, Bowman C, Curran GM, Espadas D, et al. Lessons learned about implementing research evidence into clinical practice. Experiences from VA QUERI. J Gen Intern Med. 2006;21(Suppl 2):S21–S24. doi: 10.1111/j.1525-1497.2006.00358.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Sharek PJ, Mullican C, Lavanderos A, Palmer C, Snow V, Kmetik K, et al. Best practice implementation: lessons learned from 20 partnerships. Joint Commission Journal on Quality and Patient Safety. 2007;33:16–26. doi: 10.1016/s1553-7250(07)33120-6. [DOI] [PubMed] [Google Scholar]
- 17.Hohmann AA, Shear MK. Community-based intervention research: Coping with the “noise” of real life in study design. Am J Psychiatry. 2002;159(2):201–207. doi: 10.1176/appi.ajp.159.2.201. [DOI] [PubMed] [Google Scholar]
- 18.Duan N, Gonzales J, Braslow J, Chambers D, Kravitz R. Evidence in mental health services research: What types, how much, and then what? Opening remarks. Presentation at the NIMH 15th International Conference on Mental Health Services Research; Washington, D.C.. April 2002. [Google Scholar]
- 19.Rye CB, Kimberly JR. The adoption of innovations by provider organizations in health care. Med Care Res Rev. 2007;64:235–278. doi: 10.1177/1077558707299865. [DOI] [PubMed] [Google Scholar]
- 20.Shortell SM, Zazzali JL, Burns LR, Alexander JA, Gillies RR, Budetti PP, et al. Implementing evidence-based medicine: The role of market pressures, compensation incentives, and culture in physician organizations. Med Care. 2001;39(7 Supplement 1):I62–I78. [PubMed] [Google Scholar]
- 21.Sullivan G, Duan N, Mukherjee S, Kirchner J, Perry D, Henderson K. The role of services researchers in facilitating intervention research. Psychiatr Serv. 2005;56(5):537–542. doi: 10.1176/appi.ps.56.5.537. [DOI] [PubMed] [Google Scholar]
- 22.Wells K, Miranda J, Bruce ML, Alegria M, Wallerstein N. Bridging community intervention and mental health services research. Am J Psychiatry. 2004;161(6):955–963. doi: 10.1176/appi.ajp.161.6.955. [DOI] [PubMed] [Google Scholar]
- 23.Westfall JM, Mold J, Fagnan L. Practice-based research--“Blue Highways” on the NIH roadmap. J Am Med Assoc. 2007;297:403–406. doi: 10.1001/jama.297.4.403. [DOI] [PubMed] [Google Scholar]
- 24.National Advisory Mental Health Council Services Research and Clinical Epidemiology Workgroup. The Road Ahead: Research Partnerships to Transform Services. Washington D.C.: 2006. [Google Scholar]
- 25.Rubenstein LV, Pugh J. Strategies for promoting organizational and practice change by advancing implementation research. J Gen Intern Med. 2006;21(Suppl 2):S58–S64. doi: 10.1111/j.1525-1497.2006.00364.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Demakis JG, McQueen L, Kizer KW, Feussner JR. Quality Enhancement Research Initiative (QUERI): A collaboration between research and clinical practice. Med Care. 2000;38(6, suppl I):I17–I25. [PubMed] [Google Scholar]
- 27.Stetler CB, Mittman BS, Francis J. Overview of the VA Quality Enhancement Research Initiative (QUERI) and QUERI theme articles: QUERI Series. Implementation Science. 2008;3(8):1–9. doi: 10.1186/1748-5908-3-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Kizer KW, Demakis JG, Feussner JR. Reinventing VA Health Care: Systematizing quality improvement and quality innovation. Med Care. 2000;38(suppl I):I-7–I-16. [PubMed] [Google Scholar]
- 29.Francis J, Perlin JB. Improving performance through knowledge translation in the Veterans Health Administration. J Contin Edu Health Professions. 2006;26:63–71. doi: 10.1002/chp.52. [DOI] [PubMed] [Google Scholar]
- 30.Kitson AL, Rycroft-Malone J, Harvey G, McCormack B, Seers K, Titchen A. Evaluating the successful implementation of evidence into practice using the PARiHS framework: Theoretical and practical challenges. Implementation Science. 2008;3(1):1–12. doi: 10.1186/1748-5908-3-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Rogers EM. Diffusion of Innovations. 3. New York: The Free Press, A Divison of Simon & Schuster, Inc.; 1983. [Google Scholar]
- 32.Rogers EM. Diffusion of Innovations. 4. New York, NY: The Free Press; 1995. [Google Scholar]
- 33.Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: The RE-AIM framework. Am J Public Health. 1999;89(9):1322–1327. doi: 10.2105/ajph.89.9.1322. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Glasgow RE, McKay HG, Piette JD, Reynolds KD. The RE-AIM framework for evaluating interventions: What can it tell us about approaches to chronic illness management? Patient Education & Counseling. 2001;44(2):119–127. doi: 10.1016/s0738-3991(00)00186-5. [DOI] [PubMed] [Google Scholar]
- 35.Glasgow RE. Translating research to practice: lessons learned, areas for improvement, and future directions. Diab Care. 2003;26(8):2451–2456. doi: 10.2337/diacare.26.8.2451. [DOI] [PubMed] [Google Scholar]
- 36.Glasgow RE, Lictenstein E, Marcus AC. Why don’t we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. Am J Public Health. 2003;93(8):1261–1267. doi: 10.2105/ajph.93.8.1261. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Naylor PJ, Wharf-Higgins J, Blair L, Green L, O’Connor B. Evaluating the participatory process in a community-based heart health project. Social Science & Medicine. 2002;55:1173–1187. doi: 10.1016/s0277-9536(01)00247-7. [DOI] [PubMed] [Google Scholar]
- 38.Nutting PA, Beasley JW, Werner JJ. Practice-based research networks answer primary care questions. J Am Med Assoc. 1999;281(8):686–688. doi: 10.1001/jama.281.8.686. [DOI] [PubMed] [Google Scholar]
- 39.New Freedom Commission on Mental Health. Achieving the Promise: Transforming Mental Health Care in America. Final Report. Rockville, MD: 2003. Report No.: SMA-03-3832. [Google Scholar]
- 40.Corbie-Smith G, Williams IC, Blumenthal C, Dorrance J, Estroff SE, Henderson G. Relationships and communication in minority participation in research: Multidimensional and multidirectional. Journal of the National Medical Association. 2007;99:489–498. [PMC free article] [PubMed] [Google Scholar]
- 41.Garland AF, Plemmons D, Koontz L. Research-practice partnership in mental health: Lessons from participants. Administration and Policy in Mental Health and Mental Health Services Research. 2006;33:517–528. doi: 10.1007/s10488-006-0062-2. [DOI] [PubMed] [Google Scholar]
- 42.Spoth R. Opportunities to meet challenges in rural prevention research: Findings from an evolving community-university partnership model. J Rural Health. 2007;23(Suppl):42–54. doi: 10.1111/j.1748-0361.2007.00123.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Jones L, Wells K. Strategies for academic and clinician engagement in community-participatory partnered research. J Am Med Assoc. 2007;297(4):407–410. doi: 10.1001/jama.297.4.407. [DOI] [PubMed] [Google Scholar]
- 44.Parker LE, Kirchner JE, Bonner LM, Fickel JJ, Ritchie MJ, Simons CE, et al. Creating a quality improvement dialogue: Utilizing knowledge from frontline staff, managers, and experts to foster healthcare quality improvement. Qualitative Health Research. 2009;19(2):229–242. doi: 10.1177/1049732308329481. [DOI] [PubMed] [Google Scholar]
- 45.Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: A conceptual framework. Qual Health Care. 1998;7(3):149–58. doi: 10.1136/qshc.7.3.149. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Sackett DL, Rosenberg WMC, Gray JAM, Haynes RB, Richardson WS. Evidence based medicine: What it is and what it isn’t. Br Med J. 1996;312:71–72. doi: 10.1136/bmj.312.7023.71. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Hay MC, Weisner TS, Subramanian S, Duan N, Niedzinski EJ, Kravitz RL. Harnessing experience: Exploring the gap between evidence-based medicine and clinical practice. J Eval Clin Pract. 2008;14:707–713. doi: 10.1111/j.1365-2753.2008.01009.x. [DOI] [PubMed] [Google Scholar]
- 48.Druss BG. Medicine-based evidence in mental health. Psychiatr Serv. 2005;56:543. doi: 10.1176/appi.ps.56.5.543. [DOI] [PubMed] [Google Scholar]
- 49.Manderscheid RW. Some thoughts on the relationships between evidence based practices, practice based evidence, outcomes, and performance measures. Admin Policy Ment Health. 2006;33:646–647. doi: 10.1007/s10488-006-0056-0. [DOI] [PubMed] [Google Scholar]
- 50.Margison FR, Barkham M, Evans C, McGrath G, Clark JM, Audin K, et al. Measurement and psychotherapy. Evidence-based practice and practice-based evidence. Br J Psychiatry. 2000;177:123–30. 123–130. doi: 10.1192/bjp.177.2.123. [DOI] [PubMed] [Google Scholar]
- 51.Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: A conceptual framework. Qual Health Care. 1998;7(3):149–158. doi: 10.1136/qshc.7.3.149. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Parker LE, Fickel JJ, Yano EM, Simon C, Bonner LM, Ritchie MJ, et al. Organizational context and adoption of new clinical practices. 2009 [Google Scholar]
- 53.Steckler A, Goodman RM, Kegler MC. Mobilizing organizations for health enhancement: Theories of organizational change. In: Glanz K, Rimmer BK, Lewis FM, editors. Health Behavior and Health Education: Theory, Research and Practice. San Francisco, CA: Jossey-Bass; 2002. pp. 335–360. [Google Scholar]
- 54.Glisson C, Green P. The effects of organizational culture and climate on the access to mental health care in child welfare and juvenile justice systems. Administration and Policy in Mental Health and Mental Health Services Research. 2006;33(4):433–448. doi: 10.1007/s10488-005-0016-0. [DOI] [PubMed] [Google Scholar]
- 55.Owen RR, Williams DK, Thrush CR, Hudson TJ, Armitage TL, Thapa P. The effect of guideline implementation strategies for schizophrenia on symptom outcomes. 2005 [Google Scholar]
- 56.Glisson C. The organizational context of children’s mental health services. Clinical Child and Family Psychology Review. 2002;5(4):233–253. doi: 10.1023/a:1020972906177. [DOI] [PubMed] [Google Scholar]
- 57.Glisson C, James LR. The cross-level effects of culture and climate in human service teams. Journal of Organizational Behavior. 2002;23:767–794. [Google Scholar]
- 58.Schutte K, Yano EM, Kilbourne AM, Wickrama B, Kirchner JE, Humphreys K. Organizational contexts of primary care approaches for managing problem drinking. J Subst Abuse Treat. doi: 10.1016/j.jsat.2008.09.002. In press. [DOI] [PubMed] [Google Scholar]
- 59.Glisson C, Landsverk J, Schoenwald S, Kelleher K, Hoagwood KE, Mayberg S, et al. Assessing the Organizational Social Context (OSC) of Mental Health Services: Implications for Research and Practice. Adminstration and Policy in Mental Health. 2008;35:98–113. doi: 10.1007/s10488-007-0148-5. [DOI] [PubMed] [Google Scholar]
- 60.Litaker D, Ruhe M, Flocke S. Making sense of primary care practices’ capacity for change. Translational Research. 2008;152:245–253. doi: 10.1016/j.trsl.2008.09.005. [DOI] [PubMed] [Google Scholar]
- 61.Glisson C, Schoenwald SK, Kelleher K, Landsverk J, Hoagwood KE, Mayberg S, et al. Therapist turnover and new program sustainability in mental health clinics as a function of organizational culture, climate, and service structure. Admin Policy Ment Health. 2008;35:124–133. doi: 10.1007/s10488-007-0152-9. [DOI] [PubMed] [Google Scholar]
- 62.Stetler CB, Legro MW, Rycroft-Malone J, et al. Role of “external facilitation” in implementation of research findings: A qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implementation Science. 2006 Oct 18;1:23. doi: 10.1186/1748-5908-1-23. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63.Curran GM, Mukherjee S, Allee E, Owen RR. A Process for Developing Implementation Interventions: QUERI Series. Implementation Science. 2008 Mar;3:17. doi: 10.1186/1748-5908-3-17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 64.Rubenstein LV, Parker LE, Meredith LS, et al. Understanding Team-based Quality Improvement for Depression in Primary Care. Health Serv Res. 2002 Aug;37(4):1009–29. doi: 10.1034/j.1600-0560.2002.63.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Rubenstein LV, Mittman BS, Yano EM, Mulrow CD. From understanding health care provider behavior to improving health care: The QUERI framework for quality improvement. Quality Enhancement Research Initiative Med Care. 2000;38(6 Suppl 1):I129–I141. [PubMed] [Google Scholar]
- 66.Rubenstein LV, Parker LE, Meredith LS, Altschuler A, dePillis E, Hernandez J, et al. Understanding Team-based Quality Improvement for Depression in Primary Care. Health Serv Res. 2002;37(4):1009–1029. doi: 10.1034/j.1600-0560.2002.63.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67.Parker LE, dePillis E, Altschuler A, Rubenstein LV, Meredith LS. Balancing participation and expertise: A comparison of locally and centrally managed health care quality improvement within primary care practices. Qualitative Health Research. 2007;17:1268–1279. doi: 10.1177/1049732307307447. [DOI] [PubMed] [Google Scholar]
- 68.Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: Systematic review and recommendations. The Milbank Quarterly. 2004;82(4):581–629. doi: 10.1111/j.0887-378X.2004.00325.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 69.Ginsburg LR, Lewis S, Zackheim L, Casebeer A. Revisiting interaction in knowledge translation. Implementation Science. 2007;2(34) doi: 10.1186/1748-5908-2-34. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 70.Stange KC, Goodwin MA, Zyzanski SJ, Dietrich AJ. Sustainability of a practice-individualized preventive service delivery intervention. Am J Prev Med. 2003;25:296–300. doi: 10.1016/s0749-3797(03)00219-8. [DOI] [PubMed] [Google Scholar]
- 71.Parker LE, Kirchner JE, Bonner LM, Fickel JJ, Ritchie MJ, Simons CE, et al. Creating a quality improvement dialogue: Utilizing knowledge from frontline staff, managers, and experts to foster healthcare quality improvement. Qualitative Health Research. 2009;19(2):229–242. doi: 10.1177/1049732308329481. [DOI] [PubMed] [Google Scholar]
- 72.Rubenstein LV, Mittman BS, Yano EM, Mulrow CD. From understanding health care provider behavior to improving health care: The QUERI framework for quality improvement. Quality Enhancement Research Initiative Med Care. 2000;38(6 Suppl 1):I129–I141. [PubMed] [Google Scholar]
- 73.Mendel P, Meredith LS, Schoenbaum M, Sherbourne CD, Wells KB. Interventions in organizational and community context: A framework for building evidence on dissemination and implementation in health services research. Admin Policy Ment Health. 2008;35(1-2):21–37. doi: 10.1007/s10488-007-0144-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 74.Sobo E, Simmes D, Landsverk J, Kurtin P. Rapid assessment with qualitative telephone interviews: Lessons from an evaluation of California’s Healthy Families program & Medical for children. American Journal of Evaluation. 2003;24(3):399–408. [Google Scholar]
- 75.Sobo EJ. Parents’ perceptions of pediatric day surgery risks: Unforeseeable complications, or avoidable mistakes? Social Science & Medicine. 2005;60:2341–2350. doi: 10.1016/j.socscimed.2004.10.006. [DOI] [PubMed] [Google Scholar]
- 76.Landon BE, Hicks LS, O’Malley AJ, Lieu TA, Keegan T, McNeil BJ, et al. Improving the management of chronic disease at community health centers. N Engl J Med. 2007;356:921–934. doi: 10.1056/NEJMsa062860. [DOI] [PubMed] [Google Scholar]
- 77.Katzelnick DJ, Von KM, Chung H, Provost LP, Wagner EH. Applying depression-specific change concepts in a collaborative breakthrough series. Joint Commission Journal on Quality and Patient Safety. 2005;31:386–397. doi: 10.1016/s1553-7250(05)31052-x. [DOI] [PubMed] [Google Scholar]
- 78.Meredith LS, Mendel P, Pearson M, Wu S, Joyce G, Straus JB, et al. Implementation and maintenance of quality improvement for treating depression in primary care. Psychiatr Serv. 2006;57(1):48–55. doi: 10.1176/appi.ps.57.1.48. [DOI] [PubMed] [Google Scholar]
- 79.Shojania KG, Grimshaw JM. Still no magic bullets: Pursuing more rigorous research in quality improvement. Am J Med. 2004;116:778–780. doi: 10.1016/j.amjmed.2004.03.003. [DOI] [PubMed] [Google Scholar]
