Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2013 Jun 3.
Published in final edited form as: Eval Program Plann. 2011 Mar 2;34(4):366–374. doi: 10.1016/j.evalprogplan.2011.02.003

Employing Policy and Purchasing Levers to Increase the Use of Evidence-Based Practices in Community-Based Substance Abuse Treatment Settings: Reports from Single State Authorities

Traci R Rieckmann 1, Anne E Kovas 2, Elaine F Cassidy 3, Dennis McCarty 1
PMCID: PMC3670771  NIHMSID: NIHMS468948  PMID: 21371753

Abstract

State public health authorities are critical to the successful implementation of science based addiction treatment practices by community-based providers. The literature to date, however, lacks examples of state level policy strategies that promote evidence-based practices (EBPs). This mixed-methods study documents changes in two critical state-to-provider strategies aimed at accelerating use of evidence-based practices: purchasing levers (financial incentives and mechanisms) and policy or regulatory levers. A sample of 51 state representatives was interviewed. Single State Authorities for substance abuse treatment (SSAs) that fund providers directly or through managed care were significantly more likely to have contracts that required or encouraged evidence-based interventions, as compared to SSAs that fund providers indirectly through sub-state entities. Policy levers included EBP-related legislation, language in rules and regulations, and evidence-based criteria in state plans and standards. These differences in state policy are likely to result in significant state level variations regarding both the extent to which EBPs are implemented by community-based treatment providers and the quality of implementation.

Keywords: evidence-based practices, implementation, evaluation, purchasing, policy

1. Introduction

1.1. State of evidence-based practice implementation

Quality services for alcohol and drug treatment depend on implementation of practices that integrate best research evidence, clinical expertise, and patient values (Institute of Medicine, 2001, 2006; President's New Freedom Commission on Mental Health, 2003; Substance Abuse and Mental Health Services Administration, 2002). However, there is a gap between identification and implementation of effective practices (Garner, 2009; Glasgow, Lichtenstein, & Marcus, 2003; Miller, Sorensen, Selzer, & Brigham, 2006). A recent review also reported that only nine of 65 (14%) substance abuse treatment research studies specifically focused on evidence-based treatment implementation (Garner, 2009). In order for any new intervention to achieve widespread clinical use, stakeholders must be convinced of its effectiveness and support its use, providers must have adequate and dependable funding, and programs must engage in planning, needs assessment, and implementation activities (Ling, Cunningham-Rathner, & Rawson, 2004; Rugs, Hills, Moore, & Peters, 2011; Thomas, Wallack, Lee, McCarty, & Swift, 2003). One important part of this process that has not been examined sufficiently in addiction science is the role of public health policy to accelerate use of evidence-based practices and improve the quality of care at the community level. Such policies have the potential to create significant systems-wide change and long-term effects at relatively low costs (Brownson, Chriqui, & Stamatakis, 2009). The literature to date, however, lacks examples of policy changes that promote use of evidence-based practices by community service providers.

The translation of research to practice is complicated by a multidimensional service delivery system that varies by state, county, and provider, as well as diverse and unpredictable funding streams (Author, 2010). In addition, there are no specific federal policies or mandates for governing and evaluating use of evidence-based practices (EBPs). There are, however, an increasing number of initiatives to promote dissemination of EBPs. The Substance Abuse and Mental Health Services Administration (SAMHSA) maintains an online National Registry of Evidence-Based Programs and Practices (http://nrepp.samhsa.gov) and publishes the Treatment Improvement Protocol Series (TIPS) best practice guidelines. SAMHSA and the Addiction Technology Transfer Centers (ATTCs) also collaborate with the National Institute on Drug Abuse’s Clinical Trials Network (NIDA CTN) in the NIDA/SAMHSA Blending Initiative, which seeks to advance the diffusion of research into practice (Condon, Miner, Balmer, & Pintello, 2008). The National Quality Forum also recently disseminated a consensus report that provides a framework for the continuum of addiction services and recommendations for improving the quality of care (NQF, 2007). Finally, the Robert Wood Johnson Foundation (RWJF) leads the Advancing Recovery Initiative to encourage system change through partnerships of state and provider agencies and promotes implementation of evidence-based practices within community-based treatment programs (Author, 2009). However, to date, little is known about individual state initiatives to promote implementation of EBPs and the impact of such efforts on substance abuse treatment services.

1.2. Single State Authorities and substance abuse treatment

States play an important role in adoption of new clinical practices in community-based treatment programs. As the largest source of public funding for substance abuse treatment (40% of total public funding in 2003), state and local governments provide leadership to drive change and influence the quality of care in substance abuse and mental health services (Boyle, 2009; Ducharme & Abraham, 2008; Mark, Levit, Vandivort-Warren, Coffey, Buck, et al., 2007).

The Hughes Act (Comprehensive Alcohol Abuse, and Alcoholism Prevention, Treatment and Rehabilitation Act of 1970, P.L. 91-616) required each state to establish a Single State Authority (SSA) to administer federal substance abuse funds and develop and regulate services (Author, 2010). Although the basic framework applies to all states, individual state’spublic substance abuse systems vary in organization and financing (Gold, Glynn, & Mueser, 2006) and must work within the confines of competing financial and political challenges and priorities (Lynde, 2005). Thus, the SSAs in each state and the District of Columbia work directly (and indirectly through sub-state entities) to provide services at regional, county, and local levels, and to facilitate the development, implementation and evaluation of prevention and treatment services for addiction.

The current project examines two critical state mechanisms: purchasing levers and policy or regulatory strategies that directly impact programs, providers and ultimately clients in addiction services. States may employ a range of policies and strategies including rules, regulations, standards, authorizations, and state plans, to define, structure and monitor services (Chriqui, Eidson, McBride, Scott, Cappocia, et al., 2006; Chriqui, Terry-McElrath, McBride, Eidson, & VanderWaal, 2007; Chriqui, Terry-McElrath, McBride, & Eidson, 2008). States also can implement purchasing levers that control treatment provider resources and performance standards. This approach is consistent with findings from Gelber and Rinaldo (2005) and recommendations from Rosenbloom and colleagues in their “Blueprint for the States”( Rosenbloom, Leis, Shah, & Ambrogi, 2006). In the mental health field, Rapp and colleagues identified seven tasks for implementing and sustaining EBPs, including strategic planning, by which state mental health authorities (SMHA) position themselves to meet new challenges (Rapp, Bond, Becker, Carpinello, Nikkel, et al., 2005). Additionally, to assess the impact of the SMHAs’ leadership role in implementing EBPs, Finnerty, Rapp and colleagues (2009) developed the State Health Authority Yardstick (SHAY) which captures EBP mandates, or explicit policies and regulations that may be incorporated into contracts and other mechanisms (Finnerty, Rapp, Bond, Lynde, Ganju, et al., 2009). In fact, research from mental health suggests that state efforts may be shifting away from leaders who advocate for EBP implementation toward more explicit fiscal and regulatory drivers (Bruns, Hoagwood, Rivard, Wotring, Marsenich, et al., 2008).

Little research, however, examines the use, implementation and impact of policy and purchasing levers aimed at improving quality of care in substance abuse treatment services. As a seminal report in the field of addiction services, the National Quality Forum’s Consensus Standards (2007) recommended four evidence-based practice domains (identification of substance use disorder, treatment initiation and engagement, behavioral and pharmacological therapies, and continuing care). An initial report (NQF, 2005) also provided recommendations that purchasers of service (i.e., SSAs) should use regulatory strategies and financial incentives to promote EBPs. As the first set of consensus standards on the use of evidence-based treatment for substance use disorders, evaluation of the specific financial and regulatory policies and the associated planning and evaluation activities is warranted.

1.3 Purchasing levers and evidence-based practice

When purchasing substance abuse treatment services, SSAs use competitive grants or contracts to finance and manage service delivery and their relationship with public sector providers. As both the keepers of the funds and the oversight for addiction services, the decisions of each SSA directly impact what happens each day in community-based treatment programs. More specifically, SSAs may fund providers directly by negotiating contracts with specific programs, through a managed care organization, or indirectly through transfer funds with counties or sub-state entities. Funding providers directly (or by directing a managed care organization to provide services) places contractual authority at the SSA level and makes providers directly responsible to the SSA, which creates a mechanism for influencing and evaluating service delivery (Marton, Daigle, & de la Gueronniere, 2005).

As the primary purchasers of public sector treatment services, SSAs can use contractual requirements as purchasing levers to promote the use of science-based addiction treatment practices, and thus their decisions impact client outcomes (Boyle, 2009; Finnerty, Rapp, Bond, Lynde, Ganju, et al., 2009; Finnerty, Rapp, & Bond, 2005; Institute of Medicine, 2006; Rapp, Bond, Becker, Carpinello, Nikkel, et al., 2005). Contractual requirements specifically state the providers’ obligations for clinical care, outcome monitoring, and even what specific interventions or practices (e.g., Cognitive Behavioral Therapy, Motivational Interviewing) are required within the available resources (Center for Substance Abuse Treatment, 1998; Marton, Daigle, & de la Gueronniere, 2005; Miller, Zweben, & Johnson, 2005). As a legally binding financial document, contracts offer a practical, enforceable approach to EBP implementation. Grants and contracts awarded annually or biannually offer flexibility to change provider language to adapt to changing needs, new technologies and to ensure compliance and monitoring of services.

Marton and colleagues’(2005) environmental scan of state purchasing levers for EBPs in substance abuse treatment suggested that the contractual process is the first and best opportunity to develop and define specific SSA goals and process outcomes to ensure high quality treatment services. By prioritizing EBPs, SSAs show their commitment to a particular standard of service which in turn affects providers’perceptions regarding the suitability of individual treatment modalities (Marton, Daigle, & de la Gueronniere, 2005).

1.4. Policy levers and evidence-based practice

Policy levers, regulations, standards and rules may also influence the type of treatment services clients receive. Through passage of state legislation and “standardization of criteria”, state purchasers and policy makers articulate clinical practice requirements and the regulatory mechanisms that will be used to evaluate compliance (Chriqui, Eidson, McBride, Scott, Cappocia, et al., 2006; Chriqui, Terry-McElrath, McBride, & Eidson, 2008; Marton, Daigle, & de la Gueronniere, 2005; Rapp, Bond, Becker, Carpinello, Nikkel, 2005). This mechanism can incorporate explicit specifications for EBP use into legislative language or regulations that community-based providers must adhere to if funded by the state. Standardization as a lever may arise through state legislation and resulting statutes, or within the SSA through publicly reviewed changes to regulations and administrative rules for substance abuse treatment services (Marton, Daigle, & de la Gueronniere, 2005). Statutes, regulations, and rules are codified into law and set health and safety standards. The SSA is responsible for interpreting and enforcing these standards (Gostin, 2002) within the confines of competing standards, varying social and political support, and limited resources. Thus, legislative mandates appear to correspond with the current state of the field and recommendations for accelerating the use of EBPs (Rosenbloom, Leis, Shah, & Ambrogi, 2006).

Standards defined by law may also be incorporated into state or strategic plans. These plans do not hold the force of law, but can be used to outline priorities and detail strategies within a defined timeframe and within current resource levels. State or strategic plans also may include achievable actions and deliverables, making them practical working documents. As such these tools also drive clinical decision-making and use of specific practices within community-based treatment settings.

Evaluation of state specific activities designed to promote the use of evidence-based practice provides opportunities for comparison between states and within individual states over time to further document development, program level implementation, and client impacts from state driven strategies.

2. METHODS

This project is part of a longitudinal concurrent mixed-methods study assessing SSA strategies to improve quality of care in substance abuse treatment. This paper presents quantitative and qualitative data collected during the second phase of the study. Mixed-methods studies are especially relevant for translational and systems-oriented projects because the findings provide a detailed view or narrative that may compliment findings from quantitative scales (Bernard, 1994; Briody& Baba, 1991). We also anticipated that some strategies and policies may be complicated, multifaceted, and may cross governing bodies, agencies, treatment programs, and state legal and fiscal entities. Finally, using both the quantitative findings and qualitative discussions applies a pragmatic approach that synthesizes consensus and conflict (Cresswell, 2009; Morgan, 2008) to create a more complete understanding of the evidence-based practice movement and the role of the SSA.

2.2. Participants and procedures

This study employed an expert sampling strategy to obtain data from the most knowledgeable representatives whose particular job role dictates a unique understanding of the topic matter (Trochim & Donnelly, 2006). As is standard in qualitative research (Cresswell, 2007; Crabtree & Miller, 1999) purposeful sampling ensures that we included the key policy makers, those determining where and how treatment funds are spent as well as the individuals and teams implementing such policies. Thus, the expert sample includes SSA directors, deputies and/or managers from each state and Washington, D.C. (N = 51). SSA representatives were identified from contacts made during the initial brief interviews (Author, 2009) and verified using the 2008 Single State Agency Directory published by SAMHSA. Identification of the appropriate representative started with the SSA director in each state, and in most cases, the director or deputy director (n=36, 70.6%) completed the interview. Participants from the remaining SSAs were designated program managers knowledgeable about EBPs (n = 15, 29.4%). In some instances (n = 9, 17.6%) more than one individual contributed to the discussion. We were able to obtain a complete sample with representation from all 50 states and Washington, D.C.

2.3. Survey and key informant interview procedures

SSA representatives were contacted by email and phone to schedule 60 to 90 minute telephone interviews to answer questions about strategies to increase the use of EBPs. Information sheets were provided prior to each interview to review the purpose of the study, procedures, and future use of the data in reports and manuscripts. Semi-structured interviews and brief surveys were completed between April and October 2008. Study methods were reviewed and approved by the Oregon Health & Science University Institutional Review Board.

SSA representatives each described their office’s structure, the SSA’s placement in state hierarchy and whether the state authorities for mental health (SMHA) and substance abuse were co-located. The brief survey included Likert-type questions in which respondents ranked implementation of the National Quality Forum’s (2005; 2007) substance abuse EBP domains and their state’s use of strategies to increase implementation (National Quality Forum, 2005). The NQF domains were adapted slightly into five EBP categories: screening and brief intervention, psychosocial interventions, medication-assisted treatment, wraparound services, and aftercare and recovery management. Surveys also included yes/no questions on program authorization and provider certification. Specific to the current analysis, SSA representatives were asked, “How do state funds reach treatment providers?” Responses were categorized into (a) direct contracts or grants to providers; (b) managed care company administers contracts, or (c) indirect funding through counties or other jurisdictions. To identify purchasing levers, representatives were asked, “Do provider contracts/grants include language regarding use of EBPs for substance abuse treatment?” Finally, to identify policy levers, questions included, “Does your state have any legislative policy in place regarding EBPs for substance abuse treatment?” When appropriate, representatives were asked for documentation regarding EBP-related legislation, regulations, and contract language. Statutes, rules, and regulations were verified by examining documentation provided by the SSA representative and primary policy documents from state legislative websites. All primary documents were studied for inclusion of any EBP-related language. The remainder of the interview included broad questions that allowed participants to respond objectively and subjectively to the topics: organizational structure, authorization/licensure, treatment provider funding, regulations and legislation affecting EBPs, staff functions and EBPs including EBP selection and monitoring of practices, vulnerable populations, and implementation of EBPs including SSA efforts with providers.

2.4. Data analysis

Quantitative data were entered into SPSS version 17.0 for analysis. Data entry was validated by a research assistant familiar with the instrument. After generating frequency tables, Chi-square cross tab analysis, one-way Analysis of Variance (ANOVA), and correlations were used to examine relationships between variables of interest.

To examine possible regional variations, states were categorized as Midwest, Northeast, South, or West, based on U.S. Census Bureau definitions. A second variable coded the SSA’s location in the state hierarchy: independent SSA (e.g., cabinet-level, reporting directly to Governor); under SMHA; or under Department of Health (or other umbrella agency such as Department of Human Services, Department of Public Health). Responses to queries about provider contract administration were categorized into a binary variable: direct contracts (including managed care contracts) versus indirect contracts. Managed care arrangements were included as direct contracts because the SSA retains control of services, just as in direct contracts with providers. Finally, responses to queries about EBP language in provider contracts/grants were categorized into a binary variable: contract language requires or encourages providers to use EBPs (generally or specifically) versus no EBP contract language. For simplicity, the term “contract” included monies awarded to providers through both competitive grants and negotiated contracts based on client volume and service expectations.

Qualitative data were analyzed with Atlas.ti version 6.1. Research staff coded transcripts through an iterative process of data review and classification with both individual coding and team review and refinement. Although qualitative research does not lend itself to measures of statistical significance, we employed a type of scoring to evaluate the importance of results (Luborsky, 2005). First, the most frequently coded categories or concepts were identified as important themes, leading to additional in-depth analysis and interpretation. Second, the importance of each theme was evaluated by examining direct statements indicating belief, practice, or experience related to implementing EBPs for substance abuse treatment. Third, research staff members each coded two transcripts, then the PI and/or qualitative analysis leader met with the coders to review consistency on this initial effort. Fourth, at the conclusion of document coding, 24% of documents were selected for “check-coding.” These documents were coded by a separate analyst and code choice was compared for inter-rater/check-rater consistency, to identify discrepancies between rater and check-rater code choice. Inter-rater/check-rater consistency calculations showed a strong consistency (82%) between coders’ and check coders’ choices. Finally, as is standard with mixed-methods projects, the quantitative and qualitative analyses were conducted separately and results compared for convergence when interpreting findings and drawing conclusions (Rossman & Wilson, 1985). Thus, recurrent themes and compelling discussions complement quantitative findings to create a comprehensive presentation of state-provider processes and strategies aimed at increasing the use of EBPs.

3. RESULTS

3.1. Use of purchasing levers

More than half of SSAs issued contracts directly to providers (27, 52.9%) or through managed care contracts (7, 13.7%). The remaining SSAs distributed funds to sub-state entities, such as counties or regional offices, which in turn contract with providers (17, 33.3%). See Table 1. Approximately two-thirds of SSAs (n = 32, 62.8%) incorporated EBP requirements or specifications into their contract language. Of those, approximately half (n = 15, 29.4% of total) require providers to use EBPs. Examples of SSA contract or RFP language include: (1) “Providers must use state of the art, evidenced-based counseling and therapeutic modalities that are based on current research”; (2) “Practitioners providing this service are expected to maintain knowledge and skills regarding current research trends in best/evidence-based practices”; and (3) “The [SSA] is committed to purchasing a continuum of services comprised of models that have been demonstrated to be effective. The [managed care organization], in the delivery of services, shall work with the [SSA] to continue and improve the ongoing implementation of evidence-based practices, including promising and best practices.” Some states also provide a list of EBPs or refer providers to EBP lists published by SAMHSA, NIDA, or the Addiction Technology Transfer Centers.

Table 1.

Relationship of EBP contract language to contracting method and to SMHA co-location

No EBP language in
contracts/RFPs
Contracts/RFPs
require or
encourage EBPs
Total

SSA characteristic (n=51) p-value
Provider contracting p=0.03
   Direct a 9 (26.5%) 25 (73.5%) 34 (66.6%)
   Indirect b 10 (58.8%) 7 (41.2%) 17 (33.3%)

51 (100%)
Co-location with SMHA
   Combined 10 (34.5%) 19 (65.5%) 29 (56.9%) p=0.42
   Separate 9 (40.9%) 13 (59.1%) 22 (43.1%)

51 (100%)
a

Direct contracting includes 27 (52.9%) SSAs that issue the majority of contracts directly to providers, and 7 (13.7%) SSAs that contract with managed care organizations to issue direct contracts to providers.

b

Indirect contracting includes 17 (33.3%) SSAs that issue the majority of contracts to counties or other sub-state entities.

A second contract language option, used by one-third of SSAs (n = 17, 33.3%), is to encourage (not require) the community-based treatment providers to use EBPs. Examples include: (1) “[The SSA] funds a comprehensive substance abuse treatment infrastructure, guided by evidence-based practices, data-driven processes, and outcomes-based planning and evaluation” and (2) “[Grant applications should be] written to reflect utilization of best practices in providing these services. Best practices refer to services that reflect research based findings and prevention model programs.” The specific implementation tasks, infrastructure resources and service delivery shifts necessary for providers to accelerate use of these practices are not usually included in these contracts. The remaining SSAs (n = 19, 37.3%) reported that EBP language was not written into provider contracts.

States that either require or encourage EBPs in their contract language were grouped for comparison with SSAs with no EBP contract language. Table 1 examines the relationship between SSAs with EBP language and their method of contracting (direct or managed care versus indirect via sub-state entities). SSAs that fund providers directly or through managed care were significantly more likely to have contracts that required or encouraged EBPs (25 of 34, 73.5%), as compared to SSAs that fund providers indirectly through sub-state entities (7 of 17, 41.1%).

Over half of SSA offices are combined with state mental health authority (SMHA) offices (n = 29, 56.9%). Chi-square analysis showed no differences in EBP contract language between SSAs co-located with SMHAs and SSAs located separate from SMHAs. Additionally, having contract language that required or encouraged EBPs was not associated with SSA location in the state hierarchy or with geographic region. Finally, there were no differences in findings between data collected from SSA directors versus other SSA representatives.

3.2. Use of policy levers and regulation

Policy levers include state legislation, often codified in statute; rules and regulations; and state plans and standards. Regarding state legislative policy related to EBP implementation, five (9.8%) SSAs had current legislation addressing use of EBPs: Oregon, North Carolina, Alaska, Wisconsin, and Idaho. We have included the specific details below about each state as legislation results are a matter of public record. The most prominent initiative was Oregon Revised Statutes 182.515 and 182.525 (Senate Bill 267), which requires the use of EBPs. The “Mandatory Expenditures for Evidence-Based Programs” statute, passed in 2003 and implemented in 2005, requires the SSA to use increasing portions of their budget to purchase EBPs for treatment and prevention services, culminating in 75% by 2009–2011. As part of this mandate, the SSA is required to make biennial reports to the legislature. Further, the Oregon SSA has conducted a provider survey to evaluate implementation. This effort to evaluate the mandate accomplished several goals: documented public dollars spent providing EBPs, ranked the most-used EBPs statewide (Motivational Interviewing, American Society of Addiction Medicine – Patient Placement Criteria, Integrated Dual Diagnosis Treatment for co-occuring disorders, Cognitive Behavior Therapy, and Solution-Focused Brief Therapy); indicated outcome results (improved client outcomes and employment/school were most common); and identified provider strategies to meet fidelity (clinical supervision and quality assurance activities were most common).

North Carolina’s Session Law 2001-437 (House Bill 381), “An Act to Phase in Implementation of Mental Health System Reform at the State and Local Level”, was enacted in 2001 and implemented in 2005. The legislative intent was to develop and implement a state plan that promotes best practices, within available resources. The overall goal was for the state to endorse practices and provider choice through training and funding while still allowing some flexibility. North Carolina created new EBP-based service definitions that were integrated into the service array for basic and enhanced services packages. The resulting service options allow providers to offer evidence-based services that may quality for reimbursement by both Medicaid and state funds. Implementation of North Carolina’s legislative mandate is complemented supplemented by contracting criteria. North Carolina’s SSA contracts indirectly with providers through local management entities (LMEs). SSA-LME contracts require that at least one provider in each LME use an EBP from an approved list. This criterion allows the SSA to endorse and assist provider implementation of EBPs without forcing the issue. Furthermore, the North Carolina Practice Improvement Collaborative of stakeholders – researchers, providers, consumers, advocates – reviews and makes annual recommendations for new EBPs. Although the legislation appropriated no funds, the North Carolina SSA state plan is supported financially by the North Carolina Mental Health Trust Fund.

Alaska’s Chapter 59 SLA 07 (Senate Bill 100), the “Substance Abuse/Mental Health Programs” bill, was enacted in 2007, but has not been implemented. The legislative intent was to develop and implement a substance abuse treatment system using evidence-based best practices or, if EBPs are unavailable, research-based, consensus-based, or promising practices, respectively. The policy also required a procedure for adapting the practices to new situations and for collaboration with consumer-based programs. However, no funds were allocated to implement the legislation.

Wisconsin Statute 46.86 (Assembly Bill 133) was enacted in 1999, effective the following year. Based on evidence about wraparound services and women’s treatment, this law authorized the women’s treatment programs and specified use of wraparound EBPs. Specifically, SSA resources allocated in the form of grants to counties and private entities must provide community-based alcohol and other drug abuse treatment programs for women that “emphasize parent education, vocational and housing assistance and coordination with other community programs and with treatment under intensive care.” This statute also required that funded programs offer aftercare services to women and their children for “at least 2 years after the date on which a woman has left the program.”

Finally, Idaho Code 39-303 (House Bill 833) was enacted in 2006 and established an Interagency Committee on Substance Abuse Prevention and Treatment. The statute empowers the committee to “Research, share, discuss and promote the use of best practices. ” Our respondent characterized the intent of the Idaho bill in this way: “Our legislature is, at this point in time, very bought into treating substance use disorders… they want to make sure what money is being spent is being spent in the best way possible, which means you need to move to evidence-based practice. And we have been doing evidence-based practice in Idaho for many years. It’s just never really been formalized in a statute like that. The legislature is asking us to formally move in that direction so they’re comfortable with spending the kind of money that they’re now spending on treatment… And so, there was an even greater want and need to move, to make sure that we’re doing evidence-based practices.”

3.2.1. Rules and regulations

An EBP-specific legislative mandate is only one of several regulatory pathways state officials may pursue to require or promote the use of specific practices. Alternatively, under the SSA’s statutory authority to establish policies, the SSA leadership may promulgate its own rules or regulations related to EBPs. [Note that specific state names are omitted here and below to protect SSA representative confidentiality. Legislation results above are matters of public record.] For example, one Northeastern SSA representative reported a recent amendment to their rules and regulations for licensing behavioral healthcare organizations: “All services shall be organized and delivered according to evidence-based and best practice standards and guidelines, when available.” This SSA representative also noted that, although provider contracts do not specifically require EBPs, the SSA prioritizes proposals from agencies that use EBPs.

Other states are moving toward regulatory changes. For example, a Western SSA representative anticipated “a very strong mandate within the rules around evidence-based practice. And I think that if our regulatory writers, our legislators, had their druthers, it’s all we would be paying for.” However, other states described difficulty in using policy change to promote practices. As one Western SSA representative noted, “We can certainly suggest that we participate or give input into a certain bill, but it has to be at the request of the legislature. We can’t just show up to hearings… as a state employee and give input. We have to be invited.” A Midwestern SSA representative reported“I’m sure you can understand that within state governments it’s literally about a five year process to get even one rule changed.”

3.2.2. State plans and standards

SSAs may use state plans as another policy lever for promoting use of specific interventions in community-based treatment programs. For example, in 2006, a Northeastern SSA collaborated in a task force to develop a state plan to address substance abuse. Their published state plan includes a goal to “Reduce the incidence, burden and progression of substance use disorders by integrating best available science and evidence-based programming into prevention, clinical practice and policy”. The SSA representative also noted that providers must demonstrate that they use best practices, and the SSA allows providers to choose the practices that work for them.

Alternatively, an SSA may incorporate science-based addiction services into their standards of care. For example, a Western SSA’s published standards promote “a full continuum of quality, research-based, best practice substance abuse services”. However, this SSA’s representative reported that the standards now are outdated, as the SSA’s relationship with the SMHA has shifted from a combined office, to separate offices, and back to a combined office. These changes required the SSA to rewrite and collapse standards for the state, although the most recent standards draft retains the language noted above. In view of these changes, the SSA representative stated that they have opted instead, to place language requiring EBPs in provider contracts, which are more easily changed to suit the SSA’s specific needs. Efforts to continue to revise the standards will progress slowly over time and again, such policy and regulator changes will likely influence services in community-based treatment programs. Finally, a Southern SSA incorporated EBPs into their published standards document: “The [treatment] approach is based on 12-Step facilitation therapy, cognitive-behavioral, motivational, and insight-oriented techniques according to each client’s individual needs. These best practices counseling standards can be applied in any level of care and throughout the continuum of addictions treatment.”

4. DISCUSSION

Within the current fiscal, political, and social climate, substance abuse treatment services cannot be allowed to languish. Guidance regarding practices, benchmarks, and suggested factors to consider during implementation of science-based interventions are available from the NQF standards, the NIDA/SAMHSA Blending Initiative, SAMHSA’s online NREPP database and TIPS publications, and key reports (Institute of Medicine, 2001, 2006; President's New Freedom Commission on Mental Health, 2003; SAMHSA, 2002; NQF, 2005, 2007). Each of these critical reports confirms that states and providers must work in partnership, a critical focus of the RWJF Advancing Recovery Initiative that is underway.

Results from this mixed-methods project reveal that many states have begun to employ a variety of policy and purchasing levers to increase the use of EBPs in community-based substance abuse treatment settings. Although the methods vary, these contracting, legislative, regulatory, and strategic planning approaches are aimed at shifting services and accelerating implementation of new innovative strategies. Findings are based on brief surveys and interviews with state officials and reflect their understanding of what is actually happening in treatment, in addition to primary document review of statutory and regulatory law, and policy documents available online.

4.1. Purchasing levers

A majority of SSAs contract directly with substance abuse treatment providers or through managed care organizations. SSAs that fund providers directly have more control over contractual language (Marton, Daigle, & de la Gueronniere, 2005) and most SSAs with direct contracts report that provider contracts either require or encourage EBP use. On the other hand, SSAs that fund providers indirectly through sub-state entities may have less influence over services and contact language, but this degree of control varies by state. As noted, using provider contracts to promote EBPs offers flexibility to change contract language annually, or as necessary, to meet changing needs.

In spite of their important and likely role in improving the quality of care, adherence to contract specifications is not an easy or clear process for most providers. Barriers to implementation of provider contracts with EBP language include inadequate links between primary care and psychosocial services; inadequate translation of positive research findings into evidence-based practice; and insufficient organizational and administrative support (Thomas, Bremer, & Engleby, 2004). Table 2 summarizes the barriers and facilitators to implementation that emerged from the qualitative analysis of SSA interviews. Future research should examine the impact of policy changes as well as how service providers addressed barriers and engaged the resources and support of facilitating factors.

Table 2.

Facilitators and barriers to implementation of evidence-based practices

Facilitators Barriers
Direct contracting between the SSA and publicly funded treatment providers Inadequate ability of the state office to lead EBP implementation, monitoring, and fidelity
Inclusion of contract or grant language that requires or encourages providers to use evidence-based practices with their clients Insufficient funds for providers to implement evidence-based practices
State legislation that develops and implements a state plan that promotes best practices, with adequate funding Unfunded mandates
Unsupportive policy makers
Inflexibility of laws
Collaborations of stakeholders (including providers, administrators, policy makers, researchers, consumers and their families) Implementation without consideration of all key factors and entities
Adequate methods of data collection, analysis, and dissemination for evaluation Limited resources for monitoring outcomes

What remains unclear is the extent to which contract language actually influences how innovations and new practices are translated into daily clinical care. Specification in materials and contracts may be a first step toward improving services, but actual shifts in client outcomes and documenation of work being done in each session or group remains confusing without standardization in the field.

4.2. Policy levers

Findings from this project confirm that few states have current or planned legislative or regulatory mandates for EBP implementation. Oregon remains the only state with a law that requires the substance abuse agency to show that they are spending a majority of state dollars on evidence-based programs. This centralized accoutability system gives the SSA legitimacy to lead EBP implementation (Isett, Burnam, Coleman-Beattie, Hyde, Morrissey, et al., 2008). The Oregon SSA continues to work with stakeholders to develop plans to meet the mandate’s requirements. This group reviews and selects practices for the SSA’s list of approved EBPs, holds regular meetings, prepares a bi-monthly newsletter, and addresses implementation issues such as Medicaid billing. This regulatory method is one approach described by Rapp et al. (2005) in their examination of the role of the state mental health authority in promoting improved client outcomes through EBPs. Involving stakeholders – including providers, administrators, policy makers, researchers, and consumers – in a community coalition promotes consensus, which is key to successful implementation (Rugs, Hills, Moore, & Peters, 2011). However, with a very limited budget, the Oregon SSA’s evaluation of this mandate was limited to a brief provider survey, which appears to have been conducted once. Lack of sufficient evaluation data makes it difficult to draw formal conclusions.

Alaska also attempted to mandate EBP use, but no state funds were allocated to support the mandate. With financial support, Alaska’s EBP law would have been an ambitious attempt to accelerate EBP use within the state’s publicly funded substance abuse treatment programs, and would have provided an interesting contrast to Oregon’s mandate. Similarly, trying to work within restricted funding, North Carolina’s state plan seeks to integrate EBPs with legislative funding of the Mental Health Trust Fund. However, appropriations to the trust fund are often significantly reduced by competing demands on the state budget. Nevertheless, the North Carolina legislature continues to fund the Practice Improvement Collaborative, which is the SSA’s mechanism for studying and disseminating practices. Funding likely will remain a key issue in states’ efforts to improve quality of care through policy (Bruns, Hoagwood, Rivard, Wotring, Marsenich et al., 2008; Cooper & Aratani, 2009; Greenhalgh, Robert, MacFarlane, Bate, & Kyriakidou, 2004; Simpson, 2002). Indeed, it is recommended that EBP translation and implementation leaders consider funding and policy issues early in the process (Brown & Flynn, 2002; Rugs, Hills, Moore, & Peters, 2011; Woolf, 2008).

Several states noted inclusion of EBP language in their state plans, strategic plans, or standards. However, because use of the term “standards” varies widely – from state regulatory mandates to committee recommendations – it is difficult to analyze specific standards. Nevertheless, incorporation of EBP language into state standards does indicate progress toward EBP implementation, which should translate into improved quality of care.

Each of these policy levers has specific advantages and disadvantages. Although legislative mandates for specific practices may appear to be a straightforward lever that would require implementation of particular EBPs, at least for state-funded services, managing and monitoring adherence and fidelity to such a requirement is a resource-intensive activity. For example, monitoring compliance with a requirement to provide motivational enhancement therapy (MET) requires a mechanism for auditing and providing on-going training regarding this counseling approach.

Because of the limited number of states with legislative or regulatory mandates and the relative newness of such a politically driven and state-wide approach, it is somewhat premature to fully assess or expect to understand the impact of EBP mandates. Because many state data systems are not able to track which practices clients receive, it is difficult to ascertain the impact on clinical outcomes. State officials reported in our interviews an increasing need for easy to implement and consistent databases that can be used to inform decision-making.

Additionally, for many states, legislation or regulations probably are not the ideal course to mandate specific practices, as law is inflexible and not easily adaptable to new technologies (Jacobson, 2008). With all EBPs, there could be political costs of requiring specific practices that may be perceived as difficult or costly. Consistent with the literature, findings from this study confirm that administrative resources and support from the governor and legislature are critical in accelerating the use of EBPs (Rosenbloom, Leis, Shah, & Ambrogi, 2006; Rugs, Hills, Moore, & Peters, 2011).

Finally, although regulatory strategies and legislation are at times effective mechanisms for change, states have limited resources to closely monitor treatment programs for use of specific practices (D’Aunno, 2006). Performance-enhancement contracting may be an alternative that needs greater attention, as outcomes as opposed to the use of specific practices can be targeted. One state in the study currently employs performance-based contracting and has been very satisfied with the outcomes. Although this state does not require EBPs in their contract language, the SSA proactively meets with providers to discuss effectiveness, and providers have shown greater acceptance of EBPs, especially proven psychosocial interventions. Because the state employs performance-based contracting, the SSA has noted that providers see the value in EBPs and are paying close attention to it because it affects their bottom line.

4.2. Lessons learned

Overall it is clear that the emphasis on improving quality of care and efficiencies in services continues within the field of substance abuse treatment, and that additional research is needed. In particular, Garner has noted the relative paucity of EBP implementation research in substance abuse treatment (Garner, 2009). While specific strategies differ, reports from representatives of state substance abuse authorities (SSAs) indicate that policy or regulatory changes and financial incentives are two levers being used by states to increase adoption of new practices. Such changes will result in state level variation with respect to the extent to which empirically based practices are used by community-based providers and the overall quality and fidelity of implementation as well. Thus, the results of this study offer a framework for other states to consider as well as the opportunity to continue tracking the implementation of these policy levers and the National Quality Forum standards. Consideration of the state’s role, the need for infrastructure and resources for up-to-date data systems and assistance with understanding and responding to specific contract language and administrative rules is critical to the successful and sustained implementation of science-based practices.

Nevertheless, these findings are preliminary. Evaluation of these stratgeies should include tracking client-level outcomes (Shannon, Walker, & Blevins, 2009). An evaluation infrastructure is necessary to meet the call for EBP implementation (Institute of Medicine, 2001, 2006; President's New Freedom Commission on Mental Health, 2003; Substance Abuse and Mental Health Services Administration, 2002). States need help planning and executing evaluation in partnership with providers. Individual agencies may need technical assistance to guide program planning and outcomes evaluation. And all stakeholders, including providers, administrators, policy makers, researchers, consumers and their families should provide feedback regarding the impact of state policy implementation and technical assistance. As Health Care Reform arrives, there will be new policies that require significant shifts in documentation, clinical care and integration of services. On-going process and outcomes evaluation of these behavioral health polices is critical for the well being of clients and communities and for the planning of future policy makers.

4.3. Limitations

Although the numerically small sample from this study limits the type of quantitative analysis that can be completed, participation from all state authorities means that the assessment is comprehensive. The quantitative questions were also broad and may not have captured all the subtle detail regarding purchasing and policy practices. Although qualitative questions elicited rich information regarding these levers, it was sometimes difficult to categorize state initiatives due to each SSA’s unique characteristics. Because each SSA is slightly different (e.g., in terms of SSA structure and hierarchy, financing, changing political drivers, and populations served), comparisons may be limited. Nevertheless, this study’s strength was examination of purchasing and policy levers as they impact practice and the inclusion of data from the whole population of Single State Authorities for substance abuse treatment.

Acknowledgements

This work was supported by the Robert Wood Johnson Foundation, Awards 58839 and 63878.

The authors acknowledge the contributions of the Addiction Technology Transfer Centers, Jamie Chriqui, Caitlin Rasplica, Marisa Gholson, Holly Fussell, and Sandy Kennard.

Biographies

Traci Rieckmann, Ph.D.

Traci Rieckmann, Ph.D. is a Research Assistant Professor in the Department of Public Health and Preventive Medicine at Oregon Health & Science University. Dr. Rieckmann’s research focuses on the organization and delivery of drug abuse treatment services, primarily in the translation of research to practice and the implementation of evidence-based practices. Her work also focuses on assessing disparities in access and retention in care and adapting practices for American Indian/Alaskan Native communities. Dr. Rieckmann earned her Ph.D. at the University of Utah and her clinical work has focused primarily on adolescents and adults with substance abuse and co-occurring disorders.

Anne Kovas, M.P.H.

Anne Kovas, M.P.H. is a Research Associate in the Department of Psychiatry at Oregon Health & Science University. She received a B.A. degree in Biology from the University of Virginia, and an M.P.H. in Epidemiology and Biostatistics from Oregon Health & Science University. Her M.P.H. thesis examined buprenorphine for acute heroin detoxification in a community-based treatment center. Her work in the Department of Psychiatry is primarily post-award management. Her work with colleagues in the Department of Public Health & Preventive Medicine has examined state approaches to evidence-based practice implementation.

Elaine Cassidy, Ph.D.

Elaine F. Cassidy, Ph.D., is a research and evaluation consultant at the OMG Center for Collaborative Learning, where she manages projects related to child and adolescent health promotion. Prior to joining OMG, she served as a program officer in research and evaluation at the Robert Wood Johnson Foundation, where she oversaw research and evaluation activities for the Addiction Prevention and Treatment and Vulnerable Populations portfolios. She holds a M.S.Ed. in psychological services from the University of Pennsylvania and a Ph.D. in school, community, and child-clinical psychology from the University of Pennsylvania.

Dennis McCarty, Ph.D.

Dennis McCarty, Ph.D., is a Professor in the Department of Public Health & Preventive Medicine at Oregon Health & Science University. He collaborates with policy makers in state and federal government and with community based programs to examine the organization, financing, and quality of publicly funded prevention and treatment services for alcohol and drug disorders. He is the Principal Investigator for the Oregon/Hawaii Node of the National Drug Abuse Treatment Clinical Trials Network and the national evaluation for the Network for the Improvement of Addiction Treatment. His Ph.D. in Social Psychology is from the University of Kentucky.

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

Contributor Information

Traci R. Rieckmann, Email: rieckman@ohsu.edu.

Anne E. Kovas, Email: kovasa@ohsu.edu.

Elaine F. Cassidy, Email: elaine@omgcenter.org.

Dennis McCarty, Email: mccartyd@ohsu.edu.

References

  1. Bernard HR. Research methods. In: Bernard HR, editor. Anthropology: Qualitative and quantitative approaches. Thousand Oaks, CA: Sage Publications, Inc.; 1994. [Google Scholar]
  2. Boyle M. Response 2: The states are the key factor in successful EBP adoption and technology transfer. The Bridge. 2009;1(2) [Google Scholar]
  3. Briody E, Baba ML. Explaining differences in repatriation experiences: The discovery of coupled and decoupled systems. American Anthropologist. 1991;93(2):322–344. [Google Scholar]
  4. Brown B, Flynn P. The federal role in drug abuse technology transfer: A history and perspective. Journal of Substance Abuse Treatment. 2002;22:245–257. doi: 10.1016/s0740-5472(02)00228-3. [DOI] [PubMed] [Google Scholar]
  5. Brownson RC, Chriqui JF, Stamatakis KA. Understanding evidence-based public health policy. American Journal of Public Health. 2009;99(9):1576–1583. doi: 10.2105/AJPH.2008.156224. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Bruns EJ, Hoagwood KE, Rivard JC, Wotring J, Marsenich L, Carter B. State implementation of evidence-based practice for youths, part II: Recommendations for research and policy. Journal of the American Academy of Child and Adolescent Psychiatry. 2008;47:499–504. doi: 10.1097/CHI.0b013e3181684557. [DOI] [PubMed] [Google Scholar]
  7. Center for Substance Abuse Treatment. Contracting for managed substance abuse and mental health services: A guide for public purchasers (TAP 22) Rockville, MD: Substance Abuse and Mental Health Services Administration; 1998. [PubMed] [Google Scholar]
  8. Chriqui JF, Eidson SK, McBride DC, Scott W, Cappocia V, Chaloupka FJ. Assessing state regulation of outpatient substance abuse treatment programs in the U.S. along a quality continuum. Chicago, IL: National Program Office, University of Chicago, Institute for Health Research and Policy; 2006. [Google Scholar]
  9. Chriqui JF, Terry-McElrath Y, McBride DC, Eidson SS. State policies matter: The case of outpatient drug treatment program practices. Journal of Substance Abuse Treatment. 2008;35:13–21. doi: 10.1016/j.jsat.2007.08.012. [DOI] [PubMed] [Google Scholar]
  10. Chriqui JF, Terry-McElrath Y, McBride DC, Eidson SS, VanderWaal CJ. Does state certification or licensure influence outpatient substance abuse treatment program practices? Journal of Behavioral Health Services and Research. 2007;34:309–328. doi: 10.1007/s11414-007-9069-z. [DOI] [PubMed] [Google Scholar]
  11. Condon TP, Miner LL, Balmer CW, Pintello D. Blending addiction research and practice: Strategies for technology transfer. Journal of Substance Abuse Treatment. 2008;35:156–160. doi: 10.1016/j.jsat.2007.09.004. [DOI] [PubMed] [Google Scholar]
  12. Cooper JL, Aratani Y. The status of states' policies to support evidence-based practices in children's mental health. Psychiatric Services. 2009;60(12):1672–1675. doi: 10.1176/ps.2009.60.12.1672. [DOI] [PubMed] [Google Scholar]
  13. Crabtree BF, Miller WL. Doing Qualitative Research. 2nded. Thousand Oaks, CA: Sage Publications, Inc.; 1999. [Google Scholar]
  14. Creswell JW. Qualitative inquiry & research design: Choosing among five approaches. 2nd ed. Thousand Oaks, CA: Sage Publications, Inc.; 2007. [Google Scholar]
  15. Cresswell JW. Research design: Qualitative, quantitative, and mixed methods approaches. 3rd ed. Los Angeles, CA: Sage Publications, Inc.; 2009. [Google Scholar]
  16. D’Aunno T. The role of organization and management in substance abuse treatment: Review and roadmap. Journal of Substance Abuse Treatment. 2006;31:221–233. doi: 10.1016/j.jsat.2006.06.016. [DOI] [PubMed] [Google Scholar]
  17. Ducharme LJ, Abraham AJ. State policy influence on the early diffusion of buprenorphine in community treatment programs. Substance Abuse Treatment, Prevention, and Policy. 2008;3:17–27. doi: 10.1186/1747-597X-3-17. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Finnerty MT, Rapp CA, Bond GR, Lynde DW, Ganju V, Goldman HH. The state health authority yardstick (SHAY) Community Mental Health Journal. 2009;45:228–236. doi: 10.1007/s10597-009-9181-z. [DOI] [PubMed] [Google Scholar]
  19. Finnerty M, Rapp C, Bond G. State health authority yardstick (SHAY): Impact of state level action on the quality and penetration of EBPs in the community. Baltimore, MD: NRI Conference on State Mental Health Agency Services Research, Program Evaluation, and Policy; 2005. [Google Scholar]
  20. Garner BR. Research on the diffusion of evidence-based treatments within substance abuse treatment: A systematic review. Journal of Substance Abuse Treatment. 2009;36:376–399. doi: 10.1016/j.jsat.2008.08.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Glasgow RE, Lichtenstein E, Marcus AC. Why don’t we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. American Journal of Public Health. 2003;93:1261–1267. doi: 10.2105/ajph.93.8.1261. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Gold PB, Glynn SM, Mueser KT. Challenges to implementing and sustaining comprehensive mental health service programs. Evaluation & the Health Professions. 2006;29:195–218. doi: 10.1177/0163278706287345. [DOI] [PubMed] [Google Scholar]
  23. Gostin LO. Public health law and ethics: A reader. Berkeley, CA: University of California Press; 2002. [Google Scholar]
  24. Greenhalgh T, Robert G, MacFarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: Systematic review and recommendations. Milbank Quarterly. 2004;82:581–625. doi: 10.1111/j.0887-378X.2004.00325.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Institute of Medicine. Crossing the quality chasm: A new health system for the 21st century. Washington, DC: National Academies Press; 2001. [PubMed] [Google Scholar]
  26. Institute of Medicine. Improving the quality of health care for mental and substance-use conditions. Washington, DC: National Academies Press; 2006. [PubMed] [Google Scholar]
  27. Isett KR, Burnam MA, Coleman-Beattie B, Hyde PS, Morrissey JP, Magnabosco JL, Rapp C, Ganju V, Goldman HH. The role of state mental health authorities in managing change for the implementation of evidence-based practices. Community Mental Health Journal. 2008;44:195–211. doi: 10.1007/s10597-007-9107-6. [DOI] [PubMed] [Google Scholar]
  28. Jacobson PD. Transforming clinical practice guidelines into legislative mandates: Proceed with abundant caution (commentary) Journal of the American Medical Association. 2008;299:208–210. doi: 10.1001/jama.2007.12. [DOI] [PubMed] [Google Scholar]
  29. Ling W, Cunningham-Rathner J, Rawson R. Diffusion of substance abuse treatment: Will buprenorphine be a success? Journal of Psychoactive Drugs, SARC Supplement. 2004;2:115–117. doi: 10.1080/02791072.2004.10400046. [DOI] [PubMed] [Google Scholar]
  30. Luborsky MR. Hip fracture: Cultural loss and long-term reintegration. NIH/NIA 1 R01 AG023572-01 A2. 2005 [Google Scholar]
  31. Lynde D. State mental health authority (SMHA): Lessons we are learning implementing evidence-based practices. Baltimore, MD: NRI Conference on State Mental Health Agency Services, Research, Program Evaluation, and Policy; 2005. February 6–8, 2005. [Google Scholar]
  32. Mark TL, Levit KR, Vandivort-Warren R, Coffey RM, Buck JA SAMHSA Spending Estimates Team. Trends in spending for substance abuse treatment, 1986–2003. Health Affairs (Millwood) 2007;26:1118–1128. doi: 10.1377/hlthaff.26.4.1118. [DOI] [PubMed] [Google Scholar]
  33. Marton A, Daigle J, de la Gueronniere G. Identifying state purchasing levers for promoting the use of evidence-based practice in substance abuse treatment. Hamilton, NJ: Center for Health Care Strategies; 2005. [Google Scholar]
  34. McCarty D, Rieckmann T. The treatment system for alcohol and drug disorders. In: Levin BL, Petrila J, Hennessy KD, editors. Mental Health Services: A Public Health Perspective. Third Ed. New York: Oxford University Press; 2010. [Google Scholar]
  35. Miller WR, Sorensen JL, Selzer JA, Brigham GS. Disseminating evidence-based practices in substance abuse treatment: A review with suggestions. Journal of Substance Abuse Treatment. 2006;31:25–39. doi: 10.1016/j.jsat.2006.03.005. [DOI] [PubMed] [Google Scholar]
  36. Miller WR, Zweben J, Johnson WR. Evidence-based treatment: Why, what, where, when and how? Journal of Substance Abuse Treatment. 2005;29:267–276. doi: 10.1016/j.jsat.2005.08.003. [DOI] [PubMed] [Google Scholar]
  37. Morgan D. Pragmatism as a philosophical foundation for mixed methods research. In: Plano-Clark V, Cresswell JC, editors. The mixed methods reader. Thousand Oaks, CA: Sage Publications, Inc.; 2008. pp. 29–65. [Google Scholar]
  38. National Quality Forum. Evidence-based treatment practices for substance use disorders. Washington, DC: National Quality Forum; 2005. [Google Scholar]
  39. National Quality Forum. National voluntary consensus standards for the treatment of substance use conditions: Evidence-based treatment practices. Washington, DC: National Quality Forum; 2007. [Google Scholar]
  40. President's New Freedom Commission on Mental Health. Achieving the promise: Transforming mental health care in America. DHHS pub.no. SMA-03-3832. Rockville, MD: U.S. Department of Health and Human Services; 2003. [Google Scholar]
  41. Rapp CA, Bond GR, Becker DR, Carpinello S, Nikkel R, Gintoli G. The role of state mental health authorities in promoting improved client outcomes through evidence-based practice. Community Mental Health Journal. 2005;41:347–362. doi: 10.1007/s10597-005-5008-8. [DOI] [PubMed] [Google Scholar]
  42. Rieckmann TR, Kovas AE, Fussell HE, Stettler NM. Implementation of evidence-based practices for treatment of alcohol and drug disorders: The role of the state authority. Journal of Behavioral Health Services & Research. 2009;36(4):407–419. doi: 10.1007/s11414-008-9122-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Rosenbloom DL, Leis RG, Shah P, Ambrogi R. Blueprint for the states: Policies to improve the way states organize and deliver alcohol and drug prevention and treatment: The findings of a national policy panel. Boston, MA: Join Together; 2006. [Google Scholar]
  44. Rossman GB, Wilson BL. Numbers and words: Combining quantitative and qualitative methods in a single large-scale evaluation study. Evaluation Review. 1985;9(5):627–643. [Google Scholar]
  45. Rugs D, Hills HA, Moore KA, Peters RH. A community planning process for the implementation of evidence-based practice. Evaluation and Program Planning. 2011;34:29–36. doi: 10.1016/j.evalprogplan.2010.06.002. [DOI] [PubMed] [Google Scholar]
  46. Shannon LM, Walker R, Blevins M. Developing a new system to measure outcomes in a service coordination program for youth with severe emotional disturbance. Evaluation and Program Planning. 2009;32:109–118. doi: 10.1016/j.evalprogplan.2008.09.006. [DOI] [PubMed] [Google Scholar]
  47. Simpson DD. A conceptual framework for transferring research to practice. Journal of Substance Abuse Treatment. 2002;22:171–182. doi: 10.1016/s0740-5472(02)00231-3. [DOI] [PubMed] [Google Scholar]
  48. Substance Abuse and Mental Health Services Administration. Report to congress on the prevention and treatment of co-occurring substance abuse disorders and mental disorders. Rockville, MD: Substance Abuse and Mental Health Services Administration, Office of Applied Studies; 2002. [Google Scholar]
  49. Thomas CP, Wallack SS, Lee S, McCarty D, Swift R. Research to practice: Adoption of naltrexone in alcoholism treatment. Journal of Substance Abuse Treatment. 2003;24:1–11. [PubMed] [Google Scholar]
  50. Thomas M, Bremer R, Engleby C. Assertive community treatment in a capitated managed care system. Princeton, NJ: Center for Health Care Strategies; 2004. [Google Scholar]
  51. Trochim W, Donnelly JP. The research methods knowledge base. Mason, OH: Atomic Dog; 2006. [Google Scholar]
  52. Woolf SH. The meaning of translational research and why it matters. Journal of the American Medical Association. 2008;299:211–213. doi: 10.1001/jama.2007.26. [DOI] [PubMed] [Google Scholar]

RESOURCES