Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2013 Jul 24.
Published in final edited form as: J Psychoactive Drugs. 2011 Sep;Suppl 7:27–39. doi: 10.1080/02791072.2011.601988

Legislating Clinical Practice: Counselor Responses to an Evidence-Based Practice Mandate

Traci Rieckmann a, Luke Bergmann b, Caitlin Rasplica c
PMCID: PMC3721513  NIHMSID: NIHMS474362  PMID: 22185037

Abstract

The demand to connect research findings with clinical practice for patients with substance use disorders has accelerated state and federal efforts focused on implementation of evidence-based practices (EBPs). One unique state driven strategy is Oregon’s Evidence-Based Practice mandate, which ties state funds to specific treatment practices. Clinicians play an essential role in implementation of shifts in practice patterns and use of EBPs, but little is understood about how legislative efforts impact clinicians’ sentiments and decision-making. This study presents longitudinal data from focus groups and interviews completed during the planning phase (n = 66) and early implementation of the mandate (n = 73) to investigate provider attitudes toward this policy change. Results reflect three emergent themes: (1) concern about retaining individualized treatment and clinical latitude, (2) distrust of government involvement in clinical care, and (3) the need for accountability and credibility for the field. We conclude with recommendations for state agencies considering EBP mandates.

Keywords: counselors, evidence-based practice, implementation, legislation, policy mandate


After decades of allegiance to treatment modalities rooted in personal experience and testimony, the science of addiction treatment is shifting, with greater emphasis being placed on performance and outcomes. Simultaneously, treatment options are expanding and substance abuse providers and researchers have begun to emphasize evidence-based practice (EBP) as the key to the continued advancement of the field (Garner 2009; Amodeo, Ellis & Samet 2007; Miller et al. 2006). Health care reform, including the Patient Protection and Affordable Care Act (PPACA 2010), also ensures greater attention to establishing standards, a continued examination of treatment services, and monitoring of client level outcomes (NQF 2007). Monitoring the impact and implementation of these changes on addiction services is critical as states, providers, funding entities, and the federal government all work to respond to the shifting health care climate.

HISTORY AND CONTEXT

The evidence-based medicine movement has brought significant changes to health care practices, along with some controversy. Inaugurated by Canadian physicians and researchers in the 1990s in biomedical domains, the evidence-based movement has spread across the globe to allied fields and behavioral health (Evidence-Based Medicine Working Group 1992). It has been suggested that the emphasis on evidence-based practices emerged from the concern that social service interventions, including substance abuse treatment and mental health therapy, were being practiced without consideration for effectiveness and outcomes (Regehr, Stern & Shlonsky 2007; Gambrill 2001). Generally, EBPs are the gold standard for clinical practice, having been shown to be both medically and cost effective through multiple randomized clinical trials (RCTs). As Webb (2001) describes in his work on validity and evidence-based practice, the perspective that an effective practice is “ultimately to be delivered by research informed evidence which is underpinned by rigorous and effective methodologies is deeply appealing to our contemporary technocratic culture.” Further, proponents of the EBP movement argue that standardization of care and potential cost savings through improved outcomes make a strong case for adoption of EBPs.

On the other hand, as EBPs have spread into diverse clinical settings, a growing chorus of critics has expressed concerns about their proliferation. A number of scholars have noted that clinicians have had success with many practices that lack randomized trial data (Landsman 2006). Critics have also pointed to the problems inherent in making individual treatment decisions informed by population-based (epidemiological) data. They note that the disjuncture between a clinician’s charge to treat individual patients and the population-based data with which EBPs are tested and categorized raises ethical and clinical concerns about whether individual patients will receive the most appropriate care (Rolfe 1999).

Perhaps the most abiding reservations about EBPs among practicing clinicians concern their possible interference with what is referred to as “clinical expertise,” “practice wisdom,” “clinical judgment,” “craft knowledge” or “experiential knowledge” (Staller 2006; Dybicz 2004; Thomas 2004; Addis, Wade & Hatgis 1999). If clinical practices are prescribed, or in some cases even scripted, critics reason: What place is there for clinical judgment and the personal idiosyncrasies that all clients present? Such concerns may be particularly important to clinicians who use behavioral interventions, including substance abuse counselors, for whom the “art” of practice is seen as especially complex and significant (Gray & McDonald 2006). Moreover, substance abuse treatment workers may feel protective of their clinical expertise because they tend to feel undervalued as a workforce, often laboring in agencies that are struggling to stay afloat. Not surprisingly, the push for EBPs in substance abuse treatment has been met with mixed reactions among direct care workers. While some eagerly embrace the most recent therapeutic techniques, others, many of whom have been working as treatment providers for years and/or are in recovery from addiction themselves, view emergent modalities with considerable suspicion. In short, as the field of substance abuse treatment is moving toward use of EBPs, many clinicians find themselves in tenuous positions.

Implementation of an evidence-based mandate poses significant challenges; the most daunting may be the willingness of staff to buy into the process (Aarons 2004; Willenbring et al. 2004; Klein & Sorra 1996). Other empirical work has demonstrated that clinicians play a critical role in successful and sustained implementation of new practices (Damschroder et al. 2009; Proctor et al. 2007; Knudsen et al. 2005; Aarons & Palinkas 2007; Simpson 2002). As Lipsky (1980) has concluded, understanding why policy objectives are not met requires knowledge of how the rules are experienced by workers (direct service providers) and what pressures they encounter. Thus, understanding how treatment providers respond to policy mandates is critical for policy makers; it is also critical to review how various states and municipalities are considering or implementing approaches to accelerating the use of EBPs within complex systems and restricted budgets.

THE EVIDENCE-BASED POLICY EXPERIMENT

Previous work has demonstrated that there are substantial time lags in the translation of clinical research into practice in substance abuse treatment (Miller, Zweben & Johnson 2005; Marinelli-Casey, Domier & Rawson 2002; IOM 1998). Further, it is well-established that changing clinical practice and implementing new interventions is challenging and complex (Miller et al. 2006; Marinelli-Casey, Domier & Rawson 2002; McLellan 2002). At the organizational level, issues surrounding compensation or funding (Galanter et al. 2000), insurance coverage (Rawson et al. 1998), treatment philosophy (Knudsen et al. 2005; McGovern et al. 2004; Thomas et al. 2003), and program resources (Knudsen et al. 2005) are critical, but the support of frontline clinicians is especially important for successful implementation (Proctor et al. 2007; McGovern et al. 2004; Willenbring et al. 2004; Marinelli-Casey, Domier & Rawson 2002). More specifically, training, experience, attitudes, and beliefs among clinicians all are extremely influential when working to advance the use of treatment innovations (Fuller et al. 2007; McCarty et al. 2007; Miller et al. 2006; Knudsen et al. 2005; Aarons 2004; Forman, Bovasso & Woody 2001).

To address these challenges, state and local governments are creating policies and state plans and in some cases passing laws to accelerate the use of EBPs and improve the quality of care in behavioral health services (Rieckmann et al. in press; Boyle 2009; Rieckmann et al. 2009; Chriqui et al. 2007; Ducharme & Abraham 2008; Mark et al. 2007). Leading this EBP policy charge is Oregon’s State Legislature, which passed a bill mandating that substance abuse treatment agencies that receive state funds must provide EBPs. Consistent with its role as a trailblazer in establishing a state health plan and in tackling controversial issues—from assisted suicide to medical marijuana—Oregon is the first state to pass a legislative mandate for EBPs, Senate Bill 267 (SB 267), which is now formally known as Oregon Revised Statute 182.525 (ORS 182.525). Effective July 2005, the legislation mandated that 25% of state-purchased substance abuse and mental health services be identified as evidence-based, a figure that rose to 50% in the second biennium (2007 to 2009), and finally to 75% in 2011. According to SB 267, an “evidence-based program” is one that “incorporates significant and relevant practices based on scientifically based research; and is cost effective” (SB 267, Section 3 No. 3a and 3b). The Oregon Addictions and Mental Health Division (AMH), within the Department of Human Services (Oregon’s single state authority—SSA), is required by ORS 182.525 to report to the legislature regarding the proportion of funds that support evidence-based practices. AMH (the SSA for Oregon) began this planning and preparation process in 2003 by using input from the literature, stakeholders, and some providers to establish a definition of EBPs and generate a list of practices that qualify as evidence-based.

Because the mandate is unfunded, AMH has been able to provide only minimal training or oversight to assist treatment agencies with compliance and adoption of new interventions (Magnabosco 2006; Gelber & Rinaldo 2005; Rapp et al. 2005). Overall, the emphasis of this unique legislation is less on facilitation or direct support provision, and more on compelling providers to marshal their own resources to develop greater familiarity and expertise with EBPs, and ultimately to implement more EBPs into treatment. Further, the response to this mandate may influence other states and regulating bodies as they seek to improve services and adapt to the changing behavioral health environment. In this article, we analyze qualitative data from focus groups and interviews with clinicians in Oregon to explore their reactions and perspectives on addiction services and this mandate, as well as the process taken to implement EBPs in the state.

METHODS

This study employed a mixed-methods prospective evaluation of the implementation of the Oregon evidence-based mandate; for this article, we chose to focus on the qualitative findings for several important reasons. First, the EBP mandate in Oregon involves the unfolding of complex political, social, and logistical processes in real-time. Therefore, it was important to identify and describe clinicians’ unanticipated responses to the mandate and their improvised ways of framing ideas and experiences resulting from it. Moreover, the use of qualitative methods improved validity, as clinicians were concerned about evaluation and feared monetary ramifications associated with the mandate. Quantitative methods alone may have been significantly impacted by social responding due to these concerns, and would not have provided an in-depth understanding of providers’ responses to the mandate and EBPs overall. The highly interactive qualitative methods employed for this project allowed us to establish rapport and trust with study participants.

Conceptual Approach

The study design, measures, and analysis were informed by several critical theoretical perspectives and the current research from implementation science regarding evidence-based practices. Primarily, our examination of the clinicians’ response to this mandate corresponds with provider, organizational, and systems characteristics from the Technology Diffusion Model, which emerged from classical diffusion theory (Thomas et al. 2003; Rogers 2003; 1995) and core components of the Implementation Science Model (Fixsen et al. 2005). According to Thomas and colleagues, clinician characteristics, organizational factors, and system and technology characteristics all influence adoption of innovations. Further, as Fixsen and colleagues assert, implementation of innovative practices requires comprehensive training and behavior change within the workforce, and the “core components” that drive implementation are staff selection, training, consultation, and coaching (e.g., supervision, feedback, and emotional support; Fixsen et al. 2005). Thus, human resource factors, including buy-in, positive attitudes, knowledge of the interventions, and intention to use such tools, influence full-scale adoption.

Our work is also informed by key concepts and interpretive frameworks associated with the work of Pierre Bourdieu (1984, 1977) and other “practice theorists” (Wacquant 2002; Certaue 1984; Ortner 1984). Examination of social capital and administrative power and their influence on the processes of institutional change is critical to a complete understanding of the response to this legislative mandate. According to Bourdieu, social capital operates across overlapping social “fields,” such as geographic communities, professional and educational settings, and even the clinical milieu, and is often instrumental in furthering specific institutional or social agendas (Bourdieu 1977).

Participants and Procedures

Using purpose criterion sampling, agencies were selected to achieve a geographically representative sample of substance abuse treatment programs involved in the implementation of ORS 182.525 and were recruited by the principal investigator (T. Rieckmann) through direct communication. Participating agencies were located in six geographic regions throughout the state and ranged in size, location, services provided, funding streams, and clientele. Data collection was completed during the planning phase that followed passage of the legislation but prior to the actual date the mandate went into effect (2005), as well as during a follow-up phase corresponding with the early implementation of the mandate (2007–2008). The full study included semistructured, open-ended individual interviews and focus groups with stakeholders (AMH staff, representatives of county governments, and treatment advisors), directors of treatment agencies, and clinicians at each of the participating agencies and within American Indian treatment programs, each lasting approximately 90 minutes. Data was collected during site visits to each of the treatment programs and tribal communities, where clinician focus groups were held separately from agency director and clinical supervisor interviews in order to keep responses as candid as possible. This study presents data from the direct service providers (e.g., counselors, supervisors) employed within the treatment programs who participated in the site visits during both the planning phase and early implementation follow-up. The majority of participants were present for both phases, but because some of the providers were no longer employed at the agencies during the early implementation follow-up, and because of the importance of letting all providers who were willing and able (and who met the eligibility criteria) share their experiences, we were unable to match focus group participants over time.

All of the participating treatment programs provided drug and alcohol treatment services, while some offered additional mental health and integrated care services. Sixty-three percent provided both outpatient and residential services, and 89% provided some treatment services for youth. Phase 1 (Planning Phase: 01/2005–08/2005) consisted of seven focus groups and four interviews conducted with direct service providers at 13 treatment agencies (n = 51). Two additional focus groups and three interviews (n = 15) were also completed with American Indian treatment programs during Phase 1 of this project. Phase 2 (Early Implementation: 08/2007–04/2008) consisted of ten focus groups and three interviews with direct service providers at the same 13 treatment agencies (n = 58) and three focus groups and three interviews at American Indian treatment programs (n = 15). This study includes data from direct service providers who participated in focus groups and interviews in the planning phase (n = 66) and early implementation phase (n = 73) for ORS 182.525. Agency director interviews and community and government stake-holders also participated in the study, but are not included in this analysis.

Open-ended interview and focus group questions from the initial planning phase addressed experience to-date with EBPs; influence of the mandate; EBPs and diverse populations; and overall planning and preparation experiences (using the query, “What are your thoughts about monitoring and tracking the use of EBP in your agency?”). Phase 2 interview and focus group questions that were posed three years after initial implementation addressed changes and early experiences when working to implement EBPs, including staff selection and training; organizational structure and culture; supervision; and fidelity monitoring (using the query, “How has feedback (supervision; coaching) influenced your acceptance or use of EBPs?”). This study was approved and overseen by the Oregon Health & Science University Institutional Review Board.

Analysis

To prepare interview responses for analysis, audiotapes were transcribed verbatim into individual computer text files. Analysis of transcriptions was then facilitated by use of ATLAS.ti qualitative software and corresponded with the work of Luborsky and Rubenstein (1995), in which two indexes of significance were employed: frequency of statements and direct statements of salience or meaning. Thus, both the most frequently mentioned or coded categories were deemed important, as well as direct explicit discursive comments emphasizing the importance, value, or beliefs about a specific practice.

Categories for qualitative data were determined through an iterative process of systematic data review and (re)classification (Good & Good 1982; Agar 1980). Subsequent to the application of closed codes, grouping and selective codes were applied in ATLAS.ti in order to identify patterns and relationships between categories and themes (Strauss & Corbin 1990). Finally, content and thematic analysis was used to extract recurrent and salient matters (Creswell 2007). To ensure inter-rater reliability, we used a sequential coding process. First, all research staff were involved in the development and refining of the coding scheme. Second, all staff independently coded substantial sections of multiple transcripts and then met to compare and discuss coding decisions. Third, all staff coded two transcripts and then met with the PI to review consistency of initial coding before continuing to complete the rest of the coding for the project. Finally, 15% of documents were check-coded by the PI or project director for inter-rater consistency (Lincoln & Guba 1985). This process showed a strong degree of consistency across coders.

RESULTS

After the review of focus group and interview transcripts, participant responses were organized into three emergent, recurrent themes that reflect a broad framework for examining clinicians’ response to the legislative mandate and the use of EBPs. The three themes were (1) concerns about retaining individualized treatment and clinical latitude, (2) distrust of government involvement in clinical issues, and (3) need for accountability and credibility for the field. Each theme includes additional subcategories or domains and a related description of the more detailed context reflected by the participants’ comments (see Table 1). Direct quotations and summaries of comments within each of the themes and domains are described below.

TABLE 1.

Framework of Clinician Perspectives about SB 267 and Evidence-Based Practice

Emergent Themes Theme Domains Description
Concerns about Retaining Individualized Treatment and Clinical Latitude Congruence with client needs Skepticism about fit of EBPs with collective populations and geographical regions persists.
Research limitations Concerns about the limited research for diverse populations as well a lack of applied research focused on implementation.
Manualized treatment Frustration regarding constraint of clinical care and narrow treatment scope as well as lack of resources.
Distrust of Government Involvement in Clinical Issues Expertise of the legislature Legislators lack adequate knowledge of the field to make recommendations and determine what constitutes an EBP.
Motivation behind the mandate Lack of clarity about the mandate, expectations, and scope lead to increased suspicion toward state officials.
Defining EBPs Confusion persists concerning what constitutes an EBP, who determines this and how EBPs will continue to be developed, assessed and revised.
Need for Accountability and Credibility for the Field Fidelity Concerns about compliance diminished although limited resources and lack of tools to measure practice implementation remained fairly consistent overtime.
Standardized services Belief that mandate may level the playing field with the standard practice guidelines and greater consistency for providers.
Legitimacy and Professionalism Counselors were hopeful that the policy would bring credibility to the addiction field.

Individualized Treatment and Clinical Latitude

Many of the clinicians reported concerns about EBPs that seemed to precede the state mandate; thus, such concerns might prevail even in the absence of pressure from the state government. The most broad of these was the congruence of practices with client needs, or how EBPs fit (or fail to fit) the needs of particular patients. Evidence-based practices are often developed and promulgated by the research community, which many clinicians consider to be out of touch with the nature of clinical work or the requirements of their patient populations. Clinicians expressed this disconnect in two related ways. First, clinicians at participating agencies were keenly aware of the idiosyncrasy of their organizations, client demographics, and social contextual circumstances, and frequently noted that designated EBPs tested on demographically different subjects likely would not work with clients from their communities. The following remark, made during the planning phase by a frontline counselor at an agency in rural Oregon, was typical among clinicians:

I think it has to do with evidence according to whom. Most people’s evidence doesn’t work for our community. We’re pre-dominantly [working with] Caucasian clients abusing methamphetamine and cannabis. Our youth are deep into the alcohol and cannabis and it’s part of their culture. We’re not dealing with heroin addicts or a lot of gangs.

Respondents’ concerns regarding the fit of EBPs with the range of clients also seemed to be perceived as limiting their clinical latitude and restricting services. This frustration seemed to persist into the early implementation phase, as illustrated by the following quote from a counselor that was recorded approximately three years after the mandate began:

… because it’s so structured, and typically, a manualized treatment program is for a very specific population of people, persons that fit specific criteria … oftentimes, not everybody fits in that category.

Second, as they worked to implement EBPs, clinicians felt burdened with the need to adapt materials and interventions. Adjusting each EBP to be congruent with various program and client needs requires extensive time, leaving clinicians discouraged, as their comments from approximately three years into the implementation of the mandate reflect:

You can’t take something from a very controlled residential setting that works there and extrapolate that to outpatient. There are too many differences. I’d like to see some studies and some information of best practices that comes from methadone that works in outpatient … or vice versa. I mean, in other words, I’m a little bit frustrated.

Results of this study indicate that in addition to the paucity of research about EBPs for specific cultural groups and clinical populations, clinicians also reported a lack of evidence about methods for implementing EBPs in real-world settings. These concerns converge in the second domain, which reflects recurrent statements about research limitations and the paucity of applied research. Many respondents, in both the planning phase and implementation phase of the mandate, were quick to call into question EBPs that have not been tested among a range of client populations, as exemplified by a counselor from a tribal treatment program who was inteviewed approximately three years after the policy came into effect:

Let’s talk a minute about what evidence-based is. It’s a treatment modality or approach that has some research behind it that shows it works. So then we have to think about, well, where did they do this research? Well, I can tell you from my experience and the disparity of the Native peoples that this study did not come from Native peoples.

In relation to how to implement EBPs in clinics that are not part of a controlled, funded, randomized trial, clinicians’ comments clearly reflect frustration and discontent regarding the mandate. A counselor at initial implementation noted that “drastic modifications” were required to implement new practices, requiring excessive time and energy for already strained staff and systems. Similarly, three years after initial implementation, no funding or support was available for this work, as evidenced by a counselor from a rural agency:

It’s been a process over a couple years like, well, how do we implement this new policy? The Legislature got together and developed it and then voted on it. And here it is, thrown out to everybody to do. Nobody really knew what to do or what that meant.

Finally, the third domain within the individualized treatment and clinical latitude theme was the tension between the proliferation of manualized treatments for substance abuse and a pervasive emphasis on “individualized treatment plans” among clinicians in the field. A number of EBPs, including many identified by AMH, follow very particular protocols, often specifying both the content and precisely how the intervention must be implemented (i.e., Matrix Model and Multidimensional Family Therapy). During the planning phase, clinicians reported that the pressure to use only manualized treatments directly contradicts the creation of treatment plans that are tailored to each client’s specific needs, circumstances, and history. The following exchange between clinicians at an agency in an urban area is suggestive of this tension:

R#7: My fear is that sometimes when a manual comes out, clinicians read that manual frontwards and backwards. But all they’re doing is going through the motions … It’s not like the stuff’s no good, I’ve never heard that—but clinicians don’t want to be locked in.

R#1: You touched on something so important. I can take the book and I can read it, but one of the most VALUABLE things for me here is that when client issues come up … being able to process things and get their feedback and try to integrate that … is just SO valuable. And I will NEVER give that up.

When using their own sense of clinical expertise and practice wisdom, counselors are able to establish treatment plans and corresponding interventions that fit the unique circumstances of their clients. For clinicians in Oregon, the crux of the conflict appears to be between the use of EBPs and the importance of authenticity, unconditional regard for the client, and relationship building. Indeed, some providers felt that the structure and content of clinical practice is nearly irrelevant (EBP or not) when weighed against the importance of the “genuineness” of the practitioner. This is reflected in the following quote from the planning phase in 2005:

It’s the practitioner rather than the practice that matters, as far as it being effective or not. It doesn’t matter as much whether you’re a true Big Book thumper that is presenting AA material if you really truly believe in what you’re doing and you have a passion for it, then you’re going to do well with the clientele that you’re working with.

Although some of these concerns about manualized treatment persisted well into the implementation phase (or nearly three years after initiation of the mandate), there were fewer comments overall regarding this in Phase 2 and it appears that at some level counselors were able to begin to resolve this conflict.

For me, evidence-based practices add a little solidity to things. It tells me, okay, there’s possibly been some research there…. So, it’s kind of nice having that, and at the same time, knowing research is still going on.

Distrust of Government

Over the course of the study, clinicians’ comments and discussions reflected a second broad theme, distrust of the state government. Within this overarching sense of suspicion, several key domains emerged through repetitive, salient comments from study participants. The first domain, expertise of the legislature, was often voiced as an explicitly prejudicial stand against the meddling of politicians in clinical matters, as a counselor from an agency in rural Oregon suggested during the planning phase:

Why are politicians—and I have a natural distrust of politicians, that’s my prejudice—why are they getting involved? It bothers me when people who aren’t involved in doing the work start telling us how to do the work.

A similar perspective was noted in a comment made during the planning phase:

It’s really sad to me that the government would be telling us how to conduct our treatment. We should be able to do that on our own because we know better than anyone how to best serve our clients. We’re in this mess because the legislature did not believe that we knew what we were doing; I believe that very firmly…. You do all of this stuff and say it works. You don’t have any proof that it works…. The evidence-based practice thing came along as something to use as a guide, and so that bill was passed. Now the state is sitting there trying to figure out how to respond to it. They haven’t really got a ghost of a clue.

This perceived lack of knowledge and limited respect for the government officials charged with policy development and distribution of funds for addiction treatment services continued to be apparent during the follow-up or early implementation phase of the study, as noted by a counselor in an urban treatment agency:

These are folks who aren’t treatment providers. They’re legislators. And they listen to whoever is talking in their ear, and then they pass laws and make policy that doesn’t really have a lot to do with what actually happens when you’re sitting in an office talking to them [clients].

An important aspect of this sort of a priori resistance to the mandate was that it seemed to conflate the negative reaction toward the state legislature, leaving the strong perception among providers that the legislators were uninformed and generally had no idea what addiction treatment was like on a day-to-day basis. Results of our study suggest that the lack of appropriated funds and subsequent confusion from the single state authority about how to implement a broad policy change without set aside resources or a plan, all converged and failed to convince frontline staff that this mandate would be beneficial to treatment program clients and providers.

Following this sense that the legislature was uninformed and lacked adequate knowledge in relation to substance abuse treatment services, was the second domain, motivation behind the mandate. This subtheme surfaced repeatedly in the transcripts as providers felt that the mandate was unclear and that it passed without careful attention and input. While many providers may have a sophisticated understanding of how state policy is created and enacted, they tend not to differentiate the actions of one governmental body from another in evaluating the state mandate. Thus, the overall distrust seemed to intensify clinicians’ negative responses to both the legislature and AMH.

During the planning phase, clinicians’ comments suggested an impression that the EBP mandate was shoved through the legislature in a hurry, for the sake of political expediency, but without consideration for the challenges the mandate could raise. Respondents reported that the EBP mandate had been “gut and stuffed” into Senate Bill 267, most of which dealt with unrelated matters, and therefore was passed into law without debate, planning, or refinement. During Phase 1, one counselor expressed frustration with this lack of knowledge and consideration in the development of the mandate:

We’re not even sure how to define it [SB 267]. The problem is that the legislature sometimes gets their foot stuck in their mouth on things and it gets pushed through. Then it’s left up to the lawyers to decide what it means and what it doesn’t mean.

Indeed, the sentiment among clinicians was that the way in which the mandate was passed was having significant negative consequences on how it was being implemented. The following comment from a clinician during the planning year highlights confusion and discontent about how to institute the mandate:

A better way to have gone about this would have been a budget appropriation to explore different treatment modalities and then to come back to the legislature with a comprehensive report on what the current thinking is in alcohol and drug treatment. It would have been preferable to approach it that way instead of just slamming something out and say it’s evidence-based practice and you’ve got to do it. Now we’re pressured.

Approximately three years later, during implementation, distrust of the motivations behind the mandate, distrust of government, and frustration still remained:

The other thing is, is that it’s easy for them [legislators] because they sit somewhere else. They don’t have to do all this. If they had to come and do some of this stuff, I’m sure that would make a big difference on how much paperwork we actually had to do. We’re in the trenches and they aren’t. They need to stay out of what we do.

The third domain that extends the overarching theme about distrust of the government is confusion around what constitutes evidence and who gets to define what evidence-based practices are for substance use disorders. Frequently, as reported above under the theme of individualized treatment and clinical latitude, clinicians felt that there were not enough practices to select from, doubted the evidence that was available, and felt that the approved EBPs required significant adaptation. Apparently, a sufficient definition of evidence-based practices that could have provided a solid guide to providers was not available at the onset.

Moreover, the imposition of both the mandate and the pressure to define and then increase use of EBPs converged for many providers, each seeming to catalyze and accentuate concerns overall. The blurring of these pressures as well as the lack of consensus about what constitutes an EBP are noted in statements such as “They’re telling us what to do.” It seems likely that the “they” to whom this counselor is referring is the legislature, but the referent might also be a vague amalgamation of AMH and the research community, which they perceive to be championing EBPs.

The following interaction among clinicians at an urban agency in the planning phase in 2005 reveals a similar blurring of sentiments about EBPs and the state mandate:

R#1: So whose evidence is it? Like I said, the motivational interviewing comes to mind because it’s been shoved down our throat. I think that its useful information, but I don’t think it necessarily fits every situation.

R#3: It’s contradictory because in the same breath we’re being told, go where the client is, but if you have to use certain evidence-based practices—sometimes you’re not allowed to do that.

R#5: … addiction is very complex with so many facets. I’m hearing that what works here doesn’t work there. I don’t think the government belongs there anyway.

Later, during the early implementation of the mandate, clinicians’ comments shifted from a focus on defining EBPs to implementation challenges and a continued frustration with a lack of consensus between the single state authority and the legislature about EBPs, as reflected by the following comment from an urban provider in 2008:

I think they [the single state authority] don’t understand evidence-based practice in the way the legislature wants to implement it; it just kind of goes in one ear and out the other, and they hope that management is doing what they need to implement the EBP.

Overall, in terms of identifying EBPs and sharing this information, AMH made great strides over time and provided a full list of practices and links to trainings, materials, and other opportunities. It may be that frontline clinicians were just becoming aware of this information as full implementation of the mandate took place in 2007 and 2008, and our results may reflect a movement toward greater comfort with and acceptance of the AMH definitions of what constitutes an EBP.

Accountability and Credibility

A final emergent theme from the focus groups and interviews revealed that in spite of all of their concerns about the state mandate, clinicians expressed hopes that the new policy might lead to some positive outcomes. Although they described significant challenges they would face in working to improve the quality of services (fidelity and accountability) and shifting the perceived lack of legitimacy and otherwise negative views of the field, they felt the mandate might help bolster and move these processes along.

First, in terms of accountability and monitoring, there were concerns about having the appropriate resources available, which make it possible to comply with the mandate and the consequences of failing to comply. This uncertainty surfaced in numerous discussions about fidelity, the AMH plan for evaluating compliance, and the impact on clinical care; yet at the same time, clinicians also felt that it was important to respond and document the good work they do on a daily basis. Such sentiments are clear in the cheerful reflection from a counselor in a remote part of the state during the planning phase of the study:

Good things do come out of it. It has heightened an awareness that we are accountable for what we do. It isn’t a Ouija Board magic effect. It isn’t just all going to a big AA-meeting-in-the-sky kind of thing. There are very specific things that people do that have specific results, and it’s legitimate to ask for accountability for tax dollars being spent on treatment.

Further, in terms of benchmarks and which outcomes to focus on, during the planning phase, the following comments from a counselor with experience using motivational interviewing (MI) who had attended presentations about the efficacy of MI, discussed her concerns about the government’s support of the use of MI in substance abuse treatment:

… it didn’t matter which [MI or treatment as usual] they were using, it all equaled the same outcome. And what really was the predominant factor was the relationship with the client…. So, if in fact, the evidence is really that our relationship with the client OUTWEIGHED what we’re doing … why are we focusing [in the mandate] on … the substance abuse manual? Why aren’t we focusing on advancing that relationship with our clients? Well, because they can’t put it in numbers…. But if you’re using that MI manual, I CAN gauge and mathematically compute what you’re doing …

In her recollection, clinical data from a randomized trial of motivational interviewing demonstrated that the “relationship with the client,” a proxy for clinical “genuineness,” rather than adherence to an MI protocol, is the most significant factor in clinical success. But this counselor sees the allure for researchers and state government in subjecting clinical work to quantitative measurement. This policy allows the state to control what clinicians are “doing with [their] clients.”

Similarly, several years later during early implementation, a counselor from an urban program expressed concerns about the lack of overall system support for implementation and fidelity monitoring:

I don’t think the infrastructure is in place to really create incentives for implementation and fidelity. And I don’t think there is really enough definition of what fidelity is and what it would really look like in this context.

Overall comments and concerns about fidelity monitoring persisted from the planning phase through early implementation, and were most consistently linked with a lack of funding, infrastructure, and tools. The economic downturn and associated budget cuts that began in the late 2000s may have influenced the concerns about compliance with the mandate, as the state could not afford to complete yearly site visits or reviews. During the later implementation phase, participants’ responses reflected less anxiety about compliance but they continued to acknowledge the need for fidelity tools and monitoring in spite of limited resources.

A second domain within this theme of accountability and credibility was found in numerous responses from clinicians emphasizing the importance of developing a more standardized set of practices. Clinicians were optimistic about standardization improving outcomes and providing a greater accountability for the most effective use of state funds and reimbursements, as one counselor from an urban agency reported:

I do believe that we, as an industry, need to move forward as a discipline into some scientific place that’s away from this AA culture of sponsors becoming clinicians and “what works for me will work for you.” You know what I mean? I’m in favor of it, but I have reservations…. I do think as a field, we still fight for that credibility.

Additionally, some clinicians thought the mandate would bring some leverage to the field, offering resources to clinicians, as recognized by a counselor from an urban treatment program during early implementation of the mandate:

… there are really good clinicians that can just kind of handle process intensive work, and have really good results with it. Then there are others that can’t. So I think when you have that manual to pull from, it kind of levels out the playing field.

At the same time, clinicians were at times enthusiastic about their potentially growing social capital, and the possibility that the state mandate might afford the relatively under-respected and undercompensated field of addiction treatment greater legitimacy and professionalism, as suggested by a counselor in the early implementation phase:

I find it something to be able to gauge the clients by something that they feel can provide or help them see see the actual change and movement in themselves, where before it was all kind of subjective.

A counselor in rural Oregon reflected this pragmatic orientation: “No matter what the original motivation was, if we do it, and hopefully the outcomes support what we do, then we have credibility. In some ways, we will have more credibility than some of the other fields [in health care].” Clinicians reported that they are optimistic that the passage of the mandate is a sign that state government is paying greater attention to substance abuse treatment, and they appreciate how the mandate has increased the visibility of the field generally:

I think it’s great that they’re having best practices that are evidence-based, because it professionalizes our profession. For a long time, drug and alcohol clinicians were kind of the stepchildren of clinicians. Over the years as we’ve been doing this, and we do have an impact, and we do know what we’re doing, it kind of allows us to have a better reputation and at the same time professionalizes us more.

In many instances, as clinicians articulated their hopes for increased credibility or legitimacy, they simultaneously evoked hopes for a more equitable distribution of money through the treatment field. In part, underpaid clinicians are hoping that the state mandate may somehow lead to better compensation for frontline staff. One counselor wondered: “If dollars will move towards the agencies that are the most effective, then yeah, it’s really exciting and enthusiastic to throw our weight behind how good can we be. I don’t see why you wouldn’t want to get into that kind of work.” At the same time, clinicians recognized that the infusion of more money into substance abuse treatment would need to be handled with care. The state mandate may mean greater credibility for those providing treatment, but clinicians are also wary that certain agencies and practitioners may be left behind or alienated. Along the way, clinicians at rural agencies and those at agencies that are less well-funded worry that because they are already resource-poor, they will be at a disadvantage in trying to make adjustments to EBPs. Many clinicians at such agencies complained that a lack of access to training and EBP programs was particularly difficult for them, requiring that they travel long distances, often at a great expense and professional inconvenience.

For most clinicians we spoke with, optimism about the potential social capital in their field from the state EBP mandate is tempered with concern about the potentially corrupting influence of money in medical care. A counselor from a financially healthy, urban agency in 2007–2008 expressed both his hopes and his worries:

On the one hand, I think it’s really positive overall because it gets the whole field focused in this direction, but I also know that one of the ways that conversations and change is shaped is by attaching dollars to it. Then I have this thought of what’s gonna be the overall outcome of this whole thing, not only regionally but then nationally? Are we trying as a substance abuse and mental health field to mirror the medical model of treatment? If that’s the case, I have some real concerns about it.

DISCUSSION

Overall, our findings both confirm previous research that suggests that there is a paucity of EBPs for specific populations (Miller et al. 2006; Anthony, Rogers & Farkas 2003; Rogers 2003; Drake et al. 2001) and that clinical expertise and professional development are significant concerns for clinicians (Knudsen et al. 2005; Thomas et al. 2003). Our findings also extend the literature by examining these issues within the frame of clinical practice legislation. Results suggest that clinicians have strong opinions about the involvement of the government in clinical matters, the use of manualized practices, the application of evidence in real-world settings, and their autonomy as clinicians. They also expressed optimism about ways in which this mandate might increase the credibility of the field and provide direction and even cohesion. Thus, for clinicians in Oregon, their experiences and attitudes about clinical decision-making are now intertwined with reactions to Senate Bill 267, and those participating in the present study appeared to have very complex and, at times, conflicted feelings about EBPs in the context of the mandate.

Consistent with a growing body of research regarding clinicians’ attitudes toward EBPs and the critical influence these perspectives have on the integration of new tools in clinical care (Aarons, Sommerfeld & Walrath-Greene 2009; Aarons 2004; McGovern et al. 2004; Willenbring et al. 2004), our work reinforces the need to examine such attitudes within the full context of state and federal guidelines, policy, and other such initiatives aimed at reducing costs and improving the quality of services. Results of this study also confirm the importance of a clinicians’ sense of competence and, thus, training, supervision, and feedback are critical when working to shift practice patterns (Fixsen et al. 2005; Obert et al. 2005). Similarly, the acceptance of innovations is enhanced with increased exposure to the efficacy of EBPs (Herbeck, Hser & Teruya 2008). Further, the overall response to the mandate during both the planning and early implementation phases of the project appeared to correspond with diffusion of innovation research, in which both early adopters of the mandate emerged as well as those more hesitant to engage (Rogers 2003; 1995). When encouraged to discuss EBPs more specifically, we found that some clinicians were able to embrace the most recent therapeutic techniques, and others, many of whom have long tenures as treatment providers, perceived the emergent modalities with considerable suspicion (Knudsen et al. 2005; Aarons 2004; Marinelli-Casey, Domier & Rawson 2002; Roman & Johnson 2002). Clinicians were concerned about how EBPs compromise their clinical expertise and may contradict their efforts to tailor treatments to individual needs. Respondents in this study feared a mechanistic type of treatment, in which adherence to highly structured manuals was of greater value than attending to the relationship dynamics and processes of continued healing over time (Graybeal 2007). These factors ultimately limit full-scale adoption, as clinician attitudes and corresponding intention to use innovative practices are critical to successful implementation (Fixsen et al. 2005; Thomas et al. 2003; Rogers 2003, 1995).

Moreover, clinicians expressed serious distrust of the state’s involvement in their work, and this compounded their worries about how EBPs constrain their clinical practices. Many felt that the state government simply over-stepped its bounds. Furthermore, and perhaps posing the most serious ramifications for the state mandate, clinicians worried that the use of EBPs in substance abuse treatment subjected their clinical work to the state government’s measurement and control. In turn, they felt that the state lacked direction and resources necessary to track progress and monitor fidelity, which seemed initially to be a key component of the mandate.

With the passage of a mandate, there is a conflation and mutual accentuation of the influence of the state and the research community over clinicians’ daily practice. Where there are already pronounced worries about how EBPs might be problematic, a state mandate has the strong potential to exacerbate these sentiments and further prejudice providers against EBPs. In Oregon’s case, the state government aligned itself directly with EBPs in the hopes of improving outcomes for clients and diminishing costs for payers, and is invoking its juridical and pecuniary power in mandating the use of EBPs. Thus, the interests and political power of substance abuse treatment stakeholders and government officials involved in the EBP movement are not totally obscure. Specifically, clinicians in Oregon understand that the state government’s promotion of EBPs is loaded both with political possibilities and liabilities. Thus, clinicians’ responses to this mandate and their use of EBPs is tied to their own social capital or actual and potential resources, social networks, and institutional relationships (Bourdieu 1977). In the present case, clinicians generally have low wages, may be in recovery, and lack political influence, and thus may have very limited access to social capital. Our findings suggest that they were clearly aware of the connection between their lack of power in comparison to the state’s administrative and political power and infringement upon their professional values and practices. These findings correspond with a previous study of counselors faced with shifting to a managed care model, which found that the change raised significant concerns about the counselors’ sense of authenticity and diagnostic ambiguities that may not jibe with the use of manualized intervention or restrictive practice guidelines (Kirschner & Lachicotte 2001).

As our data suggest, substance abuse treatment workers are wary of the state’s involvement in their work and of the intrusion of politics into clinical practice. In spite of such distrust, the clinicians seemed to respond well to our research initiative, and during the early implementation phase of the study, their greater sense of acceptance and knowledge about the mandate and methods for responding to it was notable. Furthermore, in spite of their reservations, clinicians still expressed optimism that the state government’s support of EBPs may lead to greater social and institutional legitimacy for the field of substance abuse treatment.

Research and civic institutions that hope to increase EBP use in substance abuse treatment should be heartened that the treatment community wants to engage in conversation with researchers and the state to improve the standing of substance abuse treatment in health services and health care. It may be good news for government payers that the state mandate could be a promising means of promoting dissemination of EBPs in substance abuse treatment, given the relatively low costs of implementing such an unfunded mandate.

Study Limitations

We acknowledge that our study has several limitations that reflect the challenges of policy implementation research and analysis. First, findings are based on data from only a portion of programs as well as from staff present at the time of the site visits, which may not reflect all programs or staff. Second the study is qualitative and may not generalize as broadly as other population-based, multistate, quantitative studies. This type of methodology was selected to provide an in-depth understanding of the issues faced by programs and clinicians responding to one of the first evidence-based practice mandates in the country. In spite of such limitations, we believe our findings contribute to the literature and reflect the important perspective of direct service providers.

Conclusion

In the future, state-level policy makers would benefit from fully engaging clinicians when developing this type of initiative. Frontline staff and providers from all levels of service delivery will have valuable insights about how to conduct training, define evidence, measure fidelity, and address concerns regarding compliance and trust with government officials. Indeed, clinicians faced with responding to this mandate were forthright about the likelihood that the policy’s implementation would be undermined without their enthusiastic support. Thus, their involvement will help to clarify understanding about the policy and improve clinician buy-in, ensuring successful implementation of innovations and EBPs. Overall, policy makers would likely find a more immediate and positive response to such legislation with greater inclusions of providers at the onset (creation of the mandate) and with more careful attention to detailed plans and the impact of the mandate on individual clients and their treatment providers.

References

  1. Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: The Evidence-Based Practice Attitude Scale (EBPAS) Mental Health Services Research. 2004;6 (2):61–74. doi: 10.1023/b:mhsr.0000024351.12294.65. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Aarons G, Palinkas L. Implementation of evidence-based practice in child welfare: Service provider perspectives. Administration and Policy in Mental Health & Mental Health Services Research. 2007;34:411–19. doi: 10.1007/s10488-007-0121-3. [DOI] [PubMed] [Google Scholar]
  3. Aarons GA, Sommerfeld DH, Walrath-Greene CM. Evidence-based practice implementation: The impact of public versus private sector organization type on organizational support, provider attitudes, and adoption of evidence-based practice. Implementation Science. 2009;4:83. doi: 10.1186/1748-5908-4-83. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Addis ME, Wade WA, Hatgis C. Barriers to dissemination of evidence-based practices: Addressing practitioners’ concerns about manual-based psychotherapies. Clinical Psychology: Science and Practice. 1999;6:430–41. [Google Scholar]
  5. Agar M. The Professional Stranger. New York: Academic Press; 1980. [Google Scholar]
  6. Amodeo M, Ellis MA, Samet JH. Introducing evidence-based practices into substance abuse treatment using organization development methods. American Journal of Drug and Alcohol Abuse. 2007;32 (4):555–60. doi: 10.1080/00952990600920250. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Anthony W, Rogers ES, Farkas M. Research on evidence-based practices: Future directions in an era of recovery. Community Mental Health Journal. 2003;39 (2):101–14. doi: 10.1023/a:1022601619482. [DOI] [PubMed] [Google Scholar]
  8. Bourdieu P. Distinction: A Social Critique of the Judgment of Taste. Cambridge: Harvard University Press; 1984. [Google Scholar]
  9. Bourdieu P. Outline of a Theory of Practice. Cambridge: Harvard University Press; 1977. [Google Scholar]
  10. Boyle M. Response 2: The states are the key factor in successful EBP adoption and technology transfer. The Bridge. 2009;1(2) Available at http://www.attcnetwork.org/find/news/attcnews/epubs/v1i2_article02.asp. [Google Scholar]
  11. Certaue M. The Practice of Everyday Life. Berkeley: University of California Press; 1984. [Google Scholar]
  12. Chriqui JF, Terry-McElrath Y, McBride DC, Eidson SS, VanderWaal CJ. Does state certification or licensure influence outpatient substance abuse treatment program practices? Journal of Behavioral Health Services and Research. 2007;34 (3):309–28. doi: 10.1007/s11414-007-9069-z. [DOI] [PubMed] [Google Scholar]
  13. Creswell J, editor. Qualitative Inquiry and Research Design: Choosing Among Five Designs. 2. Thousand Oaks, CA: Sage Publications; 2007. [Google Scholar]
  14. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science. 2009 Aug;4 (50):1–15. doi: 10.1186/1748-5908-4-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Drake RE, Goldman HH, Leff HS, Lehman AF, Dixon L, Mueser KT, Torrey WC. Implementing evidence-based practices in routine mental health service settings. Psychiatric Services. 2001;52 (2):179–82. doi: 10.1176/appi.ps.52.2.179. [DOI] [PubMed] [Google Scholar]
  16. Ducharme LJ, Abraham AJ. State policy influence on the early diffusion of buprenorphine in community treatment programs. Substance Abuse Treatment, Prevention, and Policy. 2008;3:17. doi: 10.1186/1747-597X-3-17. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Dybicz P. An inquiry into practice wisdom: Families in society. Journal of Contemporary Social Services. 2004;85 (2):197–203. [Google Scholar]
  18. Evidence-Based Medicine Working Group. Evidence-based medicine: A new approach to teaching and practice of medicine. Journal of American Medical Association. 1992;268 (17):2420–25. doi: 10.1001/jama.1992.03490170092032. [DOI] [PubMed] [Google Scholar]
  19. Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F. Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network; 2005. [Google Scholar]
  20. Forman R, Bovasso G, Woody G. Staff beliefs about addiction treatment. Journal of Substance Abuse Treatment. 2001;21 (1):1–9. doi: 10.1016/s0740-5472(01)00173-8. [DOI] [PubMed] [Google Scholar]
  21. Fuller B, Rieckmann T, Nunes E, Miller M, Arfken C, Edmundson E, McCarty D. Organizational readiness for change and opinions toward treatment innovations. Journal of Substance Abuse Treatment. 2007;33:183–92. doi: 10.1016/j.jsat.2006.12.026. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Galanter M, Keller DS, Dermatis H, Egelko S. The impact of managed care on substance abuse treatment: A report of the American Society of Addiction Medicine. Journal of Addictive Diseases. 2000;19 (3):13–34. doi: 10.1300/J069v19n03_02. [DOI] [PubMed] [Google Scholar]
  23. Gambrill E. Evaluating the quality of social work education. Journal of Social Work Education. 2001;37:418–29. [Google Scholar]
  24. Garner BR. Research on the diffusion of evidence-based treatments within substance abuse treatment: A systematic review. Journal of Substance Abuse Treatment. 2009;36:376–99. doi: 10.1016/j.jsat.2008.08.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Gelber S, Rinaldo D. State Substance Abuse Agencies and their Placement within Government: Impact on Organizational Performance and Collaboration in 12 States. Rockville, MD: Substance Abuse and Mental Health Services Administration, United States Department of Health and Human Services; 2005. [Google Scholar]
  26. Good B, Good M. Toward a meaning-centered analysis of popular illness categories. In: Marsella A, White G, editors. Cultural Conceptions of Mental Health and Therapy. Dordrecht: Riedel; 1982. [Google Scholar]
  27. Gray M, McDonald C. Pursuing good practice? The limits of evidence-based practice. Journal of Social Work. 2006;6 (1):7–20. [Google Scholar]
  28. Graybeal C. Evidence for the art of social work: Families in society. Journal of Contemporary Social Services. 2007;88 (4):513–23. [Google Scholar]
  29. Herbeck DM, Hser Y, Teruya C. Empirically supported substance abuse treatment approaches: A survey of treatment providers’ perspectives and practices. Addictive Behaviors. 2008;33 (5):699–712. doi: 10.1016/j.addbeh.2007.12.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Institute of Medicine (IOM) Bridging the Gap Between Practice and Research: Forging Partnerships with Community-Based Drug and Alcohol Treatment. Washington, D.C: National Academy Press; 1998. [PubMed] [Google Scholar]
  31. Kirshner S, Lachicotte W. Managing managed care: Habitus, hysteresis and the ends of psychotherapy. Culture, Medicine and Psychiatry. 2001;25:441–56. doi: 10.1023/a:1013068803396. [DOI] [PubMed] [Google Scholar]
  32. Klein KL, Sorra JS. The challenge of innovation implementation. Academy of Management Review. 1996;21 (4):1055–80. [Google Scholar]
  33. Knudsen H, Ducharme LJ, Roman PM, Link T. Buprenorphine diffusion: The attitudes of substance abuse treatment counselors. Journal of Substance Abuse Treatment. 2005;29 (2):95–106. doi: 10.1016/j.jsat.2005.05.002. [DOI] [PubMed] [Google Scholar]
  34. Landsman GH. What evidence, whose evidence? Physical therapy in New York State’s Clinical Practice Guideline and in the lives of mothers of disabled children. Social Science & Medicine. 2006;62 (11):2670–80. doi: 10.1016/j.socscimed.2005.11.028. [DOI] [PubMed] [Google Scholar]
  35. Lincoln YS, Guba EG. Naturalistic Inquiry. Newbury Park, CA: Sage Publications; 1985. [Google Scholar]
  36. Lipsky M. Street-level Bureaucracy: Dilemmas of the Individual in Public Services. New York: Russell Sage; 1980. [Google Scholar]
  37. Luborsky M, Rubinstein RL. Sampling in qualitative research: Rationale, issues, and methods. Research on Aging. 1995;17 (1):89–113. doi: 10.1177/0164027595171005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Magnabosco J. Innovations in mental health services implementation: A report on state-level data from the U.S. Evidence-Based Practices Project. Implementation Science. 2006;1:13. doi: 10.1186/1748-5908-1-13. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Marinelli-Casey P, Domier CP, Rawson RA. The gap between research and practice in substance abuse treatment. Psychiatric Services. 2002;53 (8):984–87. doi: 10.1176/appi.ps.53.8.984. [DOI] [PubMed] [Google Scholar]
  40. Mark TL, Levit KR, Coffey RM, McKusick DR, Harwood HJ, King EC, Bouchery E, Genuardi JS, Vandivort-Warren R, Buck JA, Ryan K. SAMHSA. Publication No SMA 07-4227. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2007. National Expenditures for Mental Health Services and Substance Abuse Treatment, 1993–2003. [Google Scholar]
  41. McCarty D, Fuller B, Arfken C, Miller M, Nunes E, Edmundson E, Copersino M, Floyd A, Forman R, Laws R, Magruder K, Oyama M, Prather K, Sindelar J, Wendt W. Direct care workers in the National Drug Abuse Treatment Clinical Trials Network: Characteristics, opinions and beliefs. Psychiatric Services. 2007;58:181–90. doi: 10.1176/appi.ps.58.2.181. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. McGovern MP, Fox TS, Xie H, Drake RE. A survey of clinical practices and readiness to adopt evidence-based practices: Dissemination research in an addiction treatment system. Journal of Substance Abuse Treatment. 2004;26:305–12. doi: 10.1016/j.jsat.2004.03.003. [DOI] [PubMed] [Google Scholar]
  43. McLellan AT. Technology transfer and the treatment of addiction: What can research offer practice? Journal of Substance Abuse Treatment. 2002;22 (4):169–70. doi: 10.1016/s0740-5472(02)00240-4. [DOI] [PubMed] [Google Scholar]
  44. Miller W, Zweben J, Johnson W. Evidence-based treatment: Why, what, where, when and how? Journal of Substance Abuse Treatment. 2005;29 (4):267–76. doi: 10.1016/j.jsat.2005.08.003. [DOI] [PubMed] [Google Scholar]
  45. Miller WR, Sorensen JL, Selzer JA, Brigham GS. Disseminating evidence-based practices in substance abuse treatment: A review with suggestions. Journal of Substance Abuse Treatment. 2006;31:25–39. doi: 10.1016/j.jsat.2006.03.005. [DOI] [PubMed] [Google Scholar]
  46. National Quality Forum (NQF) National Voluntary Consensus Standards for the Treatment of Substance Use Conditions: Evidence-Based Treatment Practices. Washington, DC: National Quality Forum; 2007. [Google Scholar]
  47. Obert JL, Brown AH, Zweben J, Christian D, Delmhorst J, Minsky S, Morrisey P, Vandersloot D, Weiner A. When treatment meets research: Clinical perspectives from the CSAT Methamphetamine Treatment Project. Journal of Substance Abuse Treatment. 2005;28 (3):231–37. doi: 10.1016/j.jsat.2004.12.008. [DOI] [PubMed] [Google Scholar]
  48. Ortner S. Theory in anthropology since the sixties. Comparative Studies in Society and History. 1984;26 (1):126–66. [Google Scholar]
  49. Patient Protection and Affordable Care Act of 2010 (PPACA). Pub. L. 111–148, 124 Stat. 119, H.R. 3590. (2010).
  50. Proctor EK, Knudsen KJ, Fedoravicius N, Hovmand P, Rosen A, Perron B. Implementation of evidence-based practice in community behavioral health: Agency director perspectives. Administration and Policy in Mental Health. 2007;34:479–88. doi: 10.1007/s10488-007-0129-8. [DOI] [PubMed] [Google Scholar]
  51. Rapp CA, Bond GR, Becker DR, Carpinello DR, Nikkel RE, Gintoli G. The role of state mental health authorities in promoting improved client outcomes through evidence-based practice. Community Mental Health Journal. 2005;41(3):347–63. doi: 10.1007/s10597-005-5008-8. [DOI] [PubMed] [Google Scholar]
  52. Rawson RA, Hasson A, Huber A, McCann MJ, Ling W. A 3-year progress report on the implementation of LAAM in the United States. Addiction. 1998;93:533–40. doi: 10.1046/j.1360-0443.1998.9345338.x. [DOI] [PubMed] [Google Scholar]
  53. Regehr C, Stern S, Shlonsky A. Operationalizing evidence-based practice: The development of an institute for evidence-based social work. Research on Social Work Practice. 2007;17 (3):408–16. [Google Scholar]
  54. Rieckmann T, Kovas AE, Cassidy EF, McCarty D. Employing policy and purchasing levers to increase the use of evidence-based practices in community-based substance abuse treatment settings: Reports from Single State Authorities. Evaluation and Program Planning. doi: 10.1016/j.evalprogplan.2011.02.003. In press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Rieckmann T, Kovas A, Fussell H, Stettler N. Implementation of evidence-based practices for treatment of alcohol and drug disorders: The role of the state authority. Journal of Behavioral Health Services Research. 2009;36 (4):407–19. doi: 10.1007/s11414-008-9122-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Rogers EM. Diffusion of Innovations. 5. New York: Free Press; 2003. [Google Scholar]
  57. Rogers EM. Diffusion of Innovations. 4. New York: Free Press; 1995. [Google Scholar]
  58. Rolfe G. Insufficient evidence: The problems of evidence-based nursing. Nurse Education Today. 1999;19:433–42. doi: 10.1054/nedt.1999.0346. [DOI] [PubMed] [Google Scholar]
  59. Roman PM, Johnson JA. Adoption and implementation of new technologies in substance abuse treatment. Journal of Substance Abuse Treatment. 2002;22:211–18. doi: 10.1016/s0740-5472(02)00241-6. [DOI] [PubMed] [Google Scholar]
  60. Simpson DD. A conceptual framework for transferring research to practice. Journal of Substance Abuse Treatment. 2002;22:171–82. doi: 10.1016/s0740-5472(02)00231-3. [DOI] [PubMed] [Google Scholar]
  61. Staller K. Railroads, runaways, & researchers: Returning evidence rhetoric to its practice base. Qualitative Inquiry. 2006;12 (3):503–22. [Google Scholar]
  62. Strauss A, Corbin J. Basics of Qualitative Research. 2. Thousand Oaks: Sage Publications; 1990. [Google Scholar]
  63. Thomas G. Introduction: evidence and practice. In: Thomas G, Pring R, editors. Evidence Based Practice in Education. New York: Open University Press; 2004. [Google Scholar]
  64. Thomas CP, Wallack SS, Lee S, McCarty D, Swift R. Research to practice: Adoption of practice: Adoption of naltrexone in alcoholism treatment. Journal of Substance Abuse Treatment. 2003;24:1–11. [PubMed] [Google Scholar]
  65. Wacquant L. Scrutinizing the street: Poverty, morality, and the pit-falls of urban ethnography. American Journal of Sociology. 2002;106 (7):1–52. [Google Scholar]
  66. Webb S. Some considerations on the validity of evidence-based practice in social work. British Journal of Social Work. 2001;31:57–79. [Google Scholar]
  67. Willenbring ML, Kivlahan D, Kenny M, Grillo M, Hagedorn H, Postier A. Beliefs about evidence-based practices in addiction treatment: A survey of Veterans Administration program leaders. Journal of Substance Abuse Treatment. 2004;26:79–85. doi: 10.1016/S0740-5472(03)00161-2. [DOI] [PubMed] [Google Scholar]

RESOURCES