Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2012 Jun 1.
Published in final edited form as: Psychol Addict Behav. 2011 Jun;25(2):225–237. doi: 10.1037/a0022398

Implementing Evidence-based Psychosocial Treatment in Specialty Substance Use Disorder Care

Jennifer K Manuel 1, Hildi J Hagedorn 2, John W Finney 3
PMCID: PMC3119356  NIHMSID: NIHMS297146  PMID: 21668085

Abstract

Implementing evidence-based psychosocial or behavioral treatments for clients with substance use disorders (SUDs) presents significant challenges. In this article, we first identify the treatments for which there is some consensus that sufficient empirical support exists to designate them as “evidence-based,” and then briefly consider the nature of that evidence. Following that, we review data from a Substance Abuse and Mental Health Services Administration survey on the extent to which these evidence-based treatments (EBTs) are used in SUD treatment in the United States. The main focus of the article is a review of 21 studies attempting to implement EBTs from which we glean information on factors associated with more and less successful implementation. We conclude that more conceptually-driven, organizationally-focused (not just individual-provider-focused) approaches to implementation are needed and that, at least with some providers in some organizational contexts, it may be more effective to implement evidence-based practices or processes (EBPs) rather than EBTs.

Keywords: Substance use disorders, psychosocial treatment, behavioral treatment, evidence-based treatment, implementation


As in other areas of health care, substantial attention has been focused in the past decade on providing evidence-based care for persons with substance use disorders (SUDs; i.e., alcohol and illicit drug use disorders). However, implementing evidence-based psychosocial or behavioral treatments can be challenging (Condon, Miner, Balmer, & Pintello, 2008). “Implementation” refers to the incorporation and use over time of a treatment in routine practice in a setting or by providers not using the treatment previously as the result of some active change process. Implementation can be differentiated from “dissemination,” which connotes more passive diffusion of information about a new treatment, and from “adoption” which connotes initial, rather than sustained use of a treatment.

In this article, we first identify the psychosocial and behavioral treatments for SUDs for which some consensus exists that there is sufficient empirical support to designate them as “evidence-based,” briefly consider the nature of that evidence base, and provide some data on the extent to which these evidence-based treatments (EBTs) are being used in SUD treatment in the United States. We then review research on the implementation of EBTs to extract information on interventions that have been used to implement EBTs and on factors associated with more and less successful implementation. The review highlights the need for more conceptually-driven, organizationally-focused (not just individual-provider-focused) approaches to implementation. We also raise the possibility that, at least in some situations, it may be more effective to implement evidence-based practices or processes (EBPs), rather than EBTs.

Consensus Evidence-Based Treatments for SUDs

Consensus exists that several psychosocial treatments or interventions for SUDs are “evidence-based.” The evidence base cited consists of findings from either individual studies or meta-analyses of studies that largely were randomized controlled trials in which individuals exposed to these psychosocial interventions had significantly better substance use outcomes either at the end of the treatment phase or at later follow-ups relative to some comparison condition. Additional criteria, such as positive findings by independent research groups and a manual to guide the provision of treatment, often need to be met for a treatment to be designed “evidence-based” or “empirically-validated” (e.g., McCrady, 2000).

Consensus evidence-based SUD treatments include behavioral couples therapy (although the validity of some studies is in question, sufficient other evidence supports this treatment), cognitive-behavioral therapy (including relapse prevention), contingency management, motivational enhancement/motivational interviewing, and 12-step facilitation treatment. Brief interventions for alcohol misuse and alcohol use disorders also are widely viewed as evidence-based - see Williams et al., this issue. Although as Miller, Sorensen, Selzer and Brigham (2006) note, lists of evidence-based SUD treatments vary across sources, need to be continuously updated and “should not be reified,” the treatments listed above have been designated as “evidence based” consistently in reviews of treatment research (e.g., McGovern & Carroll, 2003), as well as by such bodies as the National Quality Forum (NQF, 2007).

Many of these treatments have proven superior in “efficacy” trials optimized to find a significant treatment effect. For example, participants in such trials often must meet diagnostic criteria only for the specific substance use disorder (e.g., alcohol dependence) that the treatment is intended to address; persons with other drug use disorders (with the usual exception of marijuana dependence) and serious psychiatric disorders, as well as those who might prove difficult to follow because of the absence of a stable residence, often are excluded (Humphreys, Weingardt, Horst, Joshi, & Finney, 2005). Once enrolled in treatment, efforts (e.g., reminder calls) may be made to ensure that participants attend therapy sessions and receive a sufficient dose of treatment. Therapists (sometimes graduate students or individuals with masters degrees or PhDs) are thoroughly trained and monitored (e.g., with audiotapes of therapy sessions and weekly meetings with a therapy supervisor) to ensure treatment fidelity and therapist competence during the trial. Given these research features, there is understandable concern about the generalizability of findings from such trials to routine community care settings (Miller et al., 2006).

Adoption of Consensus EBTs

How widely are the evidenced-based treatments provided in SUD programs in the United States? That is a difficult question to answer, but some preliminary data are available. The National Survey of Substance Abuse Treatment Services (N-SSATS; Substance Abuse & Mental Health Services Administration, 2007) was administered between March and October of 2007 (March 30, 2007, was the reference date for information reported). Of 15,188 eligible facilities in the sample, questionnaires were returned by representatives of 14,359; after reviewing responses, 13,648 of the programs were determined to meet eligibility criteria (e.g., were offering SUD treatment or detoxification services). Virtually all respondents (96.3%) stated that at their facility substance abuse counseling was employed “often” and two-thirds indicated a 12-step treatment (not 12-step facilitation) was used “often.” Neither substance abuse counseling nor 12-step treatment has been designated as “evidence-based.” However, among designated EBTs, relapse prevention was said to be used “often” at 90.7% of facilities. The percentages of agencies reporting “often” using other EBTs were 68.5% for cognitive-behavioral therapy, 55.7% for motivational interviewing, and 19.5% for contingency management. Only small percentages of respondents indicated they were unfamiliar with most of the EBTs (e.g., 3.1% for motivational interviewing; 1.8% for cognitive-behavioral therapy), although 19.3% were not familiar with contingency management.

The extent to which EBTs were said to be used in U.S treatment programs in this survey may strike some readers as surprising. The adoption of EBTs in the U.S. may have been accelerated by the fact that some state-level alcohol and drug treatment authorities are now requiring that the programs they fund use some EBTs (McLellan, Kemp, Brooks, & Carise, 2008; Rieckmann, Kovas, Fussell, & Stettler, 2009). Nevertheless, it seems highly likely that the SAMHSA data are skewed toward overstatement of use of and familiarity with EBTs due to social desirability bias (Eliason, Arndt & Schut, 2005; Hartzler, Baer, Dunn, Rosengren & Wells, 2007).

Although the landscape of SUD treatment in the U.S. may be in the process of some change toward greater use of EBTs, implementing new treatment approaches, especially if existing approaches are well-established, can be difficult (Condon et al., 2008). We turn next to studies of the implementation of EBTs to try to glean more empirically-based knowledge about the necessary and sufficient ingredients of interventions to successfully implement EBTs. Our aims were to (a) determine if implementation interventions (e.g., training workshops) were associated with improved therapist and/or clients outcomes, (b) compare the effectiveness of training workshops with enhancements (e.g., additional coaching) versus unenhanced training workshops, (c) explore outcomes of online training in EBTs, (d) consider the outcomes of studies that employed organizationally-focused implementation approaches, as opposed to intervening with individuals providers, and (e) describe barriers to implementation that were identified in the studies.

Methods

The current review focuses on studies implementing psychosocial interventions in routine, specialty SUD treatment settings.

Study Eligibility Criteria

To be included in the review, studies must have reported on the implementation process or on specific therapist and/or client outcomes following implementation of a new SUD treatment. Studies were excluded if they were surveys of attitudes toward evidence-based interventions or surveys of the extent of adoption of EBTs, or if they implemented brief interventions for alcohol misuse (see Williams et al., this issue). Studies that evaluated training for health care providers working in settings other than SUD specialty treatment (see Walters, Matson, Baer, & Ziedonis, 2005) or implementation of pharmacotherapies for SUDs (see Gordon et al., this issue) also were not included. Two additional studies were excluded that implemented a specific component of an EBT rather than the complete EBT (e.g., Carise et al., 2009; Weingardt, Villafranca, & Levin, 2006).

We first identified those studies in the reviews by Garner (2009), Miller et al. (2006) and Walters et al. (2005) on the dissemination of EBTs that met the specified eligibility criteria for the current review. Seven of the studies in the current review also were reviewed by Garner (2009). The remaining studies from Garner (2009) were excluded because they focused on provider attitudes toward EBTs, had implemented pharmacotherapy without also implementing a psychosocial intervention, or did not specifically examine the process of implementation or therapist or client outcomes post-implementation. Six studies from the Miller et al. (2006) review were eligible for the current review. Eight of the studies included in the current review also had been reviewed by Walters et al. (2005). Other studies considered by Walters et al. (2005) were excluded primarily because of their focus on a brief intervention or because the implementation interventions took place in a medical setting. Overall, 14 of the 21 studies reviewed here had been included in one or more of the three prior reviews.

The seven other relevant studies were identified through a systematic search using Medline and Psychinfo databases. Those databases were searched for English language articles published through April, 2010, using the following terms: “adopting, adoption, diffusion, dissemination, implementing, implementation, implementation science, implementation research, technology transfer, knowledge, transfer, drug rehabilitation, practice to research, and substance use.”

Outcome Assessment

The outcomes assessed varied across studies. Many used instruments to assess providers’ knowledge and self-report of skills before and after a training workshop. One study examined therapist adherence from adolescent clients’ and their adult caregivers’ perspectives (Henggeler, Sheidow, Cunningham, Donohue, & Ford, 2008). Twelve studies included behavioral coding systems that assessed interactions between providers and actual clients or standardized patients. Only four studies examined client outcomes – client treatment adherence or retention (Ball, Martino, Nich, Frankforter, Van Horn, Crits-Christoph, et al., 2007/Martino, Ball, Nich, Frankforter, & Carroll, 2008; Keller & Galanter, 1999), and/or substance use (Ball et al., 2007/Martino, Ball et al., 2008; Keller & Galanter, 1999; Liddle et al., 2006; Morgenstern, Blanchard, Morgan, Labouvie, & Hayaki, 2001).

Results

Overview of Studies Reviewed

Table 1 describes the 21 studies reviewed. Each study included some type of workshop or didactic training that varied in length from one day (Henggeler, Chapman et al., 2008) to one week (Riley, Rieckmann, & McCarty, 2008). The treatment modalities or EBTs that were implemented varied. Ten studies (48%) examined the implementation of Motivational Interviewing (MI) or Motivational Enhancement Therapy (MET). Five studies (24%) tested Contingency Management and two (10%) examined cognitive-behavioral therapy. Combined MET/CBT, Network Therapy, Multidimensional Family Therapy, and an evidence-based manual for co-occurring disorders were each implemented in one study. Nine of the 21 studies employed comparison groups, whereas the others were single-group, pre-post studies. We examined (a) the outcomes of implementation interventions, (b) the effects of enhancements (e.g., additional coaching) to training workshops, (c) studies of online training, (d) organizationally-focused implementation efforts, and (e) barriers to implementation that were identified in the studies.

Table 1.

Studies of the Implementation of Evidence-based Psychosocial Treatments

Study Participants Implementation Follow-up Outcome
Andrzejewski, Kirby, Morral & Iguchi, 2001 10 counselors at
a community-
based methadone
clinic
Group training sessions on CM, weekly
 individual training sessions. Evaluated
 impact of graphical feedback and cash
 prize drawing on providers’ use of CM,
 post-CM training.
Baseline, fdbk, and
cash drawing phases
After graphical feedback counselor
 implementation of CM increased from 42% to
 71% (measured by proportion of CM criteria
 met). Implementation of CM increased from
 42% to 81% when the cash prize drawing was
 offered to counselors.
Baer, Rosengren, Dunn, Wells, Ogle, & Hartzler, 2004 22 addiction
treatment
program
providers
14 hr MI training wkshop Baseline, post-wkshop,
2-month follow-up.
Helpful Responses
Questionnaire (HRQ).
Sessions coded withx
MISC.
HRQ scores and ratio of reflections to questions
 higher at post and follow-up with some decay
 from post to follow-up. Increase at post in %
 Open Questions, % MI Adherent, and global
 ratings of Egalitarian and Spirit.
Baer, Wells, Rosengren, Hartzler, Beadnell, & Dunn, 2009 144 providers
from 6 substance
abuse treatment
facilities
2-day MI wkshop (MIW) vs context-
 tailored training (CTT). CTT included
 five 2-3 hr group training sessions at 2-
 week intervals, recorded skills-practice
 interview with a standardized patient
 between each session, fdbk from trainer
 on practice interviews at each training
 session. “MI Champion” assisted
 trainers to continue interest/MI skills.
Baseline, post-training,
3-month follow-up.
Sessions with
standardized clients
were coded using the
MITI.
MIW showed greater pre- to post-training gains
 than CTT on MI Spirit, but also greater
 reductions from post-training to follow-up,
 leaving no group difference at follow-up.
 Significant improvements in ratio of reflections
 to questions at post-training with slight decay at
 follow-up, but no difference between
 conditions.
Ball, Martino, et al. 2007;
Martino, Ball, Nich, Frankforter, & Carroll, 2008
35 therapists
from five
outpatient SUD
treatment
programs.
MET vs TAU. 16-hour wkshop training,
 audiotaped cases supervised by expert
 trainers until minimal standards
 achieved; supervisors rated sessions,
 gave feed-back, coached therapists.
 Post-training, monthly consult calls for
 supervisors w expert trainers.
 Therapists received biweekly
 supervision. TAU therapists discussed
 pts’s goals, encouraged 12-step
 attendance, and promoted abstinence.
4 wk tx phase: Client
motivation rated
first/last 5 min. of
session. Measured
client retention and
substance use. Urine
screens ea wk. Used
Independent Tape Rate
Scale for therapist
adherence/competence.
Higher fundamental/advanced MI
 adherence/competence and lower MI
 inconsistent adherence scores for MET
 condition. In MET condition, therapist
 competence in fundamental/advanced MI skills
 was positively related to in-session client
 change and client abstinence. MI-inconsistent
 behavior was negatively correlated with
 treatment retention.
Henggeler, Chapman et al. 2008 432 public sector
therapists from
44 state
treatment
agencies in
South Carolina
1-day training wkshop promoting CM
 (CBT skills and monitoring techniques
 with voucher incentives) use with
 adolescent substance abusers
Evaluated monthly (1-6
months post-wkshop)
through participant
interviews. Measured
adoption and
implementation.
Organizational
Readiness to Change
Scale.
58% of providers reported CM use. Increased
 adoption for those with more education,
 experience, favorable attitudes toward
 behavioral therapies and treatment manuals,
 work in mental health and willingness to adopt
 if mandated by leadership. Higher
 organizational readiness to change linked to
 more training and use of CM with higher
 fidelity.
Henggeler, Sheidow et al. 2008 30 providers
using multi-
systemic therapy
to treat
adolescents at
imminent risk
for out-of-home
placement. 70
youth and their
caregivers.
All participated in 2-day wkshop, then
 randomized to Wkshop (W-only) or
 Intensive Quality Assurance (IQA).
 W: CM materials and access to CM
 expert. IQA: wkly expert case
 consultation, provider development
 plans, quarterly booster training.
 Organizational consultation to
 identify/resolve barriers to
 implementation, w/ fdbk/positive
 reinforcement for CM use/adherence.
 Supervisors provided fdbk on sessions
 using CM checklists; resolved barriers
 to use; fdbk to improve CM use.
Youth and caregivers
rated therapists on their
adherence to cognitive
behavioral techniques
and monitoring
techniques on a
monthly basis while
actively in treatment.
Data collected on each
therapist for 15 months
(5-month baseline, 4-
month post-wkshop, 6-
month sustainability
phase)
Youth reported therapists in IQA group
 significantly increased CM techniques from
 baseline to 4 month; no change for W-only
 therapists. CM technique level at month 4
 maintained through 6-month sustainability
 phase. Caregivers reported IQA therapists
 initially increased CM techniques but returned
 to baseline levels by month 4; W-only
 therapists had a slight decrease in use of CM
 techniques, but returned to baseline levels by
 month 4. No sig. differences between groups
 from month 5 on. No change from baseline for
 either condition on CM monitoring techniques.
Keller & Galanter 1999 5 therapists from
community
outpatient SUD
programs,
cocaine-
dependent pts
Treatment manual in Network Therapy
 (NT) plus 10 hour-long didactic
 sessions followed by one hour group
 case supervision per week while
 treating client
Weekly urine
toxicology results and
length of stay in
treatment in weeks.
NT clients had more negative urine screens
 during treatment compared to TAU clients. NT
 clients completed 13.9 weeks of treatment
 compared to 10.7 for TAU clients
 (nonsignificant difference).
Kellogg et al. 2005 Seven
methadone
clinics in the
Chemical
Dependency
Treatment
Services of the
New York City
Health and
Hospitals
Corporation
Presentations to clinic leadership and
 staff at each clinic covering the theory,
 practice, and research findings on CM,
 as well as client experiences from other
 CM projects within CTN. Leadership
 and staff had meetings to discuss
 opposition or barriers to implementing
 CM.
1-yr follow-up:
qualitative data
including progress
reports, interviews with
staff, administrators,
and pts.
At 1 yr, CM programs established at 5 of 7
 clinics. Variety of behaviors targeted (e.g.,
 group attendance, negative urine screens,
 meeting vocational goals), w/ different
 reinforcers (e.g., movie passes, transportation
 vouchers).
Liddle et al. 2002 and Liddle et al. 2006 Adolescent day
treatment
program, based
on a model of
providing
patients with
positive
reinforcement
for effective
coping behaviors
and social skills.
Ten staff
members and
104 patients and
their caregivers.
Multiphase training in Multidimensional
 Family Therapy. Baseline Phase:
 Direct observation, informal interviews,
 brain-storming meetings, evaluation of
 pt outcomes and organizational climate.
 Determine areas of change from clinic
 perspective. Identify necessary
 adaptations of MDFT.
 Training/Exposure Phase: Group
 training for all staff. Cotherapy
 sessions w/ social workers. Meetings w/
 program and medical directors to
 monitor progress. Implementation
 Phase: Regular supervision and booster
 trainings, wkly meetings to identify
 barriers/solutions, review of
 “scorecards” as a reminder of tasks to
 be completed and an accountability
 instrument for supervision purposes.
 Durability/Practice Phase: 12 mns.
 Supervision withdrawn, extent of
 regular use of MDFT assessed.
12-mn baseline phase,
6-mn training phase,
12-mn Implementation
Phase.
12-mn durability and
Practice Phase assessed
therapy contacts, pt
substance use, and
sessions via Therapist
Behavior Rating Scale.
Therapists offered more weekly sessions, had
 more extrafamilial contacts, addressed
 adolescent school problems. Pts had higher
 rates of abstinence during treatment and had
 fewer out-of-home placements in the
 Implementation and Durability phases
 compared to Baseline phase.
Miller & Mount 2001 22 probation
officers and
community
correction
counselors from
the Washington
County, Oregon
probation
department
15 hour wkshop training in MI, book,
 therapist manual, trainee handbook, six
 optional 90 minute biweekly follow-up
 discussions were offered but with
 minimal participation
Pre-training, post-
training, and four
month Helpful
Responses
Questionnaire (HRQ)
follow-up. Sessions
rated using MISC
coding system.
Trainees reported substantial increases in
 understanding basic ideas/principles of MI and
 in feeling proficient to use MI. HRQ showed
 large increases in total reflections, MI-
 consistent responses, reflection-to-question
 ratio and % MI-consistent responses; decreases
 in MI-inconsistent responses. MISC coding
 showed increases in reflective listening
 responses, ratio of reflections to questions, and
 MI-consistent responses at post, declining at
 follow-up.
Miller, Yahne, Moyers, Martinez, & Pirritano, 2004 140 licensed
health
professionals
treating 5+
individual SUD
clients per week
Evaluation of five different methods of
 training MI: 1) 2-day wkshop-only, 2)
 Wkshop+fdbk on taped client sessions,
 3) Wkshop+6 biweekly individual
 telephone coaching sessions, 4)
 Wkshop+fdbk and coaching, 5) Self-
 training: manual and training videos
Audio-taped work
samples coded at
baseline, post-training,
4-, 8-, and 12-months
using MISC
At 4 months, no sig. change in skills for self-train
 group; wkshop-only had nonsignificant
 increases in skill; the 3 enhanced training
 groups showed skill increases. At 4 and 8
 months, only enhanced training groups met
 proficiency standards for MI spirit rating and %
 MI-consistent responses. At 4 months, only
 wkshop+fdbk+coaching had significantly better
 client responses compared to baseline.
Mitcheson, Bhavsar, & McCambridge, 2009 30 adolescent
treatment
practitioners
working in the
London
Examined immediate vs delayed 2-day
 MI wkshop. Training included up to
 four supervision sessions (2 individual
 & 2 group), self-directed study
 materials, option for fdbk from
 sessions.
3 and 6 months post
wksp sessions coded
using the MITI v2
At 3mn, therapist behaviors differed between the
 immediate and delayed (control group) for MI
 spirit global rating. No significant difference
 for MI spirit after controlling for baseline MI
 spirit.
Morgenstern, Morgan, McCrady, Keller, & Carroll, 2001;
Morgenstern, Blanchard, Morgan, Labouvie, Hayaki, 2001
29 substance
abuse counselors
and 252
individuals
seekingy
treatment for
substance use.
programs in New
Jersey
35 hrs didactic training in CBT Coping
 Skills Training (CBTCST) followed by
 supervision of 3-4 cases of twelve-
 session individual treatments with1
 hour of individual and 1 hour of group
 supervision per week using videotaped
 sessions and fdbk. Evaluated 2 training
 conditions compared to TAU. 1) High
 standardization CBT (HSCBT):
 Supervisors reviewed session tapes
 wkly; provided fdbk on adherence to
 manual/skillfulness. 2) Low
 standardization CBT (LSCBT):
 therapists instructed to use clinical
 judgment in selecting/sequencing CBT
 interventions; allowed to incorporate
 other techniques; group supervision to
 assist w/ integrating CBT into practice.
Baseline, end of
treatment and 6-months
post-treatment.
Sessions were rated
with the MTRS and the
ITRS coding systems.
90% of trained counselors were rated as adequate
 in their adherence to protocol and skillfulness
 with performing CBTCST by supervisors and
 trained raters. Counselors reported high levels
 of satisfaction with training, high levels of
 clinical utility and high levels of confidence
 using the model. HSCBT had higher scores on
 therapist CBT ratings than LSCBT which was
 in turn higher than TAU. Mean scores of
 adherence and skillfulness above threshold for
 competence for HSCBT therapists. Treatment
 condition was not significantly related to %
 days abstinent or negative consequences at
 post- treatment or 6 month follow-up. Strongest
 predictors of outcomes were attendance and
 therapeutic alliance.
Moyers et al. 2008 129 substance
abuse providers
from 54 different
Air Force bases
Assessed MI wkshop training vs. MI
 wkshop plus training enrichments
 including fdbk on post-training work
 sample and up to six consultation phone
 calls vs. self-directed MI training
 (received book and training videotapes)
Coded sessions post-
training, 4-, 8-, and 12-
month via the MITI.
Improvements in all measures of MI skill from
 pre- to post-training work sample. Decline in
 skills from post-training to 4-mn. No
 significant differences in outcomes between
 wkshop only and wkshop plus training
 enrichments group.
Peters et al. 2005 71 counselors in
10 SUD
programs from
three publicly
funded substance
abuse treatment
agencies and one
jail-based tx
program
Experimental (evidence-based co-
 occurring disorders treatment manual)
 vs comparison condition (counselors
 received written training materials
 only). (Implementation details
 provided in text)
One and 6 mns post
implementation phase.
Questioned counselors
regarding past, current,
and intended future use
of the manual.
Assessed attitudes
toward, practices
related to, and
knowledge related to
co-occurring disorders.
Significantly higher percentage of counselors in
 the Opinion Leader condition reported use of
 the manual in the past 6 months, current use,
 and planned future use compared to workshop
 only group. Knowledge of co-occurring
 disorders increased from baseline to post
 implementation but did not differ across
 groups. Measures of attitudes related to
 treatment of co-occurring disorders did not
 differ between groups.
Riley, Rieckmann & McCarty, 2008 9 agencies that
received
implementation
grants from the
Center on
Substance Abuse
Treatment
Effective
Adolescent
Treatment
initiative.
MET/CBT for adolescents. Practitioners
 completed a 1-week training program
 followed by submission of treatment
 tapes for certification of proficiency.
 Monthly conference calls to discuss
 issues related to compliance and hear
 expert presentations. Supervisors
 received training and submitted tapes to
 become certified for supervisory
 proficiency and provided periodic
 supervision of counselors.
Interviews conducted
during site visits and
quarterly conference
calls. Site visit occurred
during first 6 months of
funding and at the
beginning of the second
year of funding.
MET/CBT protocol fully implemented in the 9
 sites during the early period of the grant. MET
 portion readily adopted, but adaptations made
 to CBT portion by 8 of 9 sites to suit
 individual, community, and client
 circumstances. Monitoring counselor adherence
 through supervision decreased over time - may
 have led to decreased fidelity. Conflict apparent
 between beliefs about treatment effectiveness
 (e.g., longer is better) and MET/CBT-5 (5
 sessions), despite evidence it is as effective as
 longer versions and more cost-effective.
Rubel, Sobell & Miller 2000 44 participants 2-day MI wkshop Pre- and post-training
surveys
Higher MI knowledge scores following the
 wkshop. Increase in proportion of motivational
 statements in response to self-report clinical
 vignettes following the wkshop.
Schoener, Madeja, Henderson, Ondersma, & Janisse, 2006 10 therapists and
28 clients from
two mental
health clinics
2-day MI wkshop adapted to treatment of
 clients with co-occurring disorders;
 supervision by study staff every other
 week for 16 weeks
All individual sessions
were audio taped,
randomly selected for
MISC coding.
Significant changes in MISC scores on six of
 seven MI fidelity measures following training,
 including Client Change Talk. Global ratings of
 MI Spirit and Empathy improved at post-
 training, but still fell below standards for
 proficiency.
Shafer, Rhode, & Chong, 2004 SUD treatment
providers
throughout
Arizona
participated in
five workshops.
Range in number
of providers at
each workshop
was 92-351 with
145 participating
in all five
workshops.
Training in MI consisted of five 3-hour
 video wksps broadcast monthly over 5
 months. Wksps included lecture,
 demonstrations, small group activities
 completed at the local reception site,
 and homework assignments. 23
 participants were identified as intensive
 evaluators and asked to submit
 audiotaped counseling sessions.
Pre/post surveys, incl.
Helpful Responses
Questionnaire (HRQ)
and tests of MI and
SUD knowledge. 9 of
23 Intensive Evaluators
submitted 3 sessions
prior to first training, 2
wks after last training,
and 4 mns after last
training (9 of 23
complied). Sessions
coded with the MISC.
Participants scored higher on MI knowledge,
 rated own skill higher, and improved reflective
 listening skills (HRQ) post-training. Tapes for
 9 Intensive Evaluators showed they were using
 MI consistent behaviors and complex
 reflections prior to training. No significant
 improvement in MI skills at 4 months. No
 relationship between participants’ self-ratings
 of MI proficiency and MI tape ratings.
Sholomskas et al. 2005 78 providers
employed full-
time treating
substance using
population
CBT interventions: 1) Manual only (M-
 only; 20 hrs requested for
 reading/practicing); 2) manual + web-
 based training (M+WB; interactive
 training w knowledge tests & virtual
 role plays - 20 hrs on website
 requested); 3) manual + training
 seminar + supervision (M+TS; 3-day
 didactic training; audiotaped sessions
 for 3 months rated on adherence/skill;
 provided up to three 1-hour telephone
 supervision sessions).
Assessments completed
at baseline, 4 weeks
and 3 months. Tapes
coded with the YACS
At post training assessment (4 weeks), for both
 adherence and skill, M+TS had significantly
 greater gains in CBT skills than M-only.
 Differences between M+WB and M-only not
 significant. All groups had significant increases
 in knowledge scores. Manual group perceived
 more barriers to using CBT in their clinical
 setting.
Squires, Gumbley and Storti 2008 54 community-
based substance
abuse treatment
agencies in New
England
CM via Science to Service Laboratory
 (SSL): 1 day “exposure” meeting to
 determine if SSL process relevant to
 agencies. Agency director asked to
 designate a 3-member implementation
 team and sign a statement committing
 necessary staff and resources for
 duration of the project (9 months).
 Technology Transfer Specialist (TTS)
 assigned to each agency for support.
 Regional 1-day staff trainings on CM
 and development of an implementation
 plan. Identification of an internal
 “champion” to work with the external
 TTS during implementation.
 Organizational work groups to promote
 cross-agency communication and
 problem solving.
Primary outcome was
completion of the SSL
intervention and
implementation of CM.
Of 28 agencies, 26 completed the training
 program and implemented CM into routine
 practice. Non-adopter agencies were more
 likely to experience turnover in an
 organizational management position key to the
 project, critical staff (internal champion), and
 TTS and were far more likely to have never
 signed a letter of commitment. Directors ranked
 “relative advantage over existing practice” as
 the most important aspect of a potential new
 EBP, a “menu of options” as the most useful
 tool for selecting a new EBP to implement, and
 “patient outcome data” as the number one
 choice for determining success of
 implementation.

Notes. CBT, cognitive behavioral therapy; CM, contingency management; CTN, Clinical Trials Network; fdbk, feedback; ITRS, improving addiction counseling through technology transfer tape rating scale (Morgenstern, Morgan, Keller, & Carroll, 2001); MET, Motivational Enhancement Therapy; MI, motivational interviewing; MISC, Motivational Interviewing Skills Code (Moyers, Martin, Catley, Harris & Ahluwalia, 2003); MITI, Motivational Interviewing Treatment Integrity Coding System (Moyers, Martin, Manuel, Hendrickson, & Miller, 2005); MTR, MATCH videotape rating scale (Carroll et al., 1998); SUD, substance use disorder; TAU, treatment-as-usual; wkshop, workshop; YACS, Yale Adherence Competence Scale (Carroll et al., 2000).

Outcomes of Implementation Interventions

The implementation studies that employed provider self-reports exhibited an increase in knowledge and understanding of the new intervention (Peters et al., 2005; Rubel, Sobell & Miller, 2000; Shafer, Rhode & Chong, 2004; Sholomskas et al., 2005), skill gains (Baer et al., 2004; Miller & Mount, 2001; Rubel et al., 2000; Shafer et al., 2004), and a willingness to adopt or actual adoption of the new intervention (Andrzewski, Kirby, Morral & Iguchi, 2001; Henggeler, Chapman et al., 2008; Kellogg et al., 2005; Squires, Stumbley, & Storti, 2008). However, two studies (Miller & Mount, 2001; Shafer et al., 2004) observed no relationship between provider self-report of practice and objective ratings of provider practice, calling into question the validity of trainee self-reports.

Among the 12 studies using behavioral coding systems to assess interactions between providers and actual clients or standardized patients, some found that provider training in EBTs led to an immediate increase in provider skills, but that these skills decayed by the follow-up periods (Baer et al., 2004; Baer et al., 2009; Miller & Mount, 2001; Miller et al., 2004; Moyers et al., 2008). However, Sholomskas et al. (2005) found that providers trained in CBT increased their adherence and skills from the post-training period to follow-up. Nonetheless, coding data suggest that skill gains are not uniform across providers, nor are they necessarily maintained over time.

Only four studies examined clients’ outcomes. Clients who participated in Network Therapy that had been implemented in a community-based SUD program had more negative urine screens during treatment and completed more sessions of therapy as compared to treatment-as-usual clients (Keller & Galanter, 1999). Ball et al. (2007) found that newly implemented MI was superior to counseling as usual in reducing substance use, with advanced MI skill being positively related to increased client motivation and a higher percent negative drug screens (Martino et al., 2008). Liddle et al. (2006) found a higher rate of abstinence among adolescents following implementation of Multidimensional Family Therapy (MFT) in comparison to the abstinence rate prior to the implementation of MFT. However, clients in a CBT condition implemented in a community program by Morgenstern, Blanchard et al. (2001) had similar substance use outcomes when compared to those in the treatment-as-usual condition.

Enhancements to Training Workshops

Fifteen (71%) of the studies offered participants some type of training enrichment; however, only four (Baer et al., 2009; Henggeler, Sheidow et al., 2008; Miller et al. 2004, Moyers et al., 2008) compared training workshops with and without enhancements. Miller et al. (2004) found that workshop training plus enhancements resulted in significantly higher MI skills when compared to workshop alone, and Henggeler, Sheidow, et al. (2008) found training enhancements resulted in greater use of CM techniques. In contrast, Baer et al. (2009) and Moyers et al. (2008) found no benefits of training enhancements.

Online Training

Although most workshops were offered in-person, more recent studies (Shafer et al., 2004; Sholomskas et al., 2005) have begun to examine the efficacy of online and distance training methods as a way to reduce training barriers. The studies indicate that these methods may have some utility in training providers in EBTs. Shafer et al. (2004) found improved MI knowledge and skills after training (although these gains were not sustained at a 4-month follow-up). In a comparative study, Sholomskas et al. (2005) found that providers receiving a manual plus web-based training had slightly, but not significantly higher adherence and CBT skills ratings as compared to providers who only received the CBT manual. However, providers in the manual plus web-based training scored slightly lower on adherence and skills than providers who received the CBT manual, a 3-day training workshop, and supervision.

Provider versus Organizational Implementation Focus

Thirteen studies focused on training individual providers who were recruited within or across SUD treatment organizations. In two additional studies (Andrzejewski et al., 2001; Keller & Galanter, 1999), all providers within an agency were trained, but the authors did not describe specific agency-level efforts to implement the EBT. In contrast to these provider-focused studies, six studies (Henggeler, Sheidow et al., 2008; Kellogg et al., 2005; Liddle et al., 2002, 2006; Peters et al., 2005; Riley et al., 2008; Squires et al., 2008) addressed implementation efforts at the organization or systems level, and included supervisors, program directors, or staff representation in the training and implementation efforts.

We describe an example of an organizationally-focused implementation (Peters et al., 2005) in more detail to illustrate the complexity and comprehensiveness of such implementation efforts. Implementation began with a community needs assessment to identify the target of the implementation effort – patients with co-occurring substance use and psychiatric disorders. Next, opinion leaders, who were paid $500 for their involvement, were selected from each of 10 treatment programs, based on their peers’ perceptions of their expertise and knowledge.

Training was offered in three phases. First, counselors in both the implementation and control conditions received four hours of training in general assessment and treatment of co-occurring disorders. In the second phase, opinion leaders in the implementation condition participated in a two-day workshop that addressed use of a co-occurring disorders treatment manual and ways to engage counselors in the implementation process. In the third phase, opinion leaders provided a 90-minute training workshop to the counselors in their own treatment programs, followed by weekly meetings in which ongoing consultation was provided to counselors.

The implementation intervention was more effective than the control condition in which providers received general training and written treatment materials without any formal training in the empirically-supported treatment of co-occurring disorders. All of the counselors (100%) in the experimental group reported using the manual during the three-month implementation period versus only 19% of the counselors in the comparison group. When counselors were assessed post-implementation, 48% of those in the implementation condition reported current use of manual versus 19% in the control condition. Significantly more counselors in the implementation condition (95%) than the comparison condition (52%) reported plans to use manual in future. Features of the implementation interventions in the other five organizationally-focused studies (Henggeler, Sheidow et al., 2008; Kellogg et al., 2005; Liddle et al., 2002; 2006; Riley et al., 2008; Squires et al., 2008) are provided in Table 1 – e.g., adapting the EBT or other intervention to suit the organization’s needs and setting (Liddle et al., 2002; 2006) and use of “technology transfer specialists” (Squires et al., 2008).

Overall, the six studies employing agency- or system-focused implementation approaches had successful outcomes. Most agencies reported adoption of the new treatment as a result of the training or implementation effort (Kellogg et al., 2005; Peters et al., 2005; Riley et al., 2008; Squires et al., 2008). Providers were more likely to adopt an EBT if they were supported or mandated to do so by agency leaders or supervisors (Henggeler, Sheidow, et al., 2008; Peters et al., 2005; Squires et al., 2008). Nonetheless, Riley et al. (2005) reported that ongoing supervision, and therefore possibly treatment fidelity, decreased over time, indicating potential difficulties sustaining adoption within an agency. Only one of the organizationally-focused implementation studies measured client outcomes. As noted earlier, Liddle et al. (2006) found that when Multidimensional Family Therapy was implemented, clients had increased rates of abstinence.

Barriers to Implementation

Barriers to the adoption and implementation of evidence-based treatments were identified in several of the studies reviewed. Some indicated that providers’ and agencies’ orientations and approaches were in conflict with the evidence-based treatment (Baer et al., 2009; Keller & Galanter, 1999; Miller et al., 2004; Riley et al., 2008). Other organizational barriers noted include a lower level of organizational readiness to change (Baer et al., 2009) and high staff turnover (Riley et al., 2008). Poor therapist compliance with learning tasks, possibly due to lack of provider motivation to learn a new treatment, was pointed to as a significant barrier by Andrewski et al. (2001), Mitcheson et al. (2009), and Moyers et al. (2008). Other provider factors raised as barriers to implementing EBTs were time demands and heavy case loads (Henggeler, Chapman et al., 2008).

Discussion

This review considered 21 studies that examined the implementation of evidence-based psychosocial SUD interventions into front-line clinical settings. Motivational interviewing or motivational enhancement therapy and contingency management were the most prevalently implemented EBTs, comprising 72% of the psychosocial interventions. The workshop training format was used in all studies, although the length of the workshops varied from one day to one week. The studies tended to use pre and post-training assessments of provider knowledge and self-reported skills, and/or behavioral coding systems to capture changes in therapist factors. Most showed improved knowledge and skills after training workshops. However, the extent to which improvement was sustained was less frequently studied and not always supported (e.g., Shafer et al., 2004). Thus, training should be viewed as an ongoing process, with continuous opportunity for discussion and learning, rather than a time-limited activity.

Of the four studies comparing training workshops only with workshops with such training enhancements as feedback, coaching, and ongoing supervision to workshops, two found enhancements associated with more substantial training gains (Henggeler, Sheidow et al., 2008; Miller et al., 2004), but the other two (Baer et al., 2009; Moyers et al., 2008) did not. Thus, in contrast to Miller et al.’s (2006) conclusion that workshop enhancements resulted in increased acquisition of provider skills, more recent evidence provides a mixed picture of the effectiveness of training enhancements. However, the number of studies directly comparing enhancements with a workshop-only condition is very small, precluding firm conclusions.

One contributing factor accounting for variability across studies in training gains, including those comparing training workshops versus workshops with enhancements, may be variation in the level of providers’ pre-training general clinical and EBT-related skills. Despite offering similar workshops and training enhancements, and despite a similar lack of experience with MI, providers in the Miller et al. (2004) study were more highly motivated and clinically skilled prior to training in MI, than providers in the Moyers et al. (2008) study. Providers in the Miller et al. (2004) study who received workshop training and enhancements had increased gains in clinical skills, unlike the less motivated and skilled providers in the Moyers et al. (2008) study. In contrast, Baer et al. (2009) reported that providers who demonstrated higher baseline MI skills in their MI workshop study had less improvement in MI skill post-training, which they attributed to a possible ceiling effect. It is difficult to synthesize these between- and within-study findings. However, it may be that a minimal level of motivation and skill is needed for the successful adoption of motivational interviewing (and other EBTs). On the other hand, very high pre-existing levels of motivation and EBT-related skill may leave little room for improvement with training.

While many of the providers in the studies volunteered to receive training in the EBT, ostensibly out of interest in learning more about the treatment, they may have found the EBT incompatible with their background and theoretical orientation (Baer et al., 2009; Keller & Galanter, 1999; Miller et al., 2004; Riley et al., 2008). Fortunately, there is a variety of EBTs for providers to choose from and some may be more congruent with providers’ preferences and perceived needs within the agency. Hence, training may need to be adapted to providers’ pre-existing clinical skill, theoretical orientation, motivation levels and EBT preferences. Organizational factors also may contribute to variability in provider gains across studies.

Many of the studies we reviewed recruited a subset of providers from a substance use treatment agency. Often, these selected providers were the only individuals trained from their work setting, resulting in a number of impediments to implementation and sustainability. For example, Miller et al. (2004) reported that MI, in which providers were trained, conflicted with the confrontational, authoritarian orientation in the practice setting from which providers had been recruited. However, in that study, MI training with coaching and/or feedback was sufficiently potent to improve and sustain the ratio of MI consistent to MI inconsistent provider behaviors. In other cases, agency administrative personnel may not have even been aware or supportive of the individual providers’ efforts. Providers may have returned from their training highly motivated to integrate their new skills into their clinical setting. However, they then may have encountered barriers that they did not anticipate. Without leadership and peer support and encouragement, providers may have been unable to overcome obstacles to using the new treatment approach. In the absence of more information on the organizational context to which providers returned, it is not possible to know to what extent organizational factors may have affected training outcomes.

Organizationally-focused Implementation Interventions

Agency administrative support and ongoing supervision may not be a panacea for provider conflicts and time demands when attempting to implement new treatments, but they can facilitate discourse between agency leaders and staff regarding providers’ apprehensions or conflicts about learning a new treatment. Such support may be necessary to overcome barriers to EBT implementation. Six studies that we reviewed focused on the agency as the locus of change (Henggeler, Chapman et al., 2008; Kellogg et al., 2005; Liddle et al., 2002, 2006; Peters et al., 2005; Riley et al., 2008; Squires et al., 2008) rather than individual providers. In two of these studies (Liddle et al., 2002, 2006; Peters et al., 2005), implementation was an extended process that began with an assessment of organizational needs and climate, provider and client factors, and, in the Liddle et al. (2002) study, some local adaptation of the empirically-supported intervention. In addition, staff training was ongoing in most of these six organizationally-focused studies, relying heavily on feedback and supervision. Although these organizationally-focused studies reported positive outcomes, including one (Liddle et al., 2006) with better client outcomes, there is a need for studies directly comparing the effectiveness of organizationally-focused versus individual provider-focused implementation.

Conceptually-driven Implementation Interventions

The variability in findings across studies, and our inability to explain much of that variation, suggest that both the process of implementing SUD EBTs and studies of their implementation need to be more firmly grounded in theories or conceptual models of implementation. Consideration of contextual or organizational factors, provider characteristics, the nature of the intervention and the process of implementation, as emphasized in such models, may identify “moderators” of training effects and different combinations of factors (e.g., training enhancements versus no enhancements) that are necessary and sufficient to effect sustained implementation. Although several of the reviewed studies made mention of conceptual models of implementation (particularly those of Simpson et al., 2002 and Rogers, 2003), some of the organizationally-focused implementations (e.g., Liddle et al., 2002, 2006; Peters et al., 2005, and Squires et al., 2008) were rare examples of a conceptual model actually being used to inform an implementation strategy.

The importance of organizational barriers and facilitators is emphasized in many conceptual models of implementation (see the Consolidated Framework for Implementation Research [CFIR], Damschroder et al., 2009; Damschroder & Hagedorn, this issue). The CFIR breaks down organizational factors into those in the “outer” and “inner” settings. A myriad of factors comprises the inner setting – e.g., organization of services, organizational culture, implementation climate, resources, and leadership. Based on the studies reviewed here, successful implementation appears more likely when the implementation effort focuses on at least some of these inner setting factors, by including an assessment of organizational needs, discussions with staff and administrators about how an EBT fits into the organization, and by considering potential ways in which the EBT may need to be adapted to best suit the organizational climate (Henggeler, Sheidow et al., 2008; Kellogg et al., 2005; Riley et al., 2008). Although it can be expensive and time-consuming to evaluate the organizational climate, request feedback from staff regarding potential areas for change, and identify “champions” or advocates for change within the agency (a “process” factor in the CFIR), these components appear to be associated with successful adoption and implementation of EBTs in the research reviewed here.

Factors in the “outer setting” likewise are important to consider in mounting, conducting, and sustaining implementation. For example, Peters et al. (2005) and Liddle et al. (2002, 2006) conducted needs assessments to identify areas requiring improvement as perceived by the clinical community. Likewise, sustaining evidence-based changes in treatment may ultimately depend on the concordance of such changes with the policies of higher-level entities, such as health care systems or state authorities, within which the treatment organization is embedded. Recognizing any discordance between implementation plans and outer setting policies and addressing it, if possible, prior to launching an implementation effort can preempt a great deal of frustration.

Questioning Assumptions

Our review, thus far, has been consistent with two prevalent assumptions regarding the implementation of evidence-based SUD treatments: (1) EBTs have been shown to be better than treatment as usual (TAU) in community treatment programs, and (2) what should be implemented are EBTs (i.e., complete treatment modalities or approaches) guided by a (perhaps adapted) treatment manual originally developed for the efficacy or effectiveness research in which the EBT was shown to yield better outcomes than some comparison condition. Not surprisingly, these assumptions have been questioned, not only in the SUD treatment field, but also in the broader psychotherapy field (e.g., Westen, Novotny, & Thompson-Brenner, 2004).

Treatment as usual

Some proponents of EBT implementation recognize that there is not overwhelming and compelling evidence that the adoption of EBTs will lead to better client outcomes than treatment as usual (e.g., Miller et al., 2006). Providers’ and policy-makers’ positive perceptions of evidence strength and quality supporting a new intervention and the intervention’s perceived relative advantage over current practices are intervention characteristics purported to facilitate implementation success according to several conceptual models of implementation (Damschroder et al., 2009). Part of the problem in the SUD field is that the comparison condition in many treatment trials is some other active treatment, not treatment as usual (Swearingen, Moyer, & Finney, 2003). One study reviewed here (Morgenstern, Blanchard et al. (2001) found no better client outcomes after an EBT (cognitive-behavioral treatment) was implemented in some sites and compared with TAU at other sites, but another (Ball et el., 2007) found MET superior to counseling as usual.

The providers delivering EBTs can have a major impact on their effectiveness. Although we do not have a complete picture of how providers delivering TAU in routine care may differ from therapists in treatment trials, certainly some counselors in SUD treatment programs in the U.S. have lower levels of education than clinical trial therapists, and high staff turnover is common in community programs (Kerwin, Walker-Smith, & Kirby, 2006; McLellan, Carise, & Kleber, 2003). We do not know enough about the extent to which such providers may be able to effectively learn and apply EBTs that typically have been applied by master’s level therapists or graduate students in efficacy trials, or about what the outcomes would be of EBTs offered by community providers in their routine settings. The NIDA Clinical Trials Network (CTN) is addressing this question, but some differences are still evident between counselors participating in the CTN and counselors at other public and private treatment programs (Knudsen, Ducharme, & Roman, 2007).

Treatment modalities and treatment manuals

Psychosocial interventions tend to be complicated and dynamic. Providers have reacted against the notion that treatment can be delivered over a standardized period (e.g., the 12 weeks common in treatment efficacy trials) while rigidly following a treatment manual, regardless of a client’s current state or presenting problems during a particular session (Westen et al., 2004). However, some treatment manuals provide more flexibility, such the manual for the Combined Behavioral Intervention from Project COMBINE (NIAAA, 2004). More fundamentally, several observers (e.g., Bühringer, Wittchen, Gottlebe, Kufeld, & Goschke, 2008; Moos, 2003, 2007a, 2007b; Read, Kahler, & Stevenson, 2001) have suggested that it may be more productive to implement evidence-based practices or processes, instead of evidence-based treatment packages. For example, Moos (2007a) argues: “Rather than focusing so heavily on understanding specific types or orientations of treatment, such as CBT or TSF [Twelve-Step Facilitation], training should emphasize common treatment processes, such as promoting support, goal direction, and structure in treatment and clients’ life contexts, enhancing clients’ involvement in new rewarding activities, and building their self-efficacy and coping skills” (p. 119).

To be consistent with prior acronyms and avoid confusion, we will term this the “evidence-based practice” (EBP) approach, with the term “practice” encompassing both the therapeutic process used and the goal (e.g., enhancing coping skills) to which it is directed. Focusing on EBPs (e.g., increasing coping skills) might seem to run counter to some reviews (e.g., Apodaca & Longabaugh, 2009; Morgenstern & Longabaugh, 2000; Morgenstern & McKay, 2007) that have reported little support for processes or mediating variables by which such EBTs as cognitive-behavioral treatment (Morgenstern & Longabaugh, 2000) and motivational interventions (Apodaca & Longabaugh, 2009) were thought to exert their effects. However, the linkages in mediational chains that had more support in those reviews were those between some putative mediators (e.g., coping skills) and alcohol and other drug use outcomes (see also, Finney, Moos, & Humphreys, 1999). Although different techniques in efficacy trials and effectiveness studies may have been equally effective in, say, imparting new coping skills, some of those techniques nevertheless may augment the existing practices of community providers to improve their clients’ coping skills.

Not only would a focus on EBPs lead naturally to more individualized treatment approaches, it also might facilitate more widespread implementation and sustained adoption of evidence-based SUD care. The treatment processes listed by Moos (2007a) – promoting support, alternative goals, new activities, self-efficacy and coping skills - are common to many treatment approaches (which may explain why TAU has sometimes been found to be as effective as an EBT [e.g., Morgenstern et al., 2001]). For example, one can conceptualize 12-step treatment as involving, in part, learning coping skills – e.g., talking with a sponsor to obtain social support when experiencing strong craving to drink or use (Gifford, Ritsher, McLellar & Moos, 2006). Some providers in community programs and other SUD care settings may be much more willing to adopt and integrate new, evidence-based practices to augment those they currently use, than they are to completely jettison the techniques they have used over an extended period and adopt in a wholesale fashion a new treatment approach (Gifford & Humphreys, 2007; Weingardt & Gifford, 2007). Focusing on the EBPs as part of an implementation approach scores high on the characteristics of “adaptability” and “trialability,” that have been emphasized in conceptual models of implementation as intervention characteristics associated with more successful sustained adoption (e.g., Rogers, 2003). In addition, it may be easier to train providers in the specific, focused EBPs rather than EBTs, and to evaluate their fidelity and competence in applying EBPs.

For at least some providers in some organizational contexts, the EBP approach, which treats providers as collaborators with existing expertise, is likely to be more effective in improving the quality of routine SUD care than a purely “top-down” EBT approach (Weingardt & Gifford, 2007). SUD treatment is different from most areas of medical care. Although a surgeon may have some preference for employing a technique with which he or she is familiar, an SUD treatment provider may be ideologically committed to a particular treatment approach. Similarly, although a surgeon is unlikely to have personally received and experienced a positive outcome from a particular surgical technique he or she uses with patients, 40%-55% of SUD treatment counselors are in recovery (Knudsen et al., 2007), in many cases after receiving treatment with an approach that they are now employing with their clients. Thus, beginning with SUD treatment providers “where they are” may be particularly important in effectively implementing evidence-based innovations in SUD care.

Future Directions

The studies reviewed, along with literature on implementation science (e.g., Damschroder & Hagedorn, this issue), suggest that probability of implementation should be increased when the entire agency is the focal point, rather than individual providers. Thus, future research should focus on implementation interventions that engage the entire agency and contrast outcomes from that approach with those following an individual provider approach. As noted earlier, conceptual models of implementation point to important factors in organizational “inner” and “outer” settings (Damschroder & Hagedorn, this issue) that should be addressed to facilitate implementation and overcome barriers. This process may be most successful when it begins with an assessment of the needs of the agency and its clients, a discussion of how a new intervention may best fit into the agency, and if necessary, a strategic adaptation of the EBT. Training in the intervention should include ongoing supervision, coaching, feedback from taped sessions, and opportunities for discussions regarding implementation barriers.

Future research also should consider the role of adaptation within implementation efforts. Adaptations of EBTs for use within particular settings may be a necessary component when implementing new treatments (Henggeler, Sheidow et al., 2008; Kellogg et al., 2005; Riley et al., 2008) and may facilitate increased adoption and provider motivation to use the new treatment. Nonetheless, adaptations require a delicate balance of maintaining the integrity of the EBT while meeting the needs of providers and the overall organization. Adaptations of EBTs may be particularly necessary when working with special populations (e.g., minority ethnic groups), thereby allowing for more individualized approaches to meeting their specific needs.

Objective evaluation is an important component when assessing the outcomes of newly implemented treatment interventions. Accordingly, researchers should incorporate methods that extend beyond therapist self-report assessments. Hartzler et al. (2007) found that providers overestimated their MI skill prior to training in MI, and underestimated MI skill following training. The ideal way to monitor therapist competency is to randomly select a sample from a larger pool of taped sessions (Miller et al., 2006). This raises the issue of provider compliance with session taping, which has been relatively poor in past studies. If providers were approached as having existing expertise, if they had the support of their agency, and if they then were offered ongoing supervision and feedback from their taped sessions in a collaborative format, they might be more likely to become invested in the implementation process and its evaluation. The ultimate test, however, is whether newly implemented treatment practices, which have been shown to improve client outcomes in efficacy and effectiveness trials, are associated with improved client outcomes when implemented in community settings.

Acknowledgments

The authors would like to thank Rudolf Moos and Elizabeth Gifford for their insightful comments on earlier versions of this manuscript. Preparation of this manuscript was supported by the U. S. Department of Veterans Affairs Substance Use Disorder Quality Enhancement Research Initiative and Health Services Research and Development Service, and by National Institute on Alcohol Abuse and Alcoholism Grant No. AA-008689. The views expressed are those of the authors and do not necessarily represent those of the Department of Veterans Affairs or any other U.S. government entity.

Footnotes

Publisher's Disclaimer: The following manuscript is the final accepted manuscript. It has not been subjected to the final copyediting, fact-checking, and proofreading required for formal publication. It is not the definitive, publisher-authenticated version. The American Psychological Association and its Council of Editors disclaim any responsibility or liabilities for errors or omissions of this manuscript version, any version derived from this manuscript by NIH, or other third parties. The published version is available at www.apa.org/pubs/journals/ABD

Contributor Information

Jennifer K. Manuel, HSR&D Center for Health Care Evaluation, VA Palo Alto Health Care System

Hildi J. Hagedorn, VA Substance Use Disorders Quality Enhancement Research Initiative, Minneapolis VA Health Care System

John W. Finney, HSR&D Center for Health Care Evaluation, VA Palo Alto Health Care System

References

  1. Andrzejewski M, Kirby KC, Morral AR, Iguchi MY. Technology transfer through performance management: The effects of graphical feedback and positive reinforcement on drug treatment counselor’s behavior. Drug and Alcohol Dependence. 2001;63:179–186. doi: 10.1016/s0376-8716(00)00207-6. [DOI] [PubMed] [Google Scholar]
  2. Apodaca TR, Longabaugh R. Mechanisms of change in motivational interviewing: A review and preliminary evaluation of the evidence. Addiction. 2009;104(5):705–715. doi: 10.1111/j.1360-0443.2009.02527.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Baer JS, Rosengren DB, Dunn CW, Wells EA, Ogle RL, Hartzler B. An evaluation of workshop training in motivational interviewing for addiction and mental health clinicians. Drug and Alcohol Dependence. 2004;73:99–106. doi: 10.1016/j.drugalcdep.2003.10.001. [DOI] [PubMed] [Google Scholar]
  4. Baer JS, Wells EA, Rosengren DB, Hartzler B, Beadnell B, Dunn CW. Agency context and tailored training in technology transfer: A pilot evaluation of motivational interviewing training for community counselors. Journal of Substance Abuse Treatment. 2009;37:191–202. doi: 10.1016/j.jsat.2009.01.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Ball SA, Martino S, Nich C, Frankforter TL, Van Horn D, Crits-Christoph P, Woody GE, Obert JL, Farentinos C, Carroll KM. Site matters: Motivational enhancement therapy in community drug abuse clinics. Journal of Consulting and Clinical Psychology. 2007;75:556–567. doi: 10.1037/0022-006X.75.4.556. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Bühringer G, Wittchen H-U, Gottlebe K, Kufeld C, Goschke T. Why people change? The role of cognitive-control processes in the onset and cessation of substance abuse disorders. International Journal of Methods in Psychiatric Research. 2008;17(S1):S4–S15. doi: 10.1002/mpr.246. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Carise D, Brooks A, Alterman A, McLellan AT, Hoover V, Forman R. Implementing evidence-based practices in community treatment programs: Initial feasibility of a counselor “toolkit”. Substance Abuse. 2009;30:239–243. doi: 10.1080/08897070903041194. [DOI] [PubMed] [Google Scholar]
  8. Carroll KM, Connors GJ, Cooney NL, DiClemente CC, Donovan DM, Kadden RR, Longabaugh RL, Roundsaville BJ, Wirtz PW, Zweben A. Internal validity of Project MATCH treatments: Discriminability and integrity. Journal of Consulting and Clinical Psychology. 1998;66:290–303. doi: 10.1037//0022-006x.66.2.290. [DOI] [PubMed] [Google Scholar]
  9. Carroll KM, Nich C, Sifry R, Frankforter T, Nuro KF, Ball SA, et al. A general system for evaluating therapist adherence and competence in psychotherapy research in the addictions. Drug and Alcohol Dependence. 2000;57:225–238. doi: 10.1016/s0376-8716(99)00049-6. [DOI] [PubMed] [Google Scholar]
  10. Condon TP, Miner LL, Balmer CW, Pintello D. Blending addiction research and practice: strategies for technology transfer. Journal of Substance Abuse Treatment. 2008;35(2):156–160. doi: 10.1016/j.jsat.2007.09.004. [DOI] [PubMed] [Google Scholar]
  11. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science. 2009;4:50. doi: 10.1186/1748-5908-4-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Damschroder, L. J. et al. – this issue.
  13. Eliason MJ, Arndt S, Schut A. Substance abuse counseling: What is treatment as usual? Journal of Addictive Diseases. 2005;24(Suppl 1):33–51. [Google Scholar]
  14. Finney JW, Moos RH, Humphreys K. A comparative evaluation of substance abuse treatment: II. Linking proximal outcomes of 12-step and cognitive-behavioral treatment to substance use outcomes. Alcoholism: Clinical and Experimental Reserach. 1999;23(3):537–544. doi: 10.1111/j.1530-0277.1999.tb04150.x. [DOI] [PubMed] [Google Scholar]
  15. Garner BR. Research on the diffusion of evidence-based treatments within substance abuse treatment: a systematic review. Journal of Substance Abuse Treatment. 2009;36:376–399. doi: 10.1016/j.jsat.2008.08.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Gifford EV, Humphreys K. The psychological science of addiction Addiction. 2007;102(3):352–361. doi: 10.1111/j.1360-0443.2006.01706.x. [DOI] [PubMed] [Google Scholar]
  17. Gifford EV, Ritsher JB, McKellar JD, Moos RH. Acceptance and relationship context: a model of substance use disorder treatment outcome. Addiction. 2006;101(8):1167–1177. doi: 10.1111/j.1360-0443.2006.01506.x. [DOI] [PubMed] [Google Scholar]
  18. Gordon, A. et al. – this issue.
  19. Hartzler B, Baer JS, Dunn C, Rosengren DB, Wells E. What is seen through the looking glass: The impact of training on practitioner self-rating of motivational interviewing skills. Behavioural and Cognitive Psychotherapy. 2007;35:431–445. [Google Scholar]
  20. Henggeler SW, Chapman JE, Rowland MD, Halliday-Boykins CA, Randall J, Shackelford J, et al. Statewide adoption and initial implementation of contingency management for substance-abusing adolescents. Journal of Consulting and Clinical Psychology. 2008;76:556–567. doi: 10.1037/0022-006X.76.4.556. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Henggeler SW, Sheidow AJ, Cunningham PB, Donohue BC, Ford JD. Promoting the implementation of an evidence-based intervention for adolescent marijuana abuse in community settings: Testing the use of intensive quality assurance. Journal of Clinical Child and Adolescent Psychology. 2008;37:682–689. doi: 10.1080/15374410802148087. [DOI] [PubMed] [Google Scholar]
  22. Humphreys K, Weingardt KR, Horst D, Joshi AA, Finney JW. Prevalence and predictors of research participant criteria in alcohol treatment outcome studies: 1970-1998. Addiction. 2005;100:249–257. doi: 10.1111/j.1360-0443.2005.01175.x. [DOI] [PubMed] [Google Scholar]
  23. Keller DS, Galanter M. Technology transfer of network therapy to community-based addictions counselors. Journal of Substance Abuse Treatment. 1999;16:183–189. doi: 10.1016/s0740-5472(98)00044-0. [DOI] [PubMed] [Google Scholar]
  24. Kellogg SH, Burns M, Coleman P, Stitzer ML, Wale JB, Kreek MJ. Something of value: The introduction of contingency management interventions into the New York City Health and Hospital Addiction Treatment Service. Journal of Substance Abuse Treatment. 2005;28:57–65. doi: 10.1016/j.jsat.2004.10.007. [DOI] [PubMed] [Google Scholar]
  25. Kervin ML, Walker-Smith K, Kirby KC. Comparative analysis of state requirements for the training of substance abuse and mental health counselors. Journal of Substance Abuse Treatment. 2006;30:173–181. doi: 10.1016/j.jsat.2005.11.004. [DOI] [PubMed] [Google Scholar]
  26. Knudsen HK, Ducharme LJ, Roman PM. Research network involvement and addiction treatment center staff: Counselor attitudes toward buprenorphine. American Journal on Addictions. 2007;16:365–371. doi: 10.1080/10550490701525418. [DOI] [PubMed] [Google Scholar]
  27. Liddle HA, Rowe CL, Gonzalez A, Henderson CE, Dakof GA, Greenbaum PE. Changing provider practices, program environment, and improving outcomes by transporting multidimensional family therapy to an adolescent drug treatment setting. The American Journal on Addictions. 2006;15:102–112. doi: 10.1080/10550490601003698. [DOI] [PubMed] [Google Scholar]
  28. Liddle HA, Rowe CL, Quille TJ, Dakof GA, Mills DS, Sakran E, et al. Transporting a research-based adolescents drug treatment into practice. Journal of Substance Abuse Treatment. 2002;22:231–243. doi: 10.1016/s0740-5472(02)00239-8. [DOI] [PubMed] [Google Scholar]
  29. Martino S, Ball SA, Nich C, Frankforter TL, Carroll KM. Community program therapist adherence and competence in motivational enhancement therapy. Drug and Alcohol Dependence. 2008;96:37–48. doi: 10.1016/j.drugalcdep.2008.01.020. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. McCrady BS. Alcohol use disorders and the Division 12 Task Force of the American Psychological Association. Psychology of Addictive Behaviors. 2000;14:267–276. [PubMed] [Google Scholar]
  31. McGovern MP, Carroll KM. Evidence-based practices for substance use disorders. Psychiatric Clinics of North America. 2003;26(4):991–1010. doi: 10.1016/s0193-953x(03)00073-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. McLellan AT, Carise D, Kleber H. Can the national addiction treatment infrastructure support the public’s demand for quality care? Journal of Substance Abuse Treatment. 2003;25:117–121. [PubMed] [Google Scholar]
  33. McLellan AT, Kemp J, Brooks A, Carise D. Improving public addiction treatment through performance contracting: the Delaware experiment. Health Policy. 2008;87:296–308. doi: 10.1016/j.healthpol.2008.01.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Miller WR, Mount KA. A small study of training in motivational interviewing: Does one workshop change clinician and client behavior? Behavioral and Cognitive Psychotherapy. 2001;29:457–471. [Google Scholar]
  35. Miller WR, Sorensen JL, Selzer JA, Brigham GS. Disseminating evidence-based practices in substance abuse treatment. Journal of Substance Abuse Treatment. 2006;31:25–39. doi: 10.1016/j.jsat.2006.03.005. [DOI] [PubMed] [Google Scholar]
  36. Miller WR, Yahne CE, Moyers TB, Martinez J, Pirritano M. A randomized trial of methods to help clinicians learn motivational interviewing. Journal of Consulting and Clinical Psychology. 2004;72:1050–1062. doi: 10.1037/0022-006X.72.6.1050. [DOI] [PubMed] [Google Scholar]
  37. Mitcheson L, Bhavsar K, McCambridge J. Randomized trial of training and supervision in motivational interviewing with adolescent drug treatment practitioners. Journal of Substance Abuse Treatment. 2009;378:73–78. doi: 10.1016/j.jsat.2008.11.001. [DOI] [PubMed] [Google Scholar]
  38. Moos RH. Addictive disorders in context: Principles and puzzles of effective treatment and recovery. Psychology of Addictive Behaviors. 2003;17:3–12. doi: 10.1037/0893-164x.17.1.3. [DOI] [PubMed] [Google Scholar]
  39. Moos RH. Theory-based active ingredients of effective treatments for substance use disorders. Drug and Alcohol Dependence. 2007a;88:109–121. doi: 10.1016/j.drugalcdep.2006.10.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Moos RH. Theory-based processes that promote remission of substance use disorders. Clinical Psychology Review. 2007b;27:537–551. doi: 10.1016/j.cpr.2006.12.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Morgenstern J, Blanchard KA, Morgan TJ, Labouvie E, Hayaki J. Testing the effectiveness of cognitive-behavioral treatment for substance abuse in a community setting: Within treatment and posttreatment findings. Journal of Consulting and Clinical Psychology. 2001;69:1007–1017. doi: 10.1037//0022-006x.69.6.1007. [DOI] [PubMed] [Google Scholar]
  42. Morgenstern J, Longabaugh R. Cognitive-behavioral treatment for alcohol dependence: a review of evidence for its hypothesized mechanisms of action. Addiction. 2000;95(10):1475–1490. doi: 10.1046/j.1360-0443.2000.951014753.x. [DOI] [PubMed] [Google Scholar]
  43. Morgenstern J, McKay JR. Rethinking the paradigms that inform behavioral treatment research for substance use disorders. Addiction. 2007;102(9):1377–1389. doi: 10.1111/j.1360-0443.2007.01882.x. [DOI] [PubMed] [Google Scholar]
  44. Morgenstern J, Morgan TJ, McCrady BS, Keller DS, Carroll KM. Manual-guided cognitive-behavioral therapy training: A promising method for disseminating empirically supported substance abuse treatments to the practice community. Psychology of Addictive Behaviors. 2001;15:83–88. [PubMed] [Google Scholar]
  45. Moyers TB, Manuel JK, Wilson PG, Hendrickson SML, Talcot W, Durand P. A randomized trial investigating training in motivational interviewing for behavioral health providers. Behavioral and Cognitive Psychotherapy. 2008;36:149–162. [Google Scholar]
  46. Moyers TB, Martin T, Catley D, Harris KJ, Ahluwalia JS. Assessing the integrity of motivational interviewing interventions: Reliability of the motivational interviewing skills code. Behavioural and Cognitive Psychotherapy. 2003;31:177–184. [Google Scholar]
  47. Moyers TB, Martin T, Manuel JK, Hendrickson SML, Miller WR. Assessing competence in the use of motivational interviewing. Journal of Substance Abuse Treatment. 2005;28:19–26. doi: 10.1016/j.jsat.2004.11.001. [DOI] [PubMed] [Google Scholar]
  48. National Institute on Alcohol Abuse and Alcoholism COMBINE Monograph Series Volume 1 Combined Behavioral Intervention Manual: A Clinical Research Guide for Therapists Treating People With Alcohol Abuse and Dependence. 2004 NIH Pub. No. 04-5288. [Google Scholar]
  49. National Quality Forum National voluntary consensus standards for the treatment of substance use conditions. Evidence-based treatment practices. 2007 Retrieved from http://www.rwjf.org/pr/product.jsp?id=20611.
  50. Peters RH, Moore KA, Hills HA, Young MS, LeVasseur JB, Rich AR, et al. Use of opinion leaders and intensive training to implement evidence-based co-occurring disorders treatment in the community. In: Edmundson E, McCarty D, editors. Implementing evidence-based practices for treatment of alcohol and drug disorders. Haworth Press; New York: 2005. pp. 53–74. [Google Scholar]
  51. Read JP, Kahler CW, Stevenson JF. Bridging the gap between alcoholism treatment research and practice: Identifying what works and why. Professional Psychology. 2001;32(3):227–238. [Google Scholar]
  52. Rieckmann TR, Kovas AE, Fussell HE, Stettler NM. Implementation of evidence-based practices for treatment of alcohol and drug disorders: The role of the State Authority. Journal of Behavior Health Services & Research. 2009;36(4):407–419. doi: 10.1007/s11414-008-9122-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  53. Riley KJ, Rieckmann T, McCarty D. Implementation of MET/CBT5 for adolescents. The Journal of Behavioral health Services & Research. 2008;32:207–215. doi: 10.1007/s11414-008-9111-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Rogers EM. Diffusion of innovations. 5th edition Free Press; New York, NY: 2003. [Google Scholar]
  55. Rubel EC, Sobell LC, Miller WR. Do continuing education workshops improve participants’ skills? Effects of a motivational interviewing workshop on substance abuse counselors’ skills and knowledge. The Behavior Therapist. 2000;23:73–77. [Google Scholar]
  56. Schoener EP, Madeja CL, Henderson MJ, Ondersma SJ, Janisse JJ. Effects of motivational interviewing training on mental health therapist behavior. Drug and Alcohol Dependence. 2006;82:269–275. doi: 10.1016/j.drugalcdep.2005.10.003. [DOI] [PubMed] [Google Scholar]
  57. Shafer MS, Rhode R, Chong J. Using distance education to promote the transfer of motivational interviewing skills among behavioral health professionals. Journal of Substance Abuse Treatment. 2004;26:141–148. doi: 10.1016/S0740-5472(03)00167-3. [DOI] [PubMed] [Google Scholar]
  58. Sholomskas DE, Syracuse-Stewart G, Rounsaville BJ, Ball SA, Nuro KF, Carroll KM. We don’t train in vain: A dissemination trial of three strategies of training clinicians in cognitive-behavioral therapy. Journal of Consulting and Clinical Psychology. 2005;73:106–115. doi: 10.1037/0022-006X.73.1.106. [DOI] [PMC free article] [PubMed] [Google Scholar]
  59. Simpson DD. A conceptual framework for transferring research to practice. Journal of Substance Abuse Treatment. 2002;22:171–182. doi: 10.1016/s0740-5472(02)00231-3. [DOI] [PubMed] [Google Scholar]
  60. Squires DD, Gumbley SJ, Storti SA. Training substance abuse treatment organizations to adopt evidence-based practices: The Addiction Technology Transfer Center of New England Science to Service Laboratory. Journal of Substance Abuse Treatment. 2008;34:293–301. doi: 10.1016/j.jsat.2007.04.010. [DOI] [PubMed] [Google Scholar]
  61. Substance Abuse & Mental Health Services Administration 2007 National Survey of Substance Abuse Treatment Services (N-SSATS) 2007 http://www.oas.samhsa.gov/nssats2k7/NSSATS2k7Tbl4.10.htm; retrieved on September 17, 2010.
  62. Swearingen CE, Moyer A, Finney JW. Alcoholism treatment outcome studies, 1970-1998: An expanded look at the nature of the research. Addictive Behaviors. 2003;28:415–436. doi: 10.1016/s0306-4603(01)00251-9. [DOI] [PubMed] [Google Scholar]
  63. Walters ST, Matson SA, Baer JS, Ziedonis DM. Effectiveness of workshop training for psychosocial addiction treatments: A systematic review. Journal of Substance Abuse Treatment. 2005;29:283–293. doi: 10.1016/j.jsat.2005.08.006. [DOI] [PubMed] [Google Scholar]
  64. Weingardt K,R, Gifford EV. Expanding the vision of implementing effective practices. Addiction. 2007;102:864. [Google Scholar]
  65. Weingardt KR, Villafranca SW, Levin C. Technology-based training in cognitive behavioral therapy for substance abuse counselors. Substance Abuse. 2006;27(3):19–25. doi: 10.1300/J465v27n03_04. [DOI] [PubMed] [Google Scholar]
  66. Westen D, Novotny CM, Thompson-Brenner H. The empirical status of empirically supported psychotherapies: Assumptions, findings, and reporting in controlled clinical trials. Psychological Bulletin. 2004;130:631–663. doi: 10.1037/0033-2909.130.4.631. [DOI] [PubMed] [Google Scholar]
  67. Williams, E. et al. – this issue.

RESOURCES