Skip to main content
Addiction Science & Clinical Practice logoLink to Addiction Science & Clinical Practice
. 2010 Dec;5(2):40–43.

Response: fidelity and flexibility

Michael Shopshire, Michael Levy, Carrie Dodrill
PMCID: PMC3120120  PMID: 22002452

Michael Shopshire: Programs and counselors are clearly interested in and even excited by motivational interviewing (MI) and other evidence-based practices. They attend trainings and say they implement evidence-based practices, but we don’t know what they’re actually doing in their sessions. They may not really be implementing the practices in the way that the creators intended or in a way that is supported by evidence.

I try not to be rigid about following treatment manuals. Speaking as one who has developed a manual-based treatment, I really believe that it’s useful for a clinician to make a treatment his or her own. However, you still need to be sure that you maintain the basic mechanism of change that makes the treatment work. As I’ve trained people in my cognitive-behavioral anger management treatment, some clinicians have said, “Well, I do your anger management treatment, but I only do the parts of it that I like.” There really is a bottom line: Either you are teaching a client a cognitive-behavioral anger management strategy, or you’re doing something that isn’t evidence-based at all. I’ve had people say, “Oh, I just let my clients have a temper tantrum, so they get their anger out in a cathartic way.” Well, wait a minute, that’s something the manual says you’re not supposed to do. If you do that, you’re no longer doing what researchers consider an effective approach.

So the question becomes, how do we make sure that people do what is prescribed in the treatment manual and don’t introduce contradictory practices or water down the treatment? Part of the formula is training, so front-line clinicians know how to do the treatment in the first place, but the other part is adherence, so that clinicians apply it correctly and consistently in practice. That’s where supervision is critical.

Dr. Martino’s product, Motivational Interviewing Assessment: Supervisory Tools for Enhancing Proficiency (MIA:STEP), is a good example of how one can take an evidence-based treatment and come up with procedures for supervising clinicians’ performance. It’s very innovative in that it’s one of a few examples of researchers making a concerted effort to come up with a training course for supervisors.

To date, the California–Arizona Node of the NIDA Clinical Trials Network (which is now part of the Western States Node) has conducted about three trainings in MIA:STEP. We took a two-step approach. First, we found out that a lot of clinicians said they’d taken classes on MI but weren’t comfortable enough to actually implement it. So, we hired an advanced trainer from the Motivational Interviewing Network of Trainers who gave some preparation to front-line clinical staff. Then, in our second step, we tried to attract the clinicians’ supervisors to complete the supervisor training. Unfortunately, that didn’t go as well as we had hoped. Only a few supervisors attended. We will follow up with those programs to see whether the supervisors were actually able to implement the MIA:STEP procedures and to identify the reasons they did not.

The low response from supervisors is very understandable. Programs these days are very busy treating their clients and dealing with various challenges. They may be coping with funding constraints and just trying to get by. Implementing something new and complicated may not be seen as a top priority compared with giving their clients the basic services they need. So even though programs are interested in learning about MI, they may not follow through and implement it in the precise manner that’s prescribed by the treatment manual. Some programs appear interested in MI because of mandates, rather than because they’re convinced it can improve their outcomes. As long as they feel that way, they may not see that it’s worth the effort that’s required to implement it with the fullest possible fidelity.

The supervision model that’s embodied in MIA:STEP is something that’s very familiar to researchers. The supervisor sits in on a session or listens to a tape, decides whether each transaction between the counselor and client is consistent with the treatment manual, and rates the transaction on adherence and competence. As researchers, we’re very aware of how to come up with competency and adherence measures and do this kind of rating. It’s a very microlevel critique. Rating portions of two session tapes might take 2 to 3 hours, and then it takes more time to give the clinicians their feedback. As researchers we’re used to it. However, it’s different from the kind of supervision that most programs do, and programs may find it too complicated and time-consuming to implement. The supervisor has to become almost an expert on the intervention to be able to recognize which interactions follow the manual and which don’t. It’s very difficult to get front-line clinicians engaged to this extent, especially because there’s usually no one they can directly bill for it.

There may be ways to ease the burden on supervisors’ time. They might listen to only parts of sessions, or they might collect tapes from all sessions and randomly select a few to evaluate. Clinicians might be motivated to adhere consistently if they knew that any one of their sessions might be evaluated. It may also be possible to use an outside agency to provide expert review of sessions. In multisite clinical trials, session tapes are often sent to a central location for review and feedback by experts in the research protocol. Still, programs might be wary of such an arrangement. Some criticize the evidence-based approach on the grounds that these treatments always seem to come out of elite universities, and it appears to them that businesses are built around the treatments. The treatment manuals must be purchased, the trainers must be paid—and if it were then suggested to them that they now should pay a company to do the rating, too, they may feel that the effort is more motivated by financial profits than by a genuine interest in improving treatment outcomes.

Dr. Martino’s article points in the right direction and gives hope that we can come up with innovative ways to overcome these obstacles. Maybe we can develop self-paced online training alternatives for clinicians. Maybe we can make supervision easier by training clinicians to a higher level of skill, which will in turn increase adherence and fidelity before supervisor ratings are implemented. Ultimately, we need to convince programs of the importance of this kind of supervision and look for a cultural shift so that there’s a spirit of trying to adhere.

Michael Levy: Dr. Martino’s article lays out very well the different approaches to training and the incredible challenges to implementing evidence-based practices in real-world settings. CAB’s programs have experience with a number of evidence-based treatments, including methadone and buprenorphine, contingency management, Seeking Safety, the Adolescent Community Reinforcement Approach coupled with Assertive Continuing Care (ACRA/ACC), and Motivational Enhancement Therapy/Cognitive Behavioral Therapy 5 (MET/CBT 5). Our training approaches for these treatments have differed. For contingency management, I attended some trainings, and we gave a couple of in-house trainings using materials provided by the Addiction Technology Transfer Center. For the ACRA/ACC, one supervisor attended a workshop and became certified, then trained our staff. He used digital recorders to review their work and gave them feedback. His reviews were monitored, in turn, by the outside agency that developed ACRA/ACC.

We’re currently using the MIA:STEP model to train supervisors to work with their clinicians in MI. The clinicians have taken workshop training, but without ongoing supervisor monitoring, you don’t really know how well they sustain what they’ve learned; it’s kind of a black box. Supervisor support is critical to look very closely at what people do, code the work, and give them feedback—like, “here’s something you could do a little better.”

Clinical supervisors are busy people. When you’re rolling out something like this, it’s much more manageable for each supervisor to review tapes of a couple of clinicians at first, rather than his or her whole group, and to listen to maybe 15 minutes of each tape. When the first clinicians are doing well, supervisors can move on to a couple of others, and so on. Otherwise, the clinical supervisors will be overwhelmed.

Recording patient sessions has not been the norm in our organization. It represents a cultural shift. Many people were and are scared about it. However, we haven’t had much resistance. A key for successfully introducing any new practice is that the counselors have to really want to do it. For example, we started Seeking Safety at a time when our counselors were looking for ways to help a woman who struggled with trauma and substance use disorders, so the staff really were invested and eager to do it. Many of our clients grapple with ambivalence about change, so when I was ready to introduce MI and said, “Hey, do people want to get trained in this really cool process to assist people who are ambivalent about changing?” I got a lot of buy-in. If I hadn’t presented it in terms of how it can help counselors with a challenge that they all face, but instead had just said, “This is what we’re going to do,” I think we’d be doomed to fail.

Dr. Martino talks about organizational culture in his paper. I think this is another reason we haven’t had much resistance to session monitoring. CAB is known for not doing business just one way, but for trying to do cutting-edge, state-of-the-art things. When we hire people, they’re aware that we embrace a lot of different treatment modalities, and we aim for them to be skilled in a lot of different things to best serve our clients. Although recording patient sessions is now scary for some staff, it will eventually become an established part of this culture, and new clinicians coming on will see it as just a feature of the way we do things here.

Some counselors have spoken about feeling, at times, inhibited by the supervisory oversight. They feel self-conscious knowing that they will be rated, and that hinders their work a little bit. But, once they reach a level of proficiency and adherence, the frequency of reviews drops, they can make the intervention more their own, and it starts to feel more comfortable.

How much flexibility needs to be built into an evidence-based treatment to make it as good as possible? I think there should be a fair amount, because, for example, it’s important to meet patients where they are. I could be following a manual and thinking, “This is what I’m going to do,” but when that client comes in, he or she is in a totally different place. If I don’t adjust and work a little differently, I might not engage the client, or the client might not be happy with that day’s session. I sometimes tell counselors to regard a new evidence-based practice as something new to put into their tool kit, one more thing they can use along with the other things they do. In practice, they draw from this, they draw from that, and they eventually make the intervention their own.

I was struck by Dr. Martino’s comment that counselors’ performance of MI may waver when the clinical going gets tough. He cites a study in which counselors demonstrated effective use of MI when clients expressed high levels of ability to change, but the counselors performed the intervention poorly when clients said they found it difficult not to drink. That is a point worth thinking about. It suggests that counselors who are going to deliver a treatment in community programs may require a higher level of training than those who administer it in clinical trials in research settings. The reason for this would be that the people who volunteer to participate in clinical trials may be more ready to change than those in the community programs, many of whom are there because a spouse or parole officer has given them an ultimatum.

Dr. Martino gives a good account of what we know about training for evidence-based practices, but it’s worth pointing out that evidence-based practices are only a part of the puzzle of how to help people recover. Many things go into a quality treatment program. Our staff get training in our treatment philosophy, quality management, Addictions 101, and the importance of customer service. We view and discuss a tape of the Stanford prison experiment (Haney, Banks, and Zimbardo, 1973) to increase awareness of the power we have over patients and the need to take care not to misuse it, even with the best intentions. There’s a lot of research that supports the importance of nonspecific variables, such as the quality of the therapeutic alliance, in patient outcomes. We could use more research that looks at what clinicians are actually doing moment to moment in therapy, because I think a lot of people are doing pretty good work. They’ve never written it up in a manual, but I think some treatment as usual is pretty good stuff.

Carrie Dodrill: I think the best clinical skill set is to be able to draw from an armamentarium of evidence-supported procedures and adapt to individual situations and things that are observed in sessions. It’s better to be flexible and apply a variety of evidence-based processes than to just do the same workbook with every person in the same way all the time.

I’m always fascinated by the many people who come to MI training believing that they’re already using the technique. They think MI is so basic that sometimes they don’t pay attention in the beginning. However, if you record and listen to exactly what they say in their sessions, it’s not MI. After they’ve gotten some feedback, they realize, “Oh, that’s what you mean,” and that it’s not so easy. For example, they’re doing reflective listening, which does become basic when you practice it. But they’re not doing it in a two-to-one ratio to questions on a sustained basis, and that’s hard without sounding robotic.

So I agree with Dr. Martino that workshops are necessary for MI training, but not sufficient. You get the most behavior change with the blended approach, with ongoing supervision. The supervision can be singly or in a group, by phone or in person, by recording or directly observing counseling sessions, but one way or another, it’s indispensable to review actual sessions.

Counselors will be pretty nervous unless they are confident that their supervisors will score their tapes without bias. Anxiety about their scores and keeping their jobs might disrupt their learning. To alleviate that fear, in the Screening, Brief Intervention, and Referral to Treatment (SBIRT) project in Houston, a separate team of trainers and coaches reviewed tapes and provided skill scores for the providers every quarter. I and the others on the team were not the providers’ direct supervisors and had no say over whether they kept their jobs. Some providers still felt concerned that the scores would affect their job security, but overall I think they were comfortable and glad to have a chance to make mistakes and develop their skills in a somewhat protected setup. If you’re going to use an in-house rater, it will help to have someone who is highly trained and expert in the approach that he or she rates. That removes some of the subjectivity from the scoring.

As far as I know, no one in the Houston SBIRT program has been fired only for not meeting MI performance criteria—there were other performance criteria not being met for the one or two who lost their jobs. A couple of counselors were put on probation when they couldn’t use the skills after taking training that seemed to be sufficient for everyone else. These counselors got some extra training and feedback. I recall that they came up to par, and perhaps one of them dropped below par again, you know, and sort of hung around right on the edge.

Dr. Martino mentions the idea of giving clinicians cash prizes for learning treatments. We talked about doing this in the SBIRT project, but finally couldn’t see any way within the county rules about pay and promotions. But, for private organizations, I think it’s a great idea. Incentives can never hurt.

When we talk about training examples in the addiction field, we’re usually talking about MI training. That’s because MI developers have said from the outset that anyone can learn to do the technique, and they’ve created an extensive set of resources for training both supervisors and front-line providers. That’s not the case with some other evidence-based approaches that seem to presume that only a subset of people with certain types of training really can do them. For those treatments to have the best chance to be utilized to full advantage, it’s important that providers be exposed to them in their graduate training.

One type of training Dr. Martino doesn’t mention is team-based learning. I had a very good experience training teams of medical residents in what to do with patients who misuse alcohol. There was an initial lecture or workshop for a few hours, and then three annual booster sessions, each with a refresher lecture and case example. The residents formed teams to go over the examples and answer questions about it based on what they had learned. Teams that got all the questions right won a prize of candy or a healthy snack. We also gave prizes to the resident who had done the most screenings for alcohol misuse and for other achievements. The residents found the training very engaging, and we felt it was successful.

Implementing a new practice puts considerable demands on an organization. Leadership has to believe that doing so will improve its operation or outcomes. They have to build time for training into clinicians’ schedules and into their budget. They absolutely must obtain buy-in from the people who are going to be trained in the new practice. I’ve seen cases where all these things didn’t happen for programs that have strong evidence supporting their efficacy, and that’s unfortunate and sad.

REFERENCE

  1. Haney C, Banks C, Zimbardo P. Interpersonal dynamics in a simulated prison. International Journal of Criminology and Penology. 1973;1(1):69–97. [Google Scholar]

Articles from Addiction Science & Clinical Practice are provided here courtesy of BMC

RESOURCES