Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2010 Apr 13.
Published in final edited form as: Clin Psychol (New York). 2010 Mar 10;17(1):77–81. doi: 10.1111/j.1468-2850.2009.01196.x

When Technology Fails: Getting Back to Nature

Aaron Hogue 1
PMCID: PMC2853972  NIHMSID: NIHMS189083  PMID: 20396629

Abstract

Research on substance use disorders has produced a slew of disappointments in studies designed to confirm basic principles of the technology approach to treatment dissemination. These setbacks should inspire addictions science to pursue complementary paths of inquiry that focus on evidence-based practices delivered under naturalistic conditions. This will require larger accommodations to, and closer partnerships with, the indigenous cultures of everyday care.

Keywords: alcohol users, substance use disorders, technology, treatment dissemination


Carey, Henson, Carey, and Maisto (2010) examine mechanisms of change in a brief motivational intervention for college-age alcohol users by analyzing mediational effects in two key domains: drinking motivation and drinking norms. The study contains most of the prized features of rigorous mechanisms research: an empirically based treatment (EBT), significant outcome effects in the study sample, mediator constructs with strong conceptual and procedural links to the centerpiece EBT, multidimensional operationalization of study variables, multiple measurement points for mediators and outcomes, and state-of-the-science longitudinal analytic techniques. The study also contains the all-too-common result for mechanisms studies involving substance use treatments: disappointing effects for the presumptive mediators (in this case, readiness to change and perceived costs/benefits of drinking).

Addictions research has witnessed a slew of disappointments in small- and large-scale studies designed to confirm basic principles of the technology approach to treatment dissemination. The technology approach attempts to articulate EBT implementation in precise fashion so that the curative elements of a given model can be specified, evaluated, and replicated (see Carroll & Nuro, 2002; Rounsaville, Carroll, & Onken, 2001). This includes standardizing the model in a treatment manual; identifying the population for whom treatment is intended; documenting procedures for selecting, training, and supervising practitioners; and monitoring implementation with valid fidelity tools. Randomized controlled trials, client-treatment matching studies, and mechanisms of change analyses are the research designs of choice for verifying and explaining treatment effects. The exacting standards of the technology approach are intended to promote the feasibility and strength of EBTs when delivered in various clinical settings.

TECHNOLOGY GLITCHES

The technology approach has spearheaded enormous gains in laboratory-based research on EBT implementation (via efficacy trials) and has started to make headway in real-world settings (via effectiveness studies). However, several noteworthy glitches in the technology approach have persistently occurred in addictions treatment. Because many of these glitches have arisen in high-profile, well-controlled studies, they are not easily dismissed as random or marginal. Three glitches seem particularly troublesome.

Mechanisms Do Not (Always) Perform

As Morgenstern and McKay (2007) point out, a cornerstone premise of the technology approach is the specificity hypothesis: EBTs produce impacts largely due to the curative effects of model-specific, theory-based techniques that differ from common elements or placebo effects of psychotherapy. The specificity hypothesis is the underlying premise of both mediational research and client-treatment matching studies. In the substance use field, tests of mediation have yielded positive and negative results in equal abundance (Morgenstern & McKay, 2007), and matching studies have failed across the board to confirm hypotheses about which treatments are best suited for which clients (Carroll & Rounsaville, 2007; Morgenstern & McKay, 2007). Even surefire hypotheses regarding the benefits of strong fidelity to core EBT ingredients are frequently upended. Recent studies of both adolescent (Hogue et al., 2008) and adult substance users (Barber et al., 2006) have reported a curvilinear relation between treatment adherence and some client outcomes: Too much adherence (as well as too little) can be a bad thing. And therapist competence in delivering EBTs is often weakly related or unrelated to outcome—when competence can be reliably assessed and differentiated from adherence, a difficult trick to master (Barber, Sharpless, Klostermann, & McCarthy, 2007).

Technology Does Not (Easily) Transfer

The task of transporting EBTs to everyday settings—known as technology transfer—has proven formidable. A host of factors influences the amenability of community agencies to adopting EBTs, including strength of partnership between EBT developers and providers, belief by clinical personnel in the value of integrating EBTs into existing services, and suitability of agency resources (including personnel) and organizational context for implementing EBTs with fidelity (Simpson, 2002). It appears that technology transfer can work under conditions of extensive support from model developers utilizing quality assurance (QA) “superstructures” to cultivate training, implementation, and monitoring activities on-site. QA superstructures invariably contain four components (see Fixsen, Naoom, Blasé, Friedman, & Wallace, 2005): (a) guidelines for selecting adoption-ready sites and identifying qualified staff for training; (b) standardized training toolkits that include a treatment manual, protocol for training workshops, demonstration videos and clinician workbooks, on-site supervision procedures, and fidelity checklists; (c) procedures for ongoing training and consultation from model experts that include observational coaching of clinic cases; and (d) continuous quality improvement procedures to evaluate implementation data, feed selected data back to therapists, and buttress organizational support. However, commitment to QA superstructures demands substantial and costly changes in agency infrastructure, administrative and clinical supervision, material resources, and ongoing technical support. And as yet there is no evidence that EBTs can be sustained in usual care after external support ends, or what level of partnership is needed to maintain a “good enough” QA structure indefinitely.

EBTs Do Not (Uniformly) Outperform TAU

Perhaps the most surprising technology glitch has been the strong showing of treatment as usual (TAU) conditions in EBT effectiveness research. Available drug treatment services have repeatedly produced outcomes on par with competing EBTs transported with great care into existing agencies (e.g., Miller, Yahne, & Tonigan, 2003; Morgenstern, Blanchard, Morgan, Labouvie, & Hiyaki, 2001; Westerberg, Miller, & Tonigan, 2000). Also, a manualized version of drug counseling (aka 12 Step Model), the most widely practiced approach in the substance abuse treatment system, has matched or exceeded various EBTs in two multi-site controlled trials (Crits-Christoph et al., 1999; Project MATCH Research Group, 1997), such that 12 Step merits serious consideration as an EBT itself. And recent efforts by the Clinical Trials Network of the National Institute on Drug Abuse to test the effectiveness of motivational interviewing (MI) in usual care for adult substance users have logged modest EBT victories or split decisions versus TAU: MI delivered during an initial evaluation session produced better early retention in treatment but no superiority in one- or three-month outcomes (Carroll et al., 2006); three-session MI was superior in three-month outcomes for primary alcohol use but not primary drug use (with findings further complicated by site effects), and there were no differences in retention (Ball et al., 2007); and retention and outcome effects for MI were virtually indistinguishable from TAU in a Hispanic sample (Carroll et al., 2009).

ORGANIC ALTERNATIVES

Acknowledging that technology glitches are persistent and problematic is not tantamount to disparaging the technology approach. There is little reason to doubt that ongoing advances in mechanisms of change research will yield better understanding of how and why treatments work—and so forth for other technology facets. For example, germane to Carey et al. (2010), MI has a strong record of success in mediational studies, and results from these and also from therapist training studies have been translated into meaningful improvements in the theory and practice of MI (Miller & Rose, 2009). The mixed findings by Carey et al. (2010) do not detract from this body of work so much as challenge the architects of MI (and its offshoots) to understand them in context and upgrade model development as needed.

Nevertheless, the technology setbacks encountered for all varieties of EBTs should inspire addictions science to pursue complementary paths of inquiry that are considerably less traveled but potentially as rewarding. These paths would converge in focusing on evidence-based practices (EBPs) delivered under naturalistic conditions. As described below, moving efficiently from lab-developed EBTs to practice-friendly EBPs may require larger accommodations to, and closer partnerships with, the indigenous cultures of everyday care (see also Southam-Gerow, 2004, for a similar perspective regarding mental health treatment).

Live More Simply

Concerns about the feasibility of EBT technology transfer have led researchers in both mental health and substance use to advocate a “core elements” approach to increasing use of EBPs in agency settings. The core elements approach emphasizes dissemination of reduced sets of essential treatment techniques common across EBT models for similar populations. The benefits of shifting away from wholesale name-brand EBTs toward distilled core techniques could be profound (Chorpita, Daleiden, & Weisz, 2005): unify and simplify the task of transporting curative ingredients of EBTs into routine care with fidelity; forego the herculean demand to master a different treatment manual for each clinical disorder; retain the importance of provider judgment about duration, intensity, and sequencing of EBPs; and provide evidence-based options for client groups with diagnostic complexity and/or for whom no manualized EBT exists. Disseminating EBT core elements could also enhance dissemination of full-scale EBTs by augmenting the basic technical competencies of community practitioners and galvanizing the process of adapting discrete manuals to fit usual care (Chorpita & Daleiden, 2009). Core techniques might also be deployed as a first-line option in primary behavioral care, with nonresponders referred to more comprehensive, EBT-based specialty care (Carroll & Rounsaville, 2006). Note that the core elements approach subscribes (at least in principle) to the specificity hypothesis, and progress in generating an evidence base for core techniques in various client populations appears linked to progress in mechanisms of change research.

Grow What the Soil Will Sustain

As mentioned above, QA superstructures that accompany complex EBTs may ultimately prove unsustainable in many clinical settings. It seems likely that model-specific algorithms for reducing, modifying, and reinventing EBTs will become de rigueur to supplement existing QA protocols (Garner, 2009). Also, interactive computer-based training and distance learning methods hold great promise for increasing the accessibility and perhaps precision of technology transfer (Weingardt, 2004). On the other hand, QA superstructures may thrive in large government-operated sectors of care where substance use is prevalent: criminal justice, juvenile justice, welfare, child welfare, even schools. Each sector presents a unique service context—and dissemination opportunity—with regard to resource availability, organizational capacities, and barriers to effective treatment implementation (Institute of Medicine, 2006). Because government is often the sole funder of services and a primary stakeholder in the accountability and quality of those services, strong research-government partnership increases the likelihood that EBTs will take root and be sustained at a systems level (Morgenstern, Hogue, Dauber, Dasaro, & McKay, 2009).

Learn the Ecosystem

Technology-driven effectiveness studies exert multifaceted top-down influence over treatment implementation. Little is known about whether EBTs can be delivered with fidelity in unadulterated field settings, that is, without significantly changing the working conditions of line therapists. To remedy this, naturalistic studies featuring observational data collection are needed to investigate what interventions work (if any), and how they work, in community agencies, including whether (some) agencies already incorporate EBPs in everyday practice (Weingardt & Gifford, 2007). Does TAU produce good outcomes (or not) via adapted versions of EBTs? Nonspecific “common factors” such as therapeutic alliance? Placebo effects coupled with client self-change processes? Some combination? It is possible that TAU is predominantly bereft of EBPs (e.g., Santa Ana et al., 2008), but that remains to be seen. Another underutilized strategy to increase knowledge about success and failure in usual care is patient-focused research, which includes examination of client self-change, assessment of treatment responsiveness on a session-by-session basis, and feedback of responsiveness data to therapists that permits midstream adjustments (Morgen-stern & McKay, 2007; Orford, 2008). In the long run, generating evidence that EBPs are feasible and cost-effective in pure field conditions may be the best approach to encourage mainstream treatment agencies to adopt and support EBTs.

CONCLUSION

Will the hoped-for transition from lab-based EBTs to real-world EBPs lead to improvements in standard care outcomes? Lamentably few studies in the addictions field address this question (Carroll & Rounsaville, 2007). The technology approach has admirably led the charge in EBT dissemination research, but shortcomings are apparent, and a diversification in approach seems sensible. Organic alternatives cannot replace technology-driven methods, nor will they solve or prevent future technology glitches. However, they may well increase the accuracy and efficiency with which we determine how and where to plant EBTs for sustainable yield.

Acknowledgments

Preparation of this article was supported by grant R01 DA019607 from the National Institute on Drug Abuse.

References

  1. Ball SA, Martino S, Nich C, Frankforter TL, Van Horn D, Crits-Christoph P, et al. Site matters: Multisite randomized trial of motivational enhancement therapy in community drug abuse clinics. Journal of Consulting and Clinical Psychology. 2007;75:556–567. doi: 10.1037/0022-006X.75.4.556. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Barber JP, Gallop R, Crits-Christoph P, Frank A, Thase M, Weiss RD, et al. The role of therapist adherence, therapist competence, and alliance in predicting outcome of individual drug counseling: Results from the National Institute on Drug Abuse Collaborative Cocaine Treatment Study. Psychotherapy Research. 2006;16:229–240. [Google Scholar]
  3. Barber JP, Sharpless B, Klostermann S, McCarthy KS. Assessing intervention competence and its relation to therapy outcome: A selected review derived from the outcome literature. Professional Psychology: Research and Practice. 2007;38:493–500. [Google Scholar]
  4. Carey KB, Henson JM, Carey MP, Maisto SA. Perceived norms mediate effects of a brief motivational intervention for sanctioned college drinkers. Clinical Psychology: Science and Practice. 2010;17:59–72. doi: 10.1111/j.1468-2850.2009.01194.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Carroll KM, Ball SA, Nich C, Martino S, Frankforter TL, Farentinos C, et al. Motivational interviewing to improve treatment engagement and outcome in individuals seeking treatment for substance abuse: A multisite effectiveness study. Drug and Alcohol Dependence. 2006;81:301–312. doi: 10.1016/j.drugalcdep.2005.08.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Carroll KM, Martino S, Ball SA, Nich C, Frankforter T, Anez LM, et al. A multisite randomized effectiveness trial of motivational enhancement therapy for Spanish-speaking substance users. Journal of Consulting and Clinical Psychology. 2009;77:993–999. doi: 10.1037/a0016489. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Carroll KM, Nuro KF. One size cannot fit all: A stage model for psychotherapy manual development. Clinical Psychology: Science and Practice. 2002;9:396–406. [Google Scholar]
  8. Carroll KM, Rounsaville BJ. Behavioral therapies: The glass would be half full if only we had a glass. In: Miller WR, Carroll KM, editors. Rethinking substance abuse: What the science shows and what we should do about it. New York: Guilford; 2006. pp. 223–239. [Google Scholar]
  9. Carroll KM, Rounsaville BJ. A vision of the next generation of behavioral therapies research in the addictions. Addiction. 2007;102:850–862. doi: 10.1111/j.1360-0443.2007.01798.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Chorpita BF, Daleiden EL. Mapping evidence-based treatments for children and adolescents: Application of the distillation and matching model to 615 treatments from 322 randomized trials. Journal of Consulting and Clinical Psychology. 2009;77:566–579. doi: 10.1037/a0014565. [DOI] [PubMed] [Google Scholar]
  11. Chorpita BF, Daleiden EL, Weisz JR. Identifying and selecting the common elements of evidence-based interventions: A distillation and matching model. Mental Health Services Research. 2005;7:5–20. doi: 10.1007/s11020-005-1962-6. [DOI] [PubMed] [Google Scholar]
  12. Crits-Christoph P, Siqueland L, Blaine JD, Frank A, Luborsky L, Onken LS, et al. Psychosocial treatments for cocaine dependence: National Institute on Drug Abuse Collaborative Cocaine Treatment Study. Archives of General Psychiatry. 1999;56:493–502. doi: 10.1001/archpsyc.56.6.493. [DOI] [PubMed] [Google Scholar]
  13. Fixsen DL, Naoom SF, Blasé KA, Friedman RM, Wallace F. Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la parte Florida Mental Health Institute, The National Implementation Research Network; 2005. (FMHI Publication #231) [Google Scholar]
  14. Garner BR. Research on the diffusion of evidence-based treatments within substance abuse treatment: A systematic review. Journal of Substance Abuse Treatment. 2009;36:376–399. doi: 10.1016/j.jsat.2008.08.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Hogue A, Henderson CE, Dauber S, Barajas PC, Fried A, Liddle HA. Treatment adherence, competence, and outcome in individual and family therapy for adolescent behavior problems. Journal of Consulting and Clinical Psychology. 2008;76:544–555. doi: 10.1037/0022-006X.76.4.544. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Institute of Medicine. Improving the quality of healthcare for mental and substance-use conditions. Washington, DC: National Academy Press; 2006. [Google Scholar]
  17. Miller WR, Rose GS. Toward a theory of motivational interviewing. American Psychologist. 2009;64:527–537. doi: 10.1037/a0016830. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Miller WR, Yahne CE, Tonigan JS. Motivational interviewing in drug abuse services: A randomized trial. Journal of Consulting and Clinical Psychology. 2003;71:754–763. doi: 10.1037/0022-006x.71.4.754. [DOI] [PubMed] [Google Scholar]
  19. Morgenstern J, Blanchard KA, Morgan TJ, Labouvie E, Hiyaki J. Testing the effectiveness of cognitive-behavioral treatment for substance abuse in a community setting: Within treatment and posttreatment findings. Journal of Consulting and Clinical Psychology. 2001;69:1007–1017. doi: 10.1037//0022-006x.69.6.1007. [DOI] [PubMed] [Google Scholar]
  20. Morgenstern J, Hogue A, Dauber S, Dasaro C, McKay JR. A practical clinical trial of coordinated care management to treat substance use disorders among public assistance beneficiaries. Journal of Consulting and Clinical Psychology. 2009;77:257–269. doi: 10.1037/a0014489. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Morgenstern J, McKay JR. Rethinking the paradigms that inform behavioral treatment research for substance use disorders. Addiction. 2007;102:1377–1389. doi: 10.1111/j.1360-0443.2007.01882.x. [DOI] [PubMed] [Google Scholar]
  22. Orford J. Asking the right questions in the right way: The need for a shift in research on psychological treatments for addiction. Addiction. 2008;103:875–885. doi: 10.1111/j.1360-0443.2007.02092.x. [DOI] [PubMed] [Google Scholar]
  23. Project MATCH Research Group. Matching alcoholism treatments to client heterogeneity: Project MATCH posttreatment drinking outcomes. Journal of Studies on Alcohol. 1997;58:7–29. [PubMed] [Google Scholar]
  24. Rounsaville BJ, Carroll KM, Onken LS. A stage model of behavioral therapies research: Getting started and moving on from Stage I. Clinical Psychology: Science and Practice. 2001;8:133–142. [Google Scholar]
  25. Santa Ana EJ, Martino S, Ball SA, Nich C, Frankforter TL, Carroll KM. What is usual about “treatment-as-usual”? Data from two multisite effectiveness trials. Journal of Substance Abuse Treatment. 2008;35:369–379. doi: 10.1016/j.jsat.2008.01.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Simpson DD. A conceptual framework for transferring research to practice. Journal of Substance Abuse Treatment. 2002;22:171–182. doi: 10.1016/s0740-5472(02)00231-3. [DOI] [PubMed] [Google Scholar]
  27. Southam-Gerow MA. Some reasons that mental health treatments are not technologies: Toward treatment development and adaptation outside labs. Clinical Psychology: Science and Practice. 2004;11:186–189. [Google Scholar]
  28. Weingardt KR. The role of instructional design and technology in the dissemination of empirically supported, manual-based therapies. Clinical Psychology: Science and Practice. 2004;11:313–331. [Google Scholar]
  29. Weingardt KR, Gifford EV. Expanding the vision of implementing effective practices: Commentaries on Carroll & Rounsaville. Addiction. 2007;102:863–869. [Google Scholar]
  30. Westerberg VS, Miller WR, Tonigan JS. Comparison of outcomes for clients in randomized versus open trials of treatment for alcohol use disorders. Journal of Studies on Alcohol. 2000;61:720–727. doi: 10.15288/jsa.2000.61.720. [DOI] [PubMed] [Google Scholar]

RESOURCES