Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2012 Mar 1.
Published in final edited form as: Clin Psychol (New York). 2011 Mar 1;18(1):36–40. doi: 10.1111/j.1468-2850.2010.01232.x

Using Treatment Integrity Methods to Study the Implementation Process

Bryce D McLeod 1, Nadia Y Islam 1
PMCID: PMC3105779  NIHMSID: NIHMS260853  PMID: 21643464

Abstract

The last decade has witnessed increased interest in the implementation and dissemination of evidence-based treatments (EBTs) for youth. Nakamura et al. (this issue) detail lessons learned over the past decade from the large-scale implementation of EBTS for children in Hawaii. This commentary discusses how lessons from Hawaii’s initiative can help inform the next generation of implementation research. Specifically, we focus on how treatment integrity models and methods designed to characterize core aspects of treatment delivery can be used to study the implementation process. Using the new interactive online reporting systems developed by this research group to collect treatment integrity data offers researchers a way to determine how best to implement EBTs in community based service settings with integrity and skill.

Keywords: Evidenced-based treatment, dissemination, implementation, treatment integrity


In recent years, there has been increasing interest in transporting evidence-based treatments (EBTs) for youth emotional and behavioral problems to community service settings. Although advocates for EBTs have called for the dissemination and implementation of these treatments into service settings, others have stressed a need to better understand the implementation process before significant resources are dedicated to widespread dissemination (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005; Southam-Gerow et al., 2008). Over the past 10 years, Nakamura and members of the Evidence-Based Services (EBS) Committee within Hawaii’s Child and Adolescent Mental Health Division have pioneered efforts to review the youth EBT literature and design systems for widespread implementation. This work serves as an exemplar on how to achieve large-scale implementation of EBTs and how to conduct implementation research. A number of important lessons can be learned from the effort that can help maximize the yield of future implementation research.

A hallmark feature of the Hawaii initiative is the EBS collaborative approach that is invested in interdisciplinary contribution and cooperation. Many different stakeholders make up the EBS including mental health professionals, academics, and parents. This partnership has been maintained by a common principal commitment to using empirical evidence to guide decisions regarding children’s mental health care services. An important product of this collaboration is an interactive online evidence based services reporting application, managed by the private corporation called PracticeWise. This online system is designed to aid implementation efforts by allowing information about the treatment literature to be quickly disseminated to clinicians and, importantly, allowing clinicians to report upon the types of therapeutic services they deliver. The end product is a powerful feedback system that can facilitate the exchange of information across a mental health service system.

An online system that allows the bidirectional flow of information represents an important tool for the various stakeholders interested in improving mental health services for youth and their families. The online system represents an innovative and time efficient way to collect information about important aspects of treatment delivery. Currently, one part of the system allows direct-service providers to access summaries about the most current treatment outcome research evidence for specific youth emotional and behavioral problems. Another component of the system, called the clinical dashboard, enables direct-service providers to report on their use of specific practice elements. The clinical dashboard allows information about clinical practices and outcomes to be collected and evaluated across providers, clinics, and organizations. We believe an online tool like the clinical dashboard could be used to collect data about critical elements of treatment delivery such as dosage, structure (e.g., who attends treatment), integrity of delivery, and quality of service delivery. In our view, recording information about treatment delivery via an online reporting system could help policy makers, researchers, and service providers develop a better understanding of how to deliver EBTs in practice settings with integrity and skill.

Increasingly, researchers are acknowledging the importance of studying the implementation process (Fixsen et al., 2005; Southam-Gerow et al., 2008). Initial enthusiasm for transporting EBTs to practice contexts has been tempered somewhat in recent years. The movement to transport EBTs to practice settings was, at least partly, based upon the assumption that doing so would improve outcomes for youth. However, simply transporting EBTs to practice settings may not improve outcomes. Two recent effectiveness trials reported that child-focused EBTs for youth anxiety and depressive disorders were not more effective than usual clinical care (see Southam-Gerow et al., 2010; Weisz et al., 2009). The good news that can be gleaned from these trials is both EBTs and usual clinical care appear to produce remission rates comparable to those generated by EBTs in efficacy trials (over 65% in each group). However, the findings raise questions about why the EBTs did not outperform usual clinical care: Were the core components of each EBT delivered with integrity? Did usual care contain potent therapeutic interventions? These questions focus attention upon the implementation process. In order to progress, we believe it is important for investigators to ask, and answer, such questions.

Emerging methods for conducting rigorous psychotherapy process research within applied settings offers the means to answer these questions (Garland, Hurlburt, & Hawley, 2006). This approach is ideally suited to studying the integrity of implementation efforts. By using methodologies designed to assess treatment integrity in efficacy trials, implementation researchers can evaluate key aspects of treatment delivery. Treatment integrity is composed of three components–treatment adherence, treatment differentiation, and therapist competence (Perepletchikova, Treat, & Kazdin, 2007; Waltz, Addis, Koerner, & Jacobson, 1993). Treatment adherence refers to the extent to which a therapist delivers the treatment as designed. Treatment differentiation refers to the extent to which a treatment under study differs from other treatments along lines defined by the treatment manual. And, competence refers to the level of skill and degree of responsiveness demonstrated by a therapist when delivering the technical and relational elements of treatment. Each integrity component captures a unique aspect of treatment delivery that is hypothesized to be responsible for therapeutic change (Perepletchikova et al., 2007).

We believe that implementation research would benefit from assessing the three components of treatment integrity, particularly treatment differentiation. Whereas treatment adherence assesses whether a therapist follows a particular therapeutic protocol (prescribed interventions), treatment differentiation evaluates whether (and “to where”) therapists deviate from that approach (Kazdin, Bergin, & Garfield, 1994). Treatment differentiation checks are particularly important in implementation research because: (a) direct-service clinicians working in practice settings often have diverse training backgrounds, which can increase the use of interventions from multiple theoretical orientations (e.g., Garland et al., 2010; McLeod & Weisz, 2010; Weisz et al., 2009); and (b) the usual care comparison conditions commonly used in effectiveness trials dictate that treatment differentiation measures must characterize a diverse array of therapeutic interventions (Garland et al., 2010; McLeod & Weisz, 2010). Thus, in order to capture more fully both prescribed and proscribed interventions, differentiation checks must assess for a diverse array of therapeutic interventions (Waltz et al., 1993).

Until recently, the child psychotherapy field had not produced measures capable of assessing treatment differentiation in implementation research. Weersing and colleagues (2002) were the first to address this gap by developing the Therapy Procedures Checklist (TPC). The TPC was designed to assess therapists’ reports of the techniques they employ when working with child clients in usual clinical care. Items encompass the four therapeutic models used with youths (psychodynamic, cognitive, behavioral, and family) and the measure has shown favorable psychometric properties (Baumann, Kolko, Collins, & Herschell, 2006; Weersing, Weisz, & Donenberg, 2002). The TPC has a number of significant strengths, including cost effectiveness. However, its reliance on therapist self-report may reduce its ability to provide an objective account of actual therapist behavior (Chevron & Rounsaville, 1983; Lambert & Hill, 1994). Relying solely on therapist reports may, therefore, not provide a comprehensive description of the therapeutic techniques employed in a treatment session.

A new observational measure, based partly upon the TPC, was developed specifically to address the potential limitations of therapist report. The measure, called the Therapy Process Observational Coding System for Child Psychotherapy Strategies Scale (TPOCS-S: McLeod & Weisz, 2010) is an observational measure designed to characterize usual care for youth emotional and behavioral problems (see McLeod & Weisz, 2010). The TPOCS-S is unique in that it: (a) yields quantitative data derived from direct observations of treatment sessions, (b) encompasses a range of theoretical approaches to therapy, and (c) assesses how extensively specific therapeutic interventions are employed. The TPOCS-S has demonstrated good interrater reliability, its five subscales (Behavioral, Cognitive, Psychodynamic, Client-Centered, Family) have showed good internal consistency, and analyses have supported TPOCS-S validity (McLeod & Weisz, 2010). The development of the TPOCS-S provides the field with the means to make child therapy as delivered in practice settings a focus of research.

Treatment integrity checks, performed as part of the two effectiveness trials mentioned before, illustrate the potential of using treatment integrity methods and measures to study the implementation process. Two of the three components of treatment integrity — treatment adherence and treatment differentiation — were assessed in the trials (Southam-Gerow et al., 2010; Weisz et al., 2009). One part of the integrity checks used the TPOCS-S to address two questions relevant to interpreting the study findings: (a) What interventions were used in usual care?; and (b) Were the EBT and usual care conditions distinct? Across both effectiveness trials, a similar set of findings emerged. First, the usual care therapists used a wide range of interventions from multiple theoretical orientations, but generally favored non-behavioral approaches (e.g., client-centered interventions). Some usual care therapists did, however, use interventions found in the EBTs (e.g., cognitive-behavioral interventions). Accumulating evidence therefore suggests that usual care does contain evidence-based interventions (see also Garland et al., 2010; McLeod & Weisz, 2010). Second, the EBT and usual care conditions were distinct as EBT sessions scored higher than usual care on interventions prescribed by the manual (i.e., cognitive-behavioral interventions). Though the EBT and usual care conditions were distinct, not all youth in the EBT conditions received a full dose of the treatment protocols (see Southam-Gerow et al. (2010) for a discussion). Delivering EBTs in practice contexts appears to alter the content of EBTs, perhaps in ways that influence their effectiveness.

Using the TPOCS-S to characterize the treatment provided in the effectiveness trials served to illustrate how therapy process models and methods can play a role in implementation research. Indeed, therapy process research provides researchers with the tools to document whether (and how) EBTs change when delivered in practice settings. Measures that capture critical aspects of treatment delivery, such as the TPOCS-S and the TPC, can therefore help researchers study the implementation process.

In order to realize the full potential of using treatment integrity methods to study the implementation process, the science and measurement of treatment integrity needs to advance. Most existing integrity measures are designed to assess treatment adherence and very few treatment differentiation or therapist competence measures exist (Perepletchikova et al., 2007). Moreover, both self- and observer-report treatment integrity measures are needed. Though the gold standard in integrity research is observational assessment (Hill, 1991; Hogue, Liddle, & Rowe, 1996; Mowbray, Holter, Teague, & Bybee, 2003), observational coding is time and resource intensive (Hill, 1991). Therapist-, client-, and caregiver-report measures could provide cost effective alternatives for community stakeholders who do not have the resources to support observational coding (Fixsen et al., 2005; NIMH, 1999; Mihalic, 2004; Weersing et al., 2002). Addressing some of the gaps in the field would produce a more diverse set of treatment integrity measures and generate more research opportunities, especially when combined with the advantages of using an online reporting system to collect treatment integrity data.

Using an online reporting system, such as the clinical dashboard, to collect treatment integrity data could help researchers study the implementation process. Online systems are ideal for implementation research because they are easy to use, deliver information in a timely manner, and can be adapted to meet the needs of different stakeholders. For example, data could be collected from direct-service providers prior to implementation and be used to characterize clinical practices and tailor implementation efforts. As another example, an online system could be used to support clinician training and/or monitor treatment integrity. Direct-service providers could digitally record and upload therapy sessions to a secure server and trained observers could code the sessions for various aspects of treatment delivery. Using an interactive online system to gather data about treatment delivery therefore has the potential to increase the yield of implementation research. Ultimately, this approach could help researchers identify ways to deliver EBTs in practice settings with optimal integrity and skill.

In sum, Nakamura et al. are to be commended; their large-scale implementation of EBTs across the state of Hawaii has produced many innovations and provides a roadmap for future implementation efforts. This commentary serves as a call for researchers to build upon the work of the EBS and consider ways of merging models and methods from diverse fields to inform the next generation of implementation research.

Acknowledgments

Bryce D. McLeod was supported, during the writing of this commentary, by research grants from the National Institute of Mental Health (RO1 MH086529) and the Virginia Commonwealth University Presidential Research Incentive Program.

References

  1. Baumann BL, Kolko DJ, Collins K, Herschell AD. Understanding practitioners’ characteristics and perspectives prior to the dissemination of an evidence-based intervention. Child Abuse & Neglect. 2006;30:771–787. doi: 10.1016/j.chiabu.2006.01.002. [DOI] [PubMed] [Google Scholar]
  2. Chevron ES, Rounsaville BJ. Evaluating the clinical skills of psychotherapists: A comparison of techniques. Archives of General Psychiatry. 1983;40:1129–1132. doi: 10.1001/archpsyc.1983.01790090091014. [DOI] [PubMed] [Google Scholar]
  3. Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F. Implementation research: A synthesis of the literature. University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network; Tampa, FL: 2005. FMHI Publication #231. [Google Scholar]
  4. Garland AF, Brookman-Frazee L, Hurlburt MS, Accurso EC, Zoffness RJ, Haine-Schlagel R, Ganger W. Mental health care for children with disruptive behavior problems: A view inside therapists’ offices. Psychiatric Services. 2010;61:788–795. doi: 10.1176/appi.ps.61.8.788. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Garland AF, Hurlburt MS, Hawley KM. Examining psychotherapy processes in a services research context. Clinical Psychology: Science and Practice. 2006;13:30–46. [Google Scholar]
  6. Hill CE. Almost everything you ever wanted to know about how to do process research on counseling and psychotherapy but didn’t know who to ask. In: Hill CE, Schneider LJ, editors. Research in counseling. Lawrence Erlbaum Associates; Hillsdale, NJ: 1991. pp. 85–118. [Google Scholar]
  7. Hogue A, Liddle HA, Rowe C. Treatment adherence process research in family therapy: A rationale and some practical guidelines. Psychotherapy: Theory, Research, Practice, Training. 1996;33:332–345. [Google Scholar]
  8. Kazdin AE, Bergin AE, Garfield SL. Handbook of psychotherapy and behavior change. 4th ed. John Wiley & Sons; Oxford England: 1994. Methodology, design, and evaluation in psychotherapy research; pp. 19–71. [Google Scholar]
  9. Lambert MJ, Hill CE. Assessing psychotherapy outcomes and processes. In: Bergin AE, Garfield SL, editors. Handbook of psychotherapy and behavior change. 4th ed. John Wiley & Sons; Oxford England: 1994. pp. 72–113. [Google Scholar]
  10. McLeod BD, Weisz JR. The Therapy Process Observational Coding System for Child Psychotherapy Strategies scale. Journal of Clinical Child and Adolescent Psychology. 2010;39:436–443. doi: 10.1080/15374411003691750. [DOI] [PubMed] [Google Scholar]
  11. Mihalic S. The importance of implementation fidelity. Emotional and Behavioral Disorders in Youth. 2004;4:83–86. 99–105. [Google Scholar]
  12. Mowbray CT, Holter MC, Teague GB, Bybee D. Fidelity criteria: Development, measurement, and validation. American Journal of Evaluation. 2003;24:315–340. [Google Scholar]
  13. National Institute of Mental Health . Bridging science and service: A report by the National Advisory Mental Health Council’s Clinical Treatment and Services Research Workgroup. Rockville, MD: 1999. NIH Publication No. 99-4353. [Google Scholar]
  14. Perepletchikova F, Treat TA, Kazdin AE. Treatment integrity in psychotherapy research: Analysis of the studies and examination of the associated factors. Journal of Consulting and Clinical Psychology. 2007;75:829–841. doi: 10.1037/0022-006X.75.6.829. [DOI] [PubMed] [Google Scholar]
  15. Southam-Gerow MA, Marder AM, Austin AA, Steele RG, Elkin TD, Roberts MC. Handbook of evidence-based therapies for children and adolescents: Bridging science and practice. Springer Science + Business Media; New York, NY: 2008. Dissemination of evidence-based manualized treatments for children and families in practice settings; pp. 447–469. [Google Scholar]
  16. Southam-Gerow MA, Weisz JR, Chu BC, McLeod BD, Gordis EB, Connor-Smith JK. Does cognitive behavioral therapy for youth anxiety outperform usual care in community clinics? An initial effectiveness test. Journal of the American Academy of Child & Adolescent Psychiatry. 2010;49:1043–1052. doi: 10.1016/j.jaac.2010.06.009. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Waltz J, Addis ME, Koerner K, Jacobson NS. Testing the integrity of a psychotherapy protocol: Assessment of adherence and competence. Journal of Consulting and Clinical Psychology. 1993;61:620–630. doi: 10.1037//0022-006x.61.4.620. [DOI] [PubMed] [Google Scholar]
  18. Weersing RV, Weisz JR, Donenberg GR. Development of the Therapy Procedures Checklist: A therapist-report measure of technique use in child and adolescent treatment. Journal of Clinical Child Psychology. 2002;31:168–180. doi: 10.1207/S15374424JCCP3102_03. [DOI] [PubMed] [Google Scholar]
  19. Weisz JR, Southam-Gerow MA, Gordis EB, Connor-Smith JK, Chu BC, Langer DA, McLeod BD, Jensen-Doss A, Updegraff A, Weiss B. Cognitive behavioral therapy versus usual clinical care for youth depression: An initial test of transportability to community clinics and clinicians. Journal of Consulting and Clinical Psychology. 2009;77:383–396. doi: 10.1037/a0013877. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES