Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2016 May 12.
Published in final edited form as: J Emot Behav Disord. 2010 Apr 16;19(3):182–192. doi: 10.1177/1063426610367793

Statewide CBT Training for Clinicians and Supervisors Treating Youth: The New York State Evidence Based Treatment Dissemination Center

Alissa A Gleacher 1, Erum Nadeem 1,2, Amanda J Moy 1, Andria L Whited 1,2, Anne Marie Albano 1, Marleen Radigan 3, Rui Wang 3, Janet Chassman 3, Britt Myrhol-Clarke 3, Kimberly Eaton Hoagwood 1,3
PMCID: PMC4865263  NIHMSID: NIHMS763282  PMID: 27182190

Abstract

In recent years, several states have undertaken efforts to disseminate evidence-based treatments to agencies and clinicians in their children's service system. In New York, the Evidence Based Treatment Dissemination Center adopted a unique translation-based training and consultation model in which an initial 3-day training was combined with a year of clinical consultation with specific clinician and supervisor elements. This model has been used by the New York State Office of Mental Health for the past 3 years to train 1,210 clinicians and supervisors statewide. This article describes the early adoption and initial implementation of a statewide training program in cognitive-behavioral therapy for youth. The training and consultation model and descriptive findings are presented; lessons learned are described. Future plans include a focus on sustainability and measurement feedback of youth outcomes to enhance the continuity of this program and the quality of the clinical services.

Keywords: evidence-based intervention, cognitive-behavioral therapy, translation-based training consultation model, clinical services, child mental health


At least a dozen states are actively responding to requests from federal leadership and national policy institutes to close the gap between research and practice by installing evidence-based services for youth and families in their systems (Bruns & Hoagwood, 2008; Drake et al., 2001). Although the range of strategies that states are pursuing to introduce these practices varies enormously from fiscal reorganization and application of outcomes monitoring systems to centralization of training and consultation functions (Bruns et al., 2008; Daleiden et al., 2006), all share the objective of improving the quality of mental health services within the public sector. In New York State (NYS), one focus of this effort has been on building an evidence-based treatment (EBT) training center within the Office of Mental Health (OMH) structure to facilitate the training in and use of EBTs for children and families. Six years ago, NYS embarked on a process of introducing EBTs to practitioners across the state by installing a central office for dissemination in the OMH structure. This article reviews the process by which the Evidence Based Treatment Dissemination Center (EBTDC), a quality improvement practice initiative in NYS, transitioned from a concept to an established functional infrastructure for the clinical workforce in licensed, New York State Office of Mental Health (NYSOMH) agencies. The focus of the EBTDC is to provide training and consultation in cognitive-behavioral therapy (CBT) for youth with internalizing and externalizing mental health problems. Although the initiative did not set out to follow Fixsen, Naoom, Blasé, Friedman, and Wallace's (2005) model of implementation, Fixsen et al.'s model provides a useful paradigm for framing the process NYSOMH followed and serves as a basis to better understand some of the successes and challenges in undertaking a project of this magnitude.

In an effort to provide effective services to a broad range of clients, increased attention has been given to the science of implementation and dissemination of innovative therapies into community settings. In a large-scale review, Fixsen et al. (2005) identified key stages of the implementation process (i.e., exploration and adoption, program installation, initial implementation, full operation, innovation, and sustainability), which are reflected in the development and formation of the dissemination model used by NYSOMH's EBTDC. To date, the primary efforts in NYS have focused on the development and installation of a program aimed at training clinicians across the state in CBT. Although the goal of the program is to assist clinics in using EBTs, in particular CBT treatment protocols within their practice setting, this article focuses on the implementation of the EBTDC itself. Therefore, this article describes (a) the EBTDC's development in relation to the adoption, program installation, and initial implementation stages of Fixsen et al.'s (2005) implementation model and (b) the training model that was developed to address the needs of NYS.

The Origins and Adoption of EBTDC

Fixsen et al. (2005) defined the “exploration and adoption” stage as the assessment of “the potential match between community needs, evidence-based practice and program needs, and community resources,” followed by “a decision to proceed (or not)” (p. 15). After September 11, 2001 (9/11), NYS faced the overwhelming challenge of delivering effective mental health treatment to a large proportion of its population whose mental health needs were unmet, and which often predated the terrorist attacks, while addressing a workforce ill-prepared to respond to these demands (Hoven et al., 2005). Officials of NYSOMH, in partnership with Columbia University, decided that a provider training program in CBT for posttraumatic stress disorder (PTSD) was an efficient means to deliver mental health care that would potentially produce the greatest clinical impact. Through a successful partnership with leading childhood trauma experts, available evidence-based interventions were evaluated for treatment outcome research and ease of trainability; providers were trained and services were furnished to 700 children and families (Hoagwood et al., 2007). Because of the success of this experience, NYS made the pivotal decision to establish a permanent infrastructure for training and dissemination of EBTs for children and youth across the state's five regions. In the first 2 years of the EBTDC program, training approaches used in the Child and Adolescent Trauma Treatment Services (CATS) Consortium were adopted. (CATS Consortium, 2007, in press; Hoagwood et al., 2006). NYSOMH also decided that the training target would be reviewed biennially. The first two years focused on training clinicians and supervisors in CBT for PTSD (Cohen, Mannarino, & Deblinger, 2006) and depression (Stark, Curry, & Goldman, 2005), whereas the current training cycle focuses on individual CBT and parent training for disruptive behavior disorders (Lochman & Wells, 2004).

The selection of specific EBTs is guided by a Scientific Advisory Board selected by NYSOMH senior officials. It includes experts in the field of evidence-based practice and implementation, whose role is to review existing evidence on childhood EBTs in relation to topics suggested by OMH (e.g., anxiety disorders, conduct disorders). A key tool in this process is a survey where current and past EBTDC clinician and supervisor participants are asked to nominate potential training target and problem areas that they feel need attention. Identified areas of training interest are then matched against the empirical literature to pinpoint specific EBT protocols for future EBTDC training cycles. The Scientific Advisory Board then reviews the treatments for treatment outcome, ease of use, evidence to support use with diverse populations, cost of protocols and training, and ease of trainability. To date, no administrative data on diagnosis or service use patterns for those served in New York have been utilized in this process. The Scientific Advisory Board process was developed as the project transitioned into the 3rd year, when disruptive behavior disorders were identified by the community as an area of interest. Currently, the project is ending its 3rd year and beginning the 4th year, and plans for the third cycle (i.e., 5th and 6th years) are being developed.

With respect to the day-to-day clinical consultation, Columbia University and the target treatment developers have provided training and consultation through subcontracts with OMH. Within the Columbia structure, several consultants are assigned to conduct phone calls to clinician and supervisor participants. These phone calls provide participants with ongoing consultation and training, with the overall goal to assist participants in translating the CBT skills to their unique clients and clinic settings. To provide support, consistent structure, and the opportunity to further advance their knowledge of the treatments, the consultants meet with each other and the clinical director on a weekly basis. In addition, consultants meet with the treatment developers on a monthly basis to discuss the specific protocols and problem solve difficult issues and cases. In the initial year of the project, consultants were selected based on their previous involvement with the CATS Consortium and other projects. In subsequent years, a more formal process was implemented whereby the Columbia-based clinical director conducted extensive interviews and chose consultants based on their knowledge of manualized treatments, CBT, and experience in implementation of EBTs.

Establishment of the EBTDC Infrastructure and Training Program

Once the decision to adopt EBT training as a state initiative was made (including the decision to initially focus on depression and trauma), a process of program installation followed, as described by Fixsen et al. (2005). In 2005, NYSOMH established the EBTDC as well as its organizational structure. This included a structure for day-to-day program operations (e.g., identifying director and support staff), evaluation efforts (e.g., data collection databases, data reporting systems), and clinical support (i.e., establishing scientific advisory committees, creating structures for clinical consultation, establishing subcontract to university partners). The organizational strengths of the EBTDC are that it is embedded in the OMH's central office and is connected via subcontracts with academic institutions, as well as nationally recognized treatment developers and researchers in EBT development and deployment (see Figure 1). Drawing its funding from the NYSOMH budget, the EBTDC offers yearly training to 400 clinicians and supervisors with the express goal of increasing the quality of clinical care for youth and their families through access to EBTs. There is no cost to trainees or agencies, there is also no monetary incentive to participate (e.g., enhanced billing rates).

Figure 1.

Figure 1

Evidence Based Treatment Dissemination Center (EBTDC) organizational structure and individual roles performed or designated within each core function

*Members of the EBTDC Steering Committee.

Because the EBTDC is part of the NYSOMH, state-licensed agencies have been motivated to participate in the program. State-licensed agencies are community mental health centers that have their own funding structures and that have applied for and received state licenses. Each year, clinics and staff were encouraged by multiple branches of the OMH structure (both central and field offices) to participate in the trainings. In addition to establishing program structures at the state level, expectations for participating clinics and clinicians were established during the program installation phase and continue to evolve. No financial incentives have ever been offered at the clinical level; however, at the clinician-level, certificates can be earned by those who successfully complete the program. Although Fixsen et al. (2005) have suggested that there are ideal clinician and agency characteristics for participation (e.g., readiness to adopt EBTs), these characteristics were not examined before clinics were accepted for EBTDC participation because of time limitations and feasibility concerns. Acceptance into the program was based on the needs of the individual clinic populations, including availability of diagnostically and symptomatically appropriate cases. Thus, at the state level, attention was given to the establishment of critical structure (e.g., staffing, program support, evaluation infrastructure); however, individual clinics have been responsible for their own training support (e.g., phone, computer, Internet access), which likely led to variability in the local uptake and sustainability of EBTDC treatment models.

Translation-Based Training and Consultation Model (TBTCM)

Another component of the EBTDC structure was the development and installation of a training program that incorporated two components: a 3-day in-person training and biweekly follow-up phone consultation for up to one year. This model followed from the CATS Consortium where there was an identified need for initial training as well as guided practice in the application of the techniques to unique clientele. The TBTCM aims to facilitate the participants’ ability to translate new skills to their practice setting using the training and consultation elements. The first component of the TBTCM is a one-time training workshop aimed at exploiting the benefits of brief trainings including didactic instruction, demonstration (modeling), and behavioral rehearsal of the skills (Joyce & Showers, 2002). However, the TBTCM seeks to avoid the pitfalls of relying on a training workshop alone to lead to lasting changes in clinicians’ behaviors and long-term use of innovative treatments (Bero et al., 1998; Bickman, 1999). In addition to training, coaching or follow-up consultation has been identified as an essential element in the installation of new practices (Ager & O'May, 2001; Kelly et al., 2000; see Joyce & Showers, 2002, for review), and therefore composes the second element of the TBTCM. Much of the research on this type of follow-up consultation is conducted using in-person consultation or supervisory models. However, because of the scale of this project (statewide) and limited number of consultants, a distance learning phone consultation model containing key elements essential to effective coaching including supervision, assessment, and feedback (Spouse, 2001) was included. The model has evolved to incorporate separate clinician and supervisor consultation calls, which will be discussed in greater depth (see Figure 2).

Figure 2.

Figure 2

Translation-based training and consultation model (TBTCM) structure in disseminating evidence-based treatments

Three-Day Training Workshop

EBTDC training workshops followed the TBTCM model and were conducted by the treatment developers or their PhD-level staff. The EBTDC year began with a 3-day training. The 1st day covered the theoretical foundations and principles underlying CBT, along with training in relevant assessment tools and strategies. The 2nd and 3rd day focused on in-depth training in the specific EBT protocols provided by the treatment developers. In the first three years, approximately ten 3-day trainings were conducted each year throughout the state, involving 40 to 50 participants in each of the trainings.

Distance Consultation

The second core component of the TBTCM is distance consultation via telephone provided by Columbia University PhD-level staff with experience in CBT, EBTs, and the specific protocols. The overall aims of the consultation calls are to help trainees effectively transport the treatment methods to complex community cases. These biweekly consultation calls include specific assistance in problem solving, treatment planning, and case conceptualization. Consultants also provide clinical trainees continued guidance on the use and incorporation of assessment measures, tools, and specific treatment techniques for trauma (e.g., exposure) and depression (e.g., behavioral activation and cognitive restructuring; Cohen et al., 2006; North et al., 2008; Stark et al., 2005). Clinical trainees are active in the learning process and are encouraged to generalize the skills across their caseload to sustain their use of the CBT protocols and skills after the end of the consultation year. This is facilitated by the last several consultation calls being aimed at generalization and sustainability.

The clinicians participating in the trainings during the 1st year attended biweekly phone consultation calls initiated at the end of the 3-day training. Over the course of the consultation call year, trainees were expected to attend 80% of calls in Year 1 and 75% of calls in Year 2. They were also expected to present three cases and describe their implementation of the treatments. Moreover, they were expected to see one case from initiation to termination using the CBT treatment.

Initially, consultation calls were staffed by four part-time PhD-level clinical consultants, with each consultant having an average of 12 participants per call (range = 7–18). In the 2nd year, five part-time PhD-level consultants held biweekly calls, which ranged from 5 to 13 participants, with an average of 8 participants per call. Although supervisors of participating clinicians were invited to participate on calls during Year 1, there were no supervisor-specific portions of the program. In Year 2, this was amended, because of feedback from trainees, to include separate consultation calls for supervisors. These were held on a monthly basis to increase general CBT skills, CBT supervision knowledge, and protocol-specific expertise. These calls were staffed by one PhD-level consultant and had a range of 8 to 16 supervisors, with an average of 11 participants on each call. The requirements for participation were the same for supervisors, with the exception of the case completion requirement.

Although each consultant has his or her own style of interaction, consultation calls follow the basic structure of agenda setting and brief check-in (e.g., taking attendance, setting an agenda), formal case presentations (clinicians present their cases in depth, for about 20 min, following a standardized case presentation form), brief case review (a brief discussion and problem solving of individual cases for all clinicians that takes about 1 to 2 min), and intervention and program issues (didactic instruction on the treatment interventions or discussion of issues). Thus, on each call, one to two formal presentations occur, and the clinicians who do not give a formal presentation briefly check in about their cases.

Consultant Supervision

The evolved TBTCM takes into account the findings that regular internal consultant meetings are important to treatment fidelity (Schoenwald, Brown, & Henggeler, 2000). Therefore, Columbia University–based consultation staff meet on a weekly basis to receive coaching from the EBTDC clinical director; they also attend monthly meetings with the treatment developers. These meetings typically consist of a discussion of common themes across consultation calls (e.g., increasing parent participation) and of difficult case issues that have arisen on the calls. Although fidelity to the protocols is discussed on clinician, supervision, and internal consultation calls, it is not formally assessed.

Initial Implementation of the EBTDC Training Program

With a general operational structure and approach to training and consultation in place, the EBTDC launched its initial implementation phase in 2006. In Fixsen's (2005) model, initial implementation involves changes in the overall practice environment, and encompasses the complex process of changing skills, organizational structure, concerns among staff about change, and inherent difficulties in implementing a new practice. Once these initial challenges are successfully met, new program practices and learning can be integrated into staff practices and organizational policies and procedures. At the state level, initial implementation of EBTDC focused on the practical day-to-day operations of the program, as well as creating and maintaining the structures needed to implement trainings and consultation and to support clinics that were attempting to change clinical practice at the local level. To do this, a Core Steering Committee, composed of staff from OMH and Columbia University, met weekly to address ongoing issues. Specifically, the team focused on how to structure and provide trainings across the state, ensuring that consultation calls were progressing smoothly, making training slides for presentations, developing curriculum for Scientific Advisory Board–selected topics, clarifying project roles, and making staffing adjustments to meet project needs. As the project moved into Years 2 and 3, these structures and processes were refined.

In addition to these day-to-day operational issues, early implementation concerns centered on whether community demands would be sufficient to attract the planned 400 clinicians and supervisors per year and to gauge community staff willingness to change. The program began with regional meetings across the state to provide an overview of the EBTDC and encourage participation. Interested clinics completed an application that restated the requirements of participation (e.g., attendance on calls, number of cases to be completed using EBTDC protocols) at the agency and individual levels. The requirements helped ensure that potential participants had sufficient agency support (e.g., phone and Web access, sufficient time) and appropriate cases to sustain their participation and use of the new skills.

Program Evaluation

With its concentration on initial feasibility and the large scale of this project, NYSOMH focused the majority of its data collection efforts on collecting information surrounding clinician and agency participation and acceptability. Initial concerns included speculation as to whether there would be sufficient interest in the program to fill training target enrollments, whether clinicians would be able to fulfill their participation requirements, and whether the initiative would be continued beyond its 1st year. The data presented below describes the findings from the evaluation of this initial effort. Feasibility, acceptability, and completion rates are presented.

Participation

Retention

In the first three years of the project, interest and participation in the program were strong. The 1st year of EBTDC offered nine trainings across NYS for 427 individuals; the 2nd year consisted of eight trainings, with 360 individuals participating. The trainings for Year 1 and Year 2 were on the treatment of trauma and depression (see Note 1). In the 3rd year, the first EBTDC training on disruptive behavior disorder treatments exceeded the target of 400 participants with a total of 423 individuals. Moreover, 149 of the 3rd-year participants (35.2%) also attended training in the first iteration (trauma and depression treatment training) of EBTDC, suggesting participants value the trainings and seek to continue the development of their EBT skills by seeking additional training (see Note 2). In the first 3 years, the EBTDC provided training to 1,210 clinicians and supervisors across the different training cycles (see Table 1 for a breakdown of participants). Participants in Years 1 and 2 were predominantly female (82.2%), White (74.1%) social workers (77.4%) practicing in community outpatient programs (82.2%); they had a mean age of 41 years.

Table 1.

Number of Training Participants by Year

Year 1 Year 2 Year 3a
Number of training events conducted 9 8 9
Number of clinicians attending training 333 271 312
Number of supervisors attending training 94 66 75
Number participating as supervisor and clinician N/A 23 36
Total number of attendees 427 360 423
Number of consultants 4 5 5
Average number of calls per consultant over 1 year 9.0 8.2 9.0
Range of number of calls per consultant 8–10 1–15 4–15
a

A total of 146 participants from Years 1 and 2 (trauma and depression training) took part in Year 3 (disruptive behavior disorder training).

Acceptability and feasibility

Another question of concern for both the OMH and Columbia University collaborators was whether individuals would participate over the course of the entire year and be able to successfully complete the program. To address this, several evaluation measures were collected as part of the training program. At the provider level, the primary measures of program success were consultation call attendance rates and certificate completion rates.

To earn clinician completion certificates, clinicians were required to meet three criteria: complete the core modules of the treatment manuals with a current case, meet consultation call attendance (of at least 75% in Year 2 or 80% in Year 1), and provide three case presentations during the year-long consultation calls. In response to clinician feedback, the attendance criteria were amended, and the duration of all consultation calls was reduced from 90 min in Year 1 to 60 min in Years 2 and 3. Additional Year 1 feedback identified a gap in programming that might affect sustainability. Consequently, in Year 2, the consultation call program was expanded to supervisors for monthly discussion of issues unique to them. For supervisors in Year 2, the attendance criteria were identical, with the exception that they were required to complete one presentation on a supervisory issue during the year. Participants who did not fulfill the completion criteria were divided into two categories: (a) dropouts included clinicians who notified program staff of their withdrawal with an indicated reason, and also those who attended fewer than 50% of the calls with no explanation and (b) noncompleters were participants who attended between 50% and 100% of calls but did not meet one or more of the completion criteria (e.g., attendance, case presentations, and case completion).

In Year 1, 92 clinicians (27.6%), of the original 333 participating clinicians dropped out of the project. Dropout rates in the 2nd year were similar, with 82 clinicians, (29.4%) of the original 271 participating clinicians, and 13 supervisors (17.0%) of a total of 75 supervisors (after role changes) dropping out of the project (see Table 2). Further analysis revealed that of those who dropped out of the program in Year 1, 21.7% cited program-specific reasons (e.g., time conflict, difficulty finding clients); one fourth cited program nonspecific reasons (e.g., maternity and sick leave); and the remaining half fell into the categories of low attendance and other or missing. In Year 2, of those who dropped out of the program, 14.6% cited program-specific reasons; 31.7% cited program nonspecific reasons; and the remainder fell into the categories of low attendance and other or missing. Although there is no normative data for attrition in a project of this scale, given the demands of participation in terms of time and clinician and agency resources, the attrition rate appears to be relatively low and speaks to the feasibility of the overall EBTDC model. Of the clinicians who did not dropout, 80.1% (n = 193) of Year 1 participants and 80.2% (n = 150) in Year 2 met the completion criteria for the program and about 20.0% failed to complete the program in both years.

Table 2.

Evidence Based Treatment Dissemination Center Consultation Call Completion, Dropout, and Attendance Results for Years 1 and 2

Year 1
Year 2
n % n %
Clinician completion ratea 193 80.1 150 80.2
Clinician dropout for clinician callsb 92 27.6 82 29.4
Non–program specificc (Y1 n = 92, Y2 n = 90)
    Left agency 19 20.7 20 24.4
    Maternity or medical leave 4 4.3 6 7.3
Program specific
    Low attendance 26 28.3 26 31.7
    Personal or time conflict 15 16.3 6 7.3
    Switched to E3 call N/A 4 4.9
    Agency dropout N/A 2 2.4
    Difficulty finding clients 5 5.4 N/A
    Phone problems 3 3.3 N/A
    Other or missing 20 21.7 18 22.0
    Mean clinician attendance rated 84.6 83.4
Supervisor completion ratee N/A 30 48.0
Supervisor dropout rate for supervisor callsc N/A 13 17.0
Non–program specific
    Left agency N/A 3 23.1
    Maternity or medical leave N/A 0 0.0
Program specific
    Personal or time conflict N/A 1 7.7
    Switched to E3 call N/A 1 7.7
    Agency dropout N/A 2 15.4
    No clients N/A N/A
    Phone problems N/A N/A
    Other or missing N/A 1 7.7
    Low attendance N/A 5 38.5
    Mean supervisor attendance rate N/A 70.0
a

Excludes dropouts; Y1 n = 241, Y2 n = 189.

b

Y1 n = 333, Y2 n = 306.

c

A survey of supervisors’ reasons for dropping out was not performed in Year 1, given that no formal supervisor program existed; Y2 n = 54.

d

Attendance rates refer to the average attendance across consultation call groups.

e

Supervisor completion and noncompletion rates were not measured in Year 1, given that no supervisor-specific program existed; Y2 n = 64.

Characteristics of consultants appeared related to clinicians’ attendance, dropout, and completion rates for Year 1. Attendance rates of clinicians were compared among the four consultants using a one-way analysis of variance (ANOVA) that revealed that the rate of attendance varied significantly among the consultants, F(3, 406) = 13.808, p < .000, such that one consultant had significantly lower attendance rates of clinicians than did the other three. Analyses also indicated differences in the dropout rates among the four consultants, F(3, 350) = 4.247, p < .006, such that one consultant had a significantly higher dropout rate than did two of the other consultants, p < .045 and p < .006. A significant difference was also found in terms of completion rates, F(3, 236) = 7.047, p < .000, such that one consultant had a significantly higher rate of completion among participating clinicians than did two of the other consultants, p < .001.

Given these results, in Year 2 greater safeguards were put in place to ensure the quality and consistency of the consultation calls. For example, a more formal method of selecting consultants evolved for Year 2. For Year 1, consultants were selected based on their past involvement with the CATS Consortium and previous Columbia University projects. In Year 2, the Columbia-based clinical director conducted extensive interviews and chose consultants based on their knowledge of manualized treatments, CBT, and experience in implementation of EBTs. Moreover, formal weekly meetings with all the consultants and the clinical director were implemented, as were monthly treatment developer calls. Results from ANOVA conducted on Year 2 data revealed no significant differences among consultants.

Satisfaction levels with the program

Through the use of self-report measures, clinicians from both Year 1 and Year 2 indicated that they were satisfied with the training program and the consultation calls. After Year 1, clinicians reported about their experience using a 5-point Likert-type scale ranging from 1 (very negative) to 5 (very positive; M = 4.1 for CBT overview, M = 4.2 for depression-specific training, M = 4.4 for trauma-focused training; North et al., 2008). Clinicians who participated in Year 2 reported similar levels of satisfaction using a 5-point Likert-type scale ranging from 1 (very negative) to 5 (very positive; M = 4.2 for CBT overview, M = 4.1 for depression-specific training, M = 4.7 for trauma-focused training).

Discussion

The installation of the EBTDC in NYSOMH provides a real-world example of the initial stages cited in Fixsen et al.'s (2005) model of implementation. NYS in partnership with Columbia University has evolved, and our experience and initial findings suggest that statewide installation of a clinical quality improvement program is possible. Furthermore, the TBTCM may have promise as a means of disseminating EBTs. A critical step in the future will be to move from initial implementation to evaluation of the implementation (including implementation, treatment fidelity, and clinical outcomes) at the clinic, supervisor, clinician, and child levels.

Major Lessons Learned

Overall, the EBTDC demonstrates that large-scale, statewide dissemination is possible, but the process of program installation is complex, and periodic refinements are inherent to the process. There are several factors that may have contributed to the EBTDC's success in training the number of clinicians in the CBT treatments and providing year-long consultation, including the overall culture that state support for the program creates within NYS. The EBTDC program appears acceptable to clinics and clinicians, as most participants are able to successfully complete the program criteria. At this point, it is unclear what leads to these rates, but state support and the training model may be contributing factors. The unique translation-based training approach advocating an intensive workshop and phone consultation may be beneficial to, or at least desired by, clinicians. The TBTCM provides an alternative to costly in-person consultation and is a possible means to maximize consultation with experts to varied geographical locations. Although there is no standard by which participation and completion rates are assessed, we have some evidence now that there is interest among community clinicians in taking part in such a training program, and there is support among clinic administrators in that they allow their clinicians to participate. From a training perspective, we have also learned that it is possible to construct a program that utilizes multiple consultants who all follow a similar consultation call framework and model. However, the experiences of the EBTDC suggest that it is important to attend to the knowledge and experience of the consultants, such that they are well versed in the specific protocols and in the broader theoretical perspective of the treatment manuals (i.e., CBT), as well as in the implementation of EBTs in general.

Despite the overall success in the basic establishment of the TBTCM training model, many barriers to program installation and initial implementation were encountered through this process. Some of the major barriers identified involved the actual knowledge of participating clinicians, as it was observed by consultants and trainers. For example, consultants and trainees observed that clinicians had difficulties using the assessment measures, finding appropriate cases, motivating and engaging families in treatment, incorporating CBT methods into existing clinician skill sets, and maintaining clinician–agency engagement. These seem to indicate that the successful installation of the EBTDC program in the individual clinic settings varied. In some clinics, supervisors, administrators, and clinicians worked together to implement the different aspects of the training and CBT protocols. In other settings, there was far less support. These barriers are illustrative of the crossover between successful installation of the EBTDC at the state program level and the varying success of the installation and implementation of the actual EBT at the individual clinician level. As NYS continues to embrace EBT and commit itself to successful installation at multiple points (e.g., state and clinic), it will be important to continue to examine internal and external system supports that agencies can leverage to support the uptake of the CBT protocols, EBTs generally, and new skills.

Another critical element of EBTDC has been partnership, which in this case is a combination of a centralized office supporting overall infrastructure and day-to-day functioning, contractual arrangements with treatment developers for training and ongoing support with expert consultants, and a structured partnership with academic institutions to provide small group, intensive, distance consultation. Together, these partners have created a system that demonstrates high satisfaction at the clinician level. The level and depth of partnership, as well as the merging of the strengths and expertise of staff in each of these settings, have brought about the iterative refinements and quality improvement efforts that EBTDC has made thus far.

Several noteworthy alterations to the program were made in response to ongoing evaluation and feedback, much of which was received informally by consultants and OMH program staff. For example, in Year 2, the EBTDC Core Steering Committee responded to participant and consultant feedback about a significant gap in the program (i.e., lack of a supervisor-specific consultation calls) that could potentially lead to decreased clinic-level sustainability of the techniques over time. Consequently, the consultation call program was expanded to incorporate monthly supervisor consultation calls along with supervisor program completion criteria. The calls focus on issues unique to supervisors with the express goal of fostering supervisors’ conceptual understanding of CBT and their ability to supervise clinicians in these treatments. In the second training iteration, this program element was expanded to incorporate clinic administrators and program directors. Thus, the model evolved to take a broader view of the sustainability of the EBTs within an organization and recognize the importance of incorporating direct service providers, supervisors, and agency decision makers in the training program. Another alteration made to the program was that clinician and subsequent supervisor consultation calls were reduced to 60 min in length in Years 2 and 3 from 90-min phone calls in Year 1. Clinicians and supervisors felt that 90 min was both too long for a consultation and not feasible in an outpatient clinic setting as it interfered with multiple client session time slots. Moving forward, EBTDC is expanding consultation calls to include quarterly special topic discussions with treatment developers and other experts on issues such as parent engagement, developmental adaptations of treatments, advanced discussions of assessment techniques, and other topics identified through evaluation.

Future Directions

Unexamined factors may have contributed to variations in the acceptance of this initiative across the agencies. Per consultant impression, clinician and agency commitment played an important role in clinician engagement; less engaged individuals or agencies had difficulty meeting completion criteria. Consultant factors also seemed to play a role in clinician participation in this program; however, at this time, it is unclear why these differences were observed. Nevertheless, efforts are ongoing through micro analysis of the content and process of consultation call procedure (Pimentel et al., 2010) to further analyze these differences. It is likely that clinician, consultant, and agency factors play a role in clinician acceptance of the program. There is clear need for further investigation of the clinic organizational culture and structure as well as the consultation process to better understand the dissemination process.

Future investigations should look at issues of fidelity and skill attainment of participating clinicians. In the current iteration of EBTDC, it was not possible to monitor the fidelity or skill with which clinicians implemented the protocols. Consultants were left to rely on clinician report as the primary means of gauging the skill with which clinicians were applying these treatments to their clients. This aspect of the dissemination process needs major attention as there is a strong need to determine appropriate and feasible methods for studying fidelity in community settings. In a similar vain, the sustainability of these treatments in these settings has not been investigated. There are plans by NYSOMH to begin tracking use of these treatments in participating EBTDC clinics. One grant in development seeks to conduct follow-up interviews as well as chart review on a subset of previous EBTDC–clinician participants to examine if they are continuing to use the treatments at the current time and with fidelity.

To this point, the EBTDC project has not had the resources (e.g., tracking program for child-serving clinics) to examine child and family outcomes. In the absence of ongoing outcome monitoring and feedback to clinicians, the impact of the training program on children and families is unknown. In the future, EBTDC seeks to monitor outcomes at the child and family levels to assess the impact of the program on symptomatology and functioning.

Overall, the EBTDC initiative represents an important first step in bringing an EBT training system to life within a state. Although there are many areas that need more rigorous tracking and assessment (e.g., child outcome monitoring, fidelity assessment), the training of 1,210 clinicians and supervisors and the provision of focused distance learning signify a move toward bridging the gap between science and practice. These initial lessons lay the groundwork for future years of EBTDC in both New York and the wider community of service providers with the goal of improving the quality of services through dissemination of effective therapies.

Acknowledgments

We would like to acknowledge the hard work and contribution of all the individuals at Columbia University and New York State Office of Mental Health who have been dedicated to the mission of the Evidence Based Treatment Dissemination Center since its inception. We would also like to acknowledge the dedication of past and present clinicians and supervisors.

Funding

The authors received no financial support for the research and/or authorship of this article.

Biography

Alissa A. Gleacher, PhD, is an instructor of psychology in the Department of Child Psychiatry at Columbia University. Her current research interests include evidence-based treatment development, dissemination, and training models.

Erum Nadeem, PhD, is a research scientist in the Division of Mental Health Services and Policy Research at the New York State Psychiatric Institute at Columbia University. She currently conducts research on evidence-based treatments for children in school and community settings.

Amanda J. Moy, BA, currently conducts research in the Department of Child and Adolescent Psychiatry at Columbia University.

Andria L. Whited, MSW, is the project manager for two quality improvement initiatives for the New York State Office of Mental Health at Columbia University's Research Foundation for Mental Hygiene at the New York State Psychiatric Institute.

Anne Marie Albano, PhD, ABPP, is an associate professor of clinical psychology in psychiatry at Columbia University. Her interests involve the development and dissemination of evidence-based treatments for youth with anxiety and mood disorders.

Marleen Radigan, MS, MPH, DrPH, is an assistant professor of clinical neurobiology in psychiatry at Columbia University and the deputy director for the Youth Services Research Bureau, Office of Performance Measurement and Evaluation at the New York State Office of Mental Health.

Rui Wang, MS, is a research scientist in the Youth Services Research Bureau, Office of Performance Measurement and Evaluation at the New York State Office of Mental Health.

Janet Chassman, MBA, is the director of the Evidence Based Treatment Dissemination Center, Division of Children and Family Services at the New York State Office of Mental Health.

Britt Myrhol-Clarke, MA, is a program specialist in the Division of Children and Family Services at the New York State Office of Mental Health. She has coordinated and monitored implementation of evidence-based mental health treatments for both children and adults.

Kimberly Eaton Hoagwood, PhD, is a professor of clinical psychology in psychiatry at Columbia University and the director of the Youth Services Research Bureau at the New York State Office of Mental Health. She directs numerous policy and systems research projects on implementation of effective practices for children and families and parent activation in children's mental health.

Footnotes

Declaration of Conflicting Interests

The authors declared no potential conflicts of interest with respect to the authorship and/or publication of this article.

1

Please note that 30 of those participants in Year 2 received the training as part of their participation in an offshoot project with different participation requirements and, as such, were not included in further statistical analysis.

2

The figure of 1,210 includes individuals who may have participated in different capacities (i.e., supervisor or clinician) across the different training years and different training programs (i.e., trauma or depression and disruptive behavior disorders).

References

  1. Ager A, O'May F. Issues in the definition and implementation of “best practice” for staff delivery of interventions for challenging behaviour. Journal of Intellectual & Developmental Disability. 2001;26(3):243–256. [Google Scholar]
  2. Bero LA, Grilli R, Grimshaw J, Harvey E, Oxman A, Thomson M. Closing the gap between research and practice: An overview of systematic reviews of interventions to promote the implementation of research findings. British Medical Journal. 1998;317:465–468. doi: 10.1136/bmj.317.7156.465. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Bickman L. Practice makes perfect and other myths about mental health service. American Psychologist. 1999;54(11):958–973. [Google Scholar]
  4. Bruns EJ, Hoagwood KE. State implementation of evidence-based practice for youths, part I: Responses to the state of the evidence. Journal of the American Academy of Child and Adolescent Psychiatry. 2008;47(4):369–373. doi: 10.1097/CHI.0b013e31816485f4. [DOI] [PubMed] [Google Scholar]
  5. Bruns EJ, Hoagwood KE, Rivard JC, Wotring J, Marsenich L, Carter B. State implementation of evidence-based practice for youths, part II: Recommendations for research and policy. Journal of the American Academy of Child and Adolescent Psychiatry. 2008;47(5):499–504. doi: 10.1097/CHI.0b013e3181684557. [DOI] [PubMed] [Google Scholar]
  6. CATS Consortium Implementing CBT for traumatized youth after September 11th: Lessons from the Child and Adolescent Trauma Treatments and Services (CATS) Project. Journal of Clinical Child and Adolescent Psychology. 2007;36(4):581–592. doi: 10.1080/15374410701662725. [DOI] [PubMed] [Google Scholar]
  7. CATS Consortium Outcomes of CBT for severely impacted children affected by the World Trade Center disaster. Journal of the American Academy of Child and Adolescent Psychiatry. In press. [Google Scholar]
  8. Cohen JA, Mannarino AP, Deblinger E. Treating trauma and traumatic grief in children and adolescents. Guilford; New York, NY: 2006. [Google Scholar]
  9. Daleiden EL, Chorpita BF, Donkervoet C, Arensdorf AM, Brogan M, Hamilton JD. Getting better at getting them better: Health outcomes and evidence-based practice within a system of care. Journal of the American Academy of Child and Adolescent Psychiatry. 2006;45(6):749–756. doi: 10.1097/01.chi.0000215154.07142.63. [DOI] [PubMed] [Google Scholar]
  10. Drake RE, Goldman HH, Leff HS, Lehman AF, Dixon L, Mueser KT, Torrey WC. Implementing evidence-based practices in routine mental health service settings. Psychiatric Services. 2001;52(2):179–182. doi: 10.1176/appi.ps.52.2.179. [DOI] [PubMed] [Google Scholar]
  11. Fixsen DL, Naoom SF, Blasé KA, Friedman RM, Wallace F. Implementation research: A synthesis of the literature (FMHI 231) University of South Florida, Louis de la Parte Florida Mental Health Institute, National Implementation Research Network; Tampa: 2005. [Google Scholar]
  12. Hoagwood KE, Radigan M, Rodriguez J, Levitt JM, Fernandez D, Foster J. Final report on the child and adolescent trauma treatment and services (CATS) project for the substance abuse and mental health services administration (SAMHSA) Office of Mental Health; New York: 2006. [Google Scholar]
  13. Hoagwood KE, Vogel JM, Levitt JM, D'Amico PJ, Paisner WI, Kaplan SJ. Implementing an evidence-based trauma treatment in a state system after September 11: The CATS project. Journal of the American Academy of Child and Adolescent Psychiatry. 2007;46(6):773–779. doi: 10.1097/chi.0b013e3180413def. [DOI] [PubMed] [Google Scholar]
  14. Hoven CW, Duarte CS, Lucas CP, Wu P, Mandell DJ, Goodwin RD, et al. Psychopathology among New York City public school children 6 months after September 11. Archives of General Psychiatry. 2005;62:545–552. doi: 10.1001/archpsyc.62.5.545. [DOI] [PubMed] [Google Scholar]
  15. Joyce B, Showers B. Student achievement through staff development. 3rd ed. Association for Supervision and Curriculum Development; Alexandria, VA: 2002. [Google Scholar]
  16. Kelly JA, Somlai AM, DiFranceisco WJ, Otto-Salaj LL, McAuliffe TL, Hackl KL, Rompa D. Bridging the gap between the science and service of HIV prevention: Transferring effective research-based HIV prevention interventions to community AIDS service providers. American Journal of Public Health. 2000;90(7):1082–1088. doi: 10.2105/ajph.90.7.1082. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Lochman JE, Wells KC. The Coping Power program for preadolescent aggressive boys and their parents: Outcome effects at the one-year follow-up. Journal of Consulting and Clinical Psychology. 2004;72(4):571–578. doi: 10.1037/0022-006X.72.4.571. [DOI] [PubMed] [Google Scholar]
  18. North MS, Gleacher AA, Radigan M, Greene L, Levitt JM, Chassman J, Hoagwood K. The Evidence-Based Treatment Dissemination Center (EBTDC): Bridging the research-practice gap in New York State. Emotional and Behavioral Disorders in Youth. 2008;8(1):9–16. [Google Scholar]
  19. Pimentel SS, Regan J, Comer J, Hoagwood K, Albano AM, Chassman J, Wang R. Disseminating evidence-based treatments for children: A microanalysis of consultation calls as an ongoing training strategy. 2010. Unpublished manuscript.
  20. Schoenwald SK, Brown TL, Henggeler SW. Inside multisystemic therapy: Therapy, supervisory, and program practices. Journal of Emotional and Behavior Disorders. 2000;8(2):113–127. [Google Scholar]
  21. Spouse J. Bridging theory and practice in the supervisory relationship: A sociocultural perspective. Journal of Advanced Nursing. 2001;33(4):512–522. doi: 10.1046/j.1365-2648.2001.01683.x. [DOI] [PubMed] [Google Scholar]
  22. Stark K, Curry J, Goldman E. EBTDC depressive symptoms intervention manual. Integrated Psychotherapy Consortium; New York, NY: 2005. Unpublished manuscript. [Google Scholar]

RESOURCES