Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2013 Sep 1.
Published in final edited form as: Adm Policy Ment Health. 2013 Jul;40(4):274–285. doi: 10.1007/s10488-012-0418-8

Clinicians’ Perspectives on Cognitive Therapy in Community Mental Health Settings: Implications for Training and Implementation

Shannon Wiltsey Stirman 1,, Ana Gutiérrez-Colina 2, Katherine Toder 3, Gregory Esposito 4, Frances Barg 5, Frank Castro 6, Aaron T Beck 7, Paul Crits-Christoph 8
PMCID: PMC3434254  NIHMSID: NIHMS378540  PMID: 22426739

Abstract

Policymakers are investing significant resources in large-scale training and implementation programs for evidence-based psychological treatments (EBPTs) in public mental health systems. However, relatively little research has been conducted to understand factors that may influence the success of efforts to implement EBPTs for adult consumers of mental health services. In a formative investigation during the development of a program to implement cognitive therapy (CT) in a community mental health system, we surveyed and interviewed clinicians and clinical administrators to identify potential influences on CT implementation within their agencies. Four primary themes were identified. Two related to attitudes towards CT: (1) ability to address client needs and issues that are perceived as most central to their presenting problems, and (2) reluctance to fully implement CT. Two themes were relevant to context: (1) agency-level barriers, specifically workload and productivity concerns and reactions to change, and (2) agency-level facilitators, specifically, treatment planning requirements and openness to training. These findings provide information that can be used to develop strategies to facilitate the implementation of CT interventions for clients being treated in public-sector settings.

Keywords: Evidence-based psychosocial treatments, Cognitive therapy, Dissemination and implementation

Introduction

Cognitive therapy (CT) is among the most extensively researched psychological treatments of psychiatric disorders. Empirical support for CT has been established for a variety of mental health disorders, including depression, anxiety disorders, substance use disorders, and, when combined with standard care, schizophrenia (Beck 2005; Butler et al. 2006; Grant et al. 2011; Sensky et al. 2000). Its established effectiveness in routine care settings, cost-effectiveness, and benefits over and above usual care have also been demonstrated (Grant et al. 2011; Malik et al. 2009; Vos et al. 2005). Yet, like many other evidence-based psychological treatments (EBPTs; McHugh and Barlow 2010), CT is implemented in relatively few clinical practices. To improve consumer access to EBPTs such as CT, policymakers at federal, state, and local levels are devoting significant resources to training and implementation in public sector treatment settings (McHugh and Barlow 2010). Treatment developers and other experts in EBPTs are increasingly offering workshops, presentations, or consultation at local agencies to prepare clinicians to deliver these interventions to their consumers. However, without a clear understanding of both the context into which treatments such as CT are introduced, and the types of organizational enhancements that may be necessary for successful implementation, training efforts and implementation programs may fail to achieve their goals (Mendel et al. 2008; Stirman et al. 2010).

Prior to introducing or implementing new treatments into organizations and systems, it is critical to capture in-depth information on potential influences of success at multiple levels, including but not limited to the level of the provider, organization, and mental health system (Aarons et al. 2011; Mendel et al. 2008; Raghavan et al. 2008; Stetler et al. 2006a, b). Qualitative research with practice-level stakeholders can complement the breadth of understanding afforded by quantitative data with a more fine-grained understanding of specific concerns that may need to be addressed to facilitate successful implementation (Palinkas et al. 2010). While this type of formative evaluation strategy has been widely used for the development of public health initiatives and implementation programs in other areas of healthcare (Gittelsohn et al. 2006; Stetler et al. 2006a, b), there are fewer examples of its application to the development of implementation programs in mental health.

Successful implementation has been conceptualized as a function of facilitation, evidence, and context (Helfrich et al. 2010; Stetler et al. 2011). Facilitation is the process of helping individuals and teams understand what they need to change and how to change it (Stetler et al. 2011). Prior to implementation, assessment of evidence and context can inform the development of tailored facilitation strategies (Stetler et al. 2006a, b) to optimize implementation. Evidence in this model is defined more broadly than empirical evidence to include local data, clinician perceptions, and client need and preferences. Perceptions of EBPTs and their fit with a particular clientele or setting can influence the decision to adopt a new treatment as well as decision processes regarding the actual treatment delivery (Aarons et al. 2010). The effectiveness of an EBPT may be compromised if it is poorly implemented at a local level. Resulting negative appraisals of the treatment’s effectiveness could ultimately lead to decisions to discontinue the treatments at the clinician or organization level (Aarons and Palinkas 2007; Becker and Stirman 2011). An assessment of perceptions of CT prior to implementation can inform decisions about fit and can also shape training, consultation, and facilitation strategies that target attitudes that may be incompatible with CT implementation.

Context is defined as readiness for targeted implementation at the organization, program, or team level, and includes consideration of factors such as availability of necessary resources, leadership support, and an organization’s climate and culture (Stetler et al. 2011). Organizational climate is the shared perception of the psychological impact of the work environment (e.g., stress) on the well being of members of the organization (Glisson et al. 2008). Organizational culture describes how work is done in the organization through measurement of the behavioral expectations reported by members of the organization (Glisson et al. 2008). Recent research in child social service settings indicates that organizational climate predicts staff turnover, and that organizational culture predicts the sustainability of new programs (Glisson et al. 2008). Agencies with the worst climate and culture profiles demonstrated the poorest outcomes in these areas. This finding may be due to a variety of factors captured under these constructs, including organizational policies, lack of leadership support, high workloads, burnout, and resistance to change. There is also evidence that organizational climate may impact attitudes towards EBPTs. Burnout and lack of collegial support have been associated with less positive attitudes about behavior therapy among staff who work with individuals with severe mental illness on inpatient units (Corrigan et al. 1992, 1998). Child and adolescent mental health service providers in more constructive cultures endorse more positive attitudes toward adoption of evidence-based practices, while providers in poor organizational climates are more likely to perceive divergence of usual practice and these treatments (Aarons and Sawitzky 2006).

While research has been conducted to assess these constructs in child mental health services (Glisson et al. 2008; Jensen-Doss et al. 2009; Schoenwald et al. 2008), substance use disorders (Lehman et al. 2002), and inpatient settings (Corrigan et al. 1992), less is known about perceptions of EBPTs and potential barriers to their implementation in organizations that offer outpatient mental health services to adults. Previous research in the adult area has examined multidisciplinary psychiatric hospital staff perceptions of behavior therapy or psychiatric rehabilitation programs (Corrigan et al. 1992, 1996; Corry and Jewell 2002). Research on training in EBPTs in adult mental health settings has investigated the impact of clinician characteristics such as prior experience with the treatment or educational background on skill acquisition (Crits-Christoph et al. 1998; James et al. 2001). Surveys of clinician perceptions have typically focused on attitudes towards unspecified evidence based practices (Aarons et al. 2010; Aarons 2004) or new or manualized treatments in general rather than on specific treatments (c.f., Borntrager et al. 2009; Cook et al. 2009; Gray et al. 2007). Most studies of this nature have not examined the role of context, and many have assessed the attitudes of providers in private practice or from a variety of service settings (Cook et al. 2009; Gray et al. 2007; Stewart et al. in press). The nature of the population served, setting characteristics, differences in workforce characteristics, and the discussion of a particular EBPT may impact the salience of specific influences on CT implementation in community mental health agencies.

In 2007, a community-academic partnership was formed to provide training in CT to providers within an urban mental health system. This program is a facet of a broader initiative to increase the use of evidence-based treatments within the system. Given CT’s strong evidence base for a variety of disorders and comorbid conditions that are common among the system’s consumers, policymakers believed that it would be efficient and beneficial to train providers in the general CT theory and its applications to specific disorders. Given the dearth of pre-implementation research specific to CT or outpatient community mental health services for adult consumers, we employed formative research strategies (1) to understand community mental health clinicians’ perspectives regarding CT and potential barriers or facilitators to implementation, (2) to understand contextual factors at the organization and system levels that might ultimately impact implementation.

Method

Setting and Participants

This study was conducted as part of a needs assessment within the network of providers in a large urban mental health system. Within the system, virtually all clients were enrolled in a publicly-funded behavioral health program. Ninety-five clinicians and supervisors who worked with adult mental health service consumers at 12 agencies completed surveys of organizational social context and attitudes towards evidence-based practices between 2007 and 2008. Programs sampled included outpatient mental health programs and outpatient programs that treated individuals with comorbid mental health and substance use disorders. Participants at the first eight agencies were also given the option of participating in interviews. A subset of 26 clinicians and 5 supervisors (36 %) agreed to participate in an interview; an additional 8 therapists initially agreed to participate but could not be scheduled after multiple attempts.

Procedures

The study was approved by university and city Institutional Review Boards. Enrolled clinicians provided informed consent and completed a package of surveys that included a brief survey of demographics, prior training in CT, theoretical orientation, caseload characteristics, and free response questions regarding interest in, and barriers to training in CT. To assess organizational climate and culture, we administered the Organizational Social Context Survey (OSC; Glisson et al. 2008).

The OSC assesses organizational culture on three dimensions: rigidity (the degree of discretion, flexibility, and input on decisions that is afforded to service providers), proficiency (expectations that service providers will be competent and put the well-being of their clients first), and resistance (the level of interest in new ways of providing service or receptivity to change). Glisson and colleagues (2008) described criteria for “worst” cultures as proficiency subscale scores two standard deviations below the rigidity and resistance subscale scores and criteria for “best” cultures as proficiency scores two standard deviations above the rigidity and resistance subscale scores. The OSC assesses climate on three dimensions: engagement (extent to which service providers remain personally involved and feel able to accomplish worthwhile things in their work), functionality (provider perception that they receive support and help from coworkers and administrators, and that they understand how they can work successfully within the organization), and stress (perception of emotional exhaustion and inability to get necessary things done). According to criteria established by Glisson and colleagues, “worst” climates are characterized by engagement and functionality subscale scores that are at least two standard deviations below the stress score, and “best” climates are characterized by engagement and functionality scores that are two standard deviations above stress scores (Glisson et al. 2008).

Participants who agreed to be interviewed were scheduled for interviews at times that were convenient for them and that would not interfere with their clinical responsibilities. Semi-structured interviews were conducted over the telephone and interviews were digitally recorded. An interview guide included scripted questions that were open ended, and additional questions were asked to clarify the therapists’ opinions. The interview included questions about their experiences in working within the setting (e.g., What is it like to work as a clinician here? What sorts of things happen here that help you do your job? Is there anything that makes it difficult?); their perceptions of barriers and facilitators to new practices at the organization and system level (e.g., How do people here react when new practices or treatments are introduced? What types of things happen here that could make it more likely that people would use a new treatment? More difficult?); their training needs (e.g., What are the most challenging client-related issues that you encounter in your work? What type of trainings do you think would be most useful to you?), and perceptions of CT (e.g., What are your impressions of CT? Would you be interested in seeing your agency participate in training in CT?). Participants were given the opportunity to elaborate on issues that they considered particularly relevant or about which they expressed strong sentiments. Four of the interviews were conducted in Spanish and translated/transcribed by a native Spanish speaker to eliminate potential language barriers and to capture the attitudes and opinions of a more diverse group of clinicians. Interviews typically lasted 45 min to 1 h, and clinicians received $20 for their participation. Transcripts were reviewed and checked for accuracy by at least one of the authors.

Interview transcripts and survey free-responses were analyzed in a multi-step process with guidance from an investigator with experience in qualitative research (FB). The coding methodology was rooted in grounded theory (Glaser and Strauss 1967) and followed procedures outlined by Palinkas et al. (2008). Interviews were first coded to condense the data into analyzable units by the first and third authors. Segments of text ranging from a phrase to several paragraphs were assigned codes on the basis of a priori themes (that is, those from the interview guide) or by identifying emergent themes through open coding. Codes were assigned to describe connections among categories and between categories and subcategories (axial coding). The final codebook consisted of 20 codes, which included a list of themes, issues, and opinions associated with the implementation of CT. Using the codebook and the computer program QSR NVIVO 8, text segments were grouped into separate categories or nodes. Two raters (the second and fourth authors) participated in the development of the codebook and were trained by the first author until they agreed on 90 % of the codes assigned. They then coded all transcripts independently, while meeting regularly with the first author to discuss progress and resolve any coding discrepancies. Through the process of constantly comparing the categories with one another, the different categories were further condensed into broad themes. Rater agreement for each code was calculated using Cohen’s Kappa coefficients (Cohen 1968). Some variation was present but overall agreement ranged from substantial to perfect agreement on the remainder (κ = 0.76–1.0; Landis and Koch 1977).

Analyses of the surveys and interview data were first conducted independently. To generate broader a priori and emergent themes, and to understand links between findings from the two data sources, the qualitative data from the interviews and descriptive data from the survey were compared and integrated.

Results

In the full sample (N = 95), 56 % of the clinicians were paid on a fee for service basis, and 44 % were salaried employees at the agencies. The average age of the clinicians was 42 (SD = 11). Seventy one percent were female, 71 % were White, 18 % were Latino/Hispanic, and 11 % African American. Sixty-five percent had Master’s degrees, 14 % had Bachelor’s degrees, 9 % had doctoral-level degrees, 4 % were certified addictions counselors, and the remainder endorsed “other” educational backgrounds or did not provide information on their educational background. Sixty-four percent had 5 or more years of clinical experience. The interview and survey-only samples did not differ in terms of demographic variables or education level. All therapists endorsed some prior exposure to CT (either a workshop or coursework) and reported use of “elements” of CT in their practice, but the majority had no formal training or supervision in CT.

The four agencies at which clinicians were not interviewed did not have a sufficient sample size to allow for agency-level aggregation of OSC scores. Interrater agreement indices justified aggregation of all OSC scales at the remaining eight agencies (rwg = 0.71–0.91; Costa et al. 2001), with the exception of the stress subscale for one agency (Agency C). As shown in Table 1, OSC culture scores for four of the agencies (Agencies C, D, E, and F) met criteria for designation as “worst” cultures (Glisson et al. 2008). No agencies met criteria for “best” cultures. Two agencies had scores on all subscales that did not meet criteria for “best” or “worst” cultures (A and G). Two agencies met criteria for “worst” climates (D and F) and one met criteria for “best” climate (B).

Table 1.

Themes

Theme Worst climate/culture Other climate/cultures
Perceptions of cognitive therapy
 Fit with client A lot of people aren’t coming here for the long term, or to fix everything in their lives, they just want to stop hurting. And then they have all this other junk going on in their lives too, they’re mostly poor people, um, you know, and so…personal development isn’t number one on their needs list. I don’t think [CT] works for everybody, but I think if you had to pick one kind of therapy that would work with most of the people who come here that would probably be my first choice. (Agency E; Worst Culture)
Sometimes I wonder if CT is adequate for issues of trauma and things like that. And all of our clients have trauma issues, CT is really focused on the here and now, so it’s great for stabilizing a person that way. I haven’t seen CT as a helpful tool yet to go back to the old stuff…and shame. (Agency D; Worst Climate and Culture)
You know with anything you have to start with the client, and there’s some clients that are not going to be able to participate in cognitive therapy… We work with sometimes people who are really illiterate. People that are—not that I’m saying they’re stupid…this is not anything with their intellect; this is about their tools. And also we have some people who are so, so ill that, they’re so involved in just maintaining that they’re not able to engage in a cognitive treatment. (Agency A)
I think it doesn’t always get to the heart of the issue. And I think it doesn’t, it fails to assess deep enough, what’s happening unconsciously in the client, and it feels overly simplistic, to me, to think, it sort of feels top- down instead of bottom-up. (Agency B; Best Climate)
 Reluctance to fully implement The concern about the expectation, that some therapists really feel comfortable with the approach that they’ve done and we’ve had people in the field for many, many years, who are effective in their work, and so I get concerned that some of their, their creativity will get stifled. (Agency E; Worst Culture) CT has empirically based research to support it, and so, and the clinics like it [but] I would imagine if no one told you what to do in the room it would be [a question of] the therapist liking it or not…. I… have my own little bag of tricks now that I like. (Agency B; Best Climate)
[CT] is a lot of work. It’s a lot, when I’ve had six clients or four clients and I’ve done CT with each and every one of them and they have truly participated, I’m exhausted…I think the fact that cognitive therapy could probably I, I don’t want to say watered down, but I’ll say it, [could get in the way of using CT after training]. (Agency G)
Organization-level barriers
 Workload and productivity demands as barriers I think the lack of supervision as well as the supervision process when you get CT training…I think it will be hard for people to find time for it in the setting we’re talking about… People aren’t going to want to do a new therapy or do a new training if it’s going to put them at a loss…even if at the long time of it the patient is going to benefit. We’re already being stretched to the max… (Agency E; Worst Culture) I don’t think there’s anything about the way the agency is set up that would an obstacle [to training]. I don’t see that. (Agency B; Best climate)
 Reactions to change Change here is slow, and when there is change here, it usually creates instability, and that resistance to change will always be there (Agency C; Worst Culture)
Often change actually means, we’ve just turned up the speed on the treadmill. So there’s hostility. Um, I would say, one word I’d use is cynically, we react to change with cynicism…And in some regards becomes passive- aggressive non cooperation. ‘Yeah, you’re going to ask me to do this, I have no voice so I can’t protest it, but you’ll just get as much as I give you and then we’ll see what happens’. (Agency D; Worst Climate and Culture)
For the most part, are pretty good at adapting to change and you know, kind of rolling with the punches (Agency B; Best climate)
So people say ‘well that’s the way it goes, you know I mean, here they started this and then they just let it drop’. … I think sometimes perhaps the employees don’t have a great deal of faith in the fact that there is, that the change is going to be permanent or better. (Agency G)
Organization level facilitators
 Attitudes towards training I would like to get training…I think the population we work with is really challenging and I think as many different perspectives.. is helpful (Agency F; Worst Climate and Culture) [The agency] supports you if ultimately you’re doing something that’s for the good of the client then they will back you up and you will find a way to get it done (Agency H)
 Fit with documentation requirements Given all this documentation training in how to tailor…measurable goals…CT is what would work very well with this sort of documentation (Agency E; Worst Culture) To be honest with you, I think [CT is] also helpful in, in training the staff. It helps with how to write the notes, and the documentation (Agency G)

Note: Worst cultures’ proficiency subscale scores are 2 SD below rigidity and resistance subscale scores

Best cultures’ proficiency scores are 2 SD above rigidity and resistance subscale scores

Worst climates’ engagement and functionality subscale scores are 2 SD below stress scores

Best climates’ engagement and functionality scores are 2 SD above stress subscale scores

Agencies that are not designated as having worst climates or cultures did not have OSC profiles meeting these criteria

The final themes that emerged through comparison of surveys and interview data are presented below and summarized in Table 1. Four primary themes were identified. Two were related to attitudes towards CT: (1) ability to address client needs and issues that are perceived as most central to their presenting problems, and (2) reluctance to fully implement CT. The themes that emerged relevant to context were (1) agency-level barriers, specifically workload and productivity concerns and reactions to change, and (2) agency-level facilitators, specifically, treatment planning requirements and openness to training. Supporting data are presented by theme, with relevant contextual profile information and supporting interview passages.

Perception of the Ability of CT to Address Clients’ Needs

Table 1 contains examples of clinicians’ perceptions regarding the applicability of CT to their clients. While nearly all participants believed that CT could be effective for at least some clients and indicated awareness of empirical support, some did not believe that it was a good fit for all of their clients’ needs. Twenty-four participants, across the variety of climates and cultures, expressed a belief that CT can be an effective intervention. Notably, six clinicians saw the present-focused, structured aspects of the treatment as particularly appropriate for the consumers at their agencies. They expressed a belief that CT could be very helpful, particularly to clients who are faced with numerous life stressors, due to its skills-based, present-focused nature. For example, a therapist at Agency E indicated that CT would be his first choice for this reason (see Table 1), and a therapist at Agency C stated, “I can see as a therapist with experience that many of the clients’ problems are behavioral problems”. However, 11 clinicians qualified their belief in CT’s effectiveness by indicating that it might not be effective or appropriate for some clients. A clinician employed by Agency B noted, “From everything I’ve read about it, I believe that it can help a lot of people on different levels…. [but] nothing works for everybody and nothing is the answer for everybody and so I wouldn’t want to overgeneralize its value”. Some clinicians expressed concerns regarding its applicability to the population they served, and pointed out that not every client would be interested in or appropriate for CT. For example, four clinicians stated that they believed some clients want to be treated solely with medication and only come to therapy because it is an agency requirement that clients who receive medication management do so. Five cited “low-functioning clients” or clients with severe mental illness as examples of clients who might not be able to participate in or benefit from CT. Five clinicians also expressed concerns about CT’s ability to “go deep enough” to address some clients’ central problems.

Reluctance to Fully Implement CT

Eighteen clinicians, across the range OSC climate and culture profiles, reported that they thought of CT as one of many interventions they would use with their clients. Despite endorsement of limited training or exposure to CT on surveys, 11 clinicians indicated that they already implement some CT interventions. However, for a variety of reasons that are described in Table 1, including the client needs and characteristics discussed above, preferences for other treatments or modalities (9 clinicians), and perceptions of CT as challenging to implement (5 clinicians), clinicians indicated that they were unlikely to fully adopt CT. Therapists expressed a range of opinions about the extent to which CT fit with their therapeutic style. Twelve clinicians expressed a belief that CT would be most useful when integrated with other practices or indicated an intention to use the elements of CT that they found useful rather than implementing the entire protocol. For example, a clinician at Agency E stated, “I think it’s just another tool and it can be very valuable [but] it would have to be implemented on a case by case basis; one part of the therapy is [what] the client brings”. Four clinicians, all at agencies with “worst” culture profiles also expressed unease at the prospect of a mandate to use particular forms of therapy. As a clinician at Agency C emphasized, “I don’t like implementations, obligations or commitments…Nobody is going to tell me, or rather I would not accept some organization telling me what therapy should be used as a norm.”

Organization-level Barriers to Training and Implementation

Clinicians indicated a number of organizational factors that might impact implementation efforts. The most frequently noted barriers were high workloads and productivity demands that leave inadequate time for supervision or training, which were cited by 18 clinicians. Eleven clinicians and administrators also viewed the way in which changes are implemented, and employee reactions to these changes, as potentially undermining new practices and policies.

High Workloads and Productivity Demands

Paperwork and productivity requirements that were difficult to meet without overbooking to compensate for clients’ failure to attend scheduled appointments were commonly noted barriers. In Agency E (see Table 1), clinicians indicated that these issues left little time or energy to attend to clients’ individual needs or receive additional training, and they reported a general lack of recognition for clinicians’ work. At an extreme, in Agency D, an agency with “worst” climate and culture profiles, an administrator described system-level documentation and procedural requirements as the primary concern within the agency, which detracted from clinical care and professional development: “I often think that our biggest client, in terms of the client who receives the most energy is the [the mental health system] and the payers. They’re the ones who really get the focus and the attention”. However, at Agencies B, G, and H, clinicians did not express concern that paperwork, productivity, or documentation requirements would interfere with their ability to learn CT. Two of these agencies had culture and climate profiles that could not be characterized as “best” or “worst” and one (Agency B; see Table 1) was characterized by a “best” climate profile.

Reactions to Change

Clinicians in Agency B also indicated relatively little resistance to change (see Table 1). In comparison, clinicians at agencies with the “worst” climate and culture profiles indicated that changes can be met with frustration and resistance. Some cited examples that could have implications for the implementation and sustainability of CT. In discussing a recent change in procedure that occurred at Agency F, both administrators and clinicians commented on the failure to involve employees in the process of making decisions about changes. An administrator recalled “A lot of frustration. Because people weren’t included in the decision making process. … I think frustration also because there certainly is no follow-through by upper management about making changes”. As a result, a clinician in that agency indicated that “sometimes when things are just handed down, if you don’t understanding where it’s coming from, people are not always open to things”.

Statements by a clinician and administrator in Agency G illustrate dynamics that may influence the likelihood that change is sustained. The clinician stated that in the past, the agency reverted back to previous policies or procedures after making changes (see Table 1). Perhaps for this reason, the administrator noted: “When I go to my staff and say, ‘This needs to change,’ they’re cooperative about it. They don’t necessarily do it, [chuckle] but they’re cooperative about it. I mean, they try their best”. In contrast to this more passive reaction at an agency with an “average” culture and climate profile, an administrator in Agency D, which met criteria for “worst” culture noted hostility and cynicism, as well as “passive-aggressive non-cooperation” (see Table 1). Such patterns and reactions could undermine efforts to implement CT or other EBPTs within an agency, particularly if clinicians are not invested in the change or involved in the decision making process. Additionally, when change is met with resistance by staff or is not consistently supported by agency leadership, clinicians may experience doubt that changes such as the implementation of a new treatment could be maintained.

Organization and Clinician-level Facilitators of Training and Implementation

Organization-level Facilitators

Factors that would facilitate training were evident in some organizational contexts. Five administrators and clinicians pointed to agency and system-level policies that could impact openness to training, such as CT’s fit with the system’s emphasis on concrete goals in required documentation and treatment planning. It appears that efforts to assist clinicians in managing documentation demands may be appreciated, and may be well-received if integrated into training and implementation efforts.

All five of the administrators, across the variety of organizational contexts, expressed openness to training in CT, as did 16 therapists. Notably, though, two clinical administrators indicated that they were not specifically interested in CT, but in training in general. However, the sentiment that clinicians were willing to try new approaches if they could benefit their clients or help them in their work with challenging populations was expressed by 14 clinicians in a variety of contexts (see Table 1).

Discussion

This descriptive study extends the literature on EBPT implementation by providing insight into potential service-level influences on the implementation of cognitive therapy. In contrast to previous work in this area, which has been conducted in children’s mental health services, private practice settings, or on inpatient units, this research provides a detailed understanding of both the social context and the perspectives of outpatient service providers who work with adults in a large, public mental health system. Further, it illustrates an initial formative assessment designed to inform the development of a training and implementation program that is responsive to potential influences on implementation (Stirman et al. 2010). While there are few such examples of the use of this strategy in the mental health literature, formative research is critical to the development of appropriate implementation and facilitation strategies (Stetler et al. 2006a, b). Two findings in particular increase insight into agency dynamics and clinician perceptions that might influence implementation if not addressed. Clinicians generally expressed very positive attitudes towards training in CT regardless of their organizational context, but asserted that if trained in CT, they would selectively implement elements of CT rather than attempting to deliver the full protocol. Additionally, clinicians and administrators described specific patterns of response to change that have the potential to undermine the implementation of an EBPT.

Clinicians’ intention to selectively implement CT has potential implications for the level of treatment fidelity that can be expected after training. Clinicians anticipated that comfort with previously learned interventions, the effort required to implement CT, specific client needs and characteristics, and concerns about autonomy may drive their decisions to implement CT. Selective implementation of interventions without a sound case conceptualization could lead to less optimal outcomes (Henggeler 2004), which could in turn reinforce clinicians’ concerns about CT’s applicability to certain patient populations. While they believed CT could be helpful to at least some of their clients, concerns about CT’s fit with the needs and presenting problems of their clients were consistent with previous findings (Aarons and Palinkas 2007). Notably, the perception held by some clinicians that CT is only effective for present-focused, discrete symptoms or problems, conflicts with empirical evidence that cognitive therapies can be used effectively with individuals with severe mental illness (Grant et al. 2011), those who experienced trauma (Ehlers et al. 2003; Resick et al. 2008), and those with personality disorders and other challenging presentations (Davidson et al. 2006; Lam et al. 2003). These findings highlight the importance of providing clinicians and administrators with salient evidence regarding the applicability of CT to their clientele as they make decisions about initial adoption. In addition to empirical data for specific disorders, or combinations of disorders, presenting highly relevant case material or demonstrations can allow clinicians and administrators to evaluate the applicability of CT to their consumers.

Taken together, these findings support the need for ongoing consultation after introductory training (Beidas and Kendall 2010; Rakovshik and McManus 2010). Consultation is necessary to provide support after didactic training, as clinicians attempt to develop skills (Herschell et al. 2010; Miller et al. 2004; Sholomskas et al. 2005). In the context of consultation, the opportunity to deliver a new treatment and receive guidance may impact clinicians’ treatment preferences. Skeptical clinicians may need to discuss their concerns and receive feedback from consultants as they attempt to deliver CT. Once trained to competency and given the opportunity to evaluate their own experiences with CT (Aarons and Palinkas 2007; Rycroft-Malone et al. 2004), clinicians may be more willing and able to deliver it at adequate levels of fidelity. Although some clinicians expressed concern about the implications of CT training on their autonomy, there is also evidence that ongoing fidelity monitoring and support in the form of supportive consultation can improve the quality of implementation and reduce turnover (Aarons et al. 2009).

Training consultants who provide CT to client populations that are similar to the agency’s typical consumers can ensure that training is relevant to clinician and consumer needs (Riggs et al. 2012). Selection of training cases that provide opportunities for consultation on challenging issues can further engage clinicians (Stirman et al. 2010). In light of the heterogeneous populations served by community mental health systems, some adaptation may be necessary, and may ultimately improve the likelihood that the treatment will be sustained. Emerging findings confirm that during implementation, EBPTs are often modified in response to client or organizational needs (Lundgren et al. 2011). In some cases, cognitive behavioral therapies can be implemented flexibly without a detrimental impact on treatment outcomes (DeRubeis et al. 2005; Levitt et al. 2007). Thus, training and implementation programs should be designed to support clinicians’ ability to flexibly deliver CT or integrate it with other EBPTs for severe mental illness or co-occurring problems (Turkington et al. 2006) without compromising quality or moving beyond the evidence base. Further, consultation can include facilitative strategies to address barriers to implementation (Stirman et al. 2010).

Our findings revealed some contextual factors that have the potential to influence early adoption and subsequent implementation of CT or other EBPTs. Clinicians in the agencies with the worst climate and culture profiles endorsed reactions to change that were markedly more negative than those in other contexts. In light of the high workloads, scant recognition, and lack of personal connection with clients that were evident in both qualitative and quantitative data, some clinicians and administrators described cynicism or even hostility about change. In such contexts, clinicians may not commit to learning or using a new treatment if it is viewed as a temporary priority for the administration or a mandate that requires additional work with no support or recognition. Consistent with previous findings, clinicians in these agencies also indicated that they may not have time to participate in early preliminary consultation or ongoing EBPT support, both of which are increasingly recognized as critical to implementation and sustainability (Aarons et al. 2011; Beidas and Kendall 2010). Although all agencies in our sample were part of a publicly-funded system and clinicians described challenges meeting their paperwork requirements and agency productivity demands, these were only viewed as potential implementation barriers in the “worst” climates and cultures.

Facilitative strategies to address such barriers during CT training programs have been shown to be more effective than training alone (Kauth et al. 2010). A number of integrated and complementary strategies may be necessary, and facilitators should work with stakeholders to adopt a multi-level approach to effect change. At the system or payer level, reimbursement for lost productivity or higher reimbursement rates for CT-trained clinicians would convey support at high levels within the mental health system and make intensive training financially feasible for agencies. Although initially costly to the system, emerging evidence suggests that EBPT implementation can lead to cost-savings (Kilmer et al. 2011), particularly in integrated healthcare systems or networks in which clients tend to remain enrolled over the long-term. Given the critical role of leadership in the success of implementation efforts, management support and follow-through should be facilitated as part of an implementation program (Aarons 2006; Kauth et al. 2010). Findings on reactions to change suggest that facilitators should encourage the inclusion of clinicians in decision-making about EBPT implementation from the start. Clinician and administrator input regarding scheduling, feasible consultation formats, preparation activities, and strategies to mitigate workload and productivity demands during training can also increase the likelihood of success. Facilitators could also work with clinicians and administrators to develop streamlined templates for documentation that include checklists or prompts to document important treatment elements. While these strategies may be successful in some agencies, in settings with poorer organizational contexts, successful implementation may not be possible without more intensive organization-level intervention (Glisson et al. 2010; Hemmelgarn et al. 2006).

Some limitations to this study are important to note. We attempted to minimize the potential for common method bias (Podsakoff et al. 2003) by using a number of procedural remedies including temporal, methodological, and proximal separation of interview and survey collection, protecting respondent anonymity, attempting to reduce evaluation apprehension, and aggregating the survey scores rather than using individual-level data. However, we were not able to mitigate all potential sources of common method biases. It is also possible that our recruitment strategy resulted in a sample of clinicians with more extreme opinions about CT. An additional limitation is that most clinicians had relatively limited exposure to CT prior to the interviews. Clinician attitudes towards CT and their views on barriers to adopting CT might well be different after a higher level of exposure (e.g., training and consultation), and we are examining this possibility in ongoing research. However, for this study, we did not collect data on whether study participants subsequently received CT training. Thus, we are unable to draw conclusions about the influence of contextual factors and attitudes on subsequent training in, or use of, CT. In this project, our interest was in the types of service-level attitudes and barriers that occur early in the implementation process because such barriers, if not addressed, could undermine progress towards a more intensive phase of implementation. We also did not collect information on treatment outcomes, which ultimately is of interest in evaluating the success of the implementation of an EBPT.

Although our sample consisted of a relatively small group of public-sector clinicians in one urban setting and the sample size precluded hypothesis testing or additional exploration of survey data, our study provides a unique synthesis of data regarding perceptions of evidence and context in mental health. This study provides a nuanced view of the barriers that clinicians and administrators in different organizational social contexts perceive to be present at the provider, organization, and to some extent, the broader system level. These findings have implications for the development of facilitation strategies that are relevant to public mental health system administrators, clinicians, and investigators interested in promoting access to EBPTs such as CT. EBPT training programs should be designed with sensitivity to clinician concerns about the fit of the treatment with their own therapeutic practices, their questions about its applicability to the population they serve, and the contextual factors that may impact long term success. Further study of factors such as clinician experiences, attitudes, and perceptions of barriers throughout the course of training and implementation can shed light on whether and how they impact outcomes of interest (Palinkas et al. 2008). Training and implementation programs will benefit from additional formative research on factors that may influence their success, as well as more research on interventions to facilitate implementation of specific EBPTs.

Acknowledgments

The preparation of this article was funded in part by National Institute of Mental Health grants K99-MH080100, T32MH019836, P20 MH71905and R24MH070698, and the NIMH and VA-funded Implementation Research Institute through an award from the National Institute of Mental Health (R25 MH080916-01A2) and the Department of Veterans Affairs, Health Services Research & Development Service, Quality Enhancement Research Initiative (QUERI). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institute of Mental Health, the National Institutes of Health or the Department of Veterans Affairs.

Footnotes

A portion of these results were presented at the 118th annual convention of the American Psychological Association, in San Diego, CA.

Disclosures None for any author.

Contributor Information

Shannon Wiltsey Stirman, Email: sws@bu.edu, Women’s Health Sciences Division, National Center for PTSD, VA Boston Healthcare System, 150 S. Huntington Ave (116B3), Boston, MA 02130, USA. Department of Psychiatry, Boston University, Boston, MA, USA.

Ana Gutiérrez-Colina, Department of Psychology, University of Georgia, Athens, GA, USA.

Katherine Toder, Department of Psychiatry, University of Pennsylvania, Philadelphia, USA.

Gregory Esposito, University of Pennsylvania, Philadelphia, PA, USA.

Frances Barg, Department of Family Medicine and Community Health, School of Medicine, University of Pennsylvania, Philadelphia, USA. Department of Anthropology, University of Pennsylvania, Philadelphia, USA.

Frank Castro, Women’s Health Sciences Division, National Center for PTSD, VA Boston Healthcare System, 150 S. Huntington Ave (116B3), Boston, MA 02130, USA. Department of Psychiatry, Boston University, Boston, MA, USA.

Aaron T. Beck, Department of Psychiatry, University of Pennsylvania, Philadelphia, USA

Paul Crits-Christoph, Department of Psychiatry, University of Pennsylvania, Philadelphia, USA.

References

  1. Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: The evidence-based practice attitude scale (EBPAS) Mental Health Services Research. 2004;6(2):61–74. doi: 10.1023/B:MHSR.0000024351.12294.65. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Aarons GA. Transformational and transactional leadership: Association with attitudes toward evidence-based practice. Psychiatric Services. 2006;57(8):1162–1169. doi: 10.1176/appi.ps.57.8.1162. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Aarons G, Cafri G, Lugo L, Sawitzky A. Expanding the domains of attitudes towards evidence-based practice: The Evidence based practice attitude scale-50. Administration and Policy in Mental Health and Mental Health Services Research. 2010 doi: 10.1007/s10488-010-0302-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Aarons G, Hurlburt M, Horwitz S. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research. 2011;38(1):4–23. doi: 10.1007/s10488-010-0327-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Aarons GA, Palinkas LA. Implementation of evidence-based practice in child welfare: Service provider perspectives. Administration and Policy in Mental Health Services Research. 2007;34:411–419. doi: 10.1176/appi.ps.62.3.255. [DOI] [PubMed] [Google Scholar]
  6. Aarons GA, Sawitzky A. Organizational climate partially mediates the effect of culture on work attitudes and staff turnover in mental health services. Administration and Policy in Mental Health and Mental Health Services Research. 2006;33(3):289–301. doi: 10.1007/s10488-006-0039-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Aarons GA, Sommerfeld DH, Hecht DB, Silovsky JF, Chaffin MJ. The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: Evidence for a protective effect. Journal of Consulting and Clinical Psychology. 2009;77(2):270–280. doi: 10.1037/a0013223. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Beck AT. The current state of cognitive therapy: A 40-year retrospective. Archives of General Psychiatry. 2005;62(9):953–959. doi: 10.1001/archpsyc.62.9.953. [DOI] [PubMed] [Google Scholar]
  9. Becker KD, Stirman SW. The science of training in evidence-based treatments in the context of implementation programs: Current status and prospects for the future. Administration and Policy in Mental Health and Mental Health Services Research. 2011;38:317–322. doi: 10.1007/s10488-011-0361-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Beidas RS, Kendall PC. Training therapists in evidence-based practice: A critical review of studies from a systems-contextual perspective. Clinical Psychology: Science and Practice. 2010;17(1):1. doi: 10.1111/j.1468-2850.2009.01187.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Borntrager C, Chorpita BF, MacMillan-Higa C, Weisz J. Provider attitudes towards evidence based practices: Are the concerns with the evidence or the manuals? Psychiatric Services. 2009;60:677–681. doi: 10.1176/appi.ps.60.5.677. [DOI] [PubMed] [Google Scholar]
  12. Butler AC, Chapman JE, Forman EM, Beck AT. The empirical status of cognitive-behavioral therapy: A review of meta-analyses. Clinical Psychology Review. 2006;26(1):17–31. doi: 10.16/jcpr2005.07.003. [DOI] [PubMed] [Google Scholar]
  13. Cohen J. Weighted kappa: Nominal scale agreement provision for scaled disagreement or partial credit. Psychological Bulletin. 1968;70(4):213–220. doi: 10.1037/h0026256. [DOI] [PubMed] [Google Scholar]
  14. Cook JM, Biyanova T, Coyne JC. Barriers to adoption of new treatments: An internet study of practicing community psychotherapists. Administration and Policy in Mental Health and Mental Health Services Research. 2009;36(2):83–90. doi: 10.1007/s10488-008-0198-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Corrigan PW, Kwartarini WY, Pramana W. Staff perception of barriers to behavior therapy at a psychiatric hospital. Behavior Modification. 1992;16(1):132–144. doi: 10.1177/01454455920161007. [DOI] [PubMed] [Google Scholar]
  16. Corrigan PW, McCracken SG, Kommana S, Edwards M, Simpatico T. Staff perceptions about barriers to innovative behavioral rehabilitation programs. Cognitive therapy and research. 1996;20(5):541–551. doi: 10.1177/01454455980224006. [DOI] [Google Scholar]
  17. Corrigan PW, Williams OB, McCracken SG, Kommana S, Edwards M, Brunner J. Staff attitudes that impede the implementation of behavioral treatment programs. Behavior Modification. 1998;22(4):548–562. doi: 10.1177/01454455980224006. [DOI] [PubMed] [Google Scholar]
  18. Corry R, Jewell TC. Implementation of psychiatric rehabilitation strategies in real-life treatment settings for adults with severe mental illness. Psychiatric Rehabilitation Skills. 2002;6(3):335–354. doi: 10.1080/10973430208408442. [DOI] [Google Scholar]
  19. Costa A, Roe R, Taillieu T. Trust within teams: the relation with performance effectiveness. European journal of work and organizational psychology. 2001;10:225–244. doi: 10.1080/13594320143000654. [DOI] [Google Scholar]
  20. Crits-Christoph P, Siqueland L, Chittams J, Barber JP, Beck AT, Frank A, et al. Training in cognitive, supportive-expressive, and drug counseling therapies for cocaine dependence. Journal of Consulting and Clinical Psychology. 1998;66(3):484–492. doi: 10.1037/0022-006X.66.3.484. [DOI] [PubMed] [Google Scholar]
  21. Davidson K, Norrie J, Tyrer P, Gumley A, Tata P, Murray H, et al. The effectiveness of cognitive behavior therapy for borderline personality disorder: results from the borderline personality disorder study of cognitive therapy (BOSCOT) trial. Journal of Personality Disorders. 2006;20(5):450–465. doi: 10.1521/pedi.2006.20.5.450. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. DeRubeis RJ, Hollon SD, Amsterdam JD, Shelton RC, Young PR, Salomon RM, Gallop R. Cognitive therapy vs medications in the treatment of moderate to severe depression. Archives of General Psychiatry. 2005;62(4):409. doi: 10.1001/archpsyc.62.4.409. Retrieved at http://archpsyc.ama-assn.org/cgi/content/full/62/4/417. [DOI] [PubMed] [Google Scholar]
  23. Ehlers A, Clark DM, Hackmann A, McManus F, Fennell M, Herbert C, Mayou R. A randomized controlled trial of cognitive therapy, a self-help booklet, and repeated assessments as early interventions for posttraumatic stress disorder. Archives of General Psychiatry. 2003;60(10):1024. doi: 10.1001/archpsyc.60.10.1024. Retrieved at http://archpsyc.ama-assn.org/cgi/content/full/60/10/1024. [DOI] [PubMed] [Google Scholar]
  24. Gittelsohn J, Steckler A, Johnson CC, Pratt C, Grieser M, Pickrel J, et al. Formative research in school and community-based health programs and studies: “State of the art” and the TAAG approach. Health Education & Behavior. 2006;33(1):25–39. doi: 10.1177/1090198105282412. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Glaser BG, Strauss AL. The discovery of grounded theory: Strategies for qualitative research. Chicago, IL: Aldine Publishing Company; 1967. [Google Scholar]
  26. Glisson C, Schoenwald SK, Hemmelgarn A, Green P, Dukes D, Armstrong KS, et al. Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. Journal of Consulting and Clinical Psychology. 2010;78(4):537. doi: 10.1037/a0019160. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Glisson C, Schoenwald SK, Kelleher K, Landsverk J, Hoagwood KE, Mayberg S, et al. Therapist turnover and new program sustainability in mental health clinics as a function of organizational culture, climate, and service structure. Administration and Policy in Mental Health and Mental Health Services Research. 2008;35(1–2):124. doi: 10.1007/s10488-007-0152-9. [DOI] [PubMed] [Google Scholar]
  28. Grant PM, Huh GA, Perivoliotis D, Stolar N, Beck AT. A randomized trial to evaluate the efficacy of cognitive therapy for low functioning patients with schizophrenia. Archives of General Psychiatry. 2011;69(2):121–127. doi: 10.1001/archgenpsychiatry.2011.129. [DOI] [PubMed] [Google Scholar]
  29. Gray M, Elhai JD, Schmidt LO. Trauma professionals’ attitudes towards and utilization of evidence-based practice. Behavior Modification. 2007;31:732–748. doi: 10.1177/0145445507302877. [DOI] [PubMed] [Google Scholar]
  30. Helfrich CD, Damschroder LJ, Hagedorn HJ, Daggett GS, Sahay A, Ritchie M, et al. A critical synthesis of literature on the promoting action on research implementation in health services (PARIHS) framework. Implementation Science. 2010;5:82. doi: 10.1186/1748-5908-5-82. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Hemmelgarn AL, Glisson C, James LR. Organizational culture and climate: Implications for services and interventions research. Clinical Psychology: Science and Practice. 2006;13(1):73–89. doi: 10.1111/j.1468-2850.2006.00008.x. [DOI] [Google Scholar]
  32. Henggeler SW. Decreasing effect sizes for effectiveness studies: Implications for the transport of evidence-based treatments: Comments on Curtis, Ronan, and Borduin. Journal of Family Psychology. 2004;18:420–423. doi: 10.1037/0893-3200.18.3.420. [DOI] [PubMed] [Google Scholar]
  33. Herschell AD, Kolko DJ, Baumann BL, Davis AC. The role of therapist training in the implementation of psychosocial treatments: A review and critique with recommendations. Clinical Psychology Review. 2010;30(4):448–466. doi: 10.1016/j.cpr.2010.02.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. James IA, Blackburn IM, Milne DL, Reichfelt FK. Moderators of trainee therapists’ competence in cognitive therapy. British Journal of Clinical Psychology. 2001;40(2):131–141. doi: 10.1348/014466501163580. [DOI] [PubMed] [Google Scholar]
  35. Jensen-Doss A, Hawley KM, Lopez M, Osterberg LD. Using evidence-based treatments: The experiences of youth providers working under a mandate. Professional Psychology: Research and Practice. 2009;40(4):417. doi: 10.1037/a0014690. [DOI] [Google Scholar]
  36. Kauth MR, Sullivan G, Blevins D, Cully JA, Landes RD, Said Q, et al. Employing external facilitation to implement cognitive behavioral therapy in VA clinics: a pilot study. Implementation Science. 2010;5:75. doi: 10.1186/1748-5908-5-75. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Kilmer B, Eibner C, Ringel JS, Pacula RL. Invisible wounds, visible savings? Using microsimulation to estimate the costs and savings associated with providing evidence-based treatment for PTSD and depression to veterans of operation enduring freedom and operation Iraqi freedom. Psychological Trauma: Theory, Research, Practice, and Policy. 2011;3(2):201. doi: 10.1037/a0020592. [DOI] [Google Scholar]
  38. Lam DH, Watkins ER, Hayward P, Bright J, Wright K, Kerr N, Sham P. A randomized controlled study of cognitive therapy for relapse prevention for bipolar affective disorder: outcome of the first year. Archives of General Psychiatry. 2003;60(2):145. doi: 10.1001/archpsyc.60.2.145. Retrieved from http://archpsyc.ama-assn.org/cgi/reprint/60/2/145. [DOI] [PubMed] [Google Scholar]
  39. Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977;33(1):159–174. doi: 10.2307/2529310. [DOI] [PubMed] [Google Scholar]
  40. Lehman WE, Greener JM, Simpson DD. Assessing organizational readiness for change. Journal of Substance Abuse Treatment. 2002;22(4):197–209. doi: 10.1016/S0740-5472(02)00233-7. [DOI] [PubMed] [Google Scholar]
  41. Levitt JT, Malta LS, Martin A, Davis L, Cloitre M. The flexible application of a manualized treatment for PTSD symptoms and functional impairment related to the 9/11 World Trade Center attack. Behaviour Research and Therapy. 2007;45(7):1419–1433. doi: 10.1016/j.brat.2007.01.004. [DOI] [PubMed] [Google Scholar]
  42. Lundgren L, Amodeo M, Cohen A, Chassler D, Horowitz A. Modifications of evidence-based practices in community-based addiction treatment organizations: A qualitative research study addictive behaviors. Addictive Behaviors. 2011;36(6):630–635. doi: 10.1016/j.addbeh.2011.01.003. [DOI] [PubMed] [Google Scholar]
  43. Malik N, Kingdon D, Pelton J, Mehta R, Turkington D. Effectiveness of brief cognitive-behavioral therapy for schizophrenia delivered by mental health nurses: relapse and recovery at 24 months. The Journal of Clinical Psychiatry. 2009;70(2):201–207. doi: 10.4088/JCP.07m03990. [DOI] [PubMed] [Google Scholar]
  44. McHugh RK, Barlow DH. The dissemination and implementation of evidence-based psychological treatments. A review of current efforts. American Psychology. 2010;65(2):73–84. doi: 10.1037/a0018121. [DOI] [PubMed] [Google Scholar]
  45. Mendel P, Meredith LS, Schoenbaum M, Sherbourne CD, Wells KB. Interventions in organizational and community context: a framework for building evidence on dissemination and implementation in health services research. Administration and Policy in Mental Health and Mental Health Services Research. 2008;35(1):21–37. doi: 10.1007/s10488-007-0144-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Miller WR, Yahne CE, Moyers TB, Martinez J, Pirritano M. A randomized trial of methods to help clinicians learn motivational interviewing. Journal of Consulting and Clinical Psychology. 2004;72(6):1050–1062. doi: 10.1037/0022-006X.72.6.1050. [DOI] [PubMed] [Google Scholar]
  47. Palinkas L, Aarons G, Horwitz S, Chamberlain P, Hurlburt M, Landsverk J. Mixed method designs in implementation research. Administration and Policy in Mental Health and Mental Health Services Research. 2010;38(1):44–53. doi: 10.1007/s10488-010-0314-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. Palinkas LA, Schoenwald SK, Hoagwood K, Landsverk J, Chorpita BF, Weisz JR. An ethnographic study of implementation of evidence-based treatments in child mental health: First steps. Psychiatric Services. 2008;59(7):738–746. doi: 10.1176/appi.ps.59.7.738. [DOI] [PubMed] [Google Scholar]
  49. Podsakoff PM, MacKenzie SB, Lee JY, Podsakoff NP. Common method biases in behavioral research: A critical review of the literature and recommended remedies. Journal of Applied Psychology. 2003;88(5):879–903. doi: 10.1037/0021-9010.88.5.879. [DOI] [PubMed] [Google Scholar]
  50. Raghavan R, Bright C, Shadoin A. Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implementation Science. 2008;3(1):26. doi: 10.1186/1748-5908-3-26. [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Rakovshik SG, McManus F. Establishing evidence-based training in cognitive behavioral therapy: A review of current empirical findings and theoretical guidance. Clinical Psychology Review. 2010;30(5):496–516. doi: 10.1016/j.cpr.2010.03.004. [DOI] [PubMed] [Google Scholar]
  52. Resick PA, Galovski TE, Uhlmansiek MOB, Scher CD, Clum GA, Young-Xu Y. A randomized clinical trial to dismantle components of cognitive processing therapy for posttraumatic stress disorder in female victims of interpersonal violence. Journal of Consulting and Clinical Psychology. 2008;76(2):243. doi: 10.1037/0022-006X.76.2.243. [DOI] [PMC free article] [PubMed] [Google Scholar]
  53. Riggs S, Stirman SW, Beck AT. Training community mental health agencies in cognitive therapy for schizophrenia. The Behavior Therapist. 2012;35:34–39. Retrieved from http://www.abct.org/docs/PastIssue/35n2.pdf. [PMC free article] [PubMed] [Google Scholar]
  54. Rycroft-Malone J, Seers K, Titchen A, Harvey G, Kitson A, McCormack B. What counts as evidence in evidence-based practice? Journal of Advanced Nursing. 2004;47(1):81–90. doi: 10.1111/j.1365-2648.2004.03068.x. [DOI] [PubMed] [Google Scholar]
  55. Schoenwald SK, Chapman JE, Kelleher K, Hoagwood KE, Landsverk J, Stevens J, et al. A survey of the infrastructure for children’s mental health services: Implications for the implementation of empirically supported treatments (ESTs) Administration and Policy in Mental Health and Mental Health Services Research. 2008;35(1–2):84–97. doi: 10.1007/s10488-007-0147-6. [DOI] [PubMed] [Google Scholar]
  56. Sensky T, Turkington D, Kingdon D, Scott JL, Scott J, Siddle R, et al. A randomized controlled trial of cognitive-behavioral therapy for persistent symptoms in schizophrenia resistant to medication. Archives of General Psychiatry. 2000;57(2):165–172. doi: 10.1001/archpsyc.57.2.165. [DOI] [PubMed] [Google Scholar]
  57. Sholomskas DE, Syracuse-Siewert G, Rounsaville BJ, Ball SA, Nuro KF, Carroll KM. We don’t train in vain: A dissemination trial of three strategies of training clinicians in cognitive-behavioral therapy. Journal of Consulting and Clinical Psychology. 2005;73(1):106–115. doi: 10.1037/0022-006X.73.1.106. [DOI] [PMC free article] [PubMed] [Google Scholar]
  58. Stetler CB, Damschroder LJ, Helfrich CD, Hagedorn HJ. A Guide for applying a revised version of the PARIHS framework for implementation. Implementation Science. 2011;6(1):99–112. doi: 10.1186/1748-5908-6-99. [DOI] [PMC free article] [PubMed] [Google Scholar]
  59. Stetler C, Legro M, Rycroft-Malone J, Bowman C, Curran G, Guihan M, et al. Role of “external facilitation” in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implementation Science. 2006a;1(1):23–32. doi: 10.1186/1748-5908-1-23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  60. Stetler CB, Legro MW, Wallace CM, Bowman C, Guihan M, Hagedorn H, et al. The role of formative evaluation in implementation research and the QUERI experience. Jounal of General Internal Medicine. 2006b;21:S1–8. doi: 10.1111/j.1525-1497.2006.00355.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  61. Stewart RE, Wiltsey Stirman S, Chambless DL. A qualitative investigation of practicing psychologists’ attitudes toward research-informed practice: Implications for dissemination strategies. Professional Psychology: Research & Practice. doi: 10.1037/a0025694. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  62. Stirman SW, Bhar SS, Spokas M, Brown GK, Creed TA, Perivoliotis D, et al. Training and consultation in evidence-based psychosocial treatments in public mental health settings: The access model. Professional Psychology: Research and Practice. 2010;41(1):48–56. doi: 10.1037/a0018099. [DOI] [PMC free article] [PubMed] [Google Scholar]
  63. Turkington D, Dudley R, Warman DM, Beck AT. Cognitive-behavioral therapy for schizophrenia: A review. Focus. 2006;4(2):223–233. doi: 10.1097/00131746-200401000-00002. Retrieved from http://psychiatryonline.org/article.aspx?articleid=50577. [DOI] [PubMed] [Google Scholar]
  64. Vos T, Corry J, Haby MM, Carter R, Andrews G. Cost effectiveness of cognitive-behavioural therapy and drug interventions for major depression. Australian and New Zealand Journal of Psychiatry. 2005;39(8):683–692. doi: 10.1111/j.1440-1614.2005.01652.x. [DOI] [PubMed] [Google Scholar]

RESOURCES