Abstract
Among the challenges facing the mental health field are the dissemination and implementation of evidence-based practices. The present study investigated the relationships between inner context variables (i.e., adopter characteristics and individual perceptions of intra-organizational factors) and two implementation outcomes – independently rated therapist fidelity on a performance-based role-play (i.e., adherence and skill) and self-reported penetration of cognitive behavioral therapy for youth anxiety following training. A significant relationship was found between inner context variables and fidelity. Specifically, adopter characteristics were associated with adherence and skill; individual perceptions of intra-organizational factors were associated with adherence. Inner context variables were not associated with penetration. Future directions are discussed.
Keywords: evidence-based practice, inner context, implementation outcomes
The development, dissemination, and implementation of Evidence-Based Practice (EBP), defined as “the integration of the best available research with clinical expertise” (American Psychological Association, 2006), remains a focus in discourse concerning quality improvement in mental health care (Weisz, Sandler, Durlak, & Anton, 2005). Unfortunately, the promise of dissemination and implementation of EBPs has yet to be fully realized (President's New Freedom Commission on Mental Health, 2003). Various implementation strategies have been developed to improve the dissemination and implementation of EBPs in the community. An implementation strategy is a “systematic intervention process to adopt and integrate evidence-based health innovations into usual care” (Powell et al., 2011). Examples of implementation strategies include discrete strategies, such as audit and feedback, as well as more complex strategies, such as the use of community development teams to implement EBP (Saldana & Chamberlain, 2012) and the Availability, Responsiveness, and Continuity (ARC) organizational implementation strategy (Glisson et al., 2010). One frequently used discrete implementation strategy is training community therapists (Addis & Krasnow, 2000; Herschell, McNeil, & McNeil, 2004; Rakovshik & McManus, 2010; Williams, Martinez, Dafters, Ronald, & Garland, 2011).
Despite the frequent use of training as a discrete implementation strategy (Barwick et al., 2012), results of studies that have evaluated training as the sole implementation strategy have been disappointing (see review by Beidas & Kendall, 2010). Typical training efforts include printed education materials and one-time didactic workshops and are better than no training in improving therapist knowledge, but have no significant effect on client outcomes (Beidas & Kendall, 2010; Farmer et al., 2008; Fixsen et al., 2005; Shafran et al., 2009) or therapist behavior (Hawkins & Sinha, 1998; Sholomskas et al., 2005). Ongoing support, or consultation (Edmunds, Beidas, & Kendall, 2013; Nadeem, Gleacher, & Beidas, 2013), appears to be a critical discrete implementation strategy that needs to be included in the training process (Beidas, Edmunds, Marcus, & Kendall, 2012). An additional explanation for ineffectiveness of training alone is that it likely does not attend to contextual factors that may impact therapist behavior. Transferring innovation, such as EBP, to routine clinical use is a complex, multilevel process dependent upon more than therapists’ understanding of a treatment itself. Clinicians, their practice settings, and other factors can impact successful training (Beidas & Kendall, 2010) and implementation (Damschroder et al., 2009; Fixsen et al., 2005; Taxman & Belenko, 2012). Effective implementation may only occur when therapists are trained effectively and when the context supports behavior change (Beidas & Kendall, 2010; Sanders & Turner, 2005; Turner & Sanders, 2006). Understanding factors that impact therapist behavior and implementation outcomes will move the field forward in exploring other potential avenues for implementation interventions.
A variety of implementation outcomes are important to consider when transferring innovation to routine clinical use. Implementation outcomes are “the effects of deliberate and purposive actions to implement new treatments, practices and services” (Proctor et al., 2010, p. 65) and are distinct from service and/or client outcomes. They are most proximal to implementation strategies and are intermediate outcomes between implementation strategies and client outcomes (Proctor et al., 2010). Two implementation outcomes are of particular interest when evaluating training and consultation as discrete implementation strategies: fidelity and penetration. Fidelity refers to “the degree to which an intervention was implemented as it was prescribed in the original protocol” (Proctor et al., 2010, p. 69) whereas penetration refers to, “the integration of a practice within a service setting and its subsystems” (Proctor et al., 2010, p. 70). These outcomes are worthy of investigation given findings that fidelity decreases when EBPs are implemented in community settings (Henggeler, 2004; Henggeler, Melton, Brondino, Scherer, & Hanley, 1997) and that penetration is understudied as an implementation outcome (Proctor et al., 2010).
Fidelity is an important implementation outcome because it is the most proximal outcome to training and consultation; whereas penetration is important because it must occur for fidelity to matter (i.e., if therapists do not integrate an EBP within their practice, their fidelity is a moot point). Fidelity includes: 1) adherence (i.e., whether therapists implement prescribed treatment components), 2) competence (i.e., skill in implementing treatment), and 3) treatment differentiation (i.e., whether treatment differs from others along important dimensions; Perepletchikova, Treat, & Kazdin, 2007; Waltz, Addis, Koerner, & Jacobson, 1993). Assessment of fidelity typically includes qualitative measures of competence and quantitative measures of adherence, the latter of which sufficiently address both adherence and differentiation (Perepletchikova et al., 2007; Waltz et al., 1993). Although findings have been mixed regarding the relationship between fidelity and outcomes (Perepletchikova & Kazdin, 2005), within a randomized controlled trial, fidelity assessment helps determine treatment integrity and provides information on active treatment ingredients (i.e., which specific components are critical for improved outcomes). Within community settings, fidelity assessment can help determine what quantity and quality of active ingredients are being delivered (Schoenwald, Chapman, & Garland, in press). Both fidelity and penetration are important implementation outcomes to examine because they contribute to the research-practice gap. Poor fidelity contributes to the research-practice gap by limiting exposure to active ingredients. Similarly, lack of penetration (i.e., when an EBP does not become integrated into a service delivery setting) prevents the delivery of the gold-standard treatment to those individuals who could benefit from them. Given that research shows limited penetration following initial implementation (Stirman, Cook, Calloway, Castro, & Charns, 2012), a greater understanding of factors that impact fidelity and penetration over time will allow the field to explore potential avenues for maximizing and sustaining implementation outcomes.
The Exploration, Preparation, Implementation, and Sustainment framework (EPIS; Aarons, Hurlburt, & Horowitz, 2011) is a multi-level and multi-phase conceptual model that provides a lens through which to understand both the implementation process as well as important contextual factors that impact the implementation process. The process of implementing an innovation occurs in four distinct phases: exploration, preparation, implementation, and sustainment. Furthermore, a number of contextual factors that impact each phase uniquely are grouped into the following categories: intervention characteristics, outer context, and inner context. The outer context refers to system-level factors such as the service environment, inter-organizational environment, and consumer support/advocacy. The inner context refers to adopter characteristics and intra-organizational characteristics (Aarons et al., 2011). In this manuscript, we focus on inner context factors (i.e., adopter characteristics and individual perceptions of intra-organizational characteristics) during the implementation phase.
Adopter characteristics include demographics (e.g., clinical experience) and attitudes (e.g., openness) towards EBPs. The findings from preliminary research examining the impact of demographics and attitudes on implementation outcomes are mixed. For example, therapists with more clinical experience and a graduate degree prior to training were more likely to demonstrate superior posttraining fidelity in Motivational Interviewing, as assessed via observational coding of audiotapes by trained coders (Carpenter et al., 2012). This finding suggests that tailoring training based on level of previous training and experience may be a worthy pursuit in order to maximize implementation outcomes. However, another study demonstrated varied relationships over time between therapist demographics and observational ratings of competence in implementing an EBP for adolescent substance abuse (Garner, Hunter, Godley, & Godley, 2012). Those who reported having other licenses or certifications and those who had spent longer amounts of time at their current positions were less likely to be rated as competent nine-months after training.
Similarly mixed findings are found regarding the relationship between therapist attitudes and implementation outcomes. Higher initial appeal and openness toward EBPs by clinicians participating in training on EBPs for youth predicted decreases in knowledge score commission errors following training (Lim, Nakamura, Higa-McMillan, Shimabukuro, & Slavin, 2012). However, within the same study, greater initial appeal of EBPs was associated with decreases in selected knowledge scores pertaining to anxiety and disruptive behavior techniques following training. These findings reveal the complex relationship between adopter characteristics and training outcomes and suggest the need for further investigations that take into account the impact of inner context factors.
Intra-organizational factors (e.g., culture, support, climate) are specific to the setting in which a therapist practices. The management literature suggests that organizational variables, such as management support and financial resources, have an important impact on implementation of innovation (Klein, Conn, & Sorra, 2001). Within the mental health field, low workplace support, as reported by individuals who received training in the Triple P-Positive Parenting Program, was predictive of less self-reported implementation with clients 6 months following training (Sanders, Prinz, & Shapiro, 2009). Similarly, another study found that perceived setting barriers were negatively correlated with clinicians’ self-reported usage of cognitive-behavioral therapy (CBT) for depression (Lewis & Simons, 2011). More work is needed to determine whether these relationships remain when service delivery is measured objectively (e.g., via record or session tape review). Additional work has found that more engaged organizational climates, as rated by clinicians, have been associated with desirable treatment outcomes in youth, as assessed via parent report using standardized measures (Glisson & Green, 2011). Interestingly, the relationship between organizational climate and client outcomes was not mediated by the type or perceived quality of services delivered. These findings beg additional questions regarding how intra-organizational factors interplay with each other as well as how they interact with intervention characteristics and outer context factors to impact service delivery and client outcomes. In this study, we have explored individual perceptions of intra-organizational factors rather than aggregated organizational level constructs, but have drawn from this literature given the dearth of research on individual perceptions of intra-organizational factors.
Given the mixed findings to date and the importance of better understanding factors that contribute to implementation outcomes, the present study investigated the relationships between adopter characteristics (i.e., demographics; attitudes), individual perceptions of intra-organizational variables (e.g., organizational climate), and implementation outcomes (i.e., fidelity and penetration) for community therapists who received training and consultation in an EBP for youth anxiety. Specific implementation outcomes examined included independently rated fidelity as part of a performance based role-play (i.e., adherence and skill) and self-reported penetration (i.e., the number of anxious youth treated with the EBP divided by the number of anxious youth treated by trained providers).
METHOD
Participants
Demographics
Participants were 115 therapists from the northeast United States. Therapists were primarily female (90.4%) with a mean age of 35.93 (SD = 11.36). Ethnicity/race included: Caucasian (67.0%), African American (13.0%), Hispanic/Latino (5.2%), Asian (4.3%), Native American (.9%), and Other (5.2%). Twenty-nine percent of therapists reported being licensed. Therapists reported the following degrees: MA (38.3%), enrolled in graduate program (12.2%), MD (6.0%), PhD (5.2%), PsyD (4.3%), EdD (1.7%), and Other (24.3%).
Therapists reported having about 65.46 months of experience (SD = 82.38) and identifying with CBT moderately (M = 4.86, SD = 1.68; range = 1–7). Fifty percent of therapists had treated an anxious youth previously. Of those therapists, they reported treating 0–2 youth (M = .11, SD = .43), and receiving zero hours of supervision around the treatment of anxious youth. Therapists reported their current caseload at an average of 19.48 clients (SD = 23.72), and reported receiving 1.57 hours of supervision weekly (SD = 2.66).
Procedure
Recruitment and Screening
The Institutional Review Board at a large northeastern university approved all procedures. Participants were recruited from the community via professional listservs, directors of clinical training programs, and word of mouth. Two-hundred and four participants initially responded to the recruitment efforts; 115 enrolled. Eighty-nine therapists did not participate in the study for the following reasons: never responded to follow-up contact after they initially emailed the principal investigator (N = 37); not available for scheduled study dates (N = 25); did not confirm workshop attendance or complete pre-assessment (N = 15); not eligible due to exclusion criteria (N = 10); and no-show or cancelled last minute prior to training workshop (N = 2).
Participants were matched to one of six training dates based on their availability. Random assignment occurred at the level of training date (i.e., training dates were randomly assigned to condition) using equal allocation concealment. This randomization process has some limitations because randomization to training date is not the same as therapist randomization to condition. On the training day, all participants consented to participate.
Inclusion/exclusion criteria
Therapists had to (a) work in the community, (b) currently or in the future plan to work with children aged 8–17 with DSM-IV anxiety disorders, (c) identify with having received training within the mental health field, (d) volunteer to participate in the workshop and follow-up consultation, (e) read and speak English, and (f) have access to a computer or telephone for consultation. The exclusion criterion was prior training of >8 hours on CBT for child anxiety. We selected this criterion based on a methodological decision made in a similar trial (see Miller, Yahne, Moyers, Martinez, & Pirritano, 2004).
Participant Flow and Retention
Individuals who completed the pretraining assessment made up the intent-to-train (ITT) sample. Of the 115 participants in the ITT sample, 113 (98%) completed the post-training assessment and 100 (87%) completed the post-consultation assessment. Recruitment began in May 2009 and was completed by September 2009. Assessments for the study began in June 2009 and were completed by January 2010. Training occurred between June 2009 and October 2009.
Compensation
Participants received: (1) free lunch provided the day of the training, (2) a therapist manual, a child workbook, and therapy materials (e.g., therapy games) following completion of the posttraining assessment, and (3) a $15 dollar gift card following completion of the postconsultation assessment.
Measures
Inner Context Variables
Evidence-Based Practice Attitude Scale (EBPAS; Aarons, 2004)
The EBPAS is a 15-item tool that assesses participants’ attitudes towards the adoption and implementation of EBP. Four scales can be calculated: appeal, requirements, openness, and divergence (Aarons, 2004). Appeal (Chronbach’s α = .80) refers to the extent to which a therapist will adopt a new practice if it is intuitively appealing. Requirements (Chronbach’s α = .90) refer to the extent to which a therapist will adopt a new practice if required by the organization or legally mandated. Openness (Chronbach’s α =.78) is the extent to which a therapist is generally receptive to using new interventions. Divergence (Chronbach’s α = .59) is the extent to which a therapist perceives research-based treatments as not useful clinically (Aarons, 2004). For the purposes of our study, we recoded divergence so that higher scores on all subscales indicated highly rated appeal, requirements, openness, or divergence. The EBPAS demonstrates good internal consistency, subscale alphas range from .59 to .90 (Aarons & Sawitzky, 2006), and its validity is supported by its relationship with both therapist and organizational characteristics (Aarons, 2004).
Organizational Readiness for Change (ORC; Lehman, Greener, & Simpson, 2002)
The ORC contains 129 items which measure organizational characteristics on a Likert rating scale from 1 (strongly disagree) to 5 (strongly agree). The 18 scales comprise five major domains: (a) motivation, (b) resources, (c) staff attributes, (d) organizational climate, and (e) training climate. Psychometric properties for this instrument are strong (Lehman et al., 2002; Saldana, Chapman, Henggeler, & Rowland, 2007). There is evidence that the ORC is valid in indexing organizational level constructs when aggregating across therapist reports (Saldana et al., 2007). Note that in this investigation, we did not have enough therapists per agency to measure organizational level constructs, thus we use the ORC as a way to gauge individual perceptions of intra-organizational characteristics.
Clinician Demographics and Attitudes Questionnaire (CDAQ; Beidas, Barmish, & Kendall, 2009)
The CDAQ is a 15-item assessment tool that measures demographics, prior experience with CBT for youth anxiety and attitudes towards empirically supported treatments in general, as well as attitudes towards CBT for youth anxiety specifically. Psychometrics on the CDAQ indicate acceptable reliability with an intraclass coefficient (ICC) of .91 and Spearman Brown split-half reliability of .85 (Beidas et al., 2009).
Implementation Outcomes
Fidelity
Performance-Based Role Play (PBRP)
A structured PBRP (Dimeff et al., 2009) was used to assess participant’s fidelity (i.e., adherence and skill) in a simulated clinical setting. The PBRP consisted of a phone call with a standardized child client presenting for treatment for anxiety. Undergraduate research assistants blind to condition were trained to portray typical anxious youth (i.e., not overly compliant or defiant). Therapist participants were asked to take 8-minutes to prepare the client for an exposure task, a competency central to CBT. Three vignettes (used in a previous training study; Beidas et al., 2009) representative of anxious youth were created and rated by six child anxiety treatment experts to ensure comparability. Each participant was randomly assigned a vignette order for the three PBRPs. The role-plays were digitally recorded and later independently coded.
Adherence and Skill Checklist (ASCL; Beidas et al., 2009)
This measure was developed to measure both adherence to the content of CBT for youth anxiety and skill in delivery. Adherence was assessed by coding the presence of six core CBT competencies: (a) identification of somatic symptoms, (b) identification of anxious cognitions, (c) relaxation, (d) coping thoughts, (e) problem-solving, and (f) positive reinforcement. Skill was evaluated using a Likert scale from 1 (not well) to 7 (very well). Skill was rated as: “How skillful was the clinician’s performance in preparing the child for the exposure task using the cognitive-behavioral framework?” Note that this manner of rating skill is specific to skill in the application of specific CBT components; making it related to adherence. In other words, if a therapist did not adhere to the core CBT competencies, they would be unable to score highly on the skill variable.
Coders (one doctoral level psychology graduate student, two post-undergraduates, one honors undergraduate) were trained through readings, didactics, and supervised practice with feedback. Coders were blind to hypotheses, training condition, and time-point of the assessment. Rated adherence (ICC = .98) and skill (ICC = .92) demonstrated strong inter-rater reliability.
Penetration
Identification and Treatment of Anxious Youth (ITAY; Benjamin, Beidas, Edmunds, Cohen, & Kendall, 2010)
The ITAY is a self-report measure that assesses primary treatment setting, rates of treatment use since ending consultation, types of treatment modalities used, barriers of treatment use, and facilitators of treatment use. The measure involves both close-ended questions and 7-point Likert scales. In this study, consistent with the definition provided by Proctor and colleagues (2010), we defined penetration as the percentage of anxious youth treated with CBT over the past 3 months (i.e., anxious youth treated with CBT divided by anxious youth treated overall). Participants completed the ITAY 3 months following training (i.e., postconsultation assessment).
Assessment Procedure
At pretraining, participants completed measures of their demographics, attitudes, organizational characteristics, and the PBRP. Following training and 3 months of consultation, participants completed an assessment evaluating penetration of CBT for child anxiety and the PBRP.
Training and Consultation
Participants were randomized to training conditions: (1) routine training: a one-day workshop that covered the specific manual and procedures of CBT for child anxiety (i.e., training as usual), (2) computer training: computer training on CBT for child anxiety that was accomplished through a commercially-developed interactive DVD, and (3) augmented training: a one-day workshop that included a focus on principles of CBT and active learning (including behavioral role-play exercises). Given that no significant differences between the three conditions were identified on therapist fidelity, we chose to collapse across training conditions in our analyses (see Beidas et al., 2012).
Participants from all three conditions were provided weekly consultation via the WebEx virtual conferencing platform for three months following training. Participants could call in via telephone or computer to attend the 1-hour weekly virtual meeting. Those who used their computer were able to view a white-board and the individual leading the consultation via web camera. Consultation curriculum was designed with participant input and included case consultation, didactic topics (e.g., treating a client with comorbid depression), practice with concepts (e.g., relaxation), and assistance in implementation of the treatment within context (e.g., psychiatry clinic, school). Generally, consultation consisted of thirty minutes of client discussion and thirty minutes of didactics and behavioral rehearsal. Consultation included both instructor-led structured material and unstructured peer-guided material (Weingardt, Cucciare, Bellotti, & Lai, 2009). On average, the 108 consultation sessions lasted 52.57 minutes (SD = 10.79, range = 22–65) and had an average of 7.83 participants (SD = 4.52, range = 1–20). Number of cases discussed per call averaged 2.69 (SD = 1.90, range = 0–7). Participants attended an average of 7.15 consultations (SD = 3.17; range = 0–10) in the three-month period following their training (i.e., between the posttraining and postconsultation assessment). We refer the reader to Edmunds et al. (2013) for details regarding consultation.
Data analytic plan
The primary analyses were three multiple linear regressions. The three outcome variables included fidelity (i.e., adherence and skill) and penetration. The set of predictors included the following: adopter characteristics (i.e., experience, age, attitudes; as reported on the EBPAS and CDAQ) and individual perceptions of intra-organizational variables (i.e., motivation for change, resources, staff attributes, organizational climate, and training climate; as reported on the ORC).
All statistical assumptions were met, and no significant differences in inner context factors at pretraining were found when comparing conditions. ITT analyses were conducted and all participants (N = 115) were included, with the last observation carried forward for therapists who did not complete study assessment time-points (n = 13). For measures where there were missing data at pre-training, the mean of the condition was computed and imputed. Study completers were those who completed all three PBRPs. Of the 115 participants, 13 (11%) did not complete all PBRPs. There was no significant difference among conditions in study completion (χ2 (2) = .11, p = .95).
RESULTS
Fidelity: Adherence and Skill
As reported previously (Beidas et al., 2012), the RM-ANOVA demonstrated that there was a main effect of time such that therapist adherence (F (2, 222) = 100.21, p< .001) and skill (F (2, 222) = 79.90, p <.001) improved after receiving the training and consultation package.
Penetration
Over the 3 months following training (i.e., the time period between the posttraining and postconsultation assessment), as measured by the ITAY1, participants reported treating an average of 2.63 anxious youth (SD = 7.28, range = 0–65). Two hundred and forty two anxious youth were treated by the sample of clinicians in three months; participants reported providing CBT to 79% of these youth (N = 191). Other modalities included relaxation by itself (n=36, 62.1%), supportive therapy (n=29, 50%), family therapy (n=25, 43.1%), play therapy (n=18, 31%), peer support/group intervention (n=14, 24.1%), and other (n=11, 19%). Participants reported delivering about 8.11 (SD = 9.61) sessions per youth, and moderate involvement of parents in treatment (M = 3.48, SD = 1.28; 7 = highest involvement).
Relationship between adopter characteristics, individual perceptions of intra-organizational characteristics and adherence
A multiple regression examined the relationship between adherence and potential predictors, while controlling for baseline adherence and skill. A statistically significant relationship between the set of predictors and adherence was found (F (19, 108) = 2.29, p = .005. The R Square Change statistic for the increase in R² associated with the added predictors was 0.188. Using a proportional reduction in error interpretation for R², information provided by the added variables reduces error in predicting adherence by 18.8%. As in Table 1, one intra-organizational factor (i.e., organizational climate) had a significant positive regression weight, indicating that as individual perceptions of organizational climate increased, adherence also increased. Two individual adopter characteristics had significant negative weights, indicating that as experience in months and scores on the EBPAS requirements scales increased, adherence decreased. Other predictors were not significant.
Table 1.
Predictor Variable ( N = 109) | Mean (SD) | b | β |
---|---|---|---|
Demographics | |||
Experience (months) | 65.14 (82.55) | −.006 | −.304* |
Age | 35.95 (11.44) | .015 | .110 |
EBPAS | |||
EBPAS appeal | 3.34 (.62) | .393 | .157 |
EBPAS divergence^ | .89 (.53) | −.078 | −.027 |
EBPAS openness | 3.15 (.58) | .203 | .076 |
EBPAS requirement | 2.75 (1.01) | −.465 | −.302** |
CDAQ | |||
CDAQ opinion | 5.98 (1.07) | .235 | .162 |
CDAQ confidence | 5.42 (1.49) | −.031 | −.029 |
CDAQ motivation | 6.56 (.76) | −.369 | −.181 |
CDAQ usefulness | 6.63 (.62) | −.025 | −.010 |
ORC | |||
Motivation for Change | 33.31 (5.51) | −.007 | −.024 |
Resources | 33.62 (4.83) | −.018 | −.057 |
Staff Attributes | 36.18 (3.69) | −.101 | −.240 |
Organizational Climate | 33.33 (3.11) | .129 | .259* |
Training Climate | 23.70 (6.19) | .050 | .197 |
Note. EBPAS = Evidence Based Practice Attitude Scale (Aarons, 2004); CDAQ = Clinician Demographics and Attitudes Questionnaire (Beidas et al., 2009); ORC = Organizational Readiness for Change (Lehman et al., 2002). Note that higher scores on the EBPAS reflect more highly rated appeal, divergence, openness and requirements (range 0–4). Higher scores on the CDAQ reflect more positive opinions, confidence, motivation to learn, and belief in the utility of cognitive-behavioral therapy for child anxiety (range 1–7). Higher scores on the ORC reflect more highly rated motivation for change, resources, staff attributes, organizational climate, and training climate (range 10–50).
The mean of the divergence subcale is reverse scored.
≤ .05,
≤.01,
≤ .001. Note these analyses control for baseline skill and adherence.
Relationship between adopter characteristics, individual perceptions of intra-organizational characteristics and skill
A multiple regression examined the relationship between skill and potential predictors, while controlling for baseline adherence and skill. A statistically significant relationship between the set of predictors and skill was found (F (19, 108) = 1.879, p = .026. The R Square Change statistic for the increase in R² associated with the added predictors was 0.165. Using a proportional reduction in error interpretation for R², information provided by the added variables reduces error in predicting skill by 16.5%. As in Table 2, one individual adopter characteristic had a significant negative weight, indicating that as scores on the EBPAS requirements scales increased, skill decreased2. Other predictors were not significant.
Table 2.
Predictor Variable (N = 109) | Mean (SD) | b | β |
---|---|---|---|
Demographics | |||
Experience (months) | 65.14 (82.55) | −.003 | −.160 |
Age | 35.95 (11.44) | .000 | −.003 |
EBPAS | |||
EBPAS appeal | 3.34 (.62) | .419 | .182 |
EBPAS divergence^ | .89 (.53) | −.249 | −.094 |
EBPAS openness | 3.15 (.58) | .216 | .088 |
EBPAS requirement | 2.75 (1.01) | −.423 | −.300** |
CDAQ | |||
CDAQ opinion | 5.98 (1.07) | .337 | .254 |
CDAQ confidence | 5.42 (1.49) | −.057 | -.059 |
CDAQ motivation | 6.56 (.76) | -.591 | -.317 |
CDAQ usefulness | 6.63 (.62) | 1.93 | .084 |
ORC | |||
Motivation for Change | 33.31 (5.51) | −.001 | −.004 |
Resources | 33.62 (4.83) | −.016 | −.055 |
Staff Attributes | 36.18 (3.69) | −.046 | −.120 |
Organizational Climate | 33.33 (3.11) | .080 | .175 |
Training Climate | 23.70 (6.19) | .015 | .064 |
Note. EBPAS = Evidence Based Practice Attitude Scale (Aarons, 2004); CDAQ = Clinician Demographics and Attitudes Questionnaire (Beidas et al., 2009); ORC = Organizational Readiness for Change (Lehman et al., 2002). Note that higher scores on the EBPAS reflect more highly rated appeal, divergence, openness and requirements (range 0–4). Higher scores on the CDAQ reflect more positive opinions, confidence, motivation to learn, and belief in the utility of cognitive-behavioral therapy for child anxiety (range 1–7). Higher scores on the ORC reflect more highly rated motivation for change, resources, staff attributes, organizational climate, and training climate (range 10– 50).
The mean of the divergence subcale is reverse scored.
≤ .05,
≤.01,
≤ .001. Note these analyses control for baseline skill and adherence.
Relationship between adopter characteristics, individual perceptions of intra-organizational characteristics and penetration
A multiple regression examined the relationship between penetration and potential predictors, while controlling for baseline adherence and skill. A statistically significant relationship between the set of predictors and penetration was not found (F (19, 88) = 1.205, p = .280. Means and regression weights are provided in Table 3.
Table 3.
Predictor Variable (N = 89) | Mean (SD) | b | β |
---|---|---|---|
Demographics | |||
Experience (months) | 64.21 (81.30) | .001 | .129 |
Age | 35.54 (11.69) | .001 | .030 |
EBPAS | |||
EBPAS appeal | 3.38 (.55) | −.136 | −.161 |
EBPAS divergence^ | .91 (.54) | .206 | .239 |
EBPAS openness | 3.16 (.57) | .060 | .074 |
EBPAS requirement | 2.72 (.97) | .069 | .142 |
CDAQ | |||
CDAQ opinion | 6.02 (1.02) | −.101 | −.221 |
CDAQ confidence | 5.57 (1.35) | .013 | .038 |
CDAQ motivation | 6.57 (.71) | .130 | .196 |
CDAQ usefulness | 6.60 (.63) | −.036 | −.049 |
ORC | |||
Motivation for Change | 33.32 (5.27) | −.09 | −.098 |
Resources | 33.59 (5.08) | .019 | .203 |
Staff Attributes | 36.12 (3.69) | −.001 | −.009 |
Organizational Climate | 33.39 (3.25) | −.021 | −.148 |
Training Climate | 23.68 (6.42) | .008 | .111 |
Note. EBPAS = Evidence Based Practice Attitude Scale (Aarons, 2004); CDAQ = Clinician Demographics and Attitudes Questionnaire (Beidas et al., 2009); ORC = Organizational Readiness for Change (Lehman et al., 2002). Note that higher scores on the EBPAS reflect more highly rated appeal, divergence, openness and requirements (range 0–4). Higher scores on the CDAQ reflect more positive opinions, confidence, motivation to learn, and belief in the utility of cognitive-behavioral therapy for child anxiety (range 1–7). Higher scores on the ORC reflect more highly rated motivation for change, resources, staff attributes, organizational climate, and training climate (range 10–50).
The mean of the divergence subcale is reverse scored.
≤ .05,
≤.01,
≤ .001. Note these analyses control for baseline skill and adherence.
DISCUSSION
Therapist fidelity (i.e., adherence and skill), as rated by independent evaluators during a performance-based role-play, significantly improved after receiving the training and consultation package, and therapists reported high penetration (i.e., using CBT with a majority of their anxious youth patients) following training. We were interested in exploring if inner context variables as defined by the EPIS model, specifically adopter characteristics and individual perceptions of intra-organizational factors, related to implementation outcomes during the implementation phase of EPIS. Adopter characteristics were associated with fidelity (both adherence and skill) but not penetration. Individual perceptions of intra-organizational variables were related to fidelity (adherence specifically) but not skill or penetration. Similar to previous investigations (e.g., Garner et al., 2012), these results suggest a nuanced relationship between adopter characteristics and individual perceptions of intra-organizational variables and implementation outcomes. These results partially support the EPIS framework, which suggests a number of inner context factors which may be important at different stages of the implementation process (exploration, preparation, implementation, and sustainment). Additionally, the results link inner context factors to particular implementation outcomes from the Proctor et al. (2010) framework, which further advances the conceptual model.
Therapists reported high penetration – in other words, using CBT for child anxiety with the majority of anxious youth with whom they worked. However, they also reported using a variety of other modalities in the treatment of anxious youth, including non-evidence-based approaches (e.g., play therapy). These reports call into question the accuracy of the therapist report with regard to what they consider to CBT. It is possible that therapists rated themselves as using CBT if they used any components of the treatment rather than the whole package. These findings are consistent with previous studies, suggesting that therapists use a wide range of treatment strategies with youth (Garland et al., 2010). However, little is known about the effectiveness of mixing eclectic strategies with EBP on child outcomes. Therapists reported providing about 8 sessions in 3 months, which is a lower dosage than one might expect given that CBT for child anxiety is intended to be a weekly treatment (typically ranging from 12–16 sessions in 3 months; Kendall, Hudson, Gosch, Flannery-Schroeder, & Suveg, 2008; Walkup et al., 2008).
Adopter characteristics were associated with fidelity, including both adherence and skill. Specifically, certain therapist attitudes and demographics predicted adherence. One specific attitude, being more likely to adopt an EBP if required to do so by one’s organization, was related to lower adherence and skill scores after training and consultation. This finding may be explained by the fact that virtually none of the therapists who participated were mandated or required to use this treatment, rather they volunteered to learn this new treatment. Therefore, they may have been less motivated to apply the treatment in an adherent or skillful manner given that there were not organizational pressures to apply the treatment with adherence. Only one demographic variable was associated with adherence. As therapist experience increased, adherence to CBT for child anxiety decreased. This may be due to experience serving as a proxy for increased allegiance to other theoretical orientations. This finding is corroborated by other studies which showed that therapists with more clinical experience required more intensive training to become more adherent and competent (e.g., Martino et al., 2011). However, other studies have noted that having more clinical experience may be related to higher adherence and competence after training (Carpenter et al., 2012). It may be that other factors (e.g., what type of previous experience or what type of new skills are being taught) influence whether amount of previous experience improves or hinders implementation outcomes. Other important demographic variables noted to be related to adherence and skill in similar studies including longevity at organization, additional treatment certifications (Garner et al., 2012), and anxiety sensitivity (Harned, Dimeff, Woodcock, & Contreras, in press) were not measured in this study. It is important to note that the literature shows the least clarity with regard to which demographics consistently predict fidelity and this may be due to third variables that are unmeasured (i.e., inner- and outer-context variables). Future research that examines the interplay between and across inner- and outer-context variables on implementation outcomes is critical.
One individual perception of intra-organizational variables was associated with adherence. Interestingly, therapists who reported more positive organizational climates at their agency were more adherent to CBT for youth anxiety after training and consultation. This finding is consistent with previous research showing a relationship between therapist perceptions of certain aspects of organizational climate (e.g., job satisfaction, growth and advancement) and caregiver-reported therapist adherence in the delivery of MST (Schoenwald, Chapman, Sheidow, & Carter, 2009). One potential explanation is that therapists in more ideal organizational climates may be better able to focus on applying EBPs in an adherent matter versus therapists in chaotic and less ideal climates. Grounding this finding in other research that shows that more engaged organizational climates are associated with desirable treatment outcomes in youth (Glisson & Green, 2011) may suggest that the increased adherence we observed in this study could potentially lead to improved treatment outcome in the youths being treated. A meditational model was not supported in a study examining organizational climate, therapist adherence, and outcomes for youth treated with MST (Schoenwald et al., 2009). However, meditational models for other treatments, such as CBT for youth anxiety, may be an important area for future consideration.
Contrary to predictions, neither adopter characteristics nor individual perceptions of intra-organizational variables were associated with penetration. Although these findings suggest that clinicians can successfully use evidence-based treatments regardless of their demographics, attitudes, or organizational climate, they are at odds with an ecological perspective that hypothesizes the importance of inner context variables when implementing a treatment (Aarons et al., 2012). It is likely that these specific inner context variables are not predictive of penetration, and that other unmeasured inner context variables (e.g., innovation-values fit) would be more predictive. The lack of findings between inner context variables and penetration could also be due to one of the following issues. First, the study was underpowered to detect significant findings on the penetration variable because there were fewer participants to complete the penetration measure. Second, there may be a measurement problem, particularly with measurement of intra-organizational factors, given that these were individual perceptions of intra-organizational factors rather than aggregated organizational factors. Third, penetration was measured using self-report which is a known limitation. The current findings add to a decidedly mixed literature with regard to whether adopter characteristics and intra-organizational factors predict penetration and are somewhat inconsistent with previous studies. For example, studies have found that perceived setting barriers are negatively correlated with implementation outcomes (Harned et al., in press; Lewis & Simons, 2011). Discrepancies across studies call into question the adequacy of measurement as well as potential failure to investigate important third variables, such as client factors. Given that no clear answer has been reached regarding the interrelations among individual perceptions of intra-organizational variables and implementation outcomes and the importance of fostering long-term implementation of EBPs across a variety of organizational settings, further research in this area is important.
The present study has numerous strengths, including the randomization of training conditions, high completion rates of participants, and assessing inner context factors. Nonetheless, limitations should be noted. A first limitation has to do with the randomization process. We randomly assigned training dates to condition rather than randomly assigning therapists to condition for pragmatic reasons. However, it is possible that non-randomness could be introduced to the study design. For example, a therapist who signs up for the first date might be more motivated than a therapist who signs up for a later date. Also, because training occurred while recruitment was still ongoing, some therapists were prevented from signing up from earlier slots. Despite being typical in this literature, another weakness is that a number of measures were investigator-created (e.g., Dimeff et al., 2009). Although we used an innovative way to measure fidelity, it is not the traditional manner through which fidelity is measured, and did not include therapist fidelity observed during actual therapy sessions. To our knowledge, there is no information on the convergence between fidelity as rated through a behavioral role-play and actual in-session behavior (Beidas, Cross, & Dorsey, in press). Also, our penetration measure relied on self-report rather than objective measures. This is a limitation given research suggesting low concordance between self-report and observational data of session content (Hurlburt, Garland, Nguyen, & Brookman-Frazee, 2010).
Although the study included an organizational measure, organizational climate may not have been optimally assessed given that other measures exist to assess these constructs (Glisson, 2002). Aggregating across multiple ratings of organizational factors provides a better estimate of organizational culture and climate than individual accounts, yet this approach was not used in the present study because we did not have a large enough number of participants working in the same organizations. The current study may be more accurately described as an examination of individual therapists’ perceptions of their workplaces’ organizational climate and culture. Additional potential limitations concern the study’s generalizability. Its focus on CBT for youth anxiety may not generalize to training in other EBPs. Also, as previously mentioned, the participants were self-referred participants who were motivated to volunteer for training. Different findings may have emerged if participants included less motivated individuals who were mandated to receive training. Further, only 54 out of the 115 participants identified an anxious youth to treat in the consultation period following training. This is curious given that these individuals sought out this type of specialty training and reported that they either currently or planned to treat anxious youth in the future. This may be potentially due to difficulty identifying “pure” anxiety cases in community settings (Beidas et al., 2012; Ehrenreich-May et al., 2010; Southam-Gerow, Silverman, & Kendall, 2006).
Conclusion and Future Directions
The EPIS framework posits that inner context variables, such as adopter characteristics and intra-organizational variables, are likely to impact implementation outcomes (Aarons et al., 2012). The present results found partial support for this perspective. Future work is necessary to further delineate the relationships among inner context variables and implementation outcomes. Future studies would benefit from seeking a wider range of participants, including self-referred and mandated participants, in order to tease apart whether inner context variables differentially influence outcomes across these groups. Improved measurement of inner context variables and implementation outcomes across different stages of the implementation process would allow for greater clarity regarding the relationships among these factors. Specifically, outcome measures should be varied (Proctor et al., 2011) and should include objective ratings of fidelity and penetration in addition to self-report measures. Measurement of organizational variables should include multiple raters within an organization in order to determine whether aggregate measures differentially predict training outcomes as compared to individual perceptions of organizational variables. In addition to investigating whether inner context variables predict implementation outcomes, investigations can examine the relations between inner context variables and specific training techniques (e.g., whether learning style moderates the relation between training techniques and outcomes). Greater understanding of these relationships is needed to devise optimal training approaches that can be used to increase the chances youth receive optimal care regardless of the setting in which they are served.
ACKNOWLEDGEMENTS
Funding for this research project was supported by the following grants from NIMH: (MH083333, MH099179 Beidas; MH086436, MH063747 Kendall). Additionally, the preparation of this article was supported in part by the Implementation Research Institute (IRI), at the George Warren Brown School of Social Work, Washington University in St. Louis; through an award from the National Institute of Mental Health (R25 MH080916) and Quality Enhancement Research Initiative (QUERI), Department of Veterans Affairs Contract, Veterans Health Administration, Office of Research & Development, Health Services Research & Development Service. Dr. Beidas is an IRI fellow.
Footnotes
This paper was presented at the National Institutes of Health Dissemination and, Implementation Conference in 2010.
92 participants completed the ITAY.
Given the finding that both adherence and skill were negatively related to EBPAS requirements, it is of note that if a therapist was rated as very low on adherence, they would also be rated low on skill
REFERENCES
- Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: The Evidence-Based Practice Attitude Scale (EBPAS) Mental Health Services Research. 2004;6(2):61–74. doi: 10.1023/b:mhsr.0000024351.12294.65. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons GA, Hurlburt M, Horowitz S. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research. 2011;38:4–23. doi: 10.1007/s10488-010-0327-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons GA, Sawitzky AC. Organizational culture and climate and mental health provider attitudes towards evidence-based practice. Psychological Services. 2006;3(1):61–72. doi: 10.1037/1541-1559.3.1.61. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Addis ME, Krasnow AD. A national survey of practicing psychologists' attitudes toward psychotherapy treatment manuals. Journal of Consulting and Clinical Psychology. 2000;68(2):331–339. doi: 10.1037//0022-006x.68.2.331. [DOI] [PubMed] [Google Scholar]
- American Psychological Association. Evidence-based practice in psychology. American Psychologist. 2006;61:271–285. doi: 10.1037/0003-066X.61.4.271. [DOI] [PubMed] [Google Scholar]
- Barwick MA, Schachter HM, Bennett LM, McGowan J, Ly M, Wilson A, Manion I. Knowledge translation efforts in child and youth mental health: a systematic review. Journal of Evidence-Based Social Work. 2012;9(4):369–395. doi: 10.1080/15433714.2012.663667. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas RS, Barmish AJ, Kendall PC. Training as usual: Can therapist behavior change after reading a manual and attending a brief workshop on cognitive behavioral therapy for youth anxiety? The Behavior Therapist. 2009;32:97–101. [Google Scholar]
- Beidas RS, Cross W, Dorsey S. Show me, don’t tell me: Behavioral rehearsal as a training and analogue fidelity tool. Cognitive and Behavioral Practice. doi: 10.1016/j.cbpra.2013.04.002. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas RS, Edmunds JM, Marcus SC, Kendall PC. Training and consultation to promote implementation of an empirically supported treatment: a randomizedtrial. Psychiatriac Services. 2012;63(7):660–665. doi: 10.1176/appi.ps.201100401. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas RS, Kendall PC. Training therapists in evidence-based practice: A critical review of studies from a systems-contextual perspective. Clinical Psychology: Science and Practice. 2010;17(1):1–30. doi: 10.1111/j.1468-2850.2009.01187.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas RS, Suarez L, Simpson D, Read K, Wei C, Connolly S, Kendall P. Contextual factors and anxiety in minority and European American youth presenting for treatment across two urban university clinics. Journal of Anxiety Disorders. 2012;26:544–554. doi: 10.1016/j.janxdis.2012.02.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Benjamin CL, Beidas RS, Edmunds JM, Cohen J, Kendall P. Identification and treatment of anxious youth. Pennsylania: Unpublished instrument.Department of Psychology, Temple University, Philadelphia; 2010. [Google Scholar]
- Carpenter KM, Cheng WY, Smith JL, Brooks AC, Amrhein PC, Wain RM, Nunes EV. "Old dogs" and new skills: How clinician characteristics relate to motivational interviewing skills before, during, and after training. Journal of Consulting and Clinical Psychology. 2012;80(4):560–573. doi: 10.1037/a0028362. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science. 2009;4(1):50. doi: 10.1186/1748-5908-4-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dimeff LA, Koerner K, Woodcock EA, Beadnell B, Brown MZ, Skutch JM, Harned MS. Which training method works best? A randomized controlled trial comparing three methods of training clinicians in dialectical behavior therapy skills. Behaviour Research and Therapy. 2009;47(11):921–930. doi: 10.1016/j.brat.2009.07.011. [DOI] [PubMed] [Google Scholar]
- Edmunds JM, Beidas RS, Kendall PC. Dissemination and implementation of evidence–based practices: Training and consultation as implementation strategies. Clinical Psychology: Science and Practice. 2013;20(2):152–165. doi: 10.1111/cpsp.12031. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Edmunds JM, Kendall PC, Ringle V, Read K, Brodman DM, Pimentel SS, Beidas RS. An examination of behavioral rehearsal during consultation as a predictor of training outcomes. Administration and Policy in Mental Health and Mental Health Services Research. 2013;40:456–466. doi: 10.1007/s10488-013-0490-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ehrenreich-May J, Southam-Gerow MA, Hourigan SE, Wright LR, Pincus DB, Weisz JR. Characteristics of anxious and depressed youth seen in two different clinical contexts. Administration and Policy in Mental Health and Mental Health Services Research. 2010 doi: 10.1007/s10488-010-0328-6. [DOI] [PubMed] [Google Scholar]
- Farmer AP, Légaré F, Turcot L, Grimshaw J, Harvey E, McGowan JL, Wolf F. Printed educational materials: effects on professional practice and healthcare outcomes. Cochrane Database of Systematic Reviews. 2008;3 doi: 10.1002/14651858.CD004398.pub2. CD004938. [DOI] [PubMed] [Google Scholar]
- Fixsen DL, Blase KA, Naoom SF, Wallace F. Core implementation components. Research on Social Work Practice. 2009;19(5):531–540. [Google Scholar]
- Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F. Implementation research: A synthesis of the literature. Tampa, Florida: 2005. [Google Scholar]
- Garland AF, Brookman-Frazee L, Hurlburt MS, Accurso EC, Zoffness RJ, Haine-Schlagel R, Ganger W. Mental health care for children with disruptive behavior problems: a view inside therapists' offices. Psychiatric Services. 2010;61(8):788–795. doi: 10.1176/appi.ps.61.8.788. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Garner BR, Hunter BD, Godley SH, Godley MD. Training and retaining staff to competently deliver an evidence-based practice: the role of staff attributes and perceptions of organizational functioning. Journal of Substance Abuse Treatment. 2012;42(2):191–200. doi: 10.1016/j.jsat.2011.10.016. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Glisson C. The organizational context of children's mental health services. Clinical Child and Family Psychology Review. 2002;5(4):233–253. doi: 10.1023/a:1020972906177. [DOI] [PubMed] [Google Scholar]
- Glisson C, Green P. Organizational climate, services, and outcomes in child welfare systems. Child Abuse & Neglect. 2011;35(8):582–591. doi: 10.1016/j.chiabu.2011.04.009. [DOI] [PubMed] [Google Scholar]
- Glisson C, Schoenwald SK, Hemmelgarn A, Green P, Dukes D, Amstrong KS, Chapman JE. Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. Journal of Consulting and Clinical Psychology. 2010;78(4):537–550. doi: 10.1037/a0019160. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hawkins KA, Sinha R. Can line clinicians master the conceptual complexities of dialectical behavior therapy? An evaluation of a State Department of Mental Health training program. Journal of Psychiatric Research. 1998;32(6):379–384. doi: 10.1016/s0022-3956(98)00030-2. doi: http://dx.doi.org/10.1016/S0022-3956(98)00030-2. [DOI] [PubMed] [Google Scholar]
- Henggeler SW. Decreasing effect sizes for effectiveness studies - Implications for the transport of evidence-based treatments: Comment on Curtis, Ronan, and Borduin (2004) Journal Of Family Psychology. 2004;18(3):420–423. doi: 10.1037/0893-3200.18.3.420. [DOI] [PubMed] [Google Scholar]
- Henggeler SW, Melton GB, Brondino MJ, Scherer DG, Hanley JH. Multisystemic therapy with violent and chronic juvenile offenders and their families: The role of treatment fidelity in successful dissemination. Journal Of Consulting And Clinical Psychology. 1997;65(5):821–833. doi: 10.1037//0022-006x.65.5.821. [DOI] [PubMed] [Google Scholar]
- Herschell AD, McNeil CB, McNeil DW. Clinical child psychology's progress in disseminating empirically supported treatments. Clinical Psychology: Science and Practice. 2004;11(3):267–288. [Google Scholar]
- Hurlburt MS, Garland AF, Nguyen K, Brookman-Frazee L. Child and family therapy process: Concordance of therapist and observational perspectives. Administration and Policy in Mental Health and Mental Health Services Research. 2010;37:230–244. doi: 10.1007/s10488-009-0251-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kendall PC, Hudson J, Gosch E, Flannery-Schroeder E, Suveg C. Cognitive-behavioral therapy for anxiety disordered youth: A randomized clinical trial evaluating child and family modalities. Journal of Consulting and Clinical Psychology. 2008;76:282–297. doi: 10.1037/0022-006X.76.2.282. [DOI] [PubMed] [Google Scholar]
- Klein KJ, Conn AB, Sorra JS. Implementing computerized technology: An organizational analysis. Journal of Applied Psychology. 2001;86(5):811–824. doi: 10.1037/0021-9010.86.5.811. [DOI] [PubMed] [Google Scholar]
- Lehman WE, Greener JM, Simpson D. Assessing organizational readiness for change. Journal of Substance Abuse Treatment. 2002;22(4):197–209. doi: 10.1016/s0740-5472(02)00233-7. [DOI] [PubMed] [Google Scholar]
- Lewis CC, Simons AD. A pilot study disseminating cognitive behavioral therapy for depression: therapist factors and perceptions of barriers to implementation. Administration and Policy in Mental Health and Mental Health Services Research. 2011;38(4):324–334. doi: 10.1007/s10488-011-0348-x. [DOI] [PubMed] [Google Scholar]
- Lim A, Nakamura BJ, Higa-McMillan CK, Shimabukuro S, Slavin L. Effects of workshop trainings on evidence-based practice knowledge and attitudes among youth community mental health providers. Behaviour Research and Therapy. 2012;50(6):397–406. doi: 10.1016/j.brat.2012.03.008. [DOI] [PubMed] [Google Scholar]
- Martino S, Canning-Ball M, Carroll K, Rounsaville B. A criterion-based stepwise approach for trianing counselors in motivational interviewing. Journal of Substance Abuse Treatment. 2011;40:357–365. doi: 10.1016/j.jsat.2010.12.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Miller WR, Yahne CE, Moyers TB, Martinez J, Pirritano M. A randomized trial of methods to help clinicians learn motivational interviewing. Journal of Consulting and Clinical Psychology. 2004;72(6):1050–1062. doi: 10.1037/0022-006X.72.6.1050. [DOI] [PubMed] [Google Scholar]
- Nadeem E, Gleacher A, Beidas RS. Consultation as an implementation strategy for evidence-based practices across multiple contexts: Unpacking the black box. Administration and Policy in Mental Health and Mental Health Services. 2013;40:439–450. doi: 10.1007/s10488-013-0502-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Perepletchikova F, Kazdin AE. Treatment integrity and therapeutic change: Issues and research recommendations. Clinical Psychology: Science And Practice. 2005;12(4):365–383. [Google Scholar]
- Perepletchikova F, Treat TA, Kazdin AE. Treatment integrity in psychotherapy research: Analysis of the studies and examination of the associated factors. Journal Of Consulting And Clinical Psychology. 2007;75(6):829–841. doi: 10.1037/0022-006X.75.6.829. [DOI] [PubMed] [Google Scholar]
- Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, York JL. A compilation of strategies for implementing clinical innovations in health and mental health. Medical Care Research and Review. 2011;69(2):123–157. doi: 10.1177/1077558711430690. [DOI] [PMC free article] [PubMed] [Google Scholar]
- President's New Freedom Commission on Mental Health. Report of the President's New Freedom Commission on Mental Health. Washington, DC: 2003. [Google Scholar]
- Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Hensley M. Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research. 2011;38(2):65–76. doi: 10.1007/s10488-010-0319-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rakovshik SG, McManus F. Establishing evidence-based training in cognitive behavioral therapy: A review of current empirical findings and theoretical guidance. Clinical Psychology Review. 2010;30(5):496–516. doi: 10.1016/j.cpr.2010.03.004. [DOI] [PubMed] [Google Scholar]
- Saldana L, Chamberlain P. Supporting implementation: The role of community development teams to build infrastructure. American Journal of Community Psychology. 2012;50:334–346. doi: 10.1007/s10464-012-9503-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Saldana L, Chapman JE, Henggeler SW, Rowland MD. The Organizational Readiness for Change scale in adolescent programs: Criterion validity. Journal of Substance Abuse Treatment. 2007;33(2):159–169. doi: 10.1016/j.jsat.2006.12.029. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sanders MR, Prinz RJ, Shapiro CJ. Predicting utilization of evidence-based parenting interventions with organizational, service-provider and client variables. Administration and Policy in Mental Health and Mental Health Services Research. 2009;36(2):133–143. doi: 10.1007/s10488-009-0205-3. [DOI] [PubMed] [Google Scholar]
- Sanders MR, Turner KMT. Reflections of the challenges of effective dissemination of behavioural family intervention: Our experience with the Triple P- Positive Parenting Program. Child and Adolescent Mental Health. 2005;10(4):158–169. doi: 10.1111/j.1475-3588.2005.00367.x. [DOI] [PubMed] [Google Scholar]
- Schoenwald SK, Chapman JE, Sheidow AJ, Carter RE. Long-term youth criminal outcomes in MST transport: The impact of therapist adherence and organizational climate and structure. Journal Of Clinical Child And Adolescent Psychology. 2009;38(1):91–105. doi: 10.1080/15374410802575388. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schoenwald SK, Chapman JE, Garland AF. Capturing fidelity. In: Beidas RS, Kendall PC, editors. Child and Adolescent Therapy: Dissemination and Implementation of Empirically Supported Treatments. First edition. Oxford University Press; in press. [Google Scholar]
- Shafran R, Clark DM, Fairburn CG, Arntz A, Barlow DH, Ehlers A, Wilson GT. Mind the gap: Improving the dissemination of CBT. Behaviour Research and Therapy. 2009;47(11):902–909. doi: 10.1016/j.brat.2009.07.003. [DOI] [PubMed] [Google Scholar]
- Sholomskas DE, Syracuse-Siewert G, Rounsaville BJ, Ball SA, Nuro KF, Carroll KM. We don't train in vain: A dissemination trial of three strategies of training clinicians in cognitive-behavioral therapy. Journal of Consulting and Clinical Psychologyuy. 2005;73(1):106–115. doi: 10.1037/0022-006X.73.1.106. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Southam-Gerow MA, Silverman WK, Kendall PC. Client similarities and differences in two childhood anxiety disorders research clinics. Journal of Clinical Child & Adolescent Psychology. 2006;35:528–538. doi: 10.1207/s15374424jccp3504_4. [DOI] [PubMed] [Google Scholar]
- Stirman SW, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implementation Science: IS. 2012:717. doi: 10.1186/1748-5908-7-17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Taxman FS, Belenko S. Implementing evidence-based practices in community corrections and addiction treatment. New York: Springer Verlag GMBH; 2012. [Google Scholar]
- Turner KMT, Sanders MR. Dissemination of evidence-based parenting and family support strategies: Learning from the Triple-P--Positive Parenting Program system approach. Aggression and Violent behavior. 2006;11(2):176–193. doi: http://dx.doi.org/10.1016/j.avb.2005.07.005. [Google Scholar]
- Walkup JT, Albano AM, Piacentini J, Birmaher B, Compton SN, Sherrill JT, …Kendall PC. Cognitive behavioral therapy, sertraline, or a combination in childhood anxiety. New England Journal of Medicine. 2008;359(26):2753–2766. doi: 10.1056/NEJMoa0804633. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Waltz J, Addis ME, Koerner K, Jacobson NS. Testing the integrity of a psychotherapy protocol: Assessment of adherence and competence. Journal Of Consulting And Clinical Psychology. 1993;61(4):620–630. doi: 10.1037//0022-006x.61.4.620. [DOI] [PubMed] [Google Scholar]
- Weisz JR, Sandler IN, Durlak JA, Anton BS. Promoting and protecting youth mental health through evidence-based prevention and treatment. American Psychologist. 2005;60(6):628–648. doi: 10.1037/0003-066X.60.6.628. [DOI] [PubMed] [Google Scholar]
- Williams C, Martinez R, Dafters R, Ronald L, Garland A. Training the wider workforce in cognitive behavioural self-help: the SPIRIT (Structured Psychosocial InteRventions in Teams) training course. Behavioural and Cognitive Psychotherapy. 2011;39(2):139–149. doi: 10.1017/S1352465810000445. [DOI] [PubMed] [Google Scholar]