Abstract
The goal of this study was to identify therapist and organizational characteristics associated with participation in evidence-based practice (EBP) training initiatives sponsored by a large publicly funded mental health system. Self-report data from therapists (N = 247) nested within 28 mental health clinics was collected in 2015. Results from regression analyses indicated that length of time employed at an organization was associated with individual therapist participation in an EBP initiative. Organizational and implementation climate were associated with organizational participation in an EBP initiative. Organizations characterized by higher levels of stress were more likely to participate in EBP initiatives and organizations characterized by higher engagement and educational support for EBPs were less likely to participate. Implications include the need for systems to consider organizational characteristics when sponsoring organizations in EBP initiatives.
Keywords: Implementation, community mental health services, evidence-based practice, organizational characteristics
In recent years, there has been an emphasis on the implementation of evidence-based practices (EBPs) into practice settings. This focus is due to a growing understanding that EBP are not reaching the most vulnerable, particularly those served by the public health and mental health systems. 1,2 For example, despite there being over 1200 evidence-based treatments for childhood psychiatric disorders,3 these treatments are rarely found in public mental health settings..3,4 Implementation science, the study of methods to promote the systematic uptake of research,5 was developed to understand how to more effectively implement EBP in community settings. Conceptual frameworks,6,7 corroborated by empirical evidence,8–10 suggest that EBP implementation necessitates an understanding of the individuals (e.g., provider knowledge and attitudes),11–13 organizations (e.g., organizational culture and climate)14,15 and systems (e.g., financing)16,17 involved; as well as the particular stage of implementation (i.e., exploration, preparation, implementation, and sustainment; EPIS framework).6,7 However, despite the growing body of literature on characteristics of implementation success, there has been less attention paid to understanding which individuals and organizations choose to implement (i.e. adopt) EBP in the community, particularly within the context of systems who do not mandate EBP implementation.
Determining which individuals and organizations adopt EBP is an important piece of the dissemination and implementation puzzle, particularly in the exploration and preparation phases of the implementation process. Prior research indicates that individual clinician characteristics are associated with adoption, including positive attitudes towards EBP,18 being an older provider, and having higher educational attainment.18,19 In otherwise supportive organizations, reasons why individuals chose not to adopt EBP include barriers such as time, resources, and misconceptions of EBP.20,21 At the organizational level, the evidence is equivocal. Some studies have found that adoption and implementation of EBP correlates with more positive organizational climates22,23 whereas another study found a negative relationship.18 Importantly, it is also possible that the implementation of an EBP can in itself cause disruptions in organizational culture and climate.24
Given that the literature on EBP adoption is limited, conflicting, and often theoretical, empirical research investigating characteristics of individuals and organizations who do and do not participate in EBP implementation in public behavioral health settings is needed. It is particularly relevant and timely given recent efforts by state and county behavioral health systems to increase the availability of EBP through legislation, mandates, and system-funded training opportunities.25–29 The Philadelphia Department of Behavioral Health and Intellectual disAbility Services (DBHIDS) is one such behavioral health system that has engaged in an effort to encourage the implementation of EBP. Since 2007, DBHIDS has supported and funded a number of training initiatives to increase the availability of cognitive behavioral therapy (CBT) throughout behavioral health organizations in Philadelphia. These efforts present a natural laboratory within which to investigate characteristics associated with individuals and organizations who participate in EBP initiatives.
The goals of the current study are to examine (1) whether therapist characteristics predict therapist participation in EBP initiatives and (2) whether organizational characteristics predict organizational participation in EBP initiatives in public mental health settings. Variables were selected based on previous research and implementation science frameworks highlighting the impact of both individual and organizational factors on the implementation and delivery of EBPs.6,30,31 Therapist characteristics included demographics (e.g., age, educational background, and length of time at organization), burnout, attitudes toward and knowledge of EBP, and use of specific therapy skills. Organizational level variables included organizational culture and climate, and implementation climate. Organizational culture and climate are two factors shown to be related to attitudes towards adoption of innovation in general, and toward EBP in particular.19 Implementation climate, defined as staff beliefs that innovation adoption, implementation, and use is expected, rewarded, and supported by the organization,32 may also be an important predictor of participation in an EBP initiative. Given limited existing research, hypotheses are exploratory.
Method
Setting
DBHIDS has supported four specific EBP initiatives: cognitive therapy33,34 prolonged exposure, trauma-focused cognitive behavioral therapy35 and dialectical behavior therapy across a variety of care settings (e.g., outpatient, inpatient, residential), with two of the initiatives focused on child services (i.e., cognitive therapy and trauma-focused cognitive-behavioral therapy). A full-time employee within DBHIDS coordinates implementation, training, and ongoing consultation by treatment developers. Collectively, the initiatives have supported and financed training, expert consultation, and ongoing technical assistance in over 50 organizations in the Philadelphia system.36 Over the course of implementation, the process employed by DBHIDS for selecting organizations to participate in initiatives has become more formal. Initially, selection of organizations for participation was not uniform and largely guided by DBHIDS (e.g. larger organizations selected, excelling organizations selected), whereas more recently organizations have applied for participation through a competitive request-for-applications process and have been chosen for participation by DBHIDS.36
Agencies and Participants
More than 100 community mental health organizations in Philadelphia provide outpatient services to youth (data provided by the City of Philadelphia Community Behavioral Health; Sarah Chen, PhD, MSW, personal communication, October 30th, 2014). Thirty-one of the 100 organizations that serve the largest proportion of youth were purposively sampled. Of these 31 organizations, 22 (73%) agreed to participate. Several agencies had multiple locations, resulting in 28 sites. Twenty-three of the 28 sites had participated in at least one child-focused training initiatives (site is referred to as organization from this point forward). The sample consisted of 73 organizational leaders, (i.e., 22 supervisors, 29 clinical directors, and 22 executive directors), and 257 therapists. Approximately 58% of therapists employed by the 22 organizations participated in the study. There were no exclusion criteria for therapist participation. Previous investigation utilizing the data from organizational leaders in this sample has been conducted and reported elsewhere.37–39 Data from therapists only were used for the current investigation.
Procedure
This study was approved by the University of Pennsylvania and City of Philadelphia Institutional Review Boards. The person(s) identified as the leader of the organization was approached to solicit his/her organization’s participation. A one-time two-hour meeting was scheduled with potential participants, during which lunch was provided. During this meeting, the research team gave an overview of the study, obtained written informed consent, and collected all measures. All were compensated $50 for participation in the study.
Measures
Individual-level measures
Participant demographics
Participants completed a brief demographics questionnaire that assessed age, race/ethnicity, educational background, years of experience, years at current organization, hours worked weekly, caseload, and worker status (independent contractor, salaried, other).40 Therapists indicated whether they had formally participated in any of the four DBHIDS EBP initiatives. Participants were asked to verbally confirm that they understood that the question about initiative participation was referring to the one-year training and ongoing consultation provided through DBHIDS.
Burnout
The Maslach Burnout Inventory Human Services Survey (MBI-HSS)41 is a 22-item self-report measure of therapist burnout that measures emotional exhaustion, depersonalization, and reduced personal accomplishment. Items are rated on a 0 (Never) to 6 (Every day) scale. Higher scores on emotional exhaustion and depersonalization and lower scores on personal accomplishment (reverse scored for the total score) indicate higher levels of burnout. The MBI is the most widely used instrument to assess burnout and has shown satisfactory internal consistency and discriminant and factorial validity.42–44
Attitudes
The Evidence-Based Practice Attitude Scale (EBPAS)13 is a 15-item self-report measure with four subscales assessing attitudes toward adoption of EBP. The four subscales represent four theoretically derived dimensions of attitudes toward EBP adoption: appeal (is EBP intuitively appealing), requirements (would EBP be used if required by organization, supervisor, or state), openness (openness to innovation in general), and divergence (perceived divergence between EBP and current practices). Items are rated on a 0 (Not at all) to 4 (Very great extent) scale. Higher scores indicate more positive attitudes, with the exception of divergence, which is reverse coded. The EBPAS has national norms, and has demonstrated validity and good internal consistency with subscale alphas ranging from .67–.91.45
Knowledge
The Knowledge of Evidence-Based Service Questionnaire (KEBSQ) 46 is a 40-item self-report instrument that measures knowledge of various evidence-based therapy techniques across several youth problem areas. Knowledge is measured on a continuum from 0–160 where higher scores are indicative of more knowledge of EBP. Psychometric data suggests temporal stability, discriminant validity, and sensitivity to training.46
Use of therapy strategies
The Therapist Procedures Checklist – Family Revised (TPC-FR)47 is a 62-item self-report checklist measuring use of therapy strategies from four modalities: cognitive, behavioral, family, and psychodynamic. Therapists were asked to respond in reference to a representative client that they were currently treating. Items are rated on a 1 (Rarely) to 5 (Most of the time) scale and higher scores are indicative of more utilization of the set of strategies for a given domain. The factor structure has been confirmed, test-retest reliability is strong, and the instrument is sensitive to within-therapist changes in strategy use.47,48
Organizational-level measures
Organizational evidence-based practices
DBHIDS provided the research team with a list of the organizations that had participated in one of the four city-sponsored EBP initiatives, the year they began participating, and their completion year (if applicable).
Organizational culture and climate
The Organizational Social Context Measurement System (OSC),49 includes 105 items measuring six dimensions that assess organization culture and organizational climate in mental health and social service organizations.49a Organizational culture is defined as shared beliefs and expectations of a work environment and includes three primary dimensions: proficiency, rigidity, and resistance. Proficient cultures are those in which therapists prioritize the well-being of clients and are expected to be competent. Rigid cultures are those in which therapists have little autonomy. Resistant cultures are those in which therapists are expected to show little interest in changes or new ways of providing services. Organizational climate refers to shared perceptions about the work environment’s impact on worker well-being and functioning and is measured on three dimensions: engagement, functionality, and stress. Engaged climates refer to settings in which therapists feel they can accomplish worthwhile goals and remain invested in their work. Functional climates are those in which therapists can get their job done effectively and have a well-defined understanding of how they fit into the organization. Stressful climates refer to settings in which therapists feel emotionally exhausted. Organizational culture and climate scores are profiled with T scores (mean ± SD = 50 ± 10) established from a national sample of 100 mental health organizations.49 The measurement model, within-system agreement, reliability and between-system differences, are strong.50
Implementation climate
The Implementation Climate Scale (ICS),32 is an 18-item self-report measure of multiple factors that contribute to successful implementation including: organizational focus on EBP (e.g., view EBP implementation as important); educational support for EBP (e.g., provide support for trainings or training materials); recognition for using EBP (e.g., use of EBP held in esteem); rewards for using EBP (e.g., financial incentives for EBP use); selection of staff for EBP (e.g., selection based on previous experience or formal education in EBP); and selection of staff for openness (e.g., flexible, adaptable). Items are rated on a 0 (Not at all) to 4 (Very great extent) scale with higher scores indicating a more positive implementation climate indicating the organization values and is supportive of EBP. Psychometric evaluation suggests good reliability and validity.32
Data analytic plan
Organizational measures are included in analyses by aggregating individual responses within the organization, if there is enough agreement. Average within-group correlation (awg, rwg) statistics were used.51,52 On all organizational variables, both statistics were above the suggested .60 level;51,53 therefore participant responses to organizational constructs were averaged within each organization. Missing data for predictor variables were less than 10%; series means were imputed for missing predictor variables. Consistent with previous studies,54 each site (K = 28), rather than each organization (K = 22), was treated as a distinct unit because of largely different leadership structures, locations, and staff.
Initial descriptive statistics were calculated using SPSS version 23. Main analyses were conducted using PROC MIXED and PROC LOGISTIC in SAS 9.0. First, a mixed-effects linear regression model was conducted to identify if therapist demographic variables (i.e., clinical experience, age, years at organization, hours worked weekly, caseload, worker status [independent contractor, salaried, other]) were related to participation in EBP initiatives (continuous; 0–4 initiatives). Second, a mixed-effects linear regression model was conducted to identify if therapist knowledge (KEBSQ),46 attitudes (EBPAS; requirements, openness, appeal, and divergence),13 burnout (MBI; emotional exhaustion, depersonalization, and personal accomplishment),41 and self-reported use of therapy strategies (TPC-FR; behavioral, cognitive, family, psychodynamic) 47 were related to therapist participation in an initiative (yes/no). Mixed-effects models included random intercepts for organization to account for nesting of therapists within organizations and fixed effects for therapist factors.
Third, seventeen logistic regression models were run to identify if organizational culture (OSC; proficiency, rigidity, resistance),49 organizational climate (OSC; engagement, functionality, stress),49 and implementation climate (ICS; focus on EBPs, educational support, recognition, reward, selection of staff, organization openness),32 were related to organizational participation in an initiative (dichotomous; yes/no). Because analyses were underpowered to detect effects at the organizational level, trend relationships (p < .10) were included. Adjustment for multiple models were not made given the exploratory nature of the analyses.55
Results
See Table 1 for descriptive statistics of therapist demographic variables. The sample was predominantly female and ethnically diverse. The majority of therapists had master’s degrees and were either not licensed or in the process of obtaining licensure. Forty-seven percent of therapists participated in an EBP initiative (data missing for four); 82% of organizations participated in an EBP initiative. Of therapists who participated in at least one EBP initiative, 26.7% participated in one, 13.4% in two, and 6.4% in three or more.
Table 1.
Variable | Frequency (%) |
---|---|
Gendera | |
Male | 21.6% |
Female | 77.7% |
Hispanic/Latinoa | |
Yes | 21.1% |
No | 76.9% |
Ethnicitya | |
Asian | 4.5% |
Black or African American | 30% |
White | 40.9% |
Hispanic/Latino | 16.2% |
Multiracial | 4.0% |
Other | 2.8% |
Academic backgrounda | |
Bachelor’s degree | 6.5% |
Master’s degree | 77.7% |
Doctoral degree | 13.8% |
Other | |
Licensure statusa | |
Yes | 19.8% |
No | 39.3% |
In process | 38.9% |
Does not add up to 100% because of missing responses
Demographic variables predicting individual participation
Table 2 presents the results of analyses predicting individual therapist participation in EBP initiatives. Only one variable was significant; as years spent at the organization increased, the number of initiatives therapists participated in also increased.
Table 2.
Variable | M (SD) [Range] or percentage | Parameter Estimate | Standard Error |
---|---|---|---|
Dependent variable: Therapist initiative participation | .77 (1.01) [0–4] | -- | -- |
Therapist age | 38.74 (11.93) [23–76] | 1.07 | .74 |
Clinical experience | 10.05 (8.60) [0–44] | −1.31 | 1.04 |
Years at organization | 3.3 (3.9) [0–30] | .60* | .04 |
Hours worked per week | 25.13 (15.89) [0–90] | .32 | .50 |
Weekly caseload | 28.73 (20.63) [1–125] | .28 | .34 |
Worker status | salaried (20.6%); contractor (76.6%); other (9.7%) |
8.08 | 13.35 |
Abbreviations: EBP, Evidence-Based Practice
p < .05
Therapist participation in EBP initiatives
Table 3 presents the results of analyses assessing whether therapist knowledge, attitudes, burnout, and self-reported use of therapy strategies was associated with therapist participation in EBP initiatives. Descriptive statistics for each therapist-level measure are also presented in Table 3. Nearly all scale and subscale means were in the expected range for this setting and population and in line with available norms.41,45–47 On the EBPAS, the mean for the divergence subscale was somewhat higher than national norms, indicating therapists in this sample tended to view EBPs as less clinically useful than clinical experience.45 None of the variables were significantly related to therapist participation.
Table 3.
Variable | M (SD) [Range] or percentage | Odds Ratio | CI (95%) |
---|---|---|---|
Dependent variable: Therapist initiative participation | Yes (46.6%) No (51.8%) |
-- | -- |
TPC-FR | |||
Behavioral | 2.9 (.80) [1.13–4.8] | 1.16 | .72–1.88 |
Cognitive | 3.67 (.67) [1.69–5] | 1.05 | .60–1.85 |
Psychodynamic | 3.38 (.67) [1.33–5] | .85 | .47–1.54 |
Family | 3.39 (.87) [1.07–4.93] | 1.13 | .71–1.81 |
Knowledge (KEBSQ) | 96.12 (8.9) [72–121] | .98 | .94–1.02 |
Attitudes (EBPAS) | |||
Requirements | 2.86 (1.0) [0–4] | .81 | .60–1.10 |
Appeal | 3.17 (.68) [1–4] | 1.31 | .81–2.12 |
Openness | 3.08 (.71) [.75–4] | .78 | .49–1.24 |
Divergence | 2.73 (.74) [1–4] | .95 | .63–1.43 |
Burnout (MBI) | |||
Emotional Exhaustion | 19.10 (12.5) [0–54] | 1.01 | .98–1.04 |
Depersonalization | 4.09 (4.7) [0–24] | 1.00 | .93–1.08 |
Personal Accomplishment | 38.86 (6.6) [12–48] | 1.02 | .97–1.07 |
Abbreviations: EBP, Evidence-Based Practice; TPC-FR, Therapist Procedures Checklist-Family
Revised; KEBSQ, Knowledge of Evidence-Based Service Questionnaire; EBPAS, Evidence-Based Practice Attitudes Scale; MBI, Maslach Burnout Inventory
Organizational participation in EBP initiatives
Table 4 presents the results of analyses predicting organizational participation in initiatives. Descriptive statistics for each organizational-level measure are also presented in Table 4. All means were in the expected range for this setting and population.32,49 Organizational climate was marginally related to organizational participation in EBP initiatives. Each 1 point increase in engagement decreased the odds that an organization participated in an EBP initiative by 11% (OR = .89, p = .08). Each 1 point increase in stress increased the odds that an organization participated in an EBP initiative by 28% (OR = 1.28; p = .06). Implementation climate was marginally related to organizational participation in EBP initiatives. Each 1 point increase in educational support for EBPs decreased the odds that an organization participated in an EBP initiative by 78% (OR = .22, p = .09).
Table 4.
Variable | M (SD) [Range] or percentage | Odds Ratio | CI (95%) |
---|---|---|---|
Dependent variable: Organization EBP initiative participation | Yes (82.1%) No (17.9%) |
-- | -- |
Organizational Culture (OSC)a | |||
Proficiency | 53.86 (9.3) [26.01–70.07] | .95 | .84–1.08 |
Rigidity | 58.63 (7.2) [41.74–71.49] | 1.03 | .90–1.18 |
Resistance | 62.40 (9.7) [47.92–82.02] | 1.03 | .92–1.14 |
Organizational Climate (OSC) | |||
Engagement | 54.38 (9.07) [39.80–70.22] | .89+ | .78–1.02 |
Functionality | 63.34 (9.52) [44.32–81.48] | 1.01 | .91–1.12 |
Stress | 53.89 (6.61) [39.39–66.80] | 1.28+ | .99–1.66 |
Implementation Climate (ICS) | |||
Focus on EBPs | 2.67 (.47) [1.8–3.46] | .11 | .01–1.69 |
Educational support | 2.07 (.70) [.80–3.48] | .22 | .04–1.27 |
Recognition | 1.96 (.59) [.80–3.06] | .26 | .03–2.06 |
Reward | 0.55 (.58) [0–1.90] | .77 | .15–3.9 |
Staff selection | 1.87 (.64) [0–2.98] | .44 | .08–2.42 |
Openness | 2.92 (.52) [2–3.78] | .41 | .05–3.26 |
Abbreviations: EBP, Evidence-Based Practice; OSC, Organizational Social Context Measurement
System; ICS, Implementation Climate Scale
OSC values are T-scores
p < .10
Discussion
This study examined characteristics of therapists and organizations that did or did not participate in system-sponsored EBP initiatives in an urban publically-funded behavioral health system engaged in a large-scale effort to increase the use of EBP. Consistent with themes from previous qualitative research,56 therapist number of years at an organization was associated with participation in a greater number of EBP initiatives. Organizational climate factors were marginally associated with organizational EBP initiative participation, including engagement and stress, also consistent with previous research.24 Implementation climate, specifically educational support, was marginally associated with organizational participation. These results suggest that organizational characteristics (i.e. inner context variables) may differentiate those who participate in EBP initiatives, which has important implications for future efforts to take EBP implementation to scale in large systems. These results are consistent with the EPIS framework, which suggests that EBP implementation and sustainment are impacted by both inner context (e.g., organizational and provider characteristics) and outer context (system-level factors such as policy and service environment).6 The current results provide additional evidence that inner context variables impact EBP implementation.
The only therapist characteristic associated with participation in EBP initiatives was length of time employed at the organization. The most parsimonious explanation for this finding is that the longer a therapist is employed at an organization, the more opportunities there are to participate in EBP initiatives. Therapist participation was determined differently across time and setting (i.e. in some instances therapists volunteered and in other instances they were selected by organizational leadership), so additional explanations for the results also exist. Results are consistent with research from the organizational psychology literature suggesting positive associations between job tenure and voluntary learning (i.e., non-mandatory training or learning opportunities).57 It may be that staff with a longer tenure are perceived as less risky in terms of return on investment given concerns about therapist turnover.58 They may also be viewed as more connected to the organization or effective in their positions and thus more likely to remain employed.59 However, results are not consistent with literature suggesting that that the longer a therapist is employed at an organization, the less open they are to EBP adoption.19 Thus, more research is necessary to better understand the relationship between job tenure and EBP adoption.
Previous literature suggests that therapist characteristics, such as attitudes towards and knowledge about EBPs and level of training are related to adoption.18,19,60 Thus, it was surprising that therapist characteristics did not differentiate EBP initiative participation. It may be that the sample of therapists was heterogeneous making it difficult to identify meaningful differences. Reasons for a heterogeneous sample include that the selection of therapists was not uniform across organizations, that therapists may not have had equal opportunity to participate dependent on when they joined an organization, and that even within participating organizations not all therapists participated in the initiative. Future research that clarifies individual differences in participation in system level implementation efforts is warranted.
Organizations that participated in EBP initiatives had organizational climates characterized by lower levels of engagement and higher levels of stress. Thus, therapists in EBP participating organizations reported feeling less like they can accomplish worthwhile things and remain invested and may be experiencing emotional exhaustion and feeling overwhelmed with work. These results are consistent with previous research. Patterson and colleagues24 compared organizational climates among mental health programs within a large organization that were offering and not offering EBP. They found that programs offering EBP were characterized by less engagement and more stress than programs without any EBP. Items used to measure the “engagement” construct in the present study included “I feel I treat some of the clients I serve as impersonal objects (reverse coded)” and “I have accomplished many worthwhile things in this job.” It is possible that higher engagement scores reflect a strong internal mission and confidence in current treatment approaches, and such organizations might not see the need to bring in new models of treatment.
With regard to organization stress, there are several potential interpretations. The implementation process itself, which requires devoting time to training, ongoing support, and other operations (e.g., altering clinical documentation practices) may increase organizational stress.24 Several items measuring the “stress” construct could be interpreted as reflective of the implementation process (e.g., “Interests of the clients are often replaced by bureaucratic concerns such as paperwork,” and “The amount of work I have to do keeps me from doing a good job”). Alternatively, an organization may participate because either internal or system leaders believe that the implementation of EBP might mitigate some of the stress already present in an organization. Given the cross-sectional nature of this study, the directionality of this relationship cannot be determined but the results provide fodder for future inquiry. Finally, organizations fostering an implementation climate that provided educational support were less likely to participate in an EBP initiative. It may be that organizations providing more internal resources such as EBP workshops do not feel the need to participate in system-led (or external) training opportunities. Understanding differences between organizations that seek out EBP training versus those who participate in system-led efforts is another interesting area for future research.
Limitations
Several limitations exist. First, selection of both individuals and organizations for participation in initiatives was not systematic. Second the data were cross-sectional and this precludes us from making causal inferences. Third, data about length of time at the organization since participating in training for each therapist were not collected. Fourth, data were self-report, which can be limited by recall and social desirability biases. Fifth, analyses were underpowered to detect effects at the organizational level, however, results identified several marginally significant findings that were corroborated by previous research. Sixth, all organizations and providers were from a single large system, potentially limiting generalizability. Finally, study analyses were exploratory and results need replication.
Implications for Behavioral Health
This study has important implications for understanding the implementation of EBP within large public behavioral health systems, especially when EBP adoption is not mandated. This exploratory study investigated individual and organizational factors associated with participation in EBP initiatives in a large, urban public mental health system and found that factors at both the therapist and organizational levels may be related to participation in system-sponsored EBP initiatives. The large-scale implementation of EBP requires significant investment of time and financial resources. Understanding which therapist and organizational factors impact adoption of EBP has the potential to inform how systems and organizations can most effectively allocate those resources. More research is needed to understand directionality of impact; however, results suggest that it is particularly important to attend to the climate within an organization when implementing EBPs. More systematic investigation of how individuals and organizations are selected or choose to participate in EBP initiatives is needed to help tailor implementation strategies to support large-scale implementation.
Acknowledgments
We are especially grateful for the support that the Department of Behavioral Health and Intellectual disAbility Services has provided for this project, and for the Evidence Based Practice and Innovation (EPIC) group. We would also like to thank the following experts who provided their consultation on this project: Dr. Steve Marcus and Dr. David Mandell.
Footnotes
Conflicts of Interest and Disclosures: Dr. Beidas receives royalties from Oxford University Press and has served as a consultant for Merck and Kinark Child and Family Services. None of the reported disclosures are related to implementation of evidence-based practices for youth in the City of Philadelphia. The remaining authors (Dr. Skriner, Dr. Benjamin Wolk, Dr. Stewart, Ms. Adams, Dr. Rubin, and Dr. Evans) have no disclosures or conflicts of interest to report.
OSC profiles for each organization are typically created by creating an average score by agency composed of responses from front line service providers. However, two of the organizations did not have enough front line providers to create the OSC profile without the inclusion of agency leaders. Given that aggregate statistics (i.e., awg, rwg) were acceptable and empirical work suggesting that agreement in small organizations between leaders and followers is high;12 agency leaders were included in the total profile score for those two organizations only.
References
- 1.New Freedom Comission on Mental Health. Achieving the Promise: Transforming Mental Health Care in America. Final Report. Rockville, MD: U.S. Department of Health and Human Services; 2003. [Google Scholar]
- 2.U.S. Department of Health and Human Services. Mental Health: A Report of the Surgeon General. Rockville, MD: National Institute of Mental Health; 1999. [Google Scholar]
- 3.Weisz JR, Ng MY, Bearman SK. Odd couple? Reenvisioning the relation between science and practice in the dissemination-implementation era. Clinical Psychology Science. 2014;2(1):58–74. [Google Scholar]
- 4.Weisz JR, Jensen-Doss A, Hawley KM. Evidence-based youth psychotherapies versus usual clinical care: a meta-analysis of direct comparisons. American Psychologist. 2006;61(7):671–689. doi: 10.1037/0003-066X.61.7.671. [DOI] [PubMed] [Google Scholar]
- 5.Eccles MP, Mittman BS. Welcome to implementation science. Implementation Science. 2006;1(1):1–1. [Google Scholar]
- 6.Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research. 2011;38(1):4–23. doi: 10.1007/s10488-010-0327-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Damschroder LJ, Aron DC, Keith RE, et al. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science. 2009;4(1):50–64. doi: 10.1186/1748-5908-4-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Beidas RS, Marcus S, Aarons GA, et al. Predictors of community therapists’ use of therapy techniques in a large public mental health system. JAMA Pediatrics. 2015;169(4):374–382. doi: 10.1001/jamapediatrics.2014.3736. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Glisson C, Hemmelgarn A. The effects of organizational climate and interorganizational coordination on the quality and outcomes of children’s service systems. Child Abuse & Neglect. 1998;22(5):401–421. doi: 10.1016/s0145-2134(98)00005-2. [DOI] [PubMed] [Google Scholar]
- 10.Glisson C, Green P. Organizational climate, services, and outcomes in child welfare systems. Child Abuse & Neglect. 2011;35(8):582–591. doi: 10.1016/j.chiabu.2011.04.009. [DOI] [PubMed] [Google Scholar]
- 11.Henggeler SW, Chapman JE, Rowland MD, et al. Statewide adoption and initial implementation of contingency management for substance-abusing adolescents. Journal of Consulting and Clinical Psychology. 2008;76(4):556–567. doi: 10.1037/0022-006X.76.4.556. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Beidas RS, Edmunds J, Ditty M, et al. Are inner context factors related to implementation outcomes in cognitive-behavioral therapy for youth anxiety? Administration and Policy in Mental Health and Mental Health Services Research. 2014;41(6):788–799. doi: 10.1007/s10488-013-0529-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the Evidence-Based Practice Attitude Scale (EBPAS) Mental Health Services Research. 2004;6(2):61–74. doi: 10.1023/b:mhsr.0000024351.12294.65. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Glisson C, Dukes D, Green P. The effects of the ARC organizational intervention on caseworker turnover, climate, and culture in children’s service systems. Child Abuse & Neglect. 2006;30(8):855–880. doi: 10.1016/j.chiabu.2005.12.010. [DOI] [PubMed] [Google Scholar]
- 15.Peterson AE, Bond GR, Drake RE, et al. Predicting the long-term sustainability of evidence-based practices in mental health care: an 8-year longitudinal analysis. Journal of Behavioral Health Services & Research. 2014;41(3):337–346. doi: 10.1007/s11414-013-9347-x. [DOI] [PubMed] [Google Scholar]
- 16.Raghavan R, Bright CL, Shadoin AL. Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implementation Science. 2008;3(1):26–34. doi: 10.1186/1748-5908-3-26. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Willging CE, Green AE, Gunderson L, et al. From a “perfect storm” to “smooth sailing”: policymaker perspectives on implementation and sustainment of an evidence-based practice in two states. Child Maltreatment. 2015;20(1):24–36. doi: 10.1177/1077559514547384. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Williams JR, Blais MP, Banks D, et al. Predictors of the decision to adopt motivational interviewing in community health settings. The Journal of Behavioral Health Services & Research. 2014;41(3):294–307. doi: 10.1007/s11414-013-9357-8. [DOI] [PubMed] [Google Scholar]
- 19.Aarons GA, Sawitzky AC. Organizational culture and climate and mental health provider attitudes toward evidence-based practice. Psychological Services. 2006;3(1):61–72. doi: 10.1037/1541-1559.3.1.61. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Gallo KP, Barlow DH. Factors involved in clinician adoption and nonadoption of evidence-based interventions in mental health. Clinical Psychology: Science & Practice. 2012;19(1):93–106. [Google Scholar]
- 21.Stewart RE, Chambless DL, Baron J. Theoretical and practical barriers to practitioners’ willingness to seek training in empirically supported treatments. Journal of Clinical Psychology. 2012;68(1):8–23. doi: 10.1002/jclp.20832. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Hemmelgarn AL, Glisson C, James LR. Organizational culture and climate: implications for services and interventions research. Clinical Psychology: Science & Practice. 2006;13(1):73–89. [Google Scholar]
- 23.Wang W, Saldana L, Brown CH, et al. Factors that influenced county system leaders to implement an evidence-based program: a baseline survey within a randomized controlled trial. Implementation Science. 2010;5(1):72–79. doi: 10.1186/1748-5908-5-72. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Patterson DA, Dulmus CN, Maguin E. Empirically supported treatment’s impact on organizational culture and climate. Research on Social Work Practice. 2012;22(6):665–671. doi: 10.1177/1049731512448934. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Beidas RS, Aarons GA, Barg FK, et al. Policy to implementation: evidence-based practice in community mental health - study protocol. Implementation Science. 2013;8(1):38–46. doi: 10.1186/1748-5908-8-38. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Beidas RS, Stewart RE, Adams DR, et al. A multi-level examination of stakeholder perspectives of implementation of evidence-based practices in a large urban publicly-funded mental health system. Administration and Policy in Mental Health and Mental Health Services Research. 2016;43(6):893–908. doi: 10.1007/s10488-015-0705-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Chorpita BF, Yim LM, Donkervoet JC, et al. Toward large-scale implementation of empirically supported treatments for children: a review and observations by the Hawaii empirical basis to services task force. Clinical Psychology: Science & Practice. 2002;9(2):165–190. [Google Scholar]
- 28.Dorsey S, Berliner L, Lyon AR, et al. A statewide common elements initiative for children’s mental health. Journal of Behavioral Health Services & Research. 2016;43(2):246–261. doi: 10.1007/s11414-014-9430-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Southam-Gerow MA, Daleiden EL, Chorpita BF, et al. MAPping Los Angeles County: taking an evidence-informed model of mental health care to scale. Journal of Clinical Child and Adolescent Psychology. 2014;43(2):190–200. doi: 10.1080/15374416.2013.833098. [DOI] [PubMed] [Google Scholar]
- 30.Glisson C. The organizational context of children’s mental health services. Clinical Child and Family Psychology Review. 2002;5(4):233–253. doi: 10.1023/a:1020972906177. [DOI] [PubMed] [Google Scholar]
- 31.Gotham HJ. Diffusion of mental health and substance abuse treatments: development, dissemination, and implementation. Clinical Psychology: Science & Practice. 2004;11(2):160–176. [Google Scholar]
- 32.Ehrhart MG, Aarons GA, Farahnak LR. Assessing the organizational context for EBP implementation: the development and validity testing of the Implementation Climate Scale (ICS) Implementation Science. 2014;9(1):157–167. doi: 10.1186/s13012-014-0157-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Stirman SW, Kimberly J, Cook N, et al. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implementation Science. 2012;7(1):17–35. doi: 10.1186/1748-5908-7-17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Stirman SW, Spokas M, Creed TA, et al. Training and consultation in evidence-based psychosocial treatments in public mental health settings: the ACCESS model. Professional Psychology: Research & Practice. 2010;41(1):48–56. doi: 10.1037/a0018099. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Beidas RS, Adams DR, Kratz HE, et al. Lessons learned while building a trauma-informed public mental health system in the city of Philadelphia. Evaluation Program and Planning. 2016;59:21–32. doi: 10.1016/j.evalprogplan.2016.07.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Powell BJ, Beidas RS, Rubin RM, et al. Applying the policy ecology framework to Philadelphia’s behavioral health transformation efforts. Administration and Policy in Mental Health and Mental Health Services Research. 2016;43(6):909–926. doi: 10.1007/s10488-016-0733-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Beidas SR, Williams NJ, Green PD. Concordance between administrator and clinician ratings of organizational culture and climate. Administration and Policy in Mental Health and Mental Health Services Research. 2016 Nov 5;:1–10. doi: 10.1007/s10488-016-0776-8. Available online first. [DOI] [PubMed] [Google Scholar]
- 38.Aarons GA, Ehrhart MG, Torres EM, et al. The humble leader: Association of discrepancies in leader and follower ratings of implementation leadership with organizational climate in mental health. Psychiatric Services. 2017;68(2):115–122. doi: 10.1176/appi.ps.201600062. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Torres EM, Ehrhart MG, Beidas RS, et al. Validation of the Implementation Leadership Scale (ILS) with supervisors’ self-ratings. Community Mental Health Journal. 2017 Feb 8;:1–5. doi: 10.1007/s10597-017-0114-y. Available online first. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Weisz JR. Therapist Background Questionnaire. Los Angeles: University of California; 1997. [Google Scholar]
- 41.Maslach C, Jackson SE, Leiter MP. Maslach Burnout Inventory Manual. Consulting Psychologists Press; 1996. [Google Scholar]
- 42.Maslach C, Schaufeli WB, Leiter MP. Job burnout. Annual Review of Psychology. 2001;52:397–422. doi: 10.1146/annurev.psych.52.1.397. [DOI] [PubMed] [Google Scholar]
- 43.Schaufeli WB, Bakker AB, Hoogduin K, et al. On the clinical validity of the maslach burnout inventory and the burnout measure. Psychology and Health. 2001;16(5):565–582. doi: 10.1080/08870440108405527. [DOI] [PubMed] [Google Scholar]
- 44.Glass DC, Mcknight JD. Perceived control, depressive symptomatology, and professional burnout: a review of the evidence. Psychology and Health. 1996;11(1):23–48. [Google Scholar]
- 45.Aarons GA, Glisson C, Hoagwood KE, et al. Psychometric properties and U.S. national norms of the Evidence-Based Practice Attitude Scale (EBPAS) Psychological Assessment. 2010;22(2):356–365. doi: 10.1037/a0019188. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Stumpf RE, Higa-McMillan CK, Chorpita BF. Implementation of evidence-based services for youth assessing provider knowledge. Behavior Modification. 2009;33(1):48–65. doi: 10.1177/0145445508322625. [DOI] [PubMed] [Google Scholar]
- 47.Weersing VR, Weisz JR, Donenberg GR. Development of the therapy procedures checklist: a therapist-report measure of technique use in child and adolescent treatment. Journal of Clinical Child and Adolescent Psychology. 2002;31(2):168–180. doi: 10.1207/S15374424JCCP3102_03. [DOI] [PubMed] [Google Scholar]
- 48.Kolko DJ, Cohen JA, Mannarino AP, et al. Community treatment of child sexual abuse: a survey of practitioners in the national child traumatic stress network. Administration and Policy in Mental Health and Mental Health Services Research. 2009;36(1):37–49. doi: 10.1007/s10488-008-0180-0. [DOI] [PubMed] [Google Scholar]
- 49.Glisson C, Landsverk JA, Schoenwald S, et al. Assessing the Organizational Social Context (OSC) of mental health services: Implications for research and practice. Administration and Policy in Mental Health and Mental Health Services Research. 2008;35(1–2):98–113. doi: 10.1007/s10488-007-0148-5. [DOI] [PubMed] [Google Scholar]
- 50.Glisson C, Green PD, Williams NJ. Assessing the Organizational Social Context (OSC) of child welfare systems: implications for research and practice. Child Abuse & Neglect. 2012;36(9):621–632. doi: 10.1016/j.chiabu.2012.06.002. [DOI] [PubMed] [Google Scholar]
- 51.Brown RD, Hauenstein NMA. Interrater agreement reconsidered: an alternative to the r(wg) indices. Organizational Research Methods. 2005;8(2):165–184. [Google Scholar]
- 52.James LR, Demaree RG, Wolf G. Estimating within-group interrater reliability with and without response bias. Journal of Applied Psychology. 1984;69(1):85–98. [Google Scholar]
- 53.Bliese PD. Within-group agreement, non-independence, and reliability: implications for data aggregation and analysis. In: Klein K, Kozlowski S, editors. Multilevel Theory, Research, and Methods in Organizations. San Francisco: Jossey-Bass; 2000. pp. 349–380. [Google Scholar]
- 54.Aarons GA, Sommerfeld DH, Hecht DB, et al. The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: evidence for a protective effect. Journal of Consulting and Clinical Psychology. 2009;77(2):270–280. doi: 10.1037/a0013223. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Rothman KJ. No adjustments are needed for multiple comparisons. Epidemiology. 1990;1(1):43–46. [PubMed] [Google Scholar]
- 56.Beidas RS, Stewart RE, Wolk CB, et al. Independent Contractors in Public Mental Health Clinics: implications for use of evidence-based practices. Psychiatric Services. 2016;67(7):710–717. doi: 10.1176/appi.ps.201500234. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57.Birdi K, Allan C, Warr P. Correlates and perceived outcomes of four types of employee development activity. Journal of Applied Psychology. 1997;82(6):845–857. doi: 10.1037/0021-9010.82.6.845. [DOI] [PubMed] [Google Scholar]
- 58.Beidas RS, Marcus S, Wolk CB, et al. A prospective examination of clinician and supervisor turnover within the context of implementation of evidence-based practices in a publicly-funded mental health system. Administration and Policy in Mental Health and Mental Health Services Research. 2015;43(5):640–649. doi: 10.1007/s10488-015-0673-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59.Shore LM, Barksdale K, Shore TH. Managerial perceptions of employee commitment to the organization. Academy of Management Journal. 1995;38(6):1593–1615. [Google Scholar]
- 60.Aarons GA. Measuring provider attitudes toward evidence-based practice: consideration of organizational context and individual differences. Child Adolescent Psychiatric Clinics of North America. 2005;14(2):255–271. doi: 10.1016/j.chc.2004.04.008. [DOI] [PMC free article] [PubMed] [Google Scholar]