Abstract
Importance
Few studies have examined the effects of both clinician and organizational characteristics on the use of evidence-based practices in mental healthcare. Improved understanding of these factors could guide future implementation efforts to ensure effective adoption, implementation, and sustainment of evidence-based practices.
Objective
To estimate the relative contribution of clinician and organizational factors on clinician self-reported use of cognitive-behavioral, family, and psychodynamic techniques within the context of a large-scale effort to increase use of evidence-based practices in an urban public mental health system serving youth and families.
Design
Observational and cross-sectional. Data collected in 2013.
Setting
Twenty-three organizations.
Participants
We used purposive sampling to recruit the 29 largest child-serving agencies, which together serve approximately 80% of youth receiving publically funded mental health care. The final sample included 19 agencies with 23 sites, 130 therapists, 36 supervisors, and 22 executive administrators.
Main Outcome Measures
Clinician self-reported use of cognitive-behavioral, family, and psychodynamic techniques, as measured by the Therapist Procedures Checklist – Family Revised.
Results
Linear mixed-effects regression models were used; models included random intercepts for organization to account for nesting of clinicians within organization. Clinician factors accounted for the following percentage of the overall variation: cognitive-behavioral (16%), family (7%), psychodynamic (20%). Organizational factors accounted for the following percentage of the overall variation: cognitive-behavioral (23%), family (19%), psychodynamic (7%). Older clinicians and clinicians with more open attitudes were more likely to endorse use of cognitive behavioral techniques, as were those in organizations that had spent fewer years participating in evidence-based practice initiatives, had more resistant cultures, and had more functional climates. Female clinicians were more likely to endorse use of family techniques, as were those in organizations employing more fee-for-service staff and with more stressful climates. Clinicians with more divergent attitudes and less knowledge about evidence-based practices were more likely to use psychodynamic techniques.
Conclusions & Relevance
This study suggests that both clinician and organizational factors are important in explaining clinician behavior and the use of evidence-based practices, but that their relative importance varies by therapeutic technique.
Implementation science frameworks posit that both clinician (e.g., knowledge and attitudes) and organizational (e.g., culture and climate) characteristics affect the delivery of evidence-based practices (EBPs) in health and mental health.1 Little is known about the relative contributions of these two sets of characteristics. This study estimates the relative contribution of clinician and organizational factors on clinicians’ use of cognitive-behavioral, family, and psychodynamic therapy techniques, within the context of a large-scale effort to increase the use of cognitive-behavioral therapy (CBT) in an urban public mental health system.
Literature supports the role of both clinician and organizational factors in the delivery of children’s mental health services. For example, clinician factors such as attitudes towards EBPs2,3 predict the extent to which clinicians deliver EBPs as designed. Similarly, organizational factors such as organizational culture (i.e., shared employee perceptions around expectations and norms)4 and organizational climate (i.e., psychological impact of the work environment on individual well-being)5 have been linked to quality of services6 and youth mental health outcomes.6,7 Previous research has largely focused on either clinician or organizational factors. Both sets of studies find evidence for the predictive validity of their constructs of interest. Clinician and organizational factors are correlated,8 making it difficult to disentangle the individual contribution of each set. Further, different outcomes have been examined in these two sets of studies, making it challenging to compare results.
To address these issues, we measured the relationship between clinician and organizational characteristics and use of therapy techniques in clinicians in an urban public mental health system engaged in a large-scale effort to increase the use of CBT. We explore the relative contribution of clinician and organizational characteristics on therapist self-reported use of cognitive-behavioral, family, and psychodynamic techniques, widely endorsed techniques by therapists in usual care.9 CBT has a large body of evidence to support its effectiveness in addressing youth psychiatric disorders.10 Family therapy has also been found to be effective for some youth psychiatric disorders, particularly when family-oriented components are combined with cognitive-behavioral techniques.11 Psychodynamic therapy has little evidence to support its effectiveness for youth psychiatric disorders.10
Methods
Setting
Since 2007, the Philadelphia Department of Behavioral Health and Intellectual disAbility Services has supported the implementation of therapy techniques in the public mental health system based on the principles of CBT. Implementation support includes a full-time city employee who coordinates implementation, and training and consultation by treatment developers.
Agencies and Participants
There are over 100 community mental health agencies in Philadelphia that provide outpatient services to youth (Cathy Bolton, PhD, email communication, January 3rd, 2013). We used purposive sampling to recruit the 29 largest child-serving agencies, which serve approximately 80% of youth receiving publically funded mental health care. Of these 29 agencies, 18 (62%) agreed to participate. Additionally, one agency involved in EBP efforts asked to participate, resulting in a final sample of 19 agencies (23 sites, 130 therapists, 36 supervisors, 22 executive administrators). Each site (N = 23), rather than each agency (N = 19), was treated as a distinct organization because of different leadership structures, locations, and staff. Going forward, we refer to site as “organization.” The leader of each organization was invited to participate as the “executive administrator.” There were no exclusion criteria for participation. Of the organizations enrolled in this study, 16 had participated in city-sponsored EBP initiatives.
Procedure
All procedures were approved by appropriate Institutional Review Boards. We approached the executive administrator of each organization for participation. Executive administrators completed their questionnaires using REDcap, a secure web-based application that supports online data collection.12 For supervisors and clinicians, we scheduled a one-time, two-hour meeting at each organization, at which we provided lunch, obtained informed consent, and completed data collection. Approximately 60% of therapists employed by the 23 organizations participated in the study. Participants received $50.
Measures
Participant Characteristics
Executive administrators provided information on their age, gender, ethnicity/race, and educational background. Clinician and supervisor demographics were assessed using the Therapist Background Questionnaire (TBQ), 13 a 21-item demographics questionnaire. We also asked participants to report on employment status (i.e., fee-for-service or salaried).
Clinician attitudes were assessed using the Evidence-Based Practice Attitude Scale (EBPAS)14 a 15-item self-report questionnaire that assesses constructs related to appeal of EBP, requirements to use EBP, general openness to new practices, and divergence between EBP and usual practice. Each subscale is an average of the items within that factor which is measured on a continuum from 0 (not at all) to 4 (very great extent). The EBPAS demonstrates good internal consistency15 and validity.16
Clinician knowledge about EBP was measured using the Knowledge of Evidence-Based Services Questionnaire (KEBSQ)17 a 40-item self-report instrument. Knowledge is measured on a continuum from 0–160 where higher scores are indicative of more knowledge of evidence-based services for youth. Psychometric data suggests temporal stability, discriminative validity, and sensitivity to training.17
Organizational Characteristics
Supervisors provided information on the number of clinicians in their unit and their employment status. From that information, we determined program size (i.e., number of clinicians in their unit) and percentage of fee-for-service staff. We also assessed the number of years the organization had formally been involved in city-sponsored EBP initiatives.
Organizational culture and climate were measured from the perspectives of clinicians, supervisors, and executive administrators using the Organizational Social Context Measurement System (OSC),18 a 105-item measure of the social context of mental health organizations. Organizational culture includes proficiency, rigidity, and resistance, and climate includes engagement, functionality, and stress. Proficient cultures are those in which clinicians prioritize the well-being of clients and are expected to keep up competencies. Rigid cultures are those in which clinicians have little autonomy and discretion. Resistant cultures refer to ones in which clinicians are expected to be apathetic to. Engaged climates refer to ones in which clinicians feel they can accomplish worthwhile things and remain invested in their work. Functional climates are ones in which clinicians feel that they are able to get their job done effectively. Stressful climates refer to ones in which clinicians feel emotionally exhausted. Organizational culture and climate are measured with T-scores with a mean of 50 and standard deviation of 10 based on a normed sample of 100 community mental health clinics.18 The OSC has strong psychometric properties.19
Implementation climate was measured from the perspective of clinicians, supervisors, and executive administrators, using the Implementation Climate Scale (ICS)20 an 18-item scale that measures climate for EBP implementation including focus on EBP, educational support for EBP, recognition for using EBP, rewards for using EBP, selection of staff for EBP, and general organizational openness. Each subscale is an average of the items within that factor which is measured on a continuum from 0 (not at all) to 4 (very great extent). Psychometrics are strong with excellent reliability and validity.20
Implementation leadership was measured from the perspective of clinicians, rating their direct supervisor, using the Implementation Leadership Scale (ILS),21 a 12-item scale that measures leader behaviors relevant to implementation of EBP including proactive, knowledgeable, supportive, and perseverant leadership. Each subscale is an average of the items within that factor which is measured on a continuum from 0 (not at all) to 4 (very great extent). The ILS has strong psychometric properties with excellent internal consistency and validity.21
Dependent Variables
Clinician use of cognitive-behavioral, family, and psychodynamic techniques was measured using the Therapist Procedures Checklist - Family Revised (TPC-FR),22 a 62-item self-report checklist. Clinicians were asked to respond in reference to a representative client that they were currently treating. Clinicians could endorse using strategies from all three families of techniques. Each dependent variable is an average of the items that fit within that factor which is measured on a continuum from 1–5 where the anchors refer to 1 (rarely); 2 (seldom); 3(sometimes); 4(often); 5(most of the time). Higher scores are indicative of more utilization of the set of techniques. Factor structure has been confirmed, test-retest reliability is strong, and the instrument is sensitive to within-therapist changes in technique use.22,23
Data Analytic Plan
Organizational measures are constructed by aggregating individual responses within the organization, if there is enough agreement among individuals. To determine agreement, we used average within-group correlation (awg, rwg) statistics.24,25 On all organizational variables, statistics were above the suggested .60 level;25,26 therefore participant responses to organizational constructs were averaged within each organization. Missing data for both independent and predictor variables were minimal (<10%); series means were imputed for missing predictor variables.
We used three sets of linear mixed-effects regression models to determine the associations of clinician and organizational factors (independent variables) with self-reported use of cognitive-behavioral, family, and psychodynamic techniques (dependent variables). All linear mixed-effects models included random intercepts for organization to account for nesting of clinicians within organizations and fixed effects for clinician and organizational factors. Clinician factors included participant demographics [gender, age, clinical experience, employment status (i.e., fee-for-service or salaried)], attitudes [EBPAS subscales] and knowledge [KEBSQ total score]. Organizational factors included organization demographics [cumulative years organization participated in city-sponsored EBP initiatives, program size (i.e., number of therapists in the unit), organization type (percentage of fee-for-service staff)], implementation climate [ICS subscales], implementation leadership [ILS subscales], and organizational culture and organizational climate [OSC subscales]. Dependent variables included use of cognitive-behavioral [TPC-CBT], family [TPC-Family], and psychodynamic [TPC-Psychodynamic] techniques. Therapist case-mix and clinician ethnicity were initially included in the model as covariates but were removed because associated coefficients were not statistically significant.
Analyses were conducted using PROC MIXED in SAS 9.0. Four separate models were conducted for each of the dependent variables. In the first unconditional model, only the organization random effect was included to provide an estimate of both the organizational and residual variance. This model allows us to estimate how much of the total variance could possibly be attributed to the organization and to calculate the intraclass correlation coefficient (ICC). In the second model, only clinician fixed effects were included to estimate the total variance attributable to clinician fixed effects. This model allowed us to calculate the proportion of the total variation in the model explained by clinician factors. In the third model, only organizational fixed effects were included to estimate the total variation attributable to organizational fixed effects. In the fourth model, both clinician and organizational fixed effects were included; these models are reported upon in the results. Our goal was to understand how much of the overall variation in our dependent variable (s) was explained by the set of organizational factors and individual factors, respectively (i.e., separately). The proportion of variation explained by clinician factors was calculated by subtracting the total variation from the clinician factors model (Model 2) from the variance of the unconditional model (Model 1) and dividing by the total variation from Model 1 (i.e., %Varclinician = (Varunconditional - Varclinician)/Varunconditional); the same technique was used separately to calculate the proportion of variation for organizational factors (i.e., %Varorganization = (Varunconditional – Varorganization)/Varunconditional).27 Our analyses of the contributions of individual and organizational factors focus on the unique effect of each factor after controlling for all other factors in the model.
Results
Participants
Table 1 provides demographic information about clinicians. Half of executive administrators were male; they identified as Asian (9%), African American (18%), Caucasian (55%), Multiracial (9%), or missing ethnicity/race (9%). Fifteen percent identified as Hispanic/Latino. Supervisors were primarily female (69%); they identified as African American (17%), Caucasian (56%), Hispanic/Latino (14%), other (3%), or missing (10%).
Table 1.
Variable | N | Median [Inter-Quartile Range; 25th–75th percentile] or percentage (%) |
---|---|---|
Dependent Variable – Therapist Procedures Checklist – Family Revised | ||
| ||
Cognitive-behavioral Techniquesa | 127 | 3.15 [2.78–3.70] |
Family Techniquesa | 127 | 3.47 [3.00–3.80] |
Psychodynamic Techniquesa | 127 | 3.57 [2.89–4.25] |
| ||
Clinician Factors | ||
| ||
Demographics | ||
Ethnicity/Raceb | 123 | Asian (5%), African-American (22%), Caucasian (55%), Hispanic/Latino (11%), Multiracial (4%), Other (4%) |
Educational Backgroundb | 124 | Bachelors (4%), Masters (86%), Doctoral (10%) |
Years at Current Organization | 124 | 2.00 [1.00–4.00] years |
Genderb | 129 | Male (23%); Female (76%) |
Age (Years) | 122 | 35.00 [29.00–47.00] years |
Years Clinical Experience | 122 | 5.00 [2.00–10.00] years |
Employment Statusb | 119 | FFS (52%); Salaried (40%) |
Attitudes | ||
Requirements: Extent to which a therapist would adopt EBP if it were requiredc | 129 | 3.00 [2.00–3.67] |
Appeal: Extent to which a therapist would adopt EBP if it were appealingc | 129 | 3.25 [2.67–3.75] |
Openness: Extent to which a therapist is open to trying EBPc | 129 | 3.00 [2.50–3.75] |
Divergence: Extent to which EBP is not clinically usefulc | 129 | 1.25 [.75–1.75] |
Knowledge | ||
Total Knowledge of EBP for Youthd | 127 | 94.00 [89.50–101.00] |
| ||
Organizational Factors | Org. N | |
| ||
Cumulative Years Participating in EBP initiativese | 23 | 3.00 [0.00–5.00] years |
Program Size (Number of Therapists) | 23 | 9.50 [7.00–25.00] therapists |
Percentage of Staff that are Employed Using a Fee-for-Service Model | 23 | 65% |
Implementation Climate | ||
Focus on EBPs: Extent to which an organization values and emphasizes EBPc | 23 | 2.38 [1.79–2.89] |
Educational support: Extent to which an organization provides educational support for EBPc | 23 | 1.58 [1.25–1.95] |
Recognition: Extent to which an organization recognizes staff implementing EBPc | 23 | 2.00 [1.70–2.61] |
Reward: Extent to which an organization financially rewards staff implementing EBP c | 23 | .39 [.31–.95] |
Staff selection: Extent to which an organization selects staff based on ability to implement EBPc | 23 | 2.33 [2.00–2.90] |
Openness: Extent to which an organization is generally open to innovationc | 23 | 2.92 [2.33–3.42] |
Implementation Leadership | ||
Proactive: Extent to which leader developed a plan to facilitate EBP implementationc | 23 | 2.12 [1.75–2.89] |
Knowledgeable: Extent to which leader is knowledgeable about EBPc | 23 | 2.89 [2.25–3.33] |
Supportive: Extent to which leader is supportive around EBP implementationc | 23 | 3.04 [2.67–3.44] |
Perseverant: Extent to which leader is perseverant through ups and downs of EBP implementationc | 23 | 2.79 [2.36–3.33] |
Organizational Social Context | ||
Proficient culture: Extent to which clinicians are expected to remain knowledgeable and competentf | 23 | 55.60 [45.83–59.40] |
Rigid culture: Extent to which clinicians have little autonomy and discretionf | 23 | 57.97 [52.95–63.18] |
Resistant culture: Extent to which clinicians are apathetic to changef | 23 | 64.22 [56.82–74.70] |
Engaged climate: Extent to which clinicians feel like they can accomplish worthwhile thingsf | 23 | 54.17 [48.82–58.72] |
Functional climate: Extent to which clinicians feel like they can function effectivelyf | 23 | 62.14 [55.33–72.19] |
Stressful climate: Extent to which clinicians are emotionally exhaustedf | 23 | 55.46 [51.82–59.15] |
Measured on a continuum from 1–5 where the anchors refer to 1 (rarely); 2 (seldom); 3(sometimes); 4(often); 5(most of the time). Higher scores are indicative of more utilization of the set of techniques.
Does not sum to 100% due to missing data.
Measured on a continuum from 0–4 where the anchors refer to 0 (not at all); 1 (slight extent); 2 (moderate extent); 3 (great extent); 4 (very great extent). Higher scores are indicative of more positive attitudes, implementation climate, and implementation leadership.
Measured on a continuum from 0–160 where higher scores are indicative of more knowledge of evidence-based services for youth.
To calculate this variable, we added up the total number of years spent participating in an EBP initiative. For example, if an agency participated in one initiative for 2 years, and another initiative for 3 years, the total score for this variable would be 5 years. Higher numbers are indicative of more time spent in EBP initiatives.
Organizational culture and climate are measured with T-scores with a mean of 50 and standard deviation of 10 based on a normed sample of 100 community mental health clinics. Higher scores on proficiency, engagement, and functionality are reflective of more positive culture or climate. Higher scores on rigidity, resistance, and stress are reflective of more negative culture and climate.
Abbreviations: EBP = evidence-based practice, FFS = fee-for-service; * = adds up to greater than 100% because of rounding error, ^ not included in linear mixed-effects mode
See Table 1 for descriptive statistics for all variables included in the models. See eTable 1 for a correlation matrix documenting correlations between predictors and outcomes.
Clinician Use of Cognitive-Behavioral Techniques
See Table 2 for model parameters. Organizational factors accounted for 23% of the variance in clinicians’ use of cognitive-behavioral techniques; clinician factors accounted for 16%. Three organizational variables were associated with use of cognitive-behavioral techniques. Clinicians in organizations that had participated for fewer years in city-sponsored EBP initiatives, had more resistant cultures, and had more functional climates were more likely to use cognitive-behavioral techniques. Two clinician variables were associated with use of cognitive-behavioral techniques. Older clinicians and clinicians with more open attitudes towards new practices were more likely to use cognitive-behavioral techniques.
Table 2.
Variable | Cognitive-Behaviorala | Psychodynamica | Familya |
---|---|---|---|
Variance Components in Random Effects Only Model | |||
Organizational variance | .13 | .04 | .17 |
Residual variance | .35 | .39 | .75 |
ICC | .27 | .09 | .19 |
Clinician factors parameter estimates from regression analyses [confidence intervals, alpha = .05]
| |||||||||
---|---|---|---|---|---|---|---|---|---|
Demographics | Mean Diffb | Lower CI | Upper CI | Mean Diffb | Lower CI | Upper CI | Mean Diffb | Lower CI | Upper CI |
Male Gender (male vs. female) | −.13 | −.41 | .15 | −.26 | −.56 | .03 | −.49 | −.90 | −.08 |
Age (years; 1-year difference) | .02 | .00 | .03 | .01 | −.01 | .02 | .01 | −.01 | .03 |
Clinical Experience (years; 1-year difference) | −.00 | −.02 | .02 | −.01 | −.03 | .15 | .03 | −.00 | .06 |
Salaried Employment Status (salaried vs. FFS) | −.03 | −.34 | .29 | −.01 | −.32 | .33 | .03 | −.42 | .48 |
Attitudes | |||||||||
Requirements: Extent to which a therapist would adopt EBP if it were required (1-point difference)c | −.04 | −.15 | .08 | −.01 | −.13 | .11 | −.08 | −.24 | .08 |
Appeal: Extent to which a therapist would adopt EBP if it were appealing (1-point difference)c | .07 | −.12 | .27 | .09 | −.11 | .30 | .19 | −.10 | .48 |
Openness: Extent to which a therapist is open to trying EBP (1-point difference)c | .21 | .03 | .40 | .13 | −.06 | .32 | .06 | −.21 | .32 |
Divergence: Extent to which EBP is not clinically useful (1-point difference)c | .14 | −.02 | .30 | .23 | .06 | .40 | .05 | −.18 | .28 |
Knowledge | |||||||||
Total Knowledge of EBP for Youth (1-point difference)d | .00 | −.01 | .01 | −.02 | −0.03 | −.00 | −.01 | −.03 | .01 |
Organizational factors parameter estimates from regression analyses [confidence intervals, alpha = .05]
| |||||||||
---|---|---|---|---|---|---|---|---|---|
Mean Diffb | Lower CI | Upper CI | Mean Diffb | Lower CI | Upper CI | Mean Diffb | Lower CI | Upper CI | |
Cumulative Years Participating in EBP initiatives (1-year difference)e | −.13 | −.22 | −.04 | −.05 | −.14 | .04 | −.12 | −.24 | .01 |
Program size (number of therapists; 1-therapist difference) | −.02 | −.04 | .00 | −.01 | −.03 | .01 | .01 | −.04 | .02 |
Organization Type (% FFS staff; 1-percentage difference) | .75 | −.08 | 1.58 | .55 | −.30 | 1.41 | 1.26 | .07 | 2.46 |
Implementation Climate | |||||||||
Focus on EBP: Extent to which an organization values and emphasizes EBP (1-point difference)c | −.13 | −.70 | .43 | .10 | −.49 | .69 | .01 | −.81 | .82 |
Educational support: Extent to which an organization provides educational support for EBP (1- point difference)c | −.27 | −.94 | .39 | .24 | −.45 | .93 | −.20 | −1.17 | .76 |
Recognition: Extent to which an organization recognizes staff implementing EBP (1-point difference)c | .19 | −.37 | .75 | .13 | −.45 | .71 | .13 | −.68 | .93 |
Reward: Extent to which an organization financially rewards staff implementing EBP (1-point difference)c | .04 | −.49 | .56 | −.06 | −.61 | .48 | .51 | −.24 | 1.27 |
Staff selection: Extent to which an organization selects staff based on ability to implement EBP (1-point difference)c | −.01 | −.65 | .62 | −.37 | −1.03 | .29 | −.76 | −1.68 | .16 |
Openness: Extent to which an organization is generally open to innovation (1-point difference)c | −.40 | −.99 | .18 | .13 | −.47 | .73 | .21 | −.64 | 1.04 |
Implementation Leadership | |||||||||
Proactive: Extent to which leader developed a plan to facilitate EBP implementation (1-point difference)c | .08 | −.45 | .61 | .01 | −.54 | .56 | −.04 | −.80 | .73 |
Knowledgeable: Extent to which leader is knowledgeable about EBP (1-point difference)c | .01 | −.36 | .38 | .11 | −.28 | .49 | −.24 | −.78 | .29 |
Supportive: Extent to which leader is supportive around EBP implementation (1-point difference)c | −.43 | −1.05 | .19 | −.42 | −1.06 | .23 | −.25 | −1.14 | .65 |
Perseverant: Extent to which leader is perseverant through ups and downs of EBP implementation (1-point difference)c | .35 | −.45 | 1.16 | .16 | −.67 | 1.00 | .54 | −.62 | 1.70 |
Organizational Social Context | |||||||||
Proficient culture: Extent to which clinicians are expected to remain knowledgeable and competent (1-point difference)f | −.03 | −.08 | .01 | −.02 | −.06 | .02 | −.02 | −.08 | .04 |
Rigid culture: Extent to which clinicians have little autonomy and discretion (1-point difference)f | −.02 | −.05 | .01 | −.01 | −.05 | .02 | −.04 | −.08 | .01 |
Resistant culture: Extent to which clinicians are apathetic to change (1-point difference)f | .05 | .02 | .08 | .02 | −.01 | .05 | .03 | −.02 | .07 |
Engaged climate: Extent to which clinicians feel like they can accomplish worthwhile things (1-point difference)f | .01 | −.04 | .05 | .01 | −.03 | .05 | .03 | −.03 | .09 |
Functional climate: Extent to which clinicians feel like they can function effectively (1-point difference)f | .09 | .02 | .16 | .03 | −.04 | .10 | .08 | −.02 | .17 |
Stressful climate: Extent to which clinicians are emotionally exhausted (1-point difference)f | .05 | −.00 | .11 | .03 | −.03 | .09 | .12 | .04 | .20 |
Measured on a continuum from 1–5 where the anchors refer to 1 (rarely); 2 (seldom); 3(sometimes); 4(often); 5(most of the time). Higher scores are indicative of more utilization of the set of techniques.
Mean diff refers to the mean differences in the outcome between groups (for categorical variables) or for a 1-point difference (for continuous variables.
Measured on a continuum from 0–4 where the anchors refer to 0 (not at all); 1 (slight extent); 2 (moderate extent); 3 (great extent); 4 (very great extent). Higher scores are indicative of more positive attitudes, implementation climate, and implementation leadership.
Measured on a continuum from 0–160 where higher scores are indicative of more knowledge of evidence-based services for youth.
To calculate this variable, we added up the total number of years spent participating in an EBP initiative. For example, if an agency participated in one initiative for 2 years, and another initiative for 3 years, the total score for this variable would be 5 years. Higher numbers are indicative of more time spent in EBP initiatives.
Organizational culture and climate are measured with T-scores with a mean of 50 and standard deviation of 10 based on a normed sample of 100 community mental health clinics. Higher scores on proficiency, engagement, and functionality are reflective of more positive culture or climate. Higher scores on rigidity, resistance, and stress are reflective of more negative culture and climate.
Abbreviations: ICC = intraclass correlation coefficient; EBP = evidence-based practice; FFS = fee-for-service
Clinician Use of Family Techniques
Organizational variables accounted for 19% of the variance in the use of family techniques; clinician variables accounted for 7%. Two organizational variables were associated with use of family techniques. Organizations employing more fee-for-service staff and organizations with more stressful climates were more likely to use family techniques. One clinician variable was associated with use of family techniques: female clinicians were more likely to use family techniques.
Clinician Use of Psychodynamic Techniques
Clinician factors accounted for 20% of the variance in the use of psychodynamic techniques; organizational factors accounted to 7%. Two clinician factors were associated with use of psychodynamic techniques. Clinicians with more divergent attitudes on the perceived difference between evidence-based practices and current practices and less knowledge about EBP were more likely to use psychodynamic techniques.
Variance Attributable to Clinician and Organizational Factors Collectively
For use of cognitive-behavioral techniques, collectively clinician and organizational factors explained 30% of the overall variation; for use of psychodynamic strategies, collectively clinician and organizational factors explained 18% of the overall variation; for use of family strategies, collectively clinician and organizational factors explained 26% of the overall variation (data not shown).
Discussion
This study provides information on what predicts clinicians’ use of therapy techniques in a large public mental health system supporting implementation of CBT. Organizational factors accounted for more of the variance in clinicians’ use of cognitive-behavioral and family techniques. Conversely, clinician factors accounted for more of the variance in their use of psychodynamic techniques. CBT and family therapy are both evidence-based techniques for childhood disorders whereas psychodynamic techniques have less rigorous evidence to support them.10 Taken collectively, these findings suggest that organizational factors are more likely to drive use of EBPs, whereas clinician attributes are more likely to drive use of non-EBP therapy techniques.
Consistent with the literature,16 clinicians with more open attitudes were more likely to use cognitive-behavioral techniques. However, inconsistent with the literature,14 older clinicians were more likely to use cognitive-behavioral techniques. Older therapists may have more experience changing treatment modalities according to demand because they have been in the system longer. Paradoxically, organizations that had spent fewer years participating in city-sponsored EBP initiatives were more likely to use cognitive-behavioral techniques. It is possible that organizations that seek out CBT training have clinicians less experienced in CBT. Alternatively, it is possible that organizations spending more years participating in EBP initiatives experience “evidence-based practice fatigue” (i.e., innovation fatigue28), or the stress involved with competing demands of difficult human service jobs coupled with lack of clarity of how these initiatives fit with the role of therapist. Further research is needed to understand potential unintended consequences of EBP efforts such as innovation fatigue.
Organizations with more resistant cultures and organizations with more functional climates were more likely to have clinicians who endorsed using cognitive-behavioral techniques. Organizations with more resistant cultures may be more likely to participate in initiatives to increase use of innovation. Clinicians’ perceptions that they are functioning effectively appear to contribute to more use of cognitive-behavioral techniques. Organizations with more stressful climates and with more fee-for-service staff were more likely to have clinicians who used family techniques. It is possible that organizations with more stressful climates serve a more chaotic population, suggesting the potential utility of family techniques, which are indicated for youth with psychiatric disorders and chaotic family environments.29
Consistent with the literature, clinicians who were less knowledgeable about EBP and held more divergent attitudes towards EBP were more likely to use psychodynamic techniques.30 Also consistent with previous findings,9 clinicians reported using cognitive-behavioral and other therapy techniques (i.e., eclecticism) at the same time suggesting a potential exnovation problem. Exnovation refers to “the process whereby an organization decides to divest itself of an innovation that it had previously adopted”.31 Further research is needed to understand how organizations can plan for EBP fit with existing practices.
Interestingly, implementation climate and leadership, constructs hypothesized to be related to implementation outcomes,8,32 did not predict use of EBP. However, the present study did not examine more complex interactive or mediational processes to account for how these constructs may work together with molar culture or climate to predict the outcome utilized in this study.33
Some study limitations should be noted. The primary outcome variables are based on clinician self-report of use of therapeutic techniques and clinicians are not always accurate reporters of use of therapeutic techniques.34,35 Second, we did not have 100% participation, creating a potential selection bias at both the therapist- and organizational-level. Third, we used a random intercepts only model and did not allow the slopes to vary by organization due to sample size limitations.
These findings offer important implications. Organizational variables accounted for more of the variance than individual variables in predicting use of EBP, suggesting that organizational-level implementation strategies36 will be more effective in increasing the use of EBP than implementation strategies that directly target the clinician. Clinician factors account for more of the variance than organizational factors in clinician use of psychodynamic techniques. Efforts to implement EBP may need to be accompanied by efforts to encourage clinicians to divest themselves of outdated practices. While these two activities may be seen as two sides of the same coin, the results of this study suggest that the process of exnovation may be driven by different factors than the process of implementation. Of perhaps equal importance, the variables included in our measurement model represent many of the constructs posited to predict implementation, and yet separately they accounted only for a maximum of 23% of the overall variation in outcomes, suggesting there are a number of unmeasured constructs. Finally, this study highlights the need for prospective studies to test the relative contributions and interactions of clinician and organizational focused implementation strategies on adoption, fidelity, and sustainment of EBP.
These findings suggest the nuanced impact of clinician and organizational factors on clinician use of therapy techniques. This study suggests that both “where you work” and “who you are” matter in understanding clinician behavior in context, and that improving the effectiveness of implementation strategies should consider both approaches.37
Supplementary Material
Acknowledgments
Funding. Funding for this research project was supported by the following grants from NIMH (K23 MH099179; Beidas). Additionally, the preparation of this article was supported in part by the Implementation Research Institute (IRI), at the George Warren Brown School of Social Work, Washington University in St. Louis; through an award from the National Institute of Mental Health (R25 MH080916) and Quality Enhancement Research Initiative (QUERI), Department of Veterans Affairs Contract, Veterans Health Administration, Office of Research & Development, Health Services Research & Development Service. Dr. Beidas was an IRI fellow from 2012–2014.
Role of the funder/sponsor. The funding source had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and the decision to submit the manuscript for publication.
Additional acknowledgements. We are especially grateful for the support that the Department of Behavioral Health and Intellectual DisAbility Services has provided for this project, and for the Evidence Based Practice and Innovation (EPIC) group. We would also like to thank for the following experts who provided their time and input on this project: Dr. Marc Atkins, Dr. Ross Brownson, Dr. David Chambers, Dr. Bruce Chorpita, Dr. Charles Glisson, Dr. Nicholas Ialongo, Dr. John Landsverk, Dr. Enola Proctor, and Dr. Ronnie Rubin.
Footnotes
Author Contributions. Dr. Beidas is the principal investigator. She designed the study, was the primary writer of the manuscript, and approved all changes. Dr. Beidas had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis. Dr. Mandell is the primary mentor for Dr. Beidas’s K23 award which provides support for all study activities. Dr. Marcus provided support around data analysis and interpretation. The following authors (Aarons, Barg, Evans, Hurford, Hadley, Hoagwood, Marcus, Schoenwald) have provided input into the design of the study and read and/or edited the manuscript. Ms. Walsh and Ms. Adams have coordinated the study and have contributed to the design of the study. All authors reviewed and provided feedback for this manuscript. The final version of this manuscript was vetted and approved by all authors.
Conflict of Interest Disclosures. Dr. Beidas receives royalties from Oxford University Press and has served as a consultant for Kinark Child and Family Services. Dr. Marcus has received grant support from Ortho-McNeil Janssen and Forest Research Institute and has served as a consultant to AstraZeneca and Alkermes. Dr. Schoenwald is a stakeholder and board member of MST Services, LLC. None of the reported disclosures are related to implementation of evidence-based practices for youth in the City of Philadelphia. The following authors have no disclosures to report: (Dr. Aarons; Ms. Adams; Dr. Barg; Dr. Evans; Dr. Hadley; Dr. Hoagwood; Dr. Hurford; Dr. Mandell; Ms. Walsh).
References
- 1.Aarons GA, Hurlburt M, McCue Horwitz S. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Hlth. 2011;38:4–23. doi: 10.1007/s10488-010-0327-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Henggeler SW, Chapman JE, Rowland MD, et al. Statewide adoption and initial implementation of contingency management for substance-abusing adolescents. J Consult Clin Psychol. 2008;76(4):556–567. doi: 10.1037/0022-006X.76.4.556. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Beidas RS, Edmunds J, Ditty M, et al. Are inner context factors related to implementation outcomes in cognitive-behavioral therapy for youth anxiety? Adm Policy Ment Hlth. 2014;41(6):788–799. doi: 10.1007/s10488-013-0529-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Glisson C, Dukes D, Green P. The effects of the ARC organizational intervention on caseworker turnover, climate, and culture in children’s service systems. Child Abuse Neglect. 2006;30:855–880. doi: 10.1016/j.chiabu.2005.12.010. [DOI] [PubMed] [Google Scholar]
- 5.Peterson AE, Bond GR, Drake RE, McHugo GJ, Jones AM, Williams JR. Predicting the long-term sustainability of evidence-based practices in mental health care: an 8-year longitudinal analysis. Journal of Behav Health Ser R. 2014;41(3):337–346. doi: 10.1007/s11414-013-9347-x. [DOI] [PubMed] [Google Scholar]
- 6.Glisson C, Hemmelgarn A. The effects of organizational climate and interorganizational coordination on the quality and outcomes of children’s service systems. Child Abuse Neglect. 1998;22:401–421. doi: 10.1016/s0145-2134(98)00005-2. [DOI] [PubMed] [Google Scholar]
- 7.Glisson C, Green P. Organizational climate, services, and outcomes in child welfare systems. Child Abuse Neglect. 2011;35:582–591. doi: 10.1016/j.chiabu.2011.04.009. [DOI] [PubMed] [Google Scholar]
- 8.Aarons GA, Glisson C, Green PD, et al. The organizational social context of mental health services and clinician attitudes toward evidence-based practice: a United States national study. Implementation Sci. 2012;7(56):1–15. doi: 10.1186/1748-5908-7-56. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Garland AF, Brookman-Frazee L, Hurlburt MS, et al. Mental health care for children with disruptive behavior problems: a view inside therapists’ offices. Psychiatr Serv. 2010;61(8):788–795. doi: 10.1176/appi.ps.61.8.788. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Chambless DL, Ollendick TH. Empirically supported psychological interventions: controversies and evidence. Annu Rev Psychol. 2001;52:685–716. doi: 10.1146/annurev.psych.52.1.685. [DOI] [PubMed] [Google Scholar]
- 11.Mattejat F. Evidence-based family therapy. Which family-based interventions are empirically supported? Kindh Entwickl. 2005;14(1):3–11. [Google Scholar]
- 12.Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap) - A metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform. 2009;42(2):377–381. doi: 10.1016/j.jbi.2008.08.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Weisz JR. Therapist Background Questionnaire. Los Angeles, CA: University of California; 1997. [Google Scholar]
- 14.Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the Evidence-Based Practice Attitude Scale (EBPAS) Ment Health Serv Res. 2004;6(2):61–74. doi: 10.1023/b:mhsr.0000024351.12294.65. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Aarons GA, Sawitzky AC. Organizational culture and climate and mental health provider attitudes towards evidence-based practice. Psychol Serv. 2006;3(1):61–72. doi: 10.1037/1541-1559.3.1.61. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Aarons G. Measuring provider attitudes toward evidence-based practice: consideration of organizational context and individual differences. Child Adol Psych Cl. 2005;14(2):255–271. doi: 10.1016/j.chc.2004.04.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Stumpf RE, Higa-McMillan CK, Chorpita BF. Implementation of evidence-based services for youth: assessing provider knowledge. Behavior Modif. 2009;33(1):48–65. doi: 10.1177/0145445508322625. [DOI] [PubMed] [Google Scholar]
- 18.Glisson C, Landsverk J, Schoenwald S, et al. Assessing the organizational social context (OSC) of mental health services: implications for research and practice. Adm Policy Ment Hlth. 2008;35(1–2):98–113. doi: 10.1007/s10488-007-0148-5. [DOI] [PubMed] [Google Scholar]
- 19.Glisson C, Green P, Williams NJ. Assessing the Organizational Social Context (OSC) of child welfare systems: implications for research and practice. Child Abuse Neglect. 2012;36(9):621–632. doi: 10.1016/j.chiabu.2012.06.002. [DOI] [PubMed] [Google Scholar]
- 20.Aarons GA. Association for Behavioral and Cognitive Therapies. Toronto, Ontario: 2011. Organizational Climate for Evidence-Based Practice Implementation: Development of a New Scale. [Google Scholar]
- 21.Aarons GA, Ehrhart MG, Farahnak LR. The Implementation Leadership Scale (ILS): Development of a brief measure of unit level implementation leadership. Implementation Sci. 2014;9(45) doi: 10.1186/1748-5908-9-45. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Weersing VR, Weisz JR, Donenberg GR. Development of the Therapy Procedures Checklist: a therapist-report measure of technique use in child and adolescent treatment. J Clin Child Adolesc Psychol. 2002;31(2):168–180. doi: 10.1207/S15374424JCCP3102_03. [DOI] [PubMed] [Google Scholar]
- 23.Kolko DJ, Cohen JA, Mannarino AP, Baumann BL, Knudsen K. Community treatment of child sexual abuse: a survey of practitioners in the National Child Traumatic Stress Network. Adm Policy Ment Hlth. 2009;36(1):37–49. doi: 10.1007/s10488-008-0180-0. [DOI] [PubMed] [Google Scholar]
- 24.James LR, Demaree RG, Wolf G. Estimating within-Group Interrater Reliability with and without Response Bias. J Appl Psychol. 1984;69(1):85–98. [Google Scholar]
- 25.Brown RD, Hauenstein NMA. Interrater agreement reconsidered: An alternative to the r(wg) indices. Organ Res Methods. 2005;8(2):165–184. [Google Scholar]
- 26.Bliese P. Within-group agreement, non-independence, and reliability: Implications for data aggregation and analysis. In: Klein K, Kozlowski S, editors. Multilevel Theory, Research, and Methods in Organizations 1st edition. San Francisco, CA: Joseey-Bass; 2000. pp. 349–380. [Google Scholar]
- 27.Singer JD. Using SAS PROC MIXED to fit multilevel models, hierarchical models, and individual growth models. J Educ Behav Stat. 1998;23(4):323–355. [Google Scholar]
- 28.Lindsay J, Perkins C, Karanjikar M. Conquering Innovation Fatigue: Overcoming the Barriers to Personal and Corporate Success. Hoboken, NJ: Wiley; 2009. [Google Scholar]
- 29.Sexton T, Turner CW. The effectiveness of functional family therapy for youth with behavioral problems in a community practice setting. J Fam Psychol. 2010;24(3):339–348. doi: 10.1037/a0019406. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Nakamura BJ, Charmaine K, Higa-McMillan KH, Okamura SS. Knowledge of and attitudes towards evidence-based practices in community child mental health practitioners. Adm Policy Ment Hlth. 2011;38(4):287–300. doi: 10.1007/s10488-011-0351-2. [DOI] [PubMed] [Google Scholar]
- 31.Kimberly JR, Evanisko MJ. Organizational innovation: the influence of individual, organizational, and contextual factors on hospital adoption of technological and administrative innovations. Acad Manage J. 1981;24(4):689–713. [PubMed] [Google Scholar]
- 32.Aarons GA, Sommerfeld DH. Leadership, innovation climate, and attitudes toward evidence-based practice during a statewide implementation. J Am Acad Child Psy. 2012;51(4):423–431. doi: 10.1016/j.jaac.2012.01.018. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Ehrhart MG, Schneider B, Macey WH. Organizational Climate and Culture: An Introduction to Theory, Research, and Practice. New York, NY: Routledge; 2014. [Google Scholar]
- 34.Bellg AJ, Borrelli B, Resnick B, et al. Enhancing treatment fidelity in health behavior change studies: Best practices and recommendations from the NIH Behavior Change Consortium. 5. Vol. 23. US: American Psychological Association; 2004. pp. 443–451. [DOI] [PubMed] [Google Scholar]
- 35.Borrelli B, Sepinwall D, Ernst D, et al. A new tool to assess treatment fidelity and evaluation of treatment fidelity across 10 years of health behavior research. Journal of Consult Clin Psych. 2005;73(5):852–860. doi: 10.1037/0022-006X.73.5.852. [DOI] [PubMed] [Google Scholar]
- 36.Glisson C, Schoenwald SK, Hemmelgarn A, et al. Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. J Consult Clin Psych. 2010;78(4):537–550. doi: 10.1037/a0019160. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. AdmPolicy Ment Hlth. 2011;38(2):65–76. doi: 10.1007/s10488-010-0319-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.