Skip to main content
Psychiatry Journal logoLink to Psychiatry Journal
. 2014 Mar 3;2014:802983. doi: 10.1155/2014/802983

Organizational Factors Influencing Implementation of Evidence-Based Practices for Integrated Treatment in Behavioral Health Agencies

Caroline A Bonham 1,*, David Sommerfeld 2, Cathleen Willging 3, Gregory A Aarons 2
PMCID: PMC3989772  PMID: 24772411

Abstract

Objective. In recent years, New Mexico has prioritized integrated treatment for cooccurring mental health and substance use disorders within its public behavioral health system. This report describes factors likely to be important when implementing evidence-based practices (EBPs) in community agencies. Methods. Our mixed-method research design consisted of observations, semistructured interviews, and surveys undertaken with employees at 14 agencies at baseline and after 18 months. We developed four-agency typologies based on iterative coding and analysis of observations and interviews. We then examined survey data from employees at the four exemplar agencies to validate qualitative findings. Results. Financial resources and strong leadership impacted agency capacity to train providers and implement EBPs. Quantitative analysis of service provider survey responses from these agencies (N = 38) supported qualitative findings and demonstrated significant mean score differences in leadership, organizational climate, and attitudes toward EBPs in anticipated directions. Conclusion. The availability of strong leadership and financial resources were key components to initial implementation success in this study of community agencies in New Mexico. Reliance only on external funding poses risks for sustainment when demoralizing work climates precipitate employee turnover. Strong agency leadership does not always compensate for deficient financial resources in vulnerable communities.

1. Introduction

Despite substantial comorbidity of mental health and substance use disorders [1], integrated treatment is rarely offered in community settings [2]. Since integrated treatment is associated with improved mental health and substance use outcomes [3], as well as reductions in hospitalizations and arrests [4], this failure represents a significant public health concern.

When treating clients with cooccurring disorders, providers must select appropriate clinical interventions [5]. Several specific evidence-based practices (EBPs) fall under the rubric of integrated treatment. EBPs can be defined as “approaches to prevention or treatment that are based in theory and have undergone scientific evaluation” [6]. For example, Dialectical Behavioral Therapy [7] and Seeking Safety [8] have been shown to improve both mental health and substance abuse outcomes in populations with cooccurring disorders. These interventions and similar EBPs help providers deliver comprehensive treatment that addresses both disorders simultaneously. In recent years, research into EBP implementation has become recognized as an important area of study in order to improve health outcomes by identifying how to best translate effective clinical interventions into the US healthcare system [9, 10].

In 2004, the predominantly rural state of New Mexico (NM) prioritized integrated treatment and use of EBPs within the public sector and obtained federal funding to provide training in these modalities. However, implementation of new practices often involves organizational change and is rarely a simple problem that can be addressed with a single series of training sessions [11]. Organizational factors are important predictors of implementation [12, 13]. Agency culture and climate can affect provider attitudes toward adopting EBPs [14]. Positive leadership styles are associated with supportive organizational climates and receptive staff attitudes toward EBPs [15]. Thus, agency leadership is one potential route for modifying organizational climate and encouraging utilization of EBPs [16, 17].

Our study is informed by the Consolidated Framework for Implementation Research in Health Services [18], which suggests that leadership, resources, and access to knowledge and information play crucial roles in determining whether an organization is ready to adopt new practices. Our mixed-method study clarifies how these factors influence delivery of EBPs for integrated treatment in community agencies and identifies agency profiles that can facilitate or hinder these changes.

2. Methods

2.1. Study Design and Sampling

We employed a mixed-method research design between 2006 and 2008 as part of a long-term study of broader reform efforts in NM and their effects on services for adults with serious mental illness (SMI). This research took place in three rural counties and three counties with metropolitan areas; each was chosen on the basis of geographical diversity and catchment area characteristics (e.g., predominance of ethnic minority versus white residents). We collected qualitative and quantitative data on organizational leadership, climate and finances, training, and provider attitudes toward EBPs and use of these interventions from the 14 behavioral health agencies that provided the majority of services within each county. Although relatively small in size in comparison to agencies located elsewhere in the nation, they reflect the state's range of provider organizations.

Within each agency, we used a purposive sampling approach to identify and recruit participants [19]. We first interviewed a lead administrator, who then referred us to all service providers (e.g., psychiatrists, social workers, counselors, and case managers) and support staff who worked with adults with SMI. Characteristics of participating providers are presented in Table 1.

Table 1.

Demographics of sample.

%
Gender
 Male 28
 Female 72
Position at agency
 Clinician 56
 Support staff 24
 Upper level administrator 20
Education
 <8th grade 0.5
 Completed high school 11
 Some college 30
 Completed college 20
 Some graduate education 4
 Completed master's degree 28
 Completed doctorate (PhD, MD, and/or Ed.D) 6
 Missing/other 5
Ethnicity
 Nonhispanic White 44
 Hispanic 39
 Native American 17
 Other 3
Mean age of participants 46 years

As shown in Table 2, we conducted semistructured interviews and quantitative surveys with participants at baseline (Time 1) and after 18 months (Time 2). One of the original agencies closed, resulting in a total of 13 agencies at Time 2.

Table 2.

Sequence of data collection.

Time 1 (baseline) Time 2 (18 months later)
14 agencies 13 agencies
Qualitative data Qualitative data
 Observations  Observations
  24 hours at each agency   At least 8 hours at each agency
 Semistructured interviews  Semistructured interviews
  Direct service providers (n = 110)   Direct service providers (n = 93)
  Support staff (n = 41)   Support staff (n = 30)
  Upper level administrators (n = 39)   Upper level administrators (n = 27)
Quantitative data
 Attitudes towards evidence-based practices (n = 34)
 Transformational leadership (n = 38)
 Demoralizing climate (n = 38)

2.2. Qualitative Data Collection and Analysis

We developed semistructured interview protocols tailored to administrators, providers, and staff. At each wave of data collection, the research team conducted observations within each agency to better understand contextual factors affecting EBP implementation [20]. When conducting qualitative interviews at Time 2, interviewers used a slightly modified interview protocol and asked about recent agency changes. The research team took extensive field notes and observations which were typed and uploaded to an electronic database. All participants agreed to recorded interviews, which lasted 45–60 minutes, and were professionally transcribed.

The first and third authors coded the interviews and observations using NVivo software [21]. We used a descriptive coding scheme based on questions from the interview protocol and the broader domains of our conceptual model. The resulting codes centered on leadership, training, financial resources, EBP utilization, and workplace morale. When analyzing observation data, we first used open coding to locate themes. We then used focused coding to determine which themes emerged frequently and which represented particular concerns [22]. Discrepancies in coding and analysis were resolved during team meetings through a process of consensus by comparing and contrasting the content of reports, field notes, and interview transcripts [23].

Next, we identified distinct agency profiles associated with the likelihood of implementing EBPs related to integrated treatment. Interventions in the National Registry of Evidence-Based Programs and Practices were identified as EBPs [6]. All references within the interview transcripts to these EBPs were tallied for each agency at both Time 1 and Time 2. Agencies were considered to be in the process of implementing an EBP if multiple providers at Time 2 referred to using the intervention and if observation notes from that agency provided additional confirmation that the EBP was routinely available. We examined all references to EBPs within the interview transcripts and observation notes to identify patterns of potential influences.

Through this analysis, we identified four agencies that best illustrated each of the profiles and exemplified the patterns manifest across the entire dataset. These four agencies served as instrumental case studies providing insight and context into conditions that promoted or hindered EBP implementation [24]. The interviews and observation notes from the remaining agencies were then reexamined to ensure that conclusions from the case studies could be generalized across sites. Table 3 summarizes agency characteristics.

Table 3.

Characteristics of agencies.

Agency Rural or urban setting Agency sizea Degree of staff turnoverb Number of EBPs at Time 1 Number of EBPs at Time 2
Agency increased uptake of EBPs
 *Serves homeless population Urban Medium Low 3 5
 **Community mental health center Rural Medium High 0 1
 ***Community mental health center Rural Large Moderate 2 4
 Substance abuse treatment facility Urban Medium Moderate 1 3
 Community mental health center Urban Medium Moderate 0 2
 Community mental health center Rural Large Moderate 0 1
 Community mental health center Rural Small High 2 3
Agency decreased use of EBPs
 ****Substance abuse treatment facility Rural Large Moderate 2 0
 Community mental health center Urban Large Moderate 1 0
 Small group practice Rural Small N/A 2 Closed
Agency had no change in use of EBPs
 Community mental health center Urban Large Low 2 2
 Substance abuse treatment facility Rural Large Moderate 2 2
 Small group practice Urban Small High 0 0
 Serves homeless population Urban Small Low 0 0

*Agency A: strongly facilitative.

**Agency B: leader driven.

***Agency C: resource driven.

****Agency D: resource deprived.

aSmall agency: less than 8 employees; medium agency: 9–14 employees; large agency: 15–21 employees.

bLow turnover: less than 25%; moderate turnover: between 25–50%; high turnover: more than 50%.

After analyzing the qualitative data, we hypothesized the following: organizational climate was a function of (1) leadership and (2) access to resources and that (3) providers at agencies with positive work climates would be favorably disposed towards use of EBPs. We then assessed these hypotheses with Time 2 quantitative data from the four agencies serving as case studies.

2.3. Quantitative Data and Analysis

For the quantitative investigation, we compared the average ratings of leadership, demoralizing climate, and attitudes toward EBPs from the providers employed by the four agencies (N = 38). Missing data regarding attitudes toward EBPs reduced the total number of provider ratings to 34 for this measure. Each measure is described below.

2.3.1. Transformational Leadership

The Multifactor Leadership Questionnaire (MLQ) [25] is a widely used measure of leadership in organizations. The MLQ scores have been associated with organizational climate in behavioral health agencies [14]. For this research, we focused on transformational leadership, comprised of five subscales including idealized influence attributed (four items, α = .91), idealized influence behavioral (three items, α = .86), inspirational motivation (four items, α = .94), intellectual stimulation (four items, α = .93), and individual consideration (four items, α = .87). We asked participants to indicate the extent to which their supervisor demonstrated each behavior measured by the MLQ. These behaviors were rated on a 5-point scale ranging from 0, “not at all,” to 4, “to a very great extent.”

2.3.2. Demoralizing Climate

Demoralizing climate was measured by items from the Children's Services Survey [26], adapted from studies of diverse workplaces, and used previously to assess climate in public mental health agencies [27]. The subscales measuring demoralizing climate include depersonalization (five items, α = .85), emotional exhaustion (six items, α = .94), and role conflict (nine items, α = .88). As with the MLQ, each statement was rated on the same 5-point scale described above.

2.3.3. Attitudes toward EBPs

The Evidence-Based Practice Attitudes Scale (EBPAS) is a 15-item measure that assesses mental health and social service provider attitudes toward adopting EBPs [28]. EBPAS items are also rated with the same 5-point scale. Cronbach's alpha reliability for the overall EBPAS is good (α = .79), with subscale alphas ranging from 0.93 to 0.66 [29]. The EBPAS total scale score (used in the present study) represents participants' global attitudes toward adoption of EBPs.

We conducted ANOVAs to assess differences in mean ratings of transformational leadership, demoralizing climate, and EBPAS scores at the four selected agencies. When significant differences were found, we used post hoc analyses to assess all pair-wise comparisons to identify which specific typologies differed from each other on the measure. We minimized the inflated risk of a Type I error by using the Bonferroni adjustment. To explicitly examine the role of leadership, we also conducted t-tests comparing the combined service provider ratings from the two agencies with strong leadership compared to the combined service provider ratings from the two agencies with weaker leadership.

3. Qualitative Results

Four distinct agency types were associated with the adoption of EBPs related to integrated treatment within the study period: strongly facilitative, resource driven, leader driven, and resource deprived. The following case studies illustrate how leadership, access to resources, and training opportunities can influence the implementation process.

3.1. Strongly Facilitative (Agency A)

“Strongly facilitative” agencies implemented EBPs through supportive leadership and external funding. Agency A was led by a popular administrator, who ensured that providers were trained in EBPs and had time for weekly supervision. These efforts helped to establish a constructive organizational climate where providers described feeling motivated and challenged by their work. By Time 2, providers regularly incorporated five EBPs into clinical work. All data from this agency indicated that the EBPs were part of routine care. During interviews, providers spoke about these interventions knowledgably; the agency's website provided clear summaries of the EBPs; and observation data confirmed that providers had access to necessary materials. This agency participated in the National Institute on Drug Abuse Clinical Trials Network. Through this network, agency personnel identified funding opportunities and successfully competed for external grants that covered the high cost of training and buffered the agency from lost income when providers did not generate revenue through the provision of direct services.

One provider underscored the importance of a positive organizational climate when working in a field with many clinical challenges.

  • “I like the staff I work with, I wouldn't be working here if it wasn't that way…There's a lot of laughter, and that means a lot to me…The stressor is the nature of the work…Working with mental illness, and drug and alcohol clients—it's draining. You're dealing with crises a lot.”

Unlike personnel in other agencies, all providers commented on the high workplace morale; the majority also felt prepared to deliver integrated treatment.

3.2. Leader Driven (Agency B)

At “leader driven” agencies, strong managers initiated EBPs despite limited funding. Agency B served an economically challenged rural region with high rates of substance use. The clinic had minimal infrastructure and considerable financial stress. When interviewed at Time 1, providers were largely unfamiliar with EBPs and had difficulty naming any interventions based on research. Eighteen months later, the agency was consistently offering one new EBP. However, because of financial constraints, none of the providers received formal training and they expressed doubts about their abilities to deliver this EBP effectively.

  • “It would've been nice to have some training…They had talked about training, but then they just threw us in there. We reviewed (the manual) and just started piecing it together, and we probably didn't do it right at first.”

Just prior to the launch of this new EBP, the agency hired a popular clinical director who prioritized education and established regular supervision sessions. In response to provider requests for more education, he instituted regular supervision sessions. However, he was worried that these sessions were insufficient when teaching new material.

  • “Folks need help, support, and training. They need more time with their supervisor. The supervisor needs more training and more help. It's just that if it's not billable, we can't do it. That's what it boils down to…(we) can't bill for training, can't get reimbursed for training.”

When first learning interventions, both providers and supervisors needed opportunities to ensure that their knowledge of the practice was complete. Without time, funding, and educational resources, it was difficult to implement EBPs with fidelity.

Despite receiving personal support from the director, providers at this agency felt overwhelmed. Ultimately there was high turnover. Providers described a lack of community resources that hampered appropriate treatment for people with cooccurring disorders. When asked why work was stressful, they emphasized daunting community needs rather than specific complaints about internal agency management.

3.3. Resource Driven (Agency C)

Resource driven agencies relied primarily on external funding and workshops to implement new practices. Agency C served a rural community and operated several clinic sites. After learning that EBPs were eligible for higher reimbursement rates, the lead administrator hired staff to pursue grant opportunities to diversify the organization's funding portfolio. These resources allowed the agency to send providers to trainings. As in the case of the leader driven agency, few had familiarity with EBPs at Time 1. However, by Time 2, many clinicians had adopted up to four EBPs. They spoke positively about these changes, asserting that EBPs provided structure to an otherwise chaotic work environment.

Time constraints impeded the dissemination process. Although group supervision sessions were potentially available, most providers could not attend them regularly because of clinical commitments and the logistics of traveling between offices.

At this agency, providers routinely expressed frustration with their work environment. These frustrations reflected the challenges of rural practice: working in small teams, staffing several offices, and providing services to clients with complex needs over a large geographical area.

The relatively rapid uptake of several practices confirms that access to additional financial resources facilitated implementation. Nevertheless, a number of providers confided that they were thinking of leaving the agency, which would disrupt efforts to sustain these changes.

3.4. Resource Deprived (Agency D)

In “resource deprived” settings, weak leadership and inadequate funding discouraged the use of EBPs. Our fourth case study involved a rural agency with a history of collaborating with university researchers on the adaptation of existing EBPs for different cultural populations. During Time 1, providers were enthusiastic about EBPs and spoke positively about their relationship with the university. However, by Time 2, the research relationships were in transition as agency leadership changed. Previous grants were ending and program funding was unlikely to be renewed. At this time, providers were more skeptical about the value of EBPs. They also felt disconnected from the world of academia and research. One provider explained the following.

  • “Evidence-based practices show significance, but I don't see the benefits at a local level…(Researchers) come in and use their evidence-based approaches…With the information they collect, they take it back…I don't see what in the world they do with it.”

Finally, in the climate of fiscal stress and leadership changes, many providers expressed low morale. Due to financial problems, the agency lost several employees. Some found better paying jobs; the contracts of others were not renewed when grants ended. Although at Time 1 several providers used EBPs regularly, by Time 2, none of the providers were using these interventions. Some said it was just not possible to deliver EBPs without sufficient staff. In this setting, turnover not only of providers but also among leadership emerged as barriers to implementation efforts.

4. Quantitative Results

Based on our qualitative analysis, we had anticipated that strongly facilitative and leader driven agencies would have higher scores for transformational leadership than the resource driven and resource deprived agencies. We additionally expected that the strongly facilitative agency would have the lowest demoralizing climate and that the resource deprived agency would have the highest. Finally, we anticipated that the relationship between agency leadership and resources would be evident in provider attitudes toward EBPs. We also predicted that providers at the strongly facilitative agency would report more positive experiences with EBPs whereas the providers at the resource deprived agency might be more skeptical. As shown in Table 4, survey results indicated that ratings of agency leadership and organizational climate were generally consistent with the qualitative findings. The ANOVA test for transformational leadership did not identify significant differences between the four agencies, likely due to the small sample size within agency. However, as anticipated, the mean scores for the two agencies with stronger leadership were higher than for the two agencies with poor leadership. Significant differences emerged when we grouped service providers from the strongly facilitative and leader driven agencies and compared them to their counterparts at the resource driven and resource deprived agencies (t = 2.06, P = 0.046).

Table 4.

Service provider ratings of transformational leadership, demoralizing climate, and EBP attitude by agency.

Agency N Mean S.D. S.E. F Sig.
Transformational leadership 10 2.55 0.69 0.22
 A—strongly facilitative 5 2.62 1.58 0.71 1.36 0.27
 B—leader driven 15 1.87 1.15 0.3
 C—resource driven 8 1.78 1.11 0.39
 D—resource deprived
Demoralizing climate
 A—strongly facilitative 10 0.61 0.39 0.12
 B—leader driven 5 1.69 0.5 0.22 3.91 0.02
 C—resource driven 15 1.21 0.63 0.16
 D—resource deprived 8 1.36 0.92 0.33
Attitudes towards EBPs
 A—strongly facilitative 8 3.08 0.41 0.14
 B—leader driven 5 2.67 0.46 0.21 5.03 0.01
 C—resource driven 13 2.61 0.45 0.13
 D—resource deprived 8 2.2 0.5 0.18

The results for demoralizing climate partially met expectations since we found significant differences in scores across the four agencies (F = 3.91, P = 0.02). While the strongly facilitative agency had the lowest demoralizing climate score, post hoc analysis indicated that the primary difference was between the strongly facilitative and leader driven agencies (P = 0.024).

Quantitative analysis of service provider attitudes toward EBPs also supported the patterns found in the qualitative results. We found significant differences between the four agencies (F = 5.03, P = 0.01). Post hoc analysis using the Bonferroni correction indicated that service providers at the strongly facilitative agency had more favorable attitudes toward EBPs than providers from the resource deprived agency (P = 0.003).

5. Discussion

According to the Consolidated Framework for Implementation Research, the three major components that predict an organization's readiness for implementation are leadership engagement, availability of resources, and access to information and knowledge. We used multiple methods to examine how these three factors shaped organizational climate and prepared providers for the implementation of new practices in publicly funded agencies in NM. Our mixed-method approach used quantitative data to examine and validate our qualitatively derived organizational typology. In keeping with previous studies [14, 30, 31] and our qualitative findings, the survey results suggested that the presence of strong leadership and adequate financial resources affect provider attitudes toward EBPs and facilitate implementation of these innovations.

There were consistent patterns in qualitative data linking use of EBPs with leadership and organizational climate across the 14 participating agencies. In agencies that had implemented multiple EBPs by Time 2, providers frequently described a sense of camaraderie at work which they generally attributed to the presence of supportive leaders.

The higher transformational leadership scores at the strongly facilitative and the leader driven agencies corresponded with qualitative descriptions of supportive leaders who prioritized individual supervision, were available when questions arose, and made efforts to praise and reward employees for their work. These descriptions are consistent with the construct of transformational leadership which includes close supervisory relationships and is exemplified by leaders who make efforts to meet the needs of supervisees through individualized feedback, support, and motivation [32, 33]. Transformational leadership has previously been associated with positive attitudes toward EBPs [15] and organizational support for EBP [31]; our study emphasizes its particular importance during active EBP implementation.

Our findings regarding the association between transformational leadership and demoralizing climate suggest that this relationship is indirect. In our study, the leader driven agency provided an example of strong leadership amidst a demoralizing work climate. The qualitative data from this site indicated that the consistently low ratings of organizational climate were in response to the pervasive poverty of the region rather than reflecting agency leadership. And, although we had hypothesized that the resource deprived agency would have the most demoralizing climate, it appeared that lacking either leadership or financial resources was similarly detrimental to organizational climate. In chronically underfunded settings with considerable unmet need, an overemphasis on leadership may divert attention from the enduring problem of scarce resources. Interestingly, other studies of EBP implementation in substance use treatment settings have noted that the lack of financial resources is a major contributor to stress among front line staff yet can go unrecognized as a barrier by agency leadership [34].

Our third hypothesis was that the readiness of an organization to implement new EBPs would reflect a combination of agency leadership, access to resources, and training opportunities as exemplified by the strongly facilitative agency. Indeed, this agency implemented the most EBPs and demonstrated the highest scores on the EBPAS scale by Time 2. The strongly facilitative agency also experienced the least provider turnover during the study period which corresponded with the positive organizational climate and also facilitated the uptake of new EBPs.

Through the qualitative data collected at all sites, we observed a bidirectional relationship between organizational climate and provider turnover. Poor organizational climates tended to precipitate turnover. Recent turnover and a lack of a consistent workforce likewise contributed to a stressful work climate. The agencies in this study experienced substantial changes in staffing over the course of 18 months. This high level of turnover is not uncommon in community agencies [26, 35, 36]. Further analysis confirmed that turnover was precipitated by negative organizational climate but was somewhat ameliorated by strong leadership [4]. When implementing new EBPs, ongoing staff turnover may be especially problematic if agencies rely primarily upon a “resource driven” approach requiring substantial investments in introductory training sessions. While such trainings can facilitate initial implementation, they are unlikely to encourage sustainment of new practices in demoralizing climates with high turnover.

The experience at the leader driven agency suggests that a supportive supervisor can spearhead implementation initiatives. However, the agency's limited capacity meant that only a single EBP was initiated, whereas, agencies with higher levels of external funding were able to implement multiple EBPs during the eighteen-month timeframe. Additionally, the agency's inability to send providers to training sessions raises concerns about whether EBPs can be delivered with fidelity in settings under such financial strain. This experience mirrors previous research findings that agency prioritization and trainings are initial facilitators of EBP implementation, but that adequate funding is essential for long term sustainment [30, 37]. Additionally, other work supports the premise that efforts to implement EBPs without adequate resources are associated with increased likelihood of adaptations and ultimately decreased fidelity [38]. When planning implementation strategies in underresourced rural areas in particular, capacity building is paramount. Initial costs can be considerable when carrying out new practices [39] and periods of reduced revenue are common during startup phases [40, 41].

Other considerations when implementing EBPs include the relationships between agencies and researchers. Community-based participatory research is recognized as a facilitator for the adoption of science-based interventions in community settings [42, 43]. The experience at the strongly facilitative agency supports this model. Proactive leadership promoted collaboration with researchers which led to grant funding and strengthened implementation efforts. Additionally, we observed that clinicians with personal experience of research were more positively disposed toward EBPs, a finding replicated in other settings [44]. However, principles of community-based participatory research emphasize long-term relationships between community members and academics [45]. While such a partnership had met with success in the resource deprived agency, this relationship was not sustained when organizational leadership changed. Subsequently, when providers were reinterviewed at Time 2, they expressed distrust about the research process. This distrust appeared to correspond with their diminished use of EBPs. Leadership turnover has also been identified as a critical factor in sustainment in an 8-year longitudinal study of EBP implementation [46].

States have an influential role in the implementation of EBPs in public settings [47, 48], and ongoing state funding and state sponsored training are associated with long term sustainment [49]. The relatively rapid increase in EBP provision in agency settings in NM reflects policy changes and investment in initial training made at the state level. However, this process is incomplete and we do not know the degree to which EBPs were implemented with fidelity. Integrated treatment can be difficult to implement with high fidelity [50]. In this study, we did not assess fidelity outcomes but ethnographic findings published elsewhere suggest that fidelity and sustainment will be long-term concerns across the state [51].

Strengths of our study include its longitudinal design and focus on community agencies. However, the observational design of this study and the small sample of agencies and providers limit the generalizability of our findings. We focused on publicly funded agencies serving adults with serious mental illness and cooccurring substance use disorders and did not assess the experiences of independent practitioners and primary care providers who deliver limited behavioral health services or providers specializing in treating youth. Finally, this research may not fully generalize to other states.

6. Conclusions

In this study, agency leadership and the availability of external financial resources influenced EBP uptake. These factors facilitated provider education, a necessary step for changing clinical practice. While the funding facilitated initial training sessions, consistent leadership nurtured supportive work climates, minimized provider turnover, and encouraged sustainment. Clinical managers encouraged new practices through regular supervision sessions. Yet, strong agency leadership did not always compensate for the lack of resources and existing infrastructure. Future reform should consider these disparities when planning fiscal resources for vulnerable communities.

Acknowledgments

Funding was provided by the National Institute of Mental Health (R01MH072961; R01MH076084), the Substance Abuse and Mental Health Administration, and the Robert Wood Johnson Foundation Clinical Scholars Program.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

References

  • 1.Kessler RC, Wai TC, Demler O, Walters EE. Prevalence, severity, and comorbidity of 12-month DSM-IV disorders in the National Comorbidity Survey Replication. Archives of General Psychiatry. 2005;62(6):617–627. doi: 10.1001/archpsyc.62.6.617. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Substance Abuse and Mental Health Services Administration. Results from the 2009 National Survey on Drug Use and Health: Mental Health Findings. Rockville, Md, USA: Office of Applied Studies (OAS); 2010. (NSDUH Series H-39, HHS No. SMA 10-4609). http://oas.samhsa.gov/nsduh/2k9nsduh/mh/2k9mhresults.pdf. [Google Scholar]
  • 3.Drake RE, Mercer-McFadden C, Mueser KT, McHugo GJ, Bond GR. Review of integrated mental health and substance abuse treatment for patients with dual disorders. Schizophrenia Bulletin. 1998;24(4):589–608. doi: 10.1093/oxfordjournals.schbul.a033351. [DOI] [PubMed] [Google Scholar]
  • 4.Aarons GA, Sommerfeld DH, Willging CE. The soft underbelly of system change: the role of leadership and organizational climate in turnover during statewide behavioral health reform. Psychological Services. 2011;8(4):269–281. doi: 10.1037/a002619. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Drake RE, Essock SM, Shaner A, et al. Implementing dual diagnosis services for clients with severe mental illness. Psychiatric Services. 2001;52(4):469–476. doi: 10.1176/appi.ps.52.4.469. [DOI] [PubMed] [Google Scholar]
  • 6.Subtance Abuse and Mental Health Administration. National Registry of Evidence Based Practices. 2013. http://www.nrepp.samhsa.gov/ [Google Scholar]
  • 7.Harned MS, Chapman AL, Dexter-Mazza ET, Murray A, Comtois KA, Linehan MM. Treating co-occurring axis I disorders in recurrently suicidal women with borderline personality disorder: a 2-year randomized trial of dialectical behavior therapy versus community treatment by experts. Journal of Consulting and Clinical Psychology. 2008;76(6):1068–1075. doi: 10.1037/a0014044. [DOI] [PubMed] [Google Scholar]
  • 8.Hien DA, Cohen LR, Miele GM, Litt LC, Capstick C. Promising treatments for women with comorbid PTSD and substance use disorders. American Journal of Psychiatry. 2004;161(8):1426–1432. doi: 10.1176/appi.ajp.161.8.1426. [DOI] [PubMed] [Google Scholar]
  • 9.Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C. National Institutes of Health approaches to dissemination and implementation science: current and future directions. American Journal of Public Health. 2012;102(7):1274–1281. doi: 10.2105/AJPH.2012.300755. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Wisdom J, Chor K, Hoagwood K, Horwitz S. Innovation adoption: a review of theories and constructs. Administration and Policy in Mental Health and Mental Health Services Research. 2013:1–23. doi: 10.1007/s10488-013-0486-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Torrey WC, Tepper M, Greenwold J. Implementing integrated services for adults with co-occurring substance use disorders and psychiatric illnesses: a research review. Journal of Dual Diagnosis. 2011;7(3):150–161. [Google Scholar]
  • 12.Sanders MR, Prinz RJ, Shapiro CJ. Predicting utilization of evidence-based parenting interventions with organizational, service-provider and client variables. Administration and Policy in Mental Health and Mental Health Services Research. 2009;36(2):133–143. doi: 10.1007/s10488-009-0205-3. [DOI] [PubMed] [Google Scholar]
  • 13.Fuller BE, Rieckmann T, Nunes EV, et al. Organizational readiness for change and opinions toward treatment innovations. Journal of Substance Abuse Treatment. 2007;33(2):183–192. doi: 10.1016/j.jsat.2006.12.026. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Aarons GA, Sawitzky AC. Organizational culture and climate and mental health provider attitudes toward evidence-based practice. Psychological Services. 2006;3(1):61–72. doi: 10.1037/1541-1559.3.1.61. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Aarons GA. Transformational and transactional leadership: association with attitudes toward evidence-based practice. Psychiatric Services. 2006;57(8):1162–1169. doi: 10.1176/appi.ps.57.8.1162. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Fearing G, Barwick M, Kimber M. Clinical transformation: manager's perspectives on implementation of evidence-based practice. Administration and Policy in Mental Health and Mental Health Services Research. 2013:1–14. doi: 10.1007/s10488-013-0481-9. [DOI] [PubMed] [Google Scholar]
  • 17.Proctor EK, Knudsen KJ, Fedoravicius N, Hovmand P, Rosen A, Perron B. Implementation of evidence-based practice in community behavioral health: agency director perspectives. Administration and Policy in Mental Health and Mental Health Services Research. 2007;34(5):479–488. doi: 10.1007/s10488-007-0129-8. [DOI] [PubMed] [Google Scholar]
  • 18.Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science. 2009;4, article 50 doi: 10.1186/1748-5908-4-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Johnson JC. Selecting Ethnographic Informants. Newbury Park, Calif, USA: Sage; 1990. [Google Scholar]
  • 20.Maxwell JA. Designing a qualitative study. In: Bickman L, Rog DJ, editors. Handbook of Applied Social Research Methods. Thousand Oaks, Calif, USA: Sage; 1998. pp. 69–100. [Google Scholar]
  • 21.Bazeley P. Qualitative Data Analysis with NVivo. Los Angeles, Calif, USA: SAGE; 2007. [Google Scholar]
  • 22.Emerson RM, Fretz RI, Shaw LL. Writing Ethnographic Fieldnotes. Chicago, Ill, USA: University of Chicago Press; 1995. [Google Scholar]
  • 23.Sandelowski M, Barroso J. Writing the proposal for a qualitative research methodology project. Qualitative Health Research. 2003;13(6):781–820. doi: 10.1177/1049732303013006003. [DOI] [PubMed] [Google Scholar]
  • 24.Stake RE. Case studies. In: Denzin NK, Lincoln YS, editors. Handbook of Qualitative Research. Thousand Oaks, Calif, USA: Sage; 1994. pp. 237–239. [Google Scholar]
  • 25.Bass B, Avolio B. Technical Report. New York, NY, USA: Center for Leadership Studies, Binghamton University; 1995. MLQ: multifactorial leadership questionnaire. [Google Scholar]
  • 26.Glisson C, Dukes D, Green P. The effects of the ARC organizational intervention on caseworker turnover, climate, and culture in children’s service systems. Child Abuse and Neglect. 2006;30(8):855–880. doi: 10.1016/j.chiabu.2005.12.010. [DOI] [PubMed] [Google Scholar]
  • 27.Aarons GA, Sawitzky AC. Organizational climate partially mediates the effect of culture on work attitudes and staff turnover in mental health services. Administration and Policy in Mental Health and Mental Health Services Research. 2006;33(3):289–301. doi: 10.1007/s10488-006-0039-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the evidence-based practice attitude scale (EBPAS) Mental Health Services Research. 2004;6(2):61–74. doi: 10.1023/b:mhsr.0000024351.12294.65. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Aarons GA, McDonald EJ, Sheehan AK, Walrath-Greene CM. Confirmatory factor analysis of the evidence-based practice attitude scale (EBPAS) in a geographically diverse sample of community mental health providers. Administration and Policy in Mental Health and Mental Health Services Research. 2007;34(5):465–469. doi: 10.1007/s10488-007-0127-x. [DOI] [PubMed] [Google Scholar]
  • 30.Bond GR, Drake RE, McHugo GJ, Peterson AE, Jones AM, Williams J. Long term sustainability of evidence based practices in community mental health agencies. Administration and Policy in Mental Health and Mental Health Services Research. 2012:1–9. doi: 10.1007/s10488-012-0461-5. [DOI] [PubMed] [Google Scholar]
  • 31.Swain K, Whitley R, McHugo GJ, Drake RE. The sustainability of evidence-based practices in routine mental health agencies. Community Mental Health Journal. 2010;46(2):119–129. doi: 10.1007/s10597-009-9202-y. [DOI] [PubMed] [Google Scholar]
  • 32.Corrigan PW, Garman AN. Transformational and transactional leadership skills for mental health teams. Community Mental Health Journal. 1999;35(4):301–312. doi: 10.1023/a:1018757706316. [DOI] [PubMed] [Google Scholar]
  • 33.Howell JM, Hall-Merenda KE. The ties that bind: the impact of leader-member exchange, transformational and transactional leadership, and distance on predicting follower performance. Journal of Applied Psychology. 1999;84(5):680–694. [Google Scholar]
  • 34.Lundgren L, Chassler D, Amodeo M, D’Ippolito M, Sullivan L. Barriers to implementation of evidence-based addiction treatment: a national study. Journal of Substance Abuse Treatment. 2012;42(3):231–238. doi: 10.1016/j.jsat.2011.08.003. [DOI] [PubMed] [Google Scholar]
  • 35.Aarons GA, Wells RS, Zagursky K, Fettes DL, Palinkas LA. Implementing evidence-based practice in community mental health agencies: a multiple stakeholder analysis. American Journal of Public Health. 2009;99(11):2087–2095. doi: 10.2105/AJPH.2009.161711. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Gallon SL, Gabriel RM, Knudsen JRW. The toughest job you’ll ever love: a Pacific Northwest Treatment Workforce Survey. Journal of Substance Abuse Treatment. 2003;24(3):183–196. doi: 10.1016/s0740-5472(03)00032-1. [DOI] [PubMed] [Google Scholar]
  • 37.Torrey WC, Bond GR, McHugo GJ, Swain K. Evidence-based practice implementation in community mental health settings: the relative importance of key domains of implementation activity. Administration and Policy in Mental Health and Mental Health Services Research. 2012;39(5):353–364. doi: 10.1007/s10488-011-0357-9. [DOI] [PubMed] [Google Scholar]
  • 38.Lundgren L, Amodeo M, Chassler D, Krull I, Sullivan L. Organizational readiness for change in community-based addiction treatment programs and adherence in implementing evidence-based practices: a national study. Journal of Substance Abuse Treatment. 2013;45(5):457–465. doi: 10.1016/j.jsat.2013.06.007. [DOI] [PubMed] [Google Scholar]
  • 39.Essock SM, Goldman HH, Van Tosh L, et al. Evidence-based practices: setting the context and responding to concerns. Psychiatric Clinics of North America. 2003;26(4):919–938. doi: 10.1016/s0193-953x(03)00069-8. [DOI] [PubMed] [Google Scholar]
  • 40.Goldman HH, Ganju V, Drake RE, et al. Policy implications for implementing evidence-based practices. Psychiatric Services. 2001;52(12):1591–1597. doi: 10.1176/appi.ps.52.12.1591. [DOI] [PubMed] [Google Scholar]
  • 41.Herschell AD, Kogan JN, Celedonia KL, Gavin JG, Stein BD. Understanding community mental health administrators’ perspectives on dialectical behavior therapy implementation. Psychiatric Services. 2009;60(7):989–992. doi: 10.1176/appi.ps.60.7.989. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Chambers DA, Azrin ST. Research and services partnerships: partnership: a fundamental component of dissemination and implementation research. Psychiatric Services. 2013;64(6):509–511. doi: 10.1176/appi.ps.201300032. [DOI] [PubMed] [Google Scholar]
  • 43.Lindamer LA, Lebowitz B, Hough RL, et al. Establishing an implementation network: lessons learned from community-based participatory research. Implementation Science. 2009;4(1, article 17) doi: 10.1186/1748-5908-4-17. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Palinkas LA, Schoenwald SK, Hoagwood K, Landsverk J, Chorpita BF, Weisz JR. An ethnographic study of implementation of evidence-based treatments in child mental health: first steps. Psychiatric Services. 2008;59(7):738–746. doi: 10.1176/ps.2008.59.7.738. [DOI] [PubMed] [Google Scholar]
  • 45.Israel BA, Schulz AJ, Parker EA, Becker AB, Allen A, Guzman JR. Critical issues in developing and following community-based participatory research principles. In: Minkler M, Wallerstein N, editors. Community-Based Participatory Research for Health. San Francisco, Calif, USA: Jossey-Bass; 2003. pp. 56–73. [Google Scholar]
  • 46.Peterson A, Bond G, Drake R, McHugo G, Jones A, Williams J. The Journal of Behavioral Health Services & Research. 2013. Predicting the long-term sustainability of evidence-based practices in mental health care: an 8-year longitudinal analysis; pp. 1–10. [DOI] [PubMed] [Google Scholar]
  • 47.Isett KR, Burnam MA, Coleman-Beattie B, et al. The role of state mental health authorities in managing change for the implementation of evidence-based practices. Community Mental Health Journal. 2008;44(3):195–211. doi: 10.1007/s10597-007-9107-6. [DOI] [PubMed] [Google Scholar]
  • 48.Isett KR, Burnam MA, Coleman-Beattie B, et al. The state policy context of implementation issues for evidence-based practices in mental health. Psychiatric Services. 2007;58(7):914–921. doi: 10.1176/ps.2007.58.7.914. [DOI] [PubMed] [Google Scholar]
  • 49.Jones A, Bond G, Peterson A, Drake R, McHugo G, Williams J. Role of state mental health leaders in supporting evidence-based practices over time. The Journal of Behavioral Health Services & Research. 2013:1–9. doi: 10.1007/s11414-013-9358-7. [DOI] [PubMed] [Google Scholar]
  • 50.McHugo GJ, Drake RE, Whitley R, et al. Fidelity outcomes in the national implementing evidence-based practices project. Psychiatric Services. 2007;58(10):1279–1284. doi: 10.1176/ps.2007.58.10.1279. [DOI] [PubMed] [Google Scholar]
  • 51.Willging CE, Waitzkin H, Lamphere L. Transforming administrative and clinical practice in a public behavioral health system: an ethnographic assessment of the context of change. Journal of Health Care for the Poor and Underserved. 2009;20(3):866–883. doi: 10.1353/hpu.0.0177. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Psychiatry Journal are provided here courtesy of Wiley

RESOURCES