Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2020 Sep 1.
Published in final edited form as: Adm Policy Ment Health. 2019 Sep;46(5):678–687. doi: 10.1007/s10488-019-00946-x

A Mixed Methods Study of Organizational Readiness for Change and Leadership during a Training Initiative within Community Mental Health Clinics

Victoria Stanhope 1, Abigail Ross 2, Mimi Choy-Brown 3, Lauren Jessell 4
PMCID: PMC6689447  NIHMSID: NIHMS1532302  PMID: 31218480

Abstract

This longitudinal mixed-methods study explored variation in organizational readiness for change and leadership behavior across seven organizations during a 12-month training initiative in person-centered care planning. Quantitative data was used to examine trajectories of organizational readiness for change and leadership behavior over time and qualitative data explored provider perspectives on the trajectory of these organizational factors during the 12-month training initiative. Findings indicated that levels of organizational readiness for change and leadership behavior varied across clinics, but most experienced a significant positive change at the mid-point of the training. Organizational readiness for change was positively correlated with leaderships behaviors across time. Provider focus group findings gave insight into their initial resistance to adopting the new practice and their increasing receptivity in the second six months due to increased understanding of the practice and leadership endorsement. Increasing provider openness to a new practice prior to training and having a consistently engaged leadership have the potential to improve the efficiency of a training initiative.

Keywords: organizational factors, leadership, workforce training, organizational readiness for change, mental health services

Background

One of the most commonly used implementation strategies to adopt a new practice is workforce training. It is estimated that organizations in the United States spend up to $126 billion per year on training with the goal of promoting practice innovations and improving service quality (Paradise, 2007). Training is defined as “a systematic approach to learning and development to improve individual, team and organizational effectiveness” (Aguinis & Kraiger, 2008, p. 452). Within mental health services, training has been identified as a key implementation strategy in the translation of evidence-based practices and can be delivered in a variety of types and formats (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005; Powell et al., 2015). With such a reliance on training to drive innovation and the scarcity of funds in public mental health agencies, it is critical that the impact of training is maximized (Beidas, Edmunds, Marcus, & Kendall, 2012; Nadeem et al., 2013). This requires building an evidence base on what are the optimal conditions, timing and dosage for training to sustainably change practice behaviors.

Training

Often training is a multifaceted implementation strategy composed of several discrete activities, which can include: development and distribution of educational models; training providers and leadership in vivo or by webinar; train-the-trainer models; external and internal technical assistance; consultation; continuous training; and facilitation (Powell et al., 2015). Overall, the critical ingredients for an effective training are to provide information designed to change knowledge, attitudes, and skills; supply opportunities for in-vivo practice; and offer performance feedback (Salas, Tannenbaum, Kraiger, & Smith-Jentsch, 2012). Research has shown that “one-shot” trainings comprised of one educational workshop for providers does not bring about sustained behavior change (Beidas & Kendall, 2010; Nadeem et al., 2013). In a systematic review of 49 studies of educational meetings and workshops, the authors concluded that these training strategies alone are not effective in changing complex professional behaviors (Forsetlund et al., 2009). In order to sustain knowledge and skills gained from an educational meeting or workshop, there needs to be additional strategies such as technical assistance and facilitation to give providers the opportunity to rehearse and receive constructive feedback (Herschell, Kolko, Baumann, & Davis, 2010; Lyon, Stirman, Kerns, & Bruns, 2011). Through ongoing expert consultation providers can work through the issues they encounter as they implement the new practice. Consultation techniques include clarification of the practice, case consultation, and problem solving (Nadeem et al., 2013). Consultants are distinguished from supervisors or coaches, as they are not responsible for the delivery of care in that setting; therefore, trainees may have more freedom to accept or reject their feedback. A randomized clinical trial of training plus consultation for providers of children’s mental health services found that every additional hour of consultation led to a corresponding increase in adherence and skill (Beidas et al., 2012). Overall, the addition of consultation calls to in-person training has been shown to improve uptake of practice (Herschell, McNeil, & McNeil, 2004; Miller, Yahne, Moyers, Martinez, & Pirritano, 2004).

Leadership

Research suggests that leadership behavior impacts organizational readiness for change (Aarons & Sommerfield, 2012; Battilana, Gilmartin, Sengul, Pache, & Alexander, 2010) and that leaders play a critical role in facilitating the adoption of evidence-based practices (EBPs) (Guerrero, Padwa, Fenwick, Harris, & Aarons, 2016). How leaders communicate innovation efforts to their staff and how they signal their support and investment in the proposed practice innovation can influence provider motivation and a sense of efficacy. Overall, leaders set the tone for an implementation by clearly signaling that they are paying attention to the change activities occurring in their agency and are taking the time to engage and support these activities. Qualitative research has revealed the importance of certain leadership behaviors, such as being hands-on and demonstrating personal investment in EBP activities, in sustaining provider adherence to EBPs over time (Stetler, Ritchie, Rycroft-Malone, & Charns, 2015).

Defined as “primary embedding mechanisms” in organizational theory, leadership behaviors cue providers to the relative importance of implementation within an organization (Schein, 2010). These behaviors include demonstrating their own knowledge of the practice, role modeling, allocating the necessary resources for implementation, and problem solving to overcome barriers. Aarons and colleagues (2014) built on organizational theory to identify four dimensions of implementation leadership: proactivity (anticipating and addressing implementation challenges), perseverance (commitment to EBP implementation regardless of magnitude of challenges), knowledge (understanding EBP and implementation issues), and supportiveness (capacity to support use of EBPs by clinicians who are tasked with delivering them) (Aarons, Ehrhart, & Farahnak, 2014; Guerrero et al., 2016). These behaviors help to establish provider “buy-in” to the proposed change, which (while not sufficient for implementation success) is necessary to ensure change motivation and efficacy.

Organizational Readiness for Change

Providers’ perceptions of their motivation and capability to implement a new practice together comprise the concept of organizational readiness for change (Weiner, 2009). Weiner (2009) defines these concepts as a shared “change commitment” and “change efficacy” among providers. Organizational readiness for change has been associated with key elements necessary for a successful training initiative, including organizational stability, patient engagement in treatment (Lehman et al., 2002), organizational commitment (Ingersoll et al., 2000), and motivation (Fuller, et al., 2007; Saldana et al., 2007).

Change commitment, often characterized as motivation, refers to the extent to which providers value the proposed change and how much effort they are prepared to invest in pursuing the steps to implement that change. Many factors have been found to influence levels of motivation, including how in control providers feel in relation to decision-making, their conscientiousness, age, level of anxiety, and job involvement (Colquitt, LePine, & Noe, 2000). Communications about the importance of the training effort to the agency, such as framing it is an opportunity rather than as a burden, have been shown to enhance provider motivation (Salas et al., 2012). An ongoing factor that has been identified in community mental health settings due to high workloads and low salaries is emotional exhaustion, which has been associated with negative attitudes towards EBPs (Barnett et al., 2017). In this regard, lack of receptivity to training may be less about its content and more about the lack of time to attend trainings and participate in consultation.

Change efficacy refers to how capable providers collectively feel to implement a new practice. Building upon Bandura’s concept of collective efficacy, this sense of shared confidence often predicts greater engagement in a change effort (Weiner, 2009). However, a sense of self-efficacy is only a positive influence when the perception is accurate. One study found that when therapist confidence was high, more training or consultation was felt to be unnecessary (Miller & Mount, 2001). This may be particularly true for practices that are less well defined and perceived as socially desirable, which is the case for delivering person-centered care. A major concern for providers being trained in person-centered care is their belief that they are “already doing it,” while objective measures of competency have not supported this perception (Matthews, Stanhope, Choy-Brown & Doherty, 2018). Therefore, an understanding of the new practice and a willingness to engage in building capacity to implement this practice must precede an authentic or accurate sense of self-efficacy.

In this study, community mental health care providers participated in a training initiative to implement person-centered care planning (PCCP), an emerging recovery-oriented, evidence-based practice (Adams & Grieder, 2014; Tondora, Miller, Slade & Davidson, 2014). In a systematic review of 19 studies examining the effectiveness of person-centered planning interventions in medical and mental health settings, Coulter and colleagues (2015) found modest effects on physical health outcomes but that interventions did decrease depression and increase consumer confidence and ability to manage health. In a randomized controlled trial across multiple states, PCCP was found to increase service engagement and adherence to medication (Stanhope, Ingoglia, Schmelter, & Marcus, 2013).

Investment in a training initiative is a considerable burden for low-resourced community mental health agencies both in purchasing the necessary resources and time spent in building provider capacity. As the research on training has demonstrated, much of the investment has resulted in negligible gains in changing professional behaviors, particularly when these behaviors are multi-faceted (Forsetlund et al., 2009). Therefore, increasing our understanding about the conditions that ensure trainings are efficient can help agencies get the most return when they invest their limited resources. This mixed methods study had two aims: 1) utilize quantitative data to examine the trajectory of organizational readiness to change and leadership behaviors over a 12-month training initiative and 2) utilize qualitative data to provide insight into the trajectory of organizational readiness to change and leadership behaviors from the provider perspective.

Methods

Study Design

This longitudinal mixed-methods study was situated within a larger hybrid study of PCCP within community mental health clinics in which 14 sites in two states were randomly assigned to PCCP or treatment-as-usual conditions (Stanhope, Tondora, Davidson, Choy-Brown, & Marcus, 2015). This study focused only on the experimental sites (N=7) using a sequential mixed-methods design with two distinct phases and multiple perspectives (Creswell & Clark, 2007). The first phase involved quantitative data collection using consultant perspectives and provider behavior to assess leadership and organizational readiness for change over time. The second qualitative phase explored provider perspectives on leadership and organizational readiness for change to further interpret the quantitative findings. Informed consent was obtained from all individual participants in the study. The study was approved by a university Institutional Review Board.

Study Setting

The study was set within seven community mental health clinics randomized to the PCCP condition. These clinics were from two states, serving overall approximately 8,000 service users and providing a range of services including outpatient therapy, crisis intervention, medication management, case management, residential programs, community support programs, and rehabilitation services. Within these clinics, leadership, supervisors, and direct care staff participated in the study. These study participants came from various disciplines including social work, psychology and counseling. Leadership was defined as those in executive leadership positions such as the executive director, medical director, operations director, quality improvement director, and clinical services director. Supervisors were staff who supervised direct care staff and direct care staff were clinicians who provided services to consumers but did not have supervisory duties.

Intervention

Person-centered care planning is a recovery-oriented intervention designed to meet policy mandates to individualize treatment and empower consumers to make decisions about their own care (Substance Abuse and Mental Health Service Administration, 2012). Person-centered care planning provides a framework for the collaborative co-creation of a recovery-oriented treatment plan that is driven by an individual’s most valued life goals (Tondora et al, 2014). Providers learn how to elicit and empathize with their client’s subjective experiences as a whole person and howto help consumers identify and articulate their interests, preferences, and personal recovery goals. Symptoms and impairments are reframed as barriers to goal attainment and providers identify short-term, realistic, and measurable objectives, while keeping objectives explicitly connected to longer term aspirations and expanding the action network to include natural supporters as well as professional providers (Tondora et al., 2014).

Training Initiative

The training initiative for PCCP consisted of discrete strategies based on the ERIC model (Powell et al., 2015). Each of the seven sites randomized to the PCCP condition received a 2-day dynamic in-person training with distribution of educational materials followed by monthly TA calls (external facilitation) with PCCP consultants over a 12-month period. The consultants were the primary developers of the PCCP intervention and had extensive experience training providers across the United States. Supervisors were the primary target of the PCCP training with the expectation that they would train their direct care teams in PCCP. In addition to supervisors, two direct care staff from each team participated in the initial training session. At the in-person training, participants were educated on mental health recovery and the principles of person-centered care planning. They then had behavioral rehearsal exercises in which they developed person-centered service plans in teams with feedback from the trainers. Each training participant also received a PCCP provider manual and other training materials to reinforce learning. The first call was for supervisors only to discuss their progress in training their teams and to troubleshoot potential barriers in the implementation process. For the second monthly call, one supervisor and their team developed a service plan and submitted it ahead of time to the consultants for a service plan for consultant and peer feedback. This team was composed of all direct care providers who worked in the supervisor’s program and reported directly to that supervisor. All clinics received the full dose of the training with one in-person training and 24 technical assistance calls.

Quantitative Methods

Data Sources

The two sources of data for the quantitative phase of the study were consultant ratings and leadership attendance on TA calls.

Once every month, the consultants rated the agencies based on their two TA calls conducted with each clinic. As consultants evaluated multiple sites simultaneously, the ratings were kept brief for feasibility. Implementation leadership was measured by four items with a ten-point Likert scale (1=low; 10=high) corresponding to the four domain indicators of implementation leadership: the degree to which executive leadership was knowledgeable about PCCP, the degree to which executive leadership was proactively engaged in the implementation process, the degree to which executive leadership persevered in the face of challenges to the PCCP implementation process, and the degree to which executive leadership was supportive of employee efforts to implement PCCP (Aarons et al., 2014). Overall Cronbach’s alpha for the implementation leadership items was .94. Organizational readiness for change was measured by a single-item ten-point Likert scale (1=low; 10=high) in which consultants made a global assessment of a clinic’s readiness to implement the PCCP intervention.

Attendance was recorded for each call by study staff. Leadership attendance on each of the 24 TA calls was recorded as a “yes” if any leadership were present on the call and “no” if no leadership were present on the call.

Analyses

Implementation leadership, leadership attendance and organizational readiness for change means were calculated for each month over 12 months. Missing data were handled using multiple imputation (Rubin, 1987). Pearson’s r correlations were used to examine the relationships between the two leadership variables and organizational readiness for change over the 12 time points. Means and standard deviations of implementation leadership and organizational readiness for change were calculated for the first (months 1-6) and second (months 7-12) halves of the TA period. For leadership attendance, proportions of leadership attendance in the first and second halves of the TA period were calculated by dividing the number of calls attended by leadership by the total number of calls occurring during each have of the TA period (12 per half). Paired sample t-tests were used to examine changes on organizational readiness for change and leadership behavior indicators across the two phases of the TA period. McNemar’s chi square test was used to compare within group leadership attendance across the first and second halves of the TA period.

Qualitative Methods

Sample

The sample for the qualitative phase of the study was comprised of data from 15 focus groups conducted within the community mental health clinics randomized to the PCCP condition (n=7). One supervisor focus group and one direct care focus group were conducted at each clinic except for one clinic in which two direct care focus groups were conducted due to the large number of direct care staff participating. A total of 104 providers participated in the qualitative phase of the study. Forty clinical supervisors participated across the seven supervisor focus groups and 64 direct care staff participated in the eight direct care focus groups. Focus groups consisted of three to twelve participants each. Of the 104 providers included in the sample, the majority were female (N=82, 78.8%), white (N=59, 56.7%), and non-Hispanic (N=3, 2.9%). The average age of providers was 44.37 (SD= 11.43). Forty (38.5%) had a Bachelor’s degree and 56 (53.8%) had a Master’s degree. Years working at the agency ranged from less than one year to 28 years (M=7.59, SD=6.45).

Procedures

At the conclusion of the 12-month training initiative for each site, focus groups were conducted to explore barriers and facilitators to implementation. Focus groups were conducted using a semi-structured interview guide that was informed by previous literature on providers’ concerns during PCCP implementation (Tondora, Miller, & Davidson, 2012). Questions from the interview guide included: Who has been involved in implementing PCCP within this organization? Following the PCCP training, how prepared did you feel to help staff implement this practice? Overall topical domains addressed during the focus groups included: perspectives on PCCP, experiences with implementing PCCP and training staff, and perceptions of leadership. Each of the focus groups was conducted by two masters-level interviewers with experience working in community mental health clinics and with training in the PCCP intervention. Focus groups lasted approximately 60 minutes. All participants received $20 compensation for their time.

Qualitative Analyses

Focus groups were digitally recorded and transcribed verbatim with names and identifying information removed. Transcripts were then entered into Atlas-TI for data management and analysis. Thematic analysis was conducted by three independent researchers to generate categorical codes for similar ideas and patterns throughout the data and to create an initial codebook (Boyatzis, 1998). Weiner’s (2009) theory for organizational readiness and implementation leadership were employed for sensitizing concepts with codes reflecting change efficacy, change motivation, and leadership behavior (Aarons et al., 2014; Weiner, 2009). Two of the researchers first worked iteratively, reviewing and coding all of the transcripts to develop and refine the codebook. Subcategories and higher level themes emerged through this process. The third researcher assisted in resolving any inconsistencies in coding through a process of consensus coding in collaboration with the other two researchers. To ensure trustworthiness throughout the coding process, the researchers engaged in strategies to ensure rigor, including weekly team debriefings and the use of an audit trail (Padgett, 2008).

Mixed Methods Integration

Quantitative and qualitative data were integrated during the analytic and interpretation phases using an elaborative design (Palinkas et al., 2011). After the initial inductive thematic analysis, qualitative data ‘chunks’ related to the quantitative findings were extracted to gain further depth of understanding as to how providers’ perceived organizational readiness to change. By mixing these two sources of data, this study triangulates the multiple perspectives of consultants, supervisors, and providers with observed leadership behaviors during the training initiative. This integration approach serves the function of complementarity in which both methods are used to answer questions related to variation in organizational readiness for change and leadership behavior over time (Creswell & Clark, 2011).

Results

Quantitative Results

Longitudinal analyses demonstrated that organizational readiness for change had an average positive trajectory over the 12-month period (see Figure 1) and a change in the positive trajectory at the 6-month time point.

Figure 1.

Figure 1.

Mean Consultant Ratings of Organizational Readiness for Change by Month.

Paired sample t-tests revealed significantly higher organizational readiness for change during the second six months of the training initiative period compared to those observed during the first six months (t=2.63, df=6, p=.04), increasing from an average of 4.52 (2.78) in the first six months to 5.94 (2.01) during the second months. Implementation leadership was significantly higher during the second six months of the training initiative period compared to those observed during the first six months (t=4.08, df=6, p=04)), increasing from an average of 5.71 (1.63) in the first six months to 7.3(1.1) during months 7-12. Leadership attendance increased significantly during the second phase of the training initiative period (p <0.001) with the average proportion of leadership attendance increasing from 45% in the first half of the training period to 74% in months 7 to 12. Implementation leadership (r=.49, p<.01) and leadership attendance (r=.29, p<.01) were both positively correlated with organizational readiness for change across all 12 time points.

Qualitative Results

The qualitative findings provided insight into the variation in organizational readiness for change and leadership behavior over the 12-month training initiative. There were similar themes across the majority of clinics reflecting the quantitative findings, but also some variation at the site level. Provider perceptions of implementation revealed their attitudes about the PCCP intervention, the influence of the larger agency context, and how leadership behavior shaped their understanding and response to the training initiative (see Table 1).

Table 1:

Themes of change motivation, change efficacy, and implementation leadership during the first six months and the second six months

First Six Months Second Six Months
Change Motivation I think in the overall what I have to do and juggle and balance, I don’t think—I don’t think everyone’s on the same page that that is the best method. Um, so again, it’s a lot of pushback of us to like everybody else, like, “No, this isn’t the way it should be.”
(Direct Care/Site 2)
The other thing that I remember about the first email… said, “You’ve been chosen for this great opportunity,” and we’re all going, “Oh, no.” But she tried really hard to frame it as something really good. But there was moaning and groaning.
(Supervisor/Site 3)
I think, um, for my program, I think there’s not—I think person-centered, again, like I said, it was very delayed, my response to it, but it’s been very positive and I don’t think it’s something that’s going to fall by the wayside. (Supervisor/Site 6)
Um, the other thing is, is I’ve been doing treatment plans for a, a long time, so it was, it was kinda—initially it wasn’t—but towards the end it got a little refreshing, um, because you—l got caught in, in—this is the way I do it. (Supervisor/Site 7)
Change Efficacy I mean, I feel kind of like, um—of course [pause] most of, you know, if you ask nine out often clinicians, of course they’re going to be, like, “I’m really happy with the way things are going in our program, we don’t need to change anything.” (Supervisor/Site 7)
Um, so I definitely felt the staff were open but a lot of them, again, folk who had been here a little bit longer, it was kind of like, “How is this different again? Or are we just calling it a different name?”
(Supervisor/Site 6)
Got an email that we were doing this and I remember the reaction of the people was, “Person-centered treatment plans? We always did do person-centered treatment plans. We don’t need to learn that.” Come to find out it is a different, it is a little bit of a different process. (Supervisor/Site 3)
And it wasn’t just the same old stuff, either, like DMHS trains us, and it was actually fun activities and getting the staff to think about it and getting that ah-ha moment for them because of the material that we had. (Supervisor/Site 8)
Implementation Leadership But am I understanding what it—uh, the, the purpose of all of it and how it’s—no. I’m just doing it. [laughs] But I’m not really comprehending the importance of it. (Direct Care/Site 2)
Like I’d go to my supervisor and she’d be stressed out and then they’d call up here to [name 5], who’s like the mini second in command up here and you couldn’t get her and you’re leaving a message. So it kind of pitted people against each other and I’d say kind of a lack of leadership all through the summer that, where do you go? (Direct Care/Site 2)
F3: but I think that also, you know, having [leader] and [leader] involved showed a buy-in, showed some support, you know. She always is, she gets involved where she can and—physically and/or not just, “Yeah, that’s great, do that.” (Supervisor/Site 8)
I think the leadership promotes that, you know, um, in it’s all, it’s always a learning experience. You know, every day is a learning experience and I, I think, um, that’s truly based on the leadership and how the leadership is leading. Yeah. (Supervisor/Site 12)

Change Motivation

Providers described their reluctance to engage with the intervention during the first six months, which reflected both a resistance to a new practice and particular reservations about PCCP. The reluctance to engage was due to feeling that they were already overburdened by competing priorities and that this was an additional training workshops they had to attend. Many of the agencies were involved in other initiatives, some of which were being mandated at the state level, and PCCP was seen as yet another demand. This frustration was intensified by the perception that they had no choice in whether they participated. However, providers also described how this initial resistance decreased over time as they became more open to the practice itself. What they initially perceived to be an onerous task related to paperwork (the service plan), they gradually saw as being a valuable and rewarding clinical activity. Greater understanding about PCCP and what it entailed increased motivation as its perceived benefits outweighed concerns about competing demands. Once the practice was understood to improve their day-to-day work providers became more open and viewed it as sustainable.

Change Efficacy

When providers spoke about their initial response to training in PCCP, many articulated the belief that they were already delivering person-centered care and questioned their need for training. The perception among providers that they had a high degree of efficacy in this practice was an initial barrier. Related to this was a more general feeling (particularly among providers who had been at an agency for some time) that no practices were really novel, which led to an overall cynicism about any new initiative.

The shift in the second six months in organizational readiness for change was expressed by one provider as “the aha moment” when staff fully understood the intervention and how it differed from what they were doing previously. This shift occurred as providers attended TA calls, during which they went through the process of developing their own person-centered care plans. At that time, there was also the recognition of the rewards gained by this new way of practicing, which in turn led to greater organizational readiness for change as providers became enthusiastic about PCCP. While the majority of the agencies experienced this shift, there were a few sites that continued to see themselves as already proficient in person-centered care and believed the training to be superfluous.

Implementation Leadership

Perceptions among the providers about the initial role of leadership in implementation was that while leaders made the initial decision to implement PCCP, most were not going to be involved in the training process. The majority of providers experienced PCCP implementation as a top-down decision from executive leadership that was essentially communicated to them as a “fait accompli.” This created a lack of buy-in from the providers, partly because they had no say in the matter but also because they did not understand the larger context for the implementation. In turn, they felt the leadership was out of touch with their context, which were the daily demands and pressures of direct clinical practice. Beyond the initial directive to train in PCCP, providers experienced a lack of leadership involvement in the ongoing day-to-day implementation process, which created a lack of direction and tension.

From the quantitative findings, a shift in leadership involvement was demonstrated by increased attendance in the second half of the training initiative. Though providers did not speak specifically to leadership attendance, they did express an awareness of leadership involvement and how they took this as signaling their support for the implementation effort. The appreciation for leadership engagement was largely articulated among the supervisors, who as middle managers were looking for cues of buy-in among their executive leadership. One supervisor spoke about a particular leader being a champion of PCCP and how that encouragement helped with the implementation process. There was variation across the sites with some experiencing consistent support from leadership throughout the training process while others felt a lack of leadership in the early stages of training initiative.

Discussion

The study explored the variation in organizational readiness for change and leadership over time. The quantitative findings indicated a greater organizational readiness for change in the second six months than the first six months of the training initiative. The qualitative findings provided insight into these different stages of implementation. In the first six months, providers described their general resistance to taking on a new initiative amid other competing initiatives and regulatory demands, indicating a low degree of change motivation. At the outset, this resistance was not offset by a perceived value in changing their practice, because many of them felt they were already practicing person-centered care. This finding echoed results from previous studies revealing concerns about implementing PCCP specifically related to the belief among providers that their practice is already person-centered (Tondora et al., 2012). The observed shift in the second six months appeared to reflect a deeper understanding of the practice and an appreciation for how it could enhance their work with service users.

Overall, the lack of organizational readiness for change in the first six months of this training initiative can be ascribed to absence of buy-in by providers, wherein providers could be understood as being in a pre-contemplation phase with regard to adopting PCCP (DiClemente & Prochaska, 1998). Taken together, these findings suggest the critical importance of identifying and addressing stage of change and sources of resistance prior to training through assessing barriers and facilitators (Powell et al., 2015). Involving providers in decisions from the outset, addressing concerns about competing demands, and giving them a deeper sense of what the practice entailed may have significantly improved the initial receptivity to the training thereby making the training initiative more efficient.

In terms of leadership, both the subjective measure of consultant ratings of implementation leadership and the objective measure of leadership attendance on the calls were correlated with an increase in organizational readiness over time in the last six months. This suggests there is a relationship between increased visibility and involvement by leadership in the implementation and a greater organizational readiness to implement PCCP. The connection between providers’ willingness to implement the practice and the role of leadership was corroborated in the focus groups. Providers were attuned to the way leadership framed the implementation. They perceived the training initiative to be a top-down decision with no input from the clinical staff. This, together with their belief that leadership did not understand the daily realities of their work, clearly contributed to a lack of buy-in by providers at the beginning of the training. Whereas leaders who actively engaged in the implementation effort and promoted a sense of excitement around the practice change were perceived positively by providers. They played the role of “champions”, which refers to leaders who signal their commitment to a new practice and encourage similar commitment in their staff (Greenhalgh, 2004).

Limitations

There were several limitations to this study. The measure of organizational readiness for change did not measure actual uptake on PCCP. Organizational readiness for change was only measured by a single global item and implementation leadership was only measured by four items. The brevity of the consultant ratings was due to the frequency of its completion but this decreased the validity and reliability of the measure. Overall, consultant ratings of implementation readiness for change and leadership may have been subject to bias as they were also responsible for providing the initial two-day in-person training and leading the TA calls. However, findings were triangulated by an objective measure of leadership participation and provider perceptions from the focus group data. Attendance on the calls by leaders did not capture the extent to which they participated on the call. The focus groups may also have been subject to social desirability as providers were with their peer group; however, the data reflected a variance of perceptions that suggested their trustworthiness. While focus group data were utilized to interpret the quantitative data, providers did not necessarily discuss the implementation in a precise timeframe and may have had biased recall; therefore, mapping the focus group data onto the training time period was an approximation. However, the questions concerning the training were structured in such a way to prompt providers to describe the training initiative in a narrative form focusing on how the process unfolded over time.

Conclusions

Training remains the predominant implementation strategy for new practice initiatives in human service settings and its high cost both in provider time and resources is rarely reimbursable by traditional funding streams. Therefore, understanding the conditions with which to maximize the efficiency of training activities is critical. This study showed that for the first six months the training initiative had limited effect of the agency being ready to implement PCCP due to lack of provider buy-in and endorsement by the leadership. In order to address these barriers when preparing for a training initiative, agencies should consider employing a broader range of implementation strategies beyond training activities. Strategies to involve providers in decision-making from the outset, coordinating the new practice with other agency demands, and assessing providers’ stage-of-change with respect to the proposed practice have the potential to increase buy-in and ultimately, the efficiency of the training. Also important is leadership behavior as providers often take their cues from leaders when considering how much to commit to a training initiative. Organizational theories have posited that leadership visibility and role modeling play a central role in creating an organization that is ready to learn (Salas et al., 2012).

Therefore, while many agencies focus primarily on the delivery of training activities, this study suggests that the preparation for a training process maybe as important, hence the inclusion of planning stages in implementation frameworks (Damschroder et al., 2009; Fixsen et al., 2005). Trainers should assess for organizational readiness for change at the outset and calibrate the timing of their training initiative accordingly. Particularly in the case of external trainers who enter a setting with little knowledge of the context and provider perceptions. Some training models will only proceed if an assessment is made and an agency is deemed to be ready (Yanosy, 2018). By paying attention to facilitators and barriers from the outset, agencies can focus on creating optimum conditions for a new implementation effort and maximizing the efficiency of their training initiative.

Acknowledgments

Funding: This research was funded by the National Institute of Mental Health (NIMH), Grant No. R01MH099012. The content is the sole responsibility of the authors and does not necessarily reflect the official views of NIMH.

Footnotes

Publisher's Disclaimer: This Author Accepted Manuscript is a PDF file of an unedited peer-reviewed manuscript that has been accepted for publication but has not been copyedited or corrected. The official version of record that is published in the journal is kept up to date and so may therefore differ from this version.

Ethical approval: All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Conflict of Interest: The authors declare that they have no conflict of interest.

References

  1. Aarons GA (2006). Transformational and transactional leadership: Association with attitudes toward evidence-based practice. Psychiatric services, 57(8), 1162–1169. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Aarons G, Ehrhart M, & Farahnak L (2014). The implementation leadership scale (ILS): development of a brief measure of unit level implementation leadership. Implementation Science, 9(1), 45. doi: 10.1186/1748-5908-9-45 [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Aarons G, Ehrhart M, Farahnak L, & Hurlburt M (2015). Leadership and organizational change for implementation (LOCI): a randomized mixed method pilot study of a leadership and organization development intervention for evidence-based practice implementation. Implementation Science, 10(1), 11. doi: 10.1186/sl3012-014-0192-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Aarons GA, & Sommerfield D (2012). Leadership, innovation climate, and attitudes toward evidence-based practice during a statewide implementation. Journal of the American Academy of Child & Adolescent Psychiatry, 51(4), 423–431. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Adams N, & Grieder D (2014). Treatment planning for person-centered care: the road to mental health and addiction recovery: mapping the journey for individuals, families and providers (2nd ed): Academic Press. [Google Scholar]
  6. Aguinis H, & Kraiger K (2008). Benefits of Training and Development for Individuals and Teams, Organizations, and Society. Annual Review of Psychology, 60(1), 451–474. doi: 10.1146/annurev.psych.60.110707.163505 [DOI] [PubMed] [Google Scholar]
  7. Barnett M, Brookman-Frazee L, Regan J, Saifan D, Stadnick N, & Lau A (2017). How Intervention and Implementation Characteristics Relate to Community Therapists’ Attitudes Toward Evidence-Based Practices: A Mixed Methods Study. Administration and Policy in Mental Health and Mental Health Services Research, 44(6), 824–837. doi: 10.1007/s10488-017-0795-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Battilana J, Gilmartin M, Sengul M, Pache A-C, & Alexander JA (2010). Leadership competencies for implementing planned organizational change. The leadership quarterly, 21, 422–438. doi: 10.1016/j.leaqua.2010.03.007 [DOI] [Google Scholar]
  9. Beidas RS, Edmunds JM, Marcus SC, & Kendall PC (2012). Training and Consultation to Promote Implementation of an Empirically Supported Treatment: A Randomized Trial. Psychiatric Services, 63(7), 660–665. doi: 10.1176/appi.ps.201100401 [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Beidas RS, & Kendall PC (2010). Training Therapists in Evidence-Based Practice: A Critical Review of Studies From a Systems-Contextual Perspective. Clinical Psychology: Science and Practice, 17(1), 1–30. doi: 10.1111/j.1468-2850.2009.01187.x [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Boyatzis RE (1998). Transforming qualitative information: Thematic analysis and code development. Thousand Oaks, CA: Sage. [Google Scholar]
  12. Colquitt JA, LePine JA, & Noe RA (2000). Toward an Integrative Theory of Training Motivation: A Meta-Analytic Path Analysis of 20 Years of Research. Journal of Applied Psychology, 85(5), 678–707. [DOI] [PubMed] [Google Scholar]
  13. Coulter A, Entwistle VA, Eccles A, Ryan S, Shepperd S, & Perera R (2015). Personalised care planning for adults with chronic or long-term health conditions. Cochrane Database Syst Rev, 3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Creswell JW, & Clark VLP (2007). Designing and conducting mixed methods research. Thousands Oaks: Sage. [Google Scholar]
  15. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, & Lowery JC (2009). Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science, 4(1), 50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. DiClemente CC, & Prochaska JO (1998). Toward a comprehensive, transtheoretical model of change: Stages of change and addictive behaviors Treating addictive behaviors, 2nd ed (pp. 3–24). New York, NY, US: Plenum Press. [Google Scholar]
  17. Fixsen DL, Naoom SF, Blase KA, Friedman RM, & Wallace F (2005). Implementation research: A synthesis of the literature (FMHI Publication No. 231). Tampa: University of South Florida, Louis de la Parte Florida Mental Health Institute, National Implementation Research Network; Retrieved October 11, 2018, from http://www.fpg.unc.edu/~nirn/resources/publications/Monograph/index.cfm [Google Scholar]
  18. Forsetlund L, Bjørndal A, Rashidian A, Jamtvedt G, O’Brien MA, Wolf FM, . . . Oxman AD (2009). Continuing education meetings and workshops: effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews(2). doi: 10.1002/14651858.CD003030.pub2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Greenhalgh T (2004). Diffusion of innovations in service organizations: systematic review and recommendations. The Milbank quarterly, 82(4), 581. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Guerrero EG, Padwa H, Fenwick K, Harris LM, & Aarons GA (2016). Identifying and ranking implicit leadership strategies to promote evidence-based practice implementation in addiction health services. Implementation Science, 11(1), 69. doi: 10.1186/s13012-016-0438-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Herschell AD, Kolko DJ, Baumann BL, & Davis AC (2010). The role of therapist training in the implementation of psychosocial treatments: A review and critique with recommendations. Clinical psychology review, 30(4), 448–466 [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Herschell AD, McNeil CB, & McNeil DW (2004). Clinical Child Psychologys Progress in Disseminating Empirically Supported Treatments, 267. [Google Scholar]
  23. Institute of Medicine. (2001). Crossing the quality chasm: A new health system for the 21st Century. The Brief Report. Retrieved from Washington, DC: [Google Scholar]
  24. Lyon A, Stirman S, Kerns S, & Bruns E (2011). Developing the Mental Health Workforce: Review and Application of Training Approaches from Multiple Disciplines. Administration and Policy in Mental Health and Mental Health Services Research, 38(4), 238–253. doi: 10.1007/sl0488-010-0331-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Matthews EB, Stanhope V, Choy-Brown M, & Doherty M (2018). Do Providers Know What They Do Not Know? A Correlational Study of Knowledge Acquisition and Person-Centered Care. Community Mental Health Journal, 54(5), 514–520. doi: 10.1007/s10597-017-0216-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Miller W, & Mount K (2001). A small study of training in motivation interviewing: Does one workshop change clinician and client behavior? . Behavioural and Cognitive Psychotherapy, 29(4), 457–471. doi: 10.1017/S1352465801004064 [DOI] [Google Scholar]
  27. Miller WR, Yahne CE, Moyers TB, Martinez J, & Pirritano M (2004). A Randomized Trial of Methods to Help Clinicians Learn Motivational Interviewing, 1050. [DOI] [PubMed] [Google Scholar]
  28. Nadeem E, Gleacher A, Pimentel S, Hill LC, McHugh M, & Hoagwood KE (2013). The Role of Consultation Calls for Clinic Supervisors in Supporting Large-Scale Dissemination of Evidence-Based Treatments for Children. Administration and Policy in Mental Health and Mental Health Services Research, 40(6), 530–540. doi: 10.1007/sl0488-013-0491-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Padgett DK (2008). Qualitative methods in social work research (Second ed). Thousand Oaks, CA: Sage. [Google Scholar]
  30. Palinkas LA, Horwitz SM, Chamberlain P, Hurlburt MS, & Landsverk J (2011). Mixed-methods designs in mental health servcies research: A review. Psychiatric Services, 62(3), 255–263. [DOI] [PubMed] [Google Scholar]
  31. Paradise A (2007). State of the industry: ASTD’s annual review of trends in workplace learning and performance. Retrieved from Alexandria, VA: ASTD. [Google Scholar]
  32. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, … Kirchner JE (2015). A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science, 10(1), 1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Rubin DB (1987). Multiple Imputation for Nonresponse in Surveys. New York: John Wiley. [Google Scholar]
  34. Salas E, Tannenbaum SI, Kraiger K, & Smith-Jentsch K (2012). The science of training and development in organizations: What matters in practice. Psychological Science in the Public Interest, 13, 74–101. [DOI] [PubMed] [Google Scholar]
  35. Schein E (2010). Organizational culture and leadership. San Francisco: Wiley. [Google Scholar]
  36. Stanhope V, Ingoglia C, Schmelter B, & Marcus SC (2013). Impact of Person-Centered Planning and Collaborative Documentation on Treatment Adherence. Psychiatric Services, 64(1), 76–79. [DOI] [PubMed] [Google Scholar]
  37. Stanhope V, Tondora J, Davidson L, Choy-Brown M, & Marcus SC (2015). Person-centered care planning and service engagement: a study protocol for a randomized controlled trial. Trials, 16(1), 180. doi: 10.1186/sl3063-015-0715-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Stetler C, Ritchie J, Rycroft-Malone J, & Charns M (2015). Leadership for evidence-based practice: Strategic and functional behaviors for institutionalizing EBP. Worldviews on Evidence-Based Nursing, 11(4), 219–226. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Substance Abuse and Mental Health Service Administration. (2012). SAMHSA’s definition and guiding principles of recovery – answering the call for feedback. Retrieved from http://blog.samhsa.gov/2011/12/22/samhsa%E2%80%99s-definition-and-guiding-principles-of-recovery-%E2%80%93-answering-the-call-for-feedback/
  40. Tondora J, Miller R, & Davidson L (2012). The Top Ten Concerns about Person-Centered Care Planning in Mental Health Systems. The International Journal of Person Centered Medicine, 2(3), 410–420. [Google Scholar]
  41. Tondora J, Miller R, Slade M, & Davidson L (2014). Partnering for Recovery in Mental Health: A Practical Guide to Person-Centered Planning: John Wiley & Sons. [Google Scholar]
  42. Weiner B (2009). A theory of organizational readiness for change. Implementation Science, 4, 67–75. [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Yanosy S (2018). The Sanctuary Model: A trauma responsive framework for organizational practice. Paper presented at the Visiting Scholars Lecture, Fordham Graduate School of Social Service, New York, NY. [Google Scholar]

RESOURCES