Abstract
Objective
Although shared decision-making (SDM) is a key element of client-centered care, it has not been widely adopted. Accordingly, interventions have been developed to promote SDM. The aim of this study was to explore the implementation process of one SDM intervention, CommonGround, which utilizes peer specialists and a computerized decision support center to promote SDM.
Methods
As part of a larger study, CommonGround was implemented in four treatment teams in a community mental health center. The implementation process was examined by conducting semi-structured interviews with 12 staff members that were integral to the CommonGround implementation. Responses were analyzed using content analysis. Program fidelity and client program use were also examined.
Results
Although key informants identified several client and staff benefits to using CommonGround, including improved treatment engagement and availability of peer specialists, most clients did not use CommonGround consistently throughout the implementation. Key informants and fidelity reports indicated a number of program (e.g., technological difficulties, increased staff burden) and contextual barriers (e.g., poor fit with service structure, decision support center location, low staff investment and high turnover) to the successful implementation of CommonGround. Strategies to maximize the implementation by increasing awareness, buy-in, and utilization are also reported.
Conclusions and Implications for Practice
This implementation of CommonGround was limited in its success partly as a result of program and contextual barriers. Future implementations may benefit from incorporating the strategies identified to maximize implementation in order to obtain the full program benefits.
Keywords: severe mental illness, communication, shared decision-making, client-centered care, computerized intervention
Client-centered care is a primary goal in healthcare (Institute of Medicine, 2001; National Research Council, 2006) with two main components: involving the clients in their own care and individualizing care to client needs and preferences (Robinson, Callister, Berry, & Dearing, 2008). A core practice of client-centered care is shared decision-making (SDM), in which providers and clients collaboratively plan and address treatment goals (Charles, Gafni, & Whelan, 1997).
Literature demonstrates the potential for SDM to enhance client-centered care for those with severe mental illness (e.g., see Deegan & Drake, 2006; Fukui et al., 2014; Matthias et al., 2014; Park et al., 2014; Stacey et al., 2016). Research also shows that people with severe mental illnesses want to be involved in decision-making (Adams, Drake, & Wolford, 2007; Eliacin, Salyers, Kukla, & Matthias, 2015; Mahone et al., 2011; Matthias, Salyers, Rollins, & Frankel, 2012; Woltmann & Whitley, 2010). However, reports indicate SDM is not regularly occurring in psychiatric care (De las Cuevas, Peñate, Perestelo-Pérez, & Serrano-Aguilar, 2013; Goss et al., 2008; Matthias et al., 2014; Salyers et al., 2012). As a result, interventions have been proposed to increase SDM in psychiatry (Alegría et al., 2008; Bartels et al., 2013; Hamann, Cohen, Leucht, Busch, & Kissling, 2007; Hamann et al., 2011), but these are not widely adopted.
One SDM intervention, CommonGround, was developed by and for people with mental illness (Deegan, 2010). CommonGround unites support from peer specialists, computer technology, and provider- and client-coaching to promote SDM (Deegan, 2010; Deegan, Rapp, Holter, & Riefer, 2008). The CommonGround program, housed in a Decision Support Center (DSC), is designed to help clients with agenda-setting, talking to providers, and decision-making to support medication management goals. Videos of other clients’ recovery stories are available along with psychoeducational resources. Peer specialists facilitate access to the CommonGround program, helping clients complete a “health report” prior to meeting with their psychiatric provider that describes symptoms, medications, concerns, and appointment goals. The report integrates “personal medicine” – self-initiated activities that promote wellness (e.g., gardening, working) and conveys a “power statement,” describing goals for psychiatric medicine (Deegan, 2010). Health reports can be accessed by providers, and shared decisions about treatment course can be entered. Pilot testing of CommonGround found improvements in communication and SDM (Campbell, Holter, Manthey, & Rapp, 2014; Deegan et al., 2008).
CommonGround was recently implemented in a community mental health center (CMHC) in [state]. This implementation effort was informed by three key principles from the National Implementing Evidence-Based Practice (EBP) Project (Torrey, Lynde, & Gorman, 2005). First, relevant stakeholders (e.g., funders, administrators, clinicians, clients) must agree with the intervention’s value and be motivated to restructure current services to accommodate the implementation. Second, appropriate resources and supports for implementation (e.g., training, opportunities to observe the intervention at a model site) must be provided. Third, once the intervention is established, the agency and stakeholders should receive ongoing feedback about the implementation processes and intervention fidelity along with resources to improve implementation (Torrey et al., 2001).
From the current CommonGround implementation, clients reported improved symptoms and recovery perceptions, and providers reported increased client treatment involvement (Salyers, M. P., Fukui, S., Bonfils, K. A., Firmin, R. L., Luther, L., Goscha, R. J., Rapp, C. A., Holter, M. C. (in press). Consumer outcomes in implementing CommonGround as an approach to shared decision-making. Psychiatric Services). Although CommonGround holds promise for wider-scale implementation, no work has yet focused on the implementation process for this complex intervention. Our aim was to explore the current CommonGround implementation process and identify strategies to enhance the program’s impact at future sites. We used a mixed-methods approach, triangulating data from three sources: health report completion data over the course of the implementation, qualitative interviews with staff, and three program fidelity reports.
Method
Setting
We implemented CommonGround in a large, urban CMHC. This site was chosen because it is the largest and oldest CMHC in Indiana, and the research team has a long-standing relationship with this CMHC, including implementation of other recovery-oriented practices. Two Assertive Community Treatment (ACT) teams and two outpatient teams were chosen by agency leaders to implement CommonGround. One hundred sixty-seven clients participated in the larger study at baseline, 105 at 12 months, and 83 at 18 months. For additional details regarding the main client outcome study, see Salyers, M. P., Fukui, S., Bonfils, K. A., Firmin, R. L., Luther, L., Goscha, R. J., Rapp, C. A., Holter, M. C. (in press). Consumer outcomes in implementing CommonGround as an approach to shared decision-making. Psychiatric Services.
Implementation Plan
Implementation planning began with the proposal, with the researchers working with an agency leader to identify target programs. Once funded (March, 2012), two managers (one ACT, one outpatient) and research staff visited an established CommonGround site in Kansas (August, 2012). An Implementation Coach was paired with the agency to support implementation. This coach received training from a CommonGround Specialist from the original pilot study (Deegan et al., 2008) and had monthly consultation calls early in this implementation. We developed a stakeholder steering committee (September, 2012) to guide our implementation efforts and ensure the implementation plan aligned with stakeholders’ goals and the agency’s capabilities. CMHC staff training occurred prior to CommonGround implementation (February-March, 2013). Individual training sessions were held for psychiatric providers (including new providers when teams experienced turnover). Four group trainings for new staff and refreshers for existing staff were provided throughout the implementation period. The Implementation Coach also visited the site monthly for six months after opening the DSC (May-November, 2013). A leadership team consisting of the research project manager, the Implementation Coach, agency managers, and team leaders met monthly to oversee the process. The CommonGround Specialist conducted three fidelity visits to document the degree of implementation and provide corrective feedback to agency leadership.
The initial target date for availability of the CommonGround program was April 1, 2013, but due to construction delays, agency use of the CommonGround program began on May 29th, 2013; use continued through March 5th, 2015. One ACT team moved to a new building in February, 2014. A second DSC was created there, and CommonGround was available to enrolled clients when each DSC was staffed by a peer specialist (20 hours per week). Key informant interviews were conducted between January, 2015 and May, 2015.
Data Sources
CommonGround usage data
From May, 2013 to February, 2015 (the end of the study observation period), the program developers (Pat Deegan PhD & Associates, LLC) tracked the number of times each enrolled client completed a health report.
Key informant interview
A semi-structured interview guide was developed in an iterative process between several co-authors. Interview questions asked about staff members’ roles at the CMHC, interaction with CommonGround, perceptions of the program’s benefits and drawbacks as well as implementation barriers and facilitators, the extent to which CommonGround was used and discussed by their fellow staff members, and suggested changes to the program or implementation process.
Fidelity reports
CommonGround fidelity was assessed with the 13-item CommonGround Fidelity Scale (Fukui et al., in press) at 6-months (12/15/13), 12-months (6/20/14), and 2-years post-implementation (6/2/15). This scale was developed to assess fidelity to critical elements of the CommonGround intervention in five areas: structure, process, peer support, direct service staff integration, and supervision. Items have variable 5-point response options, with higher scores reflecting better adherence. Fidelity assessments were conducted through observation, interviews with staff and clients who used CommonGround, and review of CommonGround data for a subset of clients. Fidelity reports were provided to the agency and included scores, comments, and implementation suggestions. At the final 2-year assessment, item ratings were not assigned because the CMHC had largely ceased use of CommonGround; however, comments and implementation suggestions were still provided.
Procedures
For the key informant interviews, twelve staff considered integral to the implementation of CommonGround at the CMHC participated (we were unable to contact two additional staff). Interviews averaged 30 minutes; participants were compensated $25. Roles included supervisors, peer specialists, registrars, psychiatrists, and upper management (exact numbers are withheld to ensure confidentiality). On average, participants had been working at the CMHC for 3 years. All procedures were approved by the Indiana University Purdue University Indianapolis institutional review board.
Analyses
CommonGround usage data was aggregated at the team level. We examined descriptive statistics and analysis of variance (ANOVA) to compare clients on number of health reports completed, aggregated by team.
Key informant interviews were transcribed and de-identified before being coded by co-authors trained in qualitative research. A codebook was established using conventional content analysis (Hsieh & Shannon, 2005). All codes were developed from the data and discussed and refined within the team; no preconceived categories were used, allowing codes to emerge from the interviews (Kondracki, Wellman, & Amundson, 2002). Attention was paid to negative cases and disconfirming evidence throughout the coding process (Mays & Pope, 2000). Once initial coding was completed, each category was examined in-depth and summarized by a team member.
For fidelity reports, the first author examined scores, comments, and suggestions and summarized the data across the three assessments. This coding was audited by the last author to ensure confidence in the data (Creswell & Miller, 2000). Fidelity data served as a source of triangulation with the key informant interviews, ensuring comprehensiveness, encouraging reflexivity, and enhancing validity by providing the perspective of an outside observer (Mays & Pope, 2000).
Results
CommonGround Usage
As shown in Table 1, over 70% of clients on ACT teams completed a health report at least once over the 20-month observation period, whereas approximately half of outpatient clients completed at least one health report. Teams differed significantly on number of reports (F(3, 163)=40.64, p<.001); one ACT team had greater per client use (mean=7.1, compared to 1.7, 1.2, and 1.2), indicating more frequent health report completion.
Table 1.
CommonGround Health Report Completion Aggregated at the Team Level
| Team | Mean (SD) | Median | Number who completed at least one report |
Maximum reports completed |
|---|---|---|---|---|
| 1 (ACT) (N=31) | 7.06 (5.15) | 8 | 24 (77.4%) | 15 |
| 2 (ACT) (N=45) | 1.67 (1.64) | 1 | 32 (71.1%) | 7 |
| 3 (Outpatient) (N=45) | 1.16 (1.38) | 1 | 25 (55.6%) | 5 |
| 4 (Outpatient) (N=46) | 1.24 (1.58) | 1 | 26 (56.5%) | 6 |
Fidelity Reports
Based on fidelity ratings, the program achieved a moderate level of implementation at 6-months, with an average item score of 3.23 out of 5. A score of 3 on each item represents the mid-point, typically indicating more than half of the specific item criteria were implemented (Fukui et al., in press). At 12-months, the average item score was similar (3.15), indicating implementation had not improved. Scores were not assigned at 2-years. Comments and suggestions from all three fidelity reports are integrated with key informant interviews below.
Qualitative Results
Benefits of the program
Key informants identified benefits of CommonGround, particularly the breadth and relevance of material (e.g., budgeting, nutrition, relationships). The videos of recovery stories were also described as being normalizing for clients: “The videos are really important because the clients can see that they’re not the only one” (participant 10). Some staff felt the videos could increase client treatment engagement: “If we let them see through their own eyes from somebody else with their same experience, they might be able to buy-in and utilize the information” (4). Relatedly, staff described the CommonGround program as a non-threatening platform for clients that opened the door to productive conversations. The software was also appreciated as a time-saving resource that improved treatment consistency, especially when staff had gaps in their knowledge:
Sometimes people get stuck because if you’re working with hallucinations or you’re working with delusions, not every [staff] comes to this clinic with a great deal of mental health background…so there is kind of a learning curve so it can provide resources for people to have those conversations and to make the therapeutic relationship (1).
Furthermore, given high staff turnover, CommonGround made it easier for new staff to pick up where others left off.
CommonGround was also valued for better preparing clients. Visiting the DSC was viewed as a productive way for clients to use their time when waiting for medication appointments, which ultimately helped to ensure questions and concerns were addressed. CommonGround facilitated the flow of appointments and kept client goals salient, as noted by an informant: “It helped make sure that we were addressing what was important to them and not just what was important to [providers]” (6).
Finally, informants identified benefits of having peer specialists on staff, including the importance of peers in the broader team context:
Another thing that comes out of that place is to where a person does begin to talk and engage with specialist because they are feeling comfortable. But that bridges to possibilities with actually getting treatment from a clinician…so it opened communication with the rest of the team (11).
In addition to direct client benefits, the peer specialists also brought a new perspective to other staff members, sometimes combatting stereotypes about mental illness. The importance of peer specialists was supported in fidelity reports, where the assessor noted that the CMHC had “two very skilled peer specialists working in the DSCs.”
Barriers to implementation – CommonGround program-level factors
Informants spoke extensively about implementation barriers but relatively few were attributed specifically to characteristics of CommonGround. One broad criticism was that CommonGround is a complex intervention requiring staff time and investment, adding duties to already busy schedules: “Our care coordinators, I feel like they have so much they have to do already that trying to say, ‘Hey, make sure you get people in for CommonGround’ would just feel overwhelming to them” (6). One informant commented, “[Trying to] get everybody to do the [health report] every time they go to a doctor’s appointment, it just seemed very impossible” (12). Further, new staff required training in CommonGround, which was viewed as secondary to other duties. This often led to program use falling by the wayside: “So, kind of thinking, what is this CommonGround thing? They get so much new, anywhere you go. You get a lot thrown at you. Head kind of spins. And then just that’s something they don’t have to do (3).”
Technology-related issues, particularly as they added to staff burden, were also discussed. The most common technology-related barriers were regarding computer literacy/tech savvy for both staff and clients. One staff member suggested having in-house tech support to mitigate these concerns. Staff also suggested that some clients with low computer literacy found CommonGround intimidating, even with peer support specialists’ help. Some staff also mentioned difficulties opening another program on already-slow computers, remembering another password, and dealing with a system not designed to interface with their electronic medical records. One informant spoke about difficulty with the medication list embedded in the CommonGround program: “If we weren’t diligent about updating it, it would often then not be correct because it doesn’t interface within your systems…That occasionally caused some confusion” (6). CommonGround was also viewed as difficult for community-based staff (particularly ACT teams); staff described concerns about unreliable internet connectivity and safety using computers in the field, “We can take the laptops into the community obviously. But I think some of the apartments we go into, it’s better to leave it in the trunk” (8).
Barriers to implementation – Contextual factors
Though some barriers could be attributed to the CommonGround program, most were inherent to the service setting or implementation design. Organizational barriers related to a lack of fit within the existing service structure. Program developers recommended coordinating a 30-minute DSC appointment with the peer provider prior to the medication visit, but decentralized scheduling complicated this process. Teams that scheduled blocks for appointment times struggled with clients not coming early enough to use the system. One team initially operated on a walk-in basis, which sometimes meant clients were not appropriately directed to the DSC: “The doctor could grab them before you even let them know that they haven’t met with CommonGround” (12).
Staff also commented that it was difficult to integrate CommonGround into a crisis-driven setting, where focus was on addressing clients’ immediate needs. One outpatient team member said, “We’re a crisis-driven clinic and you could use this [resource], and you could use that [resource], but then they’re like ‘well they don’t have a house,’ so some of that stuff gets in the way” (1).
The fidelity assessor also commented on how the CommonGround program was never fully implemented into services at the CMHC; this was particularly evident in the absence of change in power statements in a health report (which should be updated periodically to ensure they reflect current client goals) at the 12-month fidelity review and the inconsistent staff recording of shared decisions over time. For example, at 2 years:
Four of the [client records] we reviewed did not complete a health report. Of the six that did complete a report, three had no shared decision documented on the report. Three did have shared decisions, and all three were written by one prescriber.
The assessor noted uneven team participation in CommonGround, with one prescriber more involved, which is consistent with results of CG usage data.
Several staff commented on issues related to the study design of selecting specific teams for CommonGround implementation. One informant said, “I would have scaled back and maybe only included either ACT or outpatient” (3), while another said, “We could have done it on all four [outpatient] teams instead of just 2 teams, then the whole building would be doing it, and that might have made a difference” (8). The fidelity assessor noted that study constraints were key to the ongoing lack of integration:
The direct service staff do not consistently review their clients' CommonGround [health] reports. When they do, it is typically because a client brings the report to them. One of the difficulties for direct service staff is that only a few clients on their caseloads [were recruited for the study to] use CommonGround. Because of this, CommonGround has not become integrated with the work they typically engage in with clients.
The location of the DSC itself posed barriers, first in construction delays, and later concerning its placement in relation to prescriber offices and other staff:
The building at that time was undergoing a lot of construction and that kind of pushed back the start date a little, a lot it seemed unfortunately… it was really hard then to plan trainings and to get excited about it and then it’s not happening (3).
Regarding the placement of the DSC, “I think maybe if somehow [the DSC] was on that same floor, I think there would’ve been a lot more interest and, I think clients would’ve probably used it more” (9). Concerns with DSC location became particularly salient when one ACT team moved to a new building, resulting in reduced staffing at both DSCs:
[The peer specialist is] only part-time. So some of the days, the clients do come in asking for her, but then she’s not here. So, I just try to instruct them to come back on the days that she is here (5).
This could make client appointments difficult to coordinate, “We could only schedule CommonGround on Monday, Wednesdays, and Fridays. And that got to be confusing” (6). The fidelity assessor also commented on this issue: “This led to confusion on the part of both staff and CommonGround users, because the DSC was not consistently open during clinic appointments. It appears that this impacted the motivation of both staff and clients to use CommonGround.”
Staffing, both lack of investment and turnover, were viewed as major barriers. Some reported that staff might not recognize the benefits of using CommonGround: “I think it’s underutilized because people don’t understand the richness of it, and they think they have sheets and ideas, but I don’t think they realize how much is on there (1).” Informants also discussed the importance of getting buy-in from team leaders, care coordinators, and prescribers - but this was sometimes challenging: “we just had such a difficult time I think getting the prescribers on board” (10). Staff turnover at the CMHC contributed to the lack of staff investment and also a general lack of excitement about CommonGround. The loss of many initial staff combined with construction delays resulted in apathy towards this project and “the impression that this is just like an administrative thing – a box that we check” (4). The fidelity assessor noted several staffing-related challenges at the 2-year assessment:
Staff participation has been impacted by changes in prescribers (e.g., previous prescriber used CommonGround, but the current one does not), lack of integration of CommonGround in the service delivery system so that it became a routine service, turnover of care coordinators, and part-time staffing of the DSC.
Strategies to maximize implementation
Staff across teams talked about several implementation strategies they used to increase awareness, buy-in, and use of CommonGround. One key strategy that was mentioned by nearly all staff members was having the team leader remind or encourage staff to use CommonGround during supervision or team meetings. Prescribers could also be influential. One informant noted how a prescriber would remind clients to use the DSC, by saying “You’ve got a CommonGround appointment. You need to go downstairs and do that (9).” Peer specialists were also able to increase staff awareness of CommonGround by attending team meetings. Peer specialists created lists of scheduled clients who should visit the DSC on a given day, despite difficulties with non-centralized scheduling. Receptionists and registrars also played a role in increasing client participation: “The registrar for that team got very invested and was very helpful, saying, you can’t go see [your prescriber] yet. Let me take you down to see [the peer specialist], which was helpful (3).”
In attempting to overcome barriers of using CommonGround outside the CMHC, several staff mentioned printing handouts so they were easily accessible and could be taken into the community. Additionally, efforts were made to better integrate CommonGround with existing technology: “we ended up putting something in our electronic medical record that would flag people that were in CommonGround, which was really helpful for a lot of folks (3).” Indeed, this, along with better peer specialist integration, was noted by the fidelity assessor at 12-months: “We saw evidence that [CMHC] has taken steps to improve the visibility of the program by inviting peer specialists to the team meetings and including a “pop-up” feature on the EMR to notify staff of CommonGround users.”
Discussion
To our knowledge, this is the first study to report in-depth on the implementation process of the CommonGround program. Considering the low use of the program in this implementation, it is unsurprising that key informants reported numerous barriers, though several benefits of the program were identified. Many contextual barriers identified by key informants were supported by evidence from fidelity reports. Overall, evidence indicates the CommonGround program was not fully implemented, and in fact, had ceased operation by the 2-year fidelity assessment. Thus, the CommonGround program did not reach full integration into services at the CMHC during this implementation. This is consistent with another study of CommonGround that found key concepts of health reports were not routinely discussed during psychiatric appointments, suggesting incomplete implementation (Campbell et al., 2014).
Despite struggles with implementation, CommonGround -- like other SDM interventions designed for people with mental illness (Bartels et al., 2013; Hamann et al., 2011) -- was generally liked, and informants reported numerous benefits, including the wide array of valuable information, normalizing and empowering videos for clients, and the potential to increase client treatment engagement. Within CommonGround, peer providers were especially appreciated, consistent with past literature indicating additional gains associated with peer-based interventions for client recovery (Fuhr et al., 2014). It is important to note that the CommonGround program is more complex and covers a wider range of materials than other SDM interventions for this population, which have mainly employed time-limited psychoeducational or training approaches (Alegría et al., 2008; Bartels et al., 2013; Hamann et al., 2007; Hamann et al., 2011). While some barriers found in this implementation could be applicable to any new program being added to busy clinician schedules, the complexity of the program and coordination of many people to integrate CommonGround into routine services were difficult to overcome in a crisis-driven service setting, and staff struggled with the technology-based intervention. Although one ACT team was able to overcome these barriers, future studies may consider ways to enhance availability of CommonGround in the field, while recognizing potential issues of internet access and providing real-time technical support as needed..
Context-related barriers made implementation more difficult, such as construction, staff turnover and lack of buy-in, adding a second service site (without additional peer specialist time), and recruitment limitations that prevented CommonGround access for all clients. Some of these barriers are consistent with previous research, including difficulty coordinating client time in the DSC prior to medication appointments (Deegan et al., 2008) and issues with buy-in from key staff, particularly psychiatrists (Campbell et al., 2014). The greatest issue in this implementation was a lack of investment from multiple key staff members, resulting in a lack of synergy. According to Rapp and colleagues (2010), “synergy emerges when all the key players are fulfilling their necessary role and meeting expectations” (pg. 117). Although some staff at the CMHC were enthusiastic about the CommonGround program, this enthusiasm was not sustained throughout the organization as evidenced by the absence of ongoing updates of key elements of the CommonGround program and lower priority of CommonGround training compared to other training provided to new hires. Further, the agency did not set expectations that aligned with the implementation process, despite suggestions from the fidelity assessor.
Considering these findings in the context of principles put forth in the National Implementing EBP Project, the lack of staff synergy could have been the result of failure in the first and/or second domains. It could be that the research team was not able to successfully show the benefit of the intervention from the start, failing to motivate staff at various levels to change the existing service structure enough to accommodate the intervention. Alternatively, although there may have been initial motivation, more resources and supports may have been needed to support ongoing motivation and enthusiasm for the intervention in the face of implementation barriers.
Despite the lack of synergy throughout the agency, one team was able to overcome many implementation barriers and achieve integration of CommonGround resources. This team’s success seemed to be the product of a very supportive psychiatrist, team leader, and registrar, all of whom made efforts to get clients to the DSC prior to appointments. Over time, participation in CommonGround became routine for these staff and clients. Importantly, there was no turnover among the psychiatrist, team leader, or registrar on this team during this implementation. This suggests that a few consistent people who are motivated to use the program (i.e., champions) could be key to successful CommonGround implementation. Although we cannot determine whether the success was related to the specific roles or characteristics of the people performing those roles (e.g., social capital, leadership qualities), given the importance of champions to the implementation process (Damschroder, Banaszak-Holl, et al., 2009), further research could examine the qualities of effective CommonGround champions.
Looking at the broader implementation literature, our emergent findings map onto several of the five constructs in the Consolidated Framework for Implementation Research (CFIR): intervention characteristics, individual characteristics, the implementation process, and inner setting (Damschroder, Aron, et al., 2009). Regarding the intervention itself, key informants suggested that the intervention was not perceived as adaptable and for some, was overly complex. Individual staff characteristics also played a role, as they held varying knowledge and beliefs about the intervention. The implementation process was also impacted by unforeseen issues, including the addition of a new building midway through the study and turnover. However, within the CFIR model, the internal setting appeared to play the largest role in implementation – the program did not neatly fit into an organizational structure that was crisis-oriented. There was little pressure to change current services, and no policies or incentives were established to encourage participation and routine program use.
Our findings can also be contextualized with regard to other computerized interventions in mental health. Several feasibility studies of computerized interventions have reported similar staff reluctance or lack of buy-in to that discussed by informants in this study (Koivunen, Hätönen, & Välimäki, 2008; Kuosmanen, Jakobsson, Hyttinen, Koivunen, & Välimäki, 2010). One recent study (Priebe et al., 2015) of a computer-mediated intervention designed to enhance client-centeredness and provide structure for client encounters in the UK (DIALOG+) also found similar results – although clients seemed to benefit from the intervention, implementation was variable, with nearly a third of the sample never exposed to DIALOG+. For CommonGround, limited program engagement may have been less of a concern for psychiatric providers, as clients still attended medication appointments regardless of whether they engaged with CommonGround, and the DSC was distally located to the providers. Thus, more research is needed on ways to effectively work with staff and the organizational contexts to support increased buy-in and implementation of computerized interventions.
One alternative to the implementation approach taken here that may be useful in future implementation efforts is the learning collaborative model (IHI, 2003). Within this model, smaller-scale tests of the intervention are conducted prior to larger implementation using the Plan-Do-Study-Act (PDSA) method applied to other healthcare initiatives (Berwick, 1998). Considering the agency-wide complexities of CommonGround, conducting smaller-scale implementations, studying the results, and adapting further implementations to the agency’s specific needs is a method that could improve integration of the service and save time and costs. Indeed, one recent study describing the implementation of decision support aids in 52 CMHCs suggests that this implementation strategy was successful (MacDonald-Wilson, Hutchison, Karpov, Wittman, & Deegan, 2016). This also aligns with some staff feedback in our study that taking on four teams in the agency was too many.
Overall, this implementation of CommonGround attained limited success, achieving integration into routine services in only one of four teams. Although our study is limited to one organization and one commercial, computerized intervention, we were able to integrate data from multiple sources, and several practical suggestions for future implementation efforts emerged. Specific to CommonGround, the program was generally liked, but real-time technical support would be beneficial. More generally, with such a complex intervention that requires cooperation from multiple staff levels, our results highlight the importance of synergy across staff and a supportive internal setting. Future implementations of CommonGround and other computerized interventions in mental health settings should consider investing additional time and resources into gaining staff buy-in, and may identify champions of the program early on to assist in ongoing integration of the intervention into routine services.
Acknowledgments
Research reported in this publication was supported by the National Institute of Mental Health of the National Institutes of Health under Award number R34MH093563 (A Pilot Test of CommonGround Based Shared Decision-Making). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. We thank the staff at Midtown Community Mental Health Center for their help in completing this research.
Contributor Information
Kelsey A. Bonfils, Indiana University-Purdue University Indianapolis Department of Psychology
Kimberly C. Dreison, Indiana University-Purdue University Indianapolis Department of Psychology
Lauren Luther, Indiana University-Purdue University Indianapolis Department of Psychology.
Sadaaki Fukui, University of Kansas School of Social Welfare Center for Mental Health Research and Innovation.
Abigail E. Dempsey, Indiana University-Purdue University Indianapolis Department of Psychology
Charles A. Rapp, University of Kansas School of Social Welfare Center for Mental Health Research and Innovation
Michelle P. Salyers, Indiana University-Purdue University Indianapolis Department of Psychology
References
- Adams J, Drake R, Wolford G. Shared decision-making preferences of people with severe mental illness. Psychiatric Services. 2007;58(9):1219–1221. doi: 10.1176/ps.2007.58.9.1219. [DOI] [PubMed] [Google Scholar]
- Alegría M, Polo A, Gao S, Santana L, Rothstein D, Jimenez A, Normand S-L. Evaluation of a patient activation and empowerment intervention in mental health care. Medical Care. 2008;46(3):247–256. doi: 10.1097/MLR.0b013e318158af52. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bartels SJ, Aschbrenner KA, Rolin SA, Hendrick DC, Naslund JA, Faber MJ. Activating older adults with serious mental illness for collaborative primary care visits. Psychiatric Rehabilitation Journal. 2013;36(4):278–288. doi: 10.1037/prj0000024. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Berwick DM. Developing and testing changes in delivery of care. Annals of Internal Medicine. 1998;128(8):651–656. doi: 10.7326/0003-4819-128-8-199804150-00009. [DOI] [PubMed] [Google Scholar]
- Campbell SR, Holter MC, Manthey TJ, Rapp CA. The effect of CommonGround software and Decision Support Center. American Journal of Psychiatric Rehabilitation. 2014;17(2):166–180. [Google Scholar]
- Charles C, Gafni A, Whelan T. Shared decision-making in the medical encounter: What does it mean? (or it takes at least two to tango) Social Science and Medicine. 1997;44(5):681–692. doi: 10.1016/s0277-9536(96)00221-3. [DOI] [PubMed] [Google Scholar]
- Creswell JW, Miller DL. Determining validity in qualitative inquiry. Theory Into Practice. 2000;39(3):124–130. [Google Scholar]
- Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science. 2009;4(1):50–64. doi: 10.1186/1748-5908-4-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Damschroder LJ, Banaszak-Holl J, Kowalski CP, Forman J, Saint S, Krein SL. The role of the “champion” in infection prevention: Results from a multisite qualitative study. Quality and Safety in Health Care. 2009;18(6):434–440. doi: 10.1136/qshc.2009.034199. [DOI] [PubMed] [Google Scholar]
- De las Cuevas C, Peñate W, Perestelo-Pérez L, Serrano-Aguilar P. Shared decision making in psychiatric practice and the primary care setting is unique, as measured using a 9-item Shared Decision Making Questionnaire (SDM-Q-9) Neuropsychiatric Disease and Treatment. 2013;9:1045–1052. doi: 10.2147/NDT.S49021. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Deegan PE. A web application to support recovery and shared decision making in psychiatric medication clinics. Psychiatric Rehabilitation Journal. 2010;34(1):23–28. doi: 10.2975/34.1.2010.23.28. [DOI] [PubMed] [Google Scholar]
- Deegan PE, Drake R. Shared decision making and medication management in the recovery process. Psychiatric Services. 2006;57(11):1636–1639. doi: 10.1176/ps.2006.57.11.1636. [DOI] [PubMed] [Google Scholar]
- Deegan PE, Rapp C, Holter M, Riefer M. Best practices: A program to support shared decision making in an outpatient psychiatric medication clinic. Psychiatric Services. 2008;59(6):603–605. doi: 10.1176/ps.2008.59.6.603. [DOI] [PubMed] [Google Scholar]
- Eliacin J, Salyers MP, Kukla M, Matthias MS. Patients’ understanding of shared decision-making in a mental health setting. Qualitative Health Research. 2015;25(5):668–678. doi: 10.1177/1049732314551060. [DOI] [PubMed] [Google Scholar]
- Fuhr DC, Salisbury TT, De Silva MJ, Atif N, van Ginneken N, Rahman A, Patel V. Effectiveness of peer-delivered interventions for severe mental illness and depression on clinical and psychosocial outcomes: A systematic review and meta-analysis. Social Psychiatry and Psychiatric Epidemiology. 2014;49(11):1691–1702. doi: 10.1007/s00127-014-0857-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fukui S, Salyers M, Matthias M, Collins L, Thompson J, Coffman M, Torrey W. Predictors of shared decision making and level of agreement between consumers and providers in psychiatric care. Community Mental Health Journal. 2014;50(4):375–382. doi: 10.1007/s10597-012-9584-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fukui S, Salyers MP, Rapp C, Goscha R, Young L, Mabry A. Supporting shared decision-making beyond consumer-prescriber interactions: Initial development of the CommonGround fidelity scale. American Journal of Psychiatric Rehabilitation. doi: 10.1080/15487768.2016.1197864. (in press) [DOI] [PMC free article] [PubMed] [Google Scholar]
- Goss C, Moretti F, Mazzi MA, Del Piccolo L, Rimondini M, Zimmermann C. Involving patients in decisions during psychiatric consultations. The British Journal of Psychiatry. 2008;193(5):416–421. doi: 10.1192/bjp.bp.107.048728. [DOI] [PubMed] [Google Scholar]
- Hamann J, Cohen R, Leucht S, Busch R, Kissling W. Shared decision making and long-term outcome in schizophrenia treatment. The Journal of Clinical Psychiatry. 2007;68(7):992–997. doi: 10.4088/jcp.v68n0703. [DOI] [PubMed] [Google Scholar]
- Hamann J, Mendel R, Meier A, Asani F, Pausch E, Leucht S, Kissling W. 'How to speak to your psychiatrist': Shared decision-making training for inpatients with schizophrenia. Psychiatric Services. 2011;62(10):1218–1221. doi: 10.1176/ps.62.10.pss6210_1218. [DOI] [PubMed] [Google Scholar]
- Hsieh H-F, Shannon SE. Three approaches to qualitative content analysis. Qualitative Health Research. 2005;15(9):1277–1288. doi: 10.1177/1049732305276687. [DOI] [PubMed] [Google Scholar]
- Institute for Healthcare Improvement. The breakthrough series: IHI’s collaborative model for achieving breakthrough improvement. Boston, MA: IHI innovation series white paper. Institute for Healthcare Improvement; 2003. [Google Scholar]
- Institute of Medicine. Crossing the quality chasm: A new health system for the 21st century (0309072808) Washington, DC: National Academies Press; 2001. [PubMed] [Google Scholar]
- Koivunen M, Hätönen H, Välimäki M. Barriers and facilitators influencing the implementation of an interactive Internet-portal application for patient education in psychiatric hospitals. Patient Education and Counseling. 2008;70(3):412–419. doi: 10.1016/j.pec.2007.11.002. [DOI] [PubMed] [Google Scholar]
- Kondracki NL, Wellman NS, Amundson DR. Content analysis: Review of methods and their applications in nutrition education. Journal of Nutrition Education and Behavior. 2002;34(4):224–230. doi: 10.1016/s1499-4046(06)60097-3. [DOI] [PubMed] [Google Scholar]
- Kuosmanen L, Jakobsson T, Hyttinen J, Koivunen M, Välimäki M. Usability evaluation of a web-based patient information system for individuals with severe mental health problems. Journal of Advanced Nursing. 2010;66(12):2701–2710. doi: 10.1111/j.1365-2648.2010.05411.x. [DOI] [PubMed] [Google Scholar]
- MacDonald-Wilson KL, Hutchison SL, Karpov I, Wittman P, Deegan PE. A successful implementation strategy to support adoption of decision making in mental health services. Community Mental Health Journal. 2016 doi: 10.1007/s10597-016-0027-1. Advance Online Publication. [DOI] [PubMed] [Google Scholar]
- Mahone IH, Farrell S, Hinton I, Johnson R, Moody D, Rifkin K, Barker MR. Shared decision making in mental health treatment: Qualitative findings from stakeholder focus groups. Archives of Psychiatric Nursing. 2011;25(6):e27–e36. doi: 10.1016/j.apnu.2011.04.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Matthias MS, Fukui S, Kukla M, Eliacin J, Bonfils KA, Firmin RL, Salyers MP. Consumer and relationship factors associated with shared decision-making in mental health consultations. Psychiatric Services. 2014;65(12):1488–1491. doi: 10.1176/appi.ps.201300563. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Matthias MS, Salyers MP, Rollins AL, Frankel RM. Decision making in recovery-oriented mental health care. Psychiatric Rehabilitation Journal. 2012;35(4):305–314. doi: 10.2975/35.4.2012.305.314. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mays N, Pope C. Assessing quality in qualitative research. BMJ. 2000;320(7226):50–52. doi: 10.1136/bmj.320.7226.50. [DOI] [PMC free article] [PubMed] [Google Scholar]
- National Research Council. Improving the quality of health care for mental and substance-use conditions: Quality chasm series. Washington, DC: The National Academies Press; 2006. [PubMed] [Google Scholar]
- Park SG, Derman M, Dixon LB, Brown CH, Klingaman EA, Fang LJ, Kreyenbuhl J. Factors associated with shared decision–making preferences among Veterans with serious mental illness. Psychiatric Services. 2014;65(12):1409–1413. doi: 10.1176/appi.ps.201400131. [DOI] [PubMed] [Google Scholar]
- Priebe S, Kelley L, Omer S, Golden E, Walsh S, Khanom H, McCabe R. The effectiveness of a patient-centred assessment with a solution-focused approach (DIALOG+) for patients with psychosis: A pragmatic cluster-randomised controlled trial in community care. Psychotherapy and Psychosomatics. 2015;84(5):304–313. doi: 10.1159/000430991. [DOI] [PubMed] [Google Scholar]
- Rapp CA, Etzel-Wise D, Marty D, Coffman M, Carlson L, Asher D, Holter M. Barriers to evidence-based practice implementation: Results of a qualitative study. Community Mental Health Journal. 2010;46(2):112–118. doi: 10.1007/s10597-009-9238-z. [DOI] [PubMed] [Google Scholar]
- Robinson JH, Callister LC, Berry JA, Dearing KA. Patient-centered care and adherence: Definitions and applications to improve outcomes. Journal of the American Academy of Nurse Practitioners. 2008;20(12):600–607. doi: 10.1111/j.1745-7599.2008.00360.x. [DOI] [PubMed] [Google Scholar]
- Salyers MP, Matthias MS, Fukui S, Holter MC, Collins L, Rose N, Torrey WC. A coding system to measure elements of shared decision making during psychiatric visits. Psychiatric Services. 2012;63(8):779–784. doi: 10.1176/appi.ps.201100496. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stacey G, Felton A, Morgan A, Stickley T, Willis M, Diamond B, Dumenya J. A critical narrative analysis of shared decision-making in acute, in-patient mental health care. Journal of Interprofessional Care. 2016;30(1):35–41. doi: 10.3109/13561820.2015.1064878. [DOI] [PubMed] [Google Scholar]
- Torrey WC, Drake RE, Dixon L, Burns BJ, Flynn L, Rush AJ, Klatzker D. Implementing evidence-based practices for persons with severe mental illnesses. Psychiatric Services. 2001;52(1):45–50. doi: 10.1176/appi.ps.52.1.45. [DOI] [PubMed] [Google Scholar]
- Torrey WC, Lynde DW, Gorman P. Promoting the implementation of practices that are supported by research: The National Implementing Evidence-Based Practice Project. Child and Adolescent Psychiatric Clinics of North America. 2005;14(2):297–306. doi: 10.1016/j.chc.2004.05.004. [DOI] [PubMed] [Google Scholar]
- Woltmann EM, Whitley R. Shared decision making in public mental health care: Perspectives from consumers living with severe mental illness. Psychiatric Rehabilitation Journal. 2010;34(1):29–36. doi: 10.2975/34.1.2010.29.36. [DOI] [PubMed] [Google Scholar]
