Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2018 Mar 1.
Published in final edited form as: Psychiatr Serv. 2016 Dec 1;68(3):299–302. doi: 10.1176/appi.ps.201500468

Consumer outcomes after implementing CommonGround as an approach to shared decision making

Michelle P Salyers 1, Sadaaki Fukui 2, Kelsey A Bonfils 1, Ruth L Firmin 1, Lauren Luther 1, Rick Goscha 2, Charles A Rapp 2, Mark C Holter 2
PMCID: PMC5658777  NIHMSID: NIHMS910266  PMID: 27903137

Abstract

Objective

We examined consumer outcomes before and after implementing CommonGround, a computer-based shared decision-making program.

Methods

Consumers with severe mental illness (n=167) were interviewed prior to implementation as well as 12 and 18 months later to assess changes in active treatment involvement, symptoms, and recovery-related attitudes. Providers also rated consumers on level of treatment involvement.

Results

Most consumers used CommonGround at least once (67%), but few used the program regularly. Mixed effects regression analyses showed improvement in self-reported symptoms and recovery attitudes. Self-reported treatment involvement did not change, but for a subset with the same providers over time, the providers rated consumers as more active in treatment.

Conclusions

This study adds to the growing literature on tools to support shared decision-making, showing the potential benefits of CommonGround for improving recovery outcomes. More work is needed to better engage consumers in CommonGround and to test the approach with more rigorous methods.


Medication management for people with severe mental illness (SMI) has historically been conceptualized as strategies to increase compliance. However, current focus attempts to incorporate person-centered, recovery-oriented care with effective medication management supporting consumer goals; this involves complex decision-making and requires a partnership between two experts, the consumer and the provider (1). This concept of shared decision-making (SDM) is now widely recognized as an indicator of high-quality healthcare, with increasing calls for SDM in mental health settings (1). However, SDM is still relatively rare in mental health, and few studies have examined approaches specifically designed to increase SDM in these settings.

Consumers with SMI desire a role in treatment decisions (2), but several barriers impede widespread use of SDM, including provider concerns of time constraints, questions of applicability for some consumers or clinical situations, and confusion around roles and responsibilities (3). Given barriers to SDM, decision-support tools may facilitate more effective and efficient clinical consultation, while promoting reciprocal exchange of information and preferences to improve consumer outcomes.

One promising decision-support system is CommonGround, which integrates computer technology, decision-support tools, peer support, and provider and consumer training (4). Initial pilot work with CommonGround in people with SMI suggested improved consumer-provider communication, shared treatment decisions, and an increased focus on recovery-oriented goals (46). Two other CommonGround evaluations reported varied findings; the first showed significantly improved symptoms and functioning and fewer consumer concerns about side effects with use of CommonGround (7), but the second did not show improvements in medication adherence over 6-month follow-up (8). Taken together, more research is needed to investigate the impact of CommonGround on consumer outcomes.

Within the CommonGround program, computer kiosks use technologically-advanced, self-guided discovery modules designed to assist individuals to learn about recovery, identify strategies to reach recovery goals, and monitor and share progress. “Personal medicine” (self-identified strategies that provide meaning and help consumers stay well) and a “power statement” (goals for psychiatric medication use in the recovery context) are developed in CommonGround (4). Prior to a psychiatric visit, consumers complete a one-page health report with assistance from peer providers that integrates a power statement and personal medicine with current symptoms and concerns to more facilitate more efficient communication with providers. The health report highlights the area(s) consumers most want to discuss during limited appointment times and assists in clarifying consumer and provider roles in the decision-making process. CommonGround was designed to overcome common obstacles for people with SMI, such as low literacy, limited computer skills, and potentially elevated symptoms by providing peer-guided computer-based tools in accessible language (4).

Our objective was to implement CommonGround in a new service setting -- an urban community mental health center (CMHC) -- and examine outcomes of consumers with SMI engaged in assertive community treatment (ACT) or outpatient services who had access to the program. Because CommonGround prompts consumers to take a greater role in treatment decisions, we expected consumers to report an increased desire for autonomy in treatment decisions and to show greater activation in treatment. Further, CommonGround provides concrete tools to identify and address medication concerns and to integrate personal medicine and consumer preference about medication in decision-making, which should contribute to reduced symptoms. Finally, given CommonGround’s emphasis on recovery, particularly with peer providers who model recovery (1), we hypothesized that consumers would report greater levels of recovery and hope. This study extends prior work by implementing the approach in a new setting and assesses a broader range of recovery-related consumer outcomes.

Method

We implemented CommonGround in two outpatient clinics and two ACT teams serving adults with SMI in an urban CMHC. Due to staff turnover, there were eight different psychiatric providers over the study period. Visits with providers generally entailed check-ins, medication management, and discussion of consumer concerns. The CommonGround program was offered at decision-support centers (DSCs) staffed by peer providers.

Research assistants approached potential participants upon arrival for a psychiatric visit. The assistants described the study, screened interested participants, and completed an informed consent process. Eligibility included receipt of psychiatric services from the CMHC, English fluency, ability to provide informed consent, and willingness to be interviewed three times and have three psychiatric provider visits audiotaped (baseline, 12 months, and 18 months). Consumers were not eligible if they were planning to leave the CMHC or change providers during the study timeframe. Assistants audiotaped the psychiatric visit and conducted an interview. Providers were asked to complete a brief measure assessing consumer involvement after the visit. One-year and 18-month interviews were scheduled to coincide with psychiatric appointments. Consumers were paid $20 for each interview. All procedures were approved by the [university] Institutional Review Board.

We gathered demographic variables and obtained psychiatric diagnoses through agency records. The Patient Activation Measure (PAM-MH; 9) assessed activation in mental health treatment. The Autonomy Preference Index (API) assessed preferences related to autonomy in medical decision-making (10) using two subscales: information seeking and decision-making autonomy. We assessed symptoms using a subscale of the “How I am Doing” scale from the CommonGround program (7). We used the Recovery Assessment Scale (RAS; 24-item) to measure perceived level of recovery from psychiatric illness (11), and hope was assessed with the State Hope Scale (12). Providers rated consumer involvement in visits using a 6-item questionnaire developed for this study. Providers rated the extent to which the consumer and provider worked together in the session on a variable 4-point response scale. All measures have been used in this population before and had good reliability.

Mixed effects regressions were used to examine changes in consumer outcomes over time, controlling for age, race, gender, and clinic type. Frequency of CommonGround health report completion, an indication of the intervention exposure intensity and the most critical indicator of program engagement (13), was also controlled. Further, for consumers who had the same psychiatric provider over 18 months (n=37), we examined whether the provider perceived changes in involvement over time. Because most consumers had different providers over time due to turnover, provider effects were not controlled. Multiple imputation was used for missing data.

Results

Over half of participating consumers were male (56.9%), African American (54.8%), and had completed high school or some college (58.1%). Most participants were diagnosed with schizophrenia (67.6%). There were 167 participants at baseline, 105 at 12 months, and 83 at 18 months (50% dropout). Dropout was not significantly related to consumer demographics or baseline outcomes.

Regarding intervention exposure, 60 people (36%) never completed a health report, 34 (20%) completed one report, 24 (14%) completed two reports, 13 (8%) completed three reports, and 36 (22%) completed more than three reports during the study period. Among participants who were in the study for 18 months, those who had the same providers (n=37) completed the health report about twice as often (m=6.2±4.9) as those with different providers (m=2.5±1.6; t(42.5)=−4.42, p<.001).

Consumer outcomes over time are shown in the Table. Self-reported patient activation and autonomy preferences did not change over time. However, consumers who had the same provider over 18 months showed significant improvement in provider perceptions of consumer involvement over time (β=.13). Among the entire sample, self-reported symptoms also improved over time (β=.08). Recovery attitudes showed significant improvement in RAS overall mean scores (β=.06) and the subscale “No domination by symptoms” (β=.15). Improvement in two other RAS subscales was marginally significant: “Personal confidence and hope” (β=.06) and “Reliance on others” (β=.07). Hope did not change over time.

Table.

Mixed Effects Regression Results

Baseline
(N=167)
12 Month
(N=105)
18 Month
(N=83)
Variable M SD M SD M SD Significance
(mixed
effects reg.)
Involvement
  Patient Activation Measure 55.37 13.34 54.21 12.46 54.68 14.79 .57
  API: Decision-making 2.42 .83 2.51 .91 2.45 1.04 .78
  API: Information-seeking 4.39 .49 4.20 .54 4.27 .57 .59
  Providers' perception of consumer involvement (n = 37) 3.54 .42 3.65 .44 3.80 .36 .01
Symptoms
  HIAD (Symptom subscale) 3.53 .94 3.61 .95 3.69 .91 .02
Recovery Attitudes
  RAS: Total 3.84 .53 3.91 .64 3.97 .63 .02
  RAS: Personal Confidence/Hope 3.83 .65 3.88 .79 3.95 .74 .06
  RAS: Willingness to ask for help 4.17 .68 4.22 .71 4.26 .68 .24
  RAS: Goal and success orientation 4.11 .62 4.09 .75 4.13 .67 .74
  RAS: Reliance on others 3.83 .75 3.78 .93 3.97 .78 .05
  RAS: No domination by symptoms 3.13 .91 3.53 .94 3.44 1.01 <.01
  Hope 2.91 .64 2.90 .66 2.95 .72 .55

Note. API=Autonomy Preference Index. Mean scores range from one to five, with higher scores indicating greater preferences for autonomy in decision-making or information-seeking. HIAD=How I Am Doing scale. Mean scores range from one to five, with higher scores indicating less severe symptoms. RAS=Recovery Assessment Scale. Mean scores range from one to five, with higher scores indicating greater perceptions of recovery. The Patient Activation Measure has possible scores from 1 to 100, with higher scores indicating greater patient activation. The State Hope Scale has possible mean scores from one to five, with higher scores indicating greater hope. The scale to assess providers’ perceptions of consumer involvement has possible mean scores from one to four, with higher scores indicating greater involvement in the visit.

Discussion

In this uncontrolled study, consumers reported improvements in symptoms and perceived recovery attitudes over time after implementing CommonGround in the context of ongoing mental health services. However, most measures of treatment involvement did not change. In addition, use of CommonGround was variable, with a large proportion (36%) never completing a health report.

In terms of positive changes in consumer outcomes, our findings are consistent with previous work showing improved symptoms and functioning in participants using CommonGround (7). The present study extends these findings by showing improvement on a recovery-related measure independent of the administrative data tracked in the CommonGround system. Another important contribution is that we examined the use of CommonGround in the context of both ACT and outpatient treatment teams. Given some of the uncertainty that has surrounded the feasibility of SDM with people with SMI (14), the present findings suggest that decision-aid technology with a support system that includes peer providers is promising for those involved in the most intensive community mental health services.

One unexpected finding in our study was the low rate of CommonGround health report completion across consumers overall. Given high levels of provider turnover during the study period and changing treatment team infrastructure, there were several barriers to implementation that likely influenced CommonGround use (15). Indeed, those who had the same providers completed the health report about twice as often as those with different providers, and these individuals showed positive changes in treatment involvement. It may be that provider consistency is an important mechanism that promotes both CommonGround use and consumer outcomes. Future work should seek strategies to support use of CommonGround, even in the face of turnover.

There are several limitations to this study. First, without having a control group and experimental design, the causal influence of CommonGround for improved outcomes is not clear. Second, the study had a high rate of provider turnover; this impacted CommonGround use, as well as our ability to control for the provider effects over time. Third, we had a relatively high rate of consumer dropout and a low rate of health report completion. Future work is needed to investigate factors contributing to systemic and participant-level barriers to engaging in CommonGround, including understanding subgroups for whom the intervention may be most effective. Finally, we are limited in our interpretation of clinical significance by the relatively small effect sizes, which require further investigation.

Overall, our study found additional positive recovery outcomes after CommonGround implementation for people who receive ACT and outpatient services in a CMHC, indicating potential benefits of the program for those receiving the most intensive outpatient services. More attention to facilitating consistent use of CommonGround and a more rigorous design to evaluate its causal influence are warranted.

Acknowledgments

Research reported in this publication was supported by the National Institute of Mental Health of the National Institutes of Health under Award Number R34MH093563. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Footnotes

Disclosures and Acknowledgements: The authors declare no conflicts of interest.

Previous presentation: Preliminary data from this manuscript was presented at the 20th Annual Conference Anniversary Celebration of the Society for Social Work and Research.

References

  • 1.Deegan PE, Drake RE. Shared decision making and medication management in the recovery process. Psychiatric Services. 2006;57:1636–9. doi: 10.1176/ps.2006.57.11.1636. [DOI] [PubMed] [Google Scholar]
  • 2.Hamann J, Cohen R, Leucht S, et al. Do patients with schizophrenia wish to be involved in decisions about their medical treatment? The American Journal of Psychiatry. 2005;162:2382–4. doi: 10.1176/appi.ajp.162.12.2382. [DOI] [PubMed] [Google Scholar]
  • 3.Gravel K, Légaré F, Graham ID. Barriers and facilitators to implementing shared decision-making in clinical practice: a systematic review of health professionals' perceptions. Implementation Science. 2006;1:16. doi: 10.1186/1748-5908-1-16. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Deegan PE, Rapp C, Holter M, et al. Best practices: a program to support shared decision making in an outpatient psychiatric medication clinic. Psychiatric Services. 2008;59:603–5. doi: 10.1176/ps.2008.59.6.603. [DOI] [PubMed] [Google Scholar]
  • 5.Campbell SR, Holter MC, Manthey TJ, et al. The effect of CommonGround software and decision support center. American Journal of Psychiatric Rehabilitation. 2014;17:166–80. [Google Scholar]
  • 6.Goscha R, Rapp C. Exploring the experiences of client involvement in medication decisions using a shared decision making model: results of a qualitative study. Community Mental Health Journal. 2015;51:267–74. doi: 10.1007/s10597-014-9759-y. [DOI] [PubMed] [Google Scholar]
  • 7.MacDonald-Wilson KL, Deegan PE, Hutchison SL, et al. Integrating personal medicine into service delivery: empowering people in recovery. Psychiatric Rehabilitation Journal. 2013;36:258–63. doi: 10.1037/prj0000027. [DOI] [PubMed] [Google Scholar]
  • 8.Stein B, Kogan J, Mihalyo M, et al. Use of a computerized medication shared decision making tool in community mental health settings: impact on psychotropic medication adherence. Community Mental Health Journal. 2013;49:185–92. doi: 10.1007/s10597-012-9528-8. [DOI] [PubMed] [Google Scholar]
  • 9.Green C, Perrin N, Polen M, et al. Development of the Patient Activation Measure for Mental Health. Administration and Policy in Mental Health and Mental Health Services Research. 2010;37:327–33. doi: 10.1007/s10488-009-0239-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Ende J, Kazis L, Ash A, et al. Measuring patients’ desire for autonomy. Journal of General Internal Medicine. 1989;4:23–30. doi: 10.1007/BF02596485. [DOI] [PubMed] [Google Scholar]
  • 11.Salzer MS, Brusilovskiy E. Advancing recovery science: reliability and validity properties of the Recovery Assessment Scale. Psychiatric Services. 2014;65:442–53. doi: 10.1176/appi.ps.201300089. [DOI] [PubMed] [Google Scholar]
  • 12.Snyder CR, Sympson SC, Ybasco FC, et al. Development and validation of the State Hope Scale. Journal of Personality and Social Psychology. 1996;70:321–35. doi: 10.1037//0022-3514.70.2.321. [DOI] [PubMed] [Google Scholar]
  • 13.Fukui S, Salyers MP, Rapp C, et al. Supporting shared decision-making beyond consumer-prescriber interactions: initial development of the CommonGround fidelity scale. American Journal of Psychiatric Rehabilitation. doi: 10.1080/15487768.2016.1197864. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Kaminskiy E. The elephant in the room: a theoretical examination of power for shared decision making in psychiatric medication management. Intersectionalities: A Global Journal of Social Work Analysis, Research, Policy, and Practice. 2015;4:19–38. [Google Scholar]
  • 15.Authors. Implementing CommonGround in a community mental health center: lessons in a computerized decision support system. doi: 10.1037/prj0000225. Under review. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES