Abstract
Objective:
The current paper describes efforts to develop and test a measure of recovery-oriented inpatient care.
Methods:
The Recovery-oriented Acute INpatient (RAIN) scale was based on prior literature and current Veterans Health Administration (VHA) policy and resources and further revised based on data collection from 34 VHA acute inpatient units.
Findings:
A final scale of 23, behaviorally-anchored items demonstrated a four-factor structure including the following factors: inpatient treatment planning, outpatient treatment planning, group programming, and milieu. While several items require additional revision to address psychometric concerns, the scale demonstrated adequate model fit and was consistent with prior literature on recovery-oriented inpatient care.
Conclusions and Implementations for Practice:
The RAIN scale represents an important tool for future implementation and empirical study of recovery-oriented inpatient care.
The literature for operationalizing recovery-oriented outpatient care is abundant and has grown over the past 30 years. Early work focused on conceptualizing recovery (Anthony, 1993; Deegan, 1988) and explication of recovery-promoting principles for the mental health system (Hogan, 2003; Jacobson, 2004; Jacobson, 2001). Other work has attempted to develop measures of recovery-orientation (Williams et al., 2012) and recovery climate and culture (Evans et al., 2020). Meanwhile, a wealth of literature has linked specific elements of recovery-oriented services in outpatient mental health settings with better functional and clinical outcomes. For example, interdisciplinary treatment teams (Bond et al., 2000; Mueser et al., 1998), self-management programs (McGuire et al., 2017; Mueser et al., 2006), and shared decision-making (Hamann et al., 2007) have been found to lead to better patient outcomes (Malinovsky et al., 2013). However, definitions of recovery-oriented care remain myriad, leading to the need for clear, measurable operationalizations.
The importance of clear model definitions and related measures has been emphasized within the implementation science literature (Damschroder et al., 2009) and is best represented by use of fidelity scales (Bond et al., 2000). Fidelity scale development generally requires identification of essential elements, which are based on gold standard programs, extant empirical support, or stakeholder ratings (Bond et al., 2000; Bond & Drake, 2020; Koop et al., 2004). These essential elements are then translated into a validated fidelity measurement tool. Examples of the use of fidelity scales in recovery-oriented services include for illness management and recovery (Egeland et al., 2019; McGuire et al., 2012; Mueser et al., 2002), wellness recovery action planning (Cook et al., 2012), recovery-oriented assertive community treatment (Cuddeback et al., 2013; Monroe-DeVita et al., 2011; Moser et al., 2013) and supported employment (Bond et al., 1997; Bond et al., 2008; Bond et al., 2012).
While operationalization of outpatient recovery-oriented services has progressed, there has been much less attention to operationalizing and measuring recovery-oriented inpatient mental health services. Indeed, many authors have noted the relative neglect of recovery-oriented services in the inpatient setting (Simpson et al., 2017; Waldemar et al., 2016; Waldemar et al., 2018; Zuehlke et al., 2016). Despite these limitations, some notable work has focused on recovery-oriented inpatient services (Tsai, 2010; Tsai et al., 2010). Early work focused on assessing inpatient staff perceptions of how recovery-oriented services were (McLoughlin, 2008; Salyers, 2007), including comparisons with outpatient settings (Tsai, 2010) and linking perceptions to types of training (Tsai et al., 2010). Other efforts have been made to explicate ideal characteristics of inpatient care from a variety of stakeholder perspectives (Foye et al., 2020; Sowers, 2005; Wyder et al., 2017). Additionally, several studies have examined the implementation of recovery-oriented care on specific inpatient units (McDonagh et al., 2019; Rabenschlag et al., 2014) or across a system of hospital units (Ahmed et al., 2013; Simpson et al., 2017; Waldemar et al., 2018). However, these efforts differ as to how recovery-oriented care was defined or what particular services were the targets of implementation. Taken together, it is clear that the expansion of recovery-oriented services in acute inpatient settings would be greatly facilitated by a comprehensive operationalization of recovery-oriented inpatient care.
The current study describes the development and testing of an inpatient recovery-oriented services scale for acute inpatient mental health care: The Recovery-oriented Acute Inpatient (RAIN) scale. The study team utilized extant recovery-oriented support materials from the Veterans Health Administration (VHA). The VHA is the nation’s largest integrated healthcare system and has invested heavily in the implementation of recovery-oriented services, most recently in inpatient mental health. The study team iteratively refined the VHA scale based on data collection across 34 VA acute inpatient mental health units. The overall goal was to produce a psychometrically tested scale that was comprehensive, clearly operationalized, and grounded in the practices of a diverse set of inpatient units.
Methods
Initial Scale Creation
The study team set out to develop a measure of recovery-oriented acute inpatient mental health care. The starting point for our measure development was based on the Inpatient Mental Health Recovery Services Checklist (Department of Veterans Affairs, 2013), a self-assessment designed to assist sites in implementing elements of recovery-oriented care delineated in the VHA Uniform Services Handbook 1160.06 and VHA Inpatient Mental Health Care policy. The checklist includes 20 items (with sub-items) grouped topically (organizational, individualized clinical services, staff training and competency, and structural elements), where program staff can rate each element of their own program as “met,” “in development,” or “not started.” Before initial launch for the research study, we revised the items to better fit a typical fidelity assessment that could be rated by external experts, decreasing ambiguity of item wording, separating double-barreled items, and whenever possible, providing objective rating criteria.
Sampling
In order to capture a diversity of inpatient units, an extreme groups sampling approach was adopted. The project manager maintained a list of VA facilities with acute inpatient mental health units and rank ordered them based on outpatient follow-up rates for FY2017 in an attempt to diversify the sample based on our key outcome measure (pertinent to subsequent analyses). All raters (including the principle investigator (PI)) were blinded to condition. The project manager (JG) provided the PI with a list that included the 17 highest and 17 lowest sites, that were then invited to participate. The PI then continued recruitment in batches representing the next highest/lowest sites until the target of 34 sites was reached. Enrollment rate was 34 of 87 (39%) sites contacted. There is no definitive list of unit leadership across VHA, so the study team searched pertinent VHA mailing lists and internet/intranet sites for appropriate personnel at targeted inpatient units. Using recruitment strategies based on Dillman (1978), successful in our prior work [author cites], the PI emailed individual invitations to all identified staff at each site. If no one from the site responded to the initial request, a follow-up email was sent 1–2 weeks later, then a final request was sent in another 1–2 weeks using another means (e.g., phone, instant message). Upon receiving response indicating initial interest, several steps were taken to obtain necessary approvals prior to site participation, including a) securing agreement to participate from a staff person with authority to grant a site visit, b) seeking approval from the VAMC Medical Director or equivalent, which generally required c) some level of local Research and Development review.
Sampling within participating sites involved three (non-mutually exclusive) types of participants: key staff informants (phone interviews), Veterans (phone interviews), and staff and Veterans observed during on-site visits. Staff specifically invited to participate as key informants included program coordinators, unit nurse manager, medical directors/lead psychiatrists, social workers, and local recovery coordinators (a mental health provider at each facility tasked with developing and promoting access to recovery-oriented services). Key informants were also invited to suggest other staff with critical knowledge regarding the implementation of recovery-oriented services at their unit. Key informants were contacted up to three times: by e-mail (twice) and phone or instant message (at least once). About 3 months prior to a scheduled site visit, Veterans were identified for phone interviews based on recent discharge (60 days prior) from participating units. Veterans who were hospitalized again when called for recruitment were excluded from participation so that the same time period was maintained for prior hospital experience. The team mailed a recruitment letter to eligible Veterans, and up to three recruitment phone calls were attempted. Under-represented Veterans (racial minorities and women) were recruited first, and recruitment continued until at least five Veterans were recruited for each site. Site visits involved observations and brief interviews of some staff members and Veterans on the unit. Information about the study was posted on the unit in public areas and staff break rooms. Prior to the site visit, the main point of contact at each site was encouraged to distribute information to staff who would be working on the unit during the site visit. Staff and Veterans were informed they could opt out of participation by informing study team and verbal informed consent was obtained prior to brief interviews (a waiver of written documentation of informed consent was in place for brief onsite encounters). No identifying information was collected from participants during site visits.
Procedures
A primary rater was assigned for each site; a secondary and sometimes a tertiary rater were assigned in 22 of 34 site visits. We included two site visitors for the first 8 visits in order to develop a process, and later we included multiple site visitors for facilities that were large or had a complex service structure (e.g., several different subunits). The primary rater coordinated data collection, conducted semi-structured phone or in person interviews with key informants, reviewed charts of at least five Veterans’ clinical records during their inpatient stay prior to the site visit, and examined facility-level summary data for the 6-month period prior to site visit (e.g., number of discharges, length of stay, inpatient referrals for care) and Veteran interviews prior to conducting the site visit. If other site visitors were included, they read transcripts of the key informant and Veteran interviews prior to the visit. Site visits were scheduled at the site’s convenience and were conducted over approximately 1.5 to 2 days. Raters focused on passive observation of clinical activities (e.g., treatment team meetings; clinical encounters such as medication management, therapy groups, biopsychosocial assessments, nursing shift change meetings); milieu (informal interactions amongst staff, between staff and Veterans, physical environment); and informal conversations with Veterans currently on the unit. On the second day, the rater(s) met with staff to ask clarifying questions for data triangulation and to share initial impressions as a form of informal feedback.
Following data collection, the primary rater drafted a preliminary site summary including item scores, data supporting scoring, and overall notes on the unit, incorporating narrative notes and observations from other raters. Secondary and tertiary raters (when applicable) made independent notes and provided their own scores. A scoring meeting was conducted for each site with multiple site visit raters. During this meeting raters presented their data and preliminary scores to other site visitors on the research team to develop final scores for each item, based on consensus. When deemed necessary in order to reach a consensus score, the primary rater would collect additional information to finalize scoring.
We utilized several approaches to facilitate iterative and continued improvement of the scale. Scoring meetings were held with at least four of the team’s raters after each site visit. Raters included clinical psychologists and a project manager, all of whom have a least a decade of experience in studying and implementing recovery-oriented mental health services. Raters presented data supporting and contradicting the targeted site’s implementation of recovery-oriented care, including for each element of the rating scale. During each meeting, raters reflected on aspects of care provided on the unit that demonstrated recovery-oriented inpatient care and noted aspects that were not captured by the extant scale items. Additionally, methods notes were kept regarding what data were used to rate specific items. Periodically, the information derived as part of these meetings was collated and reviewed by the full team; these notes, diagraming, and periodic reviews of the literature were used to refine the overall scale structure and scoring procedures. Substantial revisions were made in three waves occurring after the 6th, 12th, and 15th sites. The research team sent revisions to partners within the VA Office of Mental Health and Suicide Prevention who were involved in the development of the VA policies and the Toolkit as well as to a local inpatient program coordinator to ensure changes were consistent with the VA mission and policy, and would be usable by stakeholders. The final version of the RAIN scale used to score all sites in the current sample is described below. Following the finalization of this version, we rescored all sites translating earlier scores to the final version of the scale.
This study was approved by the IU IRB and Richard L Roudebush Veterans Affairs Medical Center Research and Development Committee.
Measures
Key informant interviews were semi-structured and focused on general characteristics of the unit (number of beds, common presenting concerns, etc.), current implementation of each element on the RAIN scale (described below), and factors affecting implementation (Damschroder et al., 2020). Veteran interviews were semi-structured and focused on Veteran experience of RAIN items as well as the 20-item Patient Assessment of Chronic Illness Care-Inpatient Mental Health Services which is a revised version of the Patient Assessment of Care for Chronic Conditions (Glasgow et al., 2005) that assesses patient perception of services in five areas (patient activation, goal-setting, problem-solving, delivery system, and care coordination) and has been used successfully with patients with mental health diagnoses (Cabassa et al., 2014; Gensichen et al., 2011). The scale has demonstrated excellent internal consistency and validity (Glasgow et al., 2005).
Administrative data from the VA Corporate Data Warehouse (CDW) were obtained for the 6-months prior to the site visit. CDW is a national repository of clinical and administrative data for the entire VHA; clinical data originates from the VHA electronic medical record. Veterans with an inpatient discharge from the acute mental health ward of interest at each site were identified. Data was aggregated at the facility-level to provide information on selected components of care that were used by the study team to inform item rating. Information included total number of admissions, number of Veterans with unique admissions, lengths of stay, mental health treatment plan completion rate, individual psychotherapy encounters, and group therapy encounters.
The final scale resulting from the iterative revisions - the Recovery-oriented Acute INpatient (RAIN) Scale- consists of 23 items, each corresponding to an element of recovery-oriented inpatient care. Each item is rated based on the quality of the element demonstrated at the site and consistency with which the element is provided across Veterans served. Items are rated on a 5-point ordinal scale with standard anchors (2.0 - Excellent quality and consistency (deviations or deficits rare); 1.5 - Good quality and consistency (some deviations or minor deficits); 1.0 - Regular deficits in consistency OR quality; 0.5 - Regular deficits in consistency AND quality; 0.0 - Little or no evidence of the element) except for 8 items in which these anchors did not fit the item (see Supplementary Item A).
Analyses
We examined descriptive statistics and histograms for each item to identify non-normal distribution, ceiling/floor effects, and other trends. We conducted a series of confirmatory factor analyses. Confirmatory factor analyses were appropriate to test the hypothesized factor structure of observed variables as they represent underlying latent constructs. First, a one-factor solution was tested to determine the extent to which the observed variables represent one latent construct corresponding to Recovery Oriented Care. A four-factor solution was tested to determine whether components of Recovery Oriented Care corresponded to separate latent constructs. Chi-square, standardized root mean square residual (SRMR), adjusted goodness-of-fit index (GFI), root mean square error of approximation (RMSEA), and Bentler comparative fit index values were compared between solutions to select the best fitting model. LaGrange Multiplier Test output was inspected to detect error covariances that when estimated improve model fit (Suhr, 1997). In the final model, six error covariances were estimated (item 14 with items 5, 8, 19; item 4 with items 8, 18; item 18 and item 23). Analyses were conducted using SAS software 9.4. Copyright © 2014 SAS Institute Inc. SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc., Cary, NC, USA.
Results
Sample
The 34 participating acute inpatient mental health units were geographically diverse (representing 16 of 18 Veterans Integrated Service Networks, which are geographically based organizational units, and every major region of the country) and included rural (n=3, 9%) and urban (n=31; 91%) settings (as classified by Veterans’ home zip codes) (Kaboli, 2011). An average of 4.4 (s.d. = 0.98) key informant interviews were conducted for each site (range 3 to 7). Key Informants represented nursing, psychology, psychiatry, social work, peer support, and physical medicine/rehabilitation disciplines. An average of 5.7 (s.d. = 1.04) veteran interviews were conducted for each site (range 4 to 9). Over half of Veterans interviewed were White (56%) and one-third were Black or African American (33%); a small number were Hispanic or Latino (9%). The majority of participating Veterans were male (79%).
Item-Level Scores
Initial item scores are displayed in Table 1. Several items demonstrated non-normal distributions. In two cases, this occurred artificially, based on scoring rules developed by the team and could therefore be revised to increase normality. For item 13- Sufficient Group Variety, we rescored based on the proportion of the nine domains present at the site, multiplied by 2 so that the theoretical range (0–2) would be the same as for all other items. The revised item was more normal, but still substantially negatively skewed. For item 20- Multiple Disciplines Represented, we rescored based on the average adequacy score (0–2) for each discipline. The revised item had a wider distribution, but still clustered within the 1.5–1.9 range. Distributions for four additional items were non-normal but could not be normalized by addressing artificial scoring rules (Items 9- Integrated Comorbid Physical Health and 11- Suicide Prevention suffered from ceiling effects; Items 10- Individual Evidence-Based Psychotherapy and 21- Peer Support were bimodal, with each element either present or absent).
Table 1.
Scores | Mean | S.D. | |||||
---|---|---|---|---|---|---|---|
0 | .5 | 1.0 | 1.5 | 2.0 | |||
RAIN Mean Revised | - | - | - | - | - | 1.21 | .22 |
Inpatient Treatment Planning | - | - | - | - | - | .87 | .35 |
Recovery-oriented goal setting | 5 (14.7%) |
13 (38.2%) |
10 (29.4%) |
5 (14.7%) |
1 (2.9%) |
.77 | .51 |
Written Treatment Plan | 2 (5.9%) |
6 (17.6%) |
20 (58.8%) |
6 (17.6%) |
0 (0%) |
.94 | .38 |
SDM for Medication Management | 8 (23.5%) |
10 (29.4%) |
9 (26.5%) |
6 (17.6%) |
1 (2.9%) |
.74 | .57 |
SDM for Inpatient Treatment | 10 (29.4%) |
16 (47.1%) |
6 (17.6%) |
2 (5.9%) |
0 (0%) |
.50 | .43 |
Interdisciplinary Treatment Team | 3 (8.8%) |
8 (23.5%) |
8 (23.5%) |
10 (29.4%) |
5 (14.7%) |
1.09 | .61 |
Family/Significant Other Involvement | 0 (0%) |
6 (17.6%) |
16 (47.1%) |
4 (11.8%) |
8 (23.5%) |
1.21 | .52 |
Outpatient Treatment Planning | - | - | - | - | - | 1.20 | .35 |
SDM for Outpatient Treatment | 0 (0%) |
10 (31.3%) |
12 (37.5%) |
7 (21.9%) |
3 (9.4%) |
1.05 | .48 |
Outpatient Care Coordination | 0 (0%) |
6 (17.6%) |
6 (17.6%) |
13 (38.2%) |
9 (26.5%) |
1.37 | .53 |
Least Restrictive Discharge | 0 (0%) |
10 (29.4%) |
10 (29.4%) |
10 (29.4%) |
4 (11.8%) |
1.12 | .51 |
In-Reach | 0 (0%) |
4 (11.8%) |
15 (44.1%) |
9 (26.5%) |
6 (17.6%) |
1.25 | .46 |
Group Programming | - | - | - | - | - | 1.38 | .34 |
Sufficient Volume of Group Programming | 0 (0%) |
3 (8.8%) |
10 (29.4%) |
10 (29.4%) |
11 (32.4%) |
1.43 | .49 |
Group Dimensions (Revised)a | 0 (0%) |
3 (8.8%) | 6 (17.6%) |
13 (38.2%) |
12 (35.3%) |
1.61 | .41 |
Support for Programming | 4 (11.8%) |
6 (17.6%) |
14 (41.2%) |
6 (17.6%) |
4 (11.8%) |
1.00 | .58 |
High Quality Programming | 0 (0%) |
2 (5.9%) |
6 (17.6%) |
17 (50%) |
9 (26.5%) |
1.49 | .42 |
Milieu | - | - | - | - | - | 1.36 | .36 |
Warm & Inviting Unit | 1 (2.9%) |
8 (23.5%) |
9 (26.5%) |
12 (35.3%) |
4 (11.8%) |
1.15 | .53 |
Autonomy Promoting Environment | 1 (2.9%) |
3 (8.8%) |
12 (35.3%) |
11 (32.4%) |
7 (20.6%) |
1.29 | .51 |
Respectful Therapeutic Interactions | 1 (2.9%) |
5 (14.7%) |
8 (23.5%) |
10 (29.4%) |
10 (29.4%) |
1.34 | .57 |
Behavior Managed Through Least Restrictive Means | 0 (0%) |
2 (6.1%) |
2 (6.1%) |
11 (33.3%) |
18 (54.5%) |
1.68 | .43 |
Non-Factor Items | - | - | - | - | - | ||
Integrated Care for Comorbid Physical Health | 0 (0%) |
0 (0%) |
2 (5.9%) |
7 (20.6%) |
25 (73.5%) |
1.84 | .29 |
Individual Evidence-Based Psychotherapy | 17 (50%) |
3 (8.8%) |
11 (32.4%) |
3 (8.8%) |
0 (0%) |
0.50 | .55 |
Suicide Prevention | 0 (0%) |
0 (0%) |
2 (5.9%) |
8 (23.5%) |
24 (70.6%) |
1.82 | .30 |
Disciplines’ Subjective Adequacy Reviseda | 0 (0%) |
0 (0%) |
6 (17.6%) |
23 (67.6%) |
5 (14.7%) |
1.66 | .24 |
Peer Support | 5 (14.7%) |
8 (23.5%) |
8 (23.5%) |
3 (8.8%) |
10 (29.4%) |
1.07 | .73 |
Notes: SDM = Shared Decision-Making
Frequencies given for revised items: 0 (0.0%–.49%), .5 (.5–.99), 1 (1.0–1.49), 1.5 (1.5–1.99), and 2 (2.0).
Confirmatory Factor Analyses
A confirmatory factor analysis was undertaken to examine the factor structure of the RAIN scale. Both one-factor and four-factor solutions were tested. A second four-factor solution was tested that removed items with either a ceiling effect or other problematic characteristics. The four factors corresponded to the following four subscales, which are described below: 1) Inpatient Treatment Planning, 2) Outpatient Treatment Planning, 3) Group Programming and 4) Milieu. Three sites had partial data with 1 missing score each (FIML estimation was used to account for missing data). Goodness-of-fit indicators were compared between the one-factor, initial and revised four-factor solutions (see Table 2 for fit statistics). Overall, fit indices indicated poor to good fit for the 3 solutions (Schreiber, 2006), with the revised 4-factor model demonstrating the best fit. The final model had a non-significant chi-square value (141.2652, df=123, p=.12) suggesting good fit, unlike the other two tested solutions. For the one-factor solution, overall internal consistency was moderately high (Cronbach α=.85), suggesting that all items reflect to a greater or lesser extent the presence of recovery-oriented care; item-to-total correlations ranged from .13 to .71.
Table 2.
Fit statistic | One-factor (23 items) | Four-factor (23 items) | Four-factor (18 items) |
---|---|---|---|
Chi-square (degrees of freedom) | 423.6790 (230), p<.01 | 394.5476 (224), p<.01 | 141.2652 (123), p=.12 |
standardized root mean square residual (SRMR) | 0.1579 | 0.1582 | 0.1224 |
adjusted goodness-of-fit index (GFI) | 0.4364 | 0.4506 | 0.6503 |
root mean square error of approximation (RMSEA) (90% CI) | 0.1574 (0.1337, 0.1807) | 0.1496 (0.1251, 0.1737) | 0.066 (0, 0.11) |
Bentler comparative fit index | 0.3058 | 0.3887 | 0.89 |
Discussion
The scale presented in this paper represents a substantive step forward in conceptualizing and operationalizing recovery-oriented care in the context of acute inpatient mental health care. Across the course of the project, substantial changes were made to produce a scale grounded in extant conceptualizations of inpatient recovery-oriented care as well as application in practice. Below we review the sub-domains of recovery-oriented inpatient care as revealed by our iterative revisions and factor analyses- inpatient treatment planning, outpatient treatment planning, group programming, and milieu- in view of prior literature (see Table 3).
Table 3.
Item | Inpatient Treatment Planning | Outpatient Treatment Planning | Group Programming | Milieu |
---|---|---|---|---|
Recovery Goal-Setting | 0.82 | |||
Written Treatment Plan | 0.45 | |||
SDM for Medication Management | 0.62 | |||
SDM for Inpatient Treatment | 0.43 | |||
Interdisciplinary Treatment Team | 0.64 | |||
Family/Significant Other Involvement | 0.58 | |||
SDM for Outpatient Treatment | 0.77 | |||
Outpatient Care Coordination | 0.51 | |||
Least Restrictive Discharge | 0.57 | |||
In-Reach | 0.43 | |||
Sufficient Volume of Group Programming | 0.70 | |||
Group Domains (%) | 0.42 | |||
Support for Programming | 0.72 | |||
High-Quality Programming | 0.50 | |||
Warm and Inviting Unit | 0.44 | |||
Autonomy-Promoting Milieu | 0.56 | |||
Respectful and Therapeutic Interactions | 0.79 | |||
Behavior Managed Through Least Restrictive Means | 0.54 |
Inpatient treatment planning encompasses the collaborative development of a plan for treatment during hospitalization centered on the patient’s long-term recovery goals. The patient is considered the center of an interdisciplinary treatment team, along with family and significant others. This domain is firmly rooted in various conceptualizations of recovery, including SAMHSA’s recovery principle number 2 “Recovery is person driven – each person is ultimately in charge of their own recovery, setting goals and creating a path to achieve them” (SAMHSA, 2016). Moreover, conceptualizations grounded in patient reports also include related concepts such as patient co-influence (Hansson et al., 1993), autonomy (Hopkins et al., 2009), and empowerment (Pitkänen et al., 2008). Furthermore, several authors have noted the importance of communication between disciplines, the patient, and the patient’s support network (e.g., family); (Ahmed et al., 2013; Hansson et al., 1993; Hopkins et al., 2009; McDonagh et al., 2019; Pitkänen et al., 2008; Rabenschlag et al., 2014).
Outpatient treatment planning is similar to inpatient treatment planning but differs in targeted treatment content. These items are concerned with capitalizing on the inpatient stay to join the patient in reassessing an appropriate post-discharge, outpatient plan of care; provide information about options; and provide a plan adequate to ensure the plan of care is utilized. Providers (Cleary et al., 2013; McKenna et al., 2014; Sowers, 2005) and patients (Nolan et al., 2011; Walsh & Boyle, 2009) have acknowledged the important, but often omitted or short-changed, process of providing adequate information to the patient regarding the outpatient treatment options available and planned. Indeed, patient perceptions of knowledge about outpatient services provided while inpatient has been associated with higher satisfaction (Bowersox et al., 2013) and appropriate discharge planning was predictive of fewer readmissions (Lien, 2002). Regarding in-reach, continuity of providers across inpatient and outpatient care was associated with less rehospitalization (Omer et al., 2015). Moreover, Boyer (2000) found communication between the inpatient and outpatient treatment teams and starting outpatient programming while the patient is still on inpatient were associated with increased likelihood of attending outpatient care once discharged. As a single system of care capable of providing both inpatient and outpatient mental health services and sharing a common medical record, there is certainly potential for VA to improve this aspect of recovery-oriented care.
The group programming elements of our scale include the provision of an adequate quantity of groups with a diverse topic range, skillfully administered and supported by unit staff. Providers (Sowers, 2005) and patients (Chang et al., 2018; Foye et al., 2020; Hom et al., 2020; Waldemar et al., 2018) have noted the importance of inpatient programming while several published efforts in implementing recovery-oriented inpatient care focused at least in part on implementing inpatient group programming (Ahmed et al., 2013; McDonagh et al., 2019). In Hansson and colleagues’ (1993) content analysis of patient desired elements of inpatient care, content of treatment (particularly psychosocial content) was the second most commonly cited theme. It is important to note that seminal work by Rosenhan (Rosenhan, 1973) highlighted the boredom and lack of therapeutic contact associated with inpatient mental health stay and unfortunately, this state has not improved in some cases, leading one patient to comment on that lack of therapeutic programming stating “it was like I was on vacation” (Hom et al., 2020, p. 9). Despite consensus on their importance, in their review Cook and colleagues (2014) noted most group therapy models are constructed for long-term therapy with homogeneous groups of patients- a fundamental mismatch for acute inpatient units with diverse patients with short stays. In response, they integrated extant literature on group therapy to develop a group therapy model appropriate for inpatient care. Furthermore, additional data is needed regarding how group programming and other elements of recovery-oriented inpatient care (e.g., recovery goal setting) can be mutually supportive.
Finally, the milieu elements concern various items aimed at creating a safe, therapeutic, and autonomy-supporting environment. These factors are consistent with several of the World Health Organization (Hopkins et al., 2009) elements of responsiveness: respect for dignity, autonomy, prompt attention, and quality of amenities. Moreover, they are consistent with patient’s preferences (Hansson et al., 1993). Patients appreciate access to “normal” activities such as newspapers, contact with the outside world, and outdoor activities (Walsh & Boyle, 2009), but Foye and colleagues (2020) note the restrictive nature of inpatient wards is often a barrier to self-directed engagement in such normal activities. Professional recommendations emphasize the need to manage crises with a minimal use of seclusion and restraint (Ahmed et al., 2013; Parameswaran et al., 2015; Sowers, 2005). Nursing is typically the main discipline charged with the difficult task of balancing the competing demands on the inpatient milieu (e.g., safety vs. recovery) (Cleary et al., 2013).
Several elements did not fit within the four factors reviewed above. While these items may conceptually fit within the extant domains, scaling issues may have prevented them from fitting. For instance, individual, evidence-based therapy can serve many of the same functions as group therapy; however, the low rate at which individual therapy was offered prevented sufficient variance in the sample. Conversely, robust suicide prevention strategies, such as connecting at-risk Veterans with suicide risk prevention teams and creating a suicide prevention plan, constitute a clear VA imperative. However, from a psychometric standpoint, variance was low: less than 6% of sites scored lower than a 4 on this item. The integrated care for comorbid physical health item also showed little variability, with 94% of sites scoring a 1.5 or 2. While improving integration of physical and mental health services for people with mental illness in outpatient settings has required legislative policy and practice transformation over decades and is still a work in progress (Druss & Goldman, 2018), inpatient units staffed heavily with nurses, physicians, and access to other medical center services may need little encouragement to address physical health comorbidities. The peer support and multiple disciplines represented items were both scored with respect to unit staffing. While certain roles are almost exclusively held by one discipline (e.g., prescribing by psychiatry and nurse practitioners), which discipline participates in other functions is much more idiosyncratic. Measurement of recovery-oriented services would likely be better served by focusing not on the presence of certain disciplines (i.e., the form of the team) but rather who accomplishes specific, recovery-oriented tasks (McGuire et al., 2020).
Taken together, this work and resulting scale represent an important step in operationalizing recovery-oriented inpatient care. Nonetheless, our approach is not without limitations. The purpose of the current study was to develop and iteratively refine a scale of recovery-oriented inpatient mental health care. A full study of the psychometrics of the scale was beyond the scope of the current work. Future work should examine additional psychometrics such as inter-rater reliability and examine validity. To the last point, work is ongoing to test predictive validity of the scale (relationship with Veteran outcomes). An additional limitation is that the scale was refined and tested solely in the context of VHA inpatient units and therefore requires validation in community inpatient units. Importantly, while multiple stakeholder perspectives were obtained to make ratings, differences in perceived element implementation was not systematically tracked. Future work should examine differences in perspective across disciplines within sites and potential impact on shared or dissimilar understanding of implementation. Finally, the data collection used to score sites was very time intensive and likely not feasible outside the context of a research study; additional work aimed at reducing participant and scorer burden would be valuable.
Conclusion
While applying recovery principles to inpatient care has lagged behind outpatient care, substantial work has provided perspectives from multiple stakeholders and examples of small-scale attempts to implement recovery-oriented care. The RAIN scale provides a clear conceptualization of recovery-oriented inpatient care, grounded in prior theory and current practice, with operationalized elements. While additional work is necessary to further revise the scale and test its validity (e.g., association with outcomes), the scale can provide a valuable tool for stakeholders wishing to develop, improve, and test recovery-oriented inpatient services.
Supplementary Material
Impact and Implications.
This study focused on developing a scale to measure recovery-oriented inpatient mental healthcare. The study found a 23-item scale matched prior literature and included four general areas of recovery-oriented inpatient care: inpatient treatment planning, outpatient treatment planning, group programming, and milieu. This scale can be used to improve the recovery-orientation of inpatient programming.
Acknowledgments
This work was supported by a grant from VA Health Services Research and Development (HSR&D, IIR 15-300 [$1,099,814]) and by the Department of Veterans Affairs, Health Services Research & Development Center for Health Information and Communication (CIN 13-416). The views expressed in this article are those of the authors and do not necessarily represent the views of the U.S. Department of Veterans Affairs.
The authors wish to thank Scott Patterson, Ph.D., HSPP for his assistance with this project.
Footnotes
We have no known conflict of interest to disclose.
References
- Ahmed AO, Serdarevic M, Mabe PA, & Buckley PF (2013). Triumphs and challenges of transforming a state psychiatric hospital in Georgia. International Journal of Mental Health Promotion, 15(2), 68–75. [Google Scholar]
- Anthony WA (1993). Recovery from mental illness: the guiding vision of the mental health service system in the 1990s. Psychosocial rehabilitation journal, 16(4), 11. [Google Scholar]
- Bond G, Williams J, Evans L, Salyers M, Kim H, Sharpe H, & Leff HS (2000). Psychiatric rehabilitation fidelity toolkit. Cambridge, MA: Human Services Research Institute. [DOI] [PubMed] [Google Scholar]
- Bond GR, Becker DR, Drake RE, & Vogler KM (1997). A fidelity scale for the individual placement and support model of supported employment. Rehabilitation Counseling Bulletin, 40, 265–284. [Google Scholar]
- Bond GR, & Drake RE (2020, November). Assessing the Fidelity of Evidence-Based Practices: History and Current Status of a Standardized Measurement Methodology. Adm Policy Ment Health, 47(6), 874–884. 10.1007/s10488-019-00991-6 [DOI] [PubMed] [Google Scholar]
- Bond GR, McHugo GJ, Becker DR, Rapp CA, & Whitley R (2008, Spring). Fidelity of supported employment: lessons learned from the National Evidence-Based Practice Project. Psychiatr Rehabil J, 31(4), 300–305. 10.2975/31.4.2008.300.305 [DOI] [PubMed] [Google Scholar]
- Bond GR, Peterson AE, Becker DR, & Drake RE (2012, August). Validation of the Revised Individual Placement and Support Fidelity Scale (IPS-25). Psychiatr Serv, 63(8), 758–763. 10.1176/appi.ps.201100476 [DOI] [PubMed] [Google Scholar]
- Bowersox NW, Bohnert AS, Ganoczy D, & Pfeiffer PN (2013). Inpatient psychiatric care experience and its relationship to posthospitalization treatment participation. Psychiatric services, 64(6), 554–562. [DOI] [PubMed] [Google Scholar]
- Boyer CA, McAlpine DD, Pottick KJ, & Olfson M (2000). Identifying risk factors and key strategies in linkage to outpatient psychiatric care. American Journal of Psychiatry, 157(10), 1592–1598. [DOI] [PubMed] [Google Scholar]
- Cabassa LJ, Gomes AP, Meyreles Q, Capitelli L, Younge R, Dragatsi D, Alvarez J, Nicasio A, Druss B, & Lewis-Fernández R (2014, November). Primary health care experiences of hispanics with serious mental illness: a mixed-methods study. Adm Policy Ment Health, 41(6), 724–736. 10.1007/s10488-013-0524-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chang B-H, Geller JL, & Biebel K (2018). Recovery services and outcomes in a state psychiatric hospital. Psychiatric Quarterly, 89(3), 707–716. https://link.springer.com/article/10.1007/s11126-018-9570-y [DOI] [PubMed] [Google Scholar]
- Cleary M, Horsfall J, O’Hara-Aarons M, & Hunt GE (2013). Mental health nurses’ views of recovery within an acute setting. International Journal of Mental Health Nursing, 22(3), 205–212. https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1447-0349.2012.00867.x [DOI] [PubMed] [Google Scholar]
- Cook JA, Copeland ME, Jonikas JA, Hamilton MM, Razzano LA, Grey DD, Floyd CB, Hudson WB, Macfarlane RT, Carter TM, & Boyd S (2012, June). Results of a randomized controlled trial of mental illness self-management using Wellness Recovery Action Planning. Schizophr Bull, 38(4), 881–891. 10.1093/schbul/sbr012 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cook WG, Arechiga A, Dobson LAV, & Boyd K (2014). Brief heterogeneous inpatient psychotherapy groups: A process-oriented psychoeducational (POP) model. International Journal of Group Psychotherapy, 64(2), 180–206. [DOI] [PubMed] [Google Scholar]
- Cuddeback GS, Morrissey JP, Domino ME, Monroe-DeVita M, Teague GB, & Moser LL (2013, April 1). Fidelity to recovery-oriented ACT practices and consumer outcomes. Psychiatr Serv, 64(4), 318–323. 10.1176/appi.ps.201200097 [DOI] [PubMed] [Google Scholar]
- Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, & Lowery JC (2009). Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science, 4(1), 1–15. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Damschroder LJ, Reardon CM, & Lowery JC (2020). The Consolidated Framework for Implementation Research (CFIR). In Handbook on Implementation Science. Edward Elgar Publishing. [Google Scholar]
- Deegan PE (1988). Recovery: The lived experience of rehabilitation. Psychosocial rehabilitation journal, 11(4), 11. [Google Scholar]
- Department of Veterans Affairs, V. H. A. (2013). VHA Handbook 1160.01, Uniform Mental Health Services in VA Medical Centers and Clinics, and VHA Handbook 1163.01, Psychosocial Rehabilitation and Recovery Services
- Dillman DA (1978). Mail and telephone surveys: The total design method (Vol. 19). Wiley; New York. [Google Scholar]
- Druss BG, & Goldman HH (2018, December 1). Integrating Health and Mental Health Services: A Past and Future History. Am J Psychiatry, 175(12), 1199–1204. 10.1176/appi.ajp.2018.18020169 [DOI] [PubMed] [Google Scholar]
- Egeland KM, Heiervang KS, Landers M, Ruud T, Drake RE, & Bond GR (2019). Psychometric properties of a fidelity scale for Illness Management and Recovery. Administration and Policy in Mental Health and Mental Health Services Research, 1–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Evans L, Wewiorski NJ, Ellison ML, Ni P, Harvey KLL, Hunt MG, Gorman JA, & Charns MP (2020, June 1). Development and Validation of an Instrument to Measure Staff Perceptions of Recovery Climate and Culture in Mental Health Programs. Psychiatr Serv, 71(6), 570–579. 10.1176/appi.ps.201900181 [DOI] [PubMed] [Google Scholar]
- Foye U, Li Y, Birken M, Parle K, & Simpson A (2020). Activities on acute mental health inpatient wards: A narrative synthesis of the service users’ perspective. Journal of Psychiatric and Mental Health Nursing. [DOI] [PubMed] [Google Scholar]
- Gensichen J, Serras A, Paulitsch MA, Rosemann T, König J, Gerlach FM, & Petersen JJ (2011, August). The Patient Assessment of Chronic Illness Care questionnaire: evaluation in patients with mental disorders in primary care. Community Ment Health J, 47(4), 447–453. 10.1007/s10597-010-9340-2 [DOI] [PubMed] [Google Scholar]
- Glasgow RE, Wagner EH, Schaefer J, Mahoney LD, Reid RJ, & Greene SM (2005, May). Development and validation of the Patient Assessment of Chronic Illness Care (PACIC). Med Care, 43(5), 436–444. 10.1097/01.mlr.0000160375.47920.8c [DOI] [PubMed] [Google Scholar]
- Hamann J, Cohen R, Leucht S, Busch R, & Kissling W (2007). Shared decision making and long-term outcome in schizophrenia treatment. The Journal of clinical psychiatry. [DOI] [PubMed] [Google Scholar]
- Hansson L, Bjorkman T, & Bergulnd I (1993). What is important in psychiatric inpatient care? Quality of care from the patient’s perspective. International Journal for Quality in Health Care, 5(1), 41–47. [DOI] [PubMed] [Google Scholar]
- Hogan MF (2003). New Freedom Commission report: The president’s New Freedom Commission: recommendations to transform mental health care in America. Psychiatric Services, 54(11), 1467–1474. [DOI] [PubMed] [Google Scholar]
- Hom MA, Bauer BW, Stanley IH, Boffa JW, Stage DRL, Capron DW, Schmidt NB, & Joiner TE (2020). Suicide attempt survivors’ recommendations for improving mental health treatment for attempt survivors. Psychological services. [DOI] [PubMed] [Google Scholar]
- Hopkins J, Loeb S, & Fick D (2009). Beyond satisfaction, what service users expect of inpatient mental health care: a literature review. Journal of Psychiatric and Mental Health Nursing, 16(10), 927–937. https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1365-2850.2009.01501.x [DOI] [PubMed] [Google Scholar]
- Jacobson N (2004). The making of mental health policy. VU University Press. [Google Scholar]
- Jacobson N, & Greenley D. (2001). What is recovery? A conceptual model and explication. Psychiatric Services, 52(4), 482–485. [DOI] [PubMed] [Google Scholar]
- Kaboli PJ, & Glasgow JM. (2011). VAMC Facility Rurality: Comparison of Three Classification Approaches. Veterans Administration Office of Rural Health. Washington, DC: Department of Veterans Affairs. [Google Scholar]
- Koop JI, Rollins AL, Bond GR, Salyers MP, Dincin J, Kinley T, Shimon SM, & Marcelle K (2004). Development of the DPA fidelity scale: Using fidelity to define an existing vocational model. Psychiatric Rehabilitation Journal, 28(1), 16. [DOI] [PubMed] [Google Scholar]
- Lien L (2002). Are readmission rates influenced by how psychiatric services are organized? Nord J Psychiatry, 56(1), 23–28. 10.1080/08039480252803873 [DOI] [PubMed] [Google Scholar]
- Malinovsky I, Lehrer P, Silverstein SM, Shankman SA, O’Brien W, Samuelson T, & Van Nostrand G (2013). An empirical evaluation of recovery transformation at a large community psychiatric rehabilitation organization. Psychological Services, 10(4), 428. [DOI] [PubMed] [Google Scholar]
- McDonagh JG, Haren WB, Valvano M, Grubaugh AL, Wainwright FC, Rhue CH, Pelic CM, Pelic CG, Koval R, & York JA (2019). Cultural Change: Implementation of a Recovery Program in a Veterans Health Administration Medical Center Inpatient Unit. Journal of the American Psychiatric Nurses Association, 25(3), 208–217. https://journals.sagepub.com/doi/10.1177/1078390318786024?url_ver=Z39.88-2003&rfr_id=ori:rid:crossref.org&rfr_dat=cr_pub%3dpubmed [DOI] [PubMed] [Google Scholar]
- McGuire AB, Powell KG, Treitler PC, Wagner KD, Smith KP, Cooperman N, Robinson L, Carter J, Ray B, & Watson DP (2020). Emergency department-based peer support for opioid use disorder: Emergent functions and forms. Journal of substance abuse treatment, 108, 82–87. https://www.journalofsubstanceabusetreatment.com/article/S0740-5472(19)30082-0/pdf [DOI] [PMC free article] [PubMed] [Google Scholar]
- McGuire AB, Stull LG, Mueser KT, Santos M, Mook A, Rose N, Tunze C, White LM, & Salyers MP (2012). Development and reliability of a measure of clinician competence in providing illness management and recovery. Psychiatric Services, 63(8), 772–778. [DOI] [PubMed] [Google Scholar]
- McGuire AB, White DA, Bartholomew T, Flanagan ME, McGrew JH, Rollins AL, Mueser KT, & Salyers MP (2017). The relationship between provider competence, content exposure, and consumer outcomes in illness management and recovery programs. Administration and Policy in Mental Health and Mental Health Services Research, 44(1), 81–91. [DOI] [PubMed] [Google Scholar]
- McKenna B, Furness T, Dhital D, Ennis G, Houghton J, Lupson C, & Toomey N (2014). Recovery-oriented care in acute inpatient mental health settings: An exploratory study. Issues in Mental Health Nursing, 35(7), 526–532. https://www.tandfonline.com/doi/full/10.3109/01612840.2014.890684 [DOI] [PubMed] [Google Scholar]
- McLoughlin KA, & Fitzpatrick JJ. (2008). Self-reports of recovery-oriented practices of mental health nurses in state mental health institutes: Development of a measure. Issues in Mental Health Nursing, 29(10), 1051–1065. [DOI] [PubMed] [Google Scholar]
- Monroe-DeVita M, Teague GB, & Moser LL (2011, January-Feb). The TMACT: a new tool for measuring fidelity to assertive community treatment. J Am Psychiatr Nurses Assoc, 17(1), 17–29. 10.1177/1078390310394658 [DOI] [PubMed] [Google Scholar]
- Moser LL, Monroe-DeVita M, & Teague GB (2013). Evaluating Integrated Treatment Within Assertive Community Treatment Programs: A New Measure. Journal of Dual Diagnosis, 9(2), 187–194. 10.1080/15504263.2013.779480 [DOI] [Google Scholar]
- Mueser K, Gingerich S, Bond G, Campbell K, & Williams J (2002). Illness management and recovery fidelity scale. Illness Management and Recovery Implementation Resource Kit. Edited by Mueser KT, Gingerich S. Rockville, Md: Substance Abuse and Mental Health Services Administration. [Google Scholar]
- Mueser KT, Bond GR, Drake RE, & Resnick SG (1998). Models of community care for severe mental illness: a review of research on case management. Schizophrenia bulletin, 24(1), 37–74. [DOI] [PubMed] [Google Scholar]
- Mueser KT, Meyer PS, Penn DL, Clancy R, Clancy DM, & Salyers MP (2006). The Illness Management and Recovery program: rationale, development, and preliminary findings. Schizophrenia bulletin, 32(suppl_1), S32–S43. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nolan P, Bradley E, & Brimblecombe N (2011). Disengaging from acute inpatient psychiatric care: a description of service users’ experiences and views. Journal of psychiatric and mental health nursing, 18(4), 359–367. https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1365-2850.2010.01675.x [DOI] [PubMed] [Google Scholar]
- Omer S, Priebe S, & Giacco D (2015). Continuity across inpatient and outpatient mental health care or specialisation of teams? A systematic review. European Psychiatry, 30(2), 258–270. https://www.sciencedirect.com/science/article/abs/pii/S0924933814001448?via%3Dihub [DOI] [PubMed] [Google Scholar]
- Parameswaran SG, Spaeth-Rublee B, & Pincus HA (2015). Measuring the quality of mental health care: consensus perspectives from selected industrialized countries. Administration and Policy in Mental Health and Mental Health Services Research, 42(3), 288–295. https://link.springer.com/article/10.1007%2Fs10488-014-0569-x [DOI] [PubMed] [Google Scholar]
- Pitkänen A, Hätönen H, Kuosmanen L, & Välimäki M (2008). Patients’ descriptions of nursing interventions supporting quality of life in acute psychiatric wards: a qualitative study. International Journal of Nursing Studies, 45(11), 1598–1606. https://www.sciencedirect.com/science/article/abs/pii/S0020748908000643?via%3Dihub [DOI] [PubMed] [Google Scholar]
- Rabenschlag F, Konrad A, Rueegg S, & Jaeger M (2014). A recovery-oriented approach for an acute psychiatric ward: is it feasible and how does it affect staff satisfaction? Psychiatric Quarterly, 85(2), 225–239. https://link.springer.com/article/10.1007%2Fs11126-013-9285-z [DOI] [PubMed] [Google Scholar]
- Rosenhan DL (1973). On being sane in insane places. Science, 179(4070), 250–258. https://science.sciencemag.org/content/179/4070/250.long [DOI] [PubMed] [Google Scholar]
- Salyers MP, Tsai J, & Stultz TA. (2007). Measuring recovery orientation in a hospital setting. Psychiatric Rehabilitation Journal, 31(2), 131. [DOI] [PubMed] [Google Scholar]
- SAMHSA. (2016). Defining Recovery: SAMHSA’s 10 Guiding Principals of Recovery. [Google Scholar]
- Schreiber JB, Nora A, Stage FK, Barlow EA, & King J (2006). Reporting Structural Equation Modeling and Confirmatory Factor Analysis Results: A Review. The Journal of Educational Research, 99(6), 323–338. [Google Scholar]
- Simpson A, Coffey M, Hannigan B, Barlow S, Cohen R, Jones A, Faulkner A, Thornton A, Všetečková J, & Haddad M (2017). Cross-national mixed methods comparative case study of recovery-focused mental health care planning and coordination in acute inpatient mental health settings (COCAPP-A). Health Services and Delivery Research, 5(26). [PubMed] [Google Scholar]
- Sowers W (2005). Transforming systems of care: The American Association of Community Psychiatrists guidelines for recovery oriented services. Community Mental Health Journal, 41(6), 757–774. https://link.springer.com/article/10.1007%2Fs10597-005-6433-4 [DOI] [PubMed] [Google Scholar]
- Suhr D, Keaten JA, Kelly L, & Begnal C. (1997). Confirmatory factor analysis using PROC CALIS. In Proceedings of Western Users of SAS Software Conference. [Google Scholar]
- Tsai J, & Salyers MP. (2010). Recovery orientation in hospital and community settings. The journal of behavioral health services & research, 37(3), 385–399. [DOI] [PubMed] [Google Scholar]
- Tsai J, Salyers MP, & Lobb AL (2010, December). Recovery-oriented training and staff attitudes over time in two state hospitals. Psychiatr Q, 81(4), 335–347. 10.1007/s11126-010-9142-2 [DOI] [PubMed] [Google Scholar]
- Waldemar AK, Arnfred SM, Petersen L, & Korsbek L (2016, June 1). Recovery-Oriented Practice in Mental Health Inpatient Settings: A Literature Review. Psychiatr Serv, 67(6), 596–602. 10.1176/appi.ps.201400469 [DOI] [PubMed] [Google Scholar]
- Waldemar AK, Esbensen BA, Korsbek L, Petersen L, & Arnfred S (2018). Recovery orientation in mental health inpatient settings: Inpatient experiences? International journal of mental health nursing, 27(3), 1177–1187. https://onlinelibrary.wiley.com/doi/abs/10.1111/inm.12434 [DOI] [PubMed] [Google Scholar]
- Walsh J, & Boyle J (2009). Improving acute psychiatric hospital services according to inpatient experiences. A user-led piece of research as a means to empowerment. Issues in mental health nursing, 30(1), 31–38. https://www.tandfonline.com/doi/full/10.1080/01612840802500733 [DOI] [PubMed] [Google Scholar]
- Williams J, Leamy M, Bird V, Harding C, Larsen J, Le Boutillier C, Oades L, & Slade M (2012, November). Measures of the recovery orientation of mental health services: systematic review. Soc Psychiatry Psychiatr Epidemiol, 47(11), 1827–1835. 10.1007/s00127-012-0484-y [DOI] [PubMed] [Google Scholar]
- Wyder M, Ehrlich C, Crompton D, McArthur L, Delaforce C, Dziopa F, Ramon S, & Powell E (2017, December). Nurses experiences of delivering care in acute inpatient mental health settings: A narrative synthesis of the literature. Int J Ment Health Nurs, 26(6), 527–540. 10.1111/inm.12315 [DOI] [PubMed] [Google Scholar]
- Zuehlke JB, Kotecki RM, Kern S, Sholty G, & Hauser P (2016). Transformation to a recovery-oriented model of care on a veterans administration inpatient unit. Psychiatric rehabilitation journal, 39(4), 361. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.