Abstract
Background:
Hospital-at-home (HaH) is a growing model of care that has been shown to improve patient outcomes, satisfaction, and cost-effectiveness. However, selecting appropriate patients for HaH is challenging, often requiring burdensome manual screening by clinicians. To facilitate HaH enrollment, electronic health record (EHR) tools such as best practice advisories (BPAs) can be used to alert providers of potential HaH candidates.
Objective:
To describe the development and implementation of a BPA for identifying HaH eligible patients in Mayo Clinic’s Advanced Care at Home (ACH) program, and to evaluate the provider response and the patient characteristics that triggered the BPA.
Design, Setting, and Participants:
We conducted a retrospective multicenter study of hospitalized patients who triggered the BPA notification for ACH eligibility between March and December 2021 at Mayo Clinic in Jacksonville, FL and Mayo Clinic Health System in Eau Claire, WI. We extracted demographic and diagnosis data from the patients as well as characteristics of the providers who received the BPA notification.
Intervention:
The BPA was developed based on the ACH inclusion and exclusion criteria, which were derived from clinical guidelines, literature review, and expert consensus. The BPA was integrated into the EHR and displayed a pop-up message to the provider when a patient met the criteria for ACH eligibility. The provider could choose to refer the patient to ACH, dismiss the notification, or defer the decision.
Main Outcomes and Measures:
The main outcomes were the number and proportion of BPA notifications that resulted in a referral to ACH, and the number and proportion of referrals that were accepted by the ACH clinical team and transferred to ACH. We also analyzed the factors associated with the provider’s decision to refer or not refer the patient to ACH, such as the provider’s role, location, and specialty.
Results:
During the study period, 8962 notifications were triggered for 2847 patients. Providers opted to refer 711 (11.4%) of the total notifications linked to 324 unique patients. After review by the ACH clinical team, 31 of the 324 referrals (9.6%) met clinical and social criteria and were transferred to ACH. In multivariable analysis, Wisconsin nurses, physician assistants, and in-training personnel had lower odds of referring the patients to ACH when compared to attending physicians.
INTRODUCTION
Patient identification in hospital-at-home (HaH) is complex, with numerous variables, such as clinical and social stability, insurance compatibility, and home address to consider for safely hospitalizing patients in the home setting.1 Identifying appropriate HaH patients can require burdensome manual chart review of potential candidates. This process is time intensive, concerning in a hospital environment where every minute matters for hospital throughput and patient care. Clinical decision support systems (CDSSs) embedded in an electronic health record (EHR) can provide best practice advisory (BPA) alerts to EHR users, prompting them to action, thereby improving patient safety and adherence to clinical guidelines or assisting with documentation.2
In 2020, Mayo Clinic implemented a HaH program to cover two geographically separate sites.1,3 This program, called Advanced Care at Home (ACH), recruited patients from a 304-bed hospital in Jacksonville, FL and a 309-bed hospital in Eau Claire, Wisconsin. Patient eligibility was limited by selection criteria to ensure that patients could be safely treated at home (Table 1). At the time of ACH program launch, eligible patients were identified through a manual chart review in the EHR. As screening hundreds of patients each day between sites was quite laborious, it was decided that an electronic registry would be created in the EHR. The registry would assist with more efficient chart review of patients by acting as a filter for initial eligibility criteria of patient geography (patients’ home fell within an ACH care delivery zone) and eligible patient payer source (Medicare or accepted commercial payer). Applying this filter resulted in a manageable patient list (usually 40–50 total patients per day) for secondary review of clinical and social eligibility. However, even with this registry in place, having one to two clinical ACH providers reviewing multiple patient records for eligibility remained time consuming. As a result, the ACH team moved to build clinical decision support in the EHR through a BPA alert. The ACH BPA alert would be triggered for hospital patients who met initial ACH screening criteria. The BPA alert then notified the hospital-based teams that their patient potentially qualified for ACH (Table 2). The hope was that this BPA would empower the hospital-based teams to further review the patient to ensure appropriateness for home hospitalization. The BPA prompted hospital teams to consider placing an ACH consultation via an electronic consultation request. We hypothesized that the ACH BPA alert would improve the process for identification of appropriate ACH patients. The primary aim of our study was to describe the development and effectiveness of a BPA alert resulting in patient enrollment in the ACH program. A secondary aim was to report the characteristics of patients referred to the ACH program and characteristics (such as department, role, and site) of healthcare providers referring them through the BPA.
TABLE 1.
Advanced Care at Home patient screening criteria.
| Advanced Care at Home eligibility criteria | |
|---|---|
|
| |
| Inclusion criteria | Exclusion criteria |
| Clinically stable for home care | Unstable mental illness |
| Resides within 30 miles of hospital | Requires intravenous pain medication |
| Social determinants (including, but not limited to, a residence with electricity, water, heat/AC, refrigeration, and bathroom) | Requires 24 × 7 assistance with activities of daily living, without available caregiver |
| Accepted payer source/reimbursement eligible | Presence of unstable arrhythmia |
| Age 18 years and older | Requires telemetry |
| Inpatient hospital status | |
TABLE 2.
BPA alert criteria.
| Zip code within ACH service area |
| Patient has accepted payer source |
| Clinical criteria: |
| 1. Eligible diagnosis |
| 2. Patient is not ICU level of care. |
| 3. Patient is 18 years old or older. |
| 4. If a patient is COVID+, they cannot be on 5 L or more of oxygen. |
| 5. Patient does not have an order for hemodialysis or peritoneal dialysis and/or it is not listed on the patients problem list. |
| Inpatient or observation status |
| Additional criteria: |
| 1. Patient is assigned to a hospital bed. |
| 2. Patient did not come to the hospital from a nursing home, hospice, or long-term care |
| 3. Patient does not have an active order for discharge. |
| 4. Patient has not had surgery in the last 24 h and is not planned for surgery/procedure in the next 48 h. |
Abbreviations: BPA, best practice advisory; ICU, intensive care unit.
METHODS
Patient selection and setting
This study was approved by the Mayo Clinic Institutional Review Board as a retrospective chart review under protocol number 20–010753. The study was conducted in accordance with the Declaration of Helsinki. The study was conducted between March 24 and December 31, 2021, at Mayo Clinic Hospital, Jacksonville, FL and Mayo Clinic Health System in Eau Claire, Wisconsin. The inclusion criteria were patients in which an ACH BPA alert was activated by the EHR ACH registry. Patients were excluded if admitted to ACH by means other than the ACH BPA process. Admission to the ACH program was voluntary, and patients provided oral and written informed consent to participate.
Development, logistics, and implementation of the ACH BPA process
Before the development of the ACH BPA process, members of the ACH team identified potentially eligible ACH patients through manual patient list review. This patient list indicated patients’ primary residence zip code and insurance provider. ACH team members, usually consisting of registered nurses (RNs) or advanced practice providers (APPs), typically divided the list, with each provider deeming a patient ineligible for ACH if the patient’s insurance did not cover ACH services or the patient lived outside the service area radius of the hospital. After this initial process, the list of existing potentially eligible patients was then reviewed by a multidisciplinary team of APPs, physicians, RNs, case managers, and social workers for clinical and social stability screening. The initial patient identification model presented several limitations as it was both time consuming and dependent on human judgment, potentially introducing bias into ACH patient selection. This process relied on the ACH team to “pull” patients into the program from either the emergency department (ED) or hospital floors. It did not allow an opportunity for non-ACH team members in these hospital areas to understand qualities of eligible ACH patients, nor did it allow ED or hospital floor providers to refer patients to the ACH team for review.
In March 2021, an EHR patient registry, which automatically filtered hospital patients by geography and payer source, was created to expedite this initial patient review process. Subsequently, a BPA alert was incorporated into the EHR to automatically notify anyone opening the patient’s chart that the patient met geographical, payer, and certain additional clinical and nonclinical eligibility criteria for ACH (Table 2). Along with this automatic BPA alert, a linked “ACH Consult” order was attached so that if the ED or hospital team received the alert and believed that their patient was appropriate for ACH, they could easily and quickly consult the ACH multidisciplinary team for an expedited review (Figure 1). The provider responded to the BPA alert by selecting one of the following options: “acknowledge” or “cancel/override.” By selecting “acknowledge,” the provider was directed to a tab where they could generate a consult order with ACH or “decline” ACH team review. The “cancel/override” option allowed the user to stop their involvement in the BPA process for that patient, although other providers opening that patient’s chart could still receive a separate BPA alert. Furthermore, each BPA alert allowed for an ACH consultation order, meaning multiple consultation requests could be placed by different team members on the same patient (Figure 2).
FIGURE 1.

Best practice advisory alert with linked consult order.
FIGURE 2.

Best practice advisory (BPA) trigger and options workflow describing the selection options for providers. ACH, Advanced Care at Home.
Before rollout, the ACH clinical team met once with clinical providers from Hospital Internal Medicine to inform them of the upcoming BPA alert and provide education on the “acknowledge/cancel/consult/decline” process. Brief follow-up reminders of the BPA process were given a few times via newsletter and at department meetings. Education on the process was targeted toward Hospital Internal Medicine as the majority of ACH patients came from this service. At the time of rollout, an initial email and follow-up email were sent to all hospital providers describing the BPA and illustrating the “acknowledge/decline/cancel/override/consult” process. After the BPA was launched, the multidisciplinary ACH enrollment team continued to manually review charts of potential ACH patients in the EHR based on initial screening criteria of zip code and insurance provider in the registry. However, consults to ACH triggered by the BPA alert were given prioritization for review.
Data collection and statistical analysis
Data on patient demographics and diagnosis were extracted from the EHR. Patients with two or more ACH admissions were counted as distinct cases. Demographic data included age, sex, and self-reported race and ethnicity. Patient diagnosis was defined as the primary diagnosis at the time the notification was triggered. Provider specialty/home department ordering the BPA-linked ACH consultation was collected. Provider type (attending physician, nurse practitioner, physician assistant, RN, or trainee [medical students, residents, and fellows]) ordering the BPA-linked ACH consultation was also collected.
Average, percentages, and standard deviation calculations were obtained from the numerical data of the demographic characteristics and primary diagnosis frequencies. Pearson’s Chi-square tests were performed to compare the demographic characteristics and site (Florida and Wisconsin) of the referred patients. Subsequently, a multivariable analysis was performed using SPSS software, v.25 (IBM).
RESULTS
During the study period, hospital providers received 8962 BPA notifications triggered on 2847 unique patients (average 3.1 per patient), with 4496 notifications activated in Florida and 4466 in Wisconsin. Of the 8962 BPA notifications, 711 (7.9%) notifications (346 from Florida; 365 from Wisconsin) on 345 (12.1%) unique patients were acknowledged by the hospital teams, while 8251 notifications on 2502 distinct patients were canceled, ignored, or overridden. Of the 345 patients (161 from Florida; 184 from Wisconsin) who were “acknowledged,” hospital providers placed 324 ACH consult orders (149 from Florida; 175 from Wisconsin), while providers declined consultation after acknowledgment on 21 patients. After review by the ACH clinical team, 31 of the 324 referrals (9.6%) were transferred into ACH, while the remaining were declined either by the patient or for not meeting clinical or social criteria (Figure 3).
FIGURE 3.

Results flowchart of all the best practice advisory (BPA) notifications during the study period showing order and patient volumes from BPA trigger to the moment a patient is reviewed by the Advanced Care at Home (ACH) team to determine eligibility. Figure created using BioRender.com.
Demographic characteristics were retrieved from the 324 patients referred to the program. The mean (SD) age was 74.4 ± 15.7 years, including 155 women (47.8%), of whom 296 (91.4%) identified as white, 16 (4.9%) as Black or African American, 6 (1.9%) as Asian, and 6 (1.9%) as another race or not disclosed. The ethnicity of 314 patients (96.9%) was not Hispanic or Latino. Ten patients (3.1%) were Hispanic/Latino or chose not to disclose ethnicity (Table 3).
TABLE 3.
Demographic characteristics of referred patients to ACH, based on location.
| Florida (n = 149) | Wisconsin (n = 175) | Total (n = 324) | p Value | |
|---|---|---|---|---|
| Sex | .085 | |||
| Male | 70 (47.0%) | 99 (56.6%) | 169 (52.2%) | |
| Female | 79 (53.0%) | 76 (43.4%) | 155 (47.8%) | |
| Age (years) | .676 | |||
| Mean (SD) | 74.5 (±16.3) | 74.3 (±15.3) | 74.4 (±15.7) | |
| Median [range] | 77 [19–101] | 76 [35–102] | 76 [19–102] | |
| Race | .001 | |||
| White | 127 (85.2%) | 169 (96.6%) | 296 (91.4%) | |
| Black or African American | 14 (9.4%) | 2 (1.1%) | 16 (4.9%) | |
| Asian | 3 (2.0%) | 3 (1.7%) | 6 (1.9%) | |
| Unknown | 5 (3.4%) | 1 (0.6%) | 6 (1.9%) | |
| Ethnicity | .299 | |||
| Not Hispanic or Latino | 142 (95.3%) | 172 (98.3%) | 314 (96.9%) | |
| Hispanic or Latino | 5 (3.4%) | 2 (1.1%) | 7 (2.2%) | |
| Unknown | 2 (1.3%) | 1 (0.6%) | 3 (0.9%) | |
| Clinician placing BPA | <.001 | |||
| Consultant | 53 (35.6%) | 147 (84.0%) | 200 (61.7%) | |
| Trainees | 62 (41.7%) | 10 (5.7%) | 72 (22.6%) | |
| Nurse practitioner | 21 (14.1%) | 8 (4.6%) | 29 (9.0%) | |
| Physician associate | 13 (8.7%) | 8 (4.6%) | 21 (6.5%) | |
| Registered nurse | 0 (0%) | 2 (1.1%) | 2 (0.6%) | |
| Department of clinician placing BPA | <.001 | |||
| Internal medicine | 109 (73.2%) | 154 (88.0%) | 263 (81.2%) | |
| Family medicine | 30 (20.1%) | 8 (4.6%) | 38 (11.7%) | |
| Cardiovascular diseases | 5 (3.4%) | 6 (3.4%) | 11 (3.4%) | |
| Othera | 5 (3.4%) | 7 (4.0%) | 12 (3.7%) | |
| Diagnosis | .13 | |||
| COVID-19 | 21 (14.1%) | 15 (8.6%) | 36 (11.1%) | |
| Sepsis | 10 (6.7%) | 21 (12.0%) | 31 (9.6%) | |
| Pneumonia | 12 (8.1%) | 14 (8.0%) | 26 (8.0%) | |
| Cellulitis | 10 (6.7%) | 15 (8.6%) | 25 (7.7%) | |
| Congestive heart failure exacerbation | 5 (3.4%) | 17 (9.7%) | 22 (6.8%) | |
| COPD exacerbation | 6 (4.0%) | 12 (6.9%) | 18 (5.6%) | |
| Arrythmia | 7 (4.7%) | 7 (4.0%) | 14 (4.3%) | |
| Change mental status | 4 (2.7%) | 6 (3.4%) | 10 (3.1%) | |
| Syncope | 4 (2.7%) | 2 (1.1%) | 6 (1.9%) | |
| Otherb | 70 (47.0%) | 66 (37.7%) | 136 (42.0%) | |
| ACH enrollment | .298 | |||
| Not accepted to ACH | 132 (88.6%) | 161 (92.0%) | 293 (90.4%) | |
| Accepted to ACH | 17 (11.4%) | 14 (8.0%) | 31 (9.6%) |
Note: Data reported as mean (±SD) and median (range) for age and as frequency (%) for others.
Abbreviations: ACH, Advance Care at Home; COPD, chronic obstructive pulmonary disease.
Included anesthesiology (1), case management (1), breast diagnostics department (1), trauma critical care (2), general surgery (1), ACH department (2), nephrology (1), neurology (1), psychiatry (1), and audits and appeals (1).
Combination of minor diagnosis each totaling <6 cases a piece, please see Supporting Information for the extended list.
We found that out of the 324 healthcare providers placing the ACH consult orders, 200 (61.7%) were attending physicians, 72 (22.6%) were learners including as medical students, residents, and fellows, 50 (15.5%) were physician associates or nurse practitioners, and 2 (0.6%) were nurses (Table 3).
In Florida, 62 (41.7%) providers were trainees, 53 (35.6%) were attending physicians, and 34 (22.8%) were nurse practitioners, physician associates, or RNs. Comparatively, in Wisconsin, 10 providers (5.7%) were trainees, 147 (84.0%) were attending physicians, and 18 (10.3%) were nurse practitioners, physician associates, or RNs (Table 3).
Of the 324 healthcare providers placing ACH consult orders, 263 (81.2%) were part of the internal medicine department, 38 (11.7%) were in the family medicine department, 11 (3.4%) were in the cardiovascular diseases department, and 12 (3.7%) were grouped into “other” (Table 3).
Of the 324 patients referred to the ACH program, 36 (11.1%) were diagnosed with COVID-19 infection, followed by other conditions including sepsis (9.6%), community-acquired pneumonia (8.0%), cellulitis (7.7%), congestive heart failure (6.8%), and COPD exacerbation (5.6%) among other conditions (Table 3) (for full diagnosis list, see Table S1).
The Pearson Chi-square test showed statistical significance when the location site was compared with race (p = .001), provider (p < .001), and department (p < .001) (Table 3).
In multivariable analysis, nurses (odds ratio [OR]: 0.08, 95% confidence interval [CI]: 0.028–0.228, p < .001), physician associates and nurse practitioners (OR: 0.165, 95% CI: 0.054–0.502, p = .001), and trainees (OR: 0.028, 95% CI: 0.008–0.095, p < .001) had lower odds of referring the patients to the ACH program in Wisconsin when compared to physicians. Moreover, patients with sepsis (OR: 6.251, 95% CI: 1.698–23.006, p = .006), congestive heart failure (OR: 11.774, 95% CI: 2.49–55.671, p = .002), and change in mental status (OR: 7.33, 95% CI: 1.227–43.77, p < .05) had higher odds of being referred to the ACH program in Wisconsin (Table 4).
TABLE 4.
Multivariable analysis of referred patients to ACH based on location.
| 95% confidence interval |
||||
|---|---|---|---|---|
| Odds ratio | Lower | Upper | p Value | |
| Sex | ||||
| Female versus male | 0.622 | 0.351 | 1.102 | .622 |
| Age (years) | ||||
| 18–39 | Ref. | - | ||
| 40–59 | 0.374 | 0.067 | 2.097 | .374 |
| 60–79 | 0.288 | 0.061 | 1.366 | .288 |
| 80–101 | 0.195 | 0.041 | 0.923 | .195 |
| Race | ||||
| White | Ref. | - | ||
| Black or African American | 0.122 | 0.02 | 0.75 | .122 |
| Asian | 0.537 | 0.068 | 4.217 | .537 |
| Ethnicity | ||||
| Not Hispanic or Latino | Ref. | - | ||
| Hispanic or Latino | 0.443 | 0.04 | 4.947 | .443 |
| Unknown | 1.506 | 0.035 | 65.51 | .832 |
| Provider | ||||
| Consultant | Ref. | - | ||
| Trainees | 0.028 | 0.008 | 0.095 | <.001 |
| Nurse | 0.08 | 0.028 | 0.228 | <.001 |
| PA or NP | 0.165 | 0.054 | 0.502 | .001 |
| Department | ||||
| Internal medicine | Ref. | - | ||
| Family medicine | 2.45 | 0.575 | 10.432 | .226 |
| Cardiovascular diseases | 2.5 | 0.506 | 12.35 | .261 |
| Diagnosis | ||||
| COVID-19 | Ref. | - | ||
| Sepsis | 6.251 | 1.698 | 23.006 | .006 |
| Pneumonia | 2.8 | 0.82 | 10.008 | .099 |
| Cellulitis | 2.693 | 0.761 | 9.528 | .124 |
| Congestive heart failure | 11.774 | 2.49 | 55.671 | .002 |
| COPD exacerbation | 4.046 | 0.854 | 19.175 | .078 |
| Arrythmia | 2.815 | 0.628 | 12.622 | .177 |
| Change mental status | 7.33 | 1.227 | 43.77 | .029 |
| Syncope | 1.671 | 0.176 | 15.895 | .655 |
| ACH | ||||
| Accepted (vs. not accepted) to ACH | 1.857 | 0.714 | 4.834 | .204 |
Abbreviations: ACH, Advanced Care at Home; COPD, chronic obstructive pulmonary disease; NP, nurse practitioner; PA, physician associate.
DISCUSSION
This study showed the benefits and limitations of automating patient identification through CDSS using an electronic, registry-linked BPA alert. Over the course of 9 months, 8962 alerts were fired on 2847 patients, resulting in 324 consultations and only 31 transfers into ACH (1.1% of patients). The purpose of the BPA was to rapidly alert the hospital care teams of potentially ACH-eligible patients with the hope of empowering those care teams to review and refer appropriate patients to ACH through a linked consult order, thus changing the ACH acquisition model from one that had “pulled” patients from the care teams to one that care teams “pushed” or referred patients to ACH. Unfortunately, with the execution of this new system, the desired results of increased “pushed” ACH patient volumes did not come to fruition. First, many more BPA triggers were fired than actual patients (8962 alerts on 2847 patients), creating alert fatigue for care team members and resulting in redundant acknowledgments (711 acknowledgments on 345 patients) and, more importantly, a large number of cancellations (8251). With the BPA canceled in 92.1% of instances, we do not feel that it was an effective way to engage the care teams. Second, out of the 324 ACH consultation requests placed by the hospital teams, only 31 patients qualified for the program. This low recruitment rate was likely due to the hospital teams requesting an ACH consult without either thoroughly reviewing the clinical appropriateness for ACH care or discussing the ACH program in detail with the patient before consult request. The BPA alert is initiated as soon as a patient’s electronic chart is opened, potentially before a provider has had time to fully understand a patient’s clinical scenario. It became evident that either the ACH team did not fully consider the timing of the BPA alert nor account for the change management required for this BPA initiative to be successful. Non-ACH team members not routinely engaged in ACH enrollment may not understand the make-up of what makes a patient a good candidate for ACH. Another learning from this study was that the BPA itself was too broad in its eligibility criteria, resulting in alerts on a large number of patients that did not have clinical criteria appropriate for ACH care. Based on these findings, we believe any future BPA should be structured with more rigorous registry criteria to limit the number of alerts to a much more appropriate patient population rather than intentionally keeping the initial eligibility broad. Furthermore, improving education efforts of the hospital care teams on a standardized, simplified patient review for HaH appropriateness should be conducted before BPA rollout to improve rates of successfully enrolling patients referred through the BPA.
Automation may be beneficial for one group while simultaneously creating disruption to workflows and hassle for another group such as referring providers. HaH leaders need to be keenly aware of EHR user experience when considering automation tools. Pairing the BPA with provider education for what makes a patient a good HaH candidate may improve BPA adherence. Non-HaH team members may benefit from a BPA identifying “home run” HaH candidates to build momentum with successful referrals and confidence in the BPA.
Our findings also showed differences in the use of the BPA-linked consult order between the two clinical sites by personnel. In Florida, medical trainees (41.7%) and APPs (22.8%) placed most ACH consults, while in Wisconsin, attending physicians placed most (84.0%). This difference may be due primarily to the mission at each campus, with Florida’s focus being an academic teaching center while Wisconsin’s focus being a clinical care hub for critical access hospitals. This leads to a large variability of training programs at each site, with Florida having 58 residency and fellowship programs, whereas Wisconsin has only one.4 Such differences in medical personnel and medical trainee makeup are seen commonly between large academic medical centers, trauma hospitals, community hospital, and critical access hospitals. Because of these differences, it is essential that programs looking to roll out BPA alert initiatives such as HaH patient eligibility provide the proper type and level of training that is necessary to make these efforts successful. Academic centers where trainees put in many of the EHR orders and respond to BPAs should have focused training in medical student, resident, and fellow orientation and didactic lectures. Programs where attending faculty and APPs are the primary drivers of order entry and BPA response should look at department/division meetings and didactics, grand rounds lectures, and online training modules which are commonly used with these providers. Additionally, there were significant differences in patients referred to ACH by race, with significantly more black or African Americans being referred in Florida when compared to Wisconsin (9.4% vs. 1.1%, p = .001). This difference is likely due to the different racial makeups of the communities, with Duval County Florida having a white census of 59.9% where Eau Claire County Wisconsin has a white census of 91.4%.5 Both urban and rural hospitals should be aware of the racial and ethnic makeup of their communities and ensure that any patients’ selections devices, algorithms, or calculators used to direct patients to alternate care environments or treatment plans show consistency in distribution when compared to their standard hospital care. Further studies on possible biases unintentionally built into BPA and artificial intelligence calculators should be conducted to ensure that these tools are not ruling out patients unfairly.
Our results also showed that medical specialties referred more patients than surgical specialties in all locations. This finding could potentially be that ACH was launched during the COVID-19 pandemic when many hospitals took the initiative to give bed priority to COVID-19 patients, and many elective surgeries were postponed due to bed capacity. During the COVID-19 pandemic, some institutions had to transform their operating rooms into COVID-19 ICUs. For instance, Florida modified their postanesthesia care units into COVID-19 beds.6 Furthermore, the patients that could have been potential candidates given their low risk, such as elective surgeries, were postponed. Moreover, elective and nonurgent surgeries carried out in COVID-19-positive patients have more significant morbidity and mortality.7 Determining priorities in the patient population to be enrolled in HaH is crucial as certain specialties, such as internal medicine, may have a patient population more “ready-made” for HaH than other specialties, such as surgical subspecialties, which may require modifications to an existing HaH program to ensure the needs of the patients are met. At the time of the study, not all ACH sites were offering services common to surgical patients, such as suction.
Limitations
A major limitation to this study was lack of reliable enrollment data before creation of the BPA. While a comparison of ACH enrollment rates before and after initiation of the BPA would be helpful to fully understanding the effectiveness of the BPA, the quality of data before genesis of the registry and BPA will remain an unresolvable limitation of our retrospective design. Before the creation of the registry and BPA, enrollment efforts were manual with data maintained on a shared spreadsheet that was easily altered, frequently changing and difficult to track the data that the registry and BPA captures.
Additionally, conscious or unconscious bias from ACH team members adds to variability in interpreting social and clinical stability. For example, one provider may deem a patient to be clinically stable for hospitalization at home although clinical parameters may be borderline if shared decision-making with the patient is prioritized and the patient prefers home for hospitalization over traditional hospital settings, while another provider may find this untenable. Clinicians may also find variation with what is safe from a social stability perspective in the home, whereas lack of a caregiver may influence the decision of the team to offer home hospitalization.
Lastly, clinical uncertainty and assigning a temporary diagnosis at the initial presentation of a patient which may subsequently change to a more concrete diagnosis excluded from ACH can trigger notifications in patients who were not eligible. When a patient enters the emergency room whose clinical picture was uncertain, healthcare providers often entered a temporary diagnosis until further tests were completed, and a definitive diagnosis was established. Temporary diagnoses that were ultimately changed may increase the number of notifications triggered and lower truly eligible patients.
CONCLUSION
Patient recruitment into a HaH program begins with eligible patient identification. Implementing CDSS through an automated BPA alert based on key clinical and nonclinical eligibility criteria, including geography, payer source, diagnosis, and patient stability, gave providers the ability to rapidly identify potential ACH eligible patients; however, these automation efforts ended up yielding a very low number of patients enrolled into ACH. This is likely due to the BPA alert broadly including patients even though the majority were not truly eligible for ACH at that time. Future efforts looking at using BPAs for HaH patient identification should include proven and more rigorous selection criteria, thoughtful consideration of personnel impacted by the BPA efforts to avoid unnecessary interaction with the EHR resulting in alert fatigue, and pairing appropriate change management initiatives with the BPA rollout to optimize success.
Supplementary Material
ACKNOWLEDGMENTS
The funder had no role in study design, data analysis, and interpretation; in the writing of the manuscript; and in the decision to submit the manuscript for publication. The findings and conclusions do not necessarily represent the views of the funder.
Footnotes
CONFLICT OF INTEREST STATEMENT
Sagar B. Dugani was supported by the National Institute on Minority Health and Health Disparities (NIH K23 MD016230). Mayo Clinic discloses a financial investment in Medically Home LLC, the company that provides the software platform that enables the virtual care delivery. The remaining authors declare no conflict of interest.
ETHICS STATEMENT
The study was conducted in accordance with the Declaration of Helsinki and approved by the Mayo Clinic Institutional Review Board as a retrospective chart review under protocol number 20–010753. Patients provided both oral and written informed consent to participate in the ACH program. Patients provided both oral and written informed consent to participate in the ACH program.
SUPPORTING INFORMATION
Additional supporting information can be found online in the Supporting Information section at the end of this article.
REFERENCES
- 1.Paulson MR, Shulman EP, Dunn AN, et al. Implementation of a virtual and in-person hybrid hospital-at-home model in two geographically separate regions utilizing a single command center: a descriptive cohort study. BMC Health Serv Res. 2023;23(1):139. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Sutton RT, Pincock D, Baumgart DC, Sadowski DC, Fedorak RN, Kroeker KI. An overview of clinical decision support systems: benefits, risks, and strategies for success. NPJ Digit Med. 2020;3:17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Maniaci MJ, Torres-Guzman RA, Garcia JP, et al. Overall patient experience with a virtual hybrid hospital at home program. SAGE Open Med. 2022;10:205031212210925. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Science MCCoMa. Residencies and Fellowships. Mayo Clinic College of Medicine and Science. 2022. Accessed June 22, 2023. https://college.mayo.edu/academics/residencies-and-fellowships/programs-a-z/ [Google Scholar]
- 5.United States Census Bureau. Population estimates, July 1, 2023 (V2023). United States Census Bureau. Accessed November 3, 2023. https://www.census.gov/quickfacts/fact/table/ [Google Scholar]
- 6.Peters AW, Chawla KS, Turnbull ZA. Transforming ORs into ICUs. N Engl J Med. 2020;382(19):e52. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Aggarwal R, Bhatia R, Soni K, Trikha A. Fast tracking intensive care units and operation rooms during the COVID-19 pandemic in resource limited settings. J Anaesthesiol Clin Pharmacol. 2020;36(suppl 1):7. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
