Abstract
Objective
Symptom checkers can help address high demand for SARS-CoV2 (COVID-19) testing and care by providing patients with self-service access to triage recommendations. However, health systems may be hesitant to invest in these tools, as their associated efficiency gains have not been studied. We aimed to quantify the operational efficiency gains associated with use of an online COVID-19 symptom checker as an alternative to a telephone hotline.
Methods
In our health system, ambulatory patients can either use an online symptom checker or a telephone hotline to be triaged and connected to COVID-19 care. We performed a retrospective analysis of adults who used either method between October 20, 2021 and January 10, 2022, using call logs, electronic health record data, and local wages to calculate labor costs.
Results
Of the 15 549 total COVID-19 triage encounters, 1820 (11.7%) used only the telephone hotline and 13 729 (88.3%) used the symptom checker. Only 271 (2%) of the patients who used the symptom checker also called the hotline. Hotline encounters required more clinician time compared to those involving the symptom checker (17.8 vs 0.4 min/encounter), resulting in higher average labor costs ($24.21 vs $0.55 per encounter). The symptom checker resulted in over 4200 clinician labor hours saved.
Conclusion
When given the option, most patients completed COVID-19 triage and visit scheduling online, resulting in substantial efficiency gains. These benefits may encourage health system investment in such tools.
Keywords: symptom checker, consumer health informatics, COVID-19, clinical decision support, health care delivery
INTRODUCTION
During the COVID-19 pandemic, symptom checkers and self-triage tools have become crucial for providing patients with on-demand access to triage recommendations.1–5 These tools ask patients questions about demographics, symptoms, exposures, and past medical history and suggest a diagnosis and/or recommend a disposition. Symptom checkers used for acute conditions are generally popular with users, with 70%–80% reporting high satisfaction and 90% perceiving them as useful.6,7 During COVID, these tools have been particularly important to patients, providing them with 24/7 access to health information, risk assessment, and access to care.5,8–10
Online self-triage tools may benefit health system efficiency by providing a scalable, widely accessible means of triaging patients and connecting them to care, all while maintaining social distancing and preserving the healthcare workforce to address critical clinical tasks.11–13 By automating the triage and scheduling process, symptom checkers may reduce the time that staff spend doing these tasks manually. This can decrease the operational cost of care delivery, reduce the time it takes patients to access care, and improve patient satisfaction. This is especially true for symptom checkers that are integrated within the electronic health record (EHR) of a health system, facilitating patient self-scheduling and creating a record of triage recommendations.
Literature from Switzerland and France has demonstrated that the majority of users of online COVID-19 self-service tools would have otherwise contacted healthcare systems, and that use of these tools was associated with decreased call center volumes.9,14 In addition, previous research on self-triage tools for other conditions, such as influenza, have postulated that they may lead to efficiency gains.15–17 However, to our knowledge, no studies have quantified efficiency gains or cost savings resulting from use of these tools, and no study has described the operational impact of a tool capable of self-triage and self-scheduling of tests and appointments.
The lack of concrete data on efficiency gains may make health systems hesitant to invest in designing and implementing online self-triage solutions for COVID-19 due to uncertain return on investment. In March 2020, we implemented one of the first COVID-19 symptom checkers in the United States—an EHR-tethered COVID-19 self-triage and self-scheduling tool.18 In this analysis, our primary aim is to summarize utilization patterns and estimate the efficiency gains resulting from providing patients access to this online tool as an alternative to calling a telephone hotline.
MATERIALS AND METHODS
Setting
University of California, San Francisco (UCSF) Health is a large academic health system consisting of 3 campuses, with nearly 1000 inpatient beds and >250 outpatient clinics. UCSF Health serves approximately 45 000 hospital admissions and 1.7 million outpatient visits annually. The UCSF primary care practices serve approximately 90 000 empaneled patients. As of January 2022, approximately 95% of adult primary care patients were enrolled in UCSF’s EHR-tethered patient portal.
In early March 2020, UCSF established a COVID-19 telephone hotline, which became the primary intake point for all UCSF patient and employee inquiries regarding COVID-19, including general questions, exposures, symptom assessments, and scheduling requests. This hotline was staffed by 3 levels of personnel. Health navigators—nonlicensed staff trained in health coaching techniques—fielded all incoming calls and answered general COVID-19 related questions. For symptomatic patients or those requiring a COVID-19 test, health navigators generated a work-queue order or directly transferred the call to the clinical triage line, staffed by registered nurses. After triage, if the patient required a test or appointment, they were transferred to a scheduler, who would book the appropriate visit type (Figure 1). All patient COVID-19 tests at our institution required an appointment. The telephone hotline staffing was increased or decreased throughout the pandemic in response to incoming call volume. Hotline staff could use telephone interpreters as needed to assist patients with limited English proficiency.
Figure 1.
Patient flow diagram for hotline and symptom checker. Patients may choose to use any of these pathways to receive care.
Symptom checker tool
UCSF uses a commercially available EHR from Epic Systems (Verona, WI). In early March 2020, we used built-in Epic tools to design and deploy our UCSF COVID-19 Symptom Checker, which launched on March 12, 2020.18 The UCSF COVID-19 Symptom Checker was developed as a self-service option for patients with symptoms of or exposure to COVID-19, or who are requesting a COVID-19 test. After answering a series of branched logic questions about exposures, symptoms, and comorbidities, patients were directed to the appropriate disposition based on their predicted risk level. Patients who were asymptomatic bypassed many of these questions to be more quickly directed to testing. The triage algorithm used in this tool is identical to the one used on the telephone hotline (Figure 2). The questions, dispositions, and logic were continually adapted over the course of the pandemic, and incorporated components of Centers for Disease Control and Prevention guidelines for COVID-19 triage19 and nursing telephone triage protocols.20 The algorithm was adjusted prior to influenza season to ensure that patients who may be appropriate for oseltamivir were offered a provider visit.
Figure 2.
Patient COVID-19 self-triage algorithm used by online self-triage and self-scheduling tool, and telephone hotline (last updated December 2021). The following algorithm has been simplified to remove institution-specific protocols.
Because this tool is built into our EHR, it fully integrates with the health system’s documentation and scheduling systems. It is available in English and Spanish—the 2 languages currently supported by our patient portal. The tool is available to all adult UCSF patients with active patient portal accounts. Our EHR vendor has now adopted this tool as “Foundation System Content”, making it available in its basic form to other health systems who share the same EHR vendor.
We employed a rapid build-test-learn process to iterate and improve functionality over the course of the pandemic. Self-scheduling of video visits through the Symptom Checker was first offered on March 18, 2020, followed by self-scheduling for in-person urgent care appointments on October 26, 2020, and finally automatic ordering and self-scheduling of SARS-CoV2 RNA testing appointments to primary care patients on December 10, 2020. We then expanded access to self-scheduling of all COVID-19 testing and visit types to all UCSF patients (regardless of empanelment to UCSF primary care) on October 20, 2021. Stepwise implementation of these features reflected both operational needs and technical challenges. Due to the high demand and limited supply of in-person urgent care visits and testing at the beginning of the pandemic, the decision was made to centralize scheduling at the level of the hotline and limit testing to certain groups until demand eased. COVID-19 test self-scheduling functionality required a prolonged discovery period during which technical solutions were designed and tested.
During the study period (October 2021 to January 2022), all users had access to self-triage functionality and self-scheduling of SARS-CoV-2 RNA testing visits, video visits, and in-person visits.
Study population
For the purposes of this analysis, we included hotline or Symptom Checker encounters between October 20, 2021 and January 18, 2022, the first 90 days during which self-scheduling of all appointment types was available to all patients. We included only encounters for adults with COVID-19 symptoms or exposures—encounters for general questions or requests for asymptomatic, unexposed testing were not included, since the workflow for these requests is less consistent across the institution. We excluded UCSF employees, since they have different scheduling options and workflows. This study was approved by the UCSF Institutional Review Board.
Evaluation and statistical analysis
We used a combination of telephone call logs and EHR-data to determine the type of calls required to resolve each encounter. For example, if a patient used the symptom checker but never called the hotline during the 3 months (as is depicted in the bottom row of Figure 1), their triage encounter(s) required no navigator, clinical triage, or scheduling time. If a patient called the hotline for clinical triage, they were counted as having completed 1 navigator and 1 triage call. If a patient also scheduled a visit over the phone within 24 h of their initial triage, they were counted as having also completed 1 scheduling call (as is depicted in the top row of Figure 1). The decision was made not to count multiple call attempts for the same triage encounter (eg, if the patient did not pick up on the first attempt), as they were not consistently documented. If the same patient underwent multiple triages during the 3-month study period, each triage encounter was counted separately.
Call duration was based on an internal time study done in September to October 2021, during which call length and subsequent documentation time was manually recorded by each employee for each call type (navigation, clinical triage, scheduling) in a sample of over 2400 calls. During that study, navigation calls took 8 min, clinical triage calls took 17 min, and scheduling calls took 9 min. The estimated total clinician time per encounter was calculated by multiplying the likelihood of using the service (eg, telephone scheduling) for that encounter type by the average amount of time spent on call and subsequent documentation. Labor costs were calculated by making the following assumptions: (1) employees were productive for 95% of their total paid time per day, (2) wages for each type of employee were based on mean salaries for the job type in the geographic area, and (3) a 31% benefits multiplier was used to calculate total employee cost, based on the most recent estimates from the U.S. Bureau of Labor Statistics.21
RESULTS
During the 90-day study period, there were 15 549 total encounters for 11 211 unique patients (Figure 3). Of these, 1820 (11.7%) encounters for 1712 patients were completed exclusively on the telephone hotline. The other 13 729 (88.3%) encounters for 9970 patients involved the online symptom checker, with or without the telephone hotline. Telephone hotline encounter volumes were relatively stable throughout the study period, but online symptom checker encounters increased more than 10-fold during the surge of Omicron-variant cases (Figure 4).
Figure 3.
Patient flow through the hotline and symptom checker for encounters of symptomatic or exposed patients. *Forty-eight patients in this group self-scheduled an appointment using the online tool. **Eighty-five patients in this group self-scheduled an appointment using the online tool.
Figure 4.
Weekly encounter volumes for telephone hotline and online symptom checker. *The peak of the Omicron COVID-19 variant occurred in California from December 28 to February 8, defined as the time during which the 7-day test positivity rate was above 10%.32
Patients who used only the hotline were more likely than patients who used the online symptom checker to have limited English proficiency (5.5% vs 2.3%, P < .001), to be black/African American (8.1% vs 4.9%, P < .001), to live outside of the county of San Francisco (34.8% vs 27.1%, P < .001), and to be older (52.9 vs 42.6 years, P < .001) (Table 1). Patients who used the symptom checker were more likely to be commercially insured than patients who used only the hotline (69.3% vs 49.5%, P < .001).
Table 1.
Demographics by encounter type
| Called hotline only | Used symptom checker (±hotline) | P valuea | |
|---|---|---|---|
| n = 1681 | n = 9969 | ||
| Age (median, IQR) | 52.9 (36.4–67.4) | 42.6 (33.6–56.6) | <.001 |
| Sex (% female) | |||
| Female | 1058 (62.9%) | 6336 (63.6%) | |
| Male | 625 (37.2%) | 3620 (36.3%) | |
| Unknown/not reported/nonbinary | 3 (0.2%) | 13 (0.1%) | .073 |
| Race and ethnicity | |||
| American Indian or Alaska Native | 2 (0.1%) | 16 (0.2%) | |
| Asian | 343 (20.4%) | 2260 (22.7%) | |
| Black or African American | 136 (8.1%) | 491 (4.9%) | |
| LatinX | 211 (12.6%) | 1212 (12.2%) | |
| Multirace/ethnicity | 37 (2.2%) | 270 (2.7%) | |
| Native Hawaiian or Other Pacific Islander | 15 (0.9%) | 71 (0.7%) | |
| Other | 75 (4.5%) | 377 (3.8%) | |
| Unknown/declined | 40 (2.4%) | 190 (1.9%) | |
| White or Caucasian | 828 (49.3%) | 5082 (51.0%) | <.001 |
| Limited English proficiency | 92 (5.5%) | 229 (2.3%) | <.001 |
| County of Residence | |||
| San Francisco | 1096 (65.2%) | 7271 (72.9%) | |
| Other | 585 (34.8%) | 2698 (27.1%) | <.001 |
| Insurance | |||
| Commercial | 832 (49.5%) | 6905 (69.3%) | |
| Medicare | 482 (17.8%) | 1323 (13.3%) | |
| Medicaid | 289 (17.2%) | 1233 (12.4%) | |
| Other | 13 (0.8%) | 71 (0.7%) | <.001 |
| No insurance listed | 71 (4.2%) | 437 (4.4%) | |
| Empaneled primary care patient | 1052 (62.6%) | 6351 (63.7%) | .287 |
Chi-squared test was used for categorical variables. t Test was used for continuous variables.
Of the 13 729 encounters involving the symptom checker, the patient also called the hotline (eg, to schedule an appointment or ask additional questions) in 271 cases (2.0%). Patients using the symptom checker self-scheduled a test or visit in 7329 encounters (53.3%), scheduled through the hotline in 29 encounters (0.2%), and did not schedule an appointment in 6371 encounters (46.4%) (Figure 3). Of the symptom checker encounters, 5500 (40.1%) were for patients who were asymptomatic but exposed to someone with COVID-19 (Table 2). The most common disposition for symptomatic patients was low risk (n = 5041, 36.7%), meaning that they had symptoms without risk factors for severe disease. A total of 2246 encounters (16.4%) received the nonurgent disposition (symptoms with risk factors for severe disease), while 589 (4.3%) were categorized as urgent (potentially urgent symptoms like shortness of breath) and 353 (2.5%) as emergent. More than half of encounters (n = 7370, 53.7%) led to a subsequent visit. The most common visit type was a testing visit (n = 7137, 52%).
Table 2.
Online symptom checker dispositions and visits
| Associated visita |
|||||
|---|---|---|---|---|---|
| Disposition | No visit | Testing visit | Video visit | In person visit | Total (%) |
| Emergent | 248 | 88 | 5 | 12 | 353 (2.5) |
| Urgent | 302 | 199 | 8 | 80 | 589 (4.3) |
| Nonurgent | 1073 | 1097 | 70 | 6 | 2246 (16.4) |
| Low risk | 2285 | 2721 | 25 | 10 | 5041 (36.7) |
| Asymptomatic exposed | 2451 | 3032 | 9 | 8 | 5500 (40.1) |
| Total (%) | 6359 (46.3) | 7137 (52.0) | 117 (0.9) | 116 (0.8) | 13 729 |
Associated visits were defined as those occurring within 7 days of the self-triage encounter. If both a test and a provider visit were scheduled after the same encounter, only the provider visit (video visit or in-person visit) was counted.
Triage and scheduling encounters completed exclusively using the hotline required an average of 17.8 triage staff minutes per encounter, while encounters that involved the online symptom checker took an average of 0.4 triage staff minutes per encounter (Table 3). Patients calling the hotline experienced an average hold time of 15 min, 4 s before speaking to a navigator. Encounters involving the online symptom checker had lower average per-encounter triage and scheduling costs than encounters using only the hotline ($0.57% vs $24.99). Each use of the online symptom checker instead of the telephone hotline therefore saved the health system an average of $24.42 in labor costs for triage and scheduling. Cost differences were driven mostly by the frequency of clinical triage calls, which can be lengthy (the average call and documentation time was 17 min) and are done by clinical staff such as registered nurses or advanced nurse specialists, increasing personnel costs (Table 3).
Table 3.
Estimated cost per encounter for hotline only and symptom checker assisted encounters
| Average minutes spent by staff |
|||
|---|---|---|---|
| Hourly wage (USD) | Hotline only encounter | Symptom checker assisted encounter | |
| (n = 1820) | (n = 13 729) | ||
| Navigator | $32.00 | 8.0 | 0.2 |
| Clinical triage nurse | $92.00 | 7.8 | 0.2 |
| Scheduler | $28.57 | 2.0 | 0.04 |
| Total clinician time | 17.8 | 0.4 | |
| Average cost per encounter (USD) | $24.99 | $0.57 | |
| Total 90-day cost (USD) | $45 473 | $7752 | |
Estimated time is based on average time spent on that encounter (including documentation) by each staff member.
The health system saved an estimated 4279 labor hours (1927 health navigator hours, 1879 registered nurse hours, and 473 scheduler hours) during this period as a result of diverting encounters to the symptom checker.
DISCUSSION
In this study, we estimated the efficiency gains attributable to allowing patients to self-triage and self-schedule tests and appointments for COVID-19 using an online symptom checker instead of using a telephone hotline. During the study period, the symptom checker was used more frequently that the telephone hotline for COVID-19-related symptoms or exposures. Since the symptom checker employs the same triage algorithm as the hotline, patient outcomes would be expected to be similar between the 2 methods.18 Encounters that involved the symptom checker resulted in lower use of labor resources than encounters using only the telephone hotline, resulting in over 4200 labor hours saved during the 90-day study period.
Telephone hotline encounter volumes were relatively stable throughout the study period, but online symptom checker encounters increased more than 10-fold during the surge of Omicron-variant cases. The telephone hotline quickly reached capacity during the surge, and hold times increased, so patients may have increasingly used the online self-service tool. This finding reinforces the value of online self-service tools at times of high demand.
Symptom checkers and online self-triage tools are known to have a variety of benefits, including increased patient satisfaction,6,7 quicker connection of patients to care,18 and the potential to use the discrete data captured with this tool for a host of other purposes, such as to train machine learning models. Although experts have predicted operational benefits of self-triage tools to health systems,12,13 these have rarely been quantified. After implementation of a COVID-19 self-triage tool in France, emergency calls increased less than would be expected based on COVID-19-related hospitalizations.9 In Switzerland, more than two-thirds of self-triage tool users indicated that they would have contacted the healthcare system if the online tool had not been available.14 In a survey of users of the UCSF tool described here, >90% would have called, messaged, or in other ways contacted their healthcare providers if they did not have access to this tool.22 Descriptive studies of non-COVID online self-triage tools, such as for influenza, have also suggested the ability to improve operational efficiency and decrease utilization, but to our knowledge, none have quantified these effects.15–17
The findings from this study provide evidence of a clear operational and financial benefit to health systems of offering an online COVID-19 self-triage and self-scheduling tool, particularly in labor hours saved. During the COVID-19 pandemic, personnel such as registered nurses have been in short supply.23 For that reason, there may be additional operational benefits to being able to reassign telephone hotline staff to other clinical duties, such as cost savings on hiring, training, and supervising additional personnel. These labor savings must also be weighed against investment of informatics time to develop a COVID-19 self-triage and self-scheduling tool. We estimate that our tool required an initial combined 260 h of build time from a clinical engineer, project manager, and clinical informaticist, as well as 15 h a month for ongoing maintenance and enhancement. This investment came at a time of high demand for clinical informatics time, requiring careful consideration of institutional priorities.24,25
Despite the enthusiasm around symptom checkers, there has also been skepticism regarding whether they can truly shift triage demand away from front line staff. If a patient uses a symptom checker but still calls clinic staff—for example, due to lack of trust in or discomfort using technology—they are using more health system resources than they would have if the symptom checker did not exist. Our study suggests that, in the context of an EHR-integrated COVID19 self-triage and self-scheduling tool, this concern is unfounded. When given the option, patients preferentially used the online tool, and 98% of the time did not subsequently call the hotline. Furthermore, patients who used the online tool were greater than 200 times more likely to self-schedule an appointment online than to call the hotline to schedule.
Certain patient groups were more likely to use the hotline than the symptom checker, including older patients and those with limited English proficiency. These findings are consistent with literature on differential use of symptom checkers and other digital health tools by those with lower access to or comfort using technology,26,27 and reinforce the need to design digital tools that are accessible to users with limited English proficiency. Despite this inequality, digital health tools like this may still lead to improved care for these patients. By diverting English-speaking patients away from multilingual telephone hotlines, wait times may be reduced for patients with limited English proficiency. We launched a Spanish-language version of the symptom checker in February 2021 but have observed low use rates. More research is needed on specific methods for improving equitable access to and use of these tools, such as by minimizing technical barriers to use (eg, logins, app downloads, high bandwidth requirements) and making tools available in multiple languages.
To our knowledge, this is the first analysis of the impact of an EHR-integrated symptom checker on telephone triage calls and labor costs. This is important because the cost avoidance attributable to a symptom checker is partially dependent on efficiently connecting patients to the right appointment types. Patients with acute symptoms may not know where or how to get care, so they will continue to seek triage advice. In the absence of that advice, patients may seek higher acuity care such as urgent care centers and emergency rooms.28 Online self-triage and self-scheduling allows patients to be connected to the right level of care quickly and efficiently, without using triage staff resources. Fully integrated symptom checkers have distinct advantages in that patient responses can be documented in the medical record, and patients can be connected directly to appropriate care and scheduled for necessary appointments.18 Health systems may be hesitant to devote resources to developing or integrating digital tools like symptom checkers—despite their known popularity with patients—because of uncertain return on investment, thus this type of analysis is crucial to fuel innovation.
This study has several limitations. First, in this study we based calculations on the average number and duration of telephone interactions per encounter rather than using the exact number or duration of calls for each encounter, because all calls were not consistently documented or timed. By assuming only one call of each type per encounter, we may have underestimated the actual amount of time and effort expended by hotline staff. Second, we did not quantify patient time saved in this analysis. However, our prior work demonstrated substantial time savings for patients, with a median of 2 h and 15 min saved from initiation of triage to scheduling a visit.18 Third, this study was conducted at one health system with a specific workflow for managing COVID-related calls, and where COVID-19 testing was available only with an appointment. The calculated labor costs will therefore be only an estimate of what could be expected at other institutions and may be affected by different workflows and hotline staff wages. There were several workflow innovations at our institution (eg, a triage module within the EHR) that decreased call duration for telephone triagers, without which call durations and costs (and therefore cost savings from the online tool) would have been significantly higher. Fourth, it is possible that there are differences in illness severity between patient populations leading to different utilization patterns, or that patients had different clinical outcomes between the groups. However, internally collected disposition data have not shown any such differences, and the use of an identical triage algorithm for telephone or online self-triage makes major differences unlikely. Fifth, the study period coincided with what is typically influenza season. However, based on California Department of Public Health data, influenza activity was sporadic during this time, with a test positivity rate of only 1.2%, so it is unlikely that it significantly affected the number of symptom checker users.29 Finally, operational efficiency calculations assume that most people who use the symptom checker would have otherwise called the hotline or in some other way contacted their healthcare providers. This is well supported in the literature.14,22 To further account for this assumption, we excluded asymptomatic, unexposed patients from our analysis. Based on prior studies, these patients are less likely to receive labor-intensive telephonic or in-person care than symptomatic or exposed patients.18,30,31 It is also possible that operational efficiency gains will be attenuated if there continues to be a shift toward self-testing at home, rather than appointment-based polymerase chain reaction testing.
Lastly, it is worth noting that for efficiency gains and decreased labor requirements to translate into actual cost savings, organizations must have flexible staffing models that allows for clinical staff to be reassigned to other mission critical assignments if call volumes decrease. In the longer term, symptom checkers may be factored into staffing planning for clinics and hotlines, especially when their effects on resource utilization are well established.
CONCLUSION
As demonstrated by this study, use of a COVID-19 symptom checker with self-triage and self-scheduling capabilities led to significant operational efficiency gains and cost savings on labor. Faced with high demand for COVID-19 testing and care and staffing shortages, health systems are in desperate need of automated, self-service tools. Symptom checkers can help address health system inefficiencies and improve the resilience of health systems to deal with pandemics.
FUNDING
This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.
AUTHOR CONTRIBUTIONS
Study conception and design: TJJ, AYO, LP, MM, ABM, GS, RG. Data collection: TJJ. Analysis and interpretation of results: TJJ, LP, AT. Draft manuscript preparation: TJJ, LP, AT, MM, ABN, GS, RG, AYO. All authors reviewed the results and approved the final version of the manuscript.
ACKNOWLEDGMENTS
The authors recognize the contributions of Chris Miller and Aimee Williams who were instrumental in building and launching the COVID-19 Symptom Checker, and Michael Helle who led the creation of the COVID-19 hotline. The authors also thank the many others from the UCSF Clinical Innovation Center, Clinical Systems, Center for Digital Health Innovation, and Office of Population Health who helped to design and implement COVID-19 triage algorithm, hotline workflow, and/or symptom checker.
CONFLICT OF INTEREST STATEMENT
TJJ has received consulting fees and equity from Assure Health. ABN has received grant support from Cisco Systems Inc., Royal Phillips and Eli Lilly and has received consulting fees from Medtronic, Sanofi, Intuity Medical, Steady Health, Eli Lilly and Roche. RG serves as an advisor to Phreesia, Inc and Healthwise, Inc. AYO has received grant support from Microsoft Research, Pfizer Inc, and Hatchleave.ai and has received consulting fees from Vsee, LLC.
Contributor Information
Timothy J Judson, Department of Medicine, University of California San Francisco, San Francisco, California, USA; Center for Digital Health Innovation, University of California San Francisco, San Francisco, California, USA; Office of Population Health, University of California San Francisco, San Francisco, California, USA.
Logan Pierce, Department of Medicine, University of California San Francisco, San Francisco, California, USA; Center for Digital Health Innovation, University of California San Francisco, San Francisco, California, USA.
Avi Tutman, Office of Population Health, University of California San Francisco, San Francisco, California, USA.
Michelle Mourad, Department of Medicine, University of California San Francisco, San Francisco, California, USA; Center for Digital Health Innovation, University of California San Francisco, San Francisco, California, USA.
Aaron B Neinstein, Department of Medicine, University of California San Francisco, San Francisco, California, USA; Center for Digital Health Innovation, University of California San Francisco, San Francisco, California, USA.
Gina Shuler, Office of Population Health, University of California San Francisco, San Francisco, California, USA.
Ralph Gonzales, Department of Medicine, University of California San Francisco, San Francisco, California, USA; Clinical Innovation Center, University of California San Francisco, San Francisco, California, USA.
Anobel Y Odisho, Center for Digital Health Innovation, University of California San Francisco, San Francisco, California, USA; D epartment of Urology, University of California San Francisco, San Francisco, California, USA.
Data Availability
The data underlying this article cannot be shared publicly to protect user privacy and internal operations data. The data may be shared on reasonable request to the corresponding author.
REFERENCES
- 1. Munsch N, Martin A, Gruarin S, et al. Diagnostic accuracy of web-based COVID-19 symptom checkers: comparison study. J Med Internet Res 2020; 22 (10): e21299. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. Mansab F, Bhatti S, Goyal D.. Performance of national COVID-19 “symptom checkers”: a comparative case simulation study. BMJ Health Care Inform 2021; 28 (1): e100187. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Winn AN, Somai M, Fergestrom N, et al. Association of use of online symptom checkers with patients’ plans for seeking care. JAMA Netw Open 2019; 2 (12): e1918561. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. Semigran HL, Linder JA, Gidengil C, et al. Evaluation of symptom checkers for self diagnosis and triage: audit study. BMJ 2015; 351: h3480. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5. Miner AS, Laranjo L, Kocaballi AB.. Chatbots in the fight against the COVID-19 pandemic. NPJ Digit Med 2020; 3: 65. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Chambers D, Cantrell AJ, Johnson M, et al. Digital and online symptom checkers and health assessment/triage services for urgent health problems: systematic review. BMJ Open 2019; 9 (8): e027743. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Meyer AND, Giardina TD, Spitzmueller C, et al. Patient perspectives on the usefulness of an artificial intelligence-assisted symptom checker: cross-sectional survey study. J Med Internet Res 2020; 22 (1): e14679. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Lai L, Wittbold KA, Dadabhoy FZ, et al. Digital triage: novel strategies for population health management in response to the COVID-19 pandemic. Healthc (Amst) 2020; 8 (4): 100493. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Galmiche S, Rahbe E, Fontanet A, et al. Implementation of a self-triage web application for suspected COVID-19 and its impact on emergency call centers: observational study. J Med Internet Res 2020; 22 (11): e22924. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Schrager J, Schuler K, Wright D, et al. Development and usability testing of a web-based COVID-19 self-triage platform. West J Emerg Med. 2020; 21 (5): 1054–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Herriman M, Meer E, Rosin R, et al. Asked and answered: building a chatbot to address COVID-19-related concerns. NEJM Catal Innov Care Deliv. June 18, 2020. https://catalyst.nejm.org/doi/full/10.1056/CAT.20.0230.
- 12. Wyatt JC. Fifty million people use computerised self triage. BMJ 2015; 351: h3727. [DOI] [PubMed] [Google Scholar]
- 13. Frank SR. Digital health care—the convergence of health care and the internet. J Ambul Care Manage 2000; 23 (2): 8–17. [DOI] [PubMed] [Google Scholar]
- 14. Hautz WE, Exadaktylos A, Sauter TC.. Online forward triage during the COVID-19 outbreak. Emerg Med J 2021; 38 (2): 106–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15. Nagykaldi Z, Calmbach W, DeAlleaume L, et al. Facilitating patient self-management through telephony and web technologies in seasonal influenza. Inform Prim Care 2010; 18 (1): 9–16. [DOI] [PubMed] [Google Scholar]
- 16. Rosenbloom ST, Daniels TL, Talbot TR, et al. Triaging patients at risk of influenza using a patient portal. J Am Med Inform Assoc 2012; 19 (4): 549–54. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17. Kellermann AL, Isakov AP, Parker R, et al. Web-based self-triage of influenza-like illness during the 2009 H1N1 influenza pandemic. Ann Emerg Med 2010; 56 (3): 288–94.e6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18. Judson TJ, Odisho AY, Neinstein AB, et al. Rapid design and implementation of an integrated patient self-triage and self-scheduling tool for COVID-19. J Am Med Inform Assoc 2020; 27 (6): 860–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19. Centers for Disease Control and Prevention. Evaluating and testing persons for coronavirus disease 2019 (COVID-19). 2020. https://stacks.cdc.gov/view/cdc/85933. Accessed August 29, 2022.
- 20. Briggs J. Telephone Triage Protocols for Nurses. 5th ed. Philadelphia, PA: Lippincott Williams and Wilkins; 2015. [Google Scholar]
- 21. U.S. Bureau of Labor Statistics. Employer costs for employee compensation summary. 2022. https://www.bls.gov/news.release/ecec.nr0.htm. Accessed August 29, 2022.
- 22. Liu A, Odisho A, Brown W, et al. Patient experience and feedback after use of an EHR-integrated COVID-19 symptom checker [published online ahead of print Aug 6, 2022]. JMIR Hum Factors. 2022. doi: 10.2196/40064. [DOI] [PMC free article] [PubMed]
- 23.COVID-19’s Impact on Nursing Shortages, The Rise Of Travel Nurses, And Price Gouging. Health Affairs Forefront. Jan 28, 2022. doi: 10.1377/forefront.20220125.695159. [DOI]
- 24. Hsu H, Greenwald PW, Laghezza MR, et al. Clinical informatics during the COVID-19 pandemic: lessons learned and implications for emergency department and inpatient operations. J Am Med Inform Assoc 2021; 28 (4): 879–89. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25. Lin C-T, Bookman K, Sieja A, et al. Clinical informatics accelerates health system adaptation to the COVID-19 pandemic: examples from Colorado. J Am Med Inform Assoc 2020; 27 (12): 1955–63. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26. Morse KE, Ostberg NP, Jones VG, et al. Use characteristics and triage acuity of a digital symptom checker in a large integrated health system: population-based descriptive study. J Med Internet Res 2020; 22 (11): e20549. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27. Brewer LC, Fortuna KL, Jones C, et al. Back to the future: achieving health equity through health informatics and digital health. JMIR Mhealth Uhealth 2020; 8 (1): e14512. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28. Coster JE, Turner JK, Bradbury D, et al. Why do people choose emergency and urgent care services? A rapid review utilizing a systematic literature search and narrative synthesis. Acad Emerg Med 2017; 24 (9): 1137–49. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29. California Influenza Surveillance Program. Influenza and other respiratory viruses weekly report. 2022. https://www.cdph.ca.gov/Programs/CID/DCDC/CDPH%20Document%20Library/Immunization/Week2021-2203_FINALReport.pdf. Accessed August 16, 2022.
- 30. Margolius D, Hennekes M, Yao J, et al. On the front (phone) lines: results of a COVID-19 hotline. J Am Board Fam Med 2021; 34 (Suppl): S95–102. [DOI] [PubMed] [Google Scholar]
- 31. Cheng A, Angier H, Huguet N, et al. Launching a statewide COVID-19 primary care hotline and telemedicine service. J Am Board Fam Med 2021; 34 (Suppl): S170–8. [DOI] [PubMed] [Google Scholar]
- 32. Eby K. Coronavirus timeline: tracking major moments of COVID-19 pandemic in San Francisco Bay Area. ABCNews7. 2022. https://abc7news.com/timeline-of-coronavirus-us-covid-19-bay-area-sf/6047519/. Accessed August 29, 2022.
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
The data underlying this article cannot be shared publicly to protect user privacy and internal operations data. The data may be shared on reasonable request to the corresponding author.




