Abstract
Triage errors are a major concern in health care due to resulting harmful delays in treatments or inappropriate allocation of resources. With the increasing popularity of digital symptom checkers in pre–primary care settings, and amid claims that artificial intelligence outperforms doctors, the accuracy of triage by digital symptom checkers is ever more scrutinized. This paper examines the context and challenges of triage in primary care, pre–primary care, and emergency care, as well as reviews existing evidence on the prevalence of triage errors in all three settings. Implications for development, research, and practice are highlighted, and recommendations are made on how digital symptom checkers should be best positioned.
Keywords: triage errors, pre-primary care, digital symptom checker, primary care, viewpoint, triage, symptom checker, emergency care
Introduction
Across health care settings globally, the inability of supply (health care resources) to meet demand (the need of individuals for health care advice) means significant limitations exist on access to medical assessments and treatments. Safe, effective, and fair distribution of health care resources therefore requires some form of filtering and direction, or triage, of individuals within health care services based on type or severity of symptoms and/or initial likely diagnoses.
Emerging health technologies have the potential to provide answers to this problem, in supporting the initial assessment of individuals presenting with symptoms to ensure that they access the right area of the health system with the appropriate degree of urgency. Digital symptom checkers represent one approach, providing users with triage recommendations based on their presenting symptoms and responses to screening questions. However, the extent to which digital symptom checkers can safely be used alongside or in place of existing forms of initial medical assessment is currently unclear, with the potential significance of error in triage recommendation being substantial.
In this article, we discuss existing evidence on triage errors in pre–primary care (using digital symptom checkers), in comparison with primary care and emergency care, and provide recommendations on how digital symptom checkers might be best positioned to support users and existing health systems.
What Are Triage Errors?
The Oxford English Dictionary defines triage as “the assignment of degrees of urgency of need in order to decide the order of treatment of a large number of injured or ill patients.” The sorting of patients into emergency, urgent, nonurgent, and self-care categories becomes essential in all health care settings where there is a need to manage allocation of limited health care resources [1].
Triage errors can be described as either undertriage or overtriage. Undertriage occurs when the level of urgency of an individual’s condition is underestimated [2] and they are allocated to less urgent health services or treatments than they need, potentially resulting in worsening of their condition. Overtriage refers to inappropriate allocation of health care resources to individuals whose health care needs are less significant [2]. This may lead to unnecessary use of scarce health resources and may also have a direct detrimental impact on affected individuals through unneeded (and potentially harmful) investigations or treatments [3,4].
Triage in Pre–Primary Care: Context and Challenges
Triage is likely to take place at many stages of a patient’s symptomatic and diagnostic journey, from initial awareness of symptoms through to final established diagnosis and definitive management or resolution of symptoms. Experiencing symptoms is common and frequently does not require medical assessment or treatment [5]. Most individuals will filter and prioritize symptoms that they experience based on factors including personal health beliefs, previous experiences, and informal sources of health information, and seek health care based on the perceived severity of their symptoms/condition, as well as local health system rules, access, and availability.
It has been suggested that the “pre–primary care” health sector, where individuals have reached the stage of considering seeking formal advice on their symptoms but have not yet seen a physician, should be the target of new technological approaches to triage [6]. Building on contemporary interest in self-care, the use of digital technologies to provide detailed and accurate advice and triage provision to support individuals in “self-triage” could enable them to manage their medical problems themselves where possible, or direct them to services of a type and urgency appropriate to their symptoms or condition [7,8].
Digital forms of consultation and triage lack any opportunity for physical examination or for other human interaction, where subtle cues may be picked up. Fully digital consultation systems often lack access to users’ medical histories and are entirely dependent on the data entered by users at the time of consultation. These limitations mean that errors are inevitable. Although face-to-face consultation is often viewed as the gold standard of primary care, it is not free from limitations. These might arise from biases and cultural differences between the clinicians and the patients (for instance, some patients may be reluctant to have blood drawn due to their religious beliefs) [9,10]. To consider the acceptability or otherwise of such errors, it is necessary to understand the extent of error in existing health care triage, both through face-to-face and telephone consultations.
Existing Evidence on Triage Errors
Triage Errors in Primary Care
Studies that investigate triage errors in primary care are scarce. A systematic review assessing the safety of telephone triage in out-of-hours care compared with standard face-to-face doctor assessment suggested that triage was safe in 97% (95% CI 96.5%-97.4%) of all patients contacting out-of-hours care and in 89% (95% CI 86.7%-90.2%) of patients with high urgency [11]. This reduced to 46% (95% CI 42.7%-49.8%) when high-risk groups were examined [11]. A triage system in Belgium reported a comparable level of accuracy (98%) when a new French-language algorithm was used [12]. This seems to be consistent with reported rates of triage errors since the 1970s [13]. However, a more recent study in Belgium that compared the triage decisions made by telephone operators and those made by physicians showed a lower level of accuracy [14]. The correctness of the advice given by the operator according to the physicians was 71%, with 12% underestimation of urgency and 17% overestimation [14].
Although some primary care telephone triage is done by doctors, much is done by nurses, sometimes using computer-based clinical decision support systems [15]. A study assessing the safety of telephone triage in general practitioner cooperatives found that triage nurses estimated the level of urgency correctly in 69% of total patients and underestimated the level of urgency in 19% of them [16]. A similar study in the Netherlands reported a comparable rate of triage errors (ie, the level of care was underestimated in 17% of the patients and overestimated in 19%). In Belgium, both the undertriage and overtriage rates were slightly lower, at 10% and 13% of all patients who contacted the out-of-hour telephone service, respectively [17]. In the same study, general practitioners and nurses were found to agree on the level of urgency in 77% of all contacts [17].
Triage Errors in Emergency Care
In emergency department settings, triage error rates appear to be markedly higher. Tam et al [18] found that triage accuracy in a number of multicentered and single-centered studies was only around 60%, with about 23% of cases undertriaged. A similar rate of triage errors was indicated in a US study, where emergency nurse triage accuracy was recorded for 54% of patients with acute myocardial infarction [19]. Better triage accuracy was recorded in a study in South Korea, where retrospective comparison of records of patients admitted to two emergency departments with a gold standard method (based on a 5-level triage scale reviewed by medical experts) [20] found disagreement in 14.7% of the cases (10% overtriage and 5% undertriage). A comparable 17% triage error rate was reported in a study in Brazil using similar methods [21]. Although triage accuracy varied across studies and there is no standardized acceptable triage rate for all patients, the American College of Surgeons has suggested an acceptable rate of undertriage for trauma patients of 5% and 25%-35% for overtriage [18,22]. It is worth noting that relatively high overtriage rates may be seen in emergency care settings where access to rapid imaging or other investigations allows for subsequent “downgrading” of triage.
Triage Errors in Digital Symptom Checkers
The accuracy of digital symptom checkers in providing triage has been met with skepticism. There is limited evidence in this area, but vignette studies have suggested that triage error rates have been shown to be high for digital symptom checkers [23,24]. One study compared 12 publicly available symptom checkers and reported that only 51% of triage decisions for the top 5 diagnoses were correct [23]. However, this is the mean rate of errors, which may be skewed by a wide range of triage accuracy between the least and most accurate symptom checkers (22%-72%) [23]. The rates of triage errors increase with condition urgency [23,25]. The level of urgency was found to be appropriately assessed in a small proportion of emergency cases with ophthalmic diagnoses (39%, 95% CI 14%-64%) [26]. When applied in emergency department settings, symptom checkers were reported to be inadequately sensitive to emergency cases, with triage accuracy between 45%-75% of total patients [27-30]. However, in a recent study using digital patient self-triage in a hospital emergency department, a digital tool showed higher sensitivity to high-acuity conditions and similar specificity for low-acuity conditions when compared with standard nurse triage using the Manchester Triage System; it also tended to result in overtriage of patients when compared with standard nurse triage [31].
Triage advice provided by symptom checkers is found to be more risk averse than that provided by health care professionals [30,32], with 85% of the users advised to see their doctor in one study [33]. However, in a 5-year follow-up evaluation study, it was observed that symptom checkers in 2020 are less risk averse (odds of 1.11:1, overtriage errors to undertriage errors) than in 2015 (odds of 2.82:1) [24]. Triage errors in emergencies, nevertheless, are still high, with 40% of emergency cases being missed by symptom checkers [24].
Although most studies regarding the accuracy of symptom checkers were carried out through clinical vignettes [23,24], some clinical trials have been conducted to compare the rates of triage error of face-to-face consultation with a physician and digital symptom assessment technologies [34-37]. Results from these clinical trials show that while symptom checkers did not perform as well as face-to-face consultation, correct triage for certain health conditions was still achieved in a higher proportion of patients than expected [37]. Some symptom checkers were reported to attain a sensitivity level of over 50% [36], consistent with previous findings [23].
Evidence of triage error rates in primary care, in emergency care, and by symptom checkers is summarized in Table 1.
Table 1.
|
Overtriage | Undertriage |
Primary care | 13%-19% | 10%-19% |
Emergency care | 10%-35% | 5%-23% |
Symptom checker | No specific rate of overtriage reported. Mean rate of triage accuracy reported to be around 50%, with a range of 22%-72% | No specific rate of undertriage reported. Mean rate of triage accuracy reported to be around 50%, with a range of 22%-72% |
Discussion
Summary
Triage error rates in primary and emergency care vary widely across the literature [38], and differing settings and definitions of triage across settings make comparison difficult. The overall level of accuracy of out-of-hour telephone triage was between 69% and 98%. Undertriage rates ranged from 10%-19% in primary care setting and 5%-23% in emergency setting. Overtriage rates ranged from 13%-19% in primary care setting and 10%-35% in emergency setting. Based on limited evidence, digital symptom checkers have relatively low triage accuracy, with a mean error rate of around 50% [25,30]. However, this is likely skewed by outliers caused by the most and least accurate tools, ranging from 22%-72% [23]. Although the errors tend to be over- rather than undertriage, with users advised to visit a doctor in 85% of cases in one study even when symptoms were appropriate for self-care [33], symptom checkers are increasingly less risk averse [24].
Limitations
It is worth noting that this article is not a formal systematic review, thus no specific strategies or selection criteria were applied to our literature search. This might result in potentially relevant studies being missed, despite our best effort to ensure appropriate studies regarding triage accuracy were included. However, from our consideration of the literature, we observed a high level of heterogeneity among the rates of triage errors across studies. The heterogeneity of triage error rates in primary care, pre–primary care, and emergency care is attributable to a number of factors. Most importantly, case mix and approach to/purpose of triage differ substantially across these settings. The number and type of conditions considered in each study also differed. Although the majority of studies included a mix of acute and chronic conditions, some only considered one type of disease (eg, chronic mental health disorders). Studies that assessed triage accuracy in more conditions were more likely to report higher error rates. In addition, the methods used to identify triage errors were heterogeneous. Eight methods were commonly employed in assessing triage accuracy, namely autopsies, patient and provider surveys, standardized patients, second reviews, diagnostic testing audit, malpractice claims, case reviews, and voluntary reports [3]. Studies that used different methods were found to report significantly different rates of errors [3]. Finally, there appeared to be a lack of clarity in the definition and comparison of triage errors. Some studies did not specify whether the triage errors were overtriage or undertriage. This lack of clarity and consistency means it is not possible to draw conclusions or make clear recommendations as to an acceptable error rate for symptom checkers.
Implications for Development and Practice
Consideration of triage error in primary care is particularly timely in the current unprecedented public health context. The recent COVID-19 pandemic has challenged the ability of health systems worldwide to meet demand, with services in some countries completely overwhelmed. A pressing need to avoid all but the most urgent and essential health service use, and to limit face-to-face interaction between health care professionals and members of the public to an absolute minimum, has led to the adoption of a “remote total triage” system in primary care using telephone and online consulting in many countries [39].
Whether the digital symptom checkers’ level of performance for triage is acceptable depends on the purposes for which they are used [25]. If symptom checkers are seen as a replacement for seeing physicians, they would currently be an inferior alternative [25]. However, if used by individuals to gather quick and accessible information about particular conditions, they are likely to be superior to self-directed internet searches using online search engines [25]. This is especially appropriate when only the best-performing symptom checkers with low triage error rates are used. It is also worth noting that artificial intelligence technology is constantly improving, potentially making it possible for triage made by digital symptom checkers to become more accurate and thus become a safe and useful addition to traditional face-to-face consultations.
Although seeking to avoid unnecessary burden on health services, the lack of available background information and inability to include information from physical examinations or nonverbal cues means that any remote assessment system will likely need to take a risk averse approach to triage. Thus, it is arguably appropriate that digital triage tools adopt this approach.
Implications for Research and Development
Most studies assessing the triage error rates among symptom checkers are conducted through clinical vignettes. The preparation and evaluation of vignettes need to be standardized to allow for external validity and comparability. Furthermore, clinical trials where symptom checkers’ rates of triage error are compared with those of face-to-face consultation should be encouraged. This method not only enables the assessment of triage accuracy but also allows the examination of users’ compliance with triage advice and possible benefits for the health care system.
There is little evidence on users’ compliance with triage advice, in either traditional forms of triage or that given by digital symptom checkers, nor is there data on consequences of symptom checker errors. Additionally, little is currently known about patient expectations and health beliefs in relation to digital diagnostic and triage tools. It seems likely that most individuals would place lower weighting on the advice of a symptom checker than a human clinician, and use the information provided by these tools as part of their decision-making process. However, in times of increasing reliance on digital technology, it is possible that some individuals may have greater trust in these tools than might be expected. Research is clearly needed to clarify these questions, but developers should assume a relatively high degree of reliance of users on the recommendations that symptom checkers provide.
In addition, well-conducted research to understand the clinical effectiveness of digital symptom checkers in effectively triaging individuals (ie, offering appropriate self-care advice or assigning to appropriate services) is urgently needed. To inform decisions of users and policy makers adequately, this must incorporate formal comparison with existing provision, with a focus on primary care telephone and online triage.
Recommendations
Digital symptom checkers should largely position themselves in the pre–primary care triage/self-care area—evidence does not currently support the ability of artificial intelligence to provide effective consultations at the level of those that would normally take place in traditional face-to-face primary care or emergency department setting.
Digital symptom checkers can be appropriately promoted as a safer alternative/effective addition to existing sources of information (such as self-directed online searches) for individuals prior to seeking formal health advice.
Providers of digital symptom checkers should seek to ensure that triage error rates fall within the lower thresholds demonstrated in existing evidence of those in primary care telephone triage (acknowledging the substantial limitations of this literature).
Developers of symptom checkers should make efforts to expand the evidence base in this area, including establishing systems to gain user feedback on triage accuracy/appropriateness, as well as engaging with academic partners to carry out formal research. Findings in terms of limitations and error rates should be clearly publicized and highlighted to users.
Current methods employed to study symptom checkers’ triage accuracy such as case vignette studies should be standardized to allow for external validity and comparability. Clinical trials where key outcome measures include the accuracy of both outcome conditions and triage should also be conducted. A clear distinction between over- and undertriage should be made to provide data for safety monitoring and economic evaluation.
Conclusion
There is very limited evidence and no clear gold standard comparison for triage errors in digital symptom checkers, meaning that it is not possible to make recommendations on an acceptable error rate. Positioning symptom checkers in the self-care/pre–primary care triage setting therefore seems to be most appropriate and where they can likely add value for individuals experiencing symptoms. Industry and academics should work together to develop the necessary evidence, and efforts should be made to collect user feedback and outcomes data. Until clearer comparisons with existing care are available, digital symptom checkers and triage tools should appropriately continue to take a risk averse approach in the recommendations they give to users.
Acknowledgments
BH is grateful for the support of the National Institute for Health and Care Research (NIHR) under the Applied Research Collaborations programme for North West London. The views expressed in this publication are those of the authors and not necessarily those of the National Health Service (NHS), the NIHR, or the Department of Health.
Abbreviations
- NHS
National Health Service
- NIHR
National Institute for Health and Care Research
Footnotes
Authors' Contributions: HN and BH conceptualized and wrote the paper. AM provided insights into the accuracy of digital symptom checkers and contributed to writing the paper. KBD provided further insights in relation to emergency department triage. All authors read and approved the final manuscript.
Conflicts of Interest: HN works as an epidemiologist at Your.MD Ltd. AM works as VP Clinical Operations at Your.MD Ltd. BH is a general practitioner working in the National Health Service (NHS), and works for eConsult Health Ltd, a provider of asynchronous consultations for NHS primary, secondary, and urgent/emergency care. KBD is a registered general nurse working in the NHS and also works for eConsult Health Ltd as a Clinical Director for Urgent and Emergency Care.
References
- 1.Iserson KV, Moskop JC. Triage in medicine, part I: Concept, history, and types. Ann Emerg Med. 2007 Mar;49(3):275–81. doi: 10.1016/j.annemergmed.2006.05.019.S0196-0644(06)00704-9 [DOI] [PubMed] [Google Scholar]
- 2.Cook CH, Muscarella P, Praba AC, Melvin WS, Martin LC. Reducing overtriage without compromising outcomes in trauma patients. Arch Surg. 2001 Jul 01;136(7):752–6. doi: 10.1001/archsurg.136.7.752.soa0179 [DOI] [PubMed] [Google Scholar]
- 3.Graber ML. The incidence of diagnostic error in medicine. BMJ Qual Saf. 2013 Oct 15;22 Suppl 2(Suppl 2):ii21–ii27. doi: 10.1136/bmjqs-2012-001615. http://qualitysafety.bmj.com/lookup/pmidlookup?view=long&pmid=23771902 .bmjqs-2012-001615 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Hildebrandt DE, Westfall JM, Fernald DH, Pace WD. Harm resulting from inappropriate telephone triage in primary care. J Am Board Fam Med. 2006 Sep 01;19(5):437–42. doi: 10.3122/jabfm.19.5.437. http://www.jabfm.org/cgi/pmidlookup?view=long&pmid=16951292 .19/5/437 [DOI] [PubMed] [Google Scholar]
- 5.Elnegaard S, Andersen RS, Pedersen AF, Larsen PV, Søndergaard J, Rasmussen S, Balasubramaniam K, Svendsen RP, Vedsted P, Jarbøl DE. Self-reported symptoms and healthcare seeking in the general population--exploring "The Symptom Iceberg". BMC Public Health. 2015 Jul 21;15(1):685. doi: 10.1186/s12889-015-2034-5. https://bmcpublichealth.biomedcentral.com/articles/10.1186/s12889-015-2034-5 .10.1186/s12889-015-2034-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Carr-Brown J, Berlucchi M. Pre-Primary Care: An Untapped Global Health Opportunity. Your.MD. 2016. [2022-05-27]. https://assets.ctfassets.net/iqo3fk8od6t9/3NZzZ8F7tUi9QPCljAd1YH/43f0e8e262a35f7715ec4b96d1257312/report.pdf .
- 7.Oliver D. David Oliver: Why force GP streaming on NHS emergency departments? BMJ. 2020 Mar 18;368:m992. doi: 10.1136/bmj.m992. [DOI] [PubMed] [Google Scholar]
- 8.El-Osta A, Webber D, Gnani S, Banarsee R, Mummery D, Majeed A, Smith P. The Self-Care Matrix: A unifying framework for self-care. Self Care. 2019;10(3):38–56. https://selfcarejournal.com/article/the-self-care-matrix-a-unifying-framework-for-self-care/ [Google Scholar]
- 9.FitzGerald C, Hurst S. Implicit bias in healthcare professionals: a systematic review. BMC Med Ethics. 2017 Mar 01;18(1):19. doi: 10.1186/s12910-017-0179-8. https://bmcmedethics.biomedcentral.com/articles/10.1186/s12910-017-0179-8 .10.1186/s12910-017-0179-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Johnson RL, Saha S, Arbelaez JJ, Beach MC, Cooper LA. Racial and ethnic differences in patient perceptions of bias and cultural competence in health care. J Gen Intern Med. 2004 Feb;19(2):101–10. doi: 10.1111/j.1525-1497.2004.30262.x. https://onlinelibrary.wiley.com/resolve/openurl?genre=article&sid=nlm:pubmed&issn=0884-8734&date=2004&volume=19&issue=2&spage=101 .30262 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Huibers L, Smits M, Renaud V, Giesen P, Wensing M. Safety of telephone triage in out-of-hours care: a systematic review. Scand J Prim Health Care. 2011 Dec 29;29(4):198–209. doi: 10.3109/02813432.2011.629150. http://europepmc.org/abstract/MED/22126218 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Brasseur E, Servotte J, Donneau A, Stipulante S, d'Orio V, Ghuysen A. Triage for out-of-hours primary care calls: a reliability study of a new French-language algorithm, the SALOMON rule. Scand J Prim Health Care. 2019 Jun 29;37(2):227–232. doi: 10.1080/02813432.2019.1608057. http://europepmc.org/abstract/MED/31033368 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Albin SL, Wassertheil-Smoller S, Jacobson S, Bell B. Evaluation of emergency room triage performed by nurses. Am J Public Health. 1975 Oct;65(10):1063–8. doi: 10.2105/ajph.65.10.1063. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Morreel S, Colliers A, Remmen R, Verhoeven V, Philips H. How accurate is telephone triage in out-of-hours care? An observational trial in real patients. Acta Clin Belg. 2022 Apr 30;77(2):301–306. doi: 10.1080/17843286.2020.1839719. [DOI] [PubMed] [Google Scholar]
- 15.Bunn F, Byrne G, Kendall S. The effects of telephone consultation and triage on healthcare use and patient satisfaction: a systematic review. Br J Gen Pract. 2005 Dec;55(521):956–61. https://bjgp.org/cgi/pmidlookup?view=long&pmid=16378566 . [PMC free article] [PubMed] [Google Scholar]
- 16.Graversen DS, Christensen MB, Pedersen AF, Carlsen AH, Bro F, Christensen HC, Vestergaard CH, Huibers L. Safety, efficiency and health-related quality of telephone triage conducted by general practitioners, nurses, or physicians in out-of-hours primary care: a quasi-experimental study using the Assessment of Quality in Telephone Triage (AQTT) to assess audio-recorded telephone calls. BMC Fam Pract. 2020 May 09;21(1):84. doi: 10.1186/s12875-020-01122-z. https://bmcfampract.biomedcentral.com/articles/10.1186/s12875-020-01122-z .10.1186/s12875-020-01122-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Philips H, Van Bergen J, Huibers L, Colliers A, Bartholomeeusen S, Coenen S, Remmen R. Agreement on urgency assessment between secretaries and general practitioners: an observational study in out-of-hours general practice service in Belgium. Acta Clin Belg. 2015 Oct;70(5):309–14. doi: 10.1179/2295333715Y.0000000017. [DOI] [PubMed] [Google Scholar]
- 18.Tam HL, Chung SF, Lou CK. A review of triage accuracy and future direction. BMC Emerg Med. 2018 Dec 20;18(1):58. doi: 10.1186/s12873-018-0215-0. https://bmcemergmed.biomedcentral.com/articles/10.1186/s12873-018-0215-0 .10.1186/s12873-018-0215-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Sanders SF, DeVon HA. Accuracy in ED triage for symptoms of acute myocardial infarction. J Emerg Nurs. 2016 Jul;42(4):331–7. doi: 10.1016/j.jen.2015.12.011.S0099-1767(16)00043-X [DOI] [PubMed] [Google Scholar]
- 20.Moon S, Shim JL, Park K, Park C. Triage accuracy and causes of mistriage using the Korean Triage and Acuity Scale. PLoS One. 2019 Sep 6;14(9):e0216972. doi: 10.1371/journal.pone.0216972. https://dx.plos.org/10.1371/journal.pone.0216972 .PONE-D-19-11871 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Hinson JS, Martinez DA, Schmitz PSK, Toerper M, Radu D, Scheulen J, Stewart de Ramirez SA, Levin S. Accuracy of emergency department triage using the Emergency Severity Index and independent predictors of under-triage and over-triage in Brazil: a retrospective cohort analysis. Int J Emerg Med. 2018 Jan 15;11(1):3. doi: 10.1186/s12245-017-0161-8. doi: 10.1186/s12245-017-0161-8.10.1186/s12245-017-0161-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Zachariasse JM, van der Hagen V, Seiger N, Mackway-Jones K, van Veen M, Moll HA. Performance of triage systems in emergency care: a systematic review and meta-analysis. BMJ Open. 2019 May 28;9(5):e026471. doi: 10.1136/bmjopen-2018-026471. https://bmjopen.bmj.com/lookup/pmidlookup?view=long&pmid=31142524 .bmjopen-2018-026471 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Ceney A, Tolond S, Glowinski A, Marks B, Swift S, Palser T. Accuracy of online symptom checkers and the potential impact on service utilisation. PLoS One. 2021 Jul 15;16(7):e0254088. doi: 10.1371/journal.pone.0254088. https://dx.plos.org/10.1371/journal.pone.0254088 .PONE-D-20-21627 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Schmieding ML, Kopka M, Schmidt K, Schulz-Niethammer S, Balzer F, Feufel MA. Triage accuracy of symptom checker apps: 5-year follow-up evaluation. J Med Internet Res. 2022 May 10;24(5):e31810. doi: 10.2196/31810. https://www.jmir.org/2022/5/e31810/ v24i5e31810 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Semigran HL, Linder JA, Gidengil C, Mehrotra A. Evaluation of symptom checkers for self diagnosis and triage: audit study. BMJ. 2015 Jul 08;351:h3480. doi: 10.1136/bmj.h3480. http://www.bmj.com/lookup/pmidlookup?view=long&pmid=26157077 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Shen C, Nguyen M, Gregor A, Isaza G, Beattie A. Accuracy of a popular online symptom checker for ophthalmic diagnoses. JAMA Ophthalmol. 2019 Jun 01;137(6):690–692. doi: 10.1001/jamaophthalmol.2019.0571. http://europepmc.org/abstract/MED/30973602 .2730369 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Yu SWY, Ma A, Tsang VHM, Chung LSW, Leung S, Leung L. Triage accuracy of online symptom checkers for Accident and Emergency Department patients. Hong Kong Journal of Emergency Medicine. 2019 Apr 16;27(4):217–222. doi: 10.1177/1024907919842486. [DOI] [Google Scholar]
- 28.Luger TM, Houston TK, Suls J. Older adult experience of online diagnosis: results from a scenario-based think-aloud protocol. J Med Internet Res. 2014 Jan 16;16(1):e16. doi: 10.2196/jmir.2924. https://www.jmir.org/2014/1/e16/ v16i1e16 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Bisson LJ, Komm JT, Bernas GA, Fineberg MS, Marzo JM, Rauh MA, Smolinski RJ, Wind WM. Accuracy of a computer-based diagnostic program for ambulatory patients with knee pain. Am J Sports Med. 2014 Oct 29;42(10):2371–6. doi: 10.1177/0363546514541654.0363546514541654 [DOI] [PubMed] [Google Scholar]
- 30.Chambers D, Cantrell AJ, Johnson M, Preston L, Baxter SK, Booth A, Turner J. Digital and online symptom checkers and health assessment/triage services for urgent health problems: systematic review. BMJ Open. 2019 Aug 01;9(8):e027743. doi: 10.1136/bmjopen-2018-027743. https://bmjopen.bmj.com/lookup/pmidlookup?view=long&pmid=31375610 .bmjopen-2018-027743 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Dickson SJ, Dewar C, Richardson A, Hunter A, Searle S, Hodgson LE. Agreement and validity of electronic patient self-triage (eTriage) with nurse triage in two UK emergency departments: a retrospective study. Eur J Emerg Med. 2022 Feb 01;29(1):49–55. doi: 10.1097/MEJ.0000000000000863.00063110-900000000-99010 [DOI] [PubMed] [Google Scholar]
- 32.Gilbert S, Mehl A, Baluch A, Cawley C, Challiner J, Fraser H, Millen E, Montazeri M, Multmeier J, Pick F, Richter C, Türk E, Upadhyay S, Virani V, Vona N, Wicks P, Novorol C. How accurate are digital symptom assessment apps for suggesting conditions and urgency advice? A clinical vignettes comparison to GPs. BMJ Open. 2020 Dec 16;10(12):e040269. doi: 10.1136/bmjopen-2020-040269. https://bmjopen.bmj.com/lookup/pmidlookup?view=long&pmid=33328258 .bmjopen-2020-040269 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Nijland N, Cranen K, Boer H, van Gemert-Pijnen JEWC, Seydel ER. Patient use and compliance with medical advice delivered by a web-based triage system in primary care. J Telemed Telecare. 2010 Jan 19;16(1):8–11. doi: 10.1258/jtt.2009.001004.16/1/8 [DOI] [PubMed] [Google Scholar]
- 34.Martin SS, Quaye E, Schultz S, Fashanu OE, Wang J, Saheed MO, Prem Ramaswami. de Freitas H, Ribeiro-Neto B, Parakh K. A randomized controlled trial of online symptom searching to inform patient generated differential diagnoses. NPJ Digit Med. 2019 Nov 11;2(1):110. doi: 10.1038/s41746-019-0183-0. doi: 10.1038/s41746-019-0183-0.183 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Knitza J, Muehlensiepen F, Ignatyev Y, Fuchs F, Mohn J, Simon D, Kleyer A, Fagni F, Boeltz S, Morf H, Bergmann C, Labinsky H, Vorbrüggen W, Ramming A, Distler JHW, Bartz-Bazzanella P, Vuillerme N, Schett G, Welcker M, Hueber AJ. Patient's perception of digital symptom assessment technologies in rheumatology: results from a multicentre study. Front Public Health. 2022 Feb 22;10:844669. doi: 10.3389/fpubh.2022.844669. doi: 10.3389/fpubh.2022.844669. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Knitza J, Mohn J, Bergmann C, Kampylafka E, Hagen M, Bohr D, Morf H, Araujo E, Englbrecht M, Simon D, Kleyer A, Meinderink T, Vorbrüggen W, von der Decken CB, Kleinert S, Ramming A, Distler JHW, Vuillerme N, Fricker A, Bartz-Bazzanella P, Schett G, Hueber AJ, Welcker M. Accuracy, patient-perceived usability, and acceptance of two symptom checkers (Ada and Rheport) in rheumatology: interim results from a randomized controlled crossover trial. Arthritis Res Ther. 2021 Apr 13;23(1):112. doi: 10.1186/s13075-021-02498-8. https://arthritis-research.biomedcentral.com/articles/10.1186/s13075-021-02498-8 .10.1186/s13075-021-02498-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Proft F, Spiller L, Redeker I, Protopopov M, Rodriguez VR, Muche B, Rademacher J, Weber A, Lüders S, Torgutalp M, Sieper J, Poddubnyy D. Comparison of an online self-referral tool with a physician-based referral strategy for early recognition of patients with a high probability of axial spa. Semin Arthritis Rheum. 2020 Oct;50(5):1015–1021. doi: 10.1016/j.semarthrit.2020.07.018.S0049-0172(20)30207-9 [DOI] [PubMed] [Google Scholar]
- 38.Schiff GD, Kim S, Abrams R, Cosby K, Lambert B, Elstein AS, Hasler S. Diagnosing diagnosis errors: lessons from a multi-institutional collaborative project. In: Henriksen K, Battles JB, Marks ES, Lewin DI, editors. Advances in Patient Safety: From Research to Implementation (Volume 2: Concepts and Methodology) Rockville, MD: Agency for Healthcare Research and Quality (US); 2005. [PubMed] [Google Scholar]
- 39.Advice on how to establish a remote 'total triage' model in general practice using online consultations. National Health Service. 2020. [2022-05-27]. https://www.england.nhs.uk/coronavirus/documents/advice-on-how-to-establish-a-remote-total-triage-model-in-general-practice-using-online-consultations/