Skip to main content
Journal of the American College of Emergency Physicians Open logoLink to Journal of the American College of Emergency Physicians Open
. 2021 Aug 9;2(4):e12534. doi: 10.1002/emp2.12534

Artificial intelligence MacHIne learning for the detection and treatment of atrial fibrillation guidelines in the emergency department setting (AIM HIGHER): Assessing a machine learning clinical decision support tool to detect and treat non‐valvular atrial fibrillation in the emergency department

Kim Schwab 1,2,, Dacloc Nguyen 1, GilAnthony Ungab 2, Gregory Feld 3, Alan S Maisel 4, Martin Than 5, Laura Joyce 5, W Frank Peacock 6
PMCID: PMC8353018  PMID: 34401870

Abstract

Objective

Advanced machine learning technology provides an opportunity to improve clinical electrocardiogram (ECG) interpretation, allowing non‐cardiology clinicians to initiate care for atrial fibrillation (AF). The Lucia Atrial Fibrillation Application (Lucia App) photographs the ECG to determine rhythm detection, calculates CHA2DS2‐VASc and HAS‐BLED scores, and then provides guideline‐recommended anticoagulation. Our purpose was to determine the rate of accurate AF identification and appropriate anticoagulation recommendations in emergency department (ED) patients ultimately diagnosed with AF.

Methods

We performed a single‐center, observational retrospective chart review in an urban California ED, with an annual census of 70,000 patients. A convenience sample of hospitalized patients with AF as a primary or secondary discharge diagnosis were evaluated for accurate ED AF diagnosis and ED anticoagulation rates. This was done by comparing the Lucia App against a gold standard board‐certified cardiologist diagnosis and using the American College of Emergency Physicians AF anticoagulation guidelines.

Results

Two hundred and ninety seven patients were enrolled from January 2016 until December 2019. The median age was 79 years and 44.1% were female. Compared to the gold standard diagnosis, the Lucia App detected AF in 98.3% of the cases. Physicians recommended guideline‐consistent anticoagulation therapy in 78.5% versus 98.3% for the Lucia App. Of the patients with indications for anticoagulation and discharged from the ED, only 25.0% were started at discharge.

Conclusion

Use of a cloud‐based ECG identification tool can allow non‐cardiologists to achieve similar rates of AF identification as board‐certified cardiologists and achieve higher rates of guideline‐recommended anticoagulation therapy in the ED.

Keywords: artificial intelligence, atrial fibrillation, clinical decision support, emergency department, guidelines, machine learning, oral anticoagulation

1. INTRODUCTION

1.1. Background

It is estimated that 5.2 million Americans are living with atrial fibrillation (AF) and account for 4.6 million annual emergency department (ED) visits.1, 2 Each year there are an estimated 600,000 newly diagnosed AF patients visiting the ED, the number of whom is increasing.2 The number of patients with AF is expected to increase to 12.1 million by the year 2030.1

Anticoagulation therapy is one of the most effective strategies to reduce cardiovascular morbidity and mortality in patients with non‐valvular atrial fibrillation (NVAF) and is recommended in both the American Heart Association/American College of Cardiology/Heart Rhythm Society (2014 and 2019) and American College of Emergency Physicians (ACEP) guidelines.3, 4, 5 Although these guidelines support the initiation of anticoagulation in the ED, real‐world studies have demonstrated that as many as 53% of high‐risk patients are discharged from the ED without anticoagulation.6

The decision to anticoagulate requires individual assessment of stroke risk and risk of bleeding. Use of risk tools, such as CHA2DS2‐VASc (congestive heart failure, hypertension, age ≥ 75, diabetes mellitus, prior stroke or TIA or thromboembolism, vascular disease, age 65 to 74, sex category; score ranges 0 to 9) and HAS‐BLED (hypertension, abnormal renal and/or liver function, previous stroke, bleeding history, or predisposition, labile international normalized ratios, elderly, and concomitant drugs and/or alcohol excess; score ranges 0 to 9) can help to inform the choice of oral anticoagulation (OAC) agent and management strategy.7 The CHA2DS2‐VASc score is a validated tool to assess the risk of stroke and systemic emboli in patients with NVAF. OAC is recommended for those patients with a score of 2 or greater. However, treatment with OAC or aspirin in this population may be considered for a score of 1. A HAS‐BLED score of 3 or greater indicates a high risk of bleeding. HAS‐BLED should not be used on its own to exclude patients from OAC therapy. Clinicians should evaluate each identifiable bleeding risk factor and determine if OAC is appropriate.

The consequences of misdiagnosis and treatment failure contribute to morbidity and mortality. It has been reported that AF patients discharged from the ED without anticoagulation are 2.7x more likely to die, suffer a stroke, or be readmitted over the next year, compared to those receiving anticoagulation.8 A large multicenter study in patients previously diagnosed with AF found 84% of 94,474 patients were not on guideline‐recommended anticoagulation before their stroke.9 Finally, even among cardiologists, guideline compliance rates are inadequate. The National Cardiovascular Data Registry Practice Innovation and Clinical Excellence (PINNACLE) Registry found that 40% of moderate to high‐risk AF patients seen by a cardiologist receive inadequate anticoagulation.10

The advent of machine‐assisted diagnosis and anticoagulation recommendation prompts has held the promise of improved diagnostic accuracy, with the goal of higher rates of appropriate anticoagulation. Unfortunately, the results to date have been inconsistent. In one 2019 European study, computer‐based AF diagnoses were incorrect in at least 9% of cases, with ED and primary care physicians failing to correct the missed diagnoses in 47% of cases.11 Higher quality clinical decision support tools that leverage existing ED staff can be used to further optimize patient care.

The Lucia (Lucia Health Guidelines, San Francisco, CA) Atrial Fibrillation Application (Lucia App) leverages machine learning and cloud computing to interpret an ECG photograph and provide a rhythm detection. Once the clinician inputs data points used to calculate the CHA2DS2‐VASc and HAS‐BLED scores, it then presents guideline compliant treatment recommendations.

The Lucia AF detection algorithm has been previously internally validated, though these data have not been published. The algorithm was tested against a set of randomly chosen 12‐lead resting ECGs from 1 hospital; including 335 patients adjudicated as having AF and 76 patients adjudicated as having normal sinus rhythm (NSR). The algorithm correctly identified the NSR patients 75 times for a specificity of 96.8% (95% confidence interval [CI] = 92.9%, 99.8%) and correctly identified the AF patients 331 times for a sensitivity of 98.8% (95% CI = 97.0%, 99.5%).12 Given the strong clinical accuracy of this test set and retrospective nature of the current study, the study was designed as a proof of concept to test the entire clinical decision support tool, including AF guideline recommendations.

1.2. Importance

A clinical decision support tool that streamlines the detection and treatment of AF can improve guideline adherence, when appropriate, and reduce treatment times in the ED, especially in settings with limited access to cardiologists, with the ultimate goal of reducing the incidence of stroke.

1.3. Goals of this investigation

Our primary objective was to confirm the accuracy of the Lucia App to detect the presence of AF on 12‐lead resting ECG. Secondary objectives included determining the potential impact of app‐derived guideline‐consistent anticoagulation recommendations versus discharge therapy patterns and determining the accuracy of calculation and documentation of the CHA2DS2‐VASc and HAS‐BLED scores in the medical record.

2. METHODS

2.1. Study design and setting

We performed an observational cohort study consistent with the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) guidelines, using a convenience sample obtained by retrospective chart review from January 2016 to December 2019, applying standard methodology.13, 14 The study was institution review board approved and did not require informed consent as all data were de‐identified when extracted from the chart and entered into the Lucia App. Patients were assigned a numerical study identifier that was entered into the app in place of usual patient identifiers. Data were collected by trained chart abstractors using a standardized collection form. Disagreements among abstractors was resolved by an independent panel of cardiology experts, blinded to the app results.

2.2. Selection of participants

Eligible patients were ≥ 18 years of age who presented to the ED and were ultimately discharged directly from the ED or hospitalized and discharged from the hospital, with either a primary or secondary diagnosis of AF (all patients included in this analysis had a gold standard ECG diagnosis of AF). Patients were not eligible for inclusion if they were receiving anticoagulation for an indication other than AF, were already enrolled in another clinical study, were admitted with an acute stroke, had known valvular AF, were hemodynamically unstable at ED admission, or if the patient transferred to another facility. Missing data were managed by complete case analysis (ie, cases with missing data were excluded).

2.3. Interventions

The gold standard AF diagnosis was determined by a board‐certified cardiologist, reading only the ED ECG and blinded to the app result. At this institution, all ECGs initially reviewed by the ED physician are sent to cardiology for a second read. To determine the presence of AF, the app requires that a photo is taken of the ECG, with only the patient's sex and birthdate documented. Data are entered into the app for calculation of CHA2DS2‐VASc and HAS‐BLED scores. These data are submitted securely to the Microsoft Azure Cloud (Microsoft, Inc, Redmond, WA) for detection of AF and treatment recommendations. Based on these calculations, guideline‐recommended therapy is presented to the clinician for each subject.

The Bottom Line.

A machine learning clinical decision support tool was compared to emergency department physician detection and treatment of non‐valvular atrial fibrillation (AF) in the emergency department (ED). In a convenience sample of 297 AF cases, the Lucia AF algorithm performed better (98.3% vs 78.5%) than the ED physician recommendation for following national guideline‐based anticoagulation.

2.4. Measurements

Collected data included age, gender, ethnicity, photograph of ECG, admission diagnosis from the ED, duration of AF (< 48 hours, ≥ 48 hours, or unknown), antithrombotic medication at admission (eg, warfarin, factor Xa inhibitors, or direct thrombin inhibitors), concomitant antiplatelet agents at admission (eg, P2Y12 inhibitors, aspirin), discharge antithrombotic medications, concomitant discharge medications that may increase bleeding risk, arrhythmia management, and calculation of CHA2DS2‐VASc and HAS‐BLED scores.

2.5. Outcomes

The primary outcome variable was the detection of AF by the Lucia App. The app is designed to have 95% accuracy for the detection of AF at a certainty level of 70%. All ECG results returned by the app are classified as AF (certainty level ≥ 70%), NSR (certainty level ≥ 70%), or undetermined (certainty level < 70%). Secondary outcome variables included guideline‐compliant discharge rates of anticoagulation therapy as documented by the discharging physician and those recommended by the app and documentation of the CHA2DS2‐VASc and/or the HAS‐BLED score in the medical record. A CHA2DS2‐VASc score of 1, regardless of whether it was because of female gender, was defined as a flexible indication treatment and was considered appropriate regardless of whether the providers chose to initiate anticoagulation therapy or not.

2.6. Statistical analysis

Validation of the Lucia App to detect AF in an ED setting was performed and presented with 95% CI versus the gold standard diagnosis. This provides a direct measure of diagnostic (clinical) sensitivity. Guideline‐compliant anticoagulation scores and discharge treatment reporting rates are presented with 95% CI for both physicians and the app. The interrater reliability of the chart abstractors, using intraclass correlation coefficient, was calculated at 86%.

3. RESULTS

3.1. Characteristics of study subjects

Overall, 297 patients met the entry criteria and had no exclusion criteria. The median age of all subjects was 79 years, with 183 (61.6%) non‐Hispanic, 113 (38.0%) Hispanic, and 1 (0.3%) not documented. A total of 166 (55.9%) were male and 131 (44.1%) were female. Of the total patients, 239 (80.5%) had a prior diagnosis of AF and 171 (57.6%) were on anticoagulation at presentation (Appendix Table 1). Of the total patients, 212 (71.4%) were hospitalized, with the remaining 85 (28.6%) discharged directly home from the ED (Appendix Figure 1).

3.2. Main results

The app accurately detected AF in 292 (98.3%; 95% CI = 96.4%, 99.4%) enrolled patients as compared to the gold standard. Three (1.0%) of the AF ECGs were read by the app as NSR. Of these, 1 was sinus rhythm that converted to AF midway through the ECG. Two (0.7%) ECGs fell below the app threshold of 70% and were read as undetermined (Appendix Table 2).

Additionally, correct diagnosis was evaluated in patients discharged directly home from the ED. Of the 85 (28.6%) ED discharged patients, only 71 (83.5%) were correctly diagnosed as AF by the ED physician. Fourteen (16.5%) ECGs that were initially read by the ED physician as NSR were subsequently confirmed as AF by cardiology. In the ED discharged cohort the Lucia App detected 83 (97.6%) as AF, 1 (1.2%) as NSR, and 1 (1.2%) was undetermined.

Anticoagulation recommendations by physicians at hospital discharge were consistent with ACEP guidelines in 233 (78.5%; 95% CI = 73.5%, 82.9%). This compared with 292 (98.3%; 95% CI = 96.4%, 99.4%) for the Lucia App. Because all the ECGs were known AF as a criterion for inclusion in this analysis, the 5 (1.7%) that the app did not detect as AF would have resulted in non‐compliant treatment recommendations.

Of enrolled patients, 171 (57.6%) were already on anticoagulation upon admission and 126 (42.4%) had never been on anticoagulation therapy (Appendix Table 1). The cohort with no prior anticoagulation thus represents an opportunity for therapeutic interventions. Of the 85 ED discharged patients, 65 (76.5%) were already on anticoagulation therapy and 20 (23.5%) were not. Of the 20 ED patients not on anticoagulation, only 5 (25.0%) were discharged on anticoagulation, leaving 15 (75.0%) discharged without.

A total of 212 (71.4%) AF patients were hospitalized. Of these, 115 (54.2%) were already anticoagulated. No hospitalized AF patient had anticoagulation started by the ED physician. At subsequent discharge, the admitting team continued anticoagulation therapy on 115 (54.2%), leaving 97 (45.8%) discharged without anticoagulation.

The app calculated CHA2DS2‐VASc on all 297 (100%) patients with mean (SD) score of 4.3 (1.76). Documented physician‐calculated CHA2DS2‐VASc scores were reported in only 40 (13.5%) patients with a mean (SD) score of 3.4 (1.58). Interclass correlation (ICC) results showed that, when documented, CHA2DS2‐VASc physician scores were in significant agreement with app‐derived scores (ICC = 0.883, P < 0.01). The app calculated HAS‐BLED on all 297 (100%) patients with mean (SD) score of 2.8 (1.31). Documented physician‐calculated HAS‐BLED scores were reported in only 7 (2.4%) patients with a mean (SD) score of 3.3 (0.76) (Appendix Table 3). ICC results showed that, when documented, HAS‐BLED physician scores were not in significant agreement with app‐derived scores (ICC = 0.585, P = 0.154). There was very little variation in antithrombotic therapy or lack thereof regardless of the mean CHA2DS2‐VASc score (Appendix Figure 2).

4. LIMITATIONS

Our study has several limitations. This study was designed as a retrospective chart review and thus our findings are hypothesis generating. The study design also limited the ability to determine the algorithm's full range of true negatives or false positives, as inclusion criteria looked at only 1 rhythm, AF. Of note, the lack of non‐AF cases is a significant limitation that prevents defining the predictive value of this algorithm in all presentations. However, when the algorithm's results are positive for AF, the ability to objectively confirm the detection has the potential to improve clinical management. Additionally, the relatively small sample size and single‐center investigation may provide challenges in generalizability. Further, lower rates of emergency physician implementation of indicated anticoagulation may have been a function of intentionally delayed therapy in hospitalized patients (with the intent of allowing the admitting team to select among the number of reasonable therapeutic options) or because the non‐valvular etiology of the AF could not be determined in the ED. Because the inability to determine non‐valvular etiology of the atrial fibrillation restricts the initiation of anticoagulation, this may be a significant limitation in the study. Finally, if the risk scores (CHA2DS2‐VASc and HAS‐BLED) were not recorded, they were assumed to have not been performed, which may explain the low level of documentation by physicians seen in the study.

5. DISCUSSION

Based on the comparison with board‐certified cardiologist diagnosis AF, we found that the Lucia App had a high ability to accurately detect AF. Only 5 of 297 ECGs were not detected as AF. The app's 97.6% versus the ED physician's 83.5% accuracy rate for ED discharged patients suggests opportunity for improvement in AF detection if the app was routinely applied. Furthermore, treatment recommendations by the Lucia App (98.3%) were more consistent with ACEP guidelines than the treatment decision of discharging physicians (78.5%). This is the first cloud‐based app intervention to demonstrate the potential for increased AF detection and anticoagulation treatment recommendations. Prior studies have demonstrated low rates of guideline consistent anticoagulation in patients presenting with AF but have not provided clear solutions to improve compliance.6, 15 The accessibility of a mobile (tablet or phone) app suggests new compliance opportunities could be implemented at low cost and without a significant disruption in operational flow management.

Appropriate anticoagulation is an important facet of care for patients with NVAF to prevent stroke and other thromboembolic events. However, based on data from the PINNACLE Registry, anticoagulation rates have fallen short of guideline‐based expectations, as less than half of high‐risk patients receive OAC.10 Anticoagulation prescribing rates do seem to be improving however, as a recent review study of the PINNACLE Registry found OAC rates from 2013 to 2017 increased from 52.7% to 65.2%.16 This gap in care is even more apparent in EDs in the United States, with some studies seeing anticoagulation in only 18.9% of eligible high‐risk patients.17

Numerous barriers need to be overcome in prescribing OAC in the ED and at hospital discharge, including deferment to outpatient colleagues, difficulty of establishing follow‐up with primary care physicians or cardiologists, or lack of knowledge of existing guidelines for the ED or acute management.15, 18 Other factors may include lack of cardiologist access, especially in settings where staffing may be constrained owing to logistical and financial constraints. However, one recent study found that AF patients discharged from the ED who were eligible for OAC were much more likely to be receiving it 1 year later if they were provided with a prescription in the ED (67.8%), compared to those for whom initiation was left to the primary care provider (37.2%).17 The Lucia App provides an opportunity to serve as a clinical decision support tool to help guide providers toward making guideline‐recommended therapy, even in rural communities that may not have access to cardiologists or electrophysiologists. As Lucia's algorithms were trained by cardiac electrophysiologists, it is promising that its detection capabilities are consistent with that of board‐certified cardiologists.

It is important to consider the app's advantages in recommending guideline‐based anticoagulation. Because the Lucia App facilitates calculation and documentation of a patient's CHA2DS2‐VASc and HAS‐BLED scores, this shows potential for allowing physicians to adhere more strongly to guidelines as it streamlines steps that physicians would have to take in order to assess whether anticoagulation is appropriate for the patient.

In this retrospective chart review, we found that the Lucia App had comparable rates of accurate AF detection as cardiologists and higher rates compared to emergency physicians. The high degree of guideline‐consistent recommendations provided by the Lucia App suggests the potential for an app‐based solution to guide providers toward an appropriate starting point upon which to make more personal and nuanced decisions.

6. DISCLOSURES

Kim Schwab: Nothing to disclose; Dacloc Nguyen: Nothing to disclose; GilAnthony Ungab: Co‐founder Lucia Health Guidelines and speaking payments from Zoll Medical; Gregory Feld: Fellowship stipend support from Boston Scientific, Biotronik, Biosense Webster, Medtronic, and Abbott Medical, is co‐founder and co‐owner of Perminova, has received consulting fees from Acutus Medical, Vektor Medical, and Irysis Pharmaceuticals, and has equity interest in Medwaves, Acutus Medical, and Lucia Health Guidelines; Alan Maisel: Co‐founder Brainstorm Medical; Martin Than: Clinical trial funding and speaking payments from Alere, Abbott, Beckman, and Roche; Laura Joyce: Nothing to disclose; Frank Peacock: Research Grants: Abbott, Becton Dickenson, Brainbox, Calcimedica, CSL Behring, Cue, Ortho Clinical Diagnostics, Relypsa, Roche, Salix, Siemens. Consultant: Abbott, Astra‐Zeneca, Beckman, Bosch, Fast Biomedical, Forrest Devices, Ischemia Care, Dx, Instrument Labs, Janssen, Nabriva, Ortho Clinical Diagnostics, Osler, Relypsa, Roche, Quidel, Salix, Siemens, Upstream. Stock/Ownership Interests: AseptiScope Inc, Brainbox Inc, Braincheck Inc, Coagulo Inc, Comprehensive Research Associates LLC, Comprehensive Research Management Inc, Emergencies in Medicine LLC, Fast Inc, Forrest Devices, Ischemia DX LLC, Lucia Health Guidelines LLC, Prevencio Inc, ScPharma, Trivirum Inc, Upstream Inc.

AUTHOR CONTRIBUTIONS

KS and GU conceived the study. KS and GU designed the trial. KS and DN supervised the conduct of the trial and data collection. KS, FP, and DN provided statistical advice on study design and analyzed the data. KS, FP, and DN drafted the manuscript, and all authors contributed substantially to its revision. KS takes responsibility for the paper as a whole

1.

TABLE 1.

Patient demographics

Frequency %
Gender
Female 131 44.1
Male 166 55.9
Hispanic
No 183 61.6
Yes 113 38.0
No answer 1 0.3
AF diagnosis
Preexisting 239 80.5
New 56 18.9
Not addressed 2 0.7
Medications at admission
Apixaban 72 24.2
Rivaroxaban 41 13.8
Edoxaban 3 1.0
Dabigatran 14 4.7
Warfarin 41 13.8
Aspirin 86 29.0
Clopidogrel, ticagrelor, or prasugrel 15 5.1
Dual antiplatelet therapy 6 2.0
Medications at discharge
Apixaban 85 28.6
Rivaroxaban 53 17.8
Edoxaban 5 1.7
Dabigatran 11 3.7
Aspirin 91 30.6
Clopidogrel 23 7.7
Prior anticoagulant discontinued 4 1.3
No antiplatelet at discharge 192 64.6
N/A 139 46.8
AF rhythm detection
Cardiology panel 297 100.0
Lucia 292 98.3
ACEP guideline compliance
Physician 233 78.5
Lucia 292 98.3
Choice of Discharge Treatment
Anticoagulation (including warfarin) 175 58.9
Warfarin only 13 4.4
Aspirin 52 17.5
LAA 1 0.3
Appropriate anticoagulation dose at discharge
No 13 4.4
Yes 130 43.8
N/A 154 51.9

Abbreviations: ACEP, American College of Emergency Physicians; AF, atrial fibrillation; LAA, left atrial appendage.

TABLE 2.

Confirmed versus predicted rhythm classification for Lucia App

Confirmed
AF NSR or undetermined
Predicted AF 292 0
NSR or Undetermined 5 0

Abbreviations: AF, atrial fibrillation; NSR, normal sinus rhythm.

TABLE 3.

Descriptive statistics for numerical scale variables

N Mean SD
Age 297 76.55 13.18
Days Hospital Stay 296 3.72 4.70
MD CHA2DS2‐VASc 40 3.4 1.58
LUCIA CHA2DS2‐VASc 297 4.32 1.76
MD HAS‐BLED 7 3.29 0.76
LUCIA HAS‐BLED 297 2.8 1.31

FIGURE 1.

FIGURE 1

Patient disposition

FIGURE 2.

FIGURE 2

CHA2DS2‐VASc score and antithrombotic variables

Schwab K, Nguyen D, Ungab GA, et al. Artificial intelligence MacHIne learning for the detection and treatment of atrial fibrillation guidelines in the emergency department setting (AIM HIGHER): Assessing a machine learning clinical decision support tool to detect and treat non‐valvular atrial fibrillation in the emergency department. JACEP Open. 2021;2:e12534. 10.1002/emp2.12534

Supervising Editor: Christian Tomaszewski, MD, MS

Funding and support: By JACEP Open policy, all authors are required to disclose any and all commercial, financial, and other relationships in any way related to the subject of this article as per ICMJE conflict of interest guidelines (see www.icmje.org). The authors have stated that no such relationships exist.

REFERENCES

  • 1.Colilla S, Crow A, Petkun W, Singer DE, Simon T, Liu X. Estimates of current and future incidence and prevalence of atrial fibrillation in the U.S. adult population. Am J Cardiol. 2013;112(8):1142‐1147. [DOI] [PubMed] [Google Scholar]
  • 2.Jackson SL, Tong X, Yin X, George MG, Ritchey MD. Emergency department, hospital inpatient, and mortality burden of atrial fibrillation in the United States, 2006 to 2014. Am J Cardiol. 2017;120(11):1966‐1973. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.January CT, Wann LS, Alpert JS, et al. AHA/ACC/HRS guideline for the management of patients with atrial fibrillation: a report of the American College of Cardiology/American Heart Association Task Force on practice guidelines and the Heart Rhythm Society. Circulation. 2014;130:e199‐e267. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.January CT, Wann LS, Alpert JS, et al. AHA/ACC/HRS focused update of the 2014 AHA/ACC/HRS guideline for the management of patients with atrial fibrillation: a report of the American College of Cardiology/American Heart Association Task Force on Clinical Practice Guidelines and the Heart Rhythm Society. Circulation. 2019;140:e125‐e151. [DOI] [PubMed] [Google Scholar]
  • 5.Baugh CW, Clark CL, Wilson JW, et al. Creation and implementation of an outpatient pathway for atrial fibrillation in the emergency department setting: results of an expert panel. Acad Emerg Med. 2018;25(9):1065‐1075. [DOI] [PubMed] [Google Scholar]
  • 6.Scheuermeyer FX, Innes G, Pourvali R, et al. Missed opportunities for appropriate anticoagulation among emergency department patients with uncomplicated atrial fibrillation or flutter. Ann Emerg Med. 2013;62(6):557‐565. [DOI] [PubMed] [Google Scholar]
  • 7.Lane DA, Lip GYH. Use of the CHA(2)DS(2)‐VASc and HAS‐BLED scores to aid decision making for thromboprophylaxis in nonvalvular atrial fibrillation. Circulation. 2012;126:860‐865. [DOI] [PubMed] [Google Scholar]
  • 8.Wei M, Do D, Tang PT, et al. Optimal disposition for atrial fibrillation patients presenting to the emergency departments. J Am Coll Cardiol. 2018;71(11):A509. [Google Scholar]
  • 9.Xian Y, O'Brien EC, Liang L, et al. Association of preceding antithrombotic treatment with acute ischemic stroke severity and in‐hospital outcomes among patients with atrial fibrillation. JAMA. 2017;317(10):1057‐1067. [DOI] [PubMed] [Google Scholar]
  • 10.Hsu JC, Maddox TM, Kennedy K, et al. Aspirin instead of oral anticoagulant prescription in atrial fibrillation patients at risk for stroke. J Am Coll Cardiol. 2016;67(25):2913‐2923. [DOI] [PubMed] [Google Scholar]
  • 11.Lindow T, Kron J, Thulesius H, Ljungström E, Pahlm O. Erroneous computer‐based interpretations of atrial fibrillation and atrial flutter in a Swedish primary health care setting. Scand J Prim Health Care. 2019;37(4):426‐433. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Korelc J, Rothway M, Machine learning‐based algorithm validation test on 12‐lead resting ECGs consisting of normal sinus rhythm and atrial fibrillation. 2020. Unpublished raw data.
  • 13.Gilbert EH, Lowenstein SR, Koziol‐McLain J, Barta DC, Steiner J. Chart reviews in emergency medicine research: where are the methods?. Ann Emerg Med. 1996;27(3):305‐308. [DOI] [PubMed] [Google Scholar]
  • 14.Worster A, Bledsoe RD, Cleve P, Fernandes CM, Upadhye S, Eva K. Reassessing the methods of medical record review studies in emergency medicine research. Ann Emerg Med. 2005;45(4):448‐451. [DOI] [PubMed] [Google Scholar]
  • 15.Kea B, Waites BT, Lin A, et al. Practice gap in atrial fibrillation oral anticoagulation prescribing at emergency department home discharge. West J Emerg Med. 2020;21(4):924‐934. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Maddox TM, Song Y, Allen J, et al. Trends in U.S. ambulatory cardiovascular care 2013 to 2017: JACC review topic of the week. J Am Coll Cardiol. 2020;75(1):93‐112. [DOI] [PubMed] [Google Scholar]
  • 17.Atzema CL, Jackevicius CA, Chong A, et al. Prescribing of oral anticoagulants in the emergency department and subsequent long‐term use by older adults with atrial fibrillation. CMAJ. 2019;191(49):E1345‐E1354. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Kea B, Alligood T, Robinson C, Livingston J, Sun BC. Stroke prophylaxis for atrial fibrillation? to prescribe or not to prescribe – a qualitative study on the decision‐making process of emergency department providers. Ann Emerg Med. 2019;74(6):759‐771. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Journal of the American College of Emergency Physicians Open are provided here courtesy of American College of Emergency Physicians

RESOURCES