Abstract
OBJECTIVES:
Previous research demonstrated that the numerical Cincinnati Prehospital Stroke Scale (CPSS) identifies large vessel occlusion (LVO) at similar rates compared to dedicated LVO screening tools. We aimed to compare numerical CPSS to additional stroke scales using a national Emergency Medical Services (EMS) database.
METHODS:
Using the ESO Data Collaborative, the largest EMS database with linked hospital data, we retrospectively analyzed prehospital patient records from 2022. Each EMS record was linked to corresponding emergency department (ED) and inpatient records through a data exchange platform. Prehospital CPSS was compared to the Cincinnati Stroke Triage Assessment Tool (C-STAT), the Field Assessment Stroke Triage for Emergency Destination (FAST-ED), and the Balance Eyes Face Arm Speech Time (BE-FAST). The optimal prediction cut points for LVO screening were determined by intersecting the sensitivity and specificity curves for each scale. To compare the discriminative abilities of each scale among those diagnosed with LVO, we used the area under the receiver operating curve (AUROC).
RESULTS:
We identified 17,442 prehospital records from 754 EMS agencies with ≥ 1 documented stroke scale of interest: 30.3% (n=5,278) had a hospital diagnosis of stroke, of which 71.6% (n=3,781) were ischemic; of those, 21.6% (n=817) were diagnosed with LVO. CPSS score ≥ 2 was found to be predictive of LVO with 76.9% sensitivity, 68.0% specificity, and AUROC 0.787 (95% CI 0.722-0.801). All other tools had similar predictive abilities, with sensitivity / specificity / AUROC of: C-STAT 62.5% / 76.5% / 0.727 (0.555-0.899); FAST-ED 61.4% / 76.1%/ 0.780 (0.725-0.836); BE-FAST 70.4% / 67.1% / 0.739 (0.697-0.788).
CONCLUSIONS:
The less complex CPSS exhibited comparable performance to three frequently employed LVO detection tools. The EMS leadership, medical directors, and stroke system directors should weigh the complexity of stroke severity instruments and the challenges of ensuring consistent and accurate use when choosing which tool to implement. The straightforward and widely adopted CPSS may improve compliance while maintaining accuracy in LVO detection.
Keywords: prehospital, emergency medical services, stroke, large vessel occlusion, stroke scale
INTRODUCTION
Early prehospital identification of stroke, coupled with transport to a hospital with appropriate therapeutic capabilities, is essential for optimal stroke care (1). For patients with large vessel occlusion (LVO), endovascular treatment reduces morbidity and mortality but is available only at specialized centers (2-5). Emergency Medical Services (EMS) clinicians are often the first to encounter stroke patients (6), making it crucial they quickly recognize stroke and LVO and transport patients to thrombectomy-capable centers when needed (7,8).
The Cincinnati Prehospital Stroke Scale (CPSS), which predates thrombectomy, is widely used for stroke screening due to its high inter-rater reliability and acceptability (9). It has a pooled sensitivity of 82.8% and specificity of 56.3% (10). The CPSS was first proposed as a tool for LVO detection in 2018 (11). Comparatively, CPSS demonstrated similar performance to other scales like Rapid Arterial occlusion Evaluation (RACE), the Los Angeles Motor Scale (LAMS), and the Vision Aphasia Neglect (VAN) scales (12). However, stroke severity tools including the Cincinnati Stroke Triage Assessment Tool (C-STAT) (13), the Field Assessment Stroke Triage for Emergency Destination (FAST-ED) (14), and the Balance Eyes Face Arm Speech Time (BE-FAST) (15), were not included in these evaluations. The CPSS and BE-FAST screen broadly for stroke, while FAST-ED and C-STAT target LVO specifically.
The C-STAT’s inclusion as an LVO screening tool in this study is supported by its unique derivation, developed using the same research group and methodology as CPSS (9,13,16). Additionally, it was not included in prior studies regarding numerical CPSS (12,17). The C-STAT was conceived as a two-tiered screening strategy, incorporating eye deviation, a clinical marker that has garnered recognition for its predictive value regarding LVOs (18). Similarly, FAST-ED not only assesses eye deviation but also examines patient neglect, a cortical symptom where patients are unaware of or ignore one side of their body, strongly correlated with LVO (14, 19).
While CPSS and other stroke tools are widely used, their comparative performance for LVO detection in real-world prehospital settings remains underexplored. Variability in training, application, and patient populations can affect effectiveness. Addressing this gap is essential, as selecting the right tool could improve stroke identification and ensure timely transport to appropriate facilities. This study aims to compare the predictive characteristics of CPSS with those of C-STAT, BE-FAST, and FAST-ED.
METHODS
Study design and data source
We conducted a retrospective, observational cohort study using prehospital electronic patient care records (EPCRs) from January 1, 2022, to December 31, 2022. These records were sourced from the ESO Data Collaborative, a national repository of de-identified data contributed by over 2,000 EMS agencies across the United States (U.S.), which are aggregated annually into datasets designated for research purposes. The electronic health records were aligned with the National EMS Information System (NEMSIS) version 3.4 standard, allowing comprehensive collection of dispatch, demographic, and clinical data. Through a health data exchange platform, emergency department (ED) and inpatient data, including International Classification of Diseases, 10th Revision (ICD-10) diagnoses, were seamlessly linked with the EPCRs. The ICD-10-CM classification, modified by the National Center for Health Statistics, was used. No manual data abstraction was needed, as all information was extracted directly from the database.
Access to these datasets is granted after proposal review, Institutional Review Board (IRB) approval, and a data use agreement. This study was approved by the University of Utah’s IRB (IRB_00161488) and was deemed exempt from human subjects review. It adheres to the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) guidelines (20).
Inclusion of records for analysis
We included all emergency (9-1-1) EMS calls that involved the use of one or more stroke scales of interest (CPSS, C-STAT, BE-FAST, or FAST-ED) and resulted in transportation to a hospital. The use of a stroke scale was identified through a separate worksheet within the EPCR system, which EMS clinicians complete based on their local protocols. We excluded encounters that did not originate from a 9-1-1 call (such as interfacility transports) and those that did not lead to EMS transport to a hospital. We also excluded records that were not linked to ED and/or inpatient diagnosis codes.
Classification of stroke and LVO
We classified patients based on diagnoses using the ED and hospital ICD-10 codes linked to the EMS records through a bi-directional data exchange. The ED codes were used when available as the most proximal codes to the prehospital event. When ED codes were not available, inpatient diagnoses were used. Stroke types (ischemic or hemorrhagic) and transient ischemic attack (TIA) were identified from these codes (Supplementary File Table 1). The TIA type was included because it cannot be distinguished from ischemic stroke in prehospital settings due to limited time and lack of imaging. To ensure accuracy, only primary and secondary ICD-10 codes (first and second listed) related to the acute event were included, excluding prior stroke diagnoses unrelated to the current encounter. Codes starting with I60, I61, or I62 indicated intracerebral hemorrhage (ICH), while I63 identified acute ischemic stroke (AIS), and G45 identified TIA. While terminological variations exist, LVO stroke is commonly defined as blockages of the middle cerebral arteries (MCA), internal carotid artery (ICA), and/or basilar artery (BA) (21). We used ICD-10 codes indicating thrombosis or embolism of one or more of these vessels to identify LVO stroke (Supplementary File Table 2).
Stroke scales
This study analyzed four stroke screening tools: CPSS, C-STAT, BE-FAST, and FAST-ED. The CPSS, originally unscored, now assigns 1 point each for face, arm, and speech symptoms (0-3) (11,12). The BE-FAST builds on CPSS by adding balance and eye components, scoring five criteria from 0-1 with a total range of 0-5, though it lacks an LVO threshold (15). The C-STAT assesses conjugate gaze (0-2), consciousness (0-1), and arm weakness (0-1), with scores ≥2 suggesting LVO and recommending transport to endovascular centers (13). The FAST-ED evaluates facial palsy, arm weakness, speech, eye deviation, and neglect, scoring 0-9, with ≥4 indicating LVO and directing patients to specialized centers (14) (Table 3).
Table 3.
Comparison of Stroke Screening Tools and Their Characteristics
| Stroke Scale |
Components | Score Range |
LVO Threshold |
Purpose |
|---|---|---|---|---|
| CPSS | Face, Arm, Speech | 0-3 | None | General stroke screening |
| BE-FAST | Balance, Eyes, Face, Arm, Speech | 0-5 | None | Expanded stroke screening, includes posterior circulation stroke |
| C-STAT | Conjugate Gaze, Consciousness Level, Arm Weakness | 0-4 | ≥ 2 | LVO detection and transport decision |
| FAST-ED | Facial Palsy, Arm Weakness, Speech, Eye Deviation, Neglect | 0-9 | ≥ 4 | LVO detection and transport decision |
CPSS: Cincinnati Prehospital Stroke Scale; CSTAT: Cincinnati Stroke Triage Assessment Tool; FAST-ED; Field Assessment Stroke Triage for Emergency Destination; BE-FAST: Balance Eyes Face Arm Speech Time.
Additional covariates
Stroke or TIA impression by EMS was a binary determination if the EMS clinical noted TIA or stroke as part of their primary or secondary impression of the patient in the EPCR. Urban, rural, and super rural designations come from Centers of Medicare and Medicaid assigned to the responding agencies. Advanced life support (ALS) responder was defined as a paramedic whereas Basic Life Support (BLS) could include Emergency Medical Responder (EMR), Emergency Medical Technician (EMT) basic, EMT-advanced, and EMT-intermediate. For race and ethnicity, we created a single variable in accordance with standard practices with the follow categories: White non-Hispanic, Black, non-Hispanic, Hispanic, Asian, non-Hispanic and other (22).
Statistical analysis
We assessed the predictive performance of the CPSS, C-STAT, FAST-ED, and BE-FAST stroke screens by computing the sensitivity and specificity for patients with known LVO, along with 95% confidence intervals. Statistically optimal cut-points for LVO detection on each scale were determined by the point where the sensitivity and specificity curves overlapped. The overall discrimination of each LVO scale was assessed using area under the receiver operating characteristic curves (AUROC). A C-statistic (area under the ROC curve) between 0.7 and 0.8 is generally considered acceptable discrimination, while a C-statistic greater than 0.8 is considered excellent discrimination. To determine the presence of statistically significant differences in the discriminative ability of the stroke severity assessment tools, the DeLong method was employed to compare AUROC of CPSS to the other tools. Additionally, within subpopulations where two stroke scales were concurrently applied to the same patient, McNemar’s Chi-squared test with continuity correction was utilized. All analyses were conducted using R software (version 4.3.0 (2023-04-21). Power calculation for sample size omitted given the nature of the analysis. Due to the compulsory completion of EMS charts, the issue of missing data is mitigated when conducting statistical analyses.
RESULTS
Epidemiology
During the study period, there were a total of 359,064 emergency encounters by 754 EMS agencies with linked hospital data. Of these, 17,442 encounters involving 437 EMS agencies included documentation of one or more of the stroke scales of interest. Among these patients, 5,278 (30.3%) had a linked ICD-10 hospital code indicating a stroke diagnosis. Of the patients diagnosed with stroke, 3,781 (71.6%) were identified with an acute ischemic stroke. Within this group, 837 (21.6%) were diagnosed with a LVO (Figure 1).
Figure 1: Inclusion of patients for analysis and hospital/emergency department stroke diagnoses.

CPSS: Cincinnati Prehospital Stroke Scale; C-STAT: Cincinnati Stroke Triage Assessment Tool; FAST-ED; Field Assessment Stroke Triage for Emergency Destination; BE-FAST: Balance Eyes Face Arm Speech Time; AIS: Acute Ischemic Stroke. *CPSS was used alongside another stroke severity tool (77 with C-STAT, 245 with FAST-ED, and 263 with BE-FAST; n = 585)
Patient and encounter characteristics
The median age of the patients with a documented stroke scale was 74 years (Interquartile Range: 64-83 years), and 53.5% were female. Most patients (93.6%) were assessed by EMS in urban areas, with 5.3% in rural areas and 1.1% in areas designated as super-rural. The highest certification level of EMS personnel recorded for these cases was EMT in 5.4% of cases and Paramedic in 92.8%. The CPSS was the most frequently used instrument, utilized in 82.5% (n=14,384) of patients, followed by BE-FAST at 10.6% (n=1,845), FAST-ED at 9.8% (n=1,709), and C-STAT at 0.5% (n=89) (Table 1). Details on race and ethnicity are also provided. The EPCRs with linked hospital data showed a higher median patient age (74 vs. 69 years), a greater proportion of stroke/TIA impressions by EMS (58.3% vs. 33.2%), and a slightly higher representation of white non-Hispanic patients (67.2% vs. 63.1%) compared to EPCRs without linked data (Supplementary File Table 3).
Table 1.
Patient and prehospital clinician characteristics stratified by prehospital stroke scale(s) documented
| Total N=17442 |
CPSS* 82.5% (N=14384) |
BE-FAST* 10.6% (N=1845) |
CSTAT* 0.5% (N=89) |
FAST-ED* 9.8% (N=1709) |
|
|---|---|---|---|---|---|
| Patient characteristics | |||||
| Age, years Median (IQR) | 74 (64-83) | 74 (63-83) | 74 (64-83) | 69 (56-77) | 74 (65-83) |
| Sex, F | 53.6% (9344) | 53.9% (7752) | 52.0% (959) | 56.2% (50) | 53.3% (911) |
| Stroke diagnosis, Y | 30.3% (5278) | 30.0% (4313) | 34.3% (633) | 29.2% (26) | 31.2% (533) |
| Hemorrhagic Stroke, Y | 4.4% (764) | 4.4% (635) | 3.9% (72) | 9.0% (8) | 5.0% (86) |
| AIS Diagnosis, Y | 21.7% (3781) | 21.3% (3060) | 26.2% (484) | 15.7% (14) | 23.5% (402) |
| TIA Diagnosis, Y | 5.0% (878) | 5.2% (752) | 4.7% (86) | 4.5% (4) | 4.3% (73) |
| LVO Diagnosis, Y | 4.7% (817) | 5.0% (722) | 4.4% (81) | 9.0% (8) | 3.3% (57) |
| Race/Ethnicity: White, non-Hispanic | 67.2% (11715) | 68.8% (9901) | 62.1% (1146) | 64.0% (57) | 62.7% (1072) |
| Race/Ethnicity: Black, non-Hispanic | 15.3% (2669) | 16.2% (2326) | 4.2% (77) | 29.2% (26) | 17.6% (300) |
| Race/Ethnicity: Asian, non-Hispanic | 1.4% (247) | 1.4% (203) | 1.4% (26) | 2.2% (2) | 1.2% (21) |
| Race/Ethnicity: Hispanic | 5.5% (956) | 5.2% (752) | 10.0% (185) | 1.1% (1) | 3.1% (53) |
| Race/Ethnicity: Other, non-Hispanic | 10.6% (1855) | 8.3% (1202) | 22.3% (411) | 4.5% (4) | 15.3% (263) |
| EMS clinician characteristics | |||||
| Stroke or Tia Impression by EMS, Y | 58.3% (10164) | 57.7% (8306) | 68.2% (1259) | 70.8% (63) | 60.0% (1026) |
| Urban | 93.6% (16328) | 93.2% (13412) | 93.1% (1718) | 98.9% (88) | 96.0% (1641) |
| Rural | 5.3% (916) | 5.4% (777) | 6.3% (116) | 1.1% (1) | 4.0% (68) |
| Super Rural | 1.1% (193) | 1.3% (190) | 0.6% (11) | 0 | 0 |
| BLS, highest pre-hospital provider | 5.4% (935) | 6.3% (911) | 1.1% (21) | 23.6% (21) | 0.5% (8) |
| ALS, highest pre-hospital provider | 92.8% (16187) | 91.5% (13165) | 98.8% (1823) | 75.3% (67) | 98.3% (1680) |
Multiple types of stroke scales could be documented per each encounter.
Predictive performance of stroke screening instruments
Table 2 breaks down the performance of each stroke scale across different cut points. Figure 2 displays the AUROC curves for each of the stroke severity tools. The performance of CPSS was maximized at a cut point of 2 or higher, achieving a sensitivity of 76.9%, a specificity of 68.0%, and an AUROC of 0.787 (95% CI 0.722-0.801). The C-STAT demonstrated maximum performance at a cut point of ≥2, with a sensitivity of 62.5%, a specificity of 76.5%, and an AUROC of 0.727 (95% CI 0.555-0.899). The BE-FAST showed its best performance at a cut point of ≥3, with a sensitivity of 70.4%, a specificity of 67.1%, and an AUROC of 0.739 (95% CI 0.697-0.788). The FAST-ED achieved its optimal performance at a cut point of ≥4, with a sensitivity of 61.4%, a specificity of 76.1%, and an AUROC of 0.780 (95% CI 0.725-0.836). Although CPSS had the highest AUROC, it did not show statistically significant discriminative abilities compared to C-STAT (p=0.689), BE-FAST (p=0.727), and FAST-ED (p=0.269), according to DeLong’s test for two correlated receiver operating curves. In sub-populations where CPSS was used alongside another stroke severity tool (77 with C-STAT, 245 with FAST-ED, and 263 with BE-FAST; n = 585), we analyzed sensitivity and specificity using McNemar’s Chi-square test. CPSS outperformed C-STAT (p < 0.001) and FAST-ED (p < 0.001), while no significant difference was found between CPSS and BE-FAST (p = 0.212).
Table 2.
Predictive performance of the CPSS, CSTAT, FAST-ED, and BE-FAST screening instruments for detecting LVO stroke
| Sensitivity | Specificity | PLR | NLR | Area under ROC curve, C- Statistic (95% CI) |
|
|---|---|---|---|---|---|
| CPSS (N=13993) | 0.787 (0.772-0.801) | ||||
| ≥1 | 0.943 | 0.463 | 1.758 | 0.123 | |
| ≥2*,† | 0.769 | 0.68 | 2.402 | 0.34 | |
| 3 | 0.56 | 0.84 | 3.497 | 0.524 | |
| CSTAT (N=89) | 0.727 (0.555-0.899) | ||||
| ≥1 | 0.875 | 0.469 | 1.648 | 0.266 | |
| ≥2*,† | 0.625 | 0.765 | 2.664 | 0.49 | |
| ≥3 | 0.5 | 0.864 | 3.682 | 0.579 | |
| 4 | 0 | 0.901 | 0 | 1.11 | |
| FAST-ED (N=1604) | 0.780 (0.725-0.836) | ||||
| ≥1 | 0.965 | 0.335 | 1.452 | 0.105 | |
| ≥2 | 0.912 | 0.492 | 1.794 | 0.178 | |
| ≥3 | 0.737 | 0.652 | 2.117 | 0.404 | |
| ≥4*,† | 0.614 | 0.761 | 2.568 | 0.507 | |
| ≥5 | 0.526 | 0.85 | 3.506 | 0.557 | |
| ≥6 | 0.386 | 0.909 | 4.251 | 0.675 | |
| ≥7 | CP | 0.953 | 5.574 | 0.773 | |
| ≥8 | 0.14 | 0.977 | 6.102 | 0.88 | |
| 9 | 0.018 | 0.996 | 4.83 | 0.986 | |
| BE-FAST (N=1768) | 0.739 (0.691-0.788) | ||||
| ≥1 | 0.951 | 0.293 | 1.345 | 0.168 | |
| ≥2 | 0.889 | 0.483 | 1.719 | 0.23 | |
| ≥3* | 0.704 | 0.671 | 2.14 | 0.441 | |
| ≥4 | 0.444 | 0.804 | 2.266 | 0.691 | |
| 5 | 0.222 | 0.942 | 3.843 | 0.826 | |
Optimal cut-point identified using sensitivity and specificity curves from the present study.
Recommended cut-point from initial research related to screening instrument.
Figure 2: ROC curves for prehospital stroke severity scales.

CPSS: Cincinnati Prehospital Stroke Scale; C-STAT: Cincinnati Stroke Triage Assessment Tool; FAST-ED; Field Assessment Stroke Triage for Emergency Destination; BE-FAST: Balance Eyes Face Arm Speech Time.
DISCUSSION
Our retrospective analysis, drawing upon a robust nationwide EMS database comprising over 17,400 patient records with linked hospital outcomes, demonstrated that the simpler, numerical CPSS achieved LVO identification performance comparable to that of the more complex C-STAT, FAST-ED, and BE-FAST stroke severity screening tools. Despite lower AUROC values for C-STAT, FAST-ED, and BE-FAST compared to CPSS, these differences were not statistically significant in terms of their capacity for prehospital LVO detection. However, in subpopulations where CPSS and either C-STAT or FAST-ED were employed, CPSS exhibited superior performance.
The C-STAT's initial development, which used regression analysis on two large alteplase trial datasets, focused on identifying key predictors of severe strokes (NIHSS ≥15) (13). However, only a small proportion of these patients presented with severe stroke, potentially limiting the tool’s applicability across broader stroke populations. Subsequent validation studies demonstrated C-STAT’s ability to detect LVOs, but its prospective validation in prehospital settings was limited to fewer than 60 patients, and clinicians lacked formal training during the study (23). These limitations may affect its generalizability to real-world prehospital environments. In our study, the sensitivity, specificity, and AUROC of C-STAT were lower than in the original validation. The relatively small sample size of C-STAT users in our dataset may have contributed to nonsignificant p-values, indicating limited power to detect a true difference in AUROC. This potentially suggests that the apparent lack of difference between tools may reflect insufficient sample size rather than equivalent performance, warranting further research with larger datasets to confirm these findings.
The FAST-ED was validated using the STOPStroke dataset, which comprised patients suspected of having an ischemic stroke and who underwent imaging within 24 hours of potential stroke onset (14). The creators of FAST-ED chose five NIHSS components most closely associated with LVO for the scale. Research associates retrospectively utilized the physical exam results and NIHSS scores recorded by admitting neurologists to determine the operating characteristics of FAST-ED on the detection of LVO in this cohort. In practical prehospital applications, however, the interrater reliability among clinicians was recorded at 0.66, indicating challenges with recall and real-world utilization of the test (24). In contrast, CPSS demonstrated higher interrater reliability, scoring 0.83 between EMS clinicians and neurology attendings, which suggests better recall and application by prehospital clinicians (25).
The BE-FAST was developed to be more comprehensive than CPSS, incorporating balance issues and eye problems in addition to the core components of CPSS, with the intention of increasing sensitivity, particularly at a ≥1 cutoff (15). However, as shown in Table 2, the sensitivity at this cut point was 95.1% for CPSS and 94.3% for BE-FAST, indicating almost no significant increase in sensitivity; highlighting that, in a prehospital, real-world setting, a more complex and nuanced screening tool like BE-FAST does not necessarily capture more pathology than the simpler CPSS. These results suggest that adding complexity to stroke screens may not always translate to better diagnostic performance in practice.
Our findings align with previous research demonstrating that numerical CPSS effectively identifies LVO, with a cut point of 2 providing optimal performance (11, 12). The strongest evidence comes from Richards et al., (11), who first proposed CPSS for LVO detection using retrospective data, and Crowe et al., (12), who validated this approach with real-world U.S. data. The largest prospective trial, the PRESTO study by Saver et al., (19), found CPSS comparable to eight other stroke severity tools, though the RACE tool showed a slight performance advantage. A key limitation affecting external validity is the difference in EMS training between the U.S. and the Netherlands. In the Netherlands, paramedics begin as experienced nurses and undergo extensive training, while U.S. paramedics advance from EMT certification with fewer required training hours. Additionally, a recent systematic review and meta-analysis concluded that numerical CPSS is a useful tool for LVO identification but rated the overall quality of evidence as very low (26).
In many EMS systems, a positive CPSS or similar stroke screen triggers a secondary evaluation for LVO, following AHA/ASA Mission Lifeline guidelines (27). This step is critical for determining the optimal transport destination. However, stroke, including LVO, remains a relatively rare event for individual clinicians. Consistent with previous research, our study did not identify a superior tool for prehospital LVO detection. Given the similar effectiveness of these tools, EMS agencies should consider factors like training burden, reliability, and costs when selecting one. Clinician ability to recall and apply tools accurately, especially in high-stress environments, is also essential. CPSS, being widely used and requiring less training, may offer a practical option when paired with ongoing education.
LIMITATIONS
Our retrospective analysis relied on EPCRs, where entering required data elements is mandatory for record completion. However, inaccurate data entry remains a risk. Additionally, narrative data was not extracted; therefore, if a stroke scale was documented only in the narrative section without the structured worksheet, the record was not included in this analysis. Using ICD-10 codes for stroke and LVO diagnosis introduces potential errors due to varying institutional coding practices, coder proficiency, and evolving criteria. LVO diagnosis, based on major cerebral artery occlusion codes, likely included some medium vessel occlusions (MeVO), potentially causing misclassification. This may explain the 16% LVO prevalence—higher than typically seen in EMS-screened strokes—possibly overestimating prevalence and influencing the comparative accuracy of screening tools. Despite these limitations, using primary and secondary ICD-10 codes remains a standard, reliable research method (28).
A limitation of this study is variability in the patient populations where different screening tools were applied, which could affect accuracy comparisons. As shown in Table 1, C-STAT was used in populations with higher prevalence of ICH and LVO, while CPSS and BE-FAST were more commonly used in rural EMS settings. Such variability introduces potential bias, as differences in demographics and stroke types may influence results. Ideally, these tools would be applied concurrently to the same population for direct comparison. Additionally, tool selection may reflect agency or clinician preferences, further complicating comparisons. These differences are possibly a function of the small sample size. Future research should standardize the use of stroke tools across diverse settings to ensure more reliable assessments.
Our study, limited to 2022, may not capture recent EMS practices, technological advances, or updated guidelines that could affect tool performance. It also did not evaluate interrater reliability or how EMS proficiency impacts screening effectiveness. Further research is needed to explore whether visual and balance assessments enhance detection of posterior circulation strokes, a critical subset of LVOs, and how they influence patient routing in prehospital care (29).
CONCLUSIONS
The findings of our study, and totality of evidence thus far, suggests that numerical CPSS performs as well as more complex stroke severity tools. When determining which screening instrument to implement, EMS agency leadership, medical directors, stroke system directors, and other health care decision-makers should consider the training requirements to maintain proficiency and consistency in application.
Supplementary Material
ACKNOWLEDGEMENTS:
We express our gratitude to all participating EMS agencies and their personnel who contributed data to the ESO Data Collaborative, making this study possible. Thanks to the ESO Data Collaborative team for their assistance in data management and provision, ensuring that we had a robust dataset for our analysis. We extend our gratitude to our colleagues who support and inspire us in innumerable ways.
SOURCES OF FUNDING:
This work was supported by U10NS086606 National Institute of Health (NIH)/National Institute of Neurological Disorders and Stroke (NINDS) Utah StrokeNet Funding.
DECLARATION OF INTEREST STATEMENT:
Dr. Majersik reports others grants from the NIH. Additionally, Dr. Majersik reports personal fees from the American Heart Association (AHA) Stroke Associate Editor outside the submitted work. Dr. Youngquist reports consulting fees from Colabs Medical and grant funding from the ZOLL Foundation, the US Department of Defense, NINDS 1U01NS099046-01A1 and 7U01NS114042-03, NHLBI UH3HL145269. The other author report no conflicts.
Footnotes
GENERATIVE AI DECLARATION STATEMENT: During the preparation of this work, the author(s) utilized ChatGPT version 4.0 to ensure the language was clear, concise, and grammatically correct. After using this tool/service, the author(s) reviewed and edited the content as needed. The author(s) take full responsibility for the content of this publication.
DATA SHARING STATEMENT:
The data used in this study were sourced from the ESO Data Collaborative. Due to data privacy regulations and the de-identified nature of the dataset, the raw data are not publicly available. Further details about accessing the ESO Data Collaborative can be found at https://www.eso.com/data-and-research/.
REFERENCES:
- 1.Oostema JA, Nickles A, Allen J, Ibrahim G, Luo Z, Reeves MJ. Emergency Medical Services Compliance With Prehospital Stroke Quality Metrics Is Associated With Faster Stroke Evaluation and Treatment. Stroke. 2024. Jan;55(1):101–109. doi: 10.1161/STROKEAHA.123.043846. [DOI] [PubMed] [Google Scholar]
- 2.Fransen PS, Beumer D, Berkhemer OA, van den Berg LA, Lingsma H, van der Lugt A, van Zwam WH, van Oostenbrugge RJ, Roos YB, Majoie CB, Dippel DW; MR CLEAN Investigators. MR CLEAN, a multicenter randomized clinical trial of endovascular treatment for acute ischemic stroke in the Netherlands: study protocol for a randomized controlled trial. Trials. 2014. Sep 1;15:343. doi: 10.1186/1745-6215-15-343. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Campbell BC, Mitchell PJ, Yan B, Parsons MW, Christensen S, Churilov L, Dowling RJ, Dewey H, Brooks M, Miteff F, Levi C, Krause M, Harrington TJ, Faulder KC, Steinfort BS, Kleinig T, Scroop R, Chryssidis S, Barber A, Hope A, Moriarty M, McGuinness B, Wong AA, Coulthard A, Wijeratne T, Lee A, Jannes J, Leyden J, Phan TG, Chong W, Holt ME, Chandra RV, Bladin CF, Badve M, Rice H, de Villiers L, Ma H, Desmond PM, Donnan GA, Davis SM; EXTEND-IA investigators. A multicenter, randomized, controlled study to investigate EXtending the time for Thrombolysis in Emergency Neurological Deficits with Intra-Arterial therapy (EXTEND-IA). Int J Stroke. 2014. Jan;9(1):126–32. doi: 10.1111/ijs.12206. [DOI] [PubMed] [Google Scholar]
- 4.Goyal M, Jadhav AP, Bonafe A, Diener H, Mendes Pereira V, Levy E, Baxter B, Jovin T, Jahan R, Menon BK, Saver JL; SWIFT PRIME investigators. Analysis of Workflow and Time to Treatment and the Effects on Outcome in Endovascular Treatment of Acute Ischemic Stroke: Results from the SWIFT PRIME Randomized Controlled Trial. Radiology. 2016. Jun;279(3):888–97. doi: 10.1148/radiol.2016160204. [DOI] [PubMed] [Google Scholar]
- 5.Urra X, Abilleira S, Dorado L, Ribó M, Cardona P, Millán M, Chamorro A, Molina C, Cobo E, Dávalos A, Jovin TG, Gallofré M; Catalan Stroke Code and Reperfusion Consortium. Mechanical Thrombectomy in and Outside the REVASCAT Trial: Insights From a Concurrent Population-Based Stroke Registry. Stroke. 2015. Dec;46(12):3437–42. doi: 10.1161/STROKEAHA.115.011050. [DOI] [PubMed] [Google Scholar]
- 6.Barsan WG, Brott TG, Broderick JP, Haley EC, Levy DE, Marler JR. Time of hospital presentation in patients with acute stroke. Arch Intern Med. 1993;153:2558–2561. doi: 10.1001/archinte.1993.00410220058006. [DOI] [PubMed] [Google Scholar]
- 7.Patel MD, Rose KM, O'Brien EC, Rosamond WD. Prehospital notification by emergency medical services reduces delays in stroke evaluation: findings from the North Carolina stroke care collaborative. Stroke. 2011;42:2263–2268. doi: 10.1161/STROKEAHA.110.605857. 11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Lin CB, Peterson ED, Smith EE, Saver JL, Liang L, Xian Y, Olson DM, Shah BR, Hernandez AF, Schwamm LH, et al. Emergency medical service hospital prenotification is associated with improved evaluation and treatment of acute ischemic stroke. Circ Cardiovasc Qual Outcomes. 2012;5:514–522. doi: 10.1161/CIRCOUTCOMES.112.965210. [DOI] [PubMed] [Google Scholar]
- 9.Kothari RU, Pancioli A, Liu T, Brott T, Broderick J. Cincinnati Prehospital Stroke Scale: reproducibility and validity. Ann Emerg Med. 1999. Apr;33(4):373–8. doi: 10.1016/s0196-0644(99)70299-4. [DOI] [PubMed] [Google Scholar]
- 10.Chaudhary D, Diaz J, Lu Y, Li J, Abedi V, Zand R. An updated review and meta-analysis of screening tools for stroke in the emergency room and prehospital setting. J Neurol Sci. 2022. Nov 15;442:120423. doi: 10.1016/j.jns.2022.120423. [DOI] [PubMed] [Google Scholar]
- 11.Richards CT, Huebinger R, Tataris KL, et al. Cincinnati Prehospital Stroke Scale Can Identify Large Vessel Occlusion Stroke. Prehospital Emergency Care. 2018;22(3):312–318. doi: 10.1080/10903127.2017.1387629 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Crowe RP, Myers JB, Fernandez AR, Bourn S, McMullan JT. The Cincinnati Prehospital Stroke Scale Compared to Stroke Severity Tools for Large Vessel Occlusion Stroke Prediction. Prehosp Emerg Care. 2021. Jan-Feb;25(1):67–75. doi: 10.1080/10903127.2020.1725198. [DOI] [PubMed] [Google Scholar]
- 13.Katz BS, McMullan JT, Sucharew H, Adeoye O, Broderick JP. Design and validation of a prehospital scale to predict stroke severity: Cincinnati Prehospital Stroke Severity Scale. Stroke. 2015. Jun;46(6):1508–12. doi: 10.1161/STROKEAHA.115.008804. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Lima FO, Silva GS, Furie KL, Frankel MR, Lev MH, Camargo ÉC, Haussen DC, Singhal AB, Koroshetz WJ, Smith WS, Nogueira RG. Field Assessment Stroke Triage for Emergency Destination: A Simple and Accurate Prehospital Scale to Detect Large Vessel Occlusion Strokes. Stroke. 2016. Aug;47(8):1997–2002. doi: 10.1161/STROKEAHA.116.013301. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Aroor S, Singh R, Goldstein LB. BE-FAST (Balance, Eyes, Face, Arm, Speech, Time): Reducing the Proportion of Strokes Missed Using the FAST Mnemonic. Stroke. 2017. Feb;48(2):479–481. doi: 10.1161/STROKEAHA.116.015169. [DOI] [PubMed] [Google Scholar]
- 16.Von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. Journal of Clinical Epidemiology. 2008;61(4):344–349. doi: 10.1016/j.jclinepi.2007.11.008 [DOI] [PubMed] [Google Scholar]
- 17.Saver JL, Chapot R, Agid R, et al. Thrombectomy for Distal, Medium Vessel Occlusions: A Consensus Statement on Present Knowledge and Promising Directions. Stroke. 2020;51(9):2872–2884. doi: 10.1161/STROKEAHA.120.028956 [DOI] [PubMed] [Google Scholar]
- 18.Kothari R, Hall K, Brott T, Broderick J. Early stroke recognition: Developing an out-of-hospital nih stroke scale. Academic emergency medicine: official journal of the Society for Academic Emergency Medicine. 1997;4:986–990. [DOI] [PubMed] [Google Scholar]
- 19.Duvekot MHC, Venema E, Rozeman AD, Moudrous W, Vermeij FH, Biekart M, Lingsma HF, Maasland L, Wijnhoud AD, Mulder LJMM, Alblas KCL, van Eijkelenburg RPJ, Buijck BI, Bakker J, Plaisier AS, Hensen JH, Lycklama À Nijeholt GJ, van Doormaal PJ, van Es ACGM, van der Lugt A, Kerkhoff H, Dippel DWJ, Roozenbeek B; PRESTO investigators. Comparison of eight prehospital stroke scales to detect intracranial large-vessel occlusion in suspected stroke (PRESTO): a prospective observational study. Lancet Neurol. 2021. Mar;20(3):213–221. doi: 10.1016/S1474-4422(20)30439-7. [DOI] [PubMed] [Google Scholar]
- 20.McCluskey G, Hunter A, Best E, McKee J, McCarron MO, McVerry F. Radiological eye deviation as a predictor of large vessel occlusion in acute ischaemic stroke. J Stroke Cerebrovasc Dis. 2019;28:2318–2323. doi: 10.1016/j.jstrokecerebrovasdis.2019.05.029. 29. [DOI] [PubMed] [Google Scholar]
- 21.Beume LA, Hieber M, Kaller CP, et al. Large Vessel Occlusion in Acute Stroke: Cortical Symptoms Are More Sensitive Prehospital Indicators Than Motor Deficits. Stroke. 2018;49(10):2323–2329. doi: 10.1161/STROKEAHA.118.022253 [DOI] [PubMed] [Google Scholar]
- 22.Frey T. Updated Guidance on the Reporting of Race and Ethnicity in Medical and Science Journals. AMWA. 2023;38(1). doi: 10.55752/amwa.2023.195 [DOI] [PubMed] [Google Scholar]
- 23.McMullan JT, Katz B, Broderick J, Schmit P, Sucharew H, Adeoye O. Prospective Prehospital Evaluation of the Cincinnati Stroke Triage Assessment Tool. Prehospital Emergency Care. 2017;21(4):481–488. doi: 10.1080/10903127.2016.1274349 [DOI] [PubMed] [Google Scholar]
- 24.Dowbiggin PL, Infinger AI, Purick G, Swanson DR, Studnek JR. Inter-Rater Reliability of the FAST-ED in the Out-of-Hospital Setting. Prehospital Emergency Care. Published online January 12, 2021:1–8. doi: 10.1080/10903127.2020.1852350 [DOI] [PubMed] [Google Scholar]
- 25.Gude M, Kirkegaard H, Blauenfeldt R, et al. Inter-Rater Agreement on Cincinnati Prehospital Stroke Scale (CPSS) and Prehospital Acute Stroke Severity Scale (PASS) Between EMS Providers, Neurology Residents and Neurology Consultants. CLEP. 2023;Volume 15:957–968. doi: 10.2147/CLEP.S418253 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Baser Y, Zarei H, Gharin P, et al. Cincinnati Prehospital Stroke Scale (CPSS) as a Screening Tool for Early Identification of Cerebral Large Vessel Occlusions; a Systematic Review and Meta-analysis. Archives of Academic Emergency Medicine. 2024;12(1):e38. doi: 10.22037/aaem.v12i1.2242 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Pagagos P, Schwamm L. Mission: Lifeline Stroke. Severity-Based Stroke Triage Algorithm for EMS. Dallas (TX): American Heart Association; 2016. Jun 5 [accessed 2023 Dec 16]. https://www.heart.org/en/professional/quality-improvement/mission-lifeline/mission-lifeline-stroke [Google Scholar]
- 28.Kerr AJ, Wang TKM, Jiang Y, Grey C, Wells S, Poppe KK. The importance of considering both primary and secondary diagnostic codes when using administrative health data to study acute coronary syndrome epidemiology (ANZACS-QI 47). European Heart Journal - Quality of Care and Clinical Outcomes. 2021;7(6):548–555. doi: 10.1093/ehjqcco/qcaa056 [DOI] [PubMed] [Google Scholar]
- 29.Mattle HP, Arnold M, Lindsberg PJ, Schonewille WJ, Schroth G. Basilar artery occlusion. Lancet Neurol. 2011;10:1002–1014. doi: 10.1016/S1474-4422(11)70229-0. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
The data used in this study were sourced from the ESO Data Collaborative. Due to data privacy regulations and the de-identified nature of the dataset, the raw data are not publicly available. Further details about accessing the ESO Data Collaborative can be found at https://www.eso.com/data-and-research/.
