Abstract
To facilitate high-quality inpatient care for stroke patients, we built a system within our electronic health record (EHR) to identify stroke patients while they are in the hospital; capture necessary data in the EHR to minimize the burden of manual abstraction for stroke performance measures, decreasing daily time requirement from 2 hours to 15 minutes; generate reports using an automated process; and electronically transmit data to third parties. Provider champions and support from the EHR development team ensured that we balanced the needs of the hospital with those of frontline providers. This work summarizes the development and implementation of our stroke quality system.
Keywords: electronic health records, stroke, documentation
Hospitals are required to report a large and growing number of quality measures. There are currently 8 stroke measures that Centers for Medicare and Medicaid Services (CMS) uses to evaluate stroke care. These measures align with The Joint Commission’s (TJC’s) Primary Stroke Center (PSC) certification requirements.1 Comprehensive Stroke Centers (CSC), which must meet standards to treat the most complex stroke patients (who often require surgical intervention), must track even more measures, and many hospitals participate in additional quality improvement programs, such as the Get With The Guidelines (GWTG)-Stroke registry.2,3 In addition to reporting to payers and accreditation agencies, having easily accessible quality data enables hospitals to monitor performance in real time, track the impact of process improvement efforts, and adapt accordingly.4
It takes significant effort and cost to abstract these quality data from the medical record and generate reports.5,6 We identified a key opportunity to improve this process when the University of Michigan Hospital transitioned from an internally developed inpatient electronic health record (EHR) to a commercial Epic EHR system. Simultaneously, we were transitioning from a PSC to a CSC. Our goal was to build a stroke quality system within our EHR so that we could: 1) identify stroke patients while they are in the hospital; 2) automatically capture necessary data to minimize the burden of manual abstraction; 3) generate reports using an automated process; and 4) electronically transmit data to third parties. We aimed to leverage our EHR in a way that increased value for all stakeholders, including frontline clinical staff, the CSC team, the hospital at large, and ultimately patients and families, in the form of better care.7
IDENTIFYING STROKE PATIENTS IN THE HOSPITAL
Patients enter the stroke population for public quality measure reporting purposes based on retrospective identification of a stroke-related discharge diagnosis code. However, many stroke quality metrics, such as dysphagia screening or administration of antithrombotic therapy, are time sensitive. It is not possible to correct deficiencies in such areas if they are discovered only after discharge. Thus, one of the primary goals of our stroke quality system is to identify stroke patients while they are in the hospital. This enables generation of reports and real-time monitoring of inpatient care. When deficiencies are identified, prompt communication with providers helps them implement countermeasures and optimize care prior to discharge.
Our multi-step process includes patients evaluated in the Emergency Department (ED), transferred from an outside hospital, and those already admitted with a non-stroke diagnosis. The first component involves an automated search for diagnosis codes and terms of interest in the patient’s electronic chart (Table 1). Initial search criteria are intentionally kept broad to prioritize high sensitivity over specificity in capturing stroke patients, and include provider use of the Stroke Navigator (described below), Stroke Code activations, and presenting symptoms to ED triage consistent with stroke. Patients meeting inclusion criteria populate to a system list, accessible to anyone at the institution with EHR access. The number of new patients on this system list varies depending on the mix of patients admitted the day before, but averages about 10 patients per day. The second step involves daily manual verification of this auto-populated list to exclude false positives, typically 3 to 5 patients daily. The third component of this process is learning through iteration, in which information gleaned from manual verification informs and refines the search algorithm in collaboration with our report writers, improving its specificity. Additional details regarding the stroke patient list are available in the Supplementary Material.
Table 1.
Electronic medical record data fields and criteria employed in the automated identification of stroke patients. Diagnosis codes and terms of interest are captured from these fields to automatically populate a list of stroke patients currently hospitalized
ED stroke data | |
---|---|
Transfer accepted | Acceptance of a patient from an external institution to the ED for evaluation/management of a stroke |
Rapid acute stroke screen | “Positive” selection when completing this electronic screening form |
Stroke team activation | Time of pre-hospital or ED activation of stroke team via page from ED clerk |
Primary clinical impression | Selected by ED provider from a structured list of options. If multiple diagnoses are selected, only the first is queried. |
Reason for admission | Selected by the inpatient admitting provider upon patient admission, from a structured list. If multiple reasons are selected, only the first is queried. |
Stroke Unit admission | Patient admission destination of Stroke Unit or Neurological Intensive Care Unit |
Inpatient stroke data | |
Inpatient diagnoses | Selected by inpatient provider from a structured list of options. If multiple diagnoses are selected, only the first is queried. |
Use of the Stroke Navigator | Used by the inpatient teams to document required facets of stroke care |
AUTOMATICALLY CAPTURING DATA FROM THE EHR
We designed structured data elements (SDEs) to facilitate electronic abstraction of data relevant to quality measures. The use of SDEs to improve aggregation of clinical data has been described for other specialties and disease processes.8–10 In the Epic EHR, related SDE entry fields and workflow tasks can be grouped together into tools called navigators. Using the concept of a Stroke Navigator, first developed at Stanford University Hospital, we built separate electronic data entry forms and flowsheets that neurology and neurosurgery providers (physicians and advance practice providers) use to document discrete data for patients with ischemic stroke, intracerebral hemorrhage (ICH), or subarachnoid hemorrhage. Each flowsheet captures data elements specific to that pathology. For example, we capture the ICH score only in the ICH form, while the National Institutes of Health Stroke Scale score is exclusive to the ischemic stroke form. Figure 1 illustrates the fields in the admission ischemic Stroke Navigator.
Figure 1.
Sample Stroke Navigator data entry fields, Ischemic Stroke Navigator Initial Assessment.
In addition, there is logic built into the various fields of the Stroke Navigator to prompt for documentation of the reason for deviation from a particular compliance measure. For example, if an ischemic stroke patient receives the clot-breaking medication tPA outside of our goal of 30 minutes after arrival, providers are prompted to select the reason from a structured list (eg, hypertension, diagnostic uncertainty, etc.). Similarly, when tPA is not given, there is a field to document the clinical rationale or contraindication. The verbiage of the pre-specified options aligns with reporting requirements to GWTG-Stroke. For all fields, there is also an option for providers to enter a free-text response if none of the standard selections is appropriate. Additional information regarding the SDEs in the Stroke Navigator is available in the Supplementary Material.
After completing the Stroke Navigator, providers can incorporate this information into clinical documentation via an EHR tool referred to by Epic by the proprietary name SmartPhrase (Figure 2). In this way, key information is readily available to other EHR users during their note review process, without requiring that user to be aware of the Stroke Navigator or view it directly. This is particularly important for providers outside of the neurology or neurosurgery services, who do not use the Stroke Navigator in their clinical practice. At the time of discharge, an etiology-specific discharge Stroke Navigator is used to capture essential discharge metrics, and a corresponding SmartPhrase populates these into the patient’s discharge summary. Pop-up alerts remind all providers to use the Stroke Navigator for required documentation during both admission and discharge workflows for stroke patients.
Figure 2.
Corresponding documentation in a note, Ischemic Stroke Navigator Initial Assessment structured data in note.
To facilitate provider engagement with this workflow, we identified champions in the various stroke service lines who could serve as a bridge between the CSC team and frontline providers. We incorporated feedback from the champions and frontline providers to implement improvements in the navigator, which enhanced participation. Providers have faced significant challenges in the EHR era with burnout, stemming from strict documentation requirements that may not directly relate to clinical care yet add clerical workload.11,12 Cognizant of this potential burden, our goal was to ensure that data entry into the Stroke Navigator did not adversely impact clinical workflow. From the order of the input fields to their auto-populated content, frontline providers told us what they wanted, and we built the tool to meet their needs. Others have reported lack of provider buy-in for templates that were difficult to navigate or inefficient to complete.8
While there is an abundance of information that could be captured in a Stroke Navigator, we purposely limited the information collected to those elements essential for quality improvement and review. An important consideration to reduce the risk of physician burnout is not mandating documentation solely for its own sake.11 While frontline providers were willing to use the navigators, in an ideal state, many would rather dictate or type their notes with unstructured information. This dual and sometimes conflicting goal of documentation serving the patient and organizational requirements is not unique to our process.13 Despite minimizing burden, there remains an unavoidable amount of additional workload to document adherence to performance measures, as certain data elements are collected exclusively for compliance purposes and not necessarily explicitly documented in routine clinical care. Physician champions played an important role in encouraging initially reluctant providers to nonetheless document these factors, maintaining credibility by modeling compliance.14 Support from senior department leadership was also instrumental.
One of the most critical drivers of provider buy-in was a close partnership with clinical documentation specialists on our EHR development team. Once Stroke Navigators were built, we were able to make rapid changes, based on provider feedback. For minor updates, turnaround times were as fast as 1 business day, in marked contrast to other institutional EHR changes, which may take months. The ability for clinicians to act as effective change agents dramatically improved their investment in and adoption of the Stroke Navigators. The clinical documentation specialists also found the collaboration fruitful, as they felt supported and valued by the end-users.
GENERATING AUTOMATED REPORTS
One of the major goals of our project was to develop a reporting infrastructure to monitor real-time performance on stroke quality measures. Other groups have described the superiority of concurrent review to an audit and feedback approach in helping providers meet stroke quality benchmarks and deliver consistently high-level care.15 Any EHR user can access a Stroke Summary report (Figure 3), developed in collaboration with colleagues at The Ohio State University, that displays quality data in a user-friendly way. Our stroke coordinator can, for example, review the list of stroke patients in the hospital, identify patients lacking an element, such as screening for depression, and communicate that potential deficiency to the appropriate teams, enabling intervention. Outside of CSC staff, universal access to this list empowers individual nursing units to review their quality performance, enabling bottom-up ownership to motivate adherence.
Figure 3.
Stroke summary report.
Prior to the development of our stroke quality system, abstraction of stroke quality data and patient list creation was performed manually by CSC staff and took an average of 2 hours per day. Currently, abstraction takes approximately 15 minutes daily Tuesday through Friday, and 45 minutes on Mondays, which includes review of weekend data. This affords the CSC data analytics team significantly more time for specific quality improvement initiatives.
ELECTRONICALLY TRANSMITTING DATA TO THIRD PARTIES
As all of our quality data are in an electronic format, they can be readily shared with third parties without the need for manual entry. Accuracy of these electronic Clinical Quality Measures derived from EHR abstraction has been demonstrated by other groups, particularly when SDEs are employed.16 We currently transmit core stroke measures electronically to an external vendor who is in alignment with TJC guideline compliance. Furthermore, 95% of relevant elements are auto-extracted to the American Heart Association’s GWTG Patient Management Tool. As third-party reporting requirements evolve, so too can our data capture and reporting framework, enabling agility with minimal incremental investment, thanks to significant up-front effort to build the platform and critical collaborations.
LIMITATIONS
While SDEs are very useful for abstracting clinical data for both primary clinical care and secondary clinical improvement or research purposes, SDEs can slow provider efficiency at the initial data capture stage.8 Providers are human, and as such are accustomed to natural language; SDE input lacks this expressiveness and clarity.8,10 Others have raised concern that pre-populated data in note templates may be incorrect or outdated but retained anyway, with widespread use of note templates and SDEs driven by a transformation of the note from a clinical to a billing document.13 Ideal EHR documentation, then, will strike a balance between structured and unstructured fields, determined by providers and context.9
While we built our specific EHR stroke quality system using an Epic platform, the underlying principles are vendor agnostic, and could be applied in institutions using other EHRs. Ultimately, success of such a program is dependent more on institutional culture and partnerships than specific technology.
CONCLUSIONS
We built a system within our EHR to track measures we find meaningful in the care of stroke patients. Through an iterative process, clinical provider buy-in was enhanced by collaboration, with key feedback from frontline providers and rapid turnaround time on enhancements from backend EHR development staff. This created dynamic and agile Stroke Navigators and reporting tools that, rather than duplicating workload in conflicting processes of patient care and quality monitoring and reporting, leverage real-time documentation to support high-quality care for stroke patients.
FUNDING
No funding was obtained for this specific project or report. This work was performed as a quality improvement initiative by employees of Michigan Medicine as a part of their normal job duties and within their salary support.
CONTRIBUTORS
Study concept and design: Nathan, Foley, Adelman
Acquisition of data: Nathan, Foley, Hoang, Hiner, Brooks, Gendreau, Adelman
Interpretation of data: Nathan, Foley, Hoang, Meurer, Pandey, Adelman
Drafting and revising the manuscript for content: all authors
Conflict of interest statement. None declared.
SUPPLEMENTARY MATERIAL
Supplementary material is available at Journal of the American Medical Informatics Association online.
Supplementary Material
Acknowledgments
The authors would like to thank all of the team members from the University of Michigan and other collaborating institutions who helped build stroke quality tools for the EHR.
REFERENCES
- 1. Certification for Primary Stroke Centers. Secondary Certification for Primary Stroke Centers. https://www.jointcommission.org/certification/primary_stroke_centers.aspx Accessed May 7, 2018.
- 2. Fonarow GC, Reeves MJ, Smith EE, et al. Characteristics, performance measures, and in-hospital outcomes of the first one million stroke and transient ischemic attack admissions in get with the guidelines-stroke. Circ Cardiovasc Qual Outcomes 2010; 33: 291–302. [DOI] [PubMed] [Google Scholar]
- 3.Get With The Guidelines—Stroke Overview. Secondary Get with the Guidelines—Stroke Overview 2017. http://www.heart.org/HEARTORG/Professional/GetWithTheGuidelines/GetWithTheGuidelines-Stroke/Get-With-The-Guidelines-Stroke-Overview_UCM_308021_Article.jsp#.Wc6RuLl1qpoAccessed May 7, 2018.
- 4. Safran C, Bloomrosen M, Hammond WE, et al. Toward a national framework for the secondary use of health data: an American Medical Informatics Association white paper. J Am Med Inform Assoc 2007; 141: 1–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5. Schuster MA, Onorato SE, Meltzer DO.. Measuring the cost of quality measurement: a missing link in quality strategy. JAMA 2017; 31813: 1219–20. [DOI] [PubMed] [Google Scholar]
- 6. Casalino LP, Gans D, Weber R, et al. US physician practices spend more than $15.4 billion annually to report quality measures. Health Aff (Millwood) 2016; 353: 401–6. [DOI] [PubMed] [Google Scholar]
- 7. Baron RJ. Meaningful use of health information technology is managing information. JAMA 2010; 3041: 89–90. [DOI] [PubMed] [Google Scholar]
- 8. Gronkiewicz C, Diamond EJ, French KD, et al. Capturing structured, pulmonary disease-specific data elements in electronic health records. Chest 2015; 1474: 1152–60. [DOI] [PubMed] [Google Scholar]
- 9. Rosenbloom ST, Denny JC, Xu H, et al. Data from clinical notes: a perspective on the tension between structure and flexible documentation. J Am Med Inform Assoc 2011; 182: 181–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Manion FJ, Harris MR, Buyuktur AG, et al. Leveraging EHR data for outcomes and comparative effectiveness research in oncology. Curr Oncol Rep 2012; 146: 494–501. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Sigsbee B, Bernat JL.. Physician burnout: a neurologic crisis. Neurology 2014; 8324: 2302–6. [DOI] [PubMed] [Google Scholar]
- 12. Shanafelt TD, Dyrbye LN, Sinsky C, et al. Relationship between clerical burden and characteristics of the electronic environment with physician burnout and professional satisfaction. Mayo Clin Proc 2016; 917: 836–48. [DOI] [PubMed] [Google Scholar]
- 13. Bernat JL. Ethical and quality pitfalls in electronic health records. Neurology 2013; 8011: 1057–61. [DOI] [PubMed] [Google Scholar]
- 14. Yackanicz L, Kerr R, Levick D.. Physician buy-in for EMRs. J Healthc Inf Manag 2010; 242: 41–4. [PubMed] [Google Scholar]
- 15. McGillivray CG, Silver B.. Evaluating the effectiveness of concurrent review: does it improve stroke measure results? Qual Manag Health Care 2017; 262: 97–102. [DOI] [PubMed] [Google Scholar]
- 16. Phipps MS, Fahner J, Sager D, et al. Validation of stroke meaningful use measures in a national electronic health record system. J Gen Intern Med 2016; 31 (Suppl 1): 46–52. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.