Skip to main content
Journal of the American Medical Informatics Association: JAMIA logoLink to Journal of the American Medical Informatics Association: JAMIA
. 2018 Dec 5;25(12):1651–1656. doi: 10.1093/jamia/ocy133

Integrity of clinical information in computerized order requisitions for diagnostic imaging

Ronilda Lacson 1,2,, Romeo Laroya 1,2, Aijia Wang 1, Neena Kapoor 1,2, Daniel I Glazer 2,3, Atul Shinagare 1,2, Ivan K Ip 1,2, Sameer Malhotra 4, Keith Hentel 5, Ramin Khorasani 1,2
PMCID: PMC7647161  PMID: 30517649

Abstract

Objective

Assess information integrity (concordance and completeness of documented exam indications from the electronic health record [EHR] imaging order requisition, compared to EHR provider notes), and assess potential impact of indication inaccuracies on exam planning and interpretation.

Methods

This retrospective study, approved by the Institutional Review Board, was conducted at a tertiary academic medical center. There were 139 MRI lumbar spine (LS-MRI) and 176 CT abdomen/pelvis orders performed 4/1/2016-5/31/2016 randomly selected and reviewed by 4 radiologists for concordance and completeness of relevant exam indications in order requisitions compared to provider notes, and potential impact of indication inaccuracies on exam planning and interpretation. Forty each LS-MRI and CT abdomen/pelvis were re-reviewed to assess kappa agreement.

Results

Requisition indications were more likely to be incomplete (256/315, 81%) than discordant (133/315, 42%) compared to provider notes (p < 0.0001). Potential impact of discrepancy between clinical information in requisitions and provider notes was higher for radiologist’s interpretation than for exam planning (135/315, 43%, vs 25/315, 8%, p < 0.0001). Agreement among radiologists for concordance, completeness, and potential impact was moderate to strong (Kappa 0.66-0.89). Indications in EHR order requisitions are frequently incomplete or discordant compared to physician notes, potentially impacting imaging exam planning, interpretation and accurate diagnosis. Such inaccuracies could also diminish the relevance of clinical decision support alerts if based on information in order requisitions.

Conclusions

Improved availability of relevant documented clinical information within EHR imaging requisition is necessary for optimal exam planning and interpretation.

Keywords: computerized physician order entry, patient safety, health information technology, diagnostic imaging

INTRODUCTION

The Health Information Technology for Economic and Clinical Health (HITECH) Act of 20091 promotes meaningful use of electronic health records (EHRs) and encourages the use of computerized provider order entry (CPOE) systems and the electronic transmission of diagnostic test results.2 The Protecting Access to Medicare Act of 2014 (PAMA)3 aims to promote evidence-based care by including provisions that will require ambulatory providers to consult computerized clinical decision support (CDS) systems when ordering certain advanced diagnostic imaging exams (eg CT, MRI). Computerized systems for ordering imaging exams can improve physician workflow while enhancing quality of patient care,4–8 but it is critical to understand the facilitators and limitations of these systems as they become more frequently utilized in the healthcare setting. Order entry for diagnostic imaging has evolved from paper order entry, which historically suffered from illegibility and inefficiency.9 The use of CPOE has been shown to decrease duplicate image orders for diagnostic imaging examinations,7 as well as improve the quality of indications documented in imaging order requisitions, compared to paper-based order entry.4,10,11

A recent human factors analysis of computerized consultation orders from primary care providers revealed that major problems with usability resulted from the inability of computerized templates to support clinicians’ information needs, ranging from the referring clinicians’ contact information to the reasons/indications for consultation.12 Effective communication between providers is essential to enhancing patient safety; providing indications for imaging examination requests has been shown to be valuable to radiologists in interpreting imaging examinations and making recommendations.13,14 In a small descriptive study13, radiologists who were provided more clinical information produced more accurate diagnoses. The National Academy of Medicine, in their report Improving Diagnosis in Health Care15 emphasize the importance of health information technology in supporting the diagnostic process. A major contributing factor to diagnostic process error is non-availability of information.16

There has been no analytical study assessing the information integrity, defined as concordance and completeness of documented exam indications in electronic order requisitions when compared to provider notes within the same EHR, and its potential impact on exam planning, also referred to as protocolling. Protocolling activity is a step in translating a diagnostic imaging request by a referring provider (eg abdominal MRI) into a specific imaging protocol best suited to answer a relevant clinical question, as determined by the radiologist.17 There is likewise no study assessing potential impact on exam interpretation, a key component of the diagnostic process.

Therefore, the purpose of this study was to assess the proportion of imaging order requisitions with inaccurate or incomplete examination indications compared with provider notes, and evaluate the potential impact on imaging exam protocol selection and interpretation.

METHODS

Setting

This retrospective, observational, cohort study was conducted at an urban, academic, quaternary care hospital and ambulatory center, and included inpatient and outpatient imaging examinations. The requirement to obtain informed consent was waived by the Institutional Review Board for this Health Insurance Portability and Accountability Act-compliant study. The study institution utilizes a CPOE system for imaging (Epic Systems Corporation, Madison, WI), integrated into the EHR. Upon entering a password-protected login, a physician or proxy creates orders for a specific study from predetermined structured menus and can provide optional free text data entry for other available clinical context they would like to include. All relevant information necessary to specify an imaging examination and clinical indications for the examination is captured in the system (eg examination modality, reason for exam, associated diagnosis). The CPOE system is integrated into an enterprise scheduling module, which enables ordering providers or their proxies to schedule radiologic examinations online without the need to contact a radiology central scheduling office.

Data collection

We reviewed order requisitions for a sample of 139 MRI lumbar spine order requisitions and 176 CT abdomen/pelvis order requisitions randomly selected from among 13 064 of these imaging examinations completed between April and May of 2016. Sample size calculation to detect a 16% effect size at 80% power and alpha of 0.05, with an estimated baseline proportion of 69% impact of data integrity on clinical interpretation,18 yielded 284 examinations. We included 315 order requisitions (2.4% of 13 064), divided into MRI lumbar spine and CT abdomen and pelvis, based on the proportion of order requisitions accounted for by these imaging modalities at the study institution. We specifically chose these examinations that are included in the list of advanced diagnostic examinations for which, per regulations promulgated under PAMA, physicians must consult government-approved, evidence-based appropriate-use criteria through a certified CDS system prior to ordering, beginning January 2020.3 We also accessed provider notes (including history and physical examination, progress notes and telephone encounter notes) immediately preceding the order requisition, and up to one year prior to the requisition date, from the institutional research patient data repository.

Manual review

The order requisitions and associated patient records were reviewed by a radiology fellow and three radiology attending physicians for concordance and completeness of the examination indications provided within the requisition as compared to the patient record. Each physician was instructed to review only the order requisition provided, and asked to determine the corresponding imaging protocol they would select (eg routine abdomen/pelvis CT without contrast). Subsequently, they were asked to review the provider notes, and to rate concordance and completeness of the order requisition indications, compared to the provider notes as binary (ie yes/no) variables. A case was deemed concordant if the indication in the requisition agreed with what was documented in the provider notes. A case was determined to be complete if all of the indications pertinent to the ordered imaging study from the provider notes were reflected in the requisition. Individual cases could be both discordant and incomplete.

When an order requisition was discordant, reviewers were asked to assess potential impact on protocol and interpretation. Thus, for a case with a discordant indication (eg requisition indication states “past kidney stone,” whereas provider note states “low urinary stream”), the reviewer was asked whether the discordant indication potentially impacted protocol selection as a binary (ie yes/no) variable. In this instance, impact on protocol selection means that a request for an abdominal CT scan for staging would have been changed to a different CT scan protocol or a different imaging modality based on the discordant indication. Reviewers were also asked to assess whether the discordant indication potentially impacted exam interpretation as a binary (ie yes/no) variable, specifically assessing whether the discordance would have potentially changed interpretation if they had information from the provider notes. In a case with an incomplete requisition (eg requisition indication states “abdominal pain,” whereas provider notes document “right-sided abdominal pain that is worse with bending and is likely muscular”), they were asked to assess potential impact on both protocol selection and exam interpretation separately.

Each order requisition was assigned to one physician. A subset of 40 MRI lumbar spine and 40 CT abdomen/pelvis examinations were reviewed by 2 physicians to assess inter-reviewer agreement. Disagreements were subsequently reconciled. For MRI lumbar spine, this was performed by consensus between the 2 reviewers. For CT abdomen/pelvis, a third reviewer was empirically selected to break the tie.

Outcomes

As the primary outcomes, we measured the proportion of imaging order requisitions with discordant or incomplete examination indications when compared to provider notes (as the gold standard). We also assessed those that reviewers perceived could have impacted protocol selection or could have impacted exam interpretation. The denominator for both measures includes all examinations for which data in the order requisition were discordant with the provider notes plus those for which data in the requisition were incomplete requisitions when compared to the provider notes. The numerators are: (1) the number of cases from the denominator for which protocol selection could have been impacted, and (2) the number of cases from the denominator for which exam interpretation could have been impacted.

Secondarily, we compared the proportion of (1) and (2) in examinations for which data in the requisition were discordant with the provider notes and those for which data in the requisition were incomplete when compared to the provider notes.

Statistical analysis

All statistical analyses were performed using commercially available software (SAS Institute, Cary, NC). Chi-square test was used to assess the potential impact of discordant or incomplete requisition indications on exam interpretation compared to protocol selection. A two-tailed p value of <0.05 was considered significant.

Unweighted kappa analysis was used to calculate inter-reviewer agreement.

RESULTS

Table 1 describes the demographic information corresponding to the study requisitions analyzed. This includes the most common indications reported in the requisitions for the study requested.

Table 1.

Demographic information

Type of imaging exam (n = 315)
CT abdomen/pelvis MRI lumbar spine
Number of requisitions 176 139
Indications reported as Free Text (in addition to structured data) 69 (39%) 63 (45%)
Most common indications Abdominal pain (36%) Low back pain (77%)
Cancer follow-up (20%) Leg pain (8%)
Abnormal prior (7%) Leg weakness (4%)
Patient characteristics
 Patient mean age (years) 60 59
 Patient sex Female (62%) Female (65%)
Male (38%) Male (35%)

Potential impact of incomplete and discordant indications on exam interpretation and protocol selection

Table 2 shows the number and percent of examinations with discordant or incomplete order indications when compared to data in the provider notes. Significantly more order requisitions contained incomplete than discordant indications (81% vs. 42%; p < 0.0001).

Table 2.

Imaging order requisitions with discordant and incomplete indications compared to clinical notes

Discordant
Incomplete
No (%) Yes (%) No (%) Yes (%)
Abdomen CT (n=176) 108 (61%) 68 (39%) 39 (22%) 137 (78%)
Lumbar MRI (n=139) 74 (53%) 65 (47%) 20 (14%) 119 (86%)
Combined (n=315) 182 (58%) 133 (42%) 59 (19%) 256 (81%)

Among all of the examinations with incomplete or discordant indications, in 43% (135/315) exam interpretation was potentially impacted, and in 8% (25/315) protocol selection was potentially impacted (Table 3). This difference was statistically significantly (p < 0.001).

Table 3.

Potential impact of incomplete or inaccurate order indications on protocol selection and exam interpretation

Impact on protocol (%) Impact on interpretation (%) p-value
Discordant requisitions (n=133) 7 (5%) 19 (14%) 0.01a
Incomplete requisitions (n=256) 25 (10%) 134 (52%) <0.0001a
All requisitions (n=315) 25 (8%) 135 (43%) <0.0001a
a

Statistically significant.

Inter-reviewer agreement

Table 4 shows kappa agreement scores for concordance, completeness, and potential impact on protocolling and interpretation, all indicating moderate to strong agreement between the 4 independent reviewers.

Table 4.

Kappa agreement between manual reviewers

Variables Kappa
Concordance 0.83
Discordance – impacted protocol 0.79
Discordance – impacted interpretation 0.89
Completeness 0.66
Incomplete – impacted protocol 0.83
Incomplete – impacted interpretation 0.72

Table 5 illustrates examples of order requisitions and corresponding data in the clinical notes that were discordant and incomplete. In the first row, the Order Requisition indicates that the Reason for Exam is Kidney Stone. In the provider notes, the provider indicates that there are no symptoms (eg hematuria or flank pain) to suggest kidney stone. This illustrates clearly discordant data. In the second row, the Order Requisition indicates that the Reason for Exam is Adrenal Cancer. The provider note indicates a new gall bladder mass, not noted in the Order Requisition and leading to an assessment of Incomplete Data.

Table 5.

Examples illustrating incomplete and discordant data

Order requisition Provider notes Impact interpretation/protocol
Discordant data
  • Order Requisition: CT Abdomen/Pelvis;

  • Associated Diagnosis:

  • Calculus of left ureter

  • Reason for Exam:

  • + Kidney Stone(S)/Renal Calculi [Known Dx]

Low urine stream; Currently no hematuria or flank pain to suggest kidney stone; Plan to get KUB today. Urinary symptoms -low urine stream. The indication for CT abdomen/pelvis is for a previous diagnosis.
Incomplete data Order Requisition:
  • CT Abdomen/ Pelvis; Associated Diagnosis: Adrenal cancer, unspecified laterality (C74.90)

  • Reason for Exam: [Malignancy] Adrenal (Known Act Malig Under Tx; Adrenal Cancer)

Left pheochromocytoma. S/P Retroperitoneoscopic left adrenalectomy. Tumor invasion into periadrenal adipose tissue. She has new gall bladder mass detected elsewhere, which could represent a pheochromocytoma metastasis. Knowing the presence of a gall bladder mass may potentially impact selection of a test protocol or modality, as well as interpretation.
Incomplete data Order Requisition:
  • CT Abdomen/ Pelvis; Associated Diagnosis: Epigastric pain (R10.13)

  • Reason for Exam: Abdominal pain

Patient has weight change, nausea, abdominal distention, and long standing constipation; prior workup yielded no diagnosis. The extensive clinical history may potentially impact interpretation.

DISCUSSION

Our overall goal was to assess indication inaccuracies in EHR order requisitions and evaluate impact on imaging exam protocoling and interpretation. We found that requisition data often lack integrity – being both incomplete and discordant with documented data from the provider notes. There was 42% of order requisition indications not concordant with data from provider notes, and 81% of orders requisitions lacked relevant clinical information already documented in the provider notes. In our EHR, an order requisition is created by a physician or proxy by selecting an imaging examination from a list of those available to order. A pre-determined menu for each order allows the collection of structured indications for an ordered imaging examination as well as relevant free text clinical information, similar to what has been reported in the literature.19 Thus, there is no verbal communication with a radiologist, who subsequently selects an imaging protocol and interprets the images based on the requisition. Radiologists have the option of searching the EHR for additional clinical information including the provider notes. However, time pressures20,21 and the additional workflow burden of navigating the EHR in search of potential additional relevant information means that such efforts may not be practical or routine. Thus, radiologists may rely primarily on data from the order requisitions rather than incorporating other data from the electronic medical record in clinical decision making.

Previous studies have shown that accurate clinical information improves the accuracy of radiology reports.13,14 Radiologists, however, have no influence in the quality of clinical information provided. Multiple studies have demonstrated the impact CPOE has had on both availability and quality of clinical history provided to radiologists by other providers.4,11 The improvements compared to paper requisitions include increased availability of information on prior diagnoses11 and better indication quality for imaging tests.4 In a separate study comparing clinical history in paper request slips to those in an electronic system, 62% of exams ordered had incomplete or discrepant data.18 The authors noted that the finding was a consequence of manual entry of clinical information copied from paper request slips. In our study using electronic order requisitions, we demonstrate that order requisitions frequently do not contain indications that are complete or concordant with data in provider notes. This discrepancy potentially impacts both optimal selection of imaging protocols and exam interpretation by radiologists and could thus contribute to diagnostic errors. The errors in clinical relevance of decision support based on incomplete or discordant information in electronic imaging requisitions could also contribute to decision making errors or alert fatigue and physician burnout.22,23

The potential impact of inaccurate information on imaging exam interpretation is quite significant because this forms the basis for diagnosis and further patient management. Radiologists form their assessment of patients’ diagnoses (eg pneumonia) based on imaging findings as well as clinical data.13 Without accurate and complete information, their assessment and recommended management may not be as precise. The National Academy of Medicine, in their report To Err is Human: Building a Safer Health System24 estimated that between 44 000 and 98 000 people die every year of preventable medical errors; diagnostic errors are a common, yet underreported, contributor to these errors.25 It is, therefore, necessary to address this important factor that impacts patient safety.

In addition, inaccurate information potentially impacts selection of imaging exam protocol. In spite of its lesser impact on protocol selection (8%) compared to exam interpretation (43%), this is quite significant because an incorrect protocol can decrease quality in the imaging value chain, and can lead to inaccurate diagnosis, vague clinical correlations, increased costs, as well as unnecessary radiation exposure.26–28

There is a need to provide accurate and relevant clinical information to radiologists when ordering imaging exams to ensure optimal communication between ordering and radiologist providers. However, ordering providers are also pressed for time when generating orders; they are already spending too much time with “paperwork” and documentation.29,30 Thus, electronic systems have a role to play in improving physician workflow by making relevant data available to radiologists (and other providers) whenever an imaging exam is ordered or a specialist is consulted. The data is often in the electronic health records, but frequently unstructured and not optimally accessible.31–33 Fortunately, multiple studies have shown that these data can be harvested and presented with a more relevant structure and improved content (more accurate clinical information) using various information technology solutions.34–36 The role of information processing – such as using natural language processing and information retrieval – is essential in ensuring optimal data communication in an efficient manner.34–39 More efforts are needed to provide structured, accurate and timely data to radiologists and other providers who play a role in patient care.

Limitations

Our study focused on radiologists’ perceived impact on exam interpretation and protocol selection. We did not measure actual impact on exam interpretation or protocol selection, because we could not ascertain whether radiologists did or did not consult provider notes. Thus, an exam protocol may not have been based on the imaging order requisition and may have been decided by radiologists after consulting notes in the EHR. We also did not have access to verbal communications, unless documented in the medical records. In addition, we assumed that provider notes in the EHR are a reliable source of data and could not verify their accuracy or the accuracy of clinical information entered in imaging order requisitions. Finally, this study was conducted in a single academic institution with a CPOE system. Thus, our findings may not generalize to other clinical settings.

Further studies should prospectively evaluate whether incomplete or discordant clinical indications from order requisitions actually lead to modifications in exam interpretation and protocol selection in multiple radiology subspecialties and multiple institutions. Future interventions such as pre-populating ordering requests with relevant clinical data already documented in the EHR using natural language processing or other data extraction technologies, as well as interventions using machine learning and artificial intelligence to enhance and/or automate exam protocoling in the short term and the more complex interpretation process in the long term may help improve quality and efficiency of care.

CONCLUSION

Indications in EHR imaging order requisitions are frequently incomplete or discordant compared to physician notes within the same EHR. While computerized systems should facilitate ordering physician workflow, improved availability of relevant clinical data within EHR requisitions is necessary for appropriate exam planning and interpretation, which may help minimize diagnostic errors and promote patient safety. Given current EHR ordering workflows, documentation errors in order requisitions could also diminish the clinical relevance of clinical decision support alerts, potentially leading to errors in diagnostic testing.

ACKNOWLEDGMENTS

The authors would like to thank Ms Laura Peterson for reviewing the manuscript.

FUNDING

This work was supported by Agency for Healthcare Research and Quality grant number R01HS02722.

CONTRIBUTORS

All authors contributed to the study design and data acquisition; RL, RL, AW, NK, DG, AS and RK are responsible for data acquisition and analysis. All authors contributed significant intellectual content during the manuscript preparation and revisions, approved the final version, and accept accountability for the overall integrity of the research process and the manuscript.

COMPETING INTERESTS

None.

REFERENCES

  • 1. HHS. The Health Information Technology for Economic and Clinical Health (HITECH) Act. Federal Register 2009; 74 (209): 56123–31. [PubMed] [Google Scholar]
  • 2. Kohane IS, Churchill SE, Murphy SN.. A translational engine at the national scale: informatics for integrating biology and the bedside. J Am Med Inform Assoc 2012; 19 (2): 181–5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. PUBLIC LAW 113—93—PROTECTING ACCESS TO MEDICARE ACT OF 2014. CONGRESSIONAL RECORD, Vol02512 (2014). 2014; Public Law 113-93.
  • 4. Pevnick JM, Herzik AJ, Li X, et al. Effect of computerized physician order entry on imaging study indication. J Am Coll Radiol 2015; 12 (1): 70–4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Kim M, Kaplan SJ, Mitchell SH, et al. The effect of computerized physician order entry template modifications on the administration of high-risk medications in older adults in the emergency department. Drugs Aging 2017; 34: 793–801. [DOI] [PubMed] [Google Scholar]
  • 6. Payne TH, Hoey PJ, Nichol P, Lovis C.. Preparation and use of preconstructed orders, order sets, and order menus in a computerized provider order entry system. J Am Med Inform Assoc 2003; 10 (4): 322–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Wasser EJ, Prevedello LM, Sodickson A, Mar W, Khorasani R.. Impact of a real-time computerized duplicate alert system on the utilization of computed tomography. JAMA Intern Med 2013; 173 (11): 1024–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Ballard DW, Kim AS, Huang J, et al. Implementation of computerized physician order entry is associated with increased thrombolytic administration for emergency department patients with acute ischemic stroke. Ann Emerg Med 2015; 66 (6): 601–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Stavem K, Foss T, Botnmark O, Andersen OK, Erikssen J.. Inter-observer agreement in audit of quality of radiology requests and reports. Clin Radiol 2004; 59 (11): 1018–24. [DOI] [PubMed] [Google Scholar]
  • 10. Waite S, Scott JM, Legasto A, Kolla S, Gale B, Krupinski EA.. Systemic error in radiology. AJR Am J Roentgenol 2017; 209 (3): 629–39. [DOI] [PubMed] [Google Scholar]
  • 11. Alkasab TK, Alkasab JR, Abujudeh HH.. Effects of a computerized provider order entry system on clinical histories provided in emergency department radiology requisitions. J Am Coll Radiol 2009; 6 (3): 194–200. [DOI] [PubMed] [Google Scholar]
  • 12. Savoy A, Patel H, Flanagan ME, Weiner M, Russ AL.. Systematic heuristic evaluation of computerized consultation order templates. Clinicians’ and human factors engineers’ perspectives. J Med Syst 2017; 41 (8): 129. [DOI] [PubMed] [Google Scholar]
  • 13. Leslie A, Jones AJ, Goddard PR.. The influence of clinical information on the reporting of CT by radiologists. Br J Radiol 2000; 73 (874): 1052–5. [DOI] [PubMed] [Google Scholar]
  • 14. Doubilet P, Herman PG.. Interpretation of radiographs: effect of clinical history. AJR Am J Roentgenol 1981; 137 (5): 1055–8. [DOI] [PubMed] [Google Scholar]
  • 15.Institute of Medicine. Improving Diagnosis in Health Care2015. http://iom.nationalacademies.org/∼/media/Files/Report%20Files/2015/Improving-Diagnosis/DiagnosticError_ReportBrief.pdf. Accessed August 16, 2018.
  • 16. Rogith D, Iyengar MS, Singh H.. Using fault trees to advance understanding of diagnostic errors. Jt Comm J Qual Patient Saf 2017; 43 (11): 598–605. [DOI] [PubMed] [Google Scholar]
  • 17. Khorasani R. How IT tools can help improve current protocolling performance gaps. J Am Coll Radiol 2011; 8 (10): 675–6. [DOI] [PubMed] [Google Scholar]
  • 18. Agarwal R, Bleshman MH, Langlotz CP.. Comparison of two methods to transmit clinical history information from referring providers to radiologists. J Am Coll Radiol 2009; 6 (11): 795–9. [DOI] [PubMed] [Google Scholar]
  • 19. Ip IK, Schneider LI, Hanson R, et al. Adoption and meaningful use of computerized physician order entry with an integrated clinical decision support system for radiology: ten-year analysis in an urban teaching hospital. J Am CollRadiol 2012; 9 (2): 129–36. [DOI] [PubMed] [Google Scholar]
  • 20. Andriole KP, Prevedello LM, Dufault A, et al. Augmenting the impact of technology adoption with financial incentive to improve radiology report signature times. J Am Coll Radiol 2010; 7 (3): 198–204. [DOI] [PubMed] [Google Scholar]
  • 21. Heitkamp DE, Kamer AP, Koontz NA.. Institutional pressure to reduce report turnaround time is damaging the educational mission. J Am Coll Radiol 2017; 14 (4): 537–40. [DOI] [PubMed] [Google Scholar]
  • 22. Ash JS, Sittig DF, Campbell EM, Guappone KP, Dykstra RH.. Some unintended consequences of clinical decision support systems. AMIA Annu Symp Proc 2007: 26–30. [PMC free article] [PubMed] [Google Scholar]
  • 23. Phansalkar S, van der SH, Tucker AD, et al. Drug-drug interactions that should be non-interruptive in order to reduce alert fatigue in electronic health records. J Am Med Inform Assoc 2013; 20 (3): 489–93. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Institute of Medicine. To Err is Human. Washington, DC: National Academies Press (US); 2000. http://www.ncbi.nlm.nih.gov/books/NBK225182/.
  • 25. Schiff GD, Hasan O, Kim S, et al. Diagnostic error in medicine: analysis of 583 physician-reported errors. Arch Intern Med 2009; 169 (20): 1881–7. [DOI] [PubMed] [Google Scholar]
  • 26. Brown AD, Marotta TR.. Using machine learning for sequence-level automated MRI protocol selection in neuroradiology. J Am Med Inform Assoc 2018; 25: 568–71. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27. Magnacca M, Poddighe R, Del Meglio J, et al. [ Teamwork for cardiac imaging: coronary computed tomography angiography and low-dose radiation exposure: a cardiology center experience]. G Ital Cardiol (Rome )2017; 18 (4): 313–21. [DOI] [PubMed] [Google Scholar]
  • 28. Boland GW, Duszak R Jr, Kalra M.. Protocol design and optimization. J Am Coll Radiol 2014; 11 (5): 440–1. [DOI] [PubMed] [Google Scholar]
  • 29. Sinsky C, Colligan L, Li L, et al. Allocation of physician time in ambulatory practice: a time and motion study in 4 specialties. Ann Intern Med 2016; 165 (11): 753–60. [DOI] [PubMed] [Google Scholar]
  • 30. Miller RH, Sim I.. Physicians’ use of electronic medical records: barriers and solutions. Health Aff (Millwood) 2004; 23 (2): 116–26. [DOI] [PubMed] [Google Scholar]
  • 31. Vest JR, Grannis SJ, Haut DP, Halverson PK, Menachemi N.. Using structured and unstructured data to identify patients’ need for services that address the social determinants of health. Int J Med Inform 2017; 107: 101–6. [DOI] [PubMed] [Google Scholar]
  • 32. Liu S, Wang L, Ihrke D, et al. Correlating lab test results in clinical notes with structured lab data: a case study in HbA1c and glucose. AMIA Jt Summits Transl Sci Proc 2017; 2017: 221–8. [PMC free article] [PubMed] [Google Scholar]
  • 33. Dreyer KJ, Kalra MK, Maher MM, et al. Application of recently developed computer algorithm for automatic classification of unstructured radiology reports: validation study. Radiology 2005; 234 (2): 323–9. [DOI] [PubMed] [Google Scholar]
  • 34. Chen PH, Zafar H, Galperin-Aizenberg M, Cook T.. Integrating natural language processing and machine learning algorithms to categorize oncologic response in radiology reports. J Digit Imaging 2018; 31: 178–84. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35. Lacson R, Harris K, Brawarsky P, et al. Evaluation of an automated information extraction tool for imaging data elements to populate a Breast Cancer Screening Registry. J Digit Imaging Imaging 2015; 28: 567–75. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36. Kreuzthaler M, Martinez-Costa C, Kaiser P, Schulz S.. Semantic technologies for re-use of clinical routine data. Stud Health Technol Inform 2017; 236: 24–31. [PubMed] [Google Scholar]
  • 37. Chen MC, Ball RL, Yang L, et al. Deep learning to classify radiology free-text reports. Radiology 2018; 286: 845–52. [DOI] [PubMed] [Google Scholar]
  • 38. Savova GK, Fan J, Ye Z, et al. Discovering peripheral arterial disease cases from radiology notes using natural language processing. AMIA Annu Symp Proc 2010; 2010: 722–6. [PMC free article] [PubMed] [Google Scholar]
  • 39. Savova GK, Tseytlin E, Finan S, et al. DeepPhe: a natural language processing system for extracting cancer phenotypes from clinical records. Cancer Res 2017; 77 (21): e115–8. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of Oxford University Press

RESOURCES