Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2024 May 12.
Published in final edited form as: J Am Coll Radiol. 2023 Jun 26;20(12):1267–1268. doi: 10.1016/j.jacr.2023.04.015

Improving Medical Imaging Order Entry With Artificial Intelligence Tools: Insights and Action Items

Melina Hosseiny 1, Christoph I Lee 2
PMCID: PMC11088912  NIHMSID: NIHMS1968666  PMID: 37379889

Inappropriate image ordering continues to be a cause of medical waste, leading to potential morbidity for patients, including unnecessary anxiety and radiation exposure, as well as unnecessary costs to the health care system [1]. For interpreting radiologists, inappropriate image ordering leads to more time spent looking through patient medical records to properly protocol examinations and time spent interpreting examinations that may not improve patient care management or outcomes. Clinical decision support (CDS) tools are designed to aid clinicians in selecting the most appropriate imaging study for a patient’s specific clinical scenario by presenting appropriate use criteria (AUC) information to the user at the time of image ordering [2,3]. Although the implementation of imaging CDS has been effective in reducing some proportion of inappropriate orders [4,5], it requires the input of a structured indication, which requires extra effort among busy ordering clinicians strapped with time constraints and other challenges [6]. More recently, promising artificial intelligence (AI) and natural language processing technologies have emerged as potential solutions to address the needs for better support tools for appropriate image ordering practices.

In this issue of JACR, Shreve et al [7] present their assessment of image appropriateness scoring after implementing a novel AI tool that suggests structured order entry options from free-text indications to ordering clinicians at the time of image ordering within their multicenter health care system. They compared 115,079 advanced outpatient imaging orders 7 months before and 150,950 orders 7 months after the implementation of the tool. The authors found that the percentage of scored orders increased significantly from 30% to 52% after AI implementation, and orders with structured indications also increased significantly from 34.6% to 67.3% after AI implementation. However, nearly half (48%) of orders remained unscored even after the AI tool was implemented, suggesting that AI assistance at the time of order entry is not enough for ensuring broad, more appropriate imaging ordering practices.

In multivariate analysis, the authors showed that imaging orders were more likely to be scored after the AI tool deployment but that orders placed by nonphysician medical providers were less likely to be scored than those placed by physicians. As more advanced practice providers with less training and exposure to advanced imaging take on image ordering roles in outpatient settings, more education is needed on imaging appropriateness. In addition, the authors found that orders for MR and PET were less likely to be scored than for the more common and ubiquitous CT indications, suggesting gaps in existing AUC for less common MR and PET procedures. Together, these findings suggest that imaging appropriateness efforts are limited by both provider knowledge and behavior, as well as inherent limitations in the AUC information base used by current AI tools.

Although the study’s limitations include the limited information provided by the proprietary software and reliance of the electronic health record and proprietary software reconciliation, some general action items are apparent from this AI clinical implementation study. First, greater educational efforts are needed during the clinical training of ordering clinicians on the downstream consequences of inappropriate advanced imaging ordering, especially for more advanced practice providers. Ordering clinicians should be aware of the ACR’s Appropriateness Criteria and its incorporation into existing CDS tools. A broader educational effort is especially important in light of the fact that the Protecting Access to Medicare Act will require the use of CDS for ordering of outpatient advanced imaging studies [8]. Second, the currently available AUC are not enough to cover expansive clinical scenarios and advanced imaging options. Expanding AUC guidelines and surrogate clinical diagnostic pathways will be critical for future versions of AI-based CDS tools to reach their full potential. Both of these fronts will require increased collaborations between the radiology community and other stakeholders, including nonphysician provider societies, and broader engagement of other medical specialties in the development of more AUC and AI vendors in developing CDS solutions.

In summary, the study by Shreve et al [7] provides valuable insights for radiologists and health care systems considering the implementation of AI tools in imaging CDS. It emphasizes the potential benefits of AI tools while highlighting the need for ongoing efforts to improve providers’ behavior and expand the evidence-based information for appropriate image ordering. Ultimately, the implementation of AI tools for the analysis of free-text indications has the potential to increase the accuracy and efficiency of medical imaging order entry, which in turn will improve the performance and experience for ordering clinicians, radiologists, and, most important, patients.

Footnotes

Dr Lee has received personal fees from GRAIL for service on a data and safety monitoring board; textbook royalties from McGraw Hill, Oxford University Press, and UpToDate; and personal fees from the ACR for journal editorial board work, all outside the submitted work. Dr Hosseiny states that she has no conflict of interest related to the material discussed in this article. The authors are on-partner/non-partnership track/employees.

Follow these authors via Twitter: Melina Hosseiny @MelinaHosseiny, Christoph I. Lee @christophleemd

Contributor Information

Melina Hosseiny, Department of Radiology, University of California, San Diego, San Diego, California.

Christoph I. Lee, Director of the Northwest Screening and Cancer Outcomes Research Enterprise, Department of Radiology, University of Washington School of Medicine, Seattle, Washington; Deputy Editor of JACR.

REFERENCES

  • 1.Hofmann B, Andersen ER, Kjelle E. Visualizing the invisible: invisible waste in diagnostic imaging. Healthcare (Basel) 2021;9:1693. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Chan SS, Francavilla ML, Iyer RS, Rigsby CK, Kurth D, Karmazyn BK. Clinical decision support: the role of ACR Appropriateness Criteria. Pediatr Radiol 2019;49:479–85. [DOI] [PubMed] [Google Scholar]
  • 3.McGinty GB. Clinical decision support: moving forward together. J Am Coll Radiol 2019;16:661–2. [DOI] [PubMed] [Google Scholar]
  • 4.Doyle J, Abraham S, Feeney L, Reimer S, Finkelstein A. Clinical decision support for high-cost imaging: a randomized clinical trial. PLoS ONE 2019;14:e0213373. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Wintermark M, Bruno MA. Clinical decision support: curse or blessing? J Am Coll Radiol 2020;17:566–7. [DOI] [PubMed] [Google Scholar]
  • 6.Wintermark M, Willis MH, Hom J, et al. Everything every radiologist always wanted (and needs) to know about clinical decision support. J Am Coll Radiol 2020;17:568–73. [DOI] [PubMed] [Google Scholar]
  • 7.Shreve LA, Fried JG, Liu F, et al. Impact of artificial intelligence–assisted indication selection on appropriateness order scoring for imaging clinical decision support. J Am Coll Radiol 2023;20:1258–66. [DOI] [PubMed] [Google Scholar]
  • 8.Golding LP, Nicola GN. Clinical decision support: the law, the future, and the role for radiologists. Curr Probl Diagn Radiol 2020;49:337–9. [DOI] [PubMed] [Google Scholar]

RESOURCES