Take-home points
• For referring physicians, preliminary reports generated by artificial intelligence-based computer-aided diagnosis (AI-CAD) for chest radiography may provide immediate help for both outpatient and inpatient care.
• For radiologists, AI-CAD can enhance workflow by prioritizing the worklist according to the predicted imaging findings, shortening the reporting time, and increasing the overall diagnostic performance.
• For the successful implementation of AI-CAD, accuracy, and immediate availability of AI results, software platforms providing enhanced worklist and user-modifiable configuration, and engagement with AI by radiologists when reading images in practice are needed.
Despite the development of advanced imaging techniques such as CT and MRI, chest radiography remains an important diagnostic tool because of its advantages including low cost, short scan time, and broad indications [1]. In daily practice, chest radiographs are commonly used for health check-ups, preoperative risk assessments, routine screening before hospitalization, and evaluation of patients with symptomatic cardiopulmonary diseases [2]. As chest radiographs often do not contain remarkable abnormalities and yet their analysis requires careful examination of complex structures, there is considerable risk of readers neglecting abnormalities [1,3]. The heavy workload posed by the large number of examinations present further difficulties for radiologists. Therefore, artificial intelligence-based computer-aided diagnosis (AI-CAD) can improve the efficiency and accuracy of radiologists for diagnosis by serving as a second opinion [1,4,5].
In addition, AI may help referring clinicians with chest radiography in daily practice [6,7]. An official radiology report may not be available when a clinician observes a chest radiograph. In such situations, the medical decision made with radiography is based on the reading of the referring clinicians’ instead of radiologists, which may happen especially in an outpatient clinic or emergency room. Depending on the experience level, referring clinicians sometimes may lack confidence about their interpretations of chest radiograph and may unintentionally fail to consult specialists in pulmonology or thoracic surgery [8]. Also, they may request unnecessary CT scans or follow-up imaging examinations due to concern about overlooking patient’s problems. Currently, expectations about AI have grown, and referring clinicians realize that AI might be able to support their decision-making process [9,10]. Many non-radiologist clinicians expect that AI may enable more effective disease screening and timely targeted consultation or referral of patients to relevant specialists and believe that their practice will improve with the introduction of AI systems [8]. In the same context, a recent consensus statement pointed out that AI for chest radiography could function as an assistant tool for radiologists and a decision-support tool for non-radiologist clinicians [4].
Various commercially available AI-CAD tools have demonstrated successful transitions from research to clinical practice through technical advances in deep learning for medical imaging [4,11,12]. The performance of AI systems approved by regulatory agencies has been validated in multiple reports, and further improvement in efficiency is expected in the real world [13,14,15]. However, the actual adoption and widespread use of AI-CAD in clinical practice have been very slow, contrary to earlier expectations [2]. A recent study discussed the importance of an appropriate picture archiving and communication system (PACS) for successful integration and survival of AI-CAD in the daily practice of referring clinicians and radiologists [2]. As a real-time, widely acceptable, and easily accessible platform, PACS enables AI-CAD to pass the test bench.
The authors’ institution is a general hospital that opened in March 2020. It has 595 inpatient beds, and approximately 54000 patients visit outpatient clinics per month. As a digitally innovative hospital, we introduced multiple AI-based tools for the analysis of chest radiography and mammography, with speech-to-text conversion also made possible [16]. In addition to the use of several AI solutions in routine practice, our institution serves as a research site for validating newly developed AI programs. Here, we present the experience gained from setting up and running AI-CAD on chest radiography in terms of the benefits that can be gained in daily practice and the factors needed for successful implementation.
Deep-learning-based AI software for analyzing chest radiographs was approved by the government for the first time in August 2018 in Korea (Lunit Insight CXR nodule, version 1, Lunit Inc.). Our hospital adopted Lunit Insight CXR MCA, version 2 of the same line of AI software based on ResNet34 and approved in October 2019, in March 2020, and we applied it to all chest radiographs. From then until February 2021, 56192 chest radiographs of adults taken from the anteroposterior and posteroanterior views were analyzed by the AI. The software can detect three types of lesions (nodule, consolidation, and pneumothorax), and lesions with abnormality scores of more than 15%, which is the value AI used to determine the presence of detected lesions, are visualized on a heatmap (Fig. 1). Heatmaps were automatically merged to separate images or manually displayed over the original images by applying a grayscale soft-presentation state. The highest abnormality score above the preset operating point of 15% was selected as the total abnormality score and was listed on the worklist for each patient. The maximum abnormality score was reported on the reading worklist of a DICOM viewer (Zetta PACS, Taeyoung Soft Co. Ltd.). Image analysis by AI and upload of AI results happen within one minute or less after an image is verified by a radiographer. The Lunit Insight CXR, version 3, which detects eight different lesion types (nodule, consolidation, pneumothorax, pneumoperitoneum, fibrosis, atelectasis, cardiomegaly, and pleural effusion) (Fig. 2), was then approved in Korea in October 2020 and installed in our hospital in March 2021. We analyzed 106230 chest radiographs using the AI until February 2022. Grayscale contour maps have become available with this version as an additional visualization method. Other differences from the previous version are the contour maps, abbreviations of lesion types, and abnormality scores provided for each lesion separately. The DICOM viewer was updated so that there are now more options available when selecting lesions to visualize, and that a shortcut can also be chosen when applying the grayscale soft-presentation state. On average, one faculty member worked in the thoracic radiology section of the radiology department during these two years.
Fig. 1. Chest radiographs analyzed with AI-CAD version 2.
A. Chest radiography of a 52-year-old male who visited the outpatient clinic of the cardiology department for a follow-up of arrhythmia. B. AI-CAD presenting a focal abnormal lesion in the right lung with a total abnormality score of 43%. Note that the color heatmap does not have any annotation or separate abnormality score on version 2. C. On chest CT scan, irregular shaped part-solid nodule was noted in the right lower lobe. The pathology confirmed adenocarcinoma, stage lA3, after surgical resection. The patient is now routinely followed for resectable lung cancer. AI-CAD = artificial intelligence-based computer-aided diagnosis
Fig. 2. Chest radiographs analyzed with AI-CAD version 3.
A. Chest radiograph of an 88-year-old male admitted to the intensive care unit for traumatic brain injury. B. AI-CAD showing the left Ptx with an abnormality score of 98% depicted with a contour map. Emergency thoracotomy was performed immediately after the image was taken. Note the separate contour map with abbreviations; Csn, Ndl, and right PEf on version 3. AI-CAD = artificial intelligence-based computer-aided diagnosis, Csn = consolidation, Ndl = nodule, PEf = pleural effusion, Ptx = pneumothorax
Recently, several studies have reported excellent diagnostic performance of AI-CAD systems applied to chest radiography [2,11,17,18,19,20]. The consensus of thoracic radiologists for these CADs is that AI can enhance diagnostic accuracy, which concurs with our experience [4]. However, several points should be noted. As current CADs implemented for chest radiography target multiple abnormalities that present with different imaging findings such as nodules, pneumothorax, pleural effusion, and atelectasis, the reliability of these systems differs between lesions [21]. Separate user guidelines for the abnormality scores of the various lesions, which indicate the confidence level of the results, are necessary to verify the accuracy of the findings by radiologists. Radiologists should also continue their efforts to prevent additional errors that could occur due to the heterogeneity of radiography equipment and patient demographics, both of which are inevitable factors in the real world.
An important function of AI-CADs, in addition to diagnosis, is workflow optimization [22]. Nam et al. [21] reported that AI-CAD improved workflow by shortening the mean time-to-report for critical and urgent radiographs (640.5 ± 466.3 vs. 3371.0 ± 1352.5 seconds and 1840.3 ± 1141.1 vs. 2127.1 ± 1468.2, respectively), and reduced the mean interpretation time (20.5 ± 22.8 vs. 23.5 ± 23.7 seconds) in a simulated reading test. However, prioritization might be more important than the mean interpretation time. Conventionally, cases shown on the worklist can be sorted by the time of order, time of examination, and the referring clinician or department that the patient is visiting. Because these conventional factors do not reflect abnormalities on radiography, a method that can calculate the probability of new or emergent abnormalities is required. In our hospital, because the total abnormality score is integrated into the patient worklist and displayed in one column, patients can be sorted according to the score variance from their previous score in addition to the referring department. This feature would enable radiologists conduct official readings from predictably more abnormal or urgent chest radiographs to less abnormal or urgent cases. It would be even more helpful if the worklist is sorted according to the score of any specific urgent findings, as we have experienced in our hospital since April 2022 (Fig. 3).
Fig. 3. Examples of triage with the advanced search function on the worklist and an actual related clinical case.
A. Radiologists can arrange the worklist to show chest radiography in order of high abnormality scores and with this triage, can read cases of higher priority first. Note scores of abnormal findings (CSN, ATL, NDL, PTX, PPM) following the highest abnormality score. B. Based on the abnormality score, the radiologist could immediately report an unexpected PPM from the chest radiograph of a 74-year-old male admitted to the psychiatric ward due to obsessive compulsive disorder. The radiologist was able to sort the worklist to show radiographies not ordered by the department of pulmonology or thoracic surgery, and found an image ordered by the psychiatrist that was high on the worklist with an abnormality score of 97.4%. This abnormality score was due to the PPM detected by AI-CAD. The critical value report was sent to the clinician who had not detected this abnormality and the patient ended up undergoing emergency surgery after an additional CT scan, which showed unexpected gastric ulcer perforation. AI-CAD = artificial intelligence-based computer-aided diagnosis, ATL = atelectasis, CSN = consolidation, NDL = nodule, PPM = pneumoperitoneum, PTX = pneumothorax
AI-CAD for chest radiography was the first AI-based radiology tool that is also used daily by referring clinicians in our hospital. Checking preliminary reports generated by AI-CAD has become a routine, and these preliminary reports guide referring clinicians before formal readings by a radiologist are made available [8,15]. Reports that include the abnormality score, heatmap, and contour annotation are intuitive and does not require much knowledge of AI for interpretation. The AI analysis is completed immediately after image verification, and the results are easily accessible on PACS, as they can be obtained simply by scrolling images. With chest radiography being the frontline imaging modality for further medical decisions, urgent diseases can be initially depicted on chest radiographs. For urgent cases, AI-CAD helps referring clinicians to order subsequent imaging studies, provide treatment, and refer patients to the appropriate department with more confidence, even before they see the official radiologist reports. One limitation of using AI-CAD is that false-positive results can lead to unnecessary further examination [23]. To avoid this, a comparison analysis must be performed with old images, and clinical correlation is required. Efforts can be made to reduce false-positive results by training the system with error cases, which would further improve the AI-CAD system. If clinicians overlook the abnormalities undepicted by AI-CAD in false-negative results, critical-value reports might warn them of possible urgent findings. Thus, radiologists and AI-CAD complement each other, and AI-CAD has become an essential component of medical assessment in our hospital.
In summary, AI-CAD for chest radiography can provide genuine assistance in daily medical practice for referring clinicians and has benefits for radiologists through improved diagnostic performance and workflow. The same experience and demand are expected for AI-CADs for radiological examinations other than chest radiography. Based on our experience with chest radiography, for the successful implementation of AI-CADs on other modalities, both accuracy and immediate availability of AI results seem to be necessary and critical, along with explainable visualization of disease-specific results and improved medical software platforms providing data presentation that can be configured by the user.
Acknowledgments
The authors would like to thank Jun Tae Kim for his dedicated help for researchers.
Footnotes
Conflicts of Interest: The authors have no potential conflicts of interest to disclose.
Funding Statement: None
Availability of Data and Material
Additional documents related to this study are available on request to the corresponding author.
References
- 1.Tandon YK, Bartholmai BJ, Koo CW. Putting artificial intelligence (AI) on the spot: machine learning evaluation of pulmonary nodules. J Thorac Dis. 2020;12:6954–6965. doi: 10.21037/jtd-2019-cptn-03. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Hwang EJ, Park CM. Clinical implementation of deep learning in thoracic radiology: potential applications and challenges. Korean J Radiol. 2020;21:511–525. doi: 10.3348/kjr.2019.0821. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Eisen LA, Berger JS, Hegde A, Schneider RF. Competency in chest radiography. A comparison of medical students, residents, and fellows. J Gen Intern Med. 2006;21:460–465. doi: 10.1111/j.1525-1497.2006.00427.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Hwang EJ, Goo JM, Yoon SH, Beck KS, Seo JB, Choi BW, et al. Use of artificial intelligence-based software as medical devices for chest radiography: a position paper from the Korean Society of Thoracic Radiology. Korean J Radiol. 2021;22:1743–1748. doi: 10.3348/kjr.2021.0544. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.van Ginneken B, Hogeweg L, Prokop M. Computer-aided diagnosis in chest radiography: beyond nodules. Eur J Radiol. 2009;72:226–230. doi: 10.1016/j.ejrad.2009.05.061. [DOI] [PubMed] [Google Scholar]
- 6.Edwards M, Lawson Z, Morris S, Evans A, Harrison S, Isaac R, et al. The presence of radiological features on chest radiographs: how well do clinicians agree? Clin Radiol. 2012;67:664–668. doi: 10.1016/j.crad.2011.12.003. [DOI] [PubMed] [Google Scholar]
- 7.Mehrotra P, Bosemani V, Cox J. Do radiologists still need to report chest x rays? Postgrad Med J. 2009;85:339–341. doi: 10.1136/pgmj.2007.066712. [DOI] [PubMed] [Google Scholar]
- 8.Scheetz J, Rothschild P, McGuinness M, Hadoux X, Soyer HP, Janda M, et al. A survey of clinicians on the use of artificial intelligence in ophthalmology, dermatology, radiology and radiation oncology. Sci Rep. 2021;11:5193. doi: 10.1038/s41598-021-84698-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Coppola F, Faggioni L, Regge D, Giovagnoni A, Golfieri R, Bibbolino C, et al. Artificial intelligence: radiologists’ expectations and opinions gleaned from a nationwide online survey. Radiol Med. 2021;126:63–71. doi: 10.1007/s11547-020-01205-y. [DOI] [PubMed] [Google Scholar]
- 10.Kulkarni S, Seneviratne N, Baig MS, Khan AHA. Artificial intelligence in medicine: where are we now? Acad Radiol. 2020;27:62–70. doi: 10.1016/j.acra.2019.10.001. [DOI] [PubMed] [Google Scholar]
- 11.Chassagnon G, Vakalopoulou M, Paragios N, Revel MP. Artificial intelligence applications for thoracic imaging. Eur J Radiol. 2020;123:108774. doi: 10.1016/j.ejrad.2019.108774. [DOI] [PubMed] [Google Scholar]
- 12.van Leeuwen KG, Schalekamp S, Rutten MJCM, van Ginneken B, de Rooij M. Artificial intelligence in radiology: 100 commercially available products and their scientific evidence. Eur Radiol. 2021;31:3797–3804. doi: 10.1007/s00330-021-07892-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Jin KN, Kim EY, Kim YJ, Lee GP, Kim H, Oh S, et al. Diagnostic effect of artificial intelligence solution for referable thoracic abnormalities on chest radiography: a multicenter respiratory outpatient diagnostic cohort study. Eur Radiol. 2022;32:3469–3479. doi: 10.1007/s00330-021-08397-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Hwang EJ, Kim H, Yoon SH, Goo JM, Park CM. Implementation of a deep learning-based computer-aided detection system for the interpretation of chest radiographs in patients suspected for COVID-19. Korean J Radiol. 2020;21:1150–1160. doi: 10.3348/kjr.2020.0536. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.van Leeuwen KG, de Rooij M, Schalekamp S, van Ginneken B, Rutten MJCM. How does artificial intelligence in radiology improve efficiency and health outcomes? Pediatr Radiol. 2021 Jun; doi: 10.1007/s00247-021-05114-8. [Epub] [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Kim SJ, Roh JW, Kim S, Park JY, Choi D. Current state and strategy for establishing a digitally innovative hospital: memorial review article for opening of Yongin Severance Hospital. Yonsei Med J. 2020;61:647–651. doi: 10.3349/ymj.2020.61.8.647. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Lee JH, Sun HY, Park S, Kim H, Hwang EJ, Goo JM, et al. Performance of a deep learning algorithm compared with radiologic interpretation for lung cancer detection on chest radiographs in a health screening population. Radiology. 2020;297:687–696. doi: 10.1148/radiol.2020201240. [DOI] [PubMed] [Google Scholar]
- 18.Sim Y, Chung MJ, Kotter E, Yune S, Kim M, Do S, et al. Deep convolutional neural network–based software improves radiologist detection of malignant lung nodules on chest radiographs. Radiology. 2020;294:199–209. doi: 10.1148/radiol.2019182465. [DOI] [PubMed] [Google Scholar]
- 19.Hwang EJ, Park S, Jin KN, Kim JI, Choi SY, Lee JH, et al. Development and validation of a deep learning–based automated detection algorithm for major thoracic diseases on chest radiographs. JAMA Netw Open. 2019;2:e191095. doi: 10.1001/jamanetworkopen.2019.1095. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Callı E, Sogancioglu E, van Ginneken B, van Leeuwen KG, Murphy K. Deep learning for chest X-ray analysis: a survey. Med Image Anal. 2021;72:102125. doi: 10.1016/j.media.2021.102125. [DOI] [PubMed] [Google Scholar]
- 21.Nam JG, Kim M, Park J, Hwang EJ, Lee JH, Hong JH, et al. Development and validation of a deep learning algorithm detecting 10 common abnormalities on chest radiographs. Eur Respir J. 2021;57:2003061. doi: 10.1183/13993003.03061-2020. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.van Ginneken B, Schaefer-Prokop CM, Prokop M. Computer-aided diagnosis: how to move from the laboratory to the clinic. Radiology. 2011;261:719–732. doi: 10.1148/radiol.11091710. [DOI] [PubMed] [Google Scholar]
- 23.Fazal MI, Patel ME, Tye J, Gupta Y. The past, present and future role of artificial intelligence in imaging. Eur J Radiol. 2018;105:246–250. doi: 10.1016/j.ejrad.2018.06.020. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
Additional documents related to this study are available on request to the corresponding author.