Skip to main content
BMC Medical Education logoLink to BMC Medical Education
. 2025 Mar 13;25:380. doi: 10.1186/s12909-025-06859-8

Integration of artificial intelligence in radiology education: a requirements survey and recommendations from faculty radiologists, residents, and medical students

Ruili Li 1,#, Guangxue Liu 2,#, Miao Zhang 1, Dongdong Rong 1, Zhuangzhi Su 1, Yi Shan 1, Jie Lu 1,
PMCID: PMC11908051  PMID: 40082889

Abstract

Background

To investigate the perspectives and expectations of faculty radiologists, residents, and medical students regarding the integration of artificial intelligence (AI) in radiology education, a survey was conducted to collect their opinions and attitudes on implementing AI in radiology education.

Methods

An online questionnaire was used for this survey, and the participant anonymity was ensured. In total, 41 faculty radiologists, 38 residents, and 120 medical students from the authors’ institution completed the questionnaire.

Results

Most residents and students experience different levels of psychological stress during the initial stage of clinical practice, and this stress mainly stems from tight schedules, heavy workloads, apprehensions about making mistakes in diagnostic report writing, as well as academic or employment pressures. Although most of the respondents were not familiar with how AI is applied in radiology education, a substantial proportion of them expressed eagerness and enthusiasm for the integration of AI into radiology education. Especially among radiologists and residents, they showed a desire to utilize an AI-driven online platform for practicing radiology skills, including reading medical images and writing diagnostic reports, before engaging in clinical practice. Furthermore, faculty radiologists demonstrated strong enthusiasm for the notion that AI training platforms can enhance training efficiency and boost learners’ confidence. Notably, only approximately half of the residents and medical students shared the instructors’ optimism, with the remainder expressing neutrality or concern, emphasizing the need for robust AI feedback systems and user-centered designs. Moreover, the authors’ team has developed a preliminary framework for an AI-driven radiology education training platform, consisting of four key components: imaging case sets, intelligent interactive learning, self-quiz, and online exam.

Conclusions

The integration of AI technology in radiology education has the potential to revolutionize the field by providing innovative solutions for enhancing competency levels and optimizing learning outcomes.

Supplementary Information

The online version contains supplementary material available at 10.1186/s12909-025-06859-8.

Keywords: Artificial intelligence, Radiology education, Online questionnaire, AI education platform

Background

Medical imaging technology and equipment, such as X-rays, CT scans, MRIs, ultrasounds, PET scans, and others, are diverse and rapidly evolving, with new advancements constantly emerging. As a highly practical discipline, extensive and in-depth clinical practice is indispensable for enhancing meticulous logical thinking abilities and diagnostic skills in radiology education. The pressing issue is to effectively bridge the gap between theoretical foundations and practical clinical practice, and accumulate diagnostic experience, thereby fostering high-quality radiologists.

In recent years, artificial intelligence (AI) has achieved remarkable progress in the medical field, particularly in the field of radiology, generating considerable enthusiasm and anticipation amongst healthcare professionals and the general public [1]. Initially, there was apprehension that the emergence of Artificial Intelligence (AI) in radiology could potentially jeopardize the profession, culminating in a decline in the number of radiologists [2]. Even the necessity of training future generations of radiologists was questioned [3]. However, with the development of AI technology, a consensus has now emerged that AI still serves as a supplementary tool rather than a replacement for radiologists in medical practice and has been regarded as an innovative way to significantly streamline workflows and assist in disease detection, diagnosis, and treatment [4]. Radiologists need to adopt AI technology in clinical practice to augment work efficiency, quality, and patient care. Furthermore, AI can evoke high expectations in radiology education and has the potential to promote radiology education and improve training for residents by facilitating competency-based training and evaluation. However, the application of AI technologies in radiology education remains a nascent field with limited exploration.

To date, there have been scant documented surveys exploring the perceptions of Chinese faculty radiologists, residents, and medical students regarding AI in radiology education, including its influence on teaching and the availability of opportunities within training. Gaining insight into individual attitudes and requirements could improve the integration of AI into radiology teaching and training programs, potentially exerting a profound influence on the successful implementation of AI in radiology education. This survey aimed to evaluate the requirements, experiences, and opinions of faculty radiologists, residents, and medical students regarding the integration of AI into radiology education, as well as the impact of AI on this field. It is anticipated that a framework for an AI-driven radiology education training platform will be proposed. The results of this study may provide valuable data for the implementation of AI in radiology education and offer guidance for future initiatives.

Materials and methods

Subjects

Faculty radiologists, residents, and medical students at Xuanwu Hospital, Capital Medical University, were invited to take part in this online survey conducted in April 2023. We targeted different groups of participants, aiming to include a representative mix of participants with varying levels of experience and exposure to AI in radiology. This was done to enhance the generalizability and validity of our findings. Based on the recent population records of the radiology department and learners, 218 individuals (58 faculty radiologists, 40 imaging residents, and 120 undergraduate medical students) were identified as the targets for the survey study. All subjects participated voluntarily. Since this study does not fall into the category of life science and medical research involving human subjects, the protocol has been reviewed by the Ethics Committee of Xuanwu Hospital Capital Medical University and a waiver has been granted. The study was conducted in accordance with the Declaration of Helsinki. Informed consent was acquired from all the participants. Data collection was entirely anonymized, with no identifying information.

Methods

The survey questionnaire was initially designed based on input from three radiology consultant mentors who have extensive teaching experience of over 30 years in the Department of Radiology. Subsequently, the survey was pretested with four independent senior radiologists to assess completion time, sentence clarity, validity, and reliability. Their feedback led to the rephrasing of specific statements and questions to enhance readability. The questionnaire was designed separately for instructors, residents, and medical students to capture the diverse perspectives. The final instructor version comprised a total of 12 questions, including five general demographic questions (gender, age, highest academic degree, years of service, qualifications) and seven questions related to the integration of AI into radiology education (five single-choice questions and two open-ended questions). The final resident and student version also consisted of 12 questions, including three demographic questions (gender, age, stage of training, or grade), two psychological-related questions, and seven questions regarding the integration of AI into radiology education (four single-choice questions, one-multiple choice question and two open-ended questions). It takes approximately five minutes to complete this questionnaire. Respondents were not obliged to provide any contact information. The survey questionnaire was created on the WJX online questionnaire platform (powered by https://www.wjx.cn, Changsha Ranxing IT Ltd., Changsha, China) and subsequently distributed to the participants by the research assistant. Surveys with incomplete answers would be excluded from the results.

Statistical analysis

Data were collected and categorized initially and then analyzed using IBM SPSS Statistics software, version 22.0 (IBMCorp., Armonk, N.Y., USA). The outcomes of choice questions were presented in terms of frequency and percentage [examples (%)], and the responses to the open-ended questionnaire for all participants need to be summarized. To assess the reliability of the questionnaire, the Cronbach’s alpha coefficient for internal consistency was calculated. The comparison of survey results among the faculty radiologists, residents, and medical students was conducted using Fisher’s exact test or χ2 test. Further conduct subgroup analyses within each group, such as observing the differences between junior/intermediate radiologists and senior radiologists, between junior and senior residents, and between preclinical and clinical students. Statistical significance was set at P < 0.05, two-tailed. For free-text responses, we utilized thematic analysis to identify key themes emerging from participants’ responses. Two independent reviewers analyzed the open-ended responses using an iterative process of classifying and refining categories based on consensus. To ensure reliability, we conducted inter-rater reliability checks and reported the Cohen’s kappa values achieved during this process.

Results

Demographic statistics of participants

A total of 199 valid responses were gathered, consisting of 41 from faculty radiologists, 38 from imaging residents, and 120 from undergraduate medical students within the two weeks. The total response rate was 91.7%. The distribution of sex among the involved radiologist instructors was nearly balanced, with 51.2% (n = 21) being male and 48.8% (n = 20) being female. The age distribution was 34.1% for the 25–35 years group, 34.2% for the 36–45 years group, and 31.7% for those above 45 years. All the radiologists had completed a postgraduate degree, with 48.8% (n = 20) possessing a master’s degree and 51.2% (n = 21) holding a doctoral degree. Some faculty radiologists (41.5%, n = 17) had less than 10 years of experience, 26.8% (n = 11) had more than 10 years but less than 20 years, and 31.7% (n = 13) had over 20 years of experience. There were 10 junior radiologists (24.4%), 14 intermediate radiologists (34.1%), and 17 senior radiologists (41.5%).

Among 38 residents, most of the participants were female, accounting for 65.8% (n = 25), while 34.2% (n = 13) were male. The average age of the residents was 26 years. Regarding the stage of training: 19 residents (50%) were in their first year (ST1), 10 residents (26.3%) were in their second year (ST2), and 9 residents (23.7%) were in their third year (ST3).

A total of 120 medical students at the trainee stage responded, with an average age of 23 years. There was a predominance of females (n = 83, 69.2%) over males (n = 37, 30.8%).

Psychological pressure in the early stage of clinical practice

During the initial stage of clinical practice, 58% of the residents stated that they experienced some psychological pressure, while 29% considered it to be extremely high. Among students, these proportions are 57% and 32% respectively. Reasons for high pressure among residents include a tight schedule and heavy workload, accounting for 84%; concerns about diagnostic report writing errors, accounting for 79%; academic and employment pressure, accounting for 66%; and research pressure, accounting for 42%. The respective proportions among students are 80%, 62%, 81%, and 59% (as shown in Fig. 1; Table 1).

Fig. 1.

Fig. 1

Analysis of questionnaire. a, Fisher’s Exact Test; b, χ2 test; *, P < 0.05; **, P < 0.01; ***, P < 0.001

Table 1.

The comparison of the attitudes of integrating AI in radiology education among the faculty radiologists, residents and medical students

No. Medical Students (S)
n = 120
Residents (R)
n = 38
Faculty Radiologists (F)
n = 41
P values Cramér’s V P values
S vs. R S vs. F R vs. F
Q4 Is there any psychological pressure during the initial stage of clinical practice?
Extremely A bit No Extremely A bit No
39(32%) 68(57%) 13(11%) 11(29%) 22(58%) 5(13%) 0.860a 0.041
Q5

Reasons for the psychological pressure:

A1, High academic and employment pressure; A2, Tight time, heavy tasks; A3, Diagnostic report writing pressure; A4, High research pressure.

A1 A2 A3 A4 A1 A2 A3 A4
97(81%) 96(80%) 74(62%) 71(59%) 25(66%) 32(84%) 30(79%) 16(42%) 0.296b 0.092
Q6 Are you familiar with the implementation of AI in radiology education?
Extremely A bit No Extremely A bit No Extremely A bit No
8(7%) 33(27%) 79(66%) 1(3%) 24(63%) 13(34%) 5(12%) 22(54%) 14(34%) < 0.001a, *** 0.242 < 0.001a, *** 0.001a, ** 0.281a
Q7 Do you have interest in the application of AI in radiology education?
Yes Not know No Yes Not know No Yes Not know No
68(57%) 15(12%) 37(31%) 32(84%) 5(13%) 1(3%) 33(81%) 3(7%) 5(12%) < 0.001a, *** 0.214 < 0.001a, *** 0.021a, * 0.235a
Q8 How to integrate AI into radiology education, such as online platforms or others?
Platform No idea Platform No idea Platform No idea
78(65%) 42(35%) 32(84%) 6(16%) 34(83%) 7(17%) 0.017b, * 0.203 0.025b, * 0.031b, * 0.878b
Q9 AI will play an important role in improving learning efficiency.
Agree Neutral Disagree Agree Neutral Disagree Agree Neutral Disagree
54(45%) 55(46%) 11(9%) 23(60%) 14(37%) 1(3%) 34(83%) 7(17%) 0 < 0.001a, *** 0.223 0.192a < 0.001a, *** 0.043a, *
Q10 AI can help to enhance learners’ confidence.
Agree Neutral Disagree Agree Neutral Disagree Agree Neutral Disagree
58(48%) 43(36%) 19(16%) 20(53%) 15(39%) 3(8%) 31(76%) 10(24%) 0 0.006a, ** 0.181 0.468b 0.001a, ** 0.042a, *

Note: a, Fisher’s Exact Test; b, χ2 test; ***, P < 0.001; **, P < 0.01; *, P < 0.05

Interest and experience in AI

A significant proportion of respondents reported having no prior experience with integrating AI into their existing teaching, learning, or training procedures. Only 12% of the faculty radiologists, 3% of the residents, and 7% of the students were acquainted with the implementation of AI in radiology education. Most had limited knowledge or lacked a fundamental understanding of AI in radiology education, which was particularly evident among students (P < 0.05). Nevertheless, most of the respondents were interested in integrating AI into radiology education and expressed a wish to first practice reading images and writing reports through an AI training platform before entering the clinical practice stage. This strong desire is more pronounced among faculty radiologists and residents (P < 0.05). The faculty radiologists expressed a high level of satisfaction and believed that the AI training platform could enhance training efficiency (83%) and boost learners’ confidence (76%). In contrast, approximately half of the residents and medical students were enthusiastic and recognized the potential of the AI platform to improve training efficiency and confidence, while others were neutral or concerned (P < 0.05) (as shown in Fig. 1; Table 1). The reliability of the questionnaire was represented by a Cronbach’s alpha of 0.734.

To provide a deeper understanding of how perceptions differ based on experience or stage of training, further subgroup analyses within each group were conducted and revealed that, regarding the question “Do you have interest in the application of AI in radiology education?“, some preclinical students showed more neutral or negative attitudes compared to clinical students (P < 0.05). There were no statistically significant differences between junior/intermediate radiologists and senior radiologists, between junior and senior residents, and between preclinical and clinical students for the remaining aspects (P > 0.05) (as shown in Table S1-S3).

Suggestions and concerns for integrating AI into radiology education

The free-text responses that dealt with the suggestions and concerns related to the AI training platform were also gathered. Two independent reviewers evaluated the text responses of Q11 and Q12. Cohen’s kappa values for different groups (faculties, residents, and students) were calculated separately. The results showed that all Cohen’s kappa values were higher than 0.786 with all P-values < 0.001, indicating a substantial agreement or almost perfect agreement. This suggests consistency between the classification processing of the two reviewers. Most instructors and learners have put forward several proposals for the development of AI platforms in radiology education, approximately 32.5% of students didn’t offer suggestions due to their unfamiliarity with AI. The top three suggestions include enabling learners to search for and select cases according to their requirements, read images, write radiological reports similar to real clinical work, and receive timely feedback to address their weaknesses. Other suggestions include providing a wide range of knowledge for learners. Trainees can acquire the imaging features and differential diagnoses of diseases, access the latest research progress in imaging, as well as obtain clinical knowledge spanning from basic symptoms and diagnostic criteria to cutting-edge treatment regimens.

Regarding the disadvantages, instructors, and learners have also presented their viewpoints. What they mainly care about is the software’s performance, including the AI reliability, flexibility, feedback accuracy, and interactivity. Their main concerns lie in the performance of the software. They wonder if it is intelligent enough to flexibly understand the diagnostic reports written by learners, ensuring that using different words to convey the same meaning does not result in misjudgment. They also question whether it can precisely provide feedback on the learning effectiveness and progress of learners, and if it can achieve personalized and precise services. Furthermore, they are worried about poor interactivity and the absence of communication between learners and instructors. Some other concerns include potential deficiencies in integrating with clinical practice. There may be a lack of clinical thinking, insufficient knowledge expansion and extension. It may not be as vivid as the explanations given by teaching instructors. The sense of achievement and responsibility in clinical work cannot be conveyed through AI. Learns may rely too much on it and be reluctant to think actively. It may fail to achieve truly personalized and precise education.

Discussion

The exponential growth of medical imaging knowledge in recent years poses new challenges for both learners and educators who are responsible for training the next generation of radiologists. Once entering clinical practice, the focus of education often shifts due to practical considerations. Currently, there exists a disconnect between theoretical learning and clinical practice in radiology education [5], which is manifested in the poor quality of radiological reports and low training efficiency. Incorporating AI into the provision of teaching and training is a prospect that might provide thrilling new opportunities for knowledge acquisition and overcoming educational deficiencies. Moreover, along with the advancement of AI technologies, how to integrate AI into radiology education and attain better teaching and training outcomes are the matters we truly need to contemplate profoundly. This study surveyed faculty radiologists and learners at various stages of radiology education, ranging from medical school through residency training, to assess attitudes and requirements regarding the integration of AI in radiology education from diverse perspectives. The findings revealed that despite the participants’ limited objective knowledge about the implementation of AI in radiology education, most of the participants viewed AI applications positively, considering it a valuable training tool. Furthermore, the most favored AI applications in education were portability and interactive training platforms, which can be introduced in medical school and radiology residency training as a primary didactic tool for pre-clinical training. This survey serves as an initial subjective assessment of educators’ and learners’ specific requirements related to AI in radiology education, potentially laying the groundwork for future in-depth research and guiding the implementation of AI in clinical radiology teaching and training.

Residents and students commonly experience a certain level of psychological stress during the early stage of clinical practice. This discrepancy primarily arises from an inability to promptly integrate theoretical knowledge with clinical practice, resulting in an inadequate adaptation to clinical work. They are unable to promptly and effectively combine theoretical knowledge with clinical practice and have difficulty adapting well to clinical work. The application of AI in radiology education is still very restricted, although the notion of using AI to enhance education is not novel. Most faculty radiologists, residents, and students reported having limited familiarity with AI education and lacking practical experience in using AI for teaching or training purposes. Despite their limited experience with AI, most of the participants expressed enthusiasm for incorporating AI in radiology education, particularly among faculty radiologists and residents who have already witnessed the impact of AI in clinical practice. Consequently, this highlights a mismatch between the existing interest in leveraging AI for educational purposes and the insufficient availability of relevant resources for both educators and learners. Other studies likewise discovered that there were few instances of formal integration of AI into radiology teaching and training [68]. Positive attitudes could, thus, promote the adoption of AI in radiology education. AI in radiology education could mainly be employed for foundational training, such as for medical students or junior residents, to personalize learning experiences and tailor them for individual trainees. For advanced training, senior residents will still have the chance to directly interact with cases at a picture archiving and communication system (PACS) reading station or consult and communicate directly with the educators. Hence, radiology medical students or junior residents might represent a potentially considerable untapped resource for individuals currently engaged in AI education development and research. The development of AI educational resources offers both an opportunity and a challenge.

Radiologists overwhelmingly expressed enthusiasm and optimism regarding the potential benefits of AI in radiology education, including stress reduction, confidence enhancement, and training efficacy improvement. Faculty radiologists, due to their extensive experience in clinical settings, are likely to have more exposure to AI applications. This familiarity may lead to a more positive view of AI. In contrast, the residents and medical students held more cautious expectations, characterized by a mix of optimism and skepticism. The difference in expectations regarding the role of AI between educators and learners highlights some worries among learners. Learners may be more concerned about the potential threats AI poses to their future careers and the reliability of AI-generated feedback. The top two concerns regarding AI in radiology education in this study revolve around feedback accuracy and interactivity limitations, which may impact the overall effectiveness of the learning experience. The system might be overly rigid and incapable of adapting to understand the content of radiological reports written by learners, potentially resulting in misinterpretation of different words expressing the same meaning. Moreover, there is a deficiency in real-time interaction and discussion between learners and educators. Addressing these concerns is essential to ensure that AI-driven radiology education is effective, efficient, and learner-centered. Although attitudes towards AI vary among broad categories (faculty, residents, students), subgroup comparisons within the same category across different age groups or stages show no significant differences. This indicates that the integration of AI into radiology education is a relatively new application, and the same category tends to have relatively consistent views on it, faculty radiologists who had more exposure to AI in clinical practice were related to positive attitudes towards integrating AI into education. In the future, longitudinal studies with regular surveys at predefined intervals will be carried out to track the changes in learners’ perceptions when they are exposed to AI platform. These surveys will cover various aspects of their experiences with AI, including their initial impressions, changes in their confidence levels in using AI-assisted learning, and how their views on the role of AI in their future careers develop over time. It is also a challenge for developers and researchers to establish more valid and reliable measures to analyze learners’ performance. To determine and verify the efficacy of AI in education, a series of controlled experiments are essential to precisely evaluate the impact of AI platforms on learning outcomes and stress reduction. The trainees using AI-integrated learning platforms will be in the experimental group, while those who have previously adopted traditional learning methods will be in the control group. Compare the final theoretical exam scores and images-reading skills test scores of the two groups. Also, survey learners’ satisfaction with the AI platform and whether it can relieve psychological stress in the form of a questionnaire. If AI demonstrates effectiveness in enhancing trainee performance through the previous evaluations, further longitudinal studies can be conducted to track trainees’ long-term performance.

The utilization of online resources such as imaging case sets, question banks, and tests has gained popularity among residents, offering a means to enhance learning, retain knowledge, and assess competence [9]. They exhibit a preference for these online tools, which can be attributed to their convenience, flexibility, and personalized learning experience. A well-designed and dynamic radiology training platform, integrated with AI, could furnish residents and medical students with the tools to optimize their learning outcomes and improve knowledge retention. Based on the current educational barriers and challenges, as well as the concerns and needs of instructors and learners regarding the integration of AI in education, our team has tentatively proposed an AI-integrated platform framework, which comprises a development flowchart, function menu, and deployment methods, as shown in Fig. 2A and C. The development process includes product design, program development, art design, testing support, and technical support (Fig. 2A). The main function menus of the platform include standardized imaging case sets, intelligent interactive learning, self-quizzes, and online exams (Fig. 2B). The components and functionalities of the AI education platform framework will be explained in the subsequent section. The system structure primarily consists of acquisition of information, dataset, data server, AI server, teacher-side terminals, and student-side terminals (Fig. 2C).

Fig. 2.

Fig. 2

Framework for AI-assisted radiology education platform, including the development flowchart (A), function menu (B) and deployment methods (C)

The imaging case sets can supply abundant educational resources to learners. In contrast to the conventional top-down teaching approach, where students and residents first acquire diagnostic skills from experienced radiologists and then apply them in practice, case-based learning offers an alternative paradigm, which is favored by radiology students and educators due to its effectiveness in allowing learners to gain first-hand experience in diagnosing and interpreting cases by themselves [10]. This approach is regarded as more effective than the traditional one [1113]. The case-based AI teaching platform can complement the bottom-up learning approach, and the curated case sets with high educational value can be organized by disease systems, subspecialties, and clinical diagnoses, forming a tree structured classification map of medical imaging. Cases cover both common and rare diagnoses, with 5–10 examples for each disease. These cases can be searched through semi-automated selection criteria. The case selection process is guided by four key criteria, which must be met simultaneously: (1) The case should be a disease type stipulated in the teaching syllabus for undergraduate students majoring in medical imaging or the training rules for residents, and align with the seniority and ability of the teaching and training objects; (2) The case should be a disease type for which trainees tend to make common mistakes when writing radiological reports; (3) The case should provide a comprehensive set of clinical, laboratory, and imaging examination data, moving beyond mere image description to facilitate a deeper understanding of the disease process (with all basic information and images anonymized); (4) The case should feature high-quality images with significant teaching value, enabling learners to develop their clinical thinking and decision-making skills through the interpretation of these cases. In addition, typical cases of teaching significance that newly arise in clinical work will be uploaded regularly to continuously replenish and update the case database, ensuring its perpetual usefulness.

The primary products in medical radiology are radiological reports. Traditional teaching method largely depends on the interpretation of live clinical cases at the PACS workstation [14, 15]. However, in busy clinical settings, it is challenging for faculty radiologists to provide detailed feedback on every report due to time constraints. Typically, they may make several alterations to the reports, but they only choose the key points to emphasize and explain. Junior residents often prefer to receive more feedback, both major and minor, and seek to understand the changes made to their reports. Moreover, due to heavy clinical workloads, resident educational training is often scheduled based on work demands rather than individual needs. Case-based simulations of reading images and writing reports might assist in fostering a subjective sense of preparedness and confidence for trainees [16]. The intelligent interactive learning module provides situational learning with real-time formative feedback tailored to specific learning skills, covering undergraduate education, specialized training, and continuing education. The intelligent interactive learning module encompasses various features, including image reading, radiological report writing, reports rating and scoring, AI analysis reports, feedback, relevant teaching points pushing, relevant cases pushing, growth curves, and communication section. Students and residents can search and select cases in accordance with individual circumstances, engage in image reading, identification of imaging signs, and report writing. The module includes common small tools for measuring the size and density, marking the lesion, zooming, reversing, adjusting image settings, and so forth. The functions of the common small tools used for image reading and the interface for writing radiological reports resemble the clinical working mode in the radiology department, allowing students to acquire a genuine and immersive learning experience. Automated error detection in radiological reports and immediate feedback analysis offer learners the chance to review the cases they missed or misinterpreted [17]. Furthermore, the system can supply additional cases from the teaching case set when learners read similar cases, similar to “suggested readings” and emphasize and spotlight relevant teaching points to enhance learner retention and comprehension. By tracking and analyzing large datasets on learners’ performance and progress [18, 19], it is feasible to provide customized and personalized education for residents and students, optimizing their skill levels [20]. For example, the platform can record learning activities, track progress and proficiency, identify weaknesses, learning gaps, and frequent mistakes, then adjust the training content according to performance, and design customized complementary curricula or learning plans to fill these gaps and meet the learning requirements of the learners [2123]. The necessary workload is considerably less than the current manual effort needed. These AI applications offer hope for alleviating the growing burden on faculty radiologists and residents. Within the intelligent interactive learning module, there is a communication zone where learners can pose questions about cases or express different viewpoints and opinions. Learners can leave a message in this section, and the teachers or instructors will provide timely responses.

AI platform’s self-quiz module randomly assigns learners a set number of cases to review and diagnose, providing a grade upon completion. The platform can offer immediate, objective, specific, and precise feedback and evaluations, and emphasize relevant teaching points. It can also intersperse cases in which users previously answered wrongly, assisting them in addressing and consolidating knowledge deficiencies, which has a positive effect on enhancing the learning experience [23]. More significantly, the platform will analyze extensive historical data, including the learning duration, learning trajectory, and test results of learners, to customize individualized training for each learner. Consequently, this approach facilitates the realization of personalized education and greatly advances precision education in radiology [20].

The human-computer dialogue examination mode is the prevailing method for medical imaging exams. Compared with traditional paper-based exams, online exams can present images of cases more precisely, which closely conforms to the clinical practice and examination requirements of the imaging profession, thus standardizing medical imaging exams. The platform supports various kinds of exam questions, such as single-choice, multiple-choice, fill-in-the-blank, true-or-false, short-answer, and question-and-answer questions. During the exam, the exam duration can be set, and other learning modules can be locked. Operations unrelated to the exam are not permitted. Moreover, the platform offers options for both automatic and manual scoring. For objective questions, automatic scoring will save teachers’ time and enhance exam efficiency. The platform also supports accessing the test papers, analyzing, summarizing, printing, and exporting exam results. Teachers can utilize the platform to simulate various types of medical imaging exams tailored to specific needs, including question types, difficulty levels, and assessment content. This assists learners in becoming familiar with online exam formats, like graduation and professional title exams, at an earlier stage.

In teaching practice and the existing teaching platform framework, separate standardized imaging case sets and online exams are available, but intelligent interactive learning and intelligent self-quizzes are rare. According to the current survey, instructor radiologists, residents, and medical students consider that in AI teaching and training, the development of an intelligent training platform should be given priority. Therefore, we proposed an AI-driven teaching platform framework integrating standardized imaging case sets, intelligent interactive learning, self-quizzes, and online exams. It might be beneficial for adapting to various learning phases and styles. It extends undergraduate teaching beyond the classroom and acts as a valuable supplement to the standardized curriculum. Students can utilize the AI platform to study and practice independently during extracurricular time. It can also be employed for the junior residents’ training program. Similarly, AI-driven training should be regarded as a complement to traditional training methods rather than a substitution. Applying AI to a radiology online training platform could assist in providing students and residents with a transitional adaptation period in clinical practice. It could also help identify individual knowledge gaps and learning needs that learners might have been oblivious to, offer educational content tailored to their needs and learning styles, and enhance clinical skills achievement and positive participation [18, 2426]. The combination of AI platform training and automated feedback with traditional learning approaches would improve learning and training efficiency and promote innovation in radiology education. Additionally, since a large part of our assessment of learners is subjective, it is challenging to identify trends in training effectiveness. Nevertheless, AI can recognize and manage patterns by annotating learners’ performance data, and their competence can be continuously and objectively evaluated [18, 25]. Assigning some routine tasks, such as acquiring theoretical knowledge and clinical skills, and evaluating exams, to an AI platform will reduce the teaching pressure on radiologists who are in charge of these tasks [3, 27, 28]. Radiologists will have more time to focus on clinical work and scientific research. Furthermore, this platform would not only aid in the local development of teaching and training but also offer access on a national scale without being restricted by time and space. This is particularly significant in areas where high-quality teaching resources and a variety of radiology cases are scarce. It can assist in achieving cross-regional and interschool resource sharing and solve the issue of uneven educational resources. In summary, just as AI has been rapidly adopted into clinical practice, the time has arrived for educators to embrace AI. The potential to create value and advance radiology education is undeniable.

AI has the potential to advance precision education in radiology. However, there are still some obstacles that need to be overcome to achieve this goal. First, the content construction challenges. Producing high-quality medical imaging teaching cases requires a significant amount of time from professionals. Moreover, it is quite challenging to organically combine medical imaging knowledge with AI technology. Teaching cases should cover not only the interpretation of traditional medical images but also the usage of AI tools and the principles of algorithms. Moreover, teaching content needs to be continuously updated in line with the rapid development of medical imaging technology and AI to ensure its timeliness. Second, the technological challenges. When developing AI algorithms applicable to medical imaging education, both accuracy and interpretability need to be taken into account. Although complex deep learning algorithms boast high precision, it is rather difficult to explain their decision-making processes, which is not conducive to teaching and understanding. Different medical imaging tasks (such as identifying anatomical structures, analyzing imaging features, and diagnosing diseases) require different algorithms. Third, the personnel and financial challenges. Building the platform requires cross-disciplinary talents, including medical experts, AI engineers, education experts, and so on. However, such compound talents are currently in short supply. The construction of the platform involves multiple expenses such as software development, server purchase, data storage, and personnel salaries. Moreover, continuous financial support is also needed for subsequent maintenance, updates, and iterations. Fourth, the challenges in application and promotion. Some teachers and students may question the accuracy of AI diagnoses or worry about its impact on their professional judgment abilities. AI lacks empathy and common sense and is unable to improvise or understand nuances. Learners are concerned about the accuracy and interactivity of AI feedback. It must be acknowledged that before the introduction of AI, the use of AI in education requires appropriate supervision and research, to prevent AI systems from providing trainees with inaccurate or substandard teaching. It would be important to appropriately address these issues for the successful establishment and implementation of an AI-driven radiology education training platform.

There are two limitations in the current survey: (1) The respondents are self-selected and confined to those from Xuanwu Hospital, which restricts its generalizability. Future multi-institutional research will encompass a more diverse and multi-regional sample to validate and expand our findings. This will not only enhance the robustness of our research but also contribute to a more comprehensive understanding of the topic in diverse geographical, cultural and educational settings; (2) Although the questionnaire incorporates some open-ended and free-text questions, there are still constraints in terms of the scope of response options in the questionnaire design. The questionnaire was intended to offer a comprehensive overview of AI implementation in radiology education, but this approach restricted the capacity to capture more specific viewpoints and requirements. It could be improved through group discussions and the utilization of more open-ended questions.

Conclusions

In conclusion, this survey reveals that the faculty radiologists, residents, and medical students involved in this survey all demonstrated a positive attitude towards the implementation of AI in radiology education, and faculty radiologists expressed higher satisfaction and greater confidence in AI. Furthermore, a framework of AI-assisted radiology education training platform was tentatively recommended and proposed by our group and could be employed in the clinical skill training of medical students and residents. While AI integration holds promise for enhancing radiology education, addressing concerns related to implementation, feedback accuracy, and user readiness is essential for its successful adoption. As of now, AI education is still in the early stages of exploration. It demands dynamic domestic and international collaboration from clinical, research, and educational perspectives, along with extensive research, development, and real-world validation. It is hoped that in the future, AI will not only transform the way we carry out our professions and achieve ‘precision medical diagnosis and treatment’, but also innovate educational models, change the way we teach and learn, and ultimately achieve ‘personalized precision medical education’.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1 (26.9KB, docx)

Acknowledgements

The authors thank the Xuanwu Hospital Teaching Affairs office for their support, and all the respondents for their participation.

Author contributions

R.L. and G.L. contributed to study design, data collection, data analysis, and wrote the first draft of the manuscript; M.Z. and D.R. contributed to data collection and manuscript editing; Z.S. and Y.S. contributed to data collection; J.L. contributed to study design, data collection, and manuscript editing. All authors reviewed the manuscript.

Funding

This study was supported by Beijing Education Science 13th Five-Year Plan Project under Grant [No.CHDB2020174]; Beijing Residential Training Quality Improvement Project under Grant [No.2023007, 2023033, 2024011]; and Research Project on Education and Teaching Reform of Capital Medical University under Grant [No.2023JYY136, 2024JYY112].

Data availability

The datasets used and/or analysed during the current study are available from the corresponding author under reasonable request.

Declarations

Ethics approval and consent to participate

In this study, data collection and processing were completely anonymized, without any identifying information. Since this study does not fall into the category of life science and medical research involving human subjects, the Ethics Committee of Xuanwu Hospital Capital Medical University grants a general waiver for the use of purely anonymized data and an application with approval is not requested, in accordance with the second paragraph of Article 32 of ‘Measures for Ethical Review of Life Sciences and Medical Research Involving Human Beings’, which was published by National Health Commission, Ministry of Education, Ministry of Science and Technology of China, and National Administration of Traditional Chinese Medicine with the document number [2023] 4, URL: https://www.gov.cn/zhengce/zhengceku/2023-02/28/content_5743658.htm).

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Ruili Li and Guangxue Liu contributed equally to this work.

References

  • 1.Gore JC. Artificial intelligence in medical imaging. Magn Reson Imaging. 2020;68:A1–4. 10.1016/j.mri.2019.12.006. [DOI] [PubMed] [Google Scholar]
  • 2.European Society of R. What the radiologist should know about artificial intelligence - an ESR white paper. Insights Imaging. 2019;10(1):44. 10.1186/s13244-019-0738-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Pesapane F, Codari M, Sardanelli F. Artificial intelligence in medical imaging: threat or opportunity? Radiologists again at the forefront of innovation in medicine. Eur Radiol Exp. 2018;2(1):35. 10.1186/s41747-018-0061-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Slanetz PJ, Daye D, Chen PH, Salkowski LR. Artificial Intelligence and Machine Learning in Radiology Education is Ready for Prime Time. J Am Coll Radiol. 2020;17(12):1705–7. 10.1016/j.jacr.2020.04.022. [DOI] [PubMed] [Google Scholar]
  • 5.Simelane T, Ryan DJ, Stoyanov S, et al. Bridging the divide between medical school and clinical practice: identification of six key learning outcomes for an undergraduate preparatory course in radiology. Insights Imaging. 2021;12(1):17. 10.1186/s13244-021-00971-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Hashmi OU, Chan N, de Vries CF, et al. Artificial intelligence in radiology: trainees want more. Clin Radiol. 2023;78(4):e336–41. 10.1016/j.crad.2022.12.017. [DOI] [PubMed] [Google Scholar]
  • 7.Khafaji MA, Safhi MA, Albadawi RH, et al. Artificial intelligence in radiology: are Saudi residents ready, prepared, and knowledgeable? Saudi Med J. 2022;43(1):53–60. 10.15537/smj.2022.43.1.20210337. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Lindqwister AL, Hassanpour S, Lewis PJ, Sin JM. AI-RADS: an Artificial Intelligence Curriculum for residents. Acad Radiol. 2021;28(12):1810–6. 10.1016/j.acra.2020.09.017. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Jordan SG, Nyberg EM, Catanzano TM, Davis LP, Lewis PJ. RadExam turns 1: Offering solutions to Radiology residencies. J Am Coll Radiol. 2019;16(9 Pt A):1206–10. 10.1016/j.jacr.2019.02.035 [DOI] [PubMed]
  • 10.Maleck M, Fischer MR, Kammer B, et al. Do computers teach better? A media comparison study for case-based teaching in radiology. Radiographics. 2001;21(4):1025–32. 10.1148/radiographics.21.4.g01jl091025. [DOI] [PubMed] [Google Scholar]
  • 11.Kumar V, Gadbury-Amyot CC. A case-based and team-based learning model in oral and maxillofacial radiology. J Dent Educ. 2012;76(3):330–7. PMID: 22383601. [PubMed] [Google Scholar]
  • 12.Terashita T, Tamura N, Kisa K, Kawabata H, Ogasawara K. Problem-based learning for radiological technologists: a comparison of student attitudes toward plain radiography. BMC Med Educ. 2016;16(1):236. 10.1186/s12909-016-0753-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Welter P, Deserno TM, Fischer B, Gunther RW, Spreckelsen C. Towards case-based medical learning in radiological decision making using content-based image retrieval. BMC Med Inf Decis Mak. 2011;11:68. 10.1186/1472-6947-11-68. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Sabir SH, Aran S, Abujudeh H. Simulation-based training in radiology. J Am Coll Radiol. 2014;11(5):512–7. 10.1016/j.jacr.2013.02.008. [DOI] [PubMed] [Google Scholar]
  • 15.Chetlen AL, Mendiratta-Lala M, Probyn L, et al. Conventional Medical Education and the history of Simulation in Radiology. Acad Radiol. 2015;22(10):1252–67. 10.1016/j.acra.2015.07.003. [DOI] [PubMed] [Google Scholar]
  • 16.Towbin AJ, Paterson BE, Chang PJ. Computer-based simulator for radiology: an educational tool. Radiographics. 2008;28(1):309–16. 10.1148/rg.281075051. [DOI] [PubMed] [Google Scholar]
  • 17.Waite S, Farooq Z, Grigorian A, et al. A review of Perceptual Expertise in Radiology-How it develops, how we can test it, and why humans still matter in the era of Artificial Intelligence. Acad Radiol. 2020;27(1):26–38. 10.1016/j.acra.2019.08.018. [DOI] [PubMed] [Google Scholar]
  • 18.Duong MT, Rauschecker AM, Rudie JD, et al. Artificial intelligence for precision education in radiology. Br J Radiol. 2019;92(1103):20190389. 10.1259/bjr.20190389. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Lillehaug SI, Lajoie SP. AI in medical education–another grand challenge for medical informatics. Artif Intell Med. 1998;12(3):197–225. 10.1016/s0933-3657(97)00054-7. [DOI] [PubMed] [Google Scholar]
  • 20.Simpson SA, Cook TS. Artificial Intelligence and the Trainee experience in Radiology. J Am Coll Radiol. 2020;17(11):1388–93. 10.1016/j.jacr.2020.09.028. [DOI] [PubMed] [Google Scholar]
  • 21.Khumrin P, Ryan A, Judd T, Verspoor K. Diagnostic machine learning models for Acute Abdominal Pain: towards an e-Learning Tool for Medical Students. Stud Health Technol Inf. 2017;245:447–51. PMID: 29295134. [PubMed] [Google Scholar]
  • 22.Awan O, Dey C, Salts H, et al. Making Learning Fun: Gaming in Radiology Education. Acad Radiol. 2019;26(8):1127–36. 10.1016/j.acra.2019.02.020. [DOI] [PubMed] [Google Scholar]
  • 23.Morin CE, Hostetter JM, Jeudy J, et al. Spaced radiology: encouraging durable memory using spaced testing in pediatric radiology. Pediatr Radiol. 2019;49(8):990–9. 10.1007/s00247-019-04415-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Mazurowski MA. Artificial Intelligence May cause a significant disruption to the Radiology workforce. J Am Coll Radiol. 2019;16(8):1077–82. 10.1016/j.jacr.2019.01.026. [DOI] [PubMed] [Google Scholar]
  • 25.Chan KS, Zary N. Applications and Challenges of Implementing Artificial Intelligence in Medical Education: integrative review. JMIR Med Educ. 2019;5(1):e13930. 10.2196/13930. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Tajmir SH, Alkasab TK. Toward augmented radiologists: changes in Radiology Education in the era of machine learning and Artificial Intelligence. Acad Radiol. 2018;25(6):747–50. 10.1016/j.acra.2018.03.007. [DOI] [PubMed] [Google Scholar]
  • 27.Thrall JH, Li X, Li Q, et al. Artificial Intelligence and Machine Learning in Radiology: opportunities, challenges, pitfalls, and Criteria for Success. J Am Coll Radiol. 2018;15(3 Pt B):504–8. 10.1016/j.jacr.2017.12.026. [DOI] [PubMed] [Google Scholar]
  • 28.Masters K. Artificial intelligence in medical education. Med Teach. 2019;41(9):976–80. 10.1080/0142159X.2019.1595557. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary Material 1 (26.9KB, docx)

Data Availability Statement

The datasets used and/or analysed during the current study are available from the corresponding author under reasonable request.


Articles from BMC Medical Education are provided here courtesy of BMC

RESOURCES