Abstract
Artificial intelligence (AI) is transforming radiology, with nearly 80% of approved AI as medical devices (AIaMDs) being imaging-related. As AI adoption accelerates, radiology training programs must evolve to equip future radiologists with the skills to critically evaluate, implement, and integrate AI into clinical practice. However, despite AI's growing role, its inclusion in medical curricula remains inconsistent, and assessment of AI competency is lacking. This review explores the current state of AI in UK medical training curricula with a more in-depth focus on radiology. We discuss the potential impact of AI on competency evaluations, including the Fellowship of the Royal College of Radiologists (FRCR) examinations, Annual Review of Competence Progression (ARCP), and on-call assessments. Additionally, we examine how AI-driven educational resources, such as AI-assisted training platforms, could enhance radiology education. To future-proof radiology training and careers, we propose strategies to evaluate AI literacy including nationalized structured AI teaching, and AI-focused assessments. Addressing these challenges will be crucial in ensuring that radiologists remain at the forefront of digital healthcare transformation while maintaining their core diagnostic expertise.
Keywords: radiology, artificial intelligence, ethics, education, assessment
Introduction
At present, the vast majority (almost 80%) of FDA and CE approved artificial intelligence (AI) as medical devices (AIaMDs) are related to radiology, with annual significant growth, and nearly 200 additional devices being added to the list in the last year alone.1 Nevertheless, without a clear understanding of how these digital innovations work, when to use them, and how to critically appraise their value, it will be challenging to adopt and trust these tools and translate their benefit to patient outcomes. The EU Artificial Intelligence Act underscores the importance of AI literacy, stating that providers and deployers of AI systems must ensure their staff achieve a sufficient understanding of AI.2 The UK government also acknowledges the importance of education and the need for AI ready healthcare staff having already funded several Topol Digital fellows (an NHS Digital Academy programme designed to equip fellows with new skills and knowledge to lead digital transformation),3 with Health Education England (HEE) part funding several Clinical AI Fellowships for trainees4 (focussed on AI implementation) and also publishing frameworks for how to improve staff confidence and trust in digital innovations.5 These directives support the argument that integrating AI into the curricula across all specialties is essential to prepare clinicians for the evolving demands of modern healthcare.2
Nevertheless, only a few medical specialties explicitly include AI in their curricula.6 In the United Kingdom, our resident doctors’ “Foundation programme”, and the specialist Radiology and Clinical oncology syllabi are some notable exceptions.7–9 They highlight that trainees should understand the principles of data analytics and AI—but currently do not assess these skills in any medical postgraduate examinations or appraisals. In contrast, other specialties such as cardiology and general practice do not explicitly require knowledge of AI but do encourage familiarity with “emerging technologies”.10,11 At present time, other specialty fields such as general surgery and internal medicine neglect mention of AI entirely from their training frameworks12,13 (Table 1).6–17
Table 1.
Incorporation of AI/digital technologies into training curricula for different specialties.6–17
| Specialty | AI included? | Notes/details |
|---|---|---|
| Clinical radiology | Yes | Includes AI and digital health in imaging and diagnostics |
| Clinical oncology | Yes | Includes AI and digital health in imaging and diagnostics |
| Foundation program | Yes | Includes understanding the principles of data analytics and AI |
| Cardiology | No | No explicit mention of AI; digital health tools are more broadly discussed |
| General practice | No | Lacks explicit AI; does touch on digital health tools |
| Psychiatry/general surgery/paediatrics/emergency medicine/internal medicine | No | No explicit reference to AI in the training curriculum |
| Dermatology | No | AI not included despite commercial autonomous AI diagnostic tools available |
As AI continues to evolve rapidly, the role of radiologists (and other healthcare specialties) will inevitably change, with the expectation that they have the skills and awareness to effectively integrate and use AI safely and responsibly in their daily practice. Given the domination of commercial AI offerings in the radiology space, how we evaluate and train future radiologists for the demands of a digitally enhanced future as consultants, will be critical not only for them to train the next generation of doctors but also in leading the way for other medical specialties.
This thought leadership piece explores the potential challenges and solutions for evaluating AI knowledge across current radiology trainees, by radiology trainees and fellows with a keen interest in AI. Although this article focuses on UK radiology training, AI in education is a global concern and the discussions are generalizable to other countries as well.
Current role of AI in medical education
Following medical school and a 2-year foundation program as a resident doctor (sometimes optional further years can also be taken), medical subspecialty training in the United Kingdom ranges from another additional 3 years for general practice or to up to another 8 years for other specialties, such as paediatrics.18
Radiology subspecialty training typically spans 5 years if undertaken full-time, during which trainees must meet curriculum requirements, pass Annual Review of Competence Progression (ARCP) assessments each year, and succeed in clinical radiology fellowship examinations to obtain a Certificate of Completion of Training (CCT). This curriculum ensures radiologists are competent in general and emergency radiology, in up to 2 subspecialty areas, and adept at tackling multidisciplinary team working, leadership, education, and management skills.8
In 2019, the Topol Review was published and emphasized the need for an increase in overall digital literacy within the NHS and of the general public to allow for an imminent digital health era. A key emphasis was made on the need to formally educate the medical workforce on AI, and develop relevant technology-related educational resources.19 In response, Health Education England (HEE) introduced “Knowledge for Healthcare”, an initiative to ensure that all NHS staff and learners have equitable access to high-quality knowledge services.20 Similarly, the 2021 clinical radiology curriculum was updated to include the expectation that trainees be able to critically appraise new technologies, including AI and machine learning applications.8 This encompasses understanding the data curation process, privacy and anonymization considerations, the limitations of AI based on its training data, and the basic concepts of radiomics, where radiomics is a quantitative analysis of radiology imaging.21
However, assessing these competencies remains challenging. Many trainees face increasing burdens to stay on top of their clinical knowledge, and it is easy to neglect this facet of the curriculum. Despite this, there is a great willingness and intention to learn, as highlighted in a Canadian study where 73% of oncology residents wished to learn more about AI22 and another study where 98.7% of UK radiology trainees felt that AI should be taught during their training.23 The issue with providing training and assessment however also comes from supervisors, who are still themselves learning about AI, and possibly feel unable to teach AI or judge a trainee’s competency.
Although the Royal College of Radiologists (RCR) is adapting to this new era by introducing AI-focused training courses such as the “Clinical Radiology Artificial Intelligence: A Blended Learning Program”,24 structured and mandatory education on AI is still lacking, and those who subscribe to the course must do so and pay for this out of their own pocket.
Radiology specialty training assessments
The current methods of assessment of radiology trainees within the United Kingdom are outlined below with ideas on how AI may be evaluated through these or create issues for usual radiology competency evaluation in the near future.
Annual review of competence progression
Plain film reporting is still a significant part of the radiology curriculum and trainees are expected to become proficient in reporting them. Although the exact number varies, many radiology training schemes expect trainees to report a certain number of plain films per year as part of the ARCP requirements. With the advent of AI for chest and musculoskeletal radiograph reporting, several challenges arise:
Firstly, if AI triages abnormal radiographs for urgent radiology review by either the consultant or reporting radiographer (and potentially report high confidence normal radiographs autonomously in the future), will there be enough examinations left for trainees to learn pattern recognition skills on these routine images? Furthermore, if enough radiographs remain, will trainees be disproportionately exposed to particular types of cases, potentially biasing their idea of normal and abnormal cases.
Secondly, if AI reports still require human sign-off, how should we implement it to still allow effective training of residents? Should trainees have access to AI output reports during their training, even though this might introduce bias? Or should a certain number of “no-AI” plain films be reserved exclusively for trainee teaching, free from AI assistance? The latter approach introduces the ethical issue of having plain films that are only human-reported, and would need to ensure that patient care is not being compromised. Another option may be to allow only fully independent reporters to be able to see the AI output, so residents would have to report independently of the AI. However, this would need to be feasible on a technical level, and would require intent and planning.
These questions will need to be addressed at both local and national levels to establish a standardized approach. At present, radiologists do still need to be proficient in reporting plain films, as current UK regulations require trained human readers to verify AI-generated reports.25 However, as the legal framework evolves, and AI becomes more reliable, radiographic reporting may become a less critical component of ARCP requirements or of the radiologist’s typical task or job role.
Using AI to train radiologists
To address these challenges, we could potentially use the AI to teach us how to report rather than to take away opportunities.26 This would be particularly relevant in the age of workforce shortages, where healthcare professionals may lack appropriate time for teaching and training.27
An example of an AI-driven radiology education platform is RadBytes, created by a radiologist whilst in training, which integrates AI to enhance radiology training and assessment. RadBytes has pioneered a Radiology AI Tutor, tailored to plain film interpretation. This interactive tutor delivers real-time, personalized feedback on trainee reports, meeting the needs of the individual. By simulating the guidance of a mentor, the system enables trainees to refine their diagnostic accuracy in a clinically simulated, self-directed learning environment (Figure 1).
Figure 1.
Screenshot of the RadBytes radiology AI tutor. The AI responds to the user’s comments about the image in a manner similar to a human tutor.
Another application developed is the use of AI to mark radiology examinations, streamlining the assessment process while improving accuracy and consistency. Marking imaging reports can be time-consuming and subject to inter-rater variability. By leveraging AI-driven scoring algorithms, RadBytes created a framework for a more objective evaluation while increasing self-marking efficiency (Figure 2).
Figure 2.
Screenshot of the RadBytes radiology AI tutor automatically evaluating the user’s response as correct or incorrect. The AI interprets nuanced answers, allowing for correct assessments even when the wording does not exactly match the expected response.
While AI tutors such as RadBytes offer scalable, consistent feedback for trainees, they are not without limitations. RadBytes does not interpret images directly; instead, it uses a large language model (LLM), named Cubey, to analyse trainee-submitted reports or questions and provides feedback based on expert verified interpretations and a structured repository of radiological knowledge. Clinical history and supplementary information are provided to both the trainee and the AI, supporting context-aware feedback. However, the AI cannot directly point to or highlight pathology on the image itself, limiting its ability to visually guide trainees. Its feedback is also less adaptable than that of a human mentor, as it cannot interpret tone, assess confidence, or tailor its teaching to a trainee’s learning style or level of experience. Furthermore, disparities in access to such tools across deaneries could lead to inconsistent development of AI literacy among trainees. Such tools are best positioned to complement rather than replace critical thinking, supervision, and clinical judgement.
“On-call” work
At a local level, some schemes require trainees to pass an on-call assessment before being allowed to provisionally authorize reports out of hours, ensuring they can identify urgent pathologies. However, with AI algorithms now capable of detecting stroke and intracranial haemorrhage, trainees may not encounter sufficient case load without AI influence, potentially affecting their independent learning. This is particularly relevant as radiologists who use AI must also be aware of the potential limitations of the software they use. In a UK study evaluating the accuracy of AI software in detecting abnormalities on CT angiograms for stroke patients, an error rate of 9% was found. This could clearly have significant impacts on patients.28
Again, in these situations—should trainees be prohibited from using AI until they prove their competence, or should training evolve to reflect the integrated AI workflows of the future?
Alternatively, another way of looking at this issue would be that the assistance of AI as a double reporter, may enable trainees to take on on-call duties earlier in their training to alleviate the out of hours burden on other more senior staff in the department and ease rota issues. This may also introduce them to the demands of prioritizing urgent cases and dealing with referrals sooner in their career to build their communication and interdisciplinary training skills. The AI would therefore play a powerful role in early education and confidence building.
When looking at translatable sectors where succeeding in a short space of time is crucial, similar models of work have been adopted. One such concept is that of “Lean Methodology” where commercial ventures follow an iterative process of failing fast, learning fast, and subsequently succeeding fast. In healthcare, the opportunity cost has often been deemed too high. However, with the advent of AI imaging reads, this may be something we adapt to out of necessity once the risks and mitigation strategies deem this route to be sufficiently robust.
Fellowship examinations
Finally, the most comprehensive assessments in radiology training are the FRCR exams, comprised of three components (part 1, part 2A, and part 2B).
FRCR part 1
Part 1 test is an online multiple choice and image viewing exam undertaken at ST1 level (first year) of training consisting of anatomy and physics modules, and evaluates understanding of basic principles of medical physics and anatomy. Currently, this does not evaluate the understanding of AI or informatics but could be adapted in the future to cover this part of the curriculum to encourage trainees to understand AI basic principles from the very start of their radiology careers.29
FRCR part 2A
The second part, named 2A, is an online single best answer exam involving 240 multiple choice questions—this is predominantly theory based on medical diseases and imaging findings and clinical management and next steps across all radiology subspecialty fields.30
FRCR part 2B
The final part of the FRCR (named 2B) consists of 3 components. Firstly, an oral viva component where candidates are assessed on their interpretation of 12 cases. Secondly, a long cases examination where candidates write reports for 6 cases and, as of July 2025, a short cases examination component. The final short cases component has replaced the previous rapid reporting module and consists of 25 plain radiographs that candidates are expected to write reports for as well as recommend further management.
In comparison, the previous rapid reporting component involved identifying normal and abnormal radiographs with only brief descriptions of the abnormality. This change was made to more accurately reflect current radiological practice, as the candidates will now be provided with relevant clinical information, and it will also allow for more complex radiographs to be included. Although there is no mention of AI as a driving force for the change in the rapid reporting component, the new short cases will now require candidates to recommend further management plans and MDT referrals, which is in line with what would be the role of an AI-assisted radiologist.31 However, the question still remains—if training is in an environment where AI is used on a daily basis, should the AI be available in the exam itself?
Sub-specialist training
Most UK radiologists will be trained in a specific subspecialty, such as musculoskeletal or neuroradiology. Each subspecialty will bring its own hurdles to AI integration in training as some are more disrupted by AI than others. Paediatric radiology, for example, poses a unique challenge for AI training given the relative lack of regulated and available AI tools—due to the heterogeneity of the population, lack of large amounts of curated training data, and hence, may benefit from tailored human focussed education approaches. Breast radiology, particularly breast screening, may potentially be more impacted by AI given multiple large, multi-centre trials that may mean double reporting for screening mammograms could be a thing of the past.32,33 This could either mean fewer breast radiologists will need to be trained for reporting screening studies, but also that those who do wish to undertake this type of work will need to be more vigilant about potential AI pitfalls and when to override these.
Challenges with FRCR for AI literacy assessment
The current FRCR curriculum is designed around structured exam preparation and local teaching programmes, ensuring a standardized approach to radiology training. However, this structured framework leaves limited room for self-directed learning or exploration beyond the syllabus, particularly in emerging areas such as AI, which is increasingly shaping modern radiology workflows.
The Topol Review19 has highlighted the urgent need to upskill the radiology workforce to effectively engage with AI-enabled imaging infrastructure. To bridge this gap, NHS organizations should embrace a multi-professional learning approach, equipping radiologists with digital competencies necessary for AI integration into clinical practice. This could be achieved through a combination of formative and summative assessments, including workplace-based evaluations during clinical placements.
Moreover, revisiting AI concepts at multiple stages of training, using techniques such as spaced repetition,34 could reinforce understanding and retention. This iterative approach would ensure that radiologists are not only exam-ready but also prepared for the evolving demands of AI-assisted clinical practice. These proposed strategies for AI integration into radiology training are summarized in Table 2.8,29–31
Table 2.
| Assessment | Discussion |
|---|---|
| FRCR part 1 |
|
| FRCR part 2A | Single best answer questions that examine all aspects of clinical radiology, physics, anatomy, and techniques. In the future, this may change to include questions on critically appraising AI |
| FRCR 2B |
|
| On-call assessment | Locally organized examination to assess if trainees can spot common on-call pathologies. At this stage, trainees will be junior so it may be reasonable to ensure they can interpret imaging without AI assistance to build that capability |
| Independent reporting assessment | Local assessment for post FRCR trainees to decide if they can authorize reports without consultant checking. Potential integration of AI into assessments may be considered |
| Yearly ARCP assessments |
|
| Final ARCP assessment |
|
AI teaching and training
Before introducing AI-related assessments into training, it is essential to ensure that trainees have access to sufficient learning resources. The RCR should take the lead in setting standards for education, and along with individual radiology academies and deaneries, provide teaching, guidance, resources, and mentorship to support AI education. This is particularly important given the limited number of AI-experienced radiologists and the uneven access to AI tools across different training regions.
Furthermore, the importance of interdisciplinary collaboration should be emphasized, including involving data scientists, informatics experts, or engineers in curriculum development and identifying joint training opportunities. As well as this, the RCR could collaborate with other institutions like European Society of Radiology (ESR) and Radiological Society of North America (RSNA) to bolster and inform the curriculum, as they too have been developing AI courses. Suggested resources are outlined in Table 3.35,36
Table 3.
| Resource | Discussion | Summary |
|---|---|---|
| Teaching |
|
|
| Teaching day | The RCR could organize a “travelling professor in AI” training day for each deanery. The basics of AI, machine learning/deep learning can thus be delivered in one day by experienced AI enabled radiologists |
|
| Training days |
|
|
| Simulation |
|
|
| Conferences |
|
|
| Mentoring |
|
|
| Discrepancy meetings |
|
|
Training AI-literate consultants
As AI becomes more embedded in radiology, consultants must adapt by integrating validated AI tools into practice. This requires new competencies beyond traditional imaging skills, including AI fundamentals, data curation, algorithm training, radiomics, and statistical literacy to evaluate AI-driven research. Consultants must also understand AI biases, limitations, and data privacy implications.
Beyond diagnostic AI, emerging workflow tools are streamlining vetting, reporting efficiency, and MDT follow-ups. With the rapid evolution of AI, a structured educational approach is essential. One solution is establishing AI fellowships, training radiologists as “AI champions” who guide peers in clinical AI applications. As AI reshapes hospital workflows, a dedicated AI multidisciplinary team may emerge, as outlined in the 2021 NHS AI Lab and Health Education England (HEE) report.5
At a national level, the RCR is developing AI resources, such as an AI registry, and working with policymakers to shape its integration into radiology.37 However, is it realistic for the RCR to spearhead AI integration alone, given competing priorities like workforce planning and education? This raises the possibility of creating a united space for AI advisory boards of all Royal Colleges to gather (potentially enabled by the Academy of Medical Sciences) and share developments and ideas across their subspecialty areas.
Additionally, it is important to consider the training needs of mid- to late-career consultants, which may differ from those of early-career consultants entering the workforce. Having been trained without AI, they may be more hesitant to adopt such technologies. However, failing to upskill this group risks creating a divide within the profession—between AI-assisted radiologists and those relying solely on traditional, human-only methods.
Management
Radiologists routinely engage in clinical governance, audit, and quality improvement (QI), fulfilling ARCP requirements. AI and LLMs can enhance these processes by analysing vast datasets for audits,8 but their use presents transparency, bias, and data confidentiality challenges. AI systems often function as “black boxes”, raising concerns about fairness and accuracy in decision-making.38
To ensure safe and ethical AI use, governance frameworks are emerging.39 For radiology trainees, understanding these frameworks is becoming as essential as traditional QI training. Incorporating AI-related audits and projects into the curriculum would familiarize trainees with AI governance, while consultant revalidation could require AI competency assessments, mirroring existing QI expectations.
AI in pre-specialist training
As AI awareness grows, future radiologists will enter specialty training with greater AI exposure. Medical students today view AI positively and are not widely concerned about job displacement—a study of 293 individuals (211 medical students) found only 15% believed AI would significantly impact the physician job market.40 However, their AI knowledge and exposure remain inconsistent, as there is no national undergraduate AI curriculum.
Some medical schools have introduced AI education, often within ethics modules or self-directed digital health projects. Topics may focus on the challenges of data usage, privacy, and bias when training an AI model and the ethical implications of digital health equity. These examples are rare, however, with 91.4% of students in a study of 325 stating they had not received any training on AI in their medical curriculum.41
While most students have limited real-world AI exposure, many have used LLMs (eg, ChatGPT) for academic work. A small subset may have learned AI fundamentals through A-level computer science or physics, but they remain a minority.
To bridge this gap, integrating basic AI principles into medical education would help future doctors develop a realistic understanding of AI’s capabilities and limitations. However, this should not come at the expense of core clinical skills, ensuring that doctors rely on sound clinical judgement rather than overdependence on AI.
Summary
We can summarize the unanswered questions briefly as below:
The evolving future role of a radiologist:
What will the future role of a radiologist look like?
How do we build inherent confidence and self-trust to know when to override erroneous AI results in practice and how do we reassure radiologists that they will not be (more severely) punished for errors when working with AI?
What ongoing training and evaluation should be introduced for radiologists (both in training and after training) to account for automation bias and potential deskilling?
Redesigning training pathways:
Should radiology residents in training have access to AI tools from the beginning of their training?
Is there a role for AI in medical education and training? And to what depth?
Should fellowship examinations and assessments be changed to account for the presence of AI tools in routine clinical practice?
Speaking more broadly, should AI training (and even more basic computational sciences like healthcare informatics) be mandated/introduced into the medical undergraduate and pre-specialist training curricula?
Governance, equity, and standards in training:
How can we upskill clinicians to become AI-literate, and what is our definition of AI literacy?
Who should take responsibility and have oversight over training radiologists about AI? This is particularly pertinent where there may be a senior—junior divide in understanding of the digital landscape.
How do we ensure a standardized and consistent level of AI training and experience and skills nationally when the integration and deployment of AI tools is not yet widespread across the NHS?
Conclusion
AI is already having a significant impact in radiology, meaning training programmes and their directors must adapt to prepare future radiologists for this shift. Trainees need to learn how to integrate AI tools into their practice while maintaining core radiology skills. They need to be able to identify abnormalities without AI, be able to critically appraise the tools, and have the confidence to challenge the AI when they feel it is wrong. This would require updates to education resources, assessments, and possibly the introduction of AI-focused fellowships or specialized roles. This thought leadership piece has suggested some areas of challenge but also many opportunities on how AI could be assessed and integrated in augmenting radiologists of the future.
Glossary of AI-related terms
Artificial intelligence (AI): A field of computer science focused on creating systems capable of tasks that normally require human intelligence, such as image interpretation or decision-making. This can be a broad terminology that covers many different actions and aspects, and therefore specific “subtypes of AI” may sometimes be quoted to be more specific and aid understanding (eg, LLMs, as below).
AI as a medical device (AIaMD): This terminology refers to AI systems that are regulated and approved for clinical use. These must meet safety and performance criteria by regulators and require conformity assessment and certification, such as CE marking (EU) or FDA clearance (USA). Tools that are not regulated for clinical use, should strictly not be deployed for this purpose.
Radiomics: A process of extracting quantitative data from medical images, such as shape or texture features, to support AI model training or clinical decisions.
Automation bias: The tendency to over-rely on AI system outputs, even when they may be incorrect, potentially undermining critical clinical judgement.
AI literacy: The knowledge and skills required to understand, evaluate, and appropriately use AI in clinical practice, including awareness of its limitations and risks.
Large language model (LLM): A type of AI model trained on large text datasets to generate human-like language. Examples include ChatGPT, used increasingly in research writing and communication.
Black box AI: A term used to describe AI systems where the internal logic or reasoning behind outputs is not transparent or easily understood, raising concerns around accountability and explainability particularly when erroneous or unexpected results are produced.
AI registry: A database tracking AI tools used in clinical settings, and potentially capturing their performance data. The purpose of this is to allow other users (or potential users) to know who to reach out to in order to get impartial and unbiased advice from (outside of the vendor), and potentially pool outcome data and comparisons across different populations.
Contributor Information
Girija Agarwal, Department of Clinical Radiology, St Mary's Hospital, Imperial College Healthcare Trust, London W2 1NY, United Kingdom.
Kavish Maroo, University College London Medical School, London, WC1E 6DE, United Kingdom.
Paymon Zomorodian, Department of Clinical Radiology, Mersey and West Lancashire Teaching Hospitals, Prescot, L35 5DR, United Kingdom.
Naman Bhatt, Department of Clinical Radiology, West Hertfordshire Teaching Hospital Trust, Hertfordshire, WD18 0HB, United Kingdom.
Dilan Sanli, Department of Clinical Radiology, University Hospital Southampton NHS Foundation Trust, Southampton, SO16 6YD, United Kingdom.
Akash Sharma, Department of Clinical Radiology, St Mary's Hospital, Imperial College Healthcare Trust, London W2 1NY, United Kingdom.
Susan C Shelmerdine, Department of Clinical Radiology, Great Ormond Street Hospital, London, WC1N 3BH, United Kingdom; UCL Great Ormond Street Institute of Child Health, Great Ormond Street Hospital for Children, London, WC1N 1EH, United Kingdom; NIHR Great Ormond Street Hospital Biomedical Research Centre, London, WC1N 1EH, United Kingdom.
Funding
S.C.S. was funded by a National Institute for Health Research (NIHR) Advanced Fellowship Award (NIHR-301332).
Conflicts of interest
P.Z. is a co-founder of RadBytes.
References
- 1.FDA Adds More than 120 New AI-Enabled Medical Devices Focused on Radiology to List of Approvals. May 14, 2024. Accessed March 8, 2025. https://healthimaging.com/topics/artificial-intelligence/fda-adds-more-120-new-ai-enabled-medical-devices-focused-radiology-list-approvals
- 2.Article 4: AI Literacy | EU Artificial Intelligence Act. Accessed March 4, 2025. https://artificialintelligenceact.eu/article/4/
- 3. NHS Health Education England. Topol Digital Fellowships. The Topol Review-NHS Health Education England. Accessed March 4, 2025. https://topol.hee.nhs.uk/digital-fellowships/
- 4.Fellowships in Clinical Artificial Intelligence. AI Centre. Accessed March 4, 2025. https://www.aicentre.co.uk/fellowship
- 5. NHS AI Lab & Health Education England. Developing Healthcare Workers’ Confidence in Artificial Intelligence (AI) (Part 2). 2022. Accessed March 4, 2025. https://digital-transformation.hee.nhs.uk/binaries/content/assets/digital-transformation/dart-ed/developingconfidenceinai-oct2022.pdf
- 6.GMC Approved Postgraduate Curricula. Accessed March 4, 2025. https://www.gmc-uk.org/education/standards-guidance-and-curricula/curricula
- 7.Clinical Oncology Specialty Training Curriculum. 2021. Accessed March 4, 2025. https://www.gmc-uk.org/-/media/documents/clinical-oncology-curriculum-update-310523-admin-change.pdf
- 8.Clinical Radiology Specialty Training Curriculum. 2021. Accessed March 4, 2025. https://www.gmc-uk.org/-/media/documents/clinical-radiology-curriculum-2021-updated-150523_pdf-101319661.pdf
- 9.UK Foundation Programme Curriculum 2021. 2021. Accessed March 4, 2025. https://www.gmc-uk.org/-/media/documents/fp-curr-oct22-v7_pdf-101343583.pdf
- 10.The RCGP Curriculum Being a General Practitioner. 2025. Accessed March 4, 2025. https://www.gmc-uk.org/-/media/documents/being-a-gp-2025_pdf-110256639.pdf
- 11.Curriculum for Cardiology Training. 2022. Accessed March 4, 2025. https://www.gmc-uk.org/-/media/documents/cardiology-2022-curriculum-final-v1_0_pdf-92049190.pdf
- 12.Curriculum for General Internal Medicine (Internal Medicine Stage 2) Training. 2022. Accessed March 4, 2025. https://www.gmc-uk.org/-/media/documents/gim—internal-medicine—stage-2–2022-curriculum-final-july-2022_pdf-91723907.pdf
- 13.General Surgery Curriculum. 2021. Accessed March 4, 2025. https://www.gmc-uk.org/-/media/documents/general-surgery-curriculum-august-2021-version-2—july-2023_pdf-102057645.pdf
- 14.Dermatology Training Curriculum. 2021. Accessed March 4, 2025. https://www.gmc-uk.org/-/media/documents/dermatology-2021-curriculum-updated-june-2024_pdf-107941933.pdf
- 15.The Royal College of Emergency Medicine Curriculum. 2021. Accessed March 4, 2025. https://www.gmc-uk.org/-/media/documents/rcem-curriculum-2021-master-version-1_5-2023_pdf-102245994.pdf
- 16.General Psychiatry Curriculum. 2022. Accessed March 4, 2025. https://www.gmc-uk.org/-/media/documents/general-psychiatry-curriculum-final-16-june-22_pdf-91452636.pdf
- 17.Curriculum Paediatric Specialty Postgraduate Training. 2023. Accessed March 4, 2025. https://www.gmc-uk.org/-/media/documents/main-curriculum-2023-4_pdf-103290089.pdf
- 18.Training as a Doctor. Health Careers. April 7, 2015. Accessed March 4, 2025. https://www.healthcareers.nhs.uk/explore-roles/doctors/training-doctor
- 19.The Topol Review. 2019. Accessed August 7, 2024. https://topol.hee.nhs.uk/wp-content/uploads/HEE-Topol-Review-2019.pdf
- 20.Knowledge for Healthcare. NHS England | Workforce, training and education. January 20, 2021. Accessed March 4, 2025. https://www.hee.nhs.uk/our-work/knowledge-for-healthcare
- 21. van Timmeren JE, Cester D, Tanadini-Lang S, Alkadhi H, Baessler B. Radiomics in medical imaging—“how-to” guide and critical reflection. Insights Imaging. 2020;11:91. 10.1186/s13244-020-00887-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Favorito FM, Collie L, Kennedy T, et al. A survey of perspectives and educational needs of canadian oncology residents on artificial intelligence. J Canc Educ. 2025;40:273-279. 10.1007/s13187-024-02509-7 [DOI] [PubMed] [Google Scholar]
- 23. Hashmi OU, Chan N, de Vries CF, Gangi A, Jehanli L, Lip G. Artificial intelligence in radiology: trainees want more. Clin Radiol. 2023;78:e336-e341. 10.1016/j.crad.2022.12.017 [DOI] [PubMed] [Google Scholar]
- 24.Clinical Radiology Artificial Intelligence (AI): Blended learning course | The Royal College of Radiologists. Accessed March 8, 2025. https://www.rcr.ac.uk/cpd-and-events/events-webinars/events/clinical-radiology-artificial-intelligence-ai-blended-learning-course-jan-25/
- 25. Royal College of Radiologists. Integrating Artificial Intelligence with the Radiology Reporting Workflows (RIS and PACS). 2021. Accessed March 4, 2025. https://www.rcr.ac.uk/media/hu2f3pn5/rcr-publications_integrating-artificial-intelligence-with-the-radiology-reporting-workflows-ris-and-pacs-_march-2021.pdf
- 26. Knopp MI, Warm EJ, Weber D, et al. AI-enabled medical education: threads of change, promising futures, and risky realities across four potential future worlds. JMIR Med Educ. 2023;9:e50373. 10.2196/50373 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27. Royal College of Radiologists. Clinical Radiology Workforce Census 2023. 2023. Accessed March 4, 2025. https://www.rcr.ac.uk/media/5befglss/rcr-census-clinical-radiology-workforce-census-2023.pdf
- 28. Merchant F, Choulerton J, James R, Pang CL. Real world clinical experience of using Brainomix e-CTA software in a medium size acute National Health Service Trust. Br J Radiol. 2025;98:592-599. 10.1093/bjr/tqaf019 [DOI] [PubMed] [Google Scholar]
- 29.FRCR Part 1 (Radiology)—CR1—Guidance Notes for Candidates | The Royal College of Radiologists. Accessed March 4, 2025. https://www.rcr.ac.uk/exams-training/rcr-exams/clinical-radiology-exams/frcr-part-1-radiology-cr1/frcr-part-1-radiology-cr1-guidance-notes-for-candidates/
- 30.FRCR Part 2A (Radiology) - CR2A—Guidance Notes for Candidates | The Royal College of Radiologists. Accessed March 4, 2025. https://www.rcr.ac.uk/exams-training/rcr-exams/clinical-radiology-exams/frcr-part-2a-radiology-cr2a/frcr-part-2a-radiology-cr2a-guidance-notes-for-candidates/
- 31.Upcoming Changes to the CR2B | The Royal College of Radiologists. Accessed March 4, 2025. https://www.rcr.ac.uk/exams-training/rcr-exams/clinical-radiology-exams/frcr-part-2b-radiology-cr2b/upcoming-changes-to-the-cr2b/
- 32.World-Leading AI Trial to Tackle Breast Cancer Launched | NIHR. August 4, 2025. Accessed August 6, 2025. https://www.nihr.ac.uk/news/world-leading-ai-trial-tackle-breast-cancer-launched
- 33. Lång K, Josefsson V, Larsson AM, et al. Artificial intelligence-supported screen reading versus standard double reading in the Mammography Screening with Artificial Intelligence trial (MASAI): a clinical safety analysis of a randomised, controlled, non-inferiority, single-blinded, screening accuracy study. Lancet Oncol. 2023;24:936-944. 10.1016/S1470-2045(23)00298-X [DOI] [PubMed] [Google Scholar]
- 34. Brauer DG, Ferguson KJ. The integrated curriculum in medical education: AMEE Guide No. 96. Med Teach. 2015;37:312-322. 10.3109/0142159X.2014.970998 [DOI] [PubMed] [Google Scholar]
- 35. Singla R, Pupic N, Ghaffarizadeh SA, et al. Developing a Canadian artificial intelligence medical curriculum using a Delphi study. NPJ Digit Med. 2024;7:323. 10.1038/s41746-024-01307-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.BiteLabs | HealthTech & Pharma Leadership Fellowships | Global & Hybrid Training Programs. Accessed March 8, 2025. https://www.bitelabs.io/
- 37.Embracing AI to Support the NHS in Delivering Early Diagnoses. 2024. Accessed March 4, 2025. https://www.rcr.ac.uk/media/4lhlimdh/rcr-policy-briefing-embracing-ai-to-support-the-nhs-in-delivering-early-diagnoses.pdf
- 38.A Guide to Good Practice for Digital and Data-Driven Health Technologies. GOV.UK. Accessed August 7, 2024. https://www.gov.uk/government/publications/code-of-conduct-for-data-driven-health-and-care-technology/initial-code-of-conduct-for-data-driven-health-and-care-technology
- 39. NHSx. Artificial Intelligence: How to Get It Right. 2019. Accessed March 4, 2025. https://transform.england.nhs.uk/media/documents/NHSX_AI_report.pdf
- 40. AlZaabi A, AlMaskari S, AalAbdulsalam A. Are physicians and medical students ready for artificial intelligence applications in healthcare? Digit Health. 2023;9:20552076231152167. 10.1177/20552076231152167 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41. Jackson P, Ponath Sukumaran G, Babu C, et al. Artificial intelligence in medical education—perception among medical students. BMC Med Educ. 2024;24:804. 10.1186/s12909-024-05760-0 [DOI] [PMC free article] [PubMed] [Google Scholar]


