Abstract
Purpose
Artificial intelligence (AI) is about to touch every aspect of radiation therapy, from consultation to treatment planning, quality assurance, therapy delivery, and outcomes modeling. There is an urgent need to train radiation oncologists and medical physicists in data science to help shepherd AI solutions into clinical practice. Poorly trained personnel may do more harm than good when attempting to apply rapidly developing and complex technologies. As the amount of AI research expands in our field, the radiation oncology community needs to discuss how to educate future generations in this area.
Methods and Materials
The National Cancer Institute (NCI) Workshop on AI in Radiation Oncology (Shady Grove, MD, April 4-5, 2019) was the first of 2 data science workshops in radiation oncology hosted by the NCI in 2019. During this workshop, the Training and Education Working Group was formed by volunteers among the invited attendees. Its members represent radiation oncology, medical physics, radiology, computer science, industry, and the NCI.
Results
In this perspective article written by members of the Training and Education Working Group, we provide and discuss action points relevant for future trainees interested in radiation oncology AI: (1) creating AI awareness and responsible conduct; (2) implementing a practical didactic curriculum; (3) creating a publicly available database of training resources; and (4) accelerating learning and funding opportunities.
Conclusion
Together, these action points can facilitate the translation of AI into clinical practice.
Introduction
Artificial intelligence (AI) is a longstanding field of study that has attempted to emulate and augment human intelligence. In the last several years, AI has been reinvigorated by advances in computer technology and machine learning (ML) algorithms, which aim to teach computers to learn patterns and rules by using previous examples. ML builds on experiences from computer science, statistics, neuroscience, and control theory, among many other disciplines. ML has benefited from recent availability of large data sets and developments in computers’ hardware and software for solving large-scale optimization problems. Most notably, deep-learning (DL) techniques have demonstrated significant successes in computer vision and language processing. These advances are most visible in consumer quality-of-life improvements such as self-driving cars and voice-activated virtual assistants. The umbrella term “informatics” includes practical applications of any of the aforementioned areas of study; for example, bioinformatics for biology and clinical informatics (or biomedical informatics) for clinical practice. The term “data science” refers to the general study of data analysis, which has recently focused on ML methods. A schematic of the relationships between common terminologies is shown in Figure 1 .
Many fields, such as finance, manufacturing, and advertising, have already incorporated AI into their workflows to improve efficiency and perform suprahuman tasks. Although AI has been adopted more slowly in the clinic owing to multiple competing factors—including a lack of training—the perception and engagement of AI in medicine has been improving. The American Medical Association adopted a policy in June 2019 to integrate training in AI augmentation.1 The National Institutes of Health (NIH) Big Data to Knowledge Initiative has established several Centers of Excellence in Data Science and is focused on enhancing nationwide training infrastructure in biomedical data science and data sharing.2 Radiation oncology holds significant promise for AI-powered tasks—described in several perspectives and reviews3, 4, 5, 6, 7, 8—not only for optimizing workflows or diagnosis, but also for more rewarding tasks, such as prognostic prediction and personalized treatment recommendations.
AI applications in radiation oncology span the domains of both medical physicists and radiation oncologists. Some applications, such as autosegmentation and automated treatment planning, will be human-verifiable; in other words, a human can check the work of a computer before deployment. Other applications—survival prognostication, decision support, and genomics-based treatment planning—are not human-verifiable at an individual scale and will thus require careful model development and validation. As applied research in these applications grows in radiation oncology, a commensurate growth in education is necessary to be able to build and validate trustworthy AI models that can be applied to the clinic.
Separate surveys of trainees in radiation oncology and radiology reveal that the majority are interested in additional training in the AI or informatics.9 , 10
In the radiation oncology survey, the American Society of Radiation Oncology (ASTRO) queried chairs and trainees in 2017 to assess their perception of training and research opportunities in genomics, bioinformatics, and immunology.10 Among the 3 areas, bioinformatics received the most enthusiasm: 76% believed that bioinformatics training would “definitely or probably” advance their career. Sixty-seven percent expressed interest in a formal bioinformatics training course, and 88% of chairs reported they would “probably or definitely” send faculty or trainees to such a course, reflecting an unmet need in training opportunities. Although the ASTRO survey did not specifically ask about AI/ML, we believe the high interest in bioinformatics accurately reflects interest in quantitative analysis in line with AI/ML methods.11 In recognition of the need for radiation oncologists with specialized informatics training, the National Cancer Institute (NCI), Oregon Health & Science University, and MD Anderson Cancer Center have each created training programs specific for radiation oncology fellows and residents aimed for careers as medical director in informatics or formal board certification in clinical informatics.12
The radiology survey supports a sentiment toward AI that is similar to that in radiation oncology.13 A single-institution survey of a radiology department revealed concerns about job security but also enthusiasm to learn about AI/ML.9 This survey showed that 97% of trainees (residents and fellows) were planning to learn AI/ML as relevant to their job (vs 77% of attending radiologists). In fact, 74% of trainees (vs 60% of attendings) were willing to help create or train an ML algorithm to do some of the tasks of a radiologist. National radiology societies have been responsive to these sentiments. The American College of Radiology (ACR) Data Science Institute (https://www.acrdsi.org/) recently launched the ACR AI-LAB to allow radiologists to create, validate, and use models for their specific local clinical needs. The Radiologic Society of North America (RSNA) and the Society for Imaging Informatics in Medicine cosponsor the National Imaging Informatics Course, held twice a year (in its third year); the majority of residency programs have participated.14 The RSNA annual meeting hosts several AI refresher courses and coding challenges in the new “AI Pavilion,” with residents encouraged to participate. The Society for Imaging Informatics in Medicine hosts monthly journal clubs and promotes mentoring opportunities for trainees.15
In this perspective article written by the Training and Education Working Group of the NCI Workshop on AI in Radiation Oncology (Shady Grove, MD, April 4-5, 2019),16 we propose an overall action plan for radiation therapy–specific AI training that is composed of the action points outlined in Table 1 . We cover each action point (AP) in detail in this article.
Table 1.
|
|
|
|
Abbreviations: AACR = American Association for Cancer Research; AI = artificial intelligence; ESTRO = European Society for Radiation therapy and Oncology.
AP1: Create Awareness and Responsible Conduct of AI
In the last decade or so, we have seen several examples of ethical concerns and biases magnified by AI. When there are biases in the training data (eg, certain populations or scenarios are overrepresented), an algorithm that models correlations could propagate or even amplify these biases, leading to undesirable outcomes in deployment.17 This is particularly problematic because AI is sometimes viewed as being “objective” without consideration for the data generation process, which is often unknown.
The European Union has recently released a 7-point action plan toward so-called “trustworthy” AI. This plan focuses on the ethical aspects of AI and includes human agency and oversight; robustness and safety; privacy and data governance; transparency; diversity, nondiscrimination, and fairness; societal and environmental well-being; and accountability.18 Similarly, the Food and Drug Administration (FDA) has taken similar steps toward regulation of AI applications in medicine.19 A key component of improving awareness is to be transparent and clearly document where and when an AI algorithm is used in any part of the clinical workflow. And in cases where AI is applied, researchers and physicians should also clarify whether the AI is an ML system—the more recent type of AI trained on large data, which tend to be less interpretable—or an older rule-based system. ML and rule-based AI have different behaviors. For example, neural networks—a type of ML system—are vulnerable to adversarial attacks.20 , 21
AP2: Implement Practical Didactic Curriculum
There are currently no educational guidelines for AI training for radiation oncology or medical physics residents. Serendipitously, there is an active discussion within the field about revising the radiation oncology resident training curriculum. Although in-depth discussion of all the factors at play is beyond the scope of the present discussion, we refer readers to a pair of editorials by Amdur and Lee22 and Wallner et al.23 In July 2020, the Accreditation Council for Graduate Medical Education made several changes to the radiation oncology residency curriculum.24 The revisions are notable for mandating education in several new areas, including clinical informatics. We are pleased that the Accreditation Council for Graduate Medical Education has the foresight to update the training curriculum to include informatics and hope that this paper can serve to provide high-level guidance.
In action point 2, we propose a high-level overview of a curriculum draft for trainees in medical physics and radiation oncology to adequately grasp the basic principles of AI. These principles are generalizable to medicine as a whole and have particular significance for interventional and informatics-heavy specialties such as radiation oncology.
-
1.
Responsible conduct of AI, bias, and disparities
-
2.
Methodology: Data science basics
-
3.
Interpreting data and models
-
4.
Practical experience and applications
-
5.
Data sharing: Logistics and culture
Responsible conduct of AI, bias, and disparities
There is increasing concern that AI models influenced by bias will further perpetuate health care disparities for patients. The underlying reason behind bias retained in AI models is often related to training data that fail to represent the entire population equally. Because AI algorithms do not have a concept of “fairness,” surveillance of inherent bias with AI is typically left to those who designed the system. As noted by the European Union and FDA in action point 1, proper application of AI should aim to enhance positive social change and enhance sustainability and ecological responsibility. Particularly in medicine, rules and regulations should be put in place to ensure responsibility and accountability of AI systems, their users, and their appropriate use. In the computer science and ML communities, there has been increasing efforts to improve the teaching of ethics and human-centered AI in coursework (https://stanfordcs181.github.io/).25 A complementary area of work is to develop methods to audit AI systems to identify potential systematic or cultural biases. Trainees must develop an appreciation for these critical complexities and potential limitations of AI.
Methodology: Data science basics
Data features, structures, and algorithms form the foundation of AI applications. Unfortunately, quantitative analysis and critical data appraisal are not universally emphasized in medical or postgraduate education, particularly for physicians. As many ML techniques become published in general medical or oncology journals, it is incumbent upon editors and readers alike to have some basic facility with the techniques. A working knowledge of basic statistical concepts such as hypothesis testing, confidence intervals, and basic performance metrics will need to be introduced before more data structures and model-agnostic techniques such as data cleaning, cross validation, model fitting, bias–variance tradeoff, and advanced performance metrics, including the widely used but poorly understood receiver operating curve, can be developed.26 To demystify many of these topics, there are existing high-quality online courses made broadly available, which will be further discussed in action points 2 and 3.
Interpreting data and models
For proper clinical application of AI tools, physicians should be able to assess the validity of the data and the model-generation process. So-called “black-box” models have such internal complexity that they are conceptualized as inputs mapped to outputs without any intent to understand how the mapping occurs. Several ML methods, including deep learning (DL) and most ensemble methods, fall into this categorization. Although black-box AI models can have excellent performance during training and internal validation, they often encounter problems generalizing when widely deployed. Understanding why a problem occurred can be difficult with “black-box” models and is currently a very active area of AI research.21 , 27
One way to demonstrate data and model interpretability is through “use cases.” In medical research, there are well-known examples of the potential dangers of black-box models related to confounding.28 Fortunately, researchers were able to catch these issues before deploying their models, which may not always be the case for complex data sets with non-obvious confounders.
There is an ongoing discussion on the necessity of AI interpretability by the FDA19 , 29 and the informatics community.30 All authors would agree that elevating the knowledge base of clinicians and physicists will certainly enable more innovation regardless of final regulatory plans.
Practical experience and applications
For trainees interested in applying data science to clinical practice, these opportunities should be encouraged and promoted.
Although medical physics and radiation oncology AI curricula could have significant overlap, there will necessarily be focuses on separate domains. In medical physics, instruction may cover methods for autosegmentation, automated and adaptive treatment planning, and quality assurance. Radiation oncology trainees may be more interested in prognostic predictions and clinical decision support. In the future, as AI takes more of an augmented intelligence role, there should be instruction for physicians on how to decide whether to accept, interrogate, or reject recommendations. For example, physicians may need to determine whether there are sufficient rationales to accept an automatically generated plan or treatment recommendation using clinical and dosimetric information.
Several radiation oncology departments have AI and ML researchers who could contribute to a training curriculum. These courses should be jointly taught to both physicists and physicians. We anticipate that common courses and collaboration between trainees in medical physics and radiation oncology will improve translation of AI methods into the clinic. Given that medical physicists already have quantitative training in methods with significant overlap with ML, we anticipate close collaboration between physicists and physicians. Indeed, this is the current status quo in most radiation oncology departments performing AI and ML research.
For departments without access to sufficient resources, online education using so-called MOOCs (“massive open online courses,” a misnomer, as they are not necessarily massive or open) and workshop models (see action point 4) may be more educational to trainees than co-opting faculty without training in AI and ML.
For advanced practitioners, we will discuss data science hackathons and crowdsourcing in Action Point 3.
Data sharing: Logistics and culture
One of the key aspects of creating robust predictive models is being able to show generalization to novel data sets through a process called external validation, which requires institutions to share data among themselves. The data-sharing culture in medicine has been historically tribalistic but has gradually become more collaborative. This dynamic was well exemplified by the backlash to an infamous 2016 editorial (coauthored by the editor in chief of The New England Journal of Medicine at that time) that was viewed as anti–data sharing.31 , 32 Unlike in academic medicine, academic AI researchers have a strong open-access culture where preprint archiving of publications is the norm and data sets are simultaneously published with papers to invite validation. Notably, patients are generally supportive of the sharing of their data and would likely embrace scientific reuse of their data to improve the lives of future patients,33 although we recognize that there are many regulatory limitations to widespread data sharing of this sort. Finding a path for controlled data sharing among trusted parties, or more broadly with deidentification schemes, could be an important first step in improving the accuracy of AI algorithms.
In this curriculum, we hope to emphasize the efforts in medicine and oncology to promote data sharing (Table 2 ). The NCI is keen on improving data-sharing protocols and resources. In 2018, the NCI Office of Data Sharing was created, and the NIH has recently asked for open comments on a draft policy for data management and sharing.34 The Cancer Imaging Archive is funded by NCI and allows the sharing of anonymized imaging data sets and corresponding clinical and genomic data. A radiology initiative includes the ACR Imaging Network for clinical trial protocol and data set sharing through TRIAD (Transfer of Images and Data). The American Society of Clinical Oncology (ASCO) provides another strong example of centralized data sharing in the CancerLinQ project (https://cancerlinq.org/). Radiation oncology currently lacks a specialty-specific centralized platform for data request and sharing. A notable effort to move toward this goal is by American Association of Physicists in Medicine (AAPM) Task Group 263 to standardize nomenclature using structured ontology for data pooling.35
Table 2.
Entity | Est. | Area |
---|---|---|
The Cancer Genome Atlas | 2005 | Tumor genomics |
ACR Imaging Network/TRIAD | 2009 | Clinical trial protocols, data sets, cloud-based data transfer |
Radiogenomics Consortium | 2009 | Radiation therapy genomics and genetics |
The Cancer Imaging Archive | 2010 | DICOM, radiomics. Select data sets with genomics, histopathology. |
ASCO CancerLinQ | 2014 | Quality improvement. Plan for decision support. |
Project DataSphere | 2014 | Phase 3 cancer clinical trial patient-level data |
ACR Data Science Institute | 2017 | Use cases in for development of medical imaging AI |
NCI Office of Data Sharing | 2018 | Advocacy, establishing standards, defining incentives |
Abbreviations. ACR = American College of Radiology; AI = artificial intelligence; ASCO = American Society of Clinical Oncology; DICOM = digital imaging and communications in medicine; NCI = National Cancer Institute; TRIAD = transfer of images and data.
One approach to overcome data transfer medicolegal and protected health information issues is through distributed or federated learning. In this approach, analysis is performed locally and models are transferred (eg, feature weights) instead of data; this decentralized approach has shown equivalent performance to that using central pooling of data.36 , 37 Such innovative approaches for anonymization could be part of a training curriculum to help overcome barriers to data sharing.
Recent years have seen signs of a cultural shift in radiation oncology toward open collaboration and data sharing, along with formalization of key principles in data sharing, namely that data should be FAIR: findable, accessible, interoperable, and reusable. These FAIR guiding principles for scientific data management and stewardship38 are of utmost importance and should be discussed with and endorsed for all trainees. In line with FAIR, several radiation oncology academic centers and cooperative groups have contributed data sets to the Cancer Imaging Archive.39, 40, 41 Open-access journals with a focus on radiation oncology include BMC Radiation Oncology, the Frontiers in Oncology section on radiation oncology, and Advances in Radiation Oncology, which was launched by ASTRO in 2015.
Several coordinating efforts present opportunities to pool ideas and data to promote collaborating, increase power for discovery, and avoid redundancy. Within imaging, these efforts include the aforementioned ACR Data Science Institute for AI in medical imaging, which aims to identify clinically impactful use cases in radiation oncology, such as autosegmentation and magnetic resonance imaging–derived synthetic computed tomography scans.42 Within genomics, the Radiogenomics Consortium is a transatlantic cooperative effort pooling American and European cohorts to find genomic markers for toxicity to radiation therapy.43 Several groups within the consortium are interested in creating ML models to predict toxicity response in radiation therapy.44, 45, 46, 47
Through the proposed curriculum draft of action point 2, we hope to build a core of trainees for the next generation who can understand and apply data science fundamentals while also understanding ethical considerations and data sharing principles.
AP3: Create Publicly Accessible Resources
Directly building off the curriculum discussed in action point 2, the third AP relates to the creation of a centralized, publicly accessible repository of resources to guide trainees. These resources could include seminal white papers, video lectures, code samples, and contacts for potential collaborations. To reach the widest potential audience, we favor storage at open access websites such as GitHub and YouTube, for instance.
As discussed in action point 2, a formal curriculum can be facilitated and standardized using MOOCs, which would consist of video lectures and interactive coding exercises. MOOCs have become very influential in online education, as they can be tailored for various experience levels and are self-paced. Owing to economies of scale, they can be widely disseminated for reasonable costs. For example, the MOOCs on Coursera run on the higher end of cost and charges are around $40/mo for classes that last around a month, with about 10 to 12 hours of coursework per week. MOOCs could be adopted from existing courses or centrally created in collaboration with organizations such as the Association of Residents in Radiation Oncology Education Committee and Radiation Oncology Education Collaborative Study Group.
As there is more interest, trainees will likely want to be involved in practical research projects. Given that AI expertise is not evenly distributed, both intra- and interinstitutional collaborations can be fostered. In this respect, trainees can provide a valuable service by annotating data for research. At the same time, they would also benefit from the service of others by receiving annotated data for model and skills development. A model for this can be seen in eContour (https://www.econtour.org), a free web-based contouring atlas. In a randomized trial, eContour improved nasopharynx contours and anatomy knowledge compared with traditional resources.48 A next phase of eContour may involve enabling user-generated contours and segmentations for technology development and research, with a prototype initially pilot tested at the American College of Radiation Oncology Annual Meeting in 2017.49 Researchers are invited to use the platform to collect contours from large numbers of users from diverse practice locations, though must provide funding to support website programming and content administration. For residents, funding opportunities are available through professional organizations, as discussed in action point 4.
Another promising venue for trainees interested in skills development is through public competitions (Table 3 ). Past challenges in radiation oncology have leveraged collaborations between academic centers; international societies, such as MICCAI (the Medical Image Computing and Computer Assisted Intervention Society); and commercial sites, such as TopCoder.com (Wipro, Bengalaru, India) and Kaggle.com (Google, San Francisco, CA). These public crowd-sourcing challenges have included 2 radiomics challenges: (1) to predict human papillomavirus status or local control in oropharyngeal cancer after radiation therapy50 and (2) to predict lung tumor segmentation.51 Within the medical computer vision domain, MICCAI hosts several challenges every year to help validate and benchmark image processing algorithms52; regular challenges have included melanoma diagnosis, brain tumor segmentation, and prostate cancer Gleason grading. In the data science competition space as a whole, there has been enthusiasm for health care challenges, with the last 3 Kaggle Data Science Bowls (https://datasciencebowl.com/) focused on heart ejection fraction determination (2016), lung cancer screening (2017), and cellular nuclei detection (2018).
Table 3.
Platform | Year | Prediction Goal |
---|---|---|
MICCAI Brain Tumor Segmentation (BraTS) benchmark |
2012-2020 | Segment heterogeneous brain tumors (gliomas) |
Prostate Cancer DREAM Challenge | 2015 | Predict overall survival and docetaxel discontinuation in mCRPC |
Kaggle Data Science Bowl | 2016 | Predict heart ejection fraction |
MICCAI radiomics challenges (2) | 2016 | (1) HPV, (2) local control in oropharyngeal cancer |
Kaggle Data Science Bowl | 2017 | Detect lung cancer via National Lung Screening Trial DICOMs |
TopCoder Lung Cancer Challenge | 2017 | Segment lung cancer |
Kaggle Data Science Bowl | 2018 | Detect cellular nuclei |
Abbreviations: DICOM = digital imaging and communications in medicine; DREAM = Dialogue for Reverse Engineering Assessments and Methods; HPV = human papillomavirus; MICCAI = Medical Image Computing and Computer Assisted Intervention Society; mCRPC = metastatic castrate resistant prostate cancer.
AP4: Accelerate Learning and Funding Opportunities
Developing and maintaining resources described in action point 3 will require accelerated learning of particularly motivated trainees who will need institutional infrastructure and funding mechanisms to be successful.
Although MOOCs provide consistency and quality of education, for accelerated training, the radiation oncology community could adopt the intensive weeklong workshop model that is widely used by oncology organizations.11 Examples include separate workshops on clinical trial development by ASCO and the American Association for Cancer Research (AACR) (https://vailworkshop.org/) and the European Cancer Organization, AACR, European Organization for Research and Treatment of Cancer, and European Society for Medical Oncology53 in addition to the AACR Molecular Biology in Clinical Oncology Workshop, which focuses on molecular biology methods and grantsmanship.54 These intensive, week-long workshops are run by established faculty and are aimed at the research career development of senior trainees and junior faculty with a strong focus on mentorship; this workshop model could be adopted in radiation oncology to mentor trainees in data science. The aforementioned ASTRO 2017 survey10 suggests this workshop model could be well received, with 88% of radiation oncology chairs “probably or definitely” willing to send faculty or trainees to such a course to learn bioinformatics. Radiation oncology–specific workshops (Table 4 ) such as the Practical Big Data Workshop55 (hosted by the University of Michigan annually 2017-2019) and 2 ad hoc workshops by the NIH/NCI in 2019 (NCI Workshop on AI in Radiation Oncology in April16 and the NCI State of Data Science in radiation Oncology in July56) have provided forums for data science practitioners to discuss results and ideas but have not yet had a primary focus on education or research mentorship. A joint ASTRO/AAPM AI research workshop planned for June 2020 was cancelled due to the COVID-19 pandemic (https://www.astro.org/Meetings-and-Education/Live-Meetings/2020/Research-Workshop). Other promising avenues for future AI education and workshops include annual society (ASTRO, AAPM, etc) and cooperative group meetings. For example, recent European Organization for Research and Treatment of Cancer57 and NRG Oncology meetings58 have included workshops on AI and digital health and may be able to incorporate educational content.
Table 4.
Workshop | Date | Host |
---|---|---|
AAPM Practical Big Data Workshop | May 19-20, 2017 | University of Michigan |
AAPM Practical Big Data Workshop | May 31-June 2, 2018 | University of Michigan |
EORTC State of Science meeting | September 26-27, 2018 | EORTC |
AAPM Practical Big Data Workshop | June 6-8, 2019 | University of Michigan |
NCI Workshop on AI in Radiation Oncology | April 4-5, 2019 | Radiation Research Program, NCI |
NCI State of Data Science in Radiation Oncology | July 25, 2019 | Radiation Oncology Branch, Center for Cancer Research, NCI |
Second Annual NRG Radiation Oncology Mini-Symposium: “AI and Machine Learning in Radiation Oncology” | January 10, 2020 | NRG Radiation Oncology Committee, Center for Innovation in Radiation Oncology |
NRG Oncology Digital Health Workshop | January 10, 2020 | NRG Oncology |
Abbreviations: AAPM = American Association of Physicists in Medicine; AI = artificial intelligence; EORTC = European Organization for Research and Treatment of Cancer; NCI = National Cancer Institute.
Another avenue to gain expertise during radiation oncology residency could be through the American Board of Radiology’s B. Leonard Holman research pathway. This pathway is an established research fellowship during residency that provides between 18 and 21 protected months of research without lengthening clinical training time. This protected time could be used to gain expertise in data science, which could include MOOCs or AI fellowships in collaboration with data science departments or industry. Several companies, including Google, Microsoft, nVidia, and Facebook, all offer 1-year AI “residencies” for specific areas such as deep learning. In late 2019, ASTRO and Varian Medical Systems announced a joint 1-year research fellowship to start in July 2020 for eligible residents59; research topics include AI, information systems, and related areas.
There are several grant opportunities for residents and fellows that could be used toward AI and ML research or education. These include 1-year grants of $25-$50k by ASTRO (physicians and physicists), RSNA, and ASCO. There are also additional funding opportunities not specific to trainees by the Radiation Oncology Institute and radiation therapy companies. For new faculty, NIH K08/K23 awards can provide mentored research time and salary support. NIH R25 grants for developing informatics tools for cancer are another promising avenue for multiyear funding opportunities. RSNA offers several grants for education60 and had a specific focus on developing AI education tools for their 2020 Education Innovation Grant.
Conclusions
AI is becoming a transformative force in medicine, but there are dangers to blindly trusting trained models and raw data without understanding their governance. Just as radiation oncology trainees should understand requisite radiobiology and physics to treat patients, we believe that some level of competency in AI is necessary to safely and effectively use it in the clinical setting. In the present perspective article from the 2019 NCI Workshop on AI in Radiation Oncology: Training and Education Working Group, we have discussed AI awareness and proper conduct (Action Point 1), what an AI curriculum might include (AP2), how to create and contribute to educational resources (AP3), and what support from institutions, societies, and funding agencies is required (AP4). We hope that this paper will spark further discussion on the future of trainee education in radiation oncology. One concrete path forward could be for radiation oncologists and medical physicists to collaboratively apply for fellowships/funding and develop workshops (AP4) for creation of data science educational curricula (AP2) and resources (AP3), while being mindful of the ethical concerns in AI implementation (AP1).
Acknowledgments
The authors thank Dr Erin Gillespie for information on eContour.org.
Footnotes
Sources of support: Dr Kang is supported by the Radiologic Society of North America Research & Education Foundation Resident Research Grant #RR1843. Dr Zou is supported by grants from the Chan Zuckerberg Initiative and the National Institute on AgingP30AG059307. Dr El Naqa is supported by National Institutes of Health grants R37-CA222215 and R01-CA233487. Dr Aneja is supported by the American Cancer Society, the National Science Foundation, and the Agency for Health Research and Quality.
Disclosures: Dr El Naqa is a scientific advisor for Endectra LLC and Resero AI LLC. All other authors have no disclosures to declare.
The contents do not represent the views of the US Department of Veterans Affairs or the United States government.
References
- 1.American Medical Association AMA adopt policy, integrate augmented intelligence in physician training. https://www.ama-assn.org/press-center/press-releases/ama-adopt-policy-integrate-augmented-intelligence-physician-training Available at:
- 2.Margolis R., Derr L., Dunn M. The National Institutes of Health’s Big Data to Knowledge (BD2K) initiative: Capitalizing on biomedical big data. J Am Med Inform Assoc. 2014;21:957–958. doi: 10.1136/amiajnl-2014-002974. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Lambin P., van Stiphout R.G., Starmans M.H. Predicting outcomes in radiation oncology—multifactorial decision support systems. Nat Rev Clin Oncol. 2013;10:27–40. doi: 10.1038/nrclinonc.2012.196. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Kang J., Schwartz R., Flickinger J., Beriwal S. Machine learning approaches for predicting radiation therapy outcomes: A clinician’s perspective. Int J Radiat Oncol Biol Phys. 2015;93:1127–1135. doi: 10.1016/j.ijrobp.2015.07.2286. [DOI] [PubMed] [Google Scholar]
- 5.Bibault J.E., Giraud P., Burgun A. Big Data and machine learning in radiation oncology: State of the art and future prospects. Cancer Lett. 2016;382:110–117. doi: 10.1016/j.canlet.2016.05.033. [DOI] [PubMed] [Google Scholar]
- 6.Coates J., Souhami L., El Naqa I. Big data analytics for prostate radiotherapy. Front Oncol. 2016;6:149. doi: 10.3389/fonc.2016.00149. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Feng M., Valdes G., Dixit N., Solberg T.D. Machine learning in radiation oncology: Opportunities, requirements, and needs. Front Oncol. 2018;8 doi: 10.3389/fonc.2018.00110. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Thompson R.F., Valdes G., Fuller C.D. Artificial intelligence in radiation oncology: A specialty-wide disruptive transformation? Radiother Oncol. 2018;129:421–426. doi: 10.1016/j.radonc.2018.05.030. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Collado-Mesa F., Alvarez E., Arheart K. The role of artificial intelligence in diagnostic radiology: A survey at a single radiology residency training program. J Am Coll Radiol. 2018;15:1753–1757. doi: 10.1016/j.jacr.2017.12.021. [DOI] [PubMed] [Google Scholar]
- 10.Mouw K.W., Beck T.F., Keen J.C., Dicker A.P. Assessing the training and research environment for genomics, bioinformatics, and immunology in radiation oncology. JCO Clin Cancer Inform. 2018;2:1–9. doi: 10.1200/CCI.18.00045. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Thompson R.F., Fuller C.D., Berman A.T., Aneja S., Thomas C.R. Career enrichment opportunities at the scientific frontier in radiation oncology. JCO Clin Cancer Inform. 2019;3:1–4. doi: 10.1200/CCI.18.00126. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Kang J., Thompson R.F., Fuller C.D., Camphausen K.A., Gabriel P.E., Thomas C.R. In regard to Wallner et al. Int J Radat Oncol Biol Phys. 2020;106:217–218. doi: 10.1016/j.ijrobp.2019.10.044. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Pakdemirli E. Artificial intelligence in radiology: Friend or foe? Where are we now and where are we heading? Acta Radiol Open. 2019;8 doi: 10.1177/2058460119830222. 2058460119830222. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.RSNA/SIIM National Imaging Informatics Curriculum and Course. https://sites.google.com/view/imaging-informatics-course/ Available at:
- 15.SIIM Members in Training Community. https://siim.org/page/mit_community_inter Available at:
- 16.National Cancer Institute NCI convenes workshop on artificial intelligence (AI) in radiation oncology. https://dctd.cancer.gov/NewsEvents/20190523_ai_in_radiation_oncology.htm Available at:
- 17.Zou J., Schiebinger L. AI can be sexist and racist — it’s time to make it fair. Nature. 2018;559:324. doi: 10.1038/d41586-018-05707-8. [DOI] [PubMed] [Google Scholar]
- 18.Building trust in human-centric AI |FUTURIUM - European Commission. https://ec.europa.eu/futurium/en/ai-alliance-consultation/guidelines Available at:
- 19.Clinical and Patient Decision Support Software Draft Guidance. US Food and Drug Administration; 2017. [Google Scholar]
- 20.Goodfellow I.J., Shlens J., Szegedy C. Explaining and harnessing adversarial examples. http://arxiv.org/abs/1412.6572 Available at:
- 21.Ghorbani A., Abid A., Zou J. Interpretation of neural networks is fragile. http://arxiv.org/abs/1710.10547 Available at:
- 22.Lee W.R., Amdur R.J. A call for change in the ABR initial certification examination in radiation oncology. Int J Radat Oncol Biol Phys. 2019;104:17–20. doi: 10.1016/j.ijrobp.2018.12.046. [DOI] [PubMed] [Google Scholar]
- 23.Wallner P.E., Kachnic L.A., Alektiar K.M., Davis B.J., Hardenbergh P.H., Ng A.K. The American Board of Radiology initial certification in radiation oncology: Moving forward through collaboration. Int J Radat Oncol Biol Phys. 2019;104:21–23. doi: 10.1016/j.ijrobp.2019.01.090. [DOI] [PubMed] [Google Scholar]
- 24.ACGME Radiation oncology program requirements and FAQs. https://acgme.org/Specialties/Program-Requirements-and-FAQs-and-Applications/pfcatid/22/Radiation%20Oncology Available at:
- 25.Stanford News Stanford University launches the Institute for Human-Centered Artificial Intelligence. https://news.stanford.edu/2019/03/18/stanford_university_launches_human-centered_ai/ Available at:
- 26.Hanley J.A., McNeil B.J. The meaning and use of the area under a receiver operating characteristic (ROC) curve. Radiology. 1982;143:29–36. doi: 10.1148/radiology.143.1.7063747. [DOI] [PubMed] [Google Scholar]
- 27.Caruana R., Lou Y., Gehrke J., Koch P., Sturm M., Elhadad N. Proceedings of the 21st ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. KDD; New York: 2015. Intelligible models for healthcare: Predicting pneumonia risk and hospital 30-day readmission; pp. 1721–1730. [Google Scholar]
- 28.Zech J.R., Badgeley M.A., Liu M., Costa A.B., Titano J.J., Oermann E.K. Variable generalization performance of a deep learning model to detect pneumonia in chest radiographs: A cross-sectional study. PLoS Med. 2018;15 doi: 10.1371/journal.pmed.1002683. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Evans B., Ossorio P. The challenge of regulating clinical decision support software after 21(st) century cures. Am J Law Med. 2018;44:237–251. doi: 10.1177/0098858818789418. [DOI] [PubMed] [Google Scholar]
- 30.AMIA AMIA urges more work on FDA’s decision support guidance. https://www.amia.org/news-and-publications/press-release/amia-urges-more-work-fda%E2%80%99s-decision-support-guidance Available at:
- 31.Frellick M. NEJM editor backtracks on data-sharing “parasites” editorial. http://www.medscape.com/viewarticle/857756 Available at:
- 32.Husten L. NEJM editor flip flops on data sharing after social media firestorm. http://www.cardiobrief.org/2016/01/25/nejm-editor-flip-flops-on-data-sharing-after-social-media-firestorm/ Available at:
- 33.Patient-led data sharing — a new paradigm for electronic health data. https://catalyst.nejm.org/patient-led-health-data-paradigm/ Available at:
- 34.Wolinetz C.D. NIH’s DRAFT data management and sharing policy: We need to hear from you! https://osp.od.nih.gov/2019/11/06/draft-data-management-and-sharing-policy-we-need-to-hear-from-you/ Available at:
- 35.Mayo C.S., Moran J.M., Bosch W. American Association of Physicists in Medicine task group 263: Standardizing nomenclatures in radiation oncology. Int J Radiat Oncol Biol Phys. 2018;100:1057–1066. doi: 10.1016/j.ijrobp.2017.12.013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Jochems A., Deist T.M., El Naqa I. Developing and validating a survival prediction model for NSCLC patients through distributed learning across 3 countries. Int J Radiat Oncol Biol Phys. 2017;99:344–352. doi: 10.1016/j.ijrobp.2017.04.021. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Chang K., Balachandar N., Lam C. Distributed deep learning networks among institutions for medical imaging. J Am Med Inform Assoc. 2018;25:945–954. doi: 10.1093/jamia/ocy017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Wilkinson M.D., Dumontier M., Aalbersberg I.J.J. The FAIR Guiding Principles for scientific data management and stewardship. Sci Data. 2016;3 doi: 10.1038/sdata.2016.18. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Aerts H.J.W.L., Rios Velazquez E., Leijenaar R.T.H. 2015. Data from NSCLC-Radiomics. [Google Scholar]
- 40.Bradley J., Forster K. 2018. Data from NSCLC-Cetuximab. [Google Scholar]
- 41.Grossberg A.J., Mohamed A.S.R., Elhalawani H. Imaging and clinical data archive for head and neck squamous cell carcinoma patients treated with radiotherapy. Sci Data. 2018;5:180173. doi: 10.1038/sdata.2018.173. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Thompson R.F., Valdes G., Fuller C.D. Artificial Intelligence in Radiation Oncology Imaging. Int J Radat Oncol Biol Phys. 2018;102:1159–1161. doi: 10.1016/j.ijrobp.2018.05.070. [DOI] [PubMed] [Google Scholar]
- 43.West C., Rosenstein B.S. Establishment of a radiogenomics consortium. Int J Radat Oncol Biol Phys. 2010;76:1295–1296. doi: 10.1016/j.ijrobp.2009.12.017. [DOI] [PubMed] [Google Scholar]
- 44.Oh J.H., Kerns S., Ostrer H., Powell S.N., Rosenstein B., Deasy J.O. Computational methods using genome-wide association studies to predict radiotherapy complications and to identify correlative molecular processes. Sci Rep. 2017;7:43381. doi: 10.1038/srep43381. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Kang J., Rancati T., Lee S. Machine learning and radiogenomics: Lessons learned and future directions. Front Oncol. 2018;8:228. doi: 10.3389/fonc.2018.00228. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Lee S., Kerns S., Ostrer H., Rosenstein B., Deasy J.O., Oh J.H. Machine learning on a genome-wide association study to predict late genitourinary toxicity after prostate radiation therapy. Int J Radiat Oncol Biol Phys. 2018;101:128–135. doi: 10.1016/j.ijrobp.2018.01.054. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Kang J., Coates J.T., Strawderman R.L., Rosenstein B.S., Kerns S.L. Genomics models in radiotherapy: from mechanistic to machine learning. https://aapm.onlinelibrary.wiley.com/doi/full/10.1002/mp.13751 Available at: [DOI] [PMC free article] [PubMed]
- 48.Gillespie E.F., Panjwani N., Golden D.W. Multi-institutional randomized trial testing the utility of an interactive three-dimensional contouring atlas among radiation oncology residents. Int J Radiat Oncol Biol Phys. 2017;98:547–554. doi: 10.1016/j.ijrobp.2016.11.050. [DOI] [PubMed] [Google Scholar]
- 49.eContour ACRO interactive cases. http://www.econtour.org/acro Available at:
- 50.Elhalawani H., Lin T.A., Volpe S. Machine learning applications in head and neck radiation oncology: Lessons From Open-Source Radiomics challenges. Front Oncol. 2018;8:294. doi: 10.3389/fonc.2018.00294. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Mak R.H., Endres M.G., Paik J.H. Use of crowd innovation to develop an artificial intelligence–based solution for radiation therapy targeting. JAMA Oncol. 2019;5:654–661. doi: 10.1001/jamaoncol.2019.0159. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.MICCAI challenges. http://www.miccai.org/events/challenges Available at:
- 53.Methods in Clinical Cancer Research Workshop. https://www.esmo.org/meetings/workshops-courses/methods-in-clinical-cancer-research-mccr Available at:
- 54.AACR Molecular Biology in Clinical Oncology Workshop. https://www.aacr.org/meeting/mbco-2020/ Available at:
- 55.Mackie T.R., Jackson E.F., Giger M. Opportunities and challenges to utilization of quantitative imaging: Report of the AAPM practical big data workshop. Med Phys. 2018;45:e820–e828. doi: 10.1002/mp.13135. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Jain A.K., Berny-Lang M. State of Data Science in Radiation Oncology Workshop. https://events.cancer.gov/ccr/row Available at:
- 57.Thomas G., Eisenhauer E., Bristow R.G. The European Organisation for Research and Treatment of Cancer, State of Science in Radiation Oncology and Priorities for Clinical Trials meeting report. Eur J Cancer. 2020;131:76–88. doi: 10.1016/j.ejca.2020.02.050. [DOI] [PubMed] [Google Scholar]
- 58.NRG Oncology Semiannual Meeting – January 9-11, 2020; Houston, TX. https://www.nrgoncology.org/Resources/Meetings/Past-NRG-Oncology-Meetings/January-2020-Semiannual-Meeting-Resources Available at:
- 59.ASTRO-Varian Research Training Fellowship. https://www.astro.org/Patient-Care-and-Research/Research/Funding-Opportunities/ASTRO-Varian-Award Available at:
- 60.RSNA Education Grants. https://www.rsna.org/research/funding-opportunities/education-grants Available at:
- 61.Watson Health Perspectives - The 5 V’s of big data. https://www.ibm.com/blogs/watson-health/the-5-vs-of-big-data/ Available at: