Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2022 Oct 31.
Published in final edited form as: Radiother Oncol. 2018 Jun 12;129(3):421–426. doi: 10.1016/j.radonc.2018.05.030

Artificial intelligence in radiation oncology: A specialty-wide disruptive transformation?

Reid F Thompson a,b,*, Gilmer Valdes c, Clifton D Fuller d, Colin M Carpenter e, Olivier Morin c, Sanjay Aneja f, William D Lindsay g, Hugo JWL Aerts h,i, Barbara Agrimson a, Curtiland Deville Jr j, Seth A Rosenthal k, James B Yu f, Charles R Thomas Jr a
PMCID: PMC9620952  NIHMSID: NIHMS1843540  PMID: 29907338

Abstract

Artificial intelligence (AI) is emerging as a technology with the power to transform established industries, and with applications from automated manufacturing to advertising and facial recognition to fully auton-omous transportation. Advances in each of these domains have led some to call AI the “fourth” industrial revolution [1]. In healthcare, AI is emerging as both a productive and disruptive force across many disci-plines. This is perhaps most evident in Diagnostic Radiology and Pathology, specialties largely built around the processing and complex interpretation of medical images, where the role of AI is increasingly seen as both a boon and a threat. In Radiation Oncology as well, AI seems poised to reshape the specialty in significant ways, though the impact of AI has been relatively limited at present, and may rightly seem more distant to many, given the predominantly interpersonal and complex interventional nature of the specialty. In this overview, we will explore the current state and anticipated future impact of AI on Radiation Oncology, in detail, focusing on key topics from multiple stakeholder perspectives, as well as the role our specialty may play in helping to shape the future of AI within the larger spectrum of medicine.

Introduction

Artificial Intelligence (AI) and machine learning are terms used to describe a computerized approach to identify complex mathematical relationships within observational data, and that can evolve as new data are generated. While AI is not a new concept, recent advances in computing power, algorithms, and automated data collection have enabled an explosion in AI capabilities and utilization [1]. This has been facilitated by an increase in parallel computing capability through the adoption of architectures such as the Graphics Processing Unit (GPU) and other frameworks such as cloud-based computing. Advances in computing have supported widespread adoption of more sophisticated modeling approaches that have enabled more accurate predictions. In Radiation Oncology, numerous data sources (e.g. electronic medical records and outcomes data, imaging, laboratory, and pathology data, radiotherapy planning data, record-and-verify systems, other instrument data) all provide opportunities for the application of AI methodologies to improve technical capabilities and the overall quality and safety of cancer care delivery. The existing strong quality assurance and data-driven frameworks in Radiation Oncology provide a strategic and compelling foundation for the ongoing and future development of AI and its integration into patient care and other workflows.

The current state of AI in radiation oncology

The processes of target and normal tissue image segmentation, inverse planning and dose optimization, decision support, and quality assurance are all AI-relevant areas that have attracted recent interest. There have been focused conferences dedicated to the topic, and growing academic and commercial AI research interests and applications (e.g. Google’s DeepMind [2], Siris Medical’s QuickMatch [3], Mirada’s DLCExpert [4], Oncora Medical’s Precision Radiation Oncology Platform [5], and others). There are numerous additional areas of ongoing AI-related research in the field at present, with broad applications [6].

However, there is a significant mismatch between the perceived capabilities and untamed expectations of machine learning approaches compared to their actual capabilities and limitations at present [7,8]. As machine learning algorithms predominantly search for correlations between myriad input and outcomes data, the quality and utility of even the best algorithms are constrained by the well-established limitations of any study relying upon observational data [9]. Furthermore, available Radiation Oncology datasets have generally been smaller and more limited than the datasets other professions may use to tune their predictive algorithms, and the quality and completeness of existing outcomes data are a key barrier to AI. While the non-linear modeling used in machine learning algorithms may be applicable to non-linear phenomena in Radiation Oncology (e.g. dose–response curve prediction), inherent confounders, biases, or high levels of noise in any datasets could invisibly degrade the utility of such approaches (i.e. the model may train non-linearly on the confounders or biases). This remains a particularly dangerous issue given the opacity of and difficulty in interpreting many current “black box” machine learning models (e.g. a neural network incorrectly predicted that asthmatic patients have a lower risk of dying from pneumonia) [10].

Image segmentation

While knowledge-based approaches to automated image segmentation (auto-contouring) have been in development for a number of years [11], emerging approaches are integrating AI methodology such as deep learning (e.g. Mirada’s DLCExpert [4]) and other techniques like decision forests and adversarial neural networks (e.g. Microsoft’s Project “InnerEye” [12,13]). Automated segmentation tools have been adopted by an increasing number of practitioners and have been shown to improve efficiency of structure set generation, particularly for organs at risk (OARs) such as those in the head and neck [14]. However, autocontouring methods differ in performance [15] and have greater difficulty identifying soft tissue structures due to lower tissue contrast [16], and frequently generate inaccurate, incomplete, or unusable information, requiring time-consuming manual post-processing and oversight. These inaccuracies are especially prominent in clinical scenarios where a patient’s anatomical variation (e.g. hypertrophic prostate median lobe) may be absent from or represented sparsely within the training cohort upon which a given model is based.

Radiotherapy dose optimization

Early work in radiotherapy dose optimization leveraged historical data to quantify relationships between dose and proximity of anatomical structures to guide radiotherapy treatment planning [1719]. More recently, such knowledge-based treatment planning approaches have leveraged tens to hundreds of prior treatment plans to reproducibly improve planning efficiency across multiple disease sites (~80% time savings on average). Plans generated in this manner often meet or exceed adherence to dose constraints compared to manually generated plans in many clinical scenarios (e.g. prostate cancer [20,21], cervical cancer [22], gliomas and meningiomas [23], head & neck cancer [24], spine stereotactic body radiation therapy (SBRT) [25]). This approach has also been applied to post-planning quality assurance of dose volume histogram data [2628]. Ultimately, data-driven planning is not fully automated at present as it requires expert oversight and/or intervention to ensure safely deliverable treatment plans. Automated planning may also result in plans with clinically significant suboptimal dose constraint prioritization (e.g. excess dose to spinal cord in nasopharynx cancer plans [29]). Moreover, certain types of treatment plans (e.g. complex 3D and electron plans) are not readily amenable to such approaches at the present time.

Clinical decision support and outcomes prediction

Clinical decision support, powered by treatment and patient outcomes data, has emerged as another focus area for AI efforts in Radiation Oncology. The key principle is to leverage existing data sources and models to provide pertinent point-of-care recommendations. In the field of Radiation Oncology, clinical decision support has at least three main applications: (1) The pre-planning prediction of dosimetric tradeoffs to assist physicians, patients and payers alike to make better informed decisions about treatment modality and dose prescription [3032]. (2) The integration of dosimetric information with orthogonal data (e.g. genomics, diagnostic imaging, electronic medical records) to build accurate outcomes models of Tumor Control Probability (TCP) and Normal Tissue Complication Probability (NTCP). Although early research has shown this to be a promising approach, this area of decision support is not yet ready for routine clinical use [3,5,3337]. (3) Radiomics, which is a branch of medical imaging analytics that relies upon primary extraction of quantitative imaging features (e.g. texture) to predict various clinical phenomena. For instance, analysis of pre-treatment CT planning data from early stage non-small cell lung cancer patients treated with SBRT can be used to predict clinical outcomes [38], and there are multiple ongoing studies using radiomics to predict risk of pneumonitis prior to initiation of therapy. Despite these early predictive successes, radiomics based techniques remain largely confined to research domains at present.

In all cases, the scope and accuracy of AI-based computer-aided decision support remains principally limited by the scarce availability of high quality curated clinical and outcomes data, as well as the lack of standardized processes and data structures among institutions. To address this limitation, several cooperative scale research approaches (e.g. EuroCAT [39], LAMBDA-RAD, OncoSpace [40]) and commercial efforts to curate and/or clean medical data (e.g. Flatiron Health [41]) are gaining acceptance. The rapid learning health system paradigm championed by MAASTRO seems to be a particularly viable solution to the many practical challenges of information sharing across institutional and international borders [42], and the emergence of common data models (e.g. OMOP [43]), ontologies (e.g. ROO [44]), and interoperability standards (e.g. HL7-FHIR [45]) have begun to enable such approaches on a limited basis [46].

Quality assurance (QA)

Machine learning is poised to solve multiple long standing problems and improve workflow efficiency. For instance, current IMRT QA workflows require detailed assessment of every patient plan without any individualized expectation of result. AI has been successfully used to predict individualized IMRT QA passing rate [47,48], potentially enabling intelligent resource allocation in favor of plans more likely to fail. Additionally, AI has enabled error prediction in image guidance systems [49] and Linac performance [50], in some cases exceeding the detection rate by current clinical metrics. As machine level data are increasingly applied toward preventative maintenance [51], these and other AI algorithms may also serve to decrease machine downtime and other technical failures. Importantly, this could help to improve departmental efficiency and improve staff and patient satisfaction.

However, these improvements in QA processes (as well as the incorporation of AI into other Radiation Oncology workflows) carry implicit maintenance costs in the form of additional QA demands for the algorithms themselves. At present, the accuracy of population-based machine learning algorithms in medicine tends to decay over time (approximately 4 month “half life”) as the underlying populations and paradigms evolve and thus render existing models less accurate [52]. The performance of all deployed AI algorithms will therefore need to be verified periodically using an evolving series of tests. The medical physics community has not yet faced this burden, however the need is already emerging for QA of AI-based processes and other clinically deployed algorithms.

The potential future impact of AI on radiation oncology

Increased efficiency in the radiation treatment planning process is perhaps the largest and nearest-term beneficiary of advances in AI. One could envision a system that accurately identifies both normal and target volumes, estimates the optimal modality and beam arrangement from among a range of feasible and clinically-appropriate options, and is able to achieve deliverable plans that maximize tumor control probability and minimize risk of toxicity, while approaching the upper limit of technological possibility. Moreover, AI could enable integration of clinically relevant data from multiple sources (e.g. EHR, imaging data) to further tailor the treatment approach. Overall, an ideal system could substantially speed the process, reduce the time burden of human intervention, allow for a shorter interval from simulation to initiation of treatment, and facilitate paradigm shifts such as online adaptive planning [53].

In the future, it is likely that advanced computational techniques will enable such rapid and reliable automation as to reshape resource utilization and staffing levels, training requirements, reimbursements, and patterns of care, as well as other aspects of Radiation Oncology. However, it is also likely that the most successful implementations will require human engagement to utilize AI to maximally augment human skills. Indeed, technological advances are often accompanied by loss of some clinical or technical skills [54] with gain of others. Future trainees and current practitioners alike will need to understand the applications and limitations of multiple AI tools of the future. Current critical skills such as contouring will see their importance fade over time, to be replaced by a fluency with AI and novel human–machine interfaces.

Dosimetry and medical physics

The emergence of computational tools to perform medical dosimetry workflow tasks is likely to dramatically reduce planning time and extend the capabilities of a single dosimetrist. Depending upon growth in patient volumes, the field may see a contraction in overall staffing requirements. Although these factors should reduce costs, enhance plan quality and throughput, and enable frequent adaptive replanning to the benefit of patients [55,56], patient care workflows will become increasingly dependent upon multiple additional computer-based points of failure.

Medical physicists are similarly debating what their role will be as AI reaches maturity [57]. For instance, additional QA responsibilities are likely to include the supervision, deployment and maintenance of AI algorithms that are sure to emerge. These changes potentially loom larger for the Medical Physics profession than the changes that accompanied the introduction of IMRT. The role of the physicist is likely to migrate away from quality assurance of equipment (e.g. simulators and linear accelerators), and more toward quality assurance of patient treatments and the overall treatment process and environment [58]. It is also possible that dosimetrist duties may shift toward treatment plan QA, depending upon departmental logistics and competing responsibilities of medical physics staff.

Clinical patient care

Wide anticipated adoption of AI in healthcare decisions [59] could improve the quality of care broadly by improving decision-making capacity and reducing the impact of knowledge gaps between domain-specific experts and non-experts, ensuring any provider could leverage insights from collective experience beyond the capacity of an individual to acquire. This is likely to reduce the emphasis on a provider to acquire such quantifiable knowledge. From a radiation treatment planning perspective, it is likely that clinicians, akin to dosimetrists, will have reduced direct involvement in the planning process, with an analogous shift toward oversight responsibility.

The future adoption of integrative analytics and bona fide rapid learning health systems [42] is likely to increase the demands on clinicians to interface with computers. This could be of great benefit if human computer interfaces allow providers to reduce duplicative or manual efforts. However, cumbersome interfaces could reduce efficiency and stymie progress.

Ultimately, a provider’s primary responsibilities to put the patient first – to support, educate, and treat as relevant – will remain unchanged. However, it is likely that, as AI and computerized processes play a larger role in the Radiation Oncology ecosystem, providers will increasingly be required to have basic AI literacy and to support the computerized systems upon which all depend.

Practical considerations

As AI evolves to surpass human performance in specific tasks, it may become limited by the traditional metrics we have employed, which may in and of themselves be poor surrogates of the medical ground truth. Should we reach an era where AI can statistically outperform any human expert at image segmentation, would it be ethical to have a human make “corrections” to a volume? Would it be medically defensible to select a plan disfavored by an AI-based plan comparison, particularly when the number of constraints could outstrip a human’s ability to assess? Full maturity of AI systems will therefore depend upon earned trust and a level of interpretability that enables a human to understand and accept superior performance.

The growth of AI will also pose novel data security challenges [60,61] as data are increasingly shared across governance structures and stakeholders [62,63]. Foremost among these is the increasing technical difficulty of maintaining privacy in a highly interconnected environment [64]. The increased data security requirements from the European Union’s General Data Protection Regulation (GDPR) [65,66] represent efforts to evaluate and balance these concerns; however, there are significant privacy implications of unintended third party data reuse that may become more common with AI adoption. Google DeepMind’s embattled partnership with the British National Health Care System provides a cautionary example of unfettered access to privacy noncompliant data [67].

The haphazard adoption of IMRT provides a compelling harbinger of how AI could similarly be integrated into clinical practices in a piecemeal fashion, which may be driven in part by skepticism, bias, or early challenges [68]. Early adopters will more likely face the burden of unwieldy AI implementation, as well as the resistance of later adopters who may be unwilling to exert even a modicum of time or effort to assist with the optimization of AI.

Just as some early IMRT systems had flaws that resulted in patient harms [69,70], early AI algorithms could also cause negative outcomes as they are clinically adopted. Therefore, we will likely see serial prospective technology assessment trials of AI approaches [14]. Nevertheless, AI is sure to undergo a transition through several “false starts” on the way to a fully effective widespread technology.

From vision to reality: How do we get from here to there?

Recognition of the magnitude of the task, the need for collaboration with other disciplines and specialties, and the potential revolutionary, paradigm-changing nature of the changes to come in the coming years should all serve as a “call to arms” for Radiation Oncology to actively engage in harnessing the potential advantages of AI for our specialty and for our patients. The question remains “How?”

The American College of Radiology’s Data Science Institute

Anticipating the growth of AI and its current and future implications, the American College of Radiology (ACR) has launched the Data Science Institute (DSI) to coordinate effort among key stakeholders including clinicians and patients, researchers, industry and regulators [71]. In cooperation with global partners, the DSI is scoped to define clinically meaningful AI use cases, set standards for medical imaging AI interoperability, benchmark and certify algorithms, and broadly anticipate and address the many relevant regulatory, legal, and ethical issues that are likely to arise.

Data availability

Radiation Oncology datasets represent a relatively untapped, but highly valuable source of annotated images and treatment plans for machine learning approaches. In light of the numerous barriers to data sharing (e.g. strict privacy requirements) and the growing necessity to integrate disparate datasets, we must promote and adopt data solutions that enable shared contributions and learning across institutional and international borders [42]. We must avoid data hoarding by embracing a culture committed to data sharing, possibly by incentivizing release of richly curated datasets as a promotable and rewardable academic and scientific activity [7274]. Just as open-access publication is mandated for publicly funded research, perhaps FAIR-compliant DICOM-RT and matched clinical data publication could be a requirement of any publicly funded radiotherapy trials [75,76]. Akin to the use of digital and physical phantoms as objective references, standardized datasets will also need to be created for evaluation of AI algorithms.

We should continue to embrace and expand data standards and promote seamless interoperability of treatment planning system data with both the electronic medical record and radiology PACS systems. We should leverage and support open access codebases related to Radiation Oncology data sharing and analysis [7781], and facilitate wide dissemination of imaging data for novel algorithm development and refining best practices [8284]. To enable the broadest and most impactful dissemination of AI tools, it is important that we facilitate equitable access and distribution, and that algorithms resulting in part or whole from data generated using public funds should be made publicly available to all. Both funding agencies and publishers should explicitly stipulate such policies. All of these efforts will require commitment and infrastructure development [77] by multiple parties.

Data equity

It is critical that algorithms and AI models explicitly account for diverse clinical populations to reduce implicit biases, inequities, and disparities, and ensure that they are not codified or exacerbated. Highly effective algorithms developed in one population (e.g. patients at tertiary academic medical centers) may be optimized on demographic groups which are not represented in other countries, regions, or cohorts. This can have severe consequences, as was recently shown for a predictor of hypertrophic cardiomyopathy that misdiagnosed a pathogenic genetic variant in African American patients [85]. We must therefore endeavor to augment training and validation datasets with diverse data that reflect the population(s) in which they could ultimately be implemented. We must also diversify engagement with and development of AI tools in general.

Education and training

We should provide opportunities for trainee enrichment in machine learning techniques and data science and AI tools and evaluation. Residency program curricula and in-service exams will require revision to include structured informatics (clinical and computational) educational modules, just as programs have previously included specialized instruction in radiobiology and physics.

It is also important that we begin developing pathways for interested and qualified individuals to engage more deeply in AI domains. The NCI-FDA Information Exchange and Data Transformation (INFORMED) Fellowship in Oncology Data Science, an innovative oncology fellowship program for qualified radiation oncologists, could be used as a model across institutions [86,87].

Residency organizations could further champion AI training through journal clubs, interest groups, and other extracurricular enrichment (e.g. AI boot camp). Future continuing medical education (CME), webinars, ESTRO School courses, workshops, conferences, and other distributed materials will also need to be developed to facilitate the integration of AI tools into routine practice.

Funding and staffing

We should advocate for expansion of AI-relevant research funding and continue to support and promote dissemination of AI-related research to an international audience (e.g. ESTRO annual meeting). The success of the NCI/ASTRO/AAPM workshops on Big Data (2015) and Precision Medicine (2016) highlight the power of a community-scale focus on data science, and we should provide for such opportunities on an ongoing and international basis.

Leaders of academic radiation oncology programs should be strategic in garnering shared resources from other informatics stakeholders at their home institutions, in order to develop dedicated efforts in computational biology, informatics, and data science within the classic department structure. The scale and scope of AI integration, however, could easily dwarf the capabilities of single institutions, and large scale cooperative groups should be convened to address this by collective effort. While the number of radiation oncology-trained physicians with relevant informatics experience and interest remains nascent, the evolving phenotype of the trainees matriculating into our field could meet this demand if cultivated appropriately.

Appropriate strategic collaborative partnerships with the technology industry could help not only to provide resources for critical research, but to maximize the synergy of often mutually exclusive clinical and technical expertise. Any such partnerships should embrace the principles of equity to the greatest extent possible.

It is also imperative that we proactively study the potential ramifications of emerging technologies on patient care, staffing, and financial models so that we may better anticipate workforce supply and demand. Such projections are notoriously inaccurate, and differing observations and assumptions may yield opposing conclusions over time [88,89]. Nonetheless, this quantitative approach is critical for long-term planning efforts, with potentially significant repercussions for workforce balance, patient-provider ratios, and health disparities and outcomes.

Summary/conclusions

While we acknowledge that the future is challenging to predict, there is clear and compelling evidence that AI will change the entire field of medicine over the coming years. Radiation Oncology will be no exception. Just as other technologies (e.g. IMRT) have ushered in transformative changes in the past, AI is sure to affect treatment capacity, treatment capabilities, and safety and QA frameworks, with significant repercussions for patients, providers, and the healthcare system as a whole. The international Radiation Oncology community will need to work together to ensure optimal utilization and coordination of talent, training, investment, and resources in order to maximize the potential of AI. Otherwise, future AI tools are likely to be constrained in scope and value and perhaps could be dictated to our specialty by external stakeholders. The time has come for Radiation Oncology to embrace the potential of AI through education and engagement. Even as it may seem part of a more distant horizon, the opportunity cost of ignoring AI at this juncture is steep.

Footnotes

Conflicts of interest

WDL is the founder and CEO of Oncora Medical. CMC is the founder and CEO of Siris Medical, Inc.

Disclaimer

The contents do not represent the views of the U.S. Department of Veterans Affairs or the United States Government.

References

  • [1].Hoagland J The fourth industrial revolution is upon us. The Washington Post; 2017. [Google Scholar]
  • [2].DeepMind. Applying machine learning to radiotherapy planning for head & neck cancer. (2016). Available at: https://deepmind.com/blog/applying-machine-learning-radiotherapy-planning-head-neck-cancer/. (Accessed: 1st January 2016)
  • [3].Valdes G et al. Clinical decision support of radiotherapy treatment planning: a data-driven machine learning strategy for patient-specific dosimetric decision making. Radiother Oncol 2017;125:392–7. 10.1016/j.radonc.2017.10.014. [DOI] [PubMed] [Google Scholar]
  • [4].Aljabar P, Gooding MJ. The cutting edge: delineating contours with Deep. Learning 2017.
  • [5].Oncora Medical. Available at: https://oncoramedical.com/. (Accessed: 1st November 2017)
  • [6].Bibault J-E, Giraud P, Burgun A. Big Data and machine learning in radiation oncology: state of the art and future prospects. Cancer Lett 2016;382:110–7. [DOI] [PubMed] [Google Scholar]
  • [7].Obermeyer Z, Lee TH. Lost in thought—The limits of the human mind and the future of medicine. N Engl J Med 2017;377:1209–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [8].Chen JH, Asch SM. Machine learning and prediction in medicine – beyond the peak of inflated expectations. N Engl J Med 2017;376:2507–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [9].Jepsen P, Johnsen SP, Gillman MW, Sørensen HT. Interpretation of observational studies. Heart 2004;90:956–60. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [10].Cabitza F, Rasoini R, Gensini GF. Unintended consequences of machine learning in medicine. JAMA 2017;318:517–8. [DOI] [PubMed] [Google Scholar]
  • [11].Clark MC et al. Automatic tumor segmentation using knowledge-based techniques. IEEE Trans Med Imaging 1998;17:187–201. [DOI] [PubMed] [Google Scholar]
  • [12].Project InnerEye – Medical Imaging AI to Empower Clinicians. Available at: https://www.microsoft.com/en-us/research/project/medical-image-analysis/. (Accessed: 1st November 2017)
  • [13].Kamnitsas K et al. Unsupervised domain adaptation in brain lesion segmentation with adversarial networks. (2016).
  • [14].Walker GV et al. Prospective randomized double-blind study of atlas-based organ-at-risk autosegmentation-assisted radiation planning in head and neck cancer. Radiother Oncol 2014;112:321–5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [15].Ibragimov B, Xing L. Segmentation of organs-at-risks in head and neck CT images using convolutional neural networks. Med Phys 2017;44:547–57. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [16].Delpon G et al. Comparison of automated atlas-based segmentation software for postoperative prostate cancer radiotherapy. Front Oncol 2016;6:178. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [17].Good D et al. A knowledge-based approach to improving and homogenizing intensity modulated radiation therapy planning quality among treatment centers: an example application to prostate cancer planning. Int J Radiat Oncol Biol Phys 2013;87:176–81. [DOI] [PubMed] [Google Scholar]
  • [18].Wu B et al. Using overlap volume histogram and IMRT plan data to guide and automate VMAT planning: a head-and-neck case study. Med Phys 2013;40:21714. [DOI] [PubMed] [Google Scholar]
  • [19].Moore KL, Brame RS, Low DA, Mutic S. Experience-based quality control of clinical intensity-modulated radiotherapy planning. Int J Radiat Oncol Biol Phys 2011;81:545–51. [DOI] [PubMed] [Google Scholar]
  • [20].Nwankwo O, Mekdash H, Sihono DSK, Wenz F, Glatting G. Knowledge-based radiation therapy (KBRT) treatment planning versus planning by experts: validation of a KBRT algorithm for prostate cancer treatment planning. Radiat Oncol 2015;10:111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [21].Schubert C et al. Intercenter validation of a knowledge based model for automated planning of volumetric modulated arc therapy for prostate cancer. The experience of the German RapidPlan Consortium. PLoS One 2017;12: e0178034. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [22].Li N et al. Highly efficient training, refinement, and validation of a knowledge-based planning quality-control system for radiation therapy clinical trials. Int J Radiat Oncol Biol Phys 2017;97:164–72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [23].Chatterjee A et al. Performance of knowledge-based radiation therapy planning for the glioblastoma disease site. Int J Radiat Oncol 2017. 10.1016/j.ijrobp.2017.07.012. [DOI] [PubMed] [Google Scholar]
  • [24].Tol JP, Delaney AR, Dahele M, Slotman BJ, Verbakel WFAR. Evaluation of a knowledge-based planning solution for head and neck cancer. Int J Radiat Oncol Biol Phys 2015;91:612–20. [DOI] [PubMed] [Google Scholar]
  • [25].Foy JJ et al. An analysis of knowledge-based planning for stereotactic body radiation therapy of the spine. Pract Radiat Oncol 2017. 10.1016/j.prro.2017.02.007. [DOI] [PubMed] [Google Scholar]
  • [26].Zhu X et al. A planning quality evaluation tool for prostate adaptive IMRT based on machine learning. Med Phys 2011;38:719–26. [DOI] [PubMed] [Google Scholar]
  • [27].Appenzoller LM, Michalski JM, Thorstad WL, Mutic S, Moore KL. Predicting dose-volume histograms for organs-at-risk in IMRT planning. Med Phys 2012;39:7446–61. [DOI] [PubMed] [Google Scholar]
  • [28].Moore KL et al. Quantifying unnecessary normal tissue complication risks due to suboptimal planning: a secondary study of RTOG 0126. Int J Radiat Oncol Biol Phys 2015;92:228–35. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [29].Chang ATY et al. Comparison of planning quality and efficiency between conventional and knowledge-based algorithms in nasopharyngeal cancer patients using intensity modulated radiation therapy. Int J Radiat Oncol Biol Phys 2016;95:981–90. [DOI] [PubMed] [Google Scholar]
  • [30].Cheng Q et al. Development and evaluation of an online three-level proton vs photon decision support prototype for head and neck cancer - Comparison of dose, toxicity and cost-effectiveness. Radiother Oncol 2016;118:281–5. [DOI] [PubMed] [Google Scholar]
  • [31].Langendijk JA et al. Selection of patients for radiotherapy with protons aiming at reduction of side effects: the model-based approach. Radiother Oncol 2013;107:267–73. [DOI] [PubMed] [Google Scholar]
  • [32].Hall DC, Trofimov AV, Winey BA, Liebsch NJ, Paganetti H. Predicting patient-specific dosimetric benefits of proton therapy for skull-base tumors using a geometric knowledge-based method. Int J Radiat Oncol Biol Phys 2017;97:1087–94. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [33].Naqa IE et al. Datamining approaches for modeling tumor control probability. Acta Oncol 2010;49:1363–73. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [34].Naqa IE et al. Multivariable modeling of radiotherapy outcomes, including dose-volume and clinical factors. Int J Radiat Oncol Biol Phys 2006;64:1275–86. [DOI] [PubMed] [Google Scholar]
  • [35].Oberije C et al. A prospective study comparing the predictions of doctors versus models for treatment outcome of lung cancer patients: a step toward individualized care and shared decision making. Radiother Oncol 2014;112:37–43. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [36].Bradley J, Deasy JO, Bentzen S, Naqa IE. Dosimetric correlates for acute esophagitis in patients treated with radiotherapy for lung carcinoma. Int J Radiat Oncol Biol Phys 2004;58:1106–13. [DOI] [PubMed] [Google Scholar]
  • [37].Hope AJ et al. Modeling radiation pneumonitis risk with clinical, dosimetric, and spatial parameters. Int J Radiat Oncol Biol Phys 2006;65:112–24. [DOI] [PubMed] [Google Scholar]
  • [38].Huynh E et al. CT-based radiomic analysis of stereotactic body radiation therapy patients with lung cancer. Radiother Oncol 2016;120:258–66. [DOI] [PubMed] [Google Scholar]
  • [39].Deist TM et al. Infrastructure and distributed learning methodology for privacy-preserving multi-centric rapid learning health care: euroCAT. Clin Transl Radiat Oncol 2017;4:24–31. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [40].OncoSpace. Available at: https://oncospace.radonc.jhmi.edu/. (Accessed: 1st December 2017)
  • [41].Flatiron Health. Available at: https://flatiron.com/. (Accessed: 1st November 2017)
  • [42].Lambin P et al. ‘Rapid Learning health care in oncology’ – an approach towards decision support systems enabling customised radiotherapy. Radiother Oncol 2013;109:159–64. [DOI] [PubMed] [Google Scholar]
  • [43].Reich C, Ryan P, Belenkaya R, Natarajan K & Blacketer C OMOP Common Data Model v5.2 Specifications. (2017). Available at: https://github.com/OHDSI/CommonDataModel/blob/master/OMOP_CDM_v5_2.pdf.
  • [44].Dekker A Radiation Oncology Ontology. Available at: http://bioportal.bioontology.org/ontologies/ROO.
  • [45].Phillips M, Halasz L. Radiation oncology needs to adopt a comprehensive standard for data transfer: the case for HL7 FHIR. Int J Radiat Oncol Biol Phys 2017;99:1073–5. [DOI] [PubMed] [Google Scholar]
  • [46].Rosenbloom ST, Carroll RJ, Warner JL, Matheny ME, Denny JC. Representing knowledge consistently across health systems. Yearb Med Inform 2017;26:139–47. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [47].Valdes G et al. A mathematical framework for virtual IMRT QA using machine learning. Med Phys 2016;43:4323. [DOI] [PubMed] [Google Scholar]
  • [48].Valdes G et al. IMRT QA using machine learning: a multi-institutional validation. J Appl Clin Med Phys 2017;18:279–84. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [49].Valdes G et al. Use of TrueBeam developer mode for imaging QA. J Appl Clin Med Phys 2015;16:322–33. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [50].Li Q, Chan MF. Predictive time-series modeling using artificial neural networks for Linac beam symmetry: an empirical study. Ann N Y Acad Sci 2017;1387:84–94. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [51].Hoisak JDP, Pawlicki T, Kim G-Y, Fletcher R, Moore KL. Improving linear accelerator service response with a real- time electronic event reporting system. J Appl Clin Med Phys 2014;15:4807. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [52].Chen JH, Alagappan M, Goldstein MK, Asch SM, Altman RB. Decaying relevance of clinical data towards future decisions in data-driven inpatient clinical order sets. Int J Med Inform 2017;102:71–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [53].Lim-Reinders S, Keller BM, Al-Ward S, Sahgal A, Kim A. Online adaptive radiation therapy. Int J Radiat Oncol Biol Phys 2017;99:994–1003. [DOI] [PubMed] [Google Scholar]
  • [54].Hoff T. Deskilling and adaptation among primary care physicians using two work innovations. Health Care Manage Rev 2011;36:338–48. [DOI] [PubMed] [Google Scholar]
  • [55].Dial C, Weiss E, Siebers JV, Hugo GD. Benefits of adaptive radiation therapy in lung cancer as a function of replanning frequency. Med Phys 2016;43:1787. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [56].Chen AM et al. Clinical outcomes among patients with head and neck cancer treated by intensity-modulated radiotherapy with and without adaptive replanning. Head Neck 2014;36:1541–6. [DOI] [PubMed] [Google Scholar]
  • [57].Tang X, Wang B, Rong Y. Artificial intelligence will reduce the need for clinical medical physicists. J. Appl Clin Med Phys 2018;19:6–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [58].Atwood TF et al. Care for patients, not for charts: a future for clinical medical physics. Int J Radiat Oncol Biol Phys 2018;100:21–2. [DOI] [PubMed] [Google Scholar]
  • [59].Sullivan T Half of hospitals to adopt artificial intelligence within 5 years. Healthcare IT News; 2017. [Google Scholar]
  • [60].Hill SY. Data sharing: guard the privacy of donors. Nature 2017;548:281. [DOI] [PubMed] [Google Scholar]
  • [61].Kayaalp M Patient privacy in the era of big data. Balkan Med J 2017. 10.4274/balkanmedj.2017.0966. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [62].Skripcak T et al. Creating a data exchange strategy for radiotherapy research: towards federated databases and anonymised public datasets. Radiother Oncol 2014;113:303–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [63].Jochems A et al. Distributed learning: developing a predictive model based on data from multiple hospitals without data leaving the hospital – a real life proof of concept. Radiother Oncol 2016;121:459–67. [DOI] [PubMed] [Google Scholar]
  • [64].Gutmann A Data re-identification: prioritize privacy. Science 2013;339:1032. [DOI] [PubMed] [Google Scholar]
  • [65].Rumbold JMM, Pierscionek B. The effect of the general data protection regulation on medical research. J Med Internet Res 2017;19:e47. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [66].European Society of Radiology (ESR). The new EU General Data Protection Regulation: what the radiologist should know. Insights Imaging 2017;8:295–299. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [67].McGoon C NHS illegally handed Google firm 1.6m patient records, UK data watchdog finds. The Telegraph; 2017. [Google Scholar]
  • [68].Bak K, Dobrow MJ, Hodgson D, Whitton A. Factors affecting the implementation of complex and evolving technologies: multiple case study of intensity-modulated radiation therapy (IMRT) in Ontario Canada. BMC Health Serv Res 2011;11:178. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [69].Rassiah-Szegedi P et al. Monte Carlo characterization of target doses in stereotactic body radiation therapy (SBRT). Acta Oncol 2006;45:989–94. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [70].Latifi K et al. Study of 201 non-small cell lung cancer patients given stereotactic ablative radiation therapy shows local control dependence on dose calculation algorithm. Int J Radiat Oncol Biol Phys 2014;88:1108–13. [DOI] [PubMed] [Google Scholar]
  • [71].ACR Data Science Institute to Guide Artificial Intelligence Use in Medical Imaging. (2017). Available at: https://www.acr.org/About-Us/Media-Center/Press-Releases/2017-Press-Releases/20170521-ACR-Data-Science-Institute-to-Guide-Artificial-Intelligence-Use-in-Medical-Imaging.
  • [72].Kim D Knowledge sharing as a social dilemma in pharmaceutical innovation. Food Drug Law J 2016;71:673–709. [PubMed] [Google Scholar]
  • [73].Bertagnolli MM et al. Advantages of a truly open-access data-sharing model. N Engl J Med 2017;376:1178–81. [DOI] [PubMed] [Google Scholar]
  • [74].Figueiredo AS. Data sharing: convert challenges into opportunities. Front Public Heal 2017;5:327. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [75].Vickers AJ. Whose data set is it anyway? Sharing raw data from randomized trials. Trials 2006;7:15. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [76].Sommer J The delay in sharing research data is costing lives. Nat Med 2010;16:744. [DOI] [PubMed] [Google Scholar]
  • [77].Roelofs E et al. International data-sharing for radiotherapy research: an open-source based infrastructure for multicentric clinical data mining. Radiother Oncol 2014;110:370–4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [78].Naqa IE et al. Dose response explorer: an integrated open-source tool for exploring and modelling radiotherapy dose-volume outcome relationships. Phys Med Biol 2006;51:5719–35. [DOI] [PubMed] [Google Scholar]
  • [79].Kalpathy-Cramer J et al. Development of a software for quantitative evaluation radiotherapy target and organ-at-risk segmentation comparison. J Digit Imaging 2014;27:108–19. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [80].Zhang L et al. IBEX: an open infrastructure software platform to facilitate collaborative work in radiomics. Med Phys 2015;42:1341–53. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [81].Munoz L, Ziebell A, Morton J, Bhat M. An open source solution for an in-house built dynamic platform for the validation of stereotactic ablative body radiotherapy for VMAT and IMRT. Australas Phys Eng Sci Med 2016;39:957–64. [DOI] [PubMed] [Google Scholar]
  • [82].MICCAI/M.D.. Anderson Cancer Center Head and Neck Quantitative Imaging Working Group. Matched computed tomography segmentation and demographic data for oropharyngeal cancer radiomics challenges. Sci Data 2017;4:170077. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [83].Ger RB et al. A multi-institutional comparison of dynamic contrast-enhanced magnetic resonance imaging parameter calculations. Sci Rep 2017;7:11185. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [84].Kalpathy-Cramer J, Freymann JB, Kirby JS, Kinahan PE, Prior FW. Quantitative imaging network: data sharing and competitive algorithm validation leveraging the cancer imaging archive. Transl Oncol 2014;7:147–52. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [85].Manrai AK et al. Genetic misdiagnoses and the potential for health disparities. N Engl J Med 2016;375:655–65. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [86].Khozin S, Kim G, Pazdur R. Regulatory watch: From big data to smart data: FDA’s INFORMED initiative. Nat Rev Drug Discovery 2017;16:306. [DOI] [PubMed] [Google Scholar]
  • [87].NCI announces oncology data science fellowship. (2017). Available at: https://astroblog.weebly.com/blog/nci-announces-oncology-data-science-fellowship.
  • [88].Smith BD et al. The future of radiation oncology in the United States from 2010 to 2020: will supply keep pace with demand? J Clin Oncol 2010;28:5160–5. [DOI] [PubMed] [Google Scholar]
  • [89].Pan HY et al. Supply and demand for radiation oncology in the United States: updated projections for 2015 to 2025. Int J Radiat Oncol Biol Phys 2016;96:493–500. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

Radiation Oncology datasets represent a relatively untapped, but highly valuable source of annotated images and treatment plans for machine learning approaches. In light of the numerous barriers to data sharing (e.g. strict privacy requirements) and the growing necessity to integrate disparate datasets, we must promote and adopt data solutions that enable shared contributions and learning across institutional and international borders [42]. We must avoid data hoarding by embracing a culture committed to data sharing, possibly by incentivizing release of richly curated datasets as a promotable and rewardable academic and scientific activity [7274]. Just as open-access publication is mandated for publicly funded research, perhaps FAIR-compliant DICOM-RT and matched clinical data publication could be a requirement of any publicly funded radiotherapy trials [75,76]. Akin to the use of digital and physical phantoms as objective references, standardized datasets will also need to be created for evaluation of AI algorithms.

We should continue to embrace and expand data standards and promote seamless interoperability of treatment planning system data with both the electronic medical record and radiology PACS systems. We should leverage and support open access codebases related to Radiation Oncology data sharing and analysis [7781], and facilitate wide dissemination of imaging data for novel algorithm development and refining best practices [8284]. To enable the broadest and most impactful dissemination of AI tools, it is important that we facilitate equitable access and distribution, and that algorithms resulting in part or whole from data generated using public funds should be made publicly available to all. Both funding agencies and publishers should explicitly stipulate such policies. All of these efforts will require commitment and infrastructure development [77] by multiple parties.

RESOURCES