Skip to main content
The British Journal of Radiology logoLink to The British Journal of Radiology
. 2020 Mar 19;93(1108):20190840. doi: 10.1259/bjr.20190840

Artificial intelligence in diagnostic imaging: impact on the radiography profession

Maryann Hardy 1,, Hugh Harvey 2
PMCID: PMC7362930  PMID: 31821024

Abstract

The arrival of artificially intelligent systems into the domain of medical imaging has focused attention and sparked much debate on the role and responsibilities of the radiologist. However, discussion about the impact of such technology on the radiographer role is lacking. This paper discusses the potential impact of artificial intelligence (AI) on the radiography profession by assessing current workflow and cross-mapping potential areas of AI automation such as procedure planning, image acquisition and processing. We also highlight the opportunities that AI brings including enhancing patient-facing care, increased cross-modality education and working, increased technological expertise and expansion of radiographer responsibility into AI-supported image reporting and auditing roles.

Introduction

Within the UK, and across much of the world, diagnostic radiography is a graduate profession requiring practitioners to have high level practical and critical thinking skills to ensure optimization of image acquisition processes and, working in conjunction with medical radiologist colleagues, appropriate management of the patient care pathway. Radiography, as a profession, is reliant on imaging technology. Without the technology to acquire and view images, neither radiography nor radiology would exist and in part, the growth in breadth and complexity of imaging examinations and interventions,1 as well as the associated increase in demand for medical imaging,2 are directly related to advances in imaging technology and computerization. While this technological expansion has benefitted patient diagnosis and treatment, it has also influenced and changed radiographic practice and the role of the radiographer,3–5 but this is not a new phenomenon. Technology changes and advancements have always directly influenced the radiography profession with radiographic practice evolving and adapting in response to the operation of new technologies and the advanced imaging opportunities offered by their adoption. However, recent technological advances have not focused on new imaging technologies per se. Instead, they have focused on the integration of complex machine learning algorithms and artificially intelligent systems within equipment operation and image review processes, and it is the influence and control of these technologies on radiography practice that is yet to be explored.

Artificial intelligence (AI) is a broad umbrella term that encompasses the theory and development of computer systems able to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, and prediction.6 It is a data-reliant paradigm that fits well with the technology-driven practice of modern medical imaging and, in particular, to computer vision tasks. In recent years, there has been a significant academic and industrial surge in proposed AI applications for diagnostic imaging7 and while the vast majority have focused on augmenting and assisting the radiologist, there is a growing niche of applications directly applicable to radiography practice.8,9

Radiographers have accepted automated technologies within their practice for many years, which may be regarded by some to have caused an erosion of core skills, responsibilities and opportunity for autonomous decision-making. A positive consequence of increased digitization and automation has been an increase in efficiency and throughput within imaging departments.10,11 However, evidence also suggests that increasing patient workloads and examination speeds may also have had a negative impact on radiographer morale, role satisfaction and “burn out.”12–15 The responsibility for this lies not with the technology itself, but with professional leaders and employers failing to consider the impact of advancing automation technologies on professional and workplace cultures and role adaptation. To address this gap, we explore how the radiographer role might develop and change in response to the evolving capabilities of intelligent technologies, the opportunities adoption may bring, and the steps required to ensure that the radiography profession remains engaged and involved in the successful delivery and implementation of AI systems.

Impact of AI on radiographic practice

At a high level, it may be argued that AI in some form has been an inherent component of imaging technology for many decades. Perhaps, the first example in general radiographic practice was the automatic exposure device developed in the 1980s.16 This allowed the radiographer to select the kV value for X-ray imaging but the device determined when sufficient quanta had reached the film to achieve a diagnostic image and therefore the final mAs of each exposure. While this did not diagnose or interpret images, it removed an element of decision-making from the radiographer and transferred it to the machine, the belief being that the machine could make this decision more accurately than the radiographer thereby benefitting both the organization and patient through the elimination of repeat images due to incorrect exposure and optimization of examination dose. Radiographers readily accepted this technology into their practice as they could see the benefit it provided to image acquisition practice and patient care, particularly where patient body habitus impacted on image quality. However, there was still a need for human oversight due to technical variations and errors.17

AI solutions that provide similar decision-making automation to radiographic tasks are no different to previous devices in so far as they require both clinical evidence and radiographer acceptance and oversight before they can be widely deployed. However, a key differentiator is that new AI systems have the potential to automate a broader range of higher level cognitive tasks and therefore, it can be argued, greater diligence and evidence should be required prior to adoption. Importantly, while published evidence has explored the impact of AI within specialist imaging domains such as mammography,18,19 ultrasound,20,21 and nuclear medicine,22,23 there is little if any evidence that directly considers projectional radiography and or the impact of AI on image acquisition practices within cross-sectional imaging. Importantly, it is in the latter where we anticipate AI to have the greatest impact on radiographic practice in the near future, particularly when levels of increasing demand for these modalities is considered.

Considering the current range of tasks that radiographers perform within cross-sectional imaging (Figure 1), and cross-mapping these with proposed areas for AI automation, it is clear that AI is poised to assist the radiographers’ role significantly. However, such a level of automation, if achieved in full, could also significantly reduce current radiographer roles and responsibilities. Consequently, while it is natural for healthcare organizations to want to explore how the implementation of AI technologies might improve radiology department throughput and maximize efficiency, it is also important that the risks and corresponding potential liabilities are fully understood and managed appropriately. Importantly, current regulatory frameworks24 mandate stringent human oversight and audit of clinically deployed AI solutions. Therefore, vendors are restricted to developing systems and marketing them as requiring a level of human oversight. This opens up a new challenge for the radiography profession as they must now become skilled at interacting with, and overseeing, AI-driven semi-automated processes.

Figure 1.

Figure 1.

Areas of anticipated impact of AI in the cross-sectional radiographic workflow. AI, artificialintelligence

Significantly, the radiographer voice debating the professional issues surrounding the adoption of AI technologies and increasing image acquisition automation has, until recently,25,26 been noticeably quiet within both professional and industrial literature. It is uncertain whether this represents resigned acquiescence to the inevitable march of AI technologies, professional struggles with comprehending the enormity of the potential impact of AI or professional apathy as a learned defence to change (or fear of)27 but it contrasts starkly to the debates and arguments of radiologists when the notion of diagnostic AI became popular as an automated solution to radiology reporting backlogs. Indeed, the volume of radiology journal papers discussing, debating and evaluating AI has increased exponentially since 201528 as radiologists quickly tackled initial concerns over role demise and extinction by writing counterarguments extolling the benefits of AI automation as an assistive and augmentative technology, not an existential threat.28–32 One could argue that this upsurge in radiologist interest and debate was driven, in part, by role preservation and protectionism, but equally it has served as the voice that rationalized the value of the human worker within the imaging chain, a factor so easily overlooked in the quest for service and cost efficiencies.

Applications of AI in radiography

Despite the emphasis within publications being the evaluation of AI with respect to interpreting medical images, there are several areas beyond this where AI will have a direct impact on the radiographer role and profession.33 Here, we have highlighted non-exhaustively some proposed applications.

Pre-examination assessment

A key role for radiographers is to interact directly with patients before, during and after their imaging procedure. Part of this interaction is to check identification and indications for the examination requested as well as inform the patient about the procedure to be undertaken. While direct human communication between patient and health professional is unlikely to be replaced by AI technologies, there is potential for AI systems to assist in the automated vetting of referrals and sense-checking clinical indications and the corresponding imaging modality and techniques to be employed, as well as verifying patient identification records via interaction with the electronic health record.34 The potential capability of AI to access, assimilate and synthesize knowledge and data from a range of patient data portals simultaneously surpasses that of the human radiographer, and therefore is a natural environment for efficiency savings. However, radiographer oversight and diligence are required to ensure patient electronic health record data are not corrupted and that AI decisions are consistent.

Examination planning

Across all modalities, radiographers are responsible for ensuring accurate patient positioning prior to image acquisition and that venous access for contrast injection, if required, is available. During CT and MR studies, patient position within the gantry or bore is analyzed via the scout views (topogram) and slice/volume or sequence planning subsequently performed. Inaccurate patient positioning (non-isocentric) can result in reduced image quality and for CT, increased patient dose. Consequently, this is an important aspect of the radiographer’s role, but research suggests that isocentric positioning and scout image analysis in CT, alongside MR plane and volume calculations, are areas that may be automatable by AI systems.34,35 There is also potential for optimizing contrast volume and injection rates based on patient parameters using intelligent systems,36 as well as the possibility of using no contrast at all via synthetic contrast enhancement,37 thereby embracing the ethos of personalized and individualized healthcare. Within the context of treatment planning and therapeutic radiography, deep learning systems may also be used to plan treatment regions by autosegmenting tumors38 and individualizing radiotherapy dose39 benefitting patient care and potentially reducing unintended treatment outcomes.

Image acquisition

Selecting the correct imaging protocol based on patient presentation, clinical question and region of interest is an important radiographer responsibility but evidence suggests that protocol choice and application is not consistent within, or across, hospitals sites or imaging modalities.40–43 As a result, emerging research suggests that protocol selection may be automatable.44 There are also a plethora of AI-driven dose reduction methods for mammography,45 CT and positron emission tomography/CT,46,47 as well as MR time reduction48 providing opportunities for faster image acquisition and greater patient throughput. Automated processes to support radiographers in assuring image quality via attenuation correction49 and technical recall via automated image evaluation50 are also poised for implementation. Importantly, the opportunities for AI image acquisition augmentation extends to ultrasound where examination quality has for a long time been considered operator, rather than technology, dependent. Automated AI ultrasound positioning and measurement tools will further enable sonographers to provide high quality ultrasound assessment reports with lower error rates via AI-driven automated fetal measurements and kidney function assessment,51,52 as well as image quality assessment.53

Image processing

The automation of post-processing of CT, MR and nuclear medicine studies has been a reality for many years, reducing overall examination time and optimizing patient modality throughput. Newer AI systems may perform these tasks at even greater speed and scale, potentially allowing for image super-resolution54 as well as immediate automated segmentation of organs of interest.55–57 Early research also demonstrates promise in synthetic modality transfer, that is the creation of a CT image from an MRI scan or vice versa,55 obviating the need for a second imaging procedure entirely.

Opportunities for radiographers

Without doubt, the next generation of AI-driven systems in diagnostic imaging will impact on radiographic practice across modalities and the roles and responsibilities of radiographers. However, the radiography profession is used to adapting to new technologies and has always embraced change, particularly where the ultimate improvement in patient outcomes derived from technological change can be adequately evidenced and morally aligned with radiographers’ desire to provide high quality care. As yet, qualitative research outcomes evidencing clinical impact at scale are limited, but the speed of change and rate of technical advancements within the field of AI suggests that AI-driven solutions will be adopted extensively, and radiographers must be prepared for the new opportunities these changes will present while simultaneously maintaining the core professional value of patient care.

Leading patient-facing care

The radiography profession must remain patient-facing and patient focused and the importance of this facet of the radiographer role is likely to increase with greater automation as AI systems are not yet capable of fully automated human-level conversation and are far from providing the level of reassurance and care that patients require from trained health professionals. Radiographers may also have a greater responsibility advising, defining and disclosing the radiation risks associated with imaging examinations in line with ionizing radiation (medical exposure) regulations and guidance24,58 and gaining patient informed consent, particularly if greater automation in referral and vetting processes reduces patient journey times and limits patient opportunity to reflect on the examination referral and discuss any concerns. It may be argued by some that radiologists also desire greater presence in this patient facing space but with continued shortages of radiologists reported in the UK,59 it is unlikely that capacity exists within current workforce volumes for greater patient interaction. Importantly, the IR(ME) regulations24 continue to define radiographers as “operators” with the ultimate responsibility for the practical aspects of acquiring and processing medical imaging studies, regardless of whether an automated system is involved, a legal status that is unlikely to change in the near future. Consequently, radiographers have an opportunity to redefine their roles and lead on the development of best practice for working alongside semi-automated systems under these laws, as well as drive forward improvements in patient-facing processes and care-giving time. This may include, but is not limited to, developing best practice guidance on explaining risks of AI-driven systems to patients, ascertaining the influence of AI processes on human interaction and decision-making, and acquiring patient consent for AI research protocols with further opportunities identified within NHSx 2019 report AI: how to get it right.60

Increased cross-modality and AI-focused education and training

Greater patient throughput due to service efficiencies achieved through increased automation will impact on staffing requirements. Current modality specific job plans and limited cross-modality expertise and role flexibility are unlikely to be sustainable. As we move towards even greater patient workloads and continuous increases in demand for imaging to support the diagnosis, treatment and monitoring of disease, we should expect all radiographers to achieve a range of modality and technology-interfacing competencies. If AI is to be an important aspect of all imaging approaches in the future, then it may be reasonable to expect radiography graduates to have the threshold competencies to operate and supervise image acquisition across the range of imaging modalities thereby increasing workforce flexibility. Alongside this, pre-registration radiography curricula should ensure that graduates are educated in the fundamentals of AI and its subsets of machine learning and deep learning in order to be able to confidently interact with them safely and maximize their utility.60 Importantly, the Topol Review (2019)61 advocates that these changes in education must take place by 2024 and that healthcare employers must also provide opportunities for existing staff to upgrade their skill sets to ensure comparative knowledge to support technological adaptation and the necessary changes in work practices and work place culture. Higher training will also be required in the statistical underpinning of AI systems to provide radiographers with adequate critical assessment skills for AI outputs in their practice domain and leadership roles will emerge to drive change management processes during deployment and ongoing maintenance of vendor-specific systems. With such changing emphasis envisaged within radiography, and wider healthcare, education programmes must respond to this agenda by including automated technology operation, core computer science skills and technical processes for supervising and assuring automated outputs and actions. However, if the radiography profession does not simultaneously respond to the changing technological needs and look to the future to redefine clinical roles and responsibilities, then there is a real potential for disconnect between academic departments educating IT competent, multimodality professionals for the future, and clinical departments requiring professionals with the skills of today.

Radiographer reporting

The reporting of diagnostic images by appropriately qualified radiographers has been an established role development in the UK for over 20 years. While this activity predominantly relates to projection radiography and mammography, evidence suggests that appropriately trained and supported radiographers can supplement radiology reporting across a wide-range of imaging modalities.62–64 The value of the reporting radiographers was clearly acknowledged within the CQC Radiology Review (2018)2 and Cancer Workforce Plan (2017)65 which specifically identified expansion of the reporting radiographer workforce as being key to enabling earlier cancer diagnosis, greater cancer screening turn-around times and addressing reporting backlogs.65 Yet, anxieties persist within radiology circles.66 Developing systems for AI and radiographer double reading of imaging examinations may go some way towards alleviating persisting concerns and may prove more cost effective than a single radiologist or radiographer interpretation for certain high volume modalities such as chest radiography, CT lung screening and screening mammography.67 Accordingly, there is an opportunity to look at areas where there are reporting backlogs or workforce issues and perform research to measure the potential for radiographer led, AI-supported, reporting services and develop implementation plans for easy and consistent adoption. Further, reporting radiographers may take steps to address the issues of autoreporting and non-reporting of some imaging examinations by owning and developing the processes around AI-driven triage68 and review of AI-defined normal images, as under IR(ME)R,24 all imaging examinations will still require human review and report, even if an AI system suggests appearances are “normal.”

Audit of AI systems

All automated systems must have quality control checks according to European medical device regulations (MDR 2017/745)69 and there will be a growing role for reporting radiographers to undertake regular audit and review of the outputs and decisions of an AI image evaluation system. It is likely that a proportion of all AI automated cases will require some form of post-decision check or “peer-review” to establish system sensitivity, specificity and accuracy and it is radiographers who must seize this opportunity to own the process and perhaps, in the future, establish systems for AI review and audit of human image interpretation to establish parity, or lack of. However, in order to lead and create such opportunities, radiographers must have high level technical understanding of AI system operation and functionality in need for education programmes and CPD to support the evolution of the radiographer workforce and their adaptation to new, AI-enabled services is imperative and should not be overlooked.

Summary

While at first glance AI appears to threaten the role of the radiographer, its widespread adoption and implementation also offers significant opportunities for greater autonomy and self-definition if the profession successfully prepares for, and adapts to, the inevitable changes to role and culture. By embracing change, and preparing the profession with the skills required to interact with, and own processes around, new technology, the role of the radiographer could expand into one that drives improvements in the delivery of imaging services, not only in relation to direct patient care, which should remain core to professional identity, but also in relation to greater cross-modality expertise and the clinical flexibility this affords. The expansion of AI-assisted radiographer reporting opportunities to fulfill regulatory reporting requirements and address reporting backlogs will continue if AI-specific training to support management, supervision and quality assurance of AI-enabled systems is encouraged.

The opportunities for greater professional autonomy, decision-making and professional influence are substantial, but only if radiographers take the first step to define how they wish to work in an AI-enabled environment. The future is there to be created today and it is our professional responsibility to ensure the opportunities of tomorrow do not pass us by.

Contributor Information

Maryann Hardy, Email: M.L.Hardy1@bradford.ac.uk.

Hugh Harvey, Email: drhharvey@doctors.net.uk.

REFERENCES

  • 1. European Society of radiology. The future role of radiology in healthcare. Insights Imaging 2010; 1: 2–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Care Quality Commission Radiology Review: A national review of radiology reporting within the NHS in England. Newcastle-upon-Tyne, UK: CQC; 2018. [Google Scholar]
  • 3. Price R, Miller L, Payne G. Re-Engineering the soft machine: the impact of developing technology and changing practice on diagnostic radiographer skill requirements. Health Serv Manage Res 2000; 13: 27–39. doi: 10.1177/095148480001300104 [DOI] [PubMed] [Google Scholar]
  • 4. Reiner BI, Siegel EL. Technologists' productivity when using PACS: comparison of film-based versus filmless radiography. AJR Am J Roentgenol 2002; 179: 33–7. doi: 10.2214/ajr.179.1.1790033 [DOI] [PubMed] [Google Scholar]
  • 5. Hayre CM, Eyden A, Blackman S, Carlton K. Image acquisition in general radiography: the utilisation of DDR. Radiography 2017; 23: 147–52. doi: 10.1016/j.radi.2016.12.010 [DOI] [PubMed] [Google Scholar]
  • 6. Oxford Reference [cited 18th September 2019].. Available from: https://www.oxfordreference.com/view/10.1093/oi/authority.20110803095426960.
  • 7. Hosny A, Parmar C, Quackenbush J, Schwartz LH, Aerts HJWL, Hugo JW. Artificial intelligence in radiology. Nat Rev Cancer 2018; 18: 500–10. doi: 10.1038/s41568-018-0016-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Syed AB, Zoga AC. Artificial intelligence in radiology: current technology and future directions. Seminars in Musculskeletal Radiology 2018; 22: 540–5. [DOI] [PubMed] [Google Scholar]
  • 9. Ahn SY, Chae KJ, Goo JM. The potential role of grid-like software in bedside chest radiography in improving image quality and dose reduction: an observer preference study. Korean J Radiol 2018; 19: 526–33. doi: 10.3348/kjr.2018.19.3.526 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. van Lent WAM, Deetman JW, Teertstra HJ, Muller SH, Hans EW, van Harten WH. Reducing the throughput time of the diagnostic track involving CT scanning with computer simulation. Eur J Radiol 2012; 81: 3131–40. doi: 10.1016/j.ejrad.2012.03.012 [DOI] [PubMed] [Google Scholar]
  • 11. Hawnaur J. Recent advances: diagnostic radiology. BMJ 1999; 319: 168–71. doi: 10.1136/bmj.319.7203.168 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Sheth S. The working radiological technologist: on the path for burnout? 2010 HealtheCareers [cited 18th September 2019].. Available from: https://www.healthecareers.com/article/career/the-working-radiological-technologist-on-the-path-for-burnout.
  • 13. Hutton D, Beardmore C, Patel I, Massey J, Wong H, Probst H. Audit of the job satisfaction levels of the UK radiography and physics workforce in UK radiotherapy centres 2012. Br J Radiol 2014; 87: 20130742. doi: 10.1259/bjr.20130742 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Singh N, Wright C, Knight K, Baird M, Akroyd D, Adams RD, et al. Occupational burnout among radiation therapists in Australia: findings from a mixed methods study. Radiography 2017; 23: 216–21. doi: 10.1016/j.radi.2017.03.016 [DOI] [PubMed] [Google Scholar]
  • 15. Lohikoski K, Roos M, Suominen T;in press Workplace culture assessed by radiographers in Finland. Radiography 2019; 25: e113–8. doi: 10.1016/j.radi.2019.05.003 [DOI] [PubMed] [Google Scholar]
  • 16. Sterling S. Automatic exposure control: a primer. Radiol Technol 1988; 59: 421–7. [PubMed] [Google Scholar]
  • 17. Walsh C, Larkin A, Dennan S, O'Reilly G. Exposure variations under error conditions in automatic exposure controlled film-screen projection radiography. Br J Radiol 2004; 77: 931–3. doi: 10.1259/bjr/62185486 [DOI] [PubMed] [Google Scholar]
  • 18. Geras KJ, Mann RM, Moy L. Artificial intelligence for mammography and digital breast Tomosynthesis: current concepts and future perspectives. Radiology 2019; 293: 246–59. doi: 10.1148/radiol.2019182627 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Mendelson EB. Artificial intelligence in breast imaging: potentials and limitations. AJR Am J Roentgenol 2019; 212: 293–9. doi: 10.2214/AJR.18.20532 [DOI] [PubMed] [Google Scholar]
  • 20. Wu G-G, Zhou L-Q, Xu J-W, Wang J-Y, Wei Q, Deng Y-B, et al. Artificial intelligence in breast ultrasound. World J Radiol 2019; 11: 19–26. doi: 10.4329/wjr.v11.i2.19 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21. Liu S, Wang Y, Yang X, Lei B, Liu L, Li SX, et al. Deep learning in medical ultrasound analysis: a review. Engineering 2019; 5: 261–75. doi: 10.1016/j.eng.2018.11.020 [DOI] [Google Scholar]
  • 22. Nensa F, Demircioglu A, Rischpler C. Artificial intelligence in nuclear medicine. J Nucl Med 2019; 60(Supplement 2): 29S–37. doi: 10.2967/jnumed.118.220590 [DOI] [PubMed] [Google Scholar]
  • 23. Hall M. Artificial intelligence and nuclear medicine. Nucl Med Commun 2019; 40: 1–2. doi: 10.1097/MNM.0000000000000937 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Department of Health The Ionising Radiation (Medical Exposure) Regulations 2017.. Available from: http://www.legislation.gov.uk/uksi/2017/1322/contents/made [[cited 20th September 2019]].
  • 25. Murphy A, Liszewski B. Artifical intelligence and the medical radiation profession: how our advocacy must inform future practice.. Journal of Medical Imaging and Radiation Sciences [article]. [DOI] [PubMed] [Google Scholar]
  • 26. French J, Chen L. Preparing for artificial intelligence: systems-level implications for the medical imaging and radiation therapy professions. J Med Imaging Radiat Sci 2019. doi: 10.1016/j.jmir.2019.09.002 [DOI] [PubMed] [Google Scholar]
  • 27. Yielder J, Davis M. Where radiographers fear to tread: resistance and apathy in radiography practice. Radiography 2009; 15: 345–50. doi: 10.1016/j.radi.2009.07.002 [DOI] [Google Scholar]
  • 28. Bluemke DA. Radiology in 2018: are you working with AI or being replaced by AI? Radiology 2018; 287: 365–6. doi: 10.1148/radiol.2018184007 [DOI] [PubMed] [Google Scholar]
  • 29. Thrall JH, Li X, Li Q, Cruz C, Do S, Dreyer K, et al. Artificial intelligence and machine learning in radiology: opportunities, challenges, pitfalls, and criteria for success. J Am Coll Radiol 2018; 15(3 Pt B): 504–8. doi: 10.1016/j.jacr.2017.12.026 [DOI] [PubMed] [Google Scholar]
  • 30. Langlotz CP. Will artificial intelligence replace radiologists? Radiology 2019; 1: e190058. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31. Ranschaert ER, Duerinckx AJ, Algra P, Kotter E, Kortman H, Morozov S. Advantages, Challenges, and Risks of Artificial Intelligence for Radiologists : Ranschaert E. R, Morozov S, Algra P. R, Artificial Intelligence in Medical Imaging: Opportunities, Applications and Risks. Switzerland: Springer; 2019. 329–48. [Google Scholar]
  • 32. Pakdemirli E. Artificial intelligence in radiology: friend or foe? where are we now and where are we heading? Acta Radiologica Open 2019; 8: 205846011983022. doi: 10.1177/2058460119830222 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33. Lakhani P, Prater AB, Hutson RK, Andriole KP, Dreyer KJ, Morey J, et al. Machine learning in radiology: applications beyond image interpretation. J Am Coll Radiol 2018; 15: 350–9. doi: 10.1016/j.jacr.2017.09.044 [DOI] [PubMed] [Google Scholar]
  • 34. GE Healthcare No matter how you slice it, this AI tech is changing MR neuro imaging [Internet]. 2019. Available from: http://newsroom.gehealthcare.com/this-ai-tech-is-changing-mr-neuro-imaging/ [cited 9 Aug 2019].
  • 35. Sun Y, Zhu Z, Pang S. Learning models for acquisition planning of CT projections : Anomaly Detection and Imaging with X-Rays. Baltimore, USA; 2019. [Google Scholar]
  • 36. Feng S-T, Zhu H, Peng Z, Huang L, Dong Z, Xu L, et al. An individually optimized protocol of contrast medium injection in enhanced CT scan for liver imaging. Contrast Media Mol Imaging 2017; 2017: 1–8. doi: 10.1155/2017/7350429 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37. Santini G, Zumbo LM, Martini N, Valvano G, Leo A, Ripoli A, et al. Synthetic contrast enhancement in cardiac CT with deep learning. 2018;.
  • 38. Tong N, Gou S, Yang S, Ruan D, Sheng K. Fully automatic multi-organ segmentation for head and neck cancer radiotherapy using shape representation model constrained fully convolutional neural networks. Med Phys 2018; 45: 4558–67. doi: 10.1002/mp.13147 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39. Lou B, Doken S, Zhuang T, Wingerter D, Gidwani M, Mistry N, et al. An image-based deep learning framework for individualising radiotherapy dose: a retrospective analysis of outcome prediction. The Lancet Digital Health 2019; 1: e136–47. doi: 10.1016/S2589-7500(19)30058-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40. Teeuwisse W, Geleijns J, Veldkamp W. An inter-hospital comparison of patient dose based on clinical indications. Eur Radiol 2007; 17: 1795–805. doi: 10.1007/s00330-006-0473-1 [DOI] [PubMed] [Google Scholar]
  • 41. Foley SJ, McEntee MF, Rainford LA. Establishment of CT diagnostic reference levels in Ireland. Br J Radiol 2012; 85: 1390–7. doi: 10.1259/bjr/15839549 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42. McFadden SL, Hughes CM, Winder RJ. Variation in radiographic protocols in paediatric interventional cardiology. J Radiol Prot 2013; 33: 313–9. doi: 10.1088/0952-4746/33/2/313 [DOI] [PubMed] [Google Scholar]
  • 43. Sammy IA, Chatha H, Bouamra O, Fragoso-Iñiguez M, Lecky F, Edwards A. The use of whole-body computed tomography in major trauma: variations in practice in UK trauma hospitals. Emerg Med J 2017; 34: 647–52. doi: 10.1136/emermed-2016-206167 [DOI] [PubMed] [Google Scholar]
  • 44. Brown AD, Marotta TR. Using machine learning for sequence-level automated MRI protocol selection in neuroradiology. J Am Med Inform Assoc 2018; 25: 568–71. doi: 10.1093/jamia/ocx125 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45. Liu J, Zarshenas A, Wei Z, Yang L, Fajardo L, Suzuki K et al. Radiation dose reduction in digital breast tomosynthesis (DBT) by means of deep-learning-based supervised image processing : Proceedings of Medical Imaging 2018: Image Processing. Volume 10574 Houston, USA; 2018. [Google Scholar]
  • 46. Humphries T, Coulter S, Si D, Simms M, Xing R. Comparison of deep learning approaches to low dose CT using low intensity and sparse view data In: Proceedings of Medical Imaging 2019: Physics of Medical Imaging; 2019. [Google Scholar]
  • 47. Jin H, Heo C, Ahn CK, Kim JH. Combined low-dose simulation and deep learning for CT denoising: application of ultra-low-dose cardiac CTA : Medical Imaging 2019: Physics of Medical Imaging. 10948 San Diego, USA; 2019. [Google Scholar]
  • 48. Wang S, Su Z, Ying L, Peng X, Zhu S, Liang F et al. Accelerating magnetic resonance imaging via deep learning : IEEE 13th International Symposium on Biomedical Imaging (ISBI. Czech Republic: Prague; 2016. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49. Liu F, Jang H, Kijowski R, Bradshaw T, McMillan AB. Deep learning Mr imaging-based attenuation correction for PET/MR imaging. Radiology 2018; 286: 676–84. doi: 10.1148/radiol.2017170700 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50. Esses SJ, Lu X, Zhao T, Shanbhogue K, Dane B, Bruno M, et al. Automated image quality evaluation of T2 -weighted liver MRI utilizing deep learning architecture. J Magn Reson Imaging 2018; 47: 723–8. doi: 10.1002/jmri.25779 [DOI] [PubMed] [Google Scholar]
  • 51. Looney P, Stevenson GN, Nicolaides KH, Plasencia W, Molloholli M, Natsis S, et al. Fully automated, real-time 3D ultrasound segmentation to estimate first trimester placental volume using deep learning. JCI Insight 2018; 3: e120178. doi: 10.1172/jci.insight.120178 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52. Kuo C-C, Chang C-M, Liu K-T, Lin W-K, Chiang H-Y, Chung C-W, et al. Automation of the kidney function prediction and classification through ultrasound-based kidney imaging using deep learning. NPJ Digit Med 2019; 2: 29. doi: 10.1038/s41746-019-0104-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53. Wu L, Cheng J-Z, Li S, Lei B, Wang T, Ni D. FUIQA: fetal ultrasound image quality assessment with deep Convolutional networks. IEEE Trans Cybern 2017; 47: 1336–49. doi: 10.1109/TCYB.2017.2671898 [DOI] [PubMed] [Google Scholar]
  • 54. Yoon Y, Jeon H-G, Yoo D, Lee J-Y, Kweon IS. Learning a Deep Convolutional Network for Light-Field Image Super-Resolution : IEEE International Conference on Computer Vision Workshop (ICCVW. Santiago, Chile; 2015. [Google Scholar]
  • 55. Nie D, Cao X, Gao Y, Wang L, Shen D. Estimating CT image from MRI data using 3D fully Convolutional networks. Deep Learn Data Label Med Appl 2016; 2016: 170–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56. Akkus Z, Kostandy P, Philbrick KA, Erickson BJ. Extraction of brain tissue from CT head images using fully convolutional neural networks : Proceedings of Medical Imaging 2018: Image Processing. Houston, USA; 2018. [Google Scholar]
  • 57. Wang S, He K, Nie D, Zhou S, Gao Y, Shen D. Ct male pelvic organ segmentation using fully convolutional networks with boundary sensitive representation. Med Image Anal 2019; 54: 168–78. doi: 10.1016/j.media.2019.03.003 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58. Society and College of Radiographers The. Communicating Radiation Benefit and Risk Information to Individuals Under the Ionising Radiation (Medical Exposure) Regulations (IR(ME)R. 2019;.
  • 59. Royal College of Radiologists. Clinical radiology: UK workforce census 2018 report. 2019. Available from: from: https://www.rcr.ac.uk/publication/clinical-radiology-uk-workforce-census-report-2018 [cited 29th November 2019].
  • 60. NHSx Artificial Intelligence: How to get it right Putting policy into practice for safe data-driven innovation in health and care. 2019. Available from: Available from: https://www.nhsx.nhs.uk/assets/NHSX_AI_report.pdf [cited 29th November 2019].
  • 61. Health Education England The Topol Review: Preparing the healthcare workforce to deliver the digital future. 2019. Available from: Available from: https://topol.hee.nhs.uk/ [cited 29th November 2019].
  • 62. Spencer N. Re: can radiographers read screening mammograms? Clin Radiol 2003; 58: 902. doi: 10.1016/j.crad.2003.08.002 [DOI] [PubMed] [Google Scholar]
  • 63. Culpan G, Culpan A-M, Docherty P, Denton E. Radiographer reporting: a literature review to support cancer workforce planning in England. Radiography 2019; 25: 155–63. doi: 10.1016/j.radi.2019.02.010 [DOI] [PubMed] [Google Scholar]
  • 64. Nair A, Screaton NJ, Holemans JA, Jones D, Clements L, Barton B, et al. The impact of trained radiographers as concurrent readers on performance and reading time of experienced radiologists in the UK lung cancer screening (UKLS) trial. Eur Radiol 2018; 28: 226–34. doi: 10.1007/s00330-017-4903-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65. The Cancer Workforce Plan: Phase 1: Delivering the cancer strategy to 2021. 2017. Available from: https://www.hee.nhs.uk/sites/default/files/documents/Cancer%20Workforce%20Plan%20phase%201%20-%20Delivering%20the%20cancer%20strategy%20to%202021.pdf [Cited 20th September 2019].
  • 66. Royal college of Radiologists, The; Standards for interpretation and reporting of imaging investigations, 2nd Ed. 2018. Available from: Available from: https://www.rcr.ac.uk/system/files/publication/field_publication_files/bfcr181_standards_for_interpretation_reporting.pdf [Cited 20th September 2019].
  • 67. Ritchie AJ, Sanghera C, Jacobs C, Zhang W, Mayo J, Schmidt H, et al. Computer vision tool and technician as first reader of lung cancer screening CT scans. J Thorac Oncol 2016; 11: 709–17. doi: 10.1016/j.jtho.2016.01.021 [DOI] [PubMed] [Google Scholar]
  • 68. Annarumma M, Withey SJ, Bakewell RJ, Pesce E, Goh V, Montana G. Automated triaging of adult chest radiographs with deep artificial neural networks. Radiology 2019; 291: 272. doi: 10.1148/radiol.2019194005 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69. Council of European Union Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices.. Available from: Found at https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32017R0745.

Articles from The British Journal of Radiology are provided here courtesy of Oxford University Press

RESOURCES