Abstract
Purpose
Although quality assurance (QA) is crucial in radiation oncology departments, substantial efforts are required to monitor and ensure compliance to high standards. This work aims to analyze the impact of implementing a centralized and automated tracking dashboard on compliance and variance observed in radiation therapy QA results for linear accelerators.
Methods and Materials
The study was performed in a large academic center with 7 linear accelerators. An in-house QA Dashboard was implemented in 2019 with design specifications including automated monitoring and visualization of QA progress, ease of use and accessibility, ease of integration, adaptability to new technologies, and facilitation of automated reminders. Monthly QA data were collected between 2016 and 2022 to analyze compliance pre- and post-dashboard implementation. Compliance was characterized as the percentage of tests completed on time. In addition, variance trends for linear accelerator dosimetry measurements and imaging were analyzed over 7 years.
Results
In total, 76,066 records were analyzed. Of these records, 73,187 QA measurements were completed on time. Overall QA compliance increased from 75% in 2016 to >99% in 2019 after successful implementation of the QA Dashboard. The main improvement was observed for tests that were implemented more recently, eg, imaging QA and formal recording of daily QA review by physicists. The coefficient of variation was reduced by approximately a factor of 2 for imaging QA after the implementation of the QA Dashboard. For recorded dosimetry measurements, no substantial change in variance was observed.
Conclusions
Implementation of the QA Dashboard resulted in a distinct increase in QA compliance. Reduction in the variance in QA measurements was observed for all imaging modalities. These findings demonstrate high impact of automated tracking QA tools on improved compliance and accuracy of QA.
Introduction
In radiation medicine, it is well accepted that quality assurance (QA) and quality control are central to safe and effective treatment delivery. Clinical trials show the tangible value of QA, both patient-specific and programmatic,1, 2, 3 with nonadherence to radiation therapy requirements associated with reduced survival and local control and increased toxicity.4 Lower rates of major protocol deviations are associated with studies that require radiation therapy credentialing before trial participation.5
To standardize QA requirements, national professional societies and regulators6, 7, 8, 9, 10, 11 have developed many guidelines and documents with a substantial focus on linear accelerator (LINAC) QA.7,10, 11, 12 These documents describe recommended QA procedures and associated tolerances and action levels. However, there is no standard requirement for QA record-keeping or documentation, and each institution will implement this aspect differently. QA documentation may rely on a single checklist or a spreadsheet13,14 or more-sophisticated record-keeping and compliance tracking systems.15,16 Automated tools have been demonstrated to improve efficiency and effectiveness in many aspects of radiation therapy program QA,17,18 including streamlining LINAC QA processes, measurements, and analysis.19, 20, 21
Although automated approaches increase efficiency, they do not guarantee compliance to QA program standards. To demonstrate adherence to high-quality radiation therapy requirements, QA compliance reporting is now standard practice within radiation oncology, including regular accreditation and reaccreditation.22,23 The accreditation process involves peer review by an external organization, such as the American College of Radiology or the American Society for Radiation Oncology Accreditation Program of Excellence (APEx). Although infrequent, accreditations have been demonstrated to lead to improvements in patient care24 and help facilities identify deficiencies in their practices. Based on recent data from APEx, although 76% of facilities seeking APEx accreditation receive full accreditation after their initial survey, substantial deficiencies with APEx standards were identified in 24% of surveyed facilities. Among those facilities, 54% of APEx evidence indicators requiring a corrective action plan were related to LINAC QA.25
These reported substantial deficiencies reflect the significant effort required for maintaining QA compliance with local and national guidelines.12,26, 27, 28 QA completeness and compliance can be especially challenging for institutions consisting of LINACs with different configurations and auxiliary devices each requiring specific QA documentation management. To maintain a high level of compliance, institutions may benefit from the tools to monitor reporting and documentation of all aspects of QA between accreditations.
The current work aims to demonstrate the impact of introducing software tools for automated tracking of QA compliance and completeness. A key feature of this software is its independence from vendors or specific QA domain. The main role of the software is to provide a clear and concise visualization of the QA progress for all components of LINAC QA. The study will demonstrate whether introducing automation in compliance tracking impacts completeness, timeliness, and quality of reported QA results.
Methods and Materials
Department infrastructure and characteristics
In the current study, compliance of monthly QA results is reported for a large, academic center. The department includes 2 institutions, with a total of 7 LINACs, 3 computed tomography (CT) simulators, MR-LINAC (ViewRay), MR simulator (Siemens), dedicated Co-60 total body irradiation unit, and an active High Dose Rate Brachytherapy program. In 2022, this department treated 4000 patients, and the faculty and staff were composed of 23 full-time equivalent qualified medical physicists (QMPs), 40 full-time equivalent radiation oncologists, 17 dosimetrists, 4 medical physics assistants (MPAs), and 6 medical physics (MP) residents.
The QA reporting compliance analysis was completed on the 8 (over the period) external beam LINACs, including 3 multi-energy Varian Medical Systems (Palo Alto, CA) C-series LINACs, 2 multi-energy Varian TrueBeams, 1 Varian TrueBeam STx LINAC, and 1 Edge LINAC that was decommissioned in 2021 and replaced with Varian Ethos in 2022. The characteristics of these LINACs are presented in Table E1. All LINACs are equipped with on-board imaging with kV planar and cone-beam CT (CBCT) capabilities. Additionally, 6 of 8 LINACs have surface monitoring capability with Align RT (Vision RT, London, United Kingdom) and one (Varian TrueBeam STx) has the ExacTrac (Brainlab AG, Munich, Germany) imaging system. Six degree-of-freedom, robotic couches are implemented on 4 LINACs.
QA program design
External beam treatment device QA is performed by the team consisting mostly of QMPs and MPAs. MP residents and, occasionally, postdoctoral fellows are included in this team on a rotating basis. The outline of the monthly QA assignments is presented in Table E2. This complex membership of the team performing QA provides the benefits of minimizing effects of experience or human behavior of individuals. However, these effects cannot be completely removed or controlled.
Monthly QA procedures for all LINACs were developed in accordance with published recommendations,11,12 vendor guidelines, and specialty task group documents.8,10,29, 30, 31 A document management system was developed to facilitate access to the institutional policies, procedures, and QA result records. Monthly peer-review is routinely performed by the Chief of Dosimetry with the feedback to involved team members.
QA data sources
The data were collected and analyzed for all monthly QA between 2016 and 2022. For this study, the monthly QA program results for external beam radiation therapy are reported in 5 main categories:
-
a.
Dosimetry, including output and ionization ratio consistency checks for all energies.
-
b.
Mechanical QA, including tests of full range of motions of gantry, collimator and couch, cross-hair centering, laser and accessories accuracy verification, and safety checks outlined in the American Association of Physicists in Medicine Task Group report 142 (TG-142),12 Table E2, and summarized in Table E3. The number of tests varies slightly between the machines due to the differences in LINAC configurations (ie, availability of robotic couch, wedges, electron cones, and lasers).
-
c.
Imaging QA, including imaging and treatment coordinate coincidence, image quality (contrast, uniformity, and noise), spatial resolution and geometric accuracy, positioning/repositioning, and collision interlocks for each imaging modality, kV, MV, and CBCT imaging. The tests were designed to follow TG-142,12 Table E6, and summarized in Table E4.
-
d.
Review of daily QA records, including review of beam flatness and symmetry.
-
e.
Review of surface and motion monitoring system records.
QA measurement analyses is completed with a combination of commercial software packages and manual in-house procedures, as illustrated in Fig. E1. To consolidate the QA measurements, the results are also recorded in designated Excel spreadsheets (Microsoft, Redmond, WA) to store QA data for each category for each LINAC.
Tracking and compliance QA Dashboard: development and design requirements
In 2019, an in-house dashboard was designed to facilitate peer-review and monitor QA completion compliance. Motivation to transition to a new system was to replace the commercial software program, Argus (Varian Medical Systems), that had reached its end-of-life. Additional motivation was to improve efficiency and accuracy of QA peer-review with adding automation to monitor of QA completion. The design requirements of the software included the following:
-
1.
Centralized dashboard for QA record access and visualization of QA compliance progress.
-
2.
Adaptable and extendable design: to accommodate different types of LINACs (various number of energies, with and without robotic couch, with and without stereotactic components), to allow for the addition of new tests, nonstandard tests, adaptable for new and different technology, and formatted for both LINACs and CT simulators.
-
3.
User-friendly and efficient platform for peer-review.
-
4.
Automated reminders of upcoming and overdue QA tasks.
-
5.
Centralized access to policies and procedures through dedicated interface.
Data collection, evaluation metrics, and analysis
For this study, the data were extracted for each QA category. For all categories, the date of the measurement, the initials of the person who performed the measurement, and the number of completed tasks on that date were collected. The total number of analyzed measurements was recorded for each category. Only completed measurements were counted toward statistics and the number of missed records was calculated based on the calendar assignments and the number of completed measurements.
Two metrics were used to evaluate the impact of the QA Dashboard implementation:
-
1.
Compliance: In each of the 5 categories, the percentage of tests completed on time (recorded within the calendar month) was calculated. This was collected for each LINAC per month and the average over all months was reported as the annual value.
-
2.
Variance trends: For dosimetry and imaging QA, deviations from baseline were evaluated for each monthly measurement, on each LINAC, across each year. For mechanical QA, percentage of records reported as “Failed” was calculated. These data were not collected for the Vision RT and Daily QA review because these QA are recoded as “done/not done” and therefore the magnitude of the deviations from the baseline cannot be quantified.
Variance trend analysis of the dosimetry measurements included both routine monthly pre- and post-adjustments measurements. Outputs following monitor chamber replacements were excluded due to the expected initial deviations of up to 20%.32 The mean (M) and standard deviation (SD) of each parameter for each year was calculated and the maximum deviations reported. Variance trend analysis of imaging QA included each imaging modality individually: CBCT, planar kV, and MV imaging. The coefficient of variation (CV) was used for characterization to compare imaging results. The CV is equal to the standard deviation divided by the mean of all measurements, CV = M/SD. The CV for each test was analyzed and the average for all tests for each imaging modality was reported.
Results
QA program infrastructure and characteristics
Figure 1 shows how the configuration of the QA program increased in complexity as the number of personnel and diversity of experience increased. Figure 2 shows relative participation of different groups in QA program. Over a decade, the QA responsibilities transitioned from a few QMPs and 1 to 2 postdoctoral fellows to a group of 8 to 10 QMPs, 4 MPAs, and 4 MP residents. QMPs provide continued service for many years, although their individual assignments change over time. In total, 41 individuals participated in the QA program between 2016 and 2022.
Figure 1.
Timeline (include changes to personnel and argus/spreadsheet and implementation). Abbreviation: QA = quality assurance. (A color version of this figure is available at 10.1016/j.adro.2024.101469.)
Figure 2.
Number of staff from different groups participating in linear accelerator QA program. Abbreviations: MPA = medical physics assistants; PDF = postdoctoral fellow; QA = quality assurance; QMP = qualified medical physicist. (A color version of this figure is available at 10.1016/j.adro.2024.101469.)
Tracking and compliance (QA Dashboard)
The QA compliance dashboard was developed in-house and was launched in September 2019. The software was developed in Java programming language and designed to access the Excel files from designated folders and report completion of QA tasks based on the date of the record as well as the percentage of tests recorded for each machine. Reminders were designed to be sent 1 week before the end of the month. Additional reminders were sent 3 days before the end of the month and each day if QA was overdue. Additional critical features of the QA process included different data access levels for each role group including enhanced access for review and approval.
Tracking and compliance QA Dashboard: development and design required
Figure 3 details a screen capture of the QA Dashboard summary landing page for one LINAC. Development of an in-house dashboard met the stated design requirements. It provides flexibility for future adaption to new and different technologies and the ability to access decentralized data sources. The in-house design allowed dashboard implementation without completely redesigning the existing, extensive system of QA records which spread across various technologies and databases.
Figure 3.
A screen capture of the summary landing page of the dashboard for one of the linear accelerators. Green corresponds to 100% completion of the associated QA tests; orange corresponds to >80% and <100% completion of the QA tests. Abbreviation: QA = quality assurance. (A color version of this figure is available at 10.1016/j.adro.2024.101469.)
Evaluation metrics
The current monthly QA program includes dosimetry tests for 17 individual photon and 20 electron energies, 32 to 36 individual mechanical tests per TrueBeam or C-series LINAC, and 13 mechanical tests for the Ethos machine. In the imaging category, 95 tests are completed on each LINAC. In addition, review of Vision RT and Daily QA by the QMP is required to be recorded monthly.
Compliance
In total from 2016 to 2022, 76,066 QA records were analyzed. Of these records, 73,187 measurements were completed on time. These records include 7,189 dosimetry measurements, 18,313 mechanical QA tests, 47,071 imaging QA tests, and 614 reviews of Vision RT and Daily QA results. Figure 4a shows overall compliance in time, averaged over all QA tests and all LINACs for each year and Fig. 4b shows the breakdown of compliance by each of the 5 categories. An improvement in overall compliance from 75% in 2016 to >99% in 2022 can be observed with a transition in 2019 after the implementation of the tracking software. The Vision RT and Daily QA review increased from close to 50% compliance in 2018 to 99% compliance in 2022. Because these tests were not recorded before 2018, the compliance for these tests is only reported from 2018 onward. The imaging QA was implemented before 2016. The procedures for output measurements and mechanical QA have not changed for more than 10 years. For these QA categories, the compliance increased from 90% to 95% in the years 2016 to 2019 to >99% in the years 2020 to 2022.
Figure 4.
(A) The overall compliance of completed tasks for all linear accelerators is shown for each year. Green corresponds to 100% completion of the associated QA tests; orange corresponds to >80% and <100% completion of the QA tests, red corresponds to <80% completion of QA tests on time. (B) Percentage of tasks completed on time in the 5 categories of QA measurements and results, linear accelerator output (blue), mechanical QA and safety checks (orange), all modality imaging (gray), Vision RT review (yellow), and daily QA review (green). Abbreviations: OSMS = optical surface monitoring system; VRT = Vision RT; QA = quality assurance. (A color version of this figure is available at 10.1016/j.adro.2024.101469.)
Variance trends
Imaging QA
The imaging QA results for LINACs were analyzed for each individual imaging modality, CBCT, planar kV imaging, and planar MV imaging. Figure 5 shows average CV for 3 imaging modalities individually and average CV over all imaging modalities and all LINACs. For all imaging modalities, CVs were greater in 2016 to 2019 compared with 2020 to 2022. The CV averaged over all imaging modalities was greater than 0.4 in years 2016 to 2019. It decreased to less than 0.3 in years 2020 to 2022.
Figure 5.
Average coefficient of variation for each imaging modality across all linear accelerators. The average for kV is shown in teal, MV in green, CBCT in purple, and the dashed line represents the average trend across all imaging modalities and all linear accelerators. Abbreviations: CBCT = cone beam computed tomography. (A color version of this figure is available at 10.1016/j.adro.2024.101469.)
Dosimetry
Figure 6 and Table E5 show details of the LINAC beam dosimetry trends over the study period. In total, 7,189 dosimetry measurements were performed and analyzed, including 3,783 outputs and 3,406 ionization ratio measurements. The average deviations (M) of the outputs and ionization ratios from the baseline are less than 0.65% and within required 2% tolerance. The average SD is less than 0.54%. No noticeable trends in either M or SD were observed over the 7 years. The maximum deviations were the greatest in 2019 (3.65%), although the number of outliers, which represents occasional deviations above the tolerance values, was the greatest in 2022 (24).
Figure 6.
Absolute deviations of monthly outpus and ionization ratios averaged over all energies on all linear accelerators per year. (A color version of this figure is available at 10.1016/j.adro.2024.101469.)
Mechanical QA
Figure 7 shows percentage of missed and failed mechanical QA results. Although the number of failed results is low in all years, there is a noticeable growth in reported failures from 0.3% in 2017 to 1.9% in 2021 followed by lower value of 1.2% in 2022. This trend for the missed measurements is nearly reversed with the increasing percentage (>5%) of missed results occurring in 2016 to 2018 and the fewer percentage of missed results after 2020 (0.1%). These results may indicate either true trends of failures on the machines or a more careful approach to QA. At this point, there is not enough information to distinguish between these possibilities.
Figure 7.
Percentage of missed (blue) or reported as failed (red) mechanical QA measurements, averaged over all linear accelerators per year. Abbreviation: QA = quality assurance. (A color version of this figure is available at 10.1016/j.adro.2024.101469.)
Discussion
The current study demonstrates the impact of implementing a formal automated QA Dashboard to track QA completeness and compliance in a large, complex radiation oncology department. This dashboard provides a centralized resource for visualization and monitoring QA progress. Specific features of this software include the flexibility to accommodate data collection from multiple data sources for various technology and ease of access for an efficient peer review. The software also provides access to policies and procedures and email reminders to the individuals responsible for performing QA. We demonstrate an increase in compliance after implementing the automated QA Dashboard. QA measurements quality also demonstrated improvement trends with more measurements closer to baseline (reduced variance) after the QA Dashboard implementation.
The use of software to track post-service QA and service event logging during accreditation has been shown with commercially available QA tracking software.15 Similar studies have investigated the value of centralizing database.16 In addition, both in-house and commercial automation QA solutions are available to streamline preparation, measurement, and analysis of QA tasks.19, 20, 21 Although powerful tools are developed for improving efficiency and accuracy of QA measurements and streamlining QA procedures, compliance of LINAC QA remains challenging as indicated by recent APEx reports.25 The current study extended on previously demonstrated approaches to provide an all-inclusive platform to QA reporting with a diverse radiation therapy infrastructure. The QA Dashboard provides a simple visualization to assess QA progress and drastically improved compliance without modifying existing QA processes and procedures. Although not demonstrated in the study, improvements in routine QA may be expected to lead to better machine performance and improved clinical care, similar to demonstrated improvements in clinical care during accreditation visits.24
Compliance and completeness are only one aspect of a high-quality QA program. The quality of measurements was assessed through a variation analysis to determine whether the implementation of the QA Dashboard had an impact. After the QA Dashboard was implemented, the results were closer to baseline and with smaller variance across many QA categories, especially for newly implemented procedures. For well-established monthly dosimetry output measurements, there was no noticeable change in the variation of dosimetric beam characteristics reported after the implementation of the QA Dashboard. The imaging QA was relatively new addition to the overall program, and the QA Dashboard helped reduce the variance that is often observed after the introduction of new measurements for new technology. Although it is difficult to identify the exact reasons for this effect, the data presented suggest that as the compliance increased and the procedures were performed more frequently and on time, staff became more proficient with the QA and less likely to make mistakes. It also is possible that formal peer-review facilitated by the QA Dashboard created a more critical approach to the results from the QA team members. Since we were able to implement this software for 7 LINACs simultaneously, our study is minimally affected by the performance of individual LINACs. In addition, these results are largely independent of the behaviors or habits of specific personnel because QA was performed at our institution over the period of 2016 to 2022 by 41 participants including QMPs, residents, and MPAs, on a rotating basis.
Although it is reasonable to expect that reminders and monitoring could improve compliance, before implementing the QA Dashboard, it was unclear how large an increase to expect. Understanding the extent of the improvement is important when allocating resources and dedicated personnel. The results of our study demonstrate that with comprehensive implementation of automated monitoring tools, the compliance may reach >99% even in a large and diverse center.
The decision to implement commercial software or to develop in-house software will depend on the design requirements and the specific needs and type of available resources of each institution. The resource overhead of commercial software may require redesign of processes and data entry; whereas in-house software can use existing organization and the resources are dedicated to software design and maintenance. Commercial software has the benefit of not requiring software development experts and lower maintenance overhead, however it lacks flexibility and adaptability that in-house solutions may provide.
The current study investigated the cumulative effect of the QA compliance dashboard that includes several important components, namely centralized interface for visualization of QA completion, centralized access to QA results and analysis, centralized access to policies and procedures, user-friendly tools for peer-review, and use of automated reminders. Although it was designed to meet institutionally important features, this development provides some limitations for analysis and potentially, extension to other clinics. In the current study, the contribution of individual components of the dashboard design to increased compliance cannot be estimated. The skills and experience of each individual participating in the QA program was not evaluated or quantified. In addition, the optimal pattern of reminders was not determined nor was user satisfaction (QMPs and MPAs conducting QA) quantified. We also did not address in the study the approaches that were taken to minimize QA delays, such as broken or missing equipment, unavailable staff, and other logistical issues.
As technologies continue to develop, new QA procedures will need to be implemented and monitored for maintaining high-quality and safe radiation oncology practices. Implementing information technology and software solutions is identified in many hospital-based practices as a mechanism for improving and enabling safety culture.33 This study has demonstrated how automation to monitor QA completion can both increase compliance and decrease variation of reported QA results. For institutions currently manually monitoring monthly QA compliance, this study provides evidence that the resources required to implement an automated monitoring system are likely worth the increase to almost 100% in compliance.
Conclusions
This study demonstrates high impact of an automated QA Dashboard on QA compliance and reduced variance in QA measurements. These findings highlight the importance of a broad oversight of all components of the QA program and demonstrate that improving access to a QA completion summary may benefit centers struggling to maintain high QA compliance. These results may help departments justify required resources to develop or implement QA compliance tracking software and ultimately provide better clinical care.
Footnotes
Sources of support: This work had no specific funding.
Disclosures: Dr Lyatskaya reports honoraria and travel expenses from American Society for Radiation Oncology APEx Radiation Oncology Practice Accreditation program. No other disclosures were reported.
Research data are available in an institutional repository and will be shared upon request to the corresponding author.
Supplementary material associated with this article can be found in the online version at doi:10.1016/j.adro.2024.101469.
Appendix. Supplementary materials
References
- 1.Bekelman JE, Deye JA, Vikram B, et al. Redesigning radiotherapy quality assurance: Opportunities to develop an efficient, evidence-based system to support clinical trials—report of the National Cancer Institute Work Group on Radiotherapy Quality Assurance. Int J Radiat Oncol Biol Phys. 2012;83:782–790. doi: 10.1016/j.ijrobp.2011.12.080. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Weber DC, Poortmans PM, Hurkmans CW, Aird E, Gulyban A, Fairchild A. Quality assurance for prospective EORTC radiation oncology trials: The challenges of advanced technology in a multicenter international setting. Radiother Oncol. 2011;100:150–156. doi: 10.1016/j.radonc.2011.05.073. [DOI] [PubMed] [Google Scholar]
- 3.Fairchild A, Bar-Deroma R, Collette L, et al. Development of clinical trial protocols involving advanced radiation therapy techniques: The European Organisation for Research and Treatment of Cancer Radiation Oncology Group approach. Eur J Cancer. 2012;48:1048–1054. doi: 10.1016/j.ejca.2012.02.008. [DOI] [PubMed] [Google Scholar]
- 4.Weber DC, Tomsej M, Melidis C, Hurkmans CW. QA makes a clinical trial stronger: Evidence-based medicine in radiation therapy. Radiother Oncol. 2012;105:4–8. doi: 10.1016/j.radonc.2012.08.008. [DOI] [PubMed] [Google Scholar]
- 5.Ibbott GS. The need for, and implementation of, image guidance in radiation therapy. Ann ICRP. 2018;47:160–176. doi: 10.1177/0146645318764092. [DOI] [PubMed] [Google Scholar]
- 6.Kutcher GJ, Coia L, Gillin M, et al. Comprehensive QA for radiation oncology: Report of AAPM Radiation Therapy Committee Task Group 40. Med Phys. 1994;21:581–618. doi: 10.1118/1.597316. [DOI] [PubMed] [Google Scholar]
- 7.Nath R, Biggs PJ, Bova FJ, et al. AAPM code of practice for radiotherapy accelerators: Report of AAPM Radiation Therapy Task Group No. 45. Med Phys. 1994;21:1093–1121. doi: 10.1118/1.597398. [DOI] [PubMed] [Google Scholar]
- 8.Bissonnette JP, Balter PA, Dong L, et al. Quality assurance for image-guided radiation therapy utilizing CT-based technologies: A report of the AAPM TG-179. Med Phys. 2012;39:1946–1963. doi: 10.1118/1.3690466. [DOI] [PubMed] [Google Scholar]
- 9.McCullough SP, Alkhatib H, Antes KJ, et al. AAPM Medical Physics Practice Guideline 2.b.: Commissioning and quality assurance of X-ray-based image-guided radiotherapy systems. J Appl Clin Med Phys. 2021;22:73–81. doi: 10.1002/acm2.13346. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Smith K, Balter P, Duhon J, et al. AAPM Medical Physics Practice Guideline 8.a.: Linear accelerator performance tests. J Appl Clin Med Phys. 2017;18:23–39. doi: 10.1002/acm2.12080. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Hanley J, Dresser S, Simon W, et al. AAPM Task Group 198 Report: An implementation guide for TG 142 quality assurance of medical accelerators. Med Phys. 2021;48:e830–e885. doi: 10.1002/mp.14992. [DOI] [PubMed] [Google Scholar]
- 12.Klein EE, Hanley J, Bayouth J, et al. Task Group 142 report: Quality assurance of medical accelerators. Med Phys. 2009;36:4197–4212. doi: 10.1118/1.3190392. [DOI] [PubMed] [Google Scholar]
- 13.Gawande A. 1st ed. Metropolitan Books; New York: 2010. The Checklist Manifesto: How to Get Things Right. [Google Scholar]
- 14.Leuenberger R, Kocak R, Jordan DW, George T. Medical physics: Quality and safety in the cloud. Health Phys. 2018;115:512–522. doi: 10.1097/HP.0000000000000894. [DOI] [PubMed] [Google Scholar]
- 15.Angers C, Bottema R, Buckley L, et al. Streamlining regulatory activities within radiation therapy departments using QATrack. Health Phys. 2019;117:306–312. doi: 10.1097/HP.0000000000001119. [DOI] [PubMed] [Google Scholar]
- 16.Tang G, LoSasso T, Chan M, Hunt M. Impact of a centralized database system on radiation therapy quality assurance management at a large health care network: 5 years' experience. Pract Radiat Oncol. 2022;12:e434–e441. doi: 10.1016/j.prro.2022.03.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Holdsworth C, Kukluk J, Molodowitch C, et al. Computerized system for safety verification of external beam radiation therapy planning. Int J Radiat Oncol Biol Phys. 2017;98:691–698. doi: 10.1016/j.ijrobp.2017.03.001. [DOI] [PubMed] [Google Scholar]
- 18.Damato AL, Devlin PM, Bhagwat MS, et al. Independent brachytherapy plan verification software: Improving efficacy and efficiency. Radiother Oncol. 2014;113:420–424. doi: 10.1016/j.radonc.2014.09.015. [DOI] [PubMed] [Google Scholar]
- 19.Eckhause T, Al-Hallaq H, Ritter T, et al. Automating linear accelerator quality assurance. Med Phys. 2015;42:6074–6083. doi: 10.1118/1.4931415. [DOI] [PubMed] [Google Scholar]
- 20.Skinner LB, Yang Y, Hsu A, Xing L, Yu AS, Niedermayr T. Factor 10 expedience of monthly linac quality assurance via an ion chamber array and automation scripts. Technol Cancer Res Treat. 2019;18 doi: 10.1177/1533033819876897. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Schmidt MC, Raman CA, Wu Y, et al. Application programming interface guided QA plan generation and analysis automation. J Appl Clin Med Phys. 2021;22:26–34. doi: 10.1002/acm2.13288. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.American College of Radiology (ACR) ACR; Reston, VA: 2012. Radiation Oncology Accreditation Program Requirements.http://www.acr.org/~/media/ACR/Documents/Accreditation/RO/Requirements.pdf Available at: Accessed March 13, 2023. [Google Scholar]
- 23.American Society for Radiation Oncology (ASTRO), Accreditation Program for Excellence (APEx), The APEx Accreditation Program Standards. Available at: https://www.astro.org/ASTRO/media/ASTRO/Daily%20Practice/PDFs/APExStandards.pdf. Accessed March 13, 2023.
- 24.Bogh SB, Falstie-Jensen AM, Hollnagel E, Holst R, Braithwaite J, Johnsen SP. Improvement in quality of hospital care during accreditation: A nationwide stepped-wedge study. Int J Qual Health Care. 2016;28:715–720. doi: 10.1093/intqhc/mzw099. [DOI] [PubMed] [Google Scholar]
- 25.Dawes SL. ASTRO Quality Improvement Report. Available at: https://issuu.com/aapmdocs/docs/4802?mode=embed&viewMode=doublePage&backgroundColor=353535. Accessed October 29, 2023.
- 26.ACR; Reston, VA: 2020. ACR–AAPM Technical Standard for the Performance of Radiation Oncology Physics for External-Beam Therapy.https://www.acr.org/-/media/ACR/Files/Practice-Parameters/Ext-Beam-TS.pdf Available at: Accessed March 13, 2023. [Google Scholar]
- 27.Malkoske KE, Nielsen MK, Tantot L, et al. COMP report: CPQR technical quality control guidelines for radiation treatment centers. J Appl Clin Med Phys. 2018;19:44–47. doi: 10.1002/acm2.12295. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Kirkby C, Ghasroddashti E, Angers CP, Zeng G, Barnett E. COMP report: CPQR technical quality control guideline for medical linear accelerators and multileaf collimators. J Appl Clin Med Phys. 2018;19:22–28. doi: 10.1002/acm2.12236. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Benedict SH, Yenice KM, Followill D, et al. Stereotactic body radiation therapy: The report of AAPM Task Group 101. Med Phys. 2010;37:4078–4101. doi: 10.1118/1.3438081. [DOI] [PubMed] [Google Scholar]
- 30.Keall PJ, Mageras GS, Balter JM, et al. The management of respiratory motion in radiation oncology report of AAPM Task Group 76. Med Phys. 2006;33:3874–3900. doi: 10.1118/1.2349696. [DOI] [PubMed] [Google Scholar]
- 31.Al-Hallaq HA, Cervino L, Gutierrez AN, et al. AAPM task group report 302: Surface-guided radiotherapy. Med Phys. 2022;49:e82–e112. doi: 10.1002/mp.15532. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Grattan MW, Hounsell AR. Analysis of output trends from Varian 2100C/D and 600C/D accelerators. Phys Med Biol. 2011;56:N11–N19. doi: 10.1088/0031-9155/56/1/N02. [DOI] [PubMed] [Google Scholar]
- 33.Singer SJ, Vogus TJ. Reducing hospital errors: Interventions that build safety culture. Annu Rev Public Health. 2013;34:373–396. doi: 10.1146/annurev-publhealth-031912-114439. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.







