Abstract
Purpose:
To report on the impact of using a centralized database system for major equipment QA at a large institution.
Methods:
A centralized database system has been implemented for radiotherapy machine QA in our institution at 6 campuses with 11 CTs and 22 linacs. The database system was customized to manage monthly and annual CT and linac QA. This includes providing the same set of QA procedures across the enterprise, digitally storing all measurement records, and generating trend analyses. Compared to conventional methods (i.e. paper forms), the effectiveness of the database system was quantified by changes in the compliance of QA tests and perceptions of staff to the efficiency of data retrieval and analyses. An anonymized questionnaire was provided to physicists enterprise-wide to assess workflow changes.
Results:
With the implementation of the database system, the compliance of QA test completion improved from 80% to >99% for the entire institution. This resonates with the 56% of physicists who found the database system helpful in guiding them through QA, while 25% of physicists found the contrary, and 19% reported no difference (n=16). Meanwhile, 40% of physicists reported longer times needed to record data using the database system compared to conventional methods, and another 40% suggested otherwise. 87% and 80% found the database more efficient to analyze and retrieve previous data, respectively. This was also reflected by the shorter time taken to generate year-end QA statistics using the software (5 vs. 30 min per linac). Overall, 94% of physicists preferred the centralized database system over conventional methods and endorsed continued use of the system.
Conclusions:
A centralized database system is useful and can improve the effectiveness and efficiency of QA management in a large institution. With consistent data collection and proper data storage using a database, high-quality data can be obtained for FMEA/TG 100.
Introduction
With the rapid advance of technology, the need for and development of radiation therapy QA has vastly grown in the past two decades. The complexity of localization imaging and radiation treatments, especially for stereotactic radiosurgery (SRS) and body radiotherapy (SBRT), necessitates the documentation of stringent machinery requirements for greater mechanical and dosimetric accuracy. Quality assurance (QA) guidelines such as AAPM Task Group (TG) Reports 142/198 and Medical Physics Practice Guidelines (MPPG) 2a/2b/8a devise comprehensive sets of QA tests for the different components of linacs including onboard kV and MV imaging systems, while TG 66 provides a set of recommended QA tests for CT simulation units1-6. The extent of the QA tests can result in a large amount of data records. For example, there are more than 20-25 tests for monthly linac QA and about 35 tests for annual linac QA in our institution. Within each test, there can be multiple records such as output checks for the different beam energies. Subsequently, a large amount of data accumulates over time.
AAPM TG 100 provides insights to efficiently manage the increasing QA effort for complex radiation treatments with risk analysis methods7. However, to achieve effective and accurate risk analysis, data should be collected systematically based on the same standards. In recent years, there has been an increasing trend of hospital mergers, acquisitions, and expansions, resulting in large healthcare networks8,9 that often exhibit heterogenous QA practices. Managing and maintaining a good standard of quality of care for a large institution or network is not a simple task.
QA management can be aggravated by paper forms used for data records. While paperless QA is not a new concept and is adopted in most clinics, it is worthwhile to note that it can improve workflow and management efficiency in addition to being environmentally friendly10. The simplest form of paperless QA is a spreadsheet, but technically it is not a database and can become difficult to manage especially with complex calculations and commands.
A QA database software resolves most of these issues. Data comparison to reference values and data trending are also more streamlined with such software, leading to more efficient QA management. At the time of our search for database management software, there were only four commercial products available. All of these products provide a centralized database server for QA data storage, basic image analysis (such as MLC tests and MV/kV imaging quality tests), as well as data trending. In this report, we describe the planning and implementation of a centralized QA database system for a large multi-center institution, primarily focusing on linac and CT QA. As it is not the objective of this report to promote a specific software product, but rather to provide readers with a fair assessment of the general attributes and functioning of a QA database, the vendor of the QA database system implemented at our institution is not revealed here, and we do not focus on features of our specific system.
Methods
Equipment
At the time of implementing our first centralized QA database, there were six radiation treatment campuses with a total of 22 linacs, 11 simulation units (CT and PET/CT), and a single enterprise-wide commercial treatment planning system (TPS) utilizing a single database at our institution. All linacs were grouped into four machine families, where each machine family was associated with a machine and MLC type (Table 1). All machines within a single machine family were tuned and verified to be dosimetrically equivalent (i.e. machines interchangeable for treatment without plan modifications/adjustments) based on an inter-comparison of water tank measurements (PDD, profiles, and output factors) for all photons and electrons beams11. The MLC of all machines within each machine family were calibrated to achieve the same dosimetric leaf gap (DLG). This approach allowed us to build a single beam model for each machine family. For each linac family, the baseline data within the QA system was referenced to the output of the TPS. Similarly, there is only one CT calibration curve (i.e. Hounsfield Units to electron density conversion table) in the TPS for all 11 simulation units in the entire enterprise. The use of a single CT calibration table has been previously validated during the commissioning phase of the TPS where the same CT calibration phantom was scanned in all 11 CT and PET/CT.
Table1.
Linac and CT family categories.
Machine type | MLC type | Total number of machines |
---|---|---|
Varian 6EX | Millennium 120 | 3 |
Varian C-series/Trilogy | Millennium 120 | 9 |
Varian TrueBeam | Millennium 120 | 9 |
Varian TrueBeam STx | HD MLC | 1 |
Philips Big Bore CT | - | 7 |
GE PET/CT | - | 4 |
Standardization of QA procedures
The CT and linac QA program was originally developed at Campus 1 following AAPM recommendations1-3,6,12 and departmental standards. Prior to the enterprise-wide centralized QA database implementation, the monthly and annual QA tests for CT and linacs were recorded using paper forms. Although the same forms were distributed across the institution, the methods used to perform some QA tests varied among campuses leading to different definitions and values for baseline data. This had an impact on managing the QA data, complicated the data review when tracking and comparing machines, and in limited cases led to differences in triggers for corrective action. To standardize the QA program prior to the database implementation, a committee was formed with at least one Qualified Medical Physicist (QMP)13 (i.e. physicists with a valid state license and board certification) actively involved in equipment QA from each of the six campuses to develop consensus QA procedures.
Software selection
Four commercial radiotherapy QA software packages were evaluated prior to purchase. An evaluation license was acquired for each of these four products for testing. The testing involved simulating routine operations by setting up monthly linac QA forms based on the consensus procedures for multiple machines and entering data for analysis and trending. Each QA software package was evaluated based on several factors at that time: user friendliness, versatility, flexibility, robustness, anticipated service, future development, and budget. The evaluation of each package was presented to all QA physicists (QMP only), identifying the pros and cons of each software. The selection was further narrowed to two systems and the final selection was based on the highest number of votes from the committee and all QA physicists.
System setup
Following the policy of our institution, a thorough risk assessment of the software was performed by the information security department at the time of purchase. Documentation of the system structure and a disaster recovery plan were created. The software was installed at all linac and CT/PET/CT console areas, with additional installations in the common physics areas.
The software allows different categories of user rights and each category can be customized accordingly. While all user rights categories allow full read access to the software (i.e. viewing data trends), write access is limited (i.e. modifying and adding data). To maintain a well-controlled environment, only two physicists were assigned “administrative rights”, which enable full access to the software including system setup. For QMPs, “physicist rights” were assigned, which allow all access to the software except for system setup. For residents and physicists without board certification or state license (i.e. not QMP), “resident rights” were assigned allowing only the entering of data to the software for temporary storage. A QMP with “physicist rights” must subsequently review such data before approving and saving the data permanently to the database. This also acts as a virtual co-sign of a non-QMP’s work by a QMP as per state regulations.
The six campuses of our institution were set up separately in the software, such that each campus was its own “clinic”. QA physicists at the individual campuses were assigned to their respective “clinic” thereby limiting access to the machines within that clinic only. QA physicists working in or overseeing multiple campuses were assigned to multiple clinics.
The roll-out of the software was planned in progressive phases. The first phase involved linac monthly QA only, followed by linac annual QA and CT monthly/annual QA over the course of 3 years, and brachytherapy daily QA and TPS QA in the 4th year. A master template, essentially a QA form was generated for each QA procedure. Procedures based on the committee’s consensus work were added to the templates as instructions. The QA tests in the master templates were finalized based on the standardization process described above. For linac QA, the reference data (i.e. baseline) and tolerance levels used in the master templates were primarily machine family-based (further description in the Results section). The master templates were reviewed and approved by the committee before applying them to all the machines. Because the master templates were machine family-based, only four master templates were needed for monthly QA and four for annual QA for all 22 linacs. For CT and PET/CT units, only a single master template was generated for monthly QA and annual QA as all simulation units follow the same set of CT tests (the PET QA component for PET/CT was not included). This is the same for brachytherapy daily QA and TPS QA. It is worthwhile mentioning that the software allows users to store images in a variety of formats (e.g. jpg, tiff, pdf, etc.). This provides the flexibility to attach images for additional reference to the QA data record. This may be useful, for example, in high dose rate brachytherapy daily QA where source position checks are measured with film.
1-year and 3-year look-back evaluations
Evaluations of the efficiency and effectiveness of the centralized database system were performed at one year and three years after its implementation at which point linac and CT/PET/CT, monthly and annual QA, were available since implementation. Effectiveness was quantified one year post implementation by the change in QA compliance and efficiency by the time needed to compile year-end QA statistics. Qualitative evaluations were obtained three years post implementation using an anonymized questionnaire sent to all QA physicists (QMP only). Considering the highest usage of the database system is linac monthly QA, the questionnaire focused on linac monthly QA only.
Results
Standardizing QA procedures and the effect on QA compliance
In-depth discussions comparing details of how QA tests were performed at each campus formed the basis of the committee’s consensus work. Table 2 shows the relative uniformity in QA procedures performed in the entire enterprise for a 1-year period prior to the development of uniform standards and the implementation of the database. The relative uniformity was defined as the completion rate at each campus of all QA procedures in place at the time at Campus 1, where Campus 1 was selected as the standard for the uniformity determination. Note that, the absolute QA completion rate at Campus 1 was >98%, where the incomplete tests were due to machine error or human error (i.e. data was not recorded although measurement was done). Deviations among procedures were mainly due to variations in testing equipment available at different campuses although the development of consensus recommendations was further complicated by the fact that the geographic locations of the campuses span three different regulatory agencies. Note that the zero compliance in CT monthly QA for Campus 2-5 was not only due to different state regulations but also the difference in test interpretation. Monthly QA was done for all CT/PET/CT units at these campuses, but the QA procedures were entirely different from Campus 1, which was regarded as the enterprise standard.
Table 2.
Relative uniformity of QA program compliance across campuses 12 months prior to implementation of the centralized database†. Differences in QA procedures, equipment, and baseline data for different campuses all contribute to non-uniformity compared to Campus 1.
Campus 1 | Campus 2 | Campus 3 | Campus 4 | Campus 5 | Campus 6 | ||
---|---|---|---|---|---|---|---|
Linac | Monthly QA | 100% | 93% | 93% | 93% | 89% | 86% |
Annual QA | 100% | 94% | 96% | 98% | 94% | 83% | |
CT, PET/CT | Monthly QA | 100% | 0%* | 0% | 0% | 0% | 100% |
Annual QA | 100% | 80% | 100% | 100% | 100% | 100% |
>99% uniformity was achieved 12 months after centralized database implementation.
Different state regulation requirements
The committee identified QA equipment that was mutually available across the entire enterprise and budgeted for future purchases of equipment that was previously shared or not available at every campus. Furthermore, in anticipation that most of the modern QA database management software was integrated with quantitative imaging analysis, the committee decided to retire the qualitative image quality test objects such as the Varian Las Vegas phantom (Varian Medical Systems, Palo Alto, CA) and the Leeds phantom (Leeds Test Objects Ltd, North Yorkshire, UK) in favor of an image quality test phantom that can be processed in the database software for quantitative analysis. All of the commercial database software systems evaluated prior purchase were compatible with nearly all commercially available imaging test phantoms from different vendors. This gave the committee sufficient flexibility to choose an image test phantom best suited for the department. All equipment was assessed based on ease of use, precision, efficiency, and robustness prior to final decisions regarding purchase.
For linac QA, the consensus program developed by the group included a mix of AAPM TG 142, AAPM MPPG 2a, and 8a tests1,3,6. In situations where ambiguities were apparent in some of the recommendations, there was an extensive discussion with the development of consensus recommendations. For example, in TG 142, the definition of “baseline performance” for some QA procedures had been interpreted differently among campuses with some using data from the planning system and others using commissioning and/or annual QA data. These different interpretations contributed to non-uniformity in our QA programs and complicated cross-comparison of QA data. Furthermore, in some cases, it was not always appropriate to use the same baseline data for all machines. For example, although all of our linacs within each machine type were deemed dosimetrically equivalent based on data collected using identical measurement techniques, the TG 142 tolerance for annual flatness constancy was found to be too stringent when TPS data was used as baseline for all machines. Note that all of our photon beams satisfy the criteria in TG 142, but 1% in flatness constancy is rather stringent for electrons especially with Monte Carlo dose calculation, where there is an intrinsic statistical uncertainty. Therefore, the commissioning data from each individual machine was deemed more appropriate and this approach became part of the consensus program. For CT and PET/CT, the creation of a unified QA program was somewhat simpler, after resolving the differences in state regulations, vendor QA contributions, and the availability of equipment.
Upon completion of the committee process, the centralized database system was configured following the consensus QA procedures for monthly and annual linac and CT QA. Subsequently, all physicists used the system to record QA measurements, perform analyses, and retrieve data. After implementing the centralized QA database, this QA program was followed strictly by all campuses and rectified the non-uniformity of QA practices shown in Table 2. All campuses subsequently achieved >99% QA test compliance for the first year period after centralized database implementation. Note that daily linac and CT QA are not part of our database software implementation at this time. Doing so would require extensive changes to processes currently in place for almost 200 therapists enterprise-wide. Therefore, our intention is to first evaluate workflow and logistics using the database software with QA physicists before a wider implementation.
Post-implementation of a centralized QA database system
To assess users’ perception of the overall utility of the database system (i.e., the efficiency of QA workflow, analyses, and data retrieval) an anonymized questionnaire was provided to QMPs (n=19) enterprise-wide to compare the database system to conventional QA data management methods (i.e. paper forms, manual analyses, and data retrieval). 16 of 19 physicists (84%) participated in the questionnaire. 56% of physicists thought the database system helped to guide them through the QA procedures (Figure 1). This was reflected in an improvement in QA compliance from an average of 80% to >99% for the entire institution after the implementation of the database system. However, 40% of physicists reported longer times to record measurement data (Figure 2.). This may be related to the fact that 56% of physicists felt that the user interface could be improved. For data analysis (Figure 3) and retrieval (Figure 4), 87% and 80% of physicists, respectively, agreed that the database system was more efficient than manual methods. This was consistent with the finding that the time needed to generate annual year-end QA analyses was significantly shortened from 30 minutes to 5 minutes per machine with the database software. When considering all aspects of the database system, 94% of physicists felt the centralized database represented an improvement and would prefer not to return to conventional methods (Figure 5).
Figure 1.
Evaluation of the database system in terms of its role in guiding physicists through the QA process.
Figure 2.
Evaluation of the database system in terms of its role in the efficiency of recording data.
Figure 3.
Evaluation of the database system in terms of its role in the efficiency of data analysis.
Figure 4.
Evaluation of the database system in terms of its role in the efficiency of data retrieval.
Figure 5.
Overall comparison of the database system and conventional paper forms.
Discussions
With increasing complexity of radiation treatment, the overall QA workload of physicists increases accordingly. We found that a centralized QA database system generally improves the efficiency of QA management and offers particular advantages to large multi-center operations. To maximize efficiency, a centralized QA database works best with a standardized QA program established at the outset through a collective physics effort. Initial review of our existing QA procedures and the development of a consensus program underscored the importance of equipment selection, interpretation of test procedures and data, staff expertise, and communication to the entire team of the consistency and ultimate effectiveness of a QA program in a large, distributed center. The QA database system provides easy access to QA data with trending plots, which helps physicists resolve certain machine problems more efficiently. For example, drifts from baselines with time can be easily observed. The data trending plots also facilitate a more efficient data review process by managers.
We believe the significant change in the overall compliance rate in QA from 80% to >99% was not due solely to the consensus procedures but also to the implementation of the centralized database system. Prior to the implementation of the database system, identical QA forms were used across all campuses but, since they were in paper forms, unwarranted modifications or inaccurate data collection due to misinterpretation or inadvertent error were simple to do. These changes from standard procedures were not always fully apparent without a highly detailed review of the QA data. With database software, it became difficult for routine users to make any changes, as the system environment was controlled by just two administrators. Meanwhile, skipping certain QA tests was more apparent as missing data could be easily spotted by managers. Overall, the database system supported a more consistent effort by all team members to comply with and analyze QA tests. Of course, clear QA procedures and the availability of standardized equipment needed to perform these tests are also important aspects of a strong QA program. Nonetheless, we have shown that, as medical physics QA practices evolve in quantity and complexity, a centralized database system is another important tool that facilitates systematic collection of QA data, which is a crucial step for both routine regulatory and accreditation reviews as well as failure modes and effects analyses (FMEA) as recommended by TG 100.
While the focus of this report is not on a specific commercially available database software, it is worthwhile to note that certain responses to our questionnaire may be reflective of the deficiencies of our software itself rather than the working principle of a centralized database system. For example, 40% of physicists expressed that the database system was not as efficient for recording data compared to paper forms. The main reason was due to some specific features of the software. Since some QA tests required more complex calculations with multiple calibration factors, raw data was first entered in a spreadsheet to allow the calculation of the relevant QA data point which was then imported into the database system. This particular step might not be necessary for other software however, it is likely that, since QA database software is still somewhat immature, no system will fully meet all needs for a large multi-center operation. The QA database software used in this study was a first-generation system. Since then, multiple new software systems have become available on the market and the functionalities are continuously evolving. The selection of software is not to be taken lightly as it is a substantial undertaking to determine the system best meeting the needs of the enterprise and to commission that system. For guidance, a list of desired features is strongly encouraged, as demonstrated in Table 3. It can be helpful to assign relative ranking to the items on the list while being mindful that no software would completely fulfill the entire list or to the extent expected/desired. When evaluating a database software, the ease of use, versatility and configurability (such as test customization), and the capability of image analysis and the range of tests available (e.g. MLC tests and image quality tests with various phantoms) should be considered. The ease of test building and adding/editing tests may be important also. Generally, test building is straightforward with most (if not all) of the QA systems supplying a library of standard/generic tests based on national guidelines such as TG 142. Once a test template is built, it can be easily propagated to the different machines. However, adding or editing tests once the templates have been attached to the individual machines can be tedious as the templates may no longer be attached to the master, i.e. the templates assigned to the individual machines are now separate entities and no longer linked to the master. Therefore, changes must be made on a machine by machine basis. This appears to be a database infrastructure limitation and, to our best knowledge, many but not all of the currently available commercial systems follow this scheme. Certainly, vendors of systems with this limitation should place a high priority on a near-future resolution as this issue will become more problematic as the user base grows and facilities become larger.
Table 3.
Example list of desired features in database software.
Database desired features list |
---|
Ease of use |
Versatility and configurability (e.g. test customization) |
Capability and range of image analysis |
Flexibility of data trending |
Accessibility |
Multi-vendor device integration |
Ease of on-going maintenance and management |
API/scripting support |
Data export |
Customer support |
For large healthcare networks, the flexibility of multi-machine data trending (i.e. trending multiple machines on a single plot) and accessibility of the database software (i.e. cloud-based vs client-based) are also crucial considerations. The ability to accommodate QA/measurement devices from as many vendors as possible is also desirable, as it is not unusual for institutions to be equipped with an array of different devices from multiple manufacturers. However, currently, vendors often store their own measurement data in proprietary formats thereby limiting data transfer between platforms. Some database software systems attempt to resolve this by emulating each vendor’s file format and recreating the data but this obviously can be problematic when the QA device vendor changes their file format.
Other important considerations in selecting a QA database software system include ensuring appropriate institutional resources for ongoing maintenance such as software upgrades and system upkeep including building and changing QA test procedures and regulations. Depending on the department’s needs, data export, API/scripting support, and future expansion capabilities may also be important considerations. Lastly, strong customer support from vendors is extremely helpful, as the development of comprehensive centralized database management software is still evolving.
Conclusion
A centralized database system is useful and improves the effectiveness and efficiency of radiotherapy equipment QA management in our institution. With a database system, consistent data collection and documentation can be obtained to facilitate regulatory reviews and perform FMEA as per TG 100.
Acknowledgment:
The authors would like to thank all the QA physicists for their participation in the implementation process of the centralized QA database and the questionnaire.
Funding:
This work was partly funded by the NIH/NCI Cancer Center Support Grant P30 CA008748. GT receives funding from NIH/NCI and IBA but work is done before and outside of these grants. MC receives funding from Ashland Inc., and the grant is out of the scope of this work. MH is a co-owner of a patent US20210252310A1, which is irrelevant to this work.
Research data are not available at this time.
Footnotes
Conflict of interest: none
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
References
- 1.Klein EE, Hanley J, Bayouth J, et al. : Task Group 142 report: quality assurance of medical accelerators. Med Phys 36:4197–212, 2009 [DOI] [PubMed] [Google Scholar]
- 2.Mutic S, Palta JR, Butker EK, et al. : Quality assurance for computed-tomography simulators and the computed-tomography-simulation process: Report of the AAPM Radiation Therapy Committee Task Group No. 66. Medical Physics 30:2762–2792, 2003 [DOI] [PubMed] [Google Scholar]
- 3.Smith K, Balter P, Duhon J, et al. : AAPM Medical Physics Practice Guideline 8.a.: Linear accelerator performance tests. Journal of Applied Clinical Medical Physics 18:23–39, 2017 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Hanley J, Dresser S, Simon W, et al. : AAPM Task Group 198 Report: An implementation guide for TG 142 quality assurance of medical accelerators. Medical Physics n/a, 2021 [DOI] [PubMed] [Google Scholar]
- 5.McCullough SP, Alkhatib H, Antes KJ, et al. : AAPM MEDICAL PHYSICS PRACTICE GUIDELINE 2.b.: Commissioning and quality assurance of X-ray-based image-guided radiotherapy systems. Journal of Applied Clinical Medical Physics 22:73–81, 2021 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Fontenot JD, Alkhatib H, Garrett JA, et al. : AAPM Medical Physics Practice Guideline 2.a: Commissioning and quality assurance of X-ray-based image-guided radiotherapy systems. Journal of Applied Clinical Medical Physics 15:3–13, 2014 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Huq MS, Fraass BA, Dunscombe PB, et al. : The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management. Med Phys 43:4209, 2016 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Attebery T, Hearld LR, Carroll N, et al. : Better Together? An Examination of the Relationship Between Acute Care Hospital Mergers and Patient Experience. Journal of Healthcare Management 65:330–343, 2020 [DOI] [PubMed] [Google Scholar]
- 9.Jiang HJ, Fingar KR, Liang L, et al. : Quality of Care Before and After Mergers and Acquisitions of Rural Hospitals. JAMA Network Open 4:e2124662–e2124662, 2021 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Luo J, Yau S, White S, et al. : Paperless medical physics QA in radiation therapy. Australas Phys Eng Sci Med 35:237–43, 2012 [DOI] [PubMed] [Google Scholar]
- 11.LoSasso T, Lim S, Tang G, et al. SU-E-T-52: Beam data comparison for 20 linear accelerators in one network. J Med Phys. 2014;41:233 [Google Scholar]
- 12.Almond PR, Biggs PJ, Coursey BM, et al. : AAPM's TG-51 protocol for clinical reference dosimetry of high-energy photon and electron beams. Medical Physics 26:1847–1870, 1999 [DOI] [PubMed] [Google Scholar]
- 13.Practice Guidelines for Medical Physics. New York State Education Department. Available at: http://www.op.nysed.gov/prof/medphys/medphyspracticeguidelines.htm. Accessed April 7, 2022