Abstract
In the UK, physicists and radiographers perform routine quality control (QC) of digital mammography equipment at daily, weekly and monthly intervals. The tests performed and tolerances are specified by standard protocols. The manual nature of many of the tests introduces variability due to the positioning of regions of interest (ROIs) and can be time consuming. The tools on workstations provided by manufacturers limit the range of analysis that radiographers can perform and do not allow for a standard set of tools and analysis because they are specific to a given manufacturer. Automated software provides a means of reducing the variability in the analysis and also provides the possibility of additional, more complex analysis than is currently performed on the daily, weekly and monthly checks by radiographers. To this end, a set of tools has been developed to analyse the routine images taken by radiographers. As well as automatically reproducing the usual measurements by radiographers more complex analysis is provided. A QC image collection system has been developed which automatically routes QC data from a clinical site to a centralised server for analysis. A Web-based interface has been created that allows the users to view the performance of the mammographic equipment. The pilot system obtained over 3000 QC images from seven X-ray units at a single screening centre over 2 years. The results show that these tools and methods of analysis can highlight changes in a detector over time that may otherwise go unnoticed with the conventional analysis.
Keywords: QC, Mammography, NHSBSP, ImageJ, QA
Background
Quality control (QC) can be a manual and subjective process [1]. Increased productivity and image quality can be obtained by automating QC and using Web-based methods [2]. Automated systems for quality control that collect and analyse images have been developed for cone beam CT [3].
In the UK, routine QC of digital mammography equipment is performed regularly by radiographers and physicists [4, 5]. The baselines established in conjunction with the physicists at commissioning or after a system change are used for comparison with the results of tests performed on a daily, weekly and monthly basis by radiographers.
Remedial values and tolerances specified by the National Health Service Breast Screening Programme (NHSBSP) [4] are used to monitor the performance of a mammography unit and help to identify any need for remedial action. Radiographers perform the frequent QC tests manually, which is time-consuming and introduces variability when fixing the sizes and locations of regions of interest in the images. Automated software avoids variability inherent in the performance of such tests by individuals. In addition, more complicated types of analysis, that are not feasible in manual operation of tests, can be made available using automated software.
This paper reports on the experience of developing a remote QC system to perform measurements on the images taken routinely by radiographers in the UK. In Belgium, a similar system has been developed [6] and has been piloted in the Ontario Breast Screening Program [7]. An automated QC image collection system has been implemented that collects the daily, weekly and monthly QC images from a local screening centre. Different types of analysis have been performed on these images and the results have been made available over the Internet to physicists and radiographers. The software used to analyse the images can be run as a standalone package using ImageJ (National Institutes of Health, Bethesda, Maryland, USA) image analysis software.
Routine QC of mammography equipment ensures that it meets the standards set by the NHSBSP and that it is performing as expected [8]. There are two main types of images generated by the routine QC tests performed by radiographers. The first is a uniformity image of a block of poly methyl methacrylate (PMMA) covering the detector. The second type of image is used to measure the contrast-to-noise ratio (CNR) with varying thicknesses of PMMA, incorporating an added or embedded CNR square or rectangle (0.2 mm thick, 10 x 10 mm of aluminium or larger). Current systems employed by the screening program involve the measurement of signal-to-noise ratio (SNR) and tube load (mAs) on a daily basis, CNR with a standard thickness of PMMA on a weekly basis, uniformity on a weekly basis, and CNR and SNR with different thicknesses of PMMA on a monthly basis.
These tests are performed in addition to other tests specified by the manufacturer. For the SNR and CNR measurements, the radiographer measures the signal difference and background noise by drawing regions of interest (ROI) on the image. At the pilot site, the SNR was measured on the same phantom as the CNR, avoiding the aluminium insert. To measure the uniformity, the NHSBSP protocol specifies that the mean pixel value at four corners of an image of a uniform block and that the percentage difference of values at these corners to the centre should be measured. These measurements are typically recorded locally by radiographers on paper or in a spreadsheet. If the daily QC results are not satisfactory (after re-checking), the radiographers do not proceed to image until remedial action has been taken.
Methods
The decision was made to use Web-based technologies due to the benefits of the centralisation and easy access to software and data. A Web-based system was developed to transfer images from the remote screening locations to a centralised server, to analyse the images and to provide access to the results. The server provides much functionality, including storage and retrieval of images, automated analysis and storage of results and hosting of the Website. The application, which analyses the images, stores the results directly in a relational database. This database, implemented using MySQL, also stores the account information to administer the Web access.
Image Collection
The pilot centre was the Jarvis Screening Centre, Guildford, Surrey, UK. QC images were collected from nine X-ray units. Two of these were manufactured by GE (General Electric Inc., Fairfield, Connecticut, U.S.A) and seven were manufactured by Hologic (Hologic Inc., Bedford, Massachusetts, USA). At the time of writing, 3721 QC images had been collected from nine X-ray units: two Hologic Dimensions, two GE Essentials and five Hologic Selenias. The majority of the images were from the Hologic Selenias and the GE Essentials and the results from these units are discussed in this paper.
Unprocessed or in DICOM terminology “for processing” images were required for analysis. Many screening centres do not routinely store unprocessed mammograms on the local PACS system. However, with the cooperation of the PACS vendor (Sectra AB, Linköping, Sweden) and the pilot site, the systems were altered to allow unprocessed QC images to be routinely transferred from the X-ray units and stored on the local PACS system. Once established, this procedure required no intervention by the radiographers, beyond following a standard protocol for naming the QC images and folders. This standardisation was extremely important, as some of the X-ray units did not name the QC images in a predictable, reproducible manner, which initially made it difficult to identify them on the local PACS. Once a standard protocol was in place, the QC images could be queried and extracted in an automated manner.
In order to avoid the need for regular manual intervention, a program was written which regularly queried the PACS and pulled the relevant images to be transferred. In order to allow the program to interact with the local PACS, the PACS provider was asked to add a rule to allow query/retrieve operations to take place. The DICOM query /retrieve was used to transfer the QC images from the PACS to a research server on the local network at the pilot site. The DICOM query/retrieve was performed using the DICOM toolkit DCM4Che. The networking tool rsync was used to transfer the images securely across the NHS national network (N3) to the server for analysis. This required a small change to the firewalls to allow this network traffic through with permission from the IT departments responsible for the screening centre and the Royal Surrey County Hospital where the NCCPM server is located. The whole process is completely automated, requiring no manual effort from the radiographers and only monitoring and maintenance from the system providers.
In order to provide flexibility and enable image collection from any site in the UK, we also provide an Image Exchange Portal (IEP) institute node for the collection of QC images. Sites involved would be able to transfer their QC images on a regular basis via a standard IEP transfer. These images would be received on our dedicated PACS, which would be routinely scanned for new images. If located, these new images would be extracted and passed through the analysis pipeline. The detection and analysis process would again be automated; however, manual efforts would be required by a user at the source site to transfer sets of QC images on a weekly or monthly basis.
Image Analysis
The Java-based image processing toolkit ImageJ is used widely amongst the scientific community to perform a range of image processing tasks. The functionality of ImageJ can be extended using its plugin architecture. By exploiting the ImageJ API (Application Programming Interface), software was developed that could run both independently of the ImageJ graphic user interface (GUI) on the server and manually by a user using ImageJ. A set of tools was developed to allow automatic analysis of CNR images and uniformity images. These tools were designed so they could be incorporated into a Java-based application that runs on the server and could analyse both types of images. Using custom built edge detection techniques, the CNR square was automatically detected. Once the square was located, the CNR was calculated by measuring the difference in signal inside and outside the square and dividing this by the background noise. The positioning of the ROIs was as described in technical evaluations by NCCPM [9] and using the formula in the NHSBSP protocol [4].
Four ROIs were positioned in the four corners of the image 1 cm from the edge of the PMMA and the percentage difference in the mean pixel value to the central ROI was calculated. In addition, a more detailed measure of the uniformity and variance was obtained using an ROI that was one hundredth of the image width. This ROI scanned the whole of the detector to produce a map of the uniformity and normalised variance. Sample normalised variance images are shown in Fig. 1. This map was 200 pixels in width and the height was such that the horizontal and vertical scale of the original image was preserved. We defined two metrics on the ROIs: the local percentage difference in variance (LPDV) and the global percentage difference in mean (GPDM). The LPDV for a given ROI is the percentage difference in the variance of a ROI from its nearest neighbours. The GPDM of a ROI is the percentage difference in mean pixel value of the ROI from the mean pixel value of the whole image. Using these two measures, two maps were constructed for each flat-field image: the LDPV map and the GPDM map. To condense these results and plot over time, the maps were then binned into histograms. The values of the histograms were stored in a MySQL database which served as the backend for the Web portal. A bin width of 1 % was chosen for the uniformity map and 5 % for the variance map. The percentage of the image with values outside 0–1 % and 0–5 % bins was used for the analysis on the GPDM and LPDV maps, respectively.
Fig. 1.
The workflow of the LPDV and GPDM analysis. The initial flat-field images had on the order of 5 million pixels and each map had on the order of 50,000 pixels
Website and Tools
A website was developed to provide graphical plots of the data and allow the users to view results. Security of access was provided by user accounts which are associated with specific sites. This ensures that a user only has permissions to view results for the appropriate selection of X-ray sets (for their centre, or for their region if they have a regional role). The user can view graphs of the routine QC data and the associated plots of the analysis of the uniformity and CNR phantoms. Java Web Start technology allows users to view and analyse the relevant images directly via the Web. This enables a user to look in more detail at a particular image of interest and perform offline analysis. In addition, the data can be exported by the users into a comma-separated variable file that can be easily imported into Microsoft Excel. The design approach of the analysis software allows the tools to be used separately as plugins independent of the website. These tools are available online from the NCCPM website.
Results
A selection of results is presented below for cases where changes in the behaviour of the X-ray units were observed. Shortly after image collection began, the radiographers complained of a general ”snowiness” on the clinical images from one of the GE units GE1. Radiographers remarked that the snowiness could be mistaken for microcalcifications. Using the local variance analysis, it was clear that the fraction of the flat field image above five percent in LPDV increased from 13 to 24 % during this period Fig. 2. In September 2012, at the request of the screening centre, an engineer modified the X-ray unit to try to rectify the issue. As a result, there was an increase in the CNR values from 4.5 to 5.5, and the percentage of the image with LPDV exceeding 5 % was reduced to 5 % Fig. 3. Over two other time periods, Jan 2013 and Sep 2013, there was a similar but smaller increase in the percentage of the image with a LPDV over 5 % until it decreased and this was followed by an increase in the CNR Fig. 3.
Fig. 2.
The percentage of the detector that has a LPDV higher than 5 % for each of the GE X-ray units
Fig. 3.
The percentage of the detector that has a LPDV over 5 % and the CNR of GE1
The vast majority of CNR values for the five Hologic units were within tolerance (±10 %) of the average value over the 2-year period Fig. 4. For one of the X-ray units, Hol1, the percentage of the detector LPDV was at one point over 0.8 % of the detector while for the remaining four X-ray units it was less than 0.1 %. The local percentage difference in variance values of Hol1 fluctuated over time Fig. 5.
Fig. 4.
A histogram of the combined 2444 CNR values from the Hologic X-ray units. The tolerances of ±10 % are shown by the vertical dashed lines
Fig. 5.
The fraction of the detector that has a LPDV higher than 5 % for each of the Hologic X-ray units
The uniformity values of Hol1 were in general higher than most of the other X-ray units Fig. 6. After excluding outliers for this unit, the uniformity values were seen to be increasing with time although the values of 4 to 6.5 % were still within tolerance Fig. 7. An outlier was defined as any point that was greater than 20 % different from the average over the whole time period. The other Hologic X-ray units all had an increase in the GPDM in 2014 when compared with 2013.
Fig. 6.
The percentage of the detector of the Hologic X-ray units that have a GPDM value of greater than 1 %
Fig. 7.
The percentage of the detector of Hol1 that has a GPDM value of greater than 1 % with a linear regression
The measurements of uniformity made by the radiographers for Hol1 are shown in Fig. 8. The equivalent measurement made using our automated software for the same X-ray unit is shown in Fig. 9. There is a clear increase in the percentage difference of the corner points to the central ROI over time but this trend is not apparent in the measurements performed by the radiographers.
Fig. 8.
The uniformity measurements by the radiographers for Hol1.
Fig. 9.
The uniformity measurements by our automated software for Hol1
A t test was used to compare the measurements of uniformity made by the radiographers and the measurements made by the automated software. For Hol1, there was a statistically significant difference (p<0.1). For a different machine operating in a stable manner, no significant difference was found (p=0.12).
Further investigation into the behaviour of Hol1 showed that there were a number of regions in the flat field images that were contributing to the high values of the LPDV. The LPDV map averages over four time periods are shown in Fig. 10. All regions of pixels with high LPDV grew over time Fig. 11. Regions 1, 2 and 4 were no longer visible in October 2013 Fig. 10d. After October 2013, one defect (region 3) is visible in the top left of Fig. 10d. Two strips at the top and bottom of the border of the detector are increased and decreased over time.
Fig. 10.
The LPDV maps of Hol1 averaged over 4 periods ordered chronologically with each containing 20 images. A window centre of 5 % and window width of 10 % has been used for all images. The point defects are labeled 1, 2, 3 and 4
Fig. 11.
Defect 1 from Fig. 10 a, b and c magnified to show the increase in size. The LPDV maps were windowed to the same level and width
Discussion
The system was designed to provide two types of automated QC analysis; to reproduce current analysis by radiographers using automatically defined regions of interest and to provide a more sophisticated automated analysis of the whole detector performance with a view to identifying and tracking subtle changes in the detector. A major limitation of the routine tests in the existing protocols is that they are limited to either a few localised measurements or a subjective visual assessment of uniformity. This makes it difficult to identify changes in detector performance especially if they are localised. The advantage of the automated procedures described here is that the whole detector is assessed in a quantitative and reproducible manner.
In some countries, notably the USA, QC is performed by the manufacturers software. Hence, the tests and types of analysis can vary for each manufacturer. The adoption of a standardised method would allow the comparison of results and implementation of universal limits on the deviation of the equipment from an acceptable standard.
In the case of one of the GE units, the results of our methods of analysis correlated with the onset of ”snowy” appearance in clinical images that was a hindrance to clinical interpretation. For the same detector, two other points were identified where the local variance values were seen to increase, but no degradation in clinical performance was reported. The local changes in variance allowed small defects in the detector of Hol1 to be tracked over time. Some defects were seen to disappear. This is thought to be due to recalibration of the detector after the mobile unit was moved to a new location. There were some outliers with large levels of non-uniformity for some of the Hologic X-ray units. This occurred infrequently and these outliers were excluded.
The uniformity measurements of the radiographers did not display the same increase that out software did. This is possibly due to the manual location of the ROIs by the radiographers that will introduce additional variation in the results hence hiding the upward change. The consistent positioning of the ROIs by our software eliminates this source of error.
One fundamental limitation of the system is that there is a delay between acquiring the images and providing feedback, so that the current on-the-spot analysis by radiographers is still needed. This limitation could be eliminated by providing a reliable network connection to mobile X-ray units. However, the system provides additional functionality, remote access to results and more advanced analysis than is possible using the current protocol. The advanced analysis is sensitive to small changes in the detector performance and can provide alerts to the presence of defects in the detector that would otherwise not be noticed using current QC procedures. The review tools can be used by both the radiographers and physicists. This system is ideally placed for tracking long-term effects on detectors that would otherwise be difficult to identify.
One objective of the pilot was to explore how such a system could be extended across the NHSBSP. In principle, multiple screening centres could use the centralised remote QC systems at the same time. It would be relatively straightforward to restrict access to particular systems by particular people. In order to expand to other sites, each would be required to arrange to store QC data on their PACS. During this pilot, the process of enabling the storage of QC images was relatively simple and without difficulty. Once this was achieved, depending on the desired image collection method, a local server could be installed or IEP utilised. A local server would require assistance from the local IT department and cooperation of the PACS providers to allow query/retrieve operations from the server. Experience has shown that Hospital IT departments and their PACS systems can be slow to make the changes necessary or reluctant to add firewall exceptions, hence we have provided the alternative IEP collection route. Alternatively, a software only solution could be used on existing hardware at the site. However, it is our opinion, that if such a system were to be rolled out across the NHSBSP, then these minor barriers would not be insurmountable with appropriate pressure and emphasis.
Conclusion
The implementation of a remote QC system is challenging due to the logistical and IT issues but by collecting and analysing routine QC images the performance of X-ray units can be measured more accurately and efficiently than is currently possible with manual methods. By normalising the variance, using nearest neighbours, and by considering the percentage of the total image exceeding the mean or local variance by a given percentage shift, we defined performance measures on flat-field images. This measure was used to track the performance of detectors over time and was shown to correlate with clinical performance.
Acknowledgments
This work has been funded by the NHS Breast Screening Programme. We would like to acknowledge the help of Prof. David Dance, the cooperation of the radiographers at the Jarvis Screening Centre and the staff at the Regional Radiation Protection Service at the Royal Surrey County Hospital.
References
- 1.Reiner B I: Automating quality assurance for digital radiography. J Am Coll Radiol 6(7):486–490,2009 [DOI] [PubMed]
- 2.Moores B, Charnock P, Ward M: Web-based tools for quality assurance and radiation protection in diagnostic radiology. Radiat Prot Dosim 139(1–3):422–429,2010 [DOI] [PubMed]
- 3.Grimes J, Leng S, McCollough C: Th-c-18a-07: A software tool for automated analysis of ct quality assurance phantoms. Med Phys 41(6):558–558,2014
- 4.Baxter G, Jones V, Milnes V, Oduko J, Philips V, Sellars S, Zoe V: Routine quality control tests for full field digital mammography systems: NHSBSP Report (1303),2013
- 5.Workman A, Castellano IA, Kulama E, Lawinski C, Marshall N, Young KC: Commissioning and routine testing of full field digital mammography systems: NHSBSP Report (0604), 2009
- 6.Jacobs J, Lemmens K, Nens J, Michielsen K, Marchal G, Bosmans H: One year of experience with remote quality assurance of digital mammography systems in the Flemish breast cancer screening program. In: Digital Mammography. Springer, 2008, pp 703–710
- 7.Bloomquist A, Jacobs J, Yaffe M: Su-e-i-87: Pilot testing of software for automated remote quality control of digital mammography equipment for use in the ontario breast screening program. Med Phys 38(6):3415–3416,2011
- 8.Young KC, Van Engen R, Bosmans H, Jacobs J, Zanca F: Quaxlity control in digital mammography. In: Digital Mammography. Springer, 2010, pp 33–54
- 9.Young K, Oduko J: Technical evaluation of Hologic Selenia Dimensions 2D Digital Breast Imaging System with software version 1.4.2: NHSBSP Report (1201), 2012











