Skip to main content
Journal of Digital Imaging logoLink to Journal of Digital Imaging
. 2006 Jan 25;19(2):159–166. doi: 10.1007/s10278-005-8733-1

Digital Repeat Analysis; Setup and Operation

J Nol 1, G Isouard 2,, J Mirecki 1
PMCID: PMC3045180  PMID: 16421768

Abstract

Since the emergence of digital imaging, there have been questions about the necessity of continuing reject analysis programs in imaging departments to evaluate performance and quality. As a marketing strategy, most suppliers of digital technology focus on the supremacy of the technology and its ability to reduce the number of repeats, resulting in less radiation doses given to patients and increased productivity in the department. On the other hand, quality assurance radiographers and radiologists believe that repeats are mainly related to positioning skills, and repeat analysis is the main tool to plan training needs to up-skill radiographers. A comparative study between conventional and digital imaging was undertaken to compare outcomes and evaluate the need for reject analysis. However, digital technology still being at its early development stages, setting a credible reject analysis program became the major task of the study. It took the department, with the help of the suppliers of the computed radiography reader and the picture archiving and communication system, over 2 years of software enhancement to build a reliable digital repeat analysis system. The results were supportive of both philosophies; the number of repeats as a result of exposure factors was reduced dramatically; however, the percentage of repeats as a result of positioning skills was slightly on the increase for the simple reason that some rejects in the conventional system qualifying for both exposure and positioning errors were classified as exposure error. The ability of digitally adjusting dark or light images reclassified some of those images as positioning errors.

Key words: PACS, CR, digital reject analysis, equipment use, education, repeat analysis, exposure selection, positioning and techniques

Introduction

Reject analysis (RA) has been one of the key quality control tools in conventional medical imaging departments using film processing technology for as long as many of us can remember. The Quality Control in Diagnostic Imaging1 has been used as the bible of reject analysis in most Australian imaging departments. We also acknowledge that some departments in Australia and other countries refer to Peer et al.2 Results of RA are used to plan for training needs and prepare clinical presentation targeting staff weaknesses. Reject films also provide a valuable tool in calculating radiation dosage delivered to patients for radiobiology purposes; for example, when a patient is found to be pregnant after being x-rayed, physicists require to know the number of exposures taken to assess the situation and put their recommendations. With the change to computed radiography (CR) and picture archiving and communication systems (PACS), film RA was replaced with digital reject analysis (DRA) because of the physical elimination of film.

The original aim of our study was to conduct a comparative study between film and image RA to compare the outcomes between conventional film processing and CR technology and assess the need of DRA program in digital environments. The collection of data for the conventional RA was simply reliant on manually collecting rejected films in an assigned box.

For PACS, our provider has a built-in electronic folder where radiographers have the choice of sending an unwanted image to the reject folder. Records are available to system administrators, and images cannot be deleted even if it has been rejected by the operator. However, our CR provider was not well equipped for DRA, and there were few problems to be solved to maintain a credible system; in fact, engineers and marketing key personnel were encouraging users to delete unwanted images immediately to clear memory space. Reviewing other publications, it was obvious that being at the early stages of the digital era, RA has been overlooked, and not much thought has been given to it.3 Because of the complexity of collecting data for the DRA, the main aim of our project drifted toward a lengthy process to set a credible DRA system.

Setting and Patient Groups

In a major project to change over from conventional film processing system to a computed radiography system in a large metropolitan hospital made up from two campuses in Western Sydney, a reject analysis study was conducted to compare between conventional RA and DRA outcomes.

The study was conducted at the smaller campus of the hospital that has over 20,000 examinations a year. The project was submitted to the ethics committee, and clearance was granted. Two months prior to the switch over to the new system, the collection of reject films took place based on the recommendations set by Gray et al.1 to include all reject films belonging to inpatients and outpatients referred to the hospital over a 4-week period for general radiological examinations. The second part of data collection started 2 months after the starting date of the PACS system, comprising the same category patients and examinations collected for the conventional system. Ultrasound examinations and all films or images taken by radiologists during special procedures such as barium studies, Venograms, etc., were excluded from the study.

Materials and Methods

The first part of the study of conventional RA was based on universal guidelines as set by Gray et al.1 Reject films were defined as all scrap films including green films, black films, and cleanup films. Repeat films were limited to those radiographs that were not accepted and required an additional exposure to the patient. Reject films were collected and controlled by the senior radiographer in charge of conducting the periodic reject analysis study. At the end of the collection period and after performing the initial sorting and analysis, the collected batch went through another sorting process by the researching team. Films were sorted into four categories: exposure, positioning, good, and miscellaneous errors. Clear, black, and fresh films were included as part of the reject analysis and not of the repeat analysis.

For the purpose of this study, exposure errors were sorted into three levels depending on the severity of being over- or underexposed. A third-level error required modification by more than 15 kV. A second-level error required modification by an average of 8–10 kV. A first-level error required a minor change up to 6 kV. In addition, for the purpose of this study, an additional category (digitally fixable) was created where level 1 and 2 exposure errors from the conventional batch were scanned and reassessed by the radiologist. The second part of the study took place 8 weeks after the change over to CR technology. Proper collection failed because of a conflict of philosophy between the PACS provider and the CR provider. The PACS system was properly set up to have a reject analysis folder available in the main quality assurance folder.4 However, the CR system did not have the capability to comply with DRA requirements. In response to an enhancement request, the CR provider modified and upgraded the software to rectify the problem (refer to CR system settings). Before starting data count and analysis, we had to make sure that all images residing in the CR recycle bin have been sent to PACS. Similar to the conventional batch, image analysis was conducted by the researching team comprising of a radiologist and a senior radiographer. Films were analyzed using image viewing light boxes. PACS images were viewed using a radiologist PACS workstation.

Statistical data related to the number of patients and examinations were collected from the radiology information system (RIS) for a whole month for both batches. To assess the number of films and exposures taken in the conventional batch, a film stock take was carried out at the beginning and at the end of the collection process. For PACS, the task was much easier because figures were provided electronically by the PACS engineer. All data were entered in an Excel worksheet.

CR System Settings (Sequence of Events)

Conventional RA studies relied on manual collection of films. It was up to the integrity of radiographers to keep rejects in the dedicated bin. With PACS, however, there are many complexities mainly because PACS and CR are usually given by different providers. CR is the first point of capturing digital images, where PACS has the role of modifying and archiving the image. At the time of purchasing the PACS system, prospects of continuing repeat analysis were promising. Ability to maintain a proper record of all rejected images was part of the specifications of the new PACS. It was expected that reject images are readily available whenever they are needed by administrative radiographers. It was further promised that the PACS engineer would automatically provide the department with monthly reports of the reject folder based on modality.

After 1-month operation, the first reject analysis report was received from the PACS engineer. The number of the rejected images was below of what was expected. Analyzing the available images in the reject folder, we only found images of good diagnostic quality. Rejected images comprised identical images saved in the patients' folder that has been modified, with the addition of digital shutters (apparently any image that had digital shutters applied is saved as a new image, and the old one is automatically sent to the reject folder; this rule does not apply to other digital modifications); the remaining rejects were duplicates of existing images resent to PACS because of interface failure between two or more systems, such as RIS, hospital information system, and PACS. Identifying the nature of the images in the reject folder raised a suspicion. Is it possible that we have no repeated images?

Troubleshooting the problem with the PACS engineer, it was discovered that the CR system has the ability of sending images to PACS either manually or automatically.5 Radiographers had the ability to switch autosent on and off. To hide mistakes, bad images were never sent to PACS. It was also found that radiographers had the ability to delete images permanently from the CR system. As a result, there were no records of any repeated images in the system. Contacting the CR provider, we discovered that there was no functionality to stop deleting images or switching autosent off. The inability of the CR to maintain a proper reject analysis record has been identified by other users.6 To work around the inadequacy of CR/PACS relationship, some users decided to use the CR as the means of archiving rejected images.3 We were not able to do the same because CR has a very limited archiving space, and it is not used as the long-term archiving tool. To comply with the equipment registration guidelines imposed by the New South Wales Environmental Protection Authority7 to allow retrospective dose assessment if required (e.g., if a patient is found to be pregnant after being exposed to x-rays), we are required to archive all acquired images. After all, it is a moral and professional obligation to comply with the ALARA principle, and quality assurance (QA) is an essential tool to assess training needs and to keep exposure to radiation as low as possible.

With PACS being the long-term archiving tool, it would be expected that all images including the repeated ones are sent from CR to PACS for future reference. The solution for the problem was to prevent users from switching autosent off. Raising an enhancement request to the manufacturer to upgrade the software was our only available action. It took the CR provider a year to come up with an enhancement patch to the current software, where the ability of nonadministrative users to switch autosent off and on was removed. Radiographers found another way of stopping images from going to PACS. They realized that if they were quick enough to double click the image after acquisition (before the image is sent to PACS), they can stop it from being transferred to PACS and then delete it.

The only solution was to raise another request for enhancement to disable manual deletion from the CR system. It was almost a year when the new software was released, and the radiographers' ability of deleting images was replaced with the ability of sending images to the recycle bin, and only a system administrator can empty the recycle bin. However, the problem of stopping images from going to PACS was not fixed. The reason was that some users, especially in other countries, were not happy in sending bad images to PACS.

Another request for software enhancement was placed to the manufacturer, but as an interim measure, we adopted a new policy where the QA radiographer tags all images from the CR recycle bin and send them to PACS as a batch. Unfortunately, that causes lots of complications, the main problem being the creation of a new folder for every single image sent to PACS as unspecified, and the system administrator has to sort them in their proper folders one by one. Radiographers were informed of the new policy and were instructed not to try to stop images from going to PACS, and they were also informed that images in the PACS reject folder will be dealt on an anonymous level. On the other hand, images in the CR recycle bin will be examined closely and discuss every single case with the staff in question.

Final Setting of the Digital Reject Folder

The final product of the CR/PACS setup is to automatically send all images from the CR reader to the PACS-ISU immediately when the image is captured in the reader. Images failing to go to PACS as mentioned above are sent manually on a regular basis by the system administrator especially before starting the periodic DRA. When an image hits PACS, radiographers are then required to either reject, QA, or accept an image. After finishing the examination, the radiographer will sign off the image by completing the examination in the RIS system. That will verify the examination moving it to the undictated folder for the radiologist to report.

The PACS workstation has the same QA capability as the CR; however, users have the option of switching the viewing of annotated text off or on.4 For the untrained user in the hospital wards, that might be a problem, and if annotation is switched off, remarks added to the image can be missed. The CR QA station, on the other hand, has the capability to permanently burn the text into an image such as markers to identify the side in question, identify AP from PA, etc. Therefore, radiographers prefer to use the functionality to eliminate the questions raised by users. The CR QA station is also used to change image algorithm using the raw data. For example, an abdominal image can be reoutput with a chest algorithm to assess the bases of the lungs, or chest images can be reoutput with abdomen algorithm to locate nasogastric tubes.

It is important to note that, to save short-term archiving space, PACS was set to automatically remove images from the reject folder after 7 days. After much negotiation, the period was increased to 90 days. However, from the text data, images can still be accessed at any time by the system administrator for as long as the long-term archive media is available on the premises. Another request for an enhancement has also been put to the PACS provider to store the reject folder on a CD on a monthly basis.

Error Classification

In the process of sorting images to identify the causes of repeats, we realized that some users came up with a larger number of categories than the conventional method.6 Some went to an extreme, having 38 categories.3 After careful analysis, our team decided to stick to the original categories set for the conventional RA program, with the addition of a new category related to the digital environment. Repeated films and images were sorted as exposure, positioning, good film/image, and miscellaneous errors. Exposure errors included all repeated images for being dark or light films/images. Positioning errors were included, with wrong positioning as set by Merrill's Atlas.8 Cutoff or overconed, off center, marker obstructing the site in question and the part in question was not shown.

Good films/images category consists of those images where radiographers decided to repeat the image because they thought they can obtain a better image, wherein the radiologist would be happy to accept that film or image (mainly those cases where images can be digitally adjusted to an acceptable window and level). For the purpose of this study, the researching team decided to review the conventional batch, add a subcategory tagging films as fixable digitally, and scan images that can be adjusted.

The miscellaneous category included improper patient preparation (jewelry or other metallic elements not removed), double-exposed film/image plates, patient motion, and equipment fault (processor; dirty films or image plates).

Clear, black, and fresh films as well as reprocessed images were included as part of the reject analysis and not of the repeat analysis. As recommended by Gray et al.,1 green and QC films were categorized as reject but were not included in the conventional RA. Reprocessed images included all images that have been resent to PACS after QA, such as applying shutters, reoutput images from the CR, or images sent originally to the wrong patient or folder.

Results

As shown in Table 1, the total number of examinations for the 4 weeks of the conventional batch was 1680 examinations. According to the imaging protocol of that department, the number of films taken and forwarded to a radiologist for reporting should be 2780 films. The number of films in the reject bin was 407 films, bringing the total number of used films to 3187. The usage of films according to the film stock record was 3200 films. The number of repeated views was 325 films, which makes 10.5% of the total views taken.

Table 1.

Number of repeat and accepted films/images

Hospital No. of reject films No. of scrap/blank films/reprocessed images No. of repeat Percent repeat No. of examinations No. of accepted films/images Total films/images
Conventional 407 80 327 10.5 1680 2780 3187
Digital 189 37 152 4.7 1615 3063 3252

In the digital batch, the number of examinations was 1615 examinations, and according to PACS records, the number of images forwarded for reporting was 3063 images. The number of images in the reject folder was 189 images, bringing the total number of stored images to 3252. The number of repeated views was 152 images, which makes 4.7% of the total views taken (Tables 24).

Table 3.

Causes of errors

Technology Positioning Exposure Good Miscellaneous
Conventional 95 163 40 29
Digital 103 28 7 14

Table 2.

Examinations undertaken with the repeated films/images

Examination Conventional Digital
Chest 162 50
Extremities 37 16
Abdomen 32 30
Head 19 16
Pelvic girdle 26 15
Shoulder girdle 9 7
Spine 38 18
Thorax 1 0
Unknown 3 0
Total 327 152

Table 4.

Suspected causes of unnecessary repeated medical imaging examinations

Type of error Suspected causes
Exposure In x-ray rooms where only manual exposure selection is available, the error is largely attributable to the selection of incorrect exposure factors, mainly because of the incorrect estimation of the patient's size. In x-ray rooms equipped with automatic exposure selection, the error is largely attributable to the incorrect positioning of the body part in question in relation to the ion chambers
Radiographers found not to usually consult an exposure chart or measure patient size
Positioning Not following the correct positioning technique as recommended by educational references such as The Merrill's Atlas
Good films Availability of radiologist at the site for consultation and advice reduces the number of rejected good films
Miscellaneous 1. Patient movement or motion. Errors as a result of patient movement or difficulty in holding breath. Error rate increased with severity of illness
2. Equipment mishandling and off centered images. Errors found to be a result of a number of causes, including the Bucky being off center (Bucky not pushed in correctly), the tube being off center (tube not aligned with Bucky center), tomography not correctly set up, or films placed the wrong way (cassette placed crossways for an AP skull)
3. Double exposure error because of disorganized practice and loss of concentration by radiographer
4. Overcollimation and overlapped images. Errors because of the radiographer (e.g., trying to fit a number of images on one film or may be poor patient preparation resulting in jewelry remaining in the way) or equipment fault (e.g., the lead shutters in the light beam diaphragm may need adjustment)
Scrap/blank Several causes were noted such as faulty automatic cassette holder, mishandling of films in the darkroom, and disorganized imaging practices
Reprocessed Reprocessed images to ad markers and apply shutters

The results of the repeats are as follows:

  • Exposure errors: in the conventional batch, 5.2% of the repeats were of this category. In the digital batch, exposure errors made 0.87% of the repeats.

  • Positioning errors: 3.05% of the repeated films in the conventional environment were because of positioning error. In digital batch, 3.2% fell in this category.

  • Good films: 1.3% of the repeated films in the conventional batch were good films, whereas in the digital batch, they were 0.2%.

  • Miscellaneous errors: this category made up 0.9% of the total repeats in conventional batch and 0.43% in the digital batch.

  • Green, black, and clear films: the reject bin contained 80 blank and clear films. That makes 20% of the total rejects, but as mentioned previously, this was not included in the repeat analysis. Because of the physical elimination of films, the only subcategory within this category is blank images. There were 11 blank images.

  • Reprocessed images: the digital batch contained 26 reprocessed images.

  • Fixable digitally: the analyzing radiologist selected 124 repeated images from the conventional batch to be scanned. All 124 were digitally modified to diagnostically acceptable images. That number constitutes 3.9% of the total repeats in the conventional batch. In the digital environment, only eight were selected where the radiologist would be happy if the image had a proper QA.

  • Examination-related figures: in the conventional batch, chest examinations made up 40% of the repeats; in the digital batch, the number was down to 26%. Repeated abdominal examinations sharply increased from 7 to 15% and cranial examinations from 4 to 8%. Other examinations decreased by approximately 50%.

Discussion

As a marketing strategy, most CR and PACS providers are promoting digital imaging as the answer to reducing repeated exposure, which is based on early reports when digital radiography emerged as a new technology;6 some latest publications also had comments unsupported by proper analysis.9

Unfortunately, advocates of discarding reject analysis programs and encouraging the deletion of unwanted images failed to understand that it is a moral and legal requirement to keep all patients' records and doses in case we need to determine the radiation dose delivered to the patient. They also failed to realize that conventional systems used to have an average of 7–14% reject rate. The national reject rate in the UK was 10%.10 If 45% of that percentage were caused by “over- or underexposure,” that leaves 55% that cannot be improved digitally, and different approaches are required to improve the rate. In a digital department with 60,000 examinations and 100,000 images a year, more than 2700 unnecessary exposures are still delivered to patients, and that is something that cannot be neglected.

The percentage of repeating films in the conventional batch was 10.5%, which is well above the recommended volume by Gray et al.1 that is 5%. On the other hand, digital technology being at its early stages and staff still being at the learning stage, the reject folder contained 152 images, which is comparatively less than half of that of the conventional batch, constituting less than 5% of the total number of exposed images; however, that would be considered well above the recommended rate when deducting the percentage allocated for exposure errors from the 5%.

Positioning Errors

The results of the digital batch were slightly higher than those of the conventional batch. In the conventional batch, positioning errors made up 3% of the total repeats, and it was 3.2% for the digital batch. That can be a result of previously categorizing some images as exposure errors. Digitally fixing the window and level highlights the bad positioning side of it; therefore, it gets classified as positioning error. That indicates that technology has no influence on the radiographer's skills, and DRA is still the best method to assess and tailor training needs.

In the original planning for acquiring digital technology, it was speculated that because of the readily available images, radiologists would have a greater involvement in checking unverified images from their own workstations to help radiographers in improving and embellishing their skills to reduce the number of the unnecessary views taken. Instead, it seems the gap in the communication between radiologists and radiographers got a bit larger with radiologists getting locked up in their offices staring at monitors, whereas radiographers got busy in insuring proper data transfer between RIS and PACS. This problem qualifies to have a separate study conducted on its own.

The major reduction in the number of errors reducing unnecessary examinations to the patient was in the exposure factors category. Dark and light errors went down from being almost 5.2% of the errors in the conventional batch to 0.9% in the digital batch.

Good Films

These films made up 1.3% of repeated films in the conventional batch. According to quality assurance seniors in the department, the main reason behind this category is the absence or the lack of communication between radiographers and radiologists. Using digital technology, images that could have been passed by some radiographers and repeated by others based on their experience with different radiologists, where some would accept a one- or two-star dark image while others would not, were eliminated by digitally adjusting the brightness and contrast of the image. That reduced the unnecessary repetition of good images from 40 to 7 images.

Miscellaneous Errors

These errors were reduced by half in the digital batch. The main reduction in this category was equipment error, mainly processor errors. With digital technology, errors such as scratch films, poorly developed or fixed, were completely eliminated. Obviously, that has a positive implication on the department where the staff does not have to worry about equipment errors and to waste time to monitor processor performance.

Green, Black, and Clear Films

These films made up 20% of the rejects in the conventional batch. A good number of those images could be exposed images that radiographers decided not to process. With digital technology, this specific problem will not be eliminated because radiographers have the option of running primary erasure on a digital plate without processing the image. However, digital radiography eliminated few problems such as static marks and film wastage as a result of exposure to daylight or humidity.

Fixable Digitally

In the conventional environment, the department had a special method of classifying rejected films; same categorization was applied to the digital batch. However, as stated earlier, exposure-related repeats were decreased by 83%. Comparatively with the conventional batch, exposure errors in the digital batch were mainly of the highest level.

When the radiologist was trying to sort the repeated images in the conventional batch, 124 were classified as fixable digitally. Images were of first- and second-level exposure errors. Preventing those repeats would have reduced the repeat rate from 10.5 to 6.5%.

The digital batch had eight images that the radiologist would have accepted if they were properly reprocessed using different algorithm.

Reprocessed Images

Surprisingly, only 26 images were reprocessed by adding shutters to the image. Adding digital shutters to an image enhances image quality by obstructing the glare from the large white border of an image. That will give the image a professional look and will make it user friendly to radiologists. We would have expected the number to be much higher, but it seems that autoshuttering prior to sending to PACS is taking care most of the cases. A request for software enhancement has been placed for PACS to recognize shuttered images as a replacement image and not as a new image.

Conclusion

Repeat image analysis is one of the major quality improvement tools used in imaging departments, regardless of the technology. The main challenge is to set a credible system and to work around the deficiencies of the products available in the market. Raising software enhancement requests to the manufacturer is the best approach to tailor newly setup digital systems based on the relationship of PACS, RIS, and CR systems in place.

It is unfortunate that companies involved in providing the CR/PACS setup did not give a proper consideration for the setting of a repeat analysis system to cater for the Australian market. This study has become a learning curve for radiographers and the two major companies involved with the installation of the CR/PACS equipment. A prototype has been developed; however, it is at its early stages and, still, more detailing is required.

The results of the study showed a great advantage of the digital system over the conventional system. A serious reduction in the number of unnecessary exposures has been recorded because of the ability of manipulating the brightness and contrast of an image. The number of repeating good films was also reduced mainly because of the speed of passing images to radiologists for comments. However, the digital system has not been used to its full capacity; therefore, it did not have the expected impact on improving the number of positioning errors. It was expected that radiologists would do random checks on unverified images from their own stations to insure the production of high-quality images. Unfortunately, things went the other way around, and radiologists got locked up in their offices and, in most cases, do their own QA adjustments, and communication between radiologists and radiographers got worse. It was up to radiographers to ask for radiologist's comment on an image; that helped in reducing the number of repeating good images but had no positive effect on reducing positioning errors.

Radiographers need to be cautious not to develop a complacent attitude and become overconfident because of having advanced technology. In conventional setups, radiographers used to relate 50% of the repeats to film processing and wrong exposure factors. The DRA in the digital system showed that the reject rate for positioning errors was slightly on the rise, and the number of rejects did not go down as much as it was expected. Actually, the total reject for chest examinations went down from 40 to 26% instead of 20%. Abdomen and cranial examinations increased almost twice, which shows a serious need for in-house training and clinical meetings. However, it has been noted that some of the films rejected in the conventional batch, classified as exposure error, would also qualify as a positioning error. That justifies the slight increase in positioning errors in the digital environment. It is also recommended that radiologists take a more proactive approach in commenting on the quality of images, and it will be beneficial to all parties if they take part in the training program.

Acknowledgments

Special thanks are due to Ms. Susan Fisher, Mr. Matthew Lam, and Mr. Mario Carpini for their help with the study.

References

  • 1.Gray J, Winkler N, Stears J, Frank E. Quality Control in Diagnostic Imaging. Maryland, USA: Ansen Publication; 1983. [Google Scholar]
  • 2.Peer S, et al. Comparative reject analysis in conventional film-screen and digital storage phosphor radiography. Eur Radiol. 1999;9:1693–1696. doi: 10.1007/s003300050911. [DOI] [PubMed] [Google Scholar]
  • 3.Honea R, Blado ME, Ma Y. Is reject analysis necessary after converting to computed radiography. J Digit Imaging. 2002;15(Suppl):141–152. doi: 10.1007/s10278-002-5028-7. [DOI] [PubMed] [Google Scholar]
  • 4.PathSpeed Workstation 8.1, Operating Instructions. GE Medical Systems P.O. Box 414, Milwaukee, WI 53201, USA, 5/4, 6/14, 2000
  • 5.Fuji Computed Radiography FCR 5000, Operation manual. Fuji Photo Film Co., Ltd. 2nd edn, Aug 1998
  • 6.Tucker DM, McEachern M. Quality assurance and quality control of an intensive care picture archiving and communication system. J Digit Imaging. 1995;8(4):162–167. doi: 10.1007/BF03168715. [DOI] [PubMed] [Google Scholar]
  • 7.Registration requirements and industry best practice for ionising radiation apparatus used in diagnostic imaging. Environment Protection Authority NSW Radiation Guideline No. 6; Part 2. Fluoroscopy and Radiography. Clause 3.6. November 1999
  • 8.Ballinger P. Merrill's Atlas of Radiographic Positions and Radiologic Procedures. 7. USA: Mosby; 1991. [Google Scholar]
  • 9.Pilling JR. Picture archiving and communication systems: the users' view. Br J Radiol. 2003;76:519–524. doi: 10.1259/bjr/67551353. [DOI] [PubMed] [Google Scholar]
  • 10.Weatherburn GC, Bryan S, West M. A comparison of image reject rates when using film, hard copy computed radiography and soft copy images on picture archiving and communication systems (PACS) workstations. Br J Radiol. 1999;72:653–660. doi: 10.1259/bjr.72.859.10624322. [DOI] [PubMed] [Google Scholar]

Articles from Journal of Digital Imaging are provided here courtesy of Springer

RESOURCES