Skip to main content
Journal of General Internal Medicine logoLink to Journal of General Internal Medicine
. 2006 May;21(5):460–465. doi: 10.1111/j.1525-1497.2006.00427.x

Competency in Chest Radiography

A Comparison of Medical Students, Residents, and Fellows

Lewis A Eisen 1, Jeffrey S Berger 2, Abhijith Hegde 2, Roslyn F Schneider 2
PMCID: PMC1484801  PMID: 16704388

Abstract

BACKGROUND

Accurate interpretation of chest radiographs (CXR) is essential as clinical decisions depend on readings.

OBJECTIVE

We sought to evaluate CXR interpretation ability at different levels of training and to determine factors associated with successful interpretation.

DESIGN

Ten CXR were selected from the teaching file of the internal medicine (IM) department. Participants were asked to record the most important diagnosis, their certainty in that diagnosis, interest in a pulmonary career and adequacy of CXR training. Two investigators independently scored each CXR on a scale of 0 to 2.

PARTICIPANTS

Participants (n = 145) from a single teaching hospital were third year medical students (MS) (n = 25), IM interns (n = 44), IM residents (n = 45), fellows from the divisions of cardiology and pulmonary/critical care (n = 16), and radiology residents (n = 15).

RESULTS

The median overall score was 11 of 20. An increased level of training was associated with overall score (MS 8, intern 10, IM resident 13, fellow 15, radiology resident 18, P<.001). Overall certainty was significantly correlated with overall score (r = .613, P<.001). Internal medicine interns and residents interested in a pulmonary career scored 14 of 20 while those not interested scored 11 (P = .027). Pneumothorax, misplaced central line, and pneumoperitoneum were diagnosed correctly 9%, 26%, and 46% of the time, respectively. Only 20 of 131 (15%) participants felt their CXR training sufficient.

CONCLUSION

We identified factors associated with successful CXR interpretation, including level of training, field of training, interest in a pulmonary career and overall certainty. Although interpretation improved with training, important diagnoses were missed.

Keywords: education, medical, radiography, thoracic, clinical competence, educational measurement


In academic medical centers, accurate interpretation of chest radiographs (CXR) is essential as house officers make clinical decisions based on their interpretations. Situations arise where action must be taken expeditiously before readings can be verified by an attending radiologist. Clinical decisions based on improper interpretations have potential implications for patient care.

Although competency in CXR interpretation is important, formal training varies widely. This may be partly because of the lack of importance national organizations place on CXR interpretation.17 Without national standards, there is wide variability among medical school and residency training programs.

Prior studies have shown inaccurate CXR interpretation by emergency medicine physicians,8,9 medical staff,10 primary care physicians,11 and anesthesiologists.12 Faulty interpretations change management in up to 11% of cases.10 Most studies, but not all,9,13,14 have demonstrated improved CXR interpretation score with level of training.1519 One study evaluating CXR interpretation by medical students (MS) showed that they performed poorly on common conditions and lacked confidence.20

We conducted a study to evaluate the competency of MS and house officers from different specialties in CXR interpretation. We also sought to identify factors that affect competence.

METHODS

Study Design

The study took place at Beth Israel Medical Center, University Hospital and Manhattan Campus for The Albert Einstein College of Medicine. It is a teaching hospital with approximately 700 medical/surgical beds. The Beth Israel Medical Center Institutional Review Board approved this study. The requirement for written informed consent was waived but participation was voluntary and all subjects gave oral consent.

Three authors selected a convenience sample of 10 CXR from the teaching file of an internal medicine (IM) training program. The CXR were chosen to represent common conditions that subjects should be expected to diagnose and the images were printed on photography-grade paper. Each CXR was viewed as having only 1 correct diagnosis for the purpose of this study. We specifically included 1 normal CXR and 3 examples of radiographic emergencies—pneumothorax, misplaced central venous catheter (subclavian line with distal segment in internal jugular vein), and pneumoperitoneum (Figs. 14).

Figure 1.

Figure 1

Normal

Figure 4.

Figure 4

Pneumoperitoneum.

Figure 2.

Figure 2

Pneumothorax.

Figure 3.

Figure 3

Misplaced central venous catheter.

All CXR were reviewed independently by 2 experts in CXR interpretation (a radiology attending and a pulmonary/critical care attending). They were not involved in the selection of the CXR and were viewing the CXR for the first time. The experts were blinded to the study's purpose as well as demographic or clinical information and did not have any other role in the conduct of the study. They independently recorded the most important finding on each CXR. There was uniform agreement on all diagnoses. The CXR included in the study are shown in Table 1.

Table 1.

Chest Radiograph Diagnoses

Number Diagnosis
1 Normal
2 Mass
3 Pleural effusion
4 Pneumothorax
5 Dextrocardia or right/left mislabeled film
6 Hyperinflation
7 Apical infiltrate
8 Misplaced central line (subclavian line into internal jugular vein)
9 Consolidation
10 Free air under diaphragm (pneumoperitoneum)

The study was designed to enroll all third-year MS rotating through their IM rotation, IM interns and residents, radiology residents, pulmonary/critical care fellows, and cardiology fellows in the hospital. Most subjects were given the CXR survey at a noon conference, which replaced a daily lecture (1 hour period). Two authors enrolled a few additional subjects who did not attend the conference. Over 1 week they administered the test on a personal basis. Of the total number of eligible subjects, 90% enrolled, specifically 25/25 of MS (100%), 44/47 of IM interns (94%), 35/54 of IM residents (83%), 16/19 of fellows 16/19 (84%), and 15/16 of radiology residents (94%). The most common reason for failure to enroll was vacation time.

All participants had undergone standard CXR training. Medical students at Albert Einstein College of Medicine undergo a mandatory 2-week clerkship in diagnostic radiology. However, not all had received this clerkship before the survey. During the IM clerkship, they have 2 introductory lectures by a chest radiologist. They also participate in faculty-moderated, monthly interactive sessions for IM interns and residents. The MS were in the final (12th) week of their IM rotation. IM interns and residents receive a monthly interactive CXR conference where cases are reviewed with house staff attempting to establish the diagnoses. This is led by a pulmonary/critical care attending who selects CXR from a teaching file. Attendance is mandatory. In addition, there is daily formal CXR review during 5 critical care rotation months over the course of residency. These rounds are led by a pulmonary/critical care attending in the medical intensive care unit (MICU) and by a cardiology attending in the coronary care unit. Internal medicine house staff had completed at least 6 months of training at the time of the study. The amount of prior CXR training was not exactly equal for subjects at a particular training level. For example, only 50% of IM interns had completed MICU training at the time of the survey. Pulmonary/critical care fellows receive a weekly conference in CXR interpretation led by an attending radiologist and review cases daily with pulmonary attendings. The cardiology fellowship does not include formal CXR training but CXR are frequently reviewed in clinical practice. Radiology residents have weekly formal conferences in chest radiography in addition to spending 3 months on the chest radiography service over the 4 years of residency. All had completed at least 1 chest radiography month at the time of the survey.

In daily practice, all radiographs are archived on a digital system and are easily retrievable. High-quality monitors are available on medical wards for CXR review. Internal medicine house officers are responsible for initial CXR interpretations at night. Films identified as difficult may be reviewed with the radiology resident on call. An attending radiologist is also available to read films off-site for cases deemed particularly challenging.

Each participant was given a survey that included questions about their sex, postgraduate year, field of training, career interest in pulmonary medicine, and perceived ability to interpret CXR independently. Subjects recorded “Yes” if they felt able to read CXR independently and “No” if they did not. Each subject was then shown 10 CXR and asked to write the most important finding. They were specifically told that there might be one or more normal CXR. After recording their diagnosis on a particular CXR, participants marked their “certainty” on a scale of 0 to 4 (0–0%, 1–25%, 2–50%, 3–75%, 4–100%).21 A cumulative certainty total was compiled for a maximum “overall certainty” of 40 points. Subjects were not allowed to consult outside sources. Although unlimited time was offered, all subjects finished in less than an hour.

Two blinded, independent graders gave each CXR a “score” on a scale of 0 to 2 (0 incorrect, 1 partially correct, 2 correct).20 A third party adjudicated any disagreements between the two graders. Partial credit was given if a less specific term was written. For example, on the CXR demonstrating consolidation, 1 point was awarded if a subject recorded opacity. An “overall score” was compiled by adding up the score for each CXR, for a possible total of 20 points.

Statistical Analysis

Descriptive data are presented as median with the 25th and 75th percentile following in parentheses. The analysis of variance F test was used to investigate differences in score, confidence and certainty by level of training. When 2 groups were analyzed, t test for independent samples was used. Pearson's correlation was used to investigate univariate associations between continuous variables.

Factors independently related to overall CXR score were established through multiple logistic regression analysis. The group was divided into “high scorers” (overall score in the top 25th percentile) and “low scorers” (all other scores). Odds ratios were calculated to determine the strength of the associations between factors and overall score in the top 25th percentile.

A P-value of less than .05 was considered statistically significant. Analyses were performed using the SPSS 11.0 statistical analysis program (SPSS, Inc., Chicago, IL).

RESULTS

The participants included 25 third-year MS and 120 house officers (44 interns, 60 residents, and 16 fellows) at 1 teaching hospital. Sixty (41%) participants were women. Of the 104 interns and residents, 89 were from the department of IM (44 PGY-1, 20 PGY-2, 24 PGY-3, 1 PGY-4) and 15 were from the department of radiology (5 PGY-2, 4 PGY-3, 3 PGY-4, 3 PGY-5). Of the 16 fellows, 11 were from the division of cardiology (4 PGY-5, 5 PGY-4, 2 PGY-7) and 5 were from the division of pulmonary/critical care (1 PGY-4, 3 PGY-5, 1 PGY-6).

The median overall score achieved by the entire cohort was 11/20 (8–15) (25–75th percentile). The median overall score increased with level of training (Table 2). Among all house officers, postgraduate year was significantly correlated with overall score (r = .537, P<.001). There was no significant difference in overall score between men and women.

Table 2.

Overall Score and Overall Certainty by Level of Training

Medical Student(n = 25) Internal Medicine Intern(n = 44) Internal Medicine Resident(n = 45) Fellow(n = 16) Radiology Resident(n = 15) Pvalue
Overall score (0–20) 8.0 10.0 12.8 15.0 17.5 <.001
Overall certainty (0–40) 23.0 22.0 28.0 30.0 35.1 <.001

The median overall certainty among the entire cohort was 27/40 (20–31). Overall certainty increased with level of training (Table 2). Among all house officers, postgraduate year was significantly correlated with increasing overall certainty (r = .427, P<.001). Among the entire cohort, overall score was significantly correlated with overall certainty (r = .613, P<.001).

Table 3 lists the scores and certainty obtained for the normal and critical CXR. For these CXR, few subjects were absolutely certain of their diagnoses. Subjects recorded absolute certainty on 26/145 (18%) of the normal CXR, 12/145 (9%) of the pneumothorax, 34/145 (24%) of the line misplacement, and 36/145 (25%) of the pneumoperitoneum.

Table 3.

Score and Certainty for the Normal and Critical Chest Radiographs by Level of Training

Medical Student(n = 25) Intern(n = 44) Internal Medicine Resident(n = 45) Fellow(n = 16) Radiology Resident(n = 15) Pvalue
Normal
 Score (0–2) 0.8 1.0 1.2 1.9 1.5 .250
 Certainty (0–4) 1.8 2.1 2.9 2.9 3.3 <.001
Pneumothorax
 Score (0–2) 0.0 0.1 0.2 0.4 1.2 .255
 Certainty (0–4) 2.0 1.9 2.6 2.8 2.9 <.001
Misplaced central line
 Score (0–2) 0.0 0.1 0.9 0.7 1.3 <.001
 Certainty (0–4) 2.4 2.0 3.0 3.1 3.5 <.001
Pneumoperitoneum
 Score (0–2) 0.2 0.5 1.1 1.5 1.9 .002
 Certainty (0–4) 2.0 2.0 2.7 2.9 3.9 .005

We investigated particular CXRs where subjects claimed to be absolutely certain of their diagnoses. We found that many subjects absolutely certain of their diagnoses were wrong—normal 6/26 (23%), pneumothorax, 9/12 (75%), line misplacement 10/34 (29%), and pneumoperitoneum 5/36 (14%).

Among the 89 IM interns and residents, 10 (11%) were interested in pulmonary medicine as a career. Internal medicine interns and residents interested in a career in pulmonary medicine scored 14 (11–16) while their peers scored 11 (9–13.5) (P = .027). Overall certainty did not differ between groups.

Radiology residents scored higher than IM residents. Median overall score was 11 (8–14) for IM residents and 18 (15–18) for radiology residents (P<.001). Overall certainty was significantly higher among radiology residents (P<.001).

Only 21/145 (14%) of the participants felt capable of interpreting CXR independently. Specifically, 1/25 (4%) MS, 1/44 (2%) interns, 11/45 (25%) IM residents, 4/16 fellows (25%), and 4/15 (27%) radiology residents felt capable of interpreting CXR independently. Participants who felt able to interpret CXR independently scored 16 (12–18) while other participants scored 11 (8–14) (P<.001). Overall certainty was significantly higher in the participants who felt able to interpret CXR independently (P<.001).

In a logistic regression analysis, factors independently associated with overall score were field of training, interest in a pulmonary career, level of training, and overall certainty. After controlling for other variables, perceived ability in interpreting CXR independently was not significant (Table 4).

Table 4.

Multiple Regression Analysis of Factors Associated with Overall Score in the Top 25 Percentile

Factors Odds Ratio 95% Confidence Interval P value
Radiology vs. other fields 34.4 3.65–325.1 .002
Interest in a pulmonary career 6.12 1.63–22.89 .007
Level of training 5.70 2.29–14.19 <.001
Perceived adequate chest radiographs training 2.93 0.70–12.27 .141
Overall certainty 2.10 1.16–3.81 .002

DISCUSSION

Despite its importance, organizations have not stressed CXR interpretation. Only 29% of medical schools have a required clerkship in diagnostic radiology.22 The Liaison Committee on Medical Education (LCME) states that “Educational opportunities must be available in … diagnostic imaging.”1 Thus, specific CXR interpretation training is not mandated for undergraduate medical education.

While the American Board of Internal Medicine has a requirement for competency in electrocardiogram interpretation, there is no similar requirement for CXR interpretation.2 Surprisingly, it is also not a required procedure for board certification in cardiology, pulmonary medicine, or critical care medicine.2

The Accreditation Council for Graduate Medical Education (ACGME) is responsible for accreditation for most residency programs in the United States. The ACGME IM Residency Review Committee's program requirements state that “All residents should develop competency in interpretation of chest roentgenograms.”3 The program requirements for pulmonary medicine, cardiovascular medicine, and diagnostic radiology have similar statements.46 There is no mention of how competency should be achieved or assessed. The ACGME critical care medicine program requirements make no mention of CXR interpretation.7

At many institutions, house officers are expected to interpret CXR and make clinical decisions before a formal reading by a radiologist. This is particularly important for radiographic emergencies such as pneumothorax, pneumomediastinum, pneumoperitoneum, and misplacement of central venous catheters, pulmonary artery catheters, intra-aortic balloon pumps, chest tubes, gastric tubes, and endotracheal tubes.23 Our study included three emergencies—pneumothorax (misdiagnosed by 91%), misplaced central venous catheter (misdiagnosed by 74%), and pneumoperitoneum (misdiagnosed by 54%). In addition, a significant percentage of participants absolutely certain of their diagnoses on these emergencies were wrong. This is especially worrisome because house staff absolutely certain of their CXR interpretation may not ask a senior colleague for a second opinion.

Subjects also had difficulty interpreting the normal CXR. This occurred even though they were instructed that 1 or more of the CXR in the survey might be normal. Other researchers have noted difficulty in interpreting a study as normal.15,16,18,24 Potentially, interpreting a normal CXR as abnormal could lead to inappropriate decisions.

While the overall score was low, we identified several factors significantly correlated with successful interpretation. Overall certainty was correlated with overall score. Radiology residents performed significantly better than IM residents. Internal medicine interns and residents interested in a pulmonary career performed significantly better than their peers. Although we do not have data to confirm this, it is possible that residents interested in a pulmonary career may have done more self-study. Alternatively, they may have enrolled in more rotations where they were exposed to CXR teaching. Also, their attendance may have been better at CXR teaching conferences. Finally, overall score increased with level of training. This was found even though the amount of CXR instruction was not the same for each individual participant. For example, while all of the IM interns had had the opportunity to attend 6 formal CXR lectures, only 50% of the interns had had a MICU month at the time of the CXR survey.

One prior study identified certainty on a particular CXR as being associated with successful interpretation of that CXR.25 There is also evidence that when a clinician is certain about an interpretation he is less likely to be wrong.26 However, as our study demonstrates, verification may be required even when a house officer is 100% confident.

As identified by Pfeifer, 2 possible solutions exist to the problems in CXR interpretation identified by this and prior studies.27 The first approach would be to have all CXR immediately interpreted by a qualified radiologist. This could be accomplished by increasing the number of on-site radiologists or by tele-radiology. The second approach would be to improve interpretation skills of clinicians at the point of care. Theoretically, there is great value in integrating the radiographic interpretation with other findings from the history, physical exam, and laboratory findings. For example, a radiologist may interpret a CXR with bilateral infiltrates as pulmonary edema, A clinician may integrate the patient's history (cough and sputum production), physical exam (hyperthermia), and laboratory findings (increased white blood cell count) and diagnose multilobar pneumonia.

There are several possible approaches to improving CXR interpretation skills. Computer-aided diagnosis of CXR can improve interpretation.23,28,29 One study showed that using a picture archiving and communication system (PACS) rather than standard films improves interpretation.30 However, our study was done in an institution with PACS and important diagnoses were still missed. A program of formal training significantly improves CXR readings31,32 and computer-based training may be more effective than traditional methods.32 Quality improvement initiatives improve error rates.33,34 In a prospective study by Espinosa, a program stressing an interdisciplinary approach and review of all misinterpreted films led to significantly fewer errors.34 Other potential methods to improve CXR interpretation would be web-based modules or encouraging IM house staff to enroll in formal radiology electives. Perhaps the simplest method to improve CXR interpretations would be ensuring that MS and house staff read all CXR of patients under their care and review the results with a physician with proven competence in CXR interpretation. The utility of these methods warrant further investigation.

There are several strengths to our study compared with prior studies in this area. First, the study was one of the largest studies of CXR interpretation in terms of number of participants. Second, subjects from multiple fields of medicine and multiple training levels were compared. Third, this is only the second study to directly confirm that confidence on a particular CXR reading is associated with successful interpretation. Fourth, we have shown that interest in a pulmonary career is associated with successful interpretation. Fifth, we have identified particular CXR emergencies where interpretation skills are lacking. Finally, we have shown that even subjects who are 100% sure of their interpretations are wrong a high percentage of the time.

There are several limitations to our study. First, a small and somewhat arbitrary sample of CXRs was chosen for the survey. While these were representative of common conditions, results may have been different with other CXR. Second, we did not provide house staff with clinical context for the CXR. Schreiber demonstrated in 1963 that clinical history improves CXR interpretation.35 A systematic review of 16 articles also demonstrated that test interpretation improves if clinical information is provided.36 We chose not to provide clinical information because this was a study of how well trainees interpret important, common, unambiguous radiographic findings. Adding clinical information would make the results less clear as responses could reflect understanding of the clinical scenario more than ability to recognize radiographic abnormalities. Additionally, the current training system in the United States requires frequent hand-offs of clinical information that is variably transmitted to the persons required to check CXR. Third, the gold standard in our study can also be questioned. Studies have shown that even experienced radiologists may have differing interpretations of a CXR.19,3739 In our study, the blinded experts were in 100% agreement on our series of CXR. This was probably because the CXR were classic examples of common conditions. Fourth, the CXR in the study were depicted on paper. Although the quality of the reproductions was high, subjects were used to interpreting CXR on digital monitors and this may have affected the results. Finally, this study took place at 1 teaching institution. Possibly other institutions, with different teaching methods, may have different results.

In conclusion, we have identified deficiencies in CXR interpretation with potential implications for MS education, house staff education, and patient care. If house officers are expected to make clinical decisions based on CXR readings, more effective training is needed, particularly in radiographic emergencies. Further research is needed to determine the best methods of achieving and assessing competency in CXR interpretation.

Acknowledgments

The principal investigator had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

REFERENCES

  • 1.Liaison Committee on Medical Education. LCME accreditation standards. [April 2005];Liaison Committee on Medical Education web site. doi: 10.1097/00001888-199401000-00009. Available at http://www.lcme.org. [DOI] [PubMed]
  • 2.American Board of Internal Medicine. Internal medicine policies: requirements for certification in internal medicine. [April 2005];American Board of Internal Medicine website. Available at http://www.abim.org.
  • 3.American Council of Graduate Medical Education. Internal medicine program requirements: internal medicine. [April 2005];American Council of Graduate Medical Education website. July 2003 Available at http://www.acgme.org.
  • 4.American Council of Graduate Medical Education. Internal medicine program requirements: pulmonary medicine. [April 2005];American Council of Graduate Medical Education website. July 1999 Available at http://www.acgme.org.
  • 5.American Council of Graduate Medical Education. Internal medicine program requirements: cardiovascular medicine. [April 2005];American Council of Graduate Medical Education website. July 1999 Available at http://www.acgme.org.
  • 6.American Council of Graduate Medical Education. Diagnostic radiology program requirements. [April 2005];American Council of Graduate Medical Education website. December 2003 Available at http://www.acgme.org.
  • 7.American Council of Graduate Medical Education. Internal medicine program requirements: critical care medicine. [April 2005];American Council of Graduate Medical Education website. July 1999 Available at http://www.acgme.org.
  • 8.Benger JR, Lyburn ID. What is the effect of reporting all emergency department radiographs? Emerg Med J. 2003;20:40–3. doi: 10.1136/emj.20.1.40. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Gatt ME, Spectre G, Paltiel O, et al. Chest radiographs in the emergency department: is the radiologist really necessary? Postgrad Med J. 2003;79:214–7. doi: 10.1136/pmj.79.930.214. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Grosvenor LJ, Verma R, O'Brien R, et al. Does reporting of plain chest radiographs affect the immediate management of patients admitted to a medical assessment unit? Clin Radiol. 2003;58:719–22. doi: 10.1016/s0009-9260(03)00219-8. [DOI] [PubMed] [Google Scholar]
  • 11.Kuritzky L, Hardy RI, Curry RW. Interpretation of chest roentgeonograms by primary care physicians. South Med J. 1987;80:1347–51. doi: 10.1097/00007611-198711000-00004. [DOI] [PubMed] [Google Scholar]
  • 12.Kaufman B, Dhar P, O'Neill D, et al. Chest radiograph interpretation skills of anesthesiologists. J Cardiothorac Vasc Anesth. 2001;15:680–3. doi: 10.1053/jcan.2001.28307. [DOI] [PubMed] [Google Scholar]
  • 13.Young M, Marrie TJ. Interobserver variability in the interpretation of chest roentgenograms of patients with possible pneumonia. Arch Intern Med. 1994;154:2729–32. doi: 10.1001/archinte.1994.00420230122014. [DOI] [PubMed] [Google Scholar]
  • 14.Herman PG, Hessel SJ. Accuracy and its relationship to experience in the interpretation of chest radiographs. Invest Radiol. 1975;10:63–7. doi: 10.1097/00004424-197501000-00008. [DOI] [PubMed] [Google Scholar]
  • 15.Eng J, Mysko WK, Weller GE, et al. Interpretation of emergency department radiographs; a comparison of emergency medicine physicians with radiologists, residents with faculty, and film with digital display. AJR. 2000;175:1233–99. doi: 10.2214/ajr.175.5.1751233. [DOI] [PubMed] [Google Scholar]
  • 16.Monnier-Cholley L, Carrat F, Cholley BP, et al. Detection of lung cancer on radiographs; receiver operating characteristic analyses of radiologists', pulmonologists', and anesthesiologists' performance. Radiology. 2004;233:799–805. doi: 10.1148/radiol.2333031478. [DOI] [PubMed] [Google Scholar]
  • 17.Walsh-Kelly CM, Melzer-Lange MD, Hennes HM, et al. Clinical impact of radiograph misinterpretation in pediatric ED and the effect of physician training level. Am J Emerg Med. 1995;13:262–4. doi: 10.1016/0735-6757(95)90196-5. [DOI] [PubMed] [Google Scholar]
  • 18.Potchen EJ, Cooper TG, Sierra AE, et al. Measuring performance in chest radiography. Radiology. 2000;217:456–9. doi: 10.1148/radiology.217.2.r00nv14456. [DOI] [PubMed] [Google Scholar]
  • 19.Quekel LG, Kessels AG, Goei R, et al. Detection of lung cancer on the chest radiograph; a study on observer performance. Eur J Radiol. 2001;39:111–6. doi: 10.1016/s0720-048x(01)00301-1. [DOI] [PubMed] [Google Scholar]
  • 20.Jeffrey DR, Goddard PR, Callaway MP, et al. Chest radiograph interpretation by medical students. Clin Radiol. 2003;58:478–81. doi: 10.1016/s0009-9260(03)00113-2. [DOI] [PubMed] [Google Scholar]
  • 21.Lave JR, Bankowitz RA, Hughes-Cromwick P, et al. Diagnostic certainty and hospital resource use. Cost Qual Q J. 1997;3:26–32. [PubMed] [Google Scholar]
  • 22.Samuel S, Shaffer K. Profile of medical student teaching in radiology; teaching methods, staff participation and rewards. Acad Radiol. 2000;7:868–74. doi: 10.1016/s1076-6332(00)80634-0. [DOI] [PubMed] [Google Scholar]
  • 23.Oldham SA. ICU chest radiographs—ICU calamities; evaluation of the portable chest radiograph. Emerg Radiol. 2002;9:43–54. doi: 10.1007/s10140-001-0181-8. [DOI] [PubMed] [Google Scholar]
  • 24.Shiraishi J, Hiroyuki A, Engelmann R. Computer-aided diagnosis to distinguish benign from malignant solitary pulmonary nodules on radiographs; ROC analysis of radiologists' performance—initial experience. Radiology. 2003;227:469–74. doi: 10.1148/radiol.2272020498. [DOI] [PubMed] [Google Scholar]
  • 25.Mayhue FE, Rust DD, Aldag JC, et al. Accuracy of interpretation of emergency department radiographs; effect of confidence levels. Ann Emerg Med. 1989;18:826–30. doi: 10.1016/s0196-0644(89)80205-7. [DOI] [PubMed] [Google Scholar]
  • 26.Lufkin KC, Smith SW, Matticks CA, et al. Radiologists' review of radiographs interpreted confidently by emergency physicians infrequently lead to changes in patient management. Ann Emerg Med. 1998;31:202–7. doi: 10.1016/s0196-0644(98)70307-5. [DOI] [PubMed] [Google Scholar]
  • 27.Pfeifer M. Nonradiologists reading radiographs; good medicine or stretching the scope of practice? J Cardiothorac Vasc Anesth. 2001;15:675–6. doi: 10.1053/jcan.2001.29021. [DOI] [PubMed] [Google Scholar]
  • 28.Monnier-Cholley L, MacMahon H, Katsuragawa S. Computer-aided diagnosis for detection of interstitial opacities on chest radiographs. AJR. 1998;171:1651–6. doi: 10.2214/ajr.171.6.9843307. [DOI] [PubMed] [Google Scholar]
  • 29.Kobayashi T, Xu XW, MacMahon H, et al. Effect of a computer-aided diagnosis scheme on radiologists' performance in detection of lung nodules on radiographs. Radiology. 1996;199:843–8. doi: 10.1148/radiology.199.3.8638015. [DOI] [PubMed] [Google Scholar]
  • 30.Weatherburn G, Bryan S, Nicholas A, et al. The affect of a picture archiving and communication system (PACS) on diagnostic performance in the accident and emergency department. Emerg Med J. 2000;17:180–4. doi: 10.1136/emj.17.3.180. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Dawes TJ, Vowler SL, Allen CM, et al. Training improves medical student interpretation in image interpretation. Br J Radiol. 2004;77:775–6. doi: 10.1259/bjr/66388556. [DOI] [PubMed] [Google Scholar]
  • 32.Maleck M, Fischer MR, Kammer B, et al. Do computers teach better? A media comparison study for case-based teaching in radiology. Radiographics. 2001;21:1025–32. doi: 10.1148/radiographics.21.4.g01jl091025. [DOI] [PubMed] [Google Scholar]
  • 33.Preston CA, Marr JJ, Amaraneni KK, et al. Reduction of “call-backs” to the emergency department due to discrepancies in the plain radiograph interpretation. Ann Emerg Med. 1988;17:1019–23. doi: 10.1016/s0735-6757(98)90036-5. [DOI] [PubMed] [Google Scholar]
  • 34.Espinosa JA, Nolan TW. Reducing errors made by emergency physicians interpreting radiographs; longitudinal study. BMJ. 2000;320:737–40. doi: 10.1136/bmj.320.7237.737. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Schreiber MH. The clinical history as a factor in roentgenogram interpretation. JAMA. 1963;185:399–401. doi: 10.1001/jama.1963.03060050077027. [DOI] [PubMed] [Google Scholar]
  • 36.Loy CT, Irwig L. Accuracy of diagnostic tests read with and without clinical information; a systematic review. JAMA. 2004;292:1602–9. doi: 10.1001/jama.292.13.1602. [DOI] [PubMed] [Google Scholar]
  • 37.Herman PG, Gerson DE, Hessel SJ, et al. Disagreements in chest roentgen interpretation. Chest. 1975;68:278–2. doi: 10.1378/chest.68.3.278. [DOI] [PubMed] [Google Scholar]
  • 38.Robinson PJ, Wilson D, Coral A, et al. Variation between experienced observers in the interpretation of accident and emergency radiographs. Br J Radiol. 1999;72:323–30. doi: 10.1259/bjr.72.856.10474490. [DOI] [PubMed] [Google Scholar]
  • 39.Albaum MN, Hill LC, Murphy M, et al. Interobserver reliability of the chest radiograph in community acquired pneumonia. Chest. 1996;110:343–50. doi: 10.1378/chest.110.2.343. [DOI] [PubMed] [Google Scholar]

Articles from Journal of General Internal Medicine are provided here courtesy of Society of General Internal Medicine

RESOURCES