Abstract
Objective
To identify and describe general practitioners’ (GPs’) views on radiology reports, using plain radiography for back pain as the case.
Design
Qualitative study with three focus-group interviews analysed using Giorgi's method as modified by Malterud.
Setting
Southern Norway.
Subjects
Five female and eight male GPs aged 32–57 years who had practised for 3–15 years and were from 11 different practices.
Main outcome measures
Descriptions of GPs’ views.
Results
GPs wanted radiology reports to indicate more clearly the meaning of radiological terminology, the likelihood of disease, the clinical relevance of the findings, and/or the need for further investigations. GPs stated that good referral information leads to better reports.
Conclusion
These results can help to improve communication between radiologists and GPs. The issues identified in this study could be further investigated in studies that can quantify GPs’ satisfaction with radiology reports in relation to characteristics of the GP, the radiologist, and the referral information.
Keywords: Back pain, family practice, general practitioner, qualitative research, radiography, radiology report
Clinicians wish for clarity in radiology reports and qualitative studies can help to explore exactly which issues need clarification.
General practitioners in this study stated that good referral information leads to better radiology reports.
They wanted clarified the meaning of radiological terminology, the likelihood of disease, the clinical relevance of findings, and the need for further investigations.
The radiology report is important in communication with clinicians and care of patients. Studies of clinicians’ views of radiology reports may therefore help to improve both professional communication and patient care. Most such studies have been surveys of multiple aspects of radiology reporting considered a priori by researchers, not clinicians [1–6]. The response rate varied and was often low (23–52%) [1], [2], [5–8].
Qualitative studies in which clinicians use their own words can help to identify aspects of radiology reports that clinicians themselves find essential [9], [10]. This could facilitate the design of a focused survey with few but essential questions [11]. Qualitative approaches are also valuable to get deeper insight into clinicians’ views, to generate hypotheses, and to select topics for further study when little prior research exists [10], [12].
Unlike hospital clinicians, general practitioners (GPs) seldom meet radiologists to exchange views and usually rely solely on the written radiology report. Despite this, few have explored GPs’ views on radiology reports. In a survey [2] where 36 of 104 respondents were GPs, “clarity” was graded as the most important in such reports. Exactly which issues need clarification was not examined. Hence, we performed a qualitative study to identify and describe GPs’ views on radiology reports, using plain radiography for back pain as the case.
Material and methods
We used group interviews [13], [14] to obtain data regarding views on radiology reports from a purposeful sample [12], [15] of Norwegian GPs after receiving their informed consent. The same interviews yielded data published elsewhere on decisions to order radiography [16].
Sample
To obtain a range of views, GPs of different experience and practice type/location were sampled until information redundancy (see Analysis below) [15]. This was achieved after recruiting three separate groups of GPs in 1998 from a research meeting (four academic GPs), a national educational course mandatory for GP certification (five trainees), and a professional meeting (four experienced GPs) (Table I). The meeting/course participants were asked to volunteer for a group interview performed immediately after the meeting/course.
Table I.
Age in years, range (median) | 32–57 (40) |
Men, n (%) | 8 (62) |
Years as GP, range (median) | 3–15 (10) |
Practice type and location: | |
solo/group | 2/121 |
city/suburban/rural | 4/6/41 |
1As one GP had two practices, numbers add up to 14.
Interviews
The first author conducted one 90-minute interview with each group of GPs. He informed them that he was a radiologist who sought GPs’ own thoughts and experiences. He then guided a discussion between the GPs concerning radiologists’ reports of plain radiography for back pain. He invited the GPs to describe actual cases and he asked open-ended questions based on a short interview guide to initiate discussion of different aspects of radiology reports (e.g. Is anything missing in the reports? What should be included?). To identify a relevant range of views and aspects, he then encouraged further discussion of aspects brought up by the GPs themselves [13], [14]. The interviews were audio-recorded and fully transcribed.
Analysis
The transcripts were analysed using Giorgi's method as modified by Malterud [17], [18]. This implied: (1) getting an overview of the data, (2) identifying and coding all text elements relevant to our aim (codes based on data, not decided a priori, concerning aspects of radiology reports), (3) interpreting similarly coded elements for a common meaning, which was summarized using expressions close to the GPs’ own words, and (4) describing the GPs’ views in more general terms, labelling each description and validating it, i.e. comparing it with the interview context and data it was based on and searching the entire transcripts for disproving data [15], [17], [18].
At each of the four analytic steps, a radiologist (first author, main interpreter) and a GP (second author), both with experience in qualitative research, first analysed the data individually and then contested each other's analysis [12] and reached a mutual basis for further analysis. They had six 40–60 minutes discussions in 2004–2005 of written interpretations. In preliminary analyses done immediately after each interview, the radiologist found little new information in the last interview. The main analysis also indicated information redundancy, as none of the five issues finally described below in the Results section emerged from the last interview only.
Results
GPs said they used radiology reports both in clinical decisions and to inform patients and professional colleagues. They would like some reports to indicate more clearly the meaning of radiological terminology, the likelihood of disease, the clinical relevance of the findings, and/or the need for further investigations. They also underlined that good referral information leads to better reports. These views are described below and illustrated with selected quotes.
Meaning of radiological terminology should be summarized
Many GPs said some reports contained terminology they were uncertain of and had to look up and might not find, or which they found to represent accidental or unimportant findings. The use of radiological terminology could be frustrating: “The radiologist has of course put in as many words as he could, and which I don't know and which I think probably means degeneration, that's classic” (GP 4). It could also be a challenge explaining to the patients what the report actually meant: “It's seldom you have such an orgy of foreign words … uncovertebral joints and the whole caboodle” (GP 11). “A good deal of Latin” was nevertheless okay if followed by a conclusion, such as “these are normal wear and tear changes, or may mean such and such” (GP 6).
Likelihood of disease should be clarified
GPs wanted interpretation of expressions like “possible” Bechterew's disease, “suggested” osteoporosis, and “looks like” the bone density might be a little low: “What does this actually mean? How large is the probability that this patient has osteoporosis?” (GP 2). The likelihood of the disease might remain unclear even when the findings were described as certain: “Schmorrl's impressions and things like that … is it a lot or a little in relation to what Scheuermann is, are we talking about Scheuermann or are we not talking about Scheuermann?” (GP 8).
On making the diagnostic uncertainty concrete one GP said: “If I write ‘can it be Bechterew’ and get the answer that it can be this or that, then I would like to know how sensitive or specific it is in relation to this diagnosis, and in relation to that diagnosis” (GP 4). One GP appreciated specification of “negative findings” in the radiology report.
Clinical relevance should be indicated
Many GPs seemed uncertain of the clinical relevance of various radiographic findings. They wanted to know whether the findings corresponded to the clinical condition, were normal for age, were dangerous, and whether the findings were actually worth caring about at all in that specific patient: “I don't want only a description of what it is but what it is in relation to the patient who has it” (GP 8). One GP described the need for clinical interpretation in this way: “When you deliver an X-ray report that is certainly radiologically very good, but with many things I as a general practitioner do not know whether are of any significance or not, then it would have been lovely to be told that this here, this doesn't matter for the clinic” (GP 5).
Further investigations might be advised
GPs valued advice on further investigations: “If there is a need for further examination that they [the radiologists] then write this [in the report]” (GP 10). Time could be saved when the advice was given in the report so that the GP did not have to contact the radiologist to ask, and furthermore: “If you have a recommendation from a radiologist that we should do something further, then there will not be so much discussion at the next stage about why” (GP 12).
One GP, however, was negative to repeated advice regarding further investigations: “When the radiologist does not then recommend any new or more extensive examination, then this probably means that it's unnecessary, so that is a trap to fall into” (GP 2). This GP also doubted that the radiologist had enough clinical knowledge and enough information concerning the individual patient to say what to do further: “This will be an overall judgement” (GP 2).
Good referral information leads to better reports
Several GPs underlined that they themselves had a responsibility to provide information that could contribute to clinically useful radiology reports. Their impression was that good referral information leads to better reports: “The more detailed a way one writes, explains why and argues, the more constructive answers one gets back” (GP 7). One GP explained: “If I've said a little more at the start then they [the radiologists] understand what I have thought” (GP 6).
GPs mentioned symptom description, effect on work, own uncertainty, a “feeling of being stuck” (GP 7), and “that I at least have had some thoughts and preferably concrete questions in the referral letter” (GP 6) as relevant referral information. Such information might make the radiologist more likely to look for specific findings and to provide help.
Discussion
This study adds new information on which kinds of “clarity” GPs might want in radiology reports. Clarity correlates to readability [19] and implies that “meaning is clear” [4]. GPs in our study needed clarification of clinical meaning, i.e. of terminology, disease likelihood, clinical relevance, and further investigations. Neither these four issues nor the fifth regarding referral information may be surprising. However, only one of them (further investigations) seems to have been addressed in surveys of clinicians’ views on radiology reports.
The study was improved by sampling until information redundancy occurred, followed by transcription by the interviewer, analysis by two researchers, and search for disproving data [12], [15], [17]. The GPs seemed very willing to share views but perhaps held back diverging views to conform to the group [13].
Data published during the two last decades support views given by GPs in our study. Such data indicate that radiological terms and descriptions of disease likelihood are differently understood between different clinicians or between clinicians and radiologists [20–22] and that the clinical question may not be answered [1]. Our data are from 1998 but clarification of clinical meaning in radiology reports appears relevant today as well.
Restricted use of plain radiography in the diagnosis of back pain is advised [23], [24]. However, reports of plain radiography for back pain remain very common [25] and they are similar to other common reports (e.g. of magnetic resonance imaging for back pain) in that they often contain findings of uncertain clinical meaning [26]. Our results based on the case of plain radiography for back pain may thus be relevant to many radiology reports.
It has been recommended that each radiology report uses appropriate terminology, identifies factors that may compromise the sensitivity and specificity, answers any specific clinical questions, and states factors preventing such an answer. Unless the report is brief it should contain an “impression” with a precise diagnosis when possible and suggestions for further investigations to clarify or confirm the impression when appropriate [27].
Our data support these recommendations and suggest that GPs would also welcome an impression of the findings’ clinical relevance. Radiologists may lack knowledge, information, or evidence to suggest clinical meaning. GPs and radiologists can nevertheless use the present results to reflect on and improve their referral letters and radiology reports.
The current study further exemplifies how qualitative methods can be used to identify views on radiology reports in other settings, or among hospital clinicians, who may differ from GPs in their needs regarding written radiology reports. For instance, advice on further investigations in the report is valued by GPs [2] but less so by paediatricians [4].
Our results can also serve as basis for a survey to assess the frequency of the needs we have identified, and to clarify whether unmet needs regarding radiology reports are actually less frequent following good referral information. In other studies good information improved diagnostic accuracy [28], [29] and advice on further investigations [30].
In conclusion, this qualitative study revealed views among GPs that do not seem to have been addressed in surveys on clinicians’ views on radiology reports. These views can help to improve GP–radiologist communication. The issues identified in this study could be further investigated in studies to quantify GPs’ satisfaction with radiology reports in relation to characteristics of the GP, the radiologist and the referral information.
References
- 1.Clinger NJ, Hunter TB, Hillman BJ. Radiology reporting: attitudes of referring physicians. Radiology. 1988;169:825–6. doi: 10.1148/radiology.169.3.3187005. [DOI] [PubMed] [Google Scholar]
- 2.Lafortune M, Breton G, Baudouin JL. The radiological report: What is useful for the referring physician? J Can Assoc Radiol. 1988;39:140–3. [PubMed] [Google Scholar]
- 3.McLoughlin RF, So CB, Gray RR, Brandt R. Radiology reports: How much descriptive detail is enough? Am J Roentgenol. 1995;165:803–6. doi: 10.2214/ajr.165.4.7676970. [DOI] [PubMed] [Google Scholar]
- 4.Gunderman R, Ambrosius WT, Cohen M. Radiology reporting in an academic children's hospital: What referring physicians think. Pediatr Radiol. 2000;30:307–14. doi: 10.1007/s002470050746. [DOI] [PubMed] [Google Scholar]
- 5.Naik SS, Hanbidge A, Wilson SR. Radiology reports: Examining radiologist and clinician preferences regarding style and content. Am J Roentgenol. 2001;176:591–8. doi: 10.2214/ajr.176.3.1760591. [DOI] [PubMed] [Google Scholar]
- 6.Koczwara B, Tie M, Esterman A. Are radiologists meeting the needs of Australian medical oncologists? Results of a national survey. Australas Radiol. 2003;47:268–73. doi: 10.1046/j.1440-1673.2003.01179.x. [DOI] [PubMed] [Google Scholar]
- 7.Kubik-Huch RA, Rexroth M, Porst R, Durselen L, Otto R, Szucs T. Wie zufrieden sind die klinischen Partner mit der Arbeit eines radiologischen Instituts? Entwicklung und Testung eines Fragebogens (Referrer satisfaction as a quality criterion: Developing a questionnaire for measuring the quality of services provided by a radiology department, English summary) Rofo. 2005;177:429–34. doi: 10.1055/s-2005-858022. [DOI] [PubMed] [Google Scholar]
- 8.Johnson AJ, Ying J, Swan JS, Williams LS, Applegate KE, Littenberg B. Improving the quality of radiology reporting: A physician survey to define the target. J Am Coll Radiol. 2004;1:497–505. doi: 10.1016/j.jacr.2004.02.019. [DOI] [PubMed] [Google Scholar]
- 9.Dacher JN, Charlin B, Bergeron D, Tardif J. Consultation skills in radiology: A qualitative study. J Can Assoc Radiol. 1998;49:167–71. [PubMed] [Google Scholar]
- 10.Miller WL, Crabtree BF. Clinical research: A multimethod typology and qualitative roadmap. In: Crabtree BF, Miller WL, editors. Doing qualitative research. Thousand Oaks, CA: Sage Publications; 1999. pp. 3–30. [Google Scholar]
- 11.Boynton PM, Greenhalgh T. Selecting, designing, and developing your questionnaire. BMJ. 2004;328:1312–5. doi: 10.1136/bmj.328.7451.1312. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Malterud K. Qualitative research: Standards, challenges, and guidelines. Lancet. 2001;358:483–8. doi: 10.1016/S0140-6736(01)05627-6. [DOI] [PubMed] [Google Scholar]
- 13.Morgan DL. Focus groups as qualitative research. London: Sage Publications; 1997. [Google Scholar]
- 14.Brown JB. The use of focus groups in clinical research. In: Crabtree BF, Miller WL, editors. Doing qualitative research. Thousand Oaks, CA: Sage Publications; 1999. pp. 109–24. [Google Scholar]
- 15.Kuzel AJ. Sampling in qualitative inquiry. In: Crabtree BF, Miller WL, editors. Doing qualitative research. Thousand Oaks, CA: Sage Publications; 1999. pp. 33–45. [Google Scholar]
- 16.Espeland A, Baerheim A. Factors affecting general practitioners’ decisions about plain radiography for back pain: Implications for classification of guideline barriers – a qualitative study. BMC Health Serv Res. 2003;3:8. doi: 10.1186/1472-6963-3-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Malterud K. Shared understanding of the qualitative research process. Guidelines for the medical researcher. Fam Pract. 1993;10:201–6. doi: 10.1093/fampra/10.2.201. [DOI] [PubMed] [Google Scholar]
- 18.Malterud K. Kvalitative metoder i medisinsk forskning. En innføring (Qualitative methods in medical research: An introduction) Aurskog: Tano Aschehoug; 1996. [Google Scholar]
- 19.Sierra AE, Bisesi MA, Rosenbaum TL, Potchen EJ. Readability of the radiologic report. Invest Radiol. 1992;27:236–9. doi: 10.1097/00004424-199203000-00012. [DOI] [PubMed] [Google Scholar]
- 20.Owen JP, Rutt G, Keir MJ, Spencer H, Richardson D, Richardson A, et al. Survey of general practitioners’ opinions on the role of radiology in patients with low back pain. Br J Gen Pract. 1990;40:98–101. [PMC free article] [PubMed] [Google Scholar]
- 21.Hobby JL, Tom BD, Todd C, Bearcroft PW, Dixon AK. Communication of doubt and certainty in radiological reports. Br J Radiol. 2000;73:999–1001. doi: 10.1259/bjr.73.873.11064655. [DOI] [PubMed] [Google Scholar]
- 22.Khorasani R, Bates DW, Teeger S, Rothschild JM, Adams DF, Seltzer SE. Is terminology used effectively to convey diagnostic certainty in radiology reports? Acad Radiol. 2003;10:685–8. doi: 10.1016/s1076-6332(03)80089-2. [DOI] [PubMed] [Google Scholar]
- 23.Carragee EJ, Hannibal M. Diagnostic evaluation of low back pain. Orthop Clin North Am. 2004;35:7–16. doi: 10.1016/S0030-5898(03)00099-3. [DOI] [PubMed] [Google Scholar]
- 24.Koes BW, van Tulder MW, Ostelo R, Kim BA, Waddell G. Clinical guidelines for the management of low back pain in primary care: An international comparison. Spine. 2001;26:2504–13. doi: 10.1097/00007632-200111150-00022. [DOI] [PubMed] [Google Scholar]
- 25.Børretzen I, Lysdahl KB, Olerud HM. Radiologi i Noreg – undersøkingsfrekvens per 2002, tidstrendar, geografisk variasjon og befolkningsdose (Radiology in Norway – examination frequency per 2002, trends in time, geographical variation and population dose, English summary). Strålevern Rapport 6:2006. Østerås: Norwegian Radiation Protection Authority; 2006. [Google Scholar]
- 26.Roland M, van Tulder M. Should radiologists change the way they report plain radiography of the spine? Lancet. 1998;352:229–30. doi: 10.1016/S0140-6736(97)11499-4. [DOI] [PubMed] [Google Scholar]
- 27.American College of Radiology ACR Practice guideline for communication of diagnostic imaging findings. Reston, VA: American College of Radiology; 2005. pp. 5–9. [Google Scholar]
- 28.Robinson PJ. Radiology's Achilles’ heel: Error and variation in the interpretation of the Rontgen image. Br J Radiol. 1997;70:1085–98. doi: 10.1259/bjr.70.839.9536897. [DOI] [PubMed] [Google Scholar]
- 29.Leslie A, Jones AJ, Goddard PR. The influence of clinical information on the reporting of CT by radiologists. Br J Radiol. 2000;73:1052–5. doi: 10.1259/bjr.73.874.11271897. [DOI] [PubMed] [Google Scholar]
- 30.Doubilet P, Herman PG. Interpretation of radiographs: effect of clinical history. Am J Roentgenol. 1981;137:1055–8. doi: 10.2214/ajr.137.5.1055. [DOI] [PubMed] [Google Scholar]