Skip to main content
Journal of Digital Imaging logoLink to Journal of Digital Imaging
. 2021 Apr 26;34(3):705–716. doi: 10.1007/s10278-021-00448-z

Designing a Consumer-Friendly Radiology Report using a Patient-Centered Approach

Mohammad Alarifi 1,2,, Timothy Patrick 3, Abdulrahman Jabour 4, Min Wu 5, Jake Luo 1,
PMCID: PMC8329124  PMID: 33903982

Abstract

Patient portals have helped accelerate patient engagement in treatment. Patient understanding of radiology reports has become a necessity, and we are working to design a patient-friendly radiology report that can be easily understood. We have based the design of this new radiology report on the results of a previous study that examined patient desires and needs by exploring their questions posted on online discussion forums. The current design was tested by presenting it in two groups, a control group, and an intervention group. In our evaluation, we relied on the following five concepts: understanding (quiz), cosmetics appearance, perceived ease of use, acceptance, and preference. The results showed that the new design outperformed the current design in all five concepts with an overall of (P < .00). Based on these results, we have determined that the radiology report should include both an image and notes section, and the design can be applied to all types of radiological examinations using various imaging devices. We believe this design will be an important building block in facilitating patient understanding of radiology reports.

Keywords: Radiology reporting, Patient understanding, Online discussion forums, Patient-centered approach

Introduction

Radiology reports are one of the tools that enable patients to learn about their health [1]. Patients are able to obtain copies of their reports through an electronic portal or a paper copy directly from the radiology department [1]. In a previous study, we found that many healthcare providers do not provide a full radiology report to patients [2]. Only 98 out of 110 healthcare providers allow patients to obtain their reports, but not the images [2]. The reasons for these limitations relate to an overall lack of patient understanding of the complete radiology report [2, 3]. This issue could cause increased anxiety in patients and could even have negative health effects. Some studies have shown a lack of preference regarding radiologists providing images to patients [4]. The first target of the current radiology report is referring physicians, not the patients themselves [5]. Some studies have shown that patients have a desire to obtain their complete radiology report(s) [68].

A patient-friendly radiology report should be designed to increase patient understanding of the report. Patient desires should be recognized and included into the modified report. The previous study conducted a survey of four online discussion forums to determine patient needs and desires for their radiology reports [9]. Those sites were Yahoo Answers, Reddit.com, Quora, and Wiki Answers. The results of this study were used to design the radiology report model in this current study. The current study is intended to be used to develop various radiology reports.

Methods

Study Population and Recruitment

There were 616 participants recruited via a human intelligence task (HIT) posted by Amazon’s MTurk. Our study was based on an interventional study and is divided into two groups, the intervention group (n = 296) and the control group (n = 320). The demographic of the two groups shows no significant difference except for gender (Table 1). There is also no significant difference in health literacy between the two groups (Table 2). The participants who completed the survey received US $0.41 in their Amazon.com account and had the same chance of being allocated to either the control or intervention group. The random assignment of the participants was done by computer using the online survey platform, Qualtrics.com. The study was approved by the IRB of the University of Wisconsin-Milwaukee (# 20.230).

Table 1.

Demographic results of control group and intervention group

Characteristics Control group Intervention group P value
(Pearson chi-square)

Age

• < 20–29 years

• 30–49 years

• 50 + years

39.4% 126

45.9% 147

14.7% 47

42.6% 126

43.2% 128

14.2% 42

0.72

Gender

• Male

• Female

63.4% 203

36.6% 117

53.2% 157

46.8% 138

0.01

Occupation

• Radiologists

• Health practitioner

• Other

13.4% 43

23.4% 76

62.8% 201

12.8% 38

18.6% 55

68.6% 203

0.25

Level of education

• Some school and high school

• Some college

• College degree and above

8.8% 28

27.5% 88

63.7% 204

7.8% 23

31.1% 92

61.1% 181

0.60

Race

• White

• Black or African American and American Indian or Alaska Native and other

• Asian and Native Hawaiian or Pacific Islander

62.8% 201

14.4% 46

22.8% 73

64.5% 191

9.8% 29

25.7% 76

0.19

English is the language

• Yes

• No

91.3% 292

8.8% 28

88.9% 263

9.9% 33

0.319

Income

• < $10,00

• $10,000–$19,999

• $20,000–$39,999

• $40,000–$59,999

• $60,000–$79,999

• ≥ $80,000

9.1% 29

12.5% 40

24.7% 79

22.2% 71

18.8% 60

12.8% 41

10.8% 32

14.9% 44

19.9% 59

22% 65

18.6% 55

13.9% 41

0.73

Seen a radiology report before

• Yes

• No

66.3% 212

33.8% 108

69.3% 205

30.7% 91

0.42

Done radiology scanning before

• Yes

• No

75.6% 242

24.4% 78

76.4% 226

23.6% 70

0.83

Table 2.

Health literacy level based on STOFHLA (Short Test of Functional Health Literacy in Adults) scores: inadequate health literacy (0–16), marginal health literacy (17–22), and adequate health literacy (23–36)

Health literacy P value (Pearson chi-square)

1. How often are appointment slips written in a way that is easy to read and understand?

(1) Always (2) Often (3) Sometimes (4) Occasionally (5) Never

2. How often are medical forms difficult to understand and fill out?

(1) Always (2) Often (3) Sometimes (4) Occasionally (5) Never

3. How often do you have difficulty understanding written information your health care provider gives you?

(1) Always (2) Often (3) Sometimes (4) Occasionally (5) Never

4. How often do you have problems learning about your medical condition because of difficulty understanding written information?

(1) Always (2) Often (3) Sometimes (4) Occasionally (5) Never

5. How confident are you filling out medical forms by yourself?

(1) Always (2) Often (3) Sometimes (4) Occasionally (5) Never

6. How confident do you feel you are able to follow the instructions on the label of a medication bottle?

(1) Always (2) Often (3) Sometimes (4) Occasionally (5) Never

7. How often do you have someone help you read hospital materials?

(1) Always (2) Often (3) Sometimes (4) Occasionally (5) Never

0.94
Group Control Intervention
Adequate Marginal Inadequate Adequate Marginal Inadequate
Health literacy 23.8% 76 58.1% 186 18.1% 58 23.5% 145 57.8% 356 18.7% 115

Survey Design

In the pilot study, we provided participants with the same general design but used the following three cases: CT brain tumor, X-ray femur fracture, and MRI of the lumbar spine. The results of the pilot study showed that the design can be applied to any radiology cases or modalities. The majority of the patient questions in the previous study involved the MRI [9]. For this reason, one of the MRI cases was used in the study. The revised and original MRI lumbar spinal reports were provided to the participants based on group. The original MRI report was provided to the control group (Figs. 1 and 2). The revised report was provided to the intervention group (Figs. 3, 4, and 5). In the revised report, the following five images were included: original, modified, normal case (standard), worse case (standard), and images from anatomy atlas [10].

Fig. 1.

Fig. 1

Original design of the MRI lumbar spine image

Fig. 2.

Fig. 2

Original design of the MRI lumbar spine report

Fig. 3.

Fig. 3

Revised design of the MRI lumbar spine image

Fig. 4.

Fig. 4

Revised design of the MRI lumbar spine report

Fig. 5.

Fig. 5

Revised design of the MRI lumbar spine report

Revised Report (Note Representations)

Based on the literature review, we organized the report to provide what we consider to be the “best” data. The best data are defined as the data that the reader can use limited thinking power to read and interpret [1113]. Many studies have recommended using different colors to improve understanding [14, 15] and also confirmed that data must be displayed effectively [13, 16, 17]. To achieve both of these goals, data must be connected and presented in an understandable way. Our new design features the following characteristics:

  1. Definitions were provided for any medical terminology. These terms were also underlined.

  2. Report structure was modified to be more organized.

  3. Different colors were incorporated to improve report navigation

  4. Report includes timeframe for treatment as well as instructions and resource information.

  5. A summary table was incorporated with some general recommendations (Fig. 5).

Revised Report (Image Visualizations)

For the images, study participants were provided with two radiology images, the revised and original images. In the first design, patients were only able to see the original images (Fig. 1). In the second design, a total of five images were included (Fig. 3). The original image will be provided unchanged, and the second case will include the modified images that are intended to make the report more understandable and easier to comprehend (Fig. 3). The modified image is supported by many studies that suggest that providing both images can be useful [18, 19]. Two of the other images will be standard radiology images that show the normal and worst-case scenarios for a patient’s particular indication. These images can help the patient compare his or her image with other examples (Fig. 3). The image of what is “normal” can be an image that has been taken from a volunteer patient and saved in the database. The fifth image will be in .jpg format and will come from an atlas of images of the target organ(s) (Fig. 3) [10]. The revised design would like to take advantage of how radiologists learned how to read images while in medical school and will apply these general principals to patients and other non-radiologists. The education programs for radiologists were reviewed as we worked to find ways to create a mental representation of the human body [20]. Radiologists are often asked to look at radiographs of healthy cases and then look at abnormal cases. The modified, standard, and atlas images can allow for the process to proceed more smoothly and the task could also be given to assistants to provide them with valuable training.

Study Setting and Statistical Analysis

The minimum sample size for each group was 257 participants based on a calculation with G*Power 3.1 with alpha 0.05 and desired power 0.80. Data were downloaded from the Qualtrics.com as an SPSS file for the organization and analysis. The comparison of the two groups was done using Mann–Whitney–Wilcoxon (MWW). The following five concepts were used to evaluate the new design: understanding, cosmetic appearance, perceived ease of use (PEU), acceptance, and preference. Both groups had the same questions about the sociodemographic and health literacy section. Each participant was then assigned randomly by computer to one of the groups. Two surveys were used, one based on the original report structure and the other based on the revised report structure. The control group will have access to the original design of the MRI lumbar spine report. The intervention group will have access to the updated MRI lumbar spine report. The 17 questions will fall under the five concepts as follows: understanding (n = 5) as a quiz, cosmetic appearance (n = 2), PEU (n = 4), acceptance (n = 2), and preference (n = 4). Only 11 questions were provided to the intervention group as additional questions that explored the new design specifically.

Hypothesis

  1. The proposed radiology design will improve patient understanding of the radiology report.

  2. The proposed radiology design will improve the cosmetic appearance of the radiology report.

  3. The proposed radiology design will improve the PEU of the radiology report.

  4. The proposed radiology design will improve the acceptance of the radiology by patients.

  5. The proposed radiology design will improve the preference of the radiology report among patients.

  6. The new design will be easier to read than the previous design.

Results

The design of the radiology report is something that patients can use to better manage their health. Our study has shown that the new design is more readable and understandable than the current design. We noticed that the quiz results in the intervention group were better than the control group by 0.30 (P < 0.05). The quiz is designed to reflect patient understanding of the radiology report. As a result, we accept our hypothesis that the new radiology design will improve patient understanding of the radiology report. In the second part, participants were more satisfied with the way that the new design looked (P < 0.05) (Table 3). The “looks” of the new design include images and notes. Based on these results, the second hypothesis has been accepted. For PEU, the new design showed that the ability of participants to determine locations where issues appear in the images and understand the content increased by 1.80 (P < 0.001) (Table 3). Based on these results, we accept the hypothesis that the new design will improve the PEU of the radiology report. Figure 6 shows that 46.28% of the intervention group agreed that the ability to understand terminology increased. Additionally, 43.58% of participants agreed that they are comfortable with their ability to access more information about report terminology and their case. Figure 7 showed that 42.57% of participants agreed that adding annotations and additional colors increased understanding. The new design contains additional features such as providing an anatomical portrait from the atlas dictionary for the target organ(s). Figure 8 shows that 43.24% of participants found the atlas image helpful in allowing them to understand the radiology image. Additionally, 44.93% of participants preferred the use of annotations and color in the atlas image [10].

Table 3.

Comparison of the original design and revised design

Concept Item (Average) original design
(control group)
(Average) revised design
(intervention group)
Difference P value (Mann–Whitney)
Understanding Quiz of 5 questions 2.16 2.51 0.30 0.015
Cosmetics appearance I think that the format and appearance of the notes in the report are easy to look at. 6.72 7.11 0.38 0.007
Cosmetics appearance I think that the appearance of the image in the report is easy to read.
Perceived ease of use I am comfortable with my ability to find information in the images and notes. 12.64 14.44 1.80 0.000
Perceived ease of use I am comfortable with my ability to figure out the location of issues in the image.
Perceived ease of use In general, I am comfortable with my ability to understand the content in the images and notes.
Perceived ease of use I am comfortable with my ability to understand this report and the severity of the case.
Acceptance In general, I am satisfied with the way this image is presented. 6.82 7.41 0.59 0.000
Acceptance In general, I am satisfied with the way the notes in the radiology report are presented.
Preference I would like to receive my radiology report in this design. 14.03 15.10 1.06 0.000
Preference I prefer having my next appointment and instructions in the radiology report.
Preference I prefer having definitions of the medical terms in the radiology report.
Preference I prefer having an anatomy atlas image of the same of my case in the radiology report.
Overall 42.44 46.57 4.14 0.000

Fig. 6.

Fig. 6

PEU of patient regarding the terminology

Fig. 7.

Fig. 7

PEU of adding two scenarios and annotations on image

Fig. 8.

Fig. 8

PEU of adding anatomy atlas image with annotations

The acceptance of the new radiology report design improved by 0.59 (P < 0.001). This improvement shows that participants were satisfied with how image and notes were presented. Based on these results, we accept the hypothesis that the new design will improve the overall acceptance of the radiology report. Patients prefer to receive the radiology reports using the new design as shown by the improvement of 1.06 (P < 0.001). Figures 9 and 10 both show the acceptance and desire to add features to the radiology report including instructions, barcodes, and a summary table with recommendations. Based on these results, we accept our hypothesis that the new design will improve the preference of the new radiology report. The new design was more readable than the old design based on the improvement of 4.14 (P < 0.001).

Fig. 9.

Fig. 9

PEU of adding instructions in icons and barcodes

Fig. 10.

Fig. 10

PEU of adding summary section to the report

Discussions

Currently, radiology reports are intended for physician use, not patient use [5]. For this reason, some studies have attempted to develop a radiology report that is suitable for patient understanding. Some studies have attempted to develop patient understanding of radiology images [18]. In one study, an explanation feature was added to the images to test patient understanding. The results of the study showed that the modification enhanced patient understanding of the radiology images. Prior work also presented a patient-friendly radiology report for lung cancer screening using a design that depends on feedback from patient advisory groups. The design contained the following: information before the scan, results, and recommendations based on the scan [21]. While these studies were useful, they did not possess patient inclusivity. For example, the first study focused on developing images without examining the radiology explanation in the report, and the second study did not provide the original patient image in the report. Since the study did not submit the original report, patients did not have the opportunity to share the reports with consultants.

Our study provides a more patient-friendly radiology report that is applicable for all radiological examinations. We focused on facilitating radiology images and radiologist notes to allow for easier navigation between the two. The notes were divided into the following six sections: the original report, explanation of medical terms, a brochure that related to the patient’s particular indication, report summary, and an upcoming appointment reminder. The original report was re-organized and provided without any changes to the content of the report itself. We merged the report with the terminology section by placing a line between the medical terminology section and the original report. The medical terminology section provides the patient with the opportunity to access links that provide further explanation of the terminology. The brochure helps to explain the general medical condition through the barcode provided in the report. The shortcut section provides a simple explanation of the patient’s condition along with general advice. In the upcoming appointment section, we placed important patient information. The patient was provided with his or her original image along with four additional images to facilitate patient understanding of the images.

This study showed that patients can have a high level of understanding of radiology reports if the reports are re-organized and improved. The findings show that the new design is more patient-friendly and understandable than the current report. Additionally, the improvements in our design were not related to age, gender, economic status, or ethnic background and health literacy. The participants had a variety of sociodemographic traits (Table 1). We believe that the improved radiology design could increase patient engagement, and the enhancements could increase patient understanding of the report. If the patient understands more about his or her report, this could decrease the time that physicians spend talking to patients about the report. This is the first study, to our knowledge, that designs a universal radiology report with this level of comprehensiveness.

Conclusion

Our study was able to produce a patient-friendly radiology report with a patient-centered approach. The study relied on exploring patient opinions and desires by collecting the questions posted by patients in online forums. The design was approved of by patients at all sociodemographic levels. We believe that this design can be used effectively by both patients and radiologists.

Declarations

Ethics Approval

This is an observational study. The University of Wisconsin—Milwaukee Institutional Research Ethics Committee has confirmed that no ethical approval is required.

Conflict of Interest

The authors declare no competing interests.

Footnotes

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Mohammad Alarifi, Email: malarifi@uwm.edu, Email: mohalarifi@ksu.edu.sa.

Timothy Patrick, Email: Tp5@uwm.edu.

Abdulrahman Jabour, Email: Ajabour@jazanu.edu.sa.

Min Wu, Email: wu@uwm.edu.

Jake Luo, Email: Jakeluo@uwm.edu.

References

  • 1.Rocha DM, Brasil LM, Lamas JM, Luz GV, Bacelar SS. Evidence of the benefits, advantages and potentialities of the structured radiological report: an integrative review. Artif Intell Med. 2020;102:101770. doi: 10.1016/j.artmed.2019.101770. [DOI] [PubMed] [Google Scholar]
  • 2.Alarifi M, Patrick T, Jabour A, Wu M, Luo J: Full Radiology Report through Patient Web Portal: A Literature Review. Int J Env Res Pub He 17:3673, 2020 [DOI] [PMC free article] [PubMed]
  • 3.Lee CI, Langlotz CP, Elmore JG. Implications of direct patient online access to radiology reports through patient web portals. JACR. 2016;13:1608–1614. doi: 10.1016/j.jacr.2016.09.007. [DOI] [PubMed] [Google Scholar]
  • 4.Johnson AJ, Frankel RM, Williams LS, Glover S, Easterling D. Patient access to radiology reports: what do physicians think? JACR. 2010;7:281–289. doi: 10.1016/j.jacr.2009.10.011. [DOI] [PubMed] [Google Scholar]
  • 5. Olthof AW, de Groot JC, Zorgdrager AN, van Ooijen PMA: Perception of radiology reporting efficacy by neurologists in general and university hospitals. Clin Radiol 73:675. e1–675. e7, 2018 [DOI] [PubMed]
  • 6.Yi PH, Golden SK, Harringa JB, Kliewer MA. Readability of lumbar spine MRI reports: will patients understand? AJR. 2019;212:602–606. doi: 10.2214/AJR.18.20197. [DOI] [PubMed] [Google Scholar]
  • 7.Martin-Carreras T, Kahn CE., Jr Coverage and readability of information resources to help patients understand radiology reports. JACR. 2018;15:1681–1686. doi: 10.1016/j.jacr.2017.11.019. [DOI] [PubMed] [Google Scholar]
  • 8. Gutzeit A, Heiland R, Sudarski S, Froehlich JM, Hergan K, Meissnitzer M, Kos S, Bertke P, Kolokythas O, Koh DM: Direct communication between radiologists and patients following imaging examinations. Should radiologists rethink their patient care? Eur Radiol. 29:224–231, 2019 [DOI] [PubMed]
  • 9.Alarifi M, Patrick T, Jabour A, Wu M, Luo J: Understanding patient needs and gaps in radiology reports through online discussion forum analysis. Insights Into Imaging (forthcoming) 10.1186/s13244-020-00930-2 [DOI] [PMC free article] [PubMed]
  • 10.Netter FH: Atlas of human anatomy, 7th edition, Plate 162. 2014: Elsevier Health Sciences.
  • 11.Cleveland WS, McGill R. Graphical perception: theory, experimentation, and application to the development of graphical methods. J Am Stat Assoc. 1984;79:531–554. doi: 10.1080/01621459.1984.10478080. [DOI] [Google Scholar]
  • 12.Hegarty M. The cognitive science of visual-spatial displays: implications for design. Top Cogn Sci. 2011;3:446–474. doi: 10.1111/j.1756-8765.2011.01150.x. [DOI] [PubMed] [Google Scholar]
  • 13.Khasnabish S, Burns Z, Couch M, Mullin M, Newmark R, Dykes PC. Best practices for data visualization: creating and evaluating a report for an evidence-based fall prevention program. J Am Med Inform Assn. 2020;27:308–314. doi: 10.1093/jamia/ocz190. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Ratwani RM, Trafton JG, Boehm-Davis DA. Thinking graphically: connecting vision and cognition during graph comprehension. J Exp Psychol. 2008;14:36. doi: 10.1037/1076-898X.14.1.36. [DOI] [PubMed] [Google Scholar]
  • 15.Davis CR, McNair AG, Brigic A, Clarke MG, Brookes ST, Thomas MG, Blazeby JM. Optimising methods for communicating survival data to patients undergoing cancer surgery. Eur J Cancer. 2010;46:3192–3199. doi: 10.1016/j.ejca.2010.07.030. [DOI] [PubMed] [Google Scholar]
  • 16.Alverson CY, Yamamoto SH. Educational decision making with visual data and graphical interpretation: assessing the effects of user preference and accuracy. Sage Open. 2016;6:2158244016678290. doi: 10.1177/2158244016678290. [DOI] [Google Scholar]
  • 17.O'Brien S, Lauer C: Testing the susceptibility of users to deceptive data visualizations when paired with explanatory text. in Proceedings of the 36th ACM International Conference on the Design of Communication, 2018.
  • 18.Gichoya JW, Alarifi M, Bhaduri R, Tahir B, Purkayastha S: Using cognitive fit theory to evaluate patient understanding of medical images. in 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, 2017 [DOI] [PubMed]
  • 19. Oladiran O, Gichoya J, Purkayastha S: Conversion of JPG Image into DICOM Image Format with One Click Tagging. in International Conference on Digital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management. Springer, 2017
  • 20. He Z: Understanding and Bridging the Language and Terminology Gap Between Health Professionals and Consumers Using Social Media, in Social Web and Health Research. Springer 103–121, 2019
  • 21.Patient-centered radiology reporting for lung cancer screening Vitzthum von Eckstaedt H, Kitts AB, Swanson C, Hanley M, Krishnaraj A. J Thora Imag. 2020;35:85–90. doi: 10.1097/RTI.0000000000000469. [DOI] [PubMed] [Google Scholar]

Articles from Journal of Digital Imaging are provided here courtesy of Springer

RESOURCES