Skip to main content
Springer Nature - PMC COVID-19 Collection logoLink to Springer Nature - PMC COVID-19 Collection
. 2023 Feb 16:1–7. Online ahead of print. doi: 10.1007/s00256-023-04301-y

Factors influencing patient understanding of information on radiology examinations

Amissa Brewer-Hofmann 1,, Sana Sajjad 2, Zane Bekheet 3, Matthew P Moy 4, Tony T Wong 4
PMCID: PMC9933798  PMID: 36795137

Abstract

Purpose

To determine which factors influence patient understanding of information documents on radiology examinations.

Materials and methods

This is a randomized prospective study with 361 consecutive patients. Documents with information on 9 radiology exams were obtained (www.radiologyinfo.org). Three versions of each of these were written at low (below 7th grade), middle (8–12th grade), and high (college) reading grades. Before their scheduled radiology exam, patients were randomized to read one document. Their subjective and objective understanding of the information was assessed. Statistics including logistic regression used to assess relationships between demographic factors and document grade level and understanding.

Results

Twenty-eight percent (100/361) of patients completed the study. More females vs. males (85% vs. 66%) read their entire document (p = 0.042). Document grade level was not associated with understanding (p > 0.05). Correlation between college degrees and subjective understanding was positive (r = 0.234, p = 0.019). More females (74% vs. 54%, p = 0.047) and patients with college degrees (72% vs. 48%, p = 0.034) had higher objective understanding. Controlling for document grade level and demographics, patients with college degrees were more likely to have subjective understanding of at least half of the document (OR 7.97, 95% CI [1.24, 51.34], p = 0.029) and females were more likely to have higher objective understanding (OR 2.65, 95% CI [1.06, 6.62], p = 0.037).

Conclusion

Patients with college degrees understood more on information documents. Females read more of the documents than males and had a higher objective understanding. Reading grade level did not affect understanding.

Supplementary Information

The online version contains supplementary material available at 10.1007/s00256-023-04301-y.

Keywords: Health literacy, Readability, Musculoskeletal procedures, Patient education

Introduction

In the age of the Internet, patients are able to quickly search for health-related information at the touch of a button. This convenience has allowed more and more Americans to use online resources to supplement or even replace their acquisition of medical knowledge [1]. In light of this trend, it is necessary to investigate how health information can be best conveyed to patients. Improving the readability of informational materials has the potential to improve health literacy, ultimately leading to better health outcomes [2].

Low health literacy is the reason why many Americans do not feel that they have agency over their healthcare. Medical jargon is difficult to understand and often alienates patients from feeling involved in medical decision making [3]. This lack of understanding can lead to mistrust of healthcare professionals and decreased utilization of healthcare services [4]. Consequently, patients with low health literacy have poorer health outcomes, even when correcting for sociodemographic factors [5].

The American Medical Association (AMA) has identified that the majority of Americans read at or below an eighth grade level [5]. As such, both the AMA and the National Institute of Health (NIH) recommend that all resources geared toward patients be written below the seventh grade level [6].

Despite these recommendations, the inappropriately high reading levels of online information about radiology procedures have been well described. For example, Radiologyinfo.org, sponsored by both the American College of Radiology (ACR) and the Radiological Society of North America (RSNA), is a website that compiles easily accessible articles, explaining various imaging modalities and procedures. These texts were found to have been written between a 10th and 14th grade reading level, significantly above what the AMA and NIH recommend [7, 8]. Though it is clear that online patient information does not comply with recommendations, it is difficult to say whether lowering the reading level would increase patient understanding or if other independent factors influence understanding. In this study, we will present documents at various reading levels to patients undergoing radiology exams at Blinded Institution. Our purpose is to determine which factors influence patient understanding of information documents on radiology examinations. We hypothesize that patients will have better understanding after reading materials written at a lower reading level.

Materials and methods

This was a prospective Health Insurance Portability and Accountability Act compliant study approved by the institutional review board with informed consent obtained from all participants.

Document selection and translation

Nine documents, each describing a specific radiology examination, were obtained from Radiologyinfo.org. The nine exams included were bone X-ray, magnetic resonance imaging (MRI) of the spine, MRI of the knee, MRI of the shoulder, MRI of the general musculoskeletal system (MSK), computed tomography (CT) of the spine, MSK ultrasound (US), direct arthrography (DA), and US-guided injection. With the aid of a freely accessible online readability tool (www.webfx.com/tools/read-able), each of these documents was translated from the original to three different reading levels. After adjusting sentence structure and word length, one version was written at a low reading level (below 7th grade), one at a middle reading level (high school: 8th–12th grade), and another at a high reading level (college level: higher than 12th grade). This online readability tool uses the Flesch-Kincaid model, which incorporates average sentence and word length to determine grade level with the following formula: [0.39 × (words/sentences) + 11.8 × (syllables/words) − 15.59] [9]. For grade level and word count of each document, see Table 1.

Table 1.

FKGL (Flesch-Kincaid Grade Level)

Low Middle High
Exam type FKGL Word Ct FKGL Word Ct FKGL Word Ct
MRI MSK* 5.6 1227 9.2 1489 11.3 1464
MRI spine 5.5 1258 8.7 1373 11.4 1478
MRI should 5.5 1290 9 1578 11.1 1626
MRI knee 5.5 1262 9.3 1586 11 1568
CT spine 5.4 829 9.4 1012 12.7 1078
X-ray bone 5.4 432 10.4 479 13 524
US diagnostic MSK 5.4 454 8.2 521 10.3 620
US injection 5.4 656 8.9 885 10.7 989
DA 5.9 1633 10.3 1973 11.9 2086

* MRI MSK included all general MRI exams and joints that did not fall under one of the other categories above

Patient recruitment and enrollment

Consecutive patients scheduled for radiology examinations pertaining to the documents above were identified between August 2020 and March 2021. Pediatric patients and non-English speakers were excluded. Three hundred sixty-one consecutive qualifying patients were consented over the phone to participate in the study.

Online survey data collection

A personalized Qualtrics survey for data collection was created (see Appendix). The survey contained questions on demographics, asked patients how they obtained their health information, and included assessments of patient understanding of the presented information. Through an automatic feature in Qualtrics, participants were randomized to read the low-, middle-, or high-level document that corresponded to their scheduled exam. Within 3 days of their consent, patients were emailed their randomly assigned document and survey. Participants were asked to read their information document and complete the survey before the date of their upcoming examination. Participation in this study was entirely remote and asynchronous.

Statistics

In order to analyze understanding across various demographic data as well as across reading levels, Fisher’s exact/chi-squared tests, Student’s t-test, point-biserial correlation, and logistic regression were performed. Statistical significance was defined as p < / = 0.05.

Results

Information distribution and survey completion

Participants were randomized to read a document at one of three reading levels: low (level 1), middle (level 2), and high (level 3). The distribution of participants across these levels can be seen in Table 2.

Table 2.

Reading level of assigned documents

Reading level 1 Reading level 2 Reading level 3
MRI knee 2/10 (20%) 2/10 (20%) 6/10 (60%)
MRI shoulder 1/10 (10%) 5/10 (50%) 4/10 (40%)
MRI spine 17/53 (32.1%) 22/53 (41.5%) 14/53 (26.4%)
MRI MSK * 2/10 (20%) 6/10 (60%) 2/10 (20%)
US injection 1/1 (100%) 0/1 (0%) 0/1 (0%)
X-ray bone 2/2 (100%) 0/2 (0%) 0/2 (0%)
CT spine 2/10 (20%) 4/10 (40%) 4/10 (40%)
US diagnostic MSK 0/4 (0%) 1/4 (25%) 3/4 (75%)
DA✝ N/A N/A N/A
Total number of participants (%) 27/100 (27%) 40/100 (40%) 33/100 (33%)

* MRI MSK included all general MRI exams and joints that did not fall under one of the other categories above

✝ No patients were scheduled for direct arthrography during the study period

Of the 361 individuals who were consented to participate in the study, 127 responded to the survey and 100 patients (100/361, 28%) successfully completed the study.

Demographics

The average age of participants was 53.7 years old (SD: 15.1) and the ratio of men to women was 35:65. Sixty participants (60%) qualified as being “high income” (making $75,000 or more per year) and 79 participants (79%) completed a bachelor’s degree level of education or higher (Fig. 1).

Fig. 1.

Fig. 1

Demographic Results. Out of 100 total participants. A Yearly Income. B Highest level of education attained

Sources of patient information

Patients reported use of various sources to obtain their health information with the majority (68%, 68/100) utilizing more than one source. Overall, patients most commonly received information from their doctors (86%, 86/100) and from the Internet (69%, 69/100) (Fig. 2). Use of the Internet either some, most, or all of the time was reported by 91% (91/100) of patients. For those who reported they rarely or never used the Internet, 50% (5/10) did not think the information was useful, 20% (2/10) found the information too hard to understand, 10% (1/10) was not familiar with its use, and 20% (2/10) did not offer a reason. Prior to review of the information documents provided in this study, only 15% (15/100) of patients looked up information about the radiology examination they were scheduled to have; of those, 80% (12/15) used the internet to do so.

Fig. 2.

Fig. 2

Health information resources used by patients

Patient understanding

A total of 35% of patients reported reading the entirety of their assigned document. The average amounts of the text read by patients for each reading grade level document were 83% (level 1), 69% (level 2), and 73% (level 3). There was a higher percentage of females (85%) who read more than half of the document than males (66%) (p = 0.042). Of the patients who did not read the whole document (n = 65), reasons that were cited included the following: 38% reported having read the information previously, 13% reported the document was too long, 21% cited another reason not listed in the survey, and 28% reported a combination of the aforementioned reasons.

Subjective understanding was assessed by patients estimating a percentage of the text that they understood. Seventy-one percent of patients reported a subjective understanding of the entirety of their assigned document. The remaining patients who understood less than the entire document responded with an assessment of the reasons why (Fig. 3). Only 1 patient offered another explanation beyond what was provided, responding that too much time was required to read the document.

Fig. 3.

Fig. 3

Patient response to reasons why they did not understand the entire information document

We found a positive correlation between subjective understanding and having at least a bachelor’s degree (r = 0.234, p = 0.019). Patients with college degrees subjectively reported understanding 12.2% more of what they read (94.3% vs. 82.1%, p = 0.019) than those who did not finish college. No significant associations with subjective understanding were found between age, sex, income, document reading grade level, whether the patient had the exam before, and whether knowledge was supplemented by other sources.

Objective understanding was assessed with two multiple choice comprehension questions that populated within the survey based on the type of study the patient was to undergo. A total of 6% of patients had two incorrect responses, 27% had one correct and one incorrect response, and 67% had two correct responses. More females than males [74% (48/65) vs. 54% (19/35) (p = 0.047)] and more patients with college degrees than without [72% (57/79) vs. 48% (10/21) (p = 0.034)] showed higher objective understanding with two correct responses (Fig. 4). The mean difference in quiz performance was 14.5% higher for patients who had college degrees than those without (83.5% vs. 69.0%, p = 0.049). No significant difference was found in objective understanding with different patient age, income, document reading grade level, whether the patient had the exam before, and whether knowledge was supplemented by other sources.

Fig. 4.

Fig. 4

Objective understanding of document text demonstrated by correct responses to both test questions. Patients who were female and who had college degrees had higher levels of understanding. No difference was found between different document reading grade levels

Logistic regression showed a significant association between a college degree and subjective understanding of at least half of the document (OR 7.97, 95% CI [1.24, 51.34], p = 0.029) and between female sex and objective understanding with two correct responses to the test questions (OR 2.65, 95% CI [1.06, 6.62], p = 0.037) when controlling for each other as well as age, income, document grade level, whether the patient had the exam before, and whether knowledge was supplemented by other sources.

Discussion

We found that a majority of patients use a variety of sources to obtain their health information. While most depend on their referring physicians, many also consume written information from other sources including the Internet (69%), hospital websites (21%), and hospital pamphlets (9%). The ubiquity of these sources underscores the importance of determining which factors influence patient understanding of information documents so that communication can be optimized.

Patients with a higher level of education, defined as a college bachelor’s degree or higher, had higher subjective and objective understanding of information documents. This is not surprising as more education likely provides an increased capacity to understand some of the inherent complexity of medical language. It is also possible that patients with higher education levels have a better overall awareness of their own health and are therefore more invested in learning more about the procedures they undergo [10].

We also found that female patients showed a higher objective understanding than males. This may be partly explained by some existing literacy data. It has been shown that females typically perform better than males on reading comprehension tasks with differences even seen on functional neuroimaging [11]. Additionally, it is possible that males are more prone to skipping sections while reading [12]. This study supports this as a significantly lower percentage of males read through at least half of their assigned document when compared to females. It is also interesting to note that despite females having a higher objective understanding than males, there was no sex difference for subjective understanding. This suggests that either male patients tend to overestimate their reading comprehension or that females underestimate their comprehension.

As far as reasons for decreased subjective understanding, we found that a large proportion of participants did not agree with the options that we provided. While the survey included an optional space to write in a response if an appropriate one was not listed, only one participant wrote a reply: “Too much time to read.” We postulate that one reason for lack of understanding could be that a low number of patients read their entire document (35%). We saw that the average amount of text read varied between 73 and 83% depending on the reading grade level. Similar to the write in response, it is possible that patients did not read their entire document because they thought they were too long and took too much time to read.

There are several studies which highlight the importance of ensuring reading levels are not too high for patients. Choudhry et al. found that increasing the readability of discharge summaries—by lowering the reading level—resulted in fewer follow-up phone calls and readmissions, suggesting that lowering reading level increases patient understanding on average [13]. It has also been shown that decreasing sentence length, using simpler words, and replacing text with images improves health literacy for the most vulnerable patients [14]. In this study, we did not find that lower reading grade level documents were associated with increased understanding. One potential reason for this discrepancy is that the small sample size limited the random distribution of participants across reading levels. Another reason is that approximately half (48%) of participants had an income of at least $100,000 and 79% had at least a bachelor’s degree, bringing into question how results may have changed with a more economically and educationally diverse population.

While we did not find a difference in understanding based on the reading level of our documents, it is important to acknowledge that information on radiology examinations are not representative of all different types of health information. As such, we do not suggest that providers disregard reading level on the basis of our results. Instead, we recommend that providers offer two versions of documents for patients: one lower level document and one higher level document. Considering that our results showed that various differences between patients influenced their understanding, there is probably still value in providing a higher, more complex version of materials for patients who are curious to know more and a simpler version for other patients; in this way, patients can decide which version they would like. However, in situations when offering two versions is impractical, providing all patients with materials at a lower reading level may be the most appropriate and efficient method to increase overall patient understanding[15].

While much has been written about how to alter the reading level of documents to allow for increased understanding, our results highlight how differences in patient populations also need to be accounted for [16]. Documents that are written at the appropriate reading level are still not necessarily understood equally by everyone. As it pertains to radiology, department staff should not assume that patients fully understand the examinations they are scheduled to undergo even if they have been given written information about it. Patients should be given an opportunity to ask questions and be reassured, as appropriate. While our study focused on diagnostic examinations, the stakes are undoubtedly higher for interventional procedures. Further research will be necessary to determine what factors influence patient understanding in those situations.

We note several limitations to our study. First, there was a small sample size of 100 participants and those included were limited to English-speaking patients within one hospital system. The small and generally well-educated sample size limited the random distribution of participants across reading levels. It should also be noted that over half (53%) of participants were scheduled to undergo the same examination—MRI spine—and no patients were scheduled to undergo direct arthrography, leading to an uneven distribution of participants between the 9 procedure types. As patients completed the study between August 2020 and March 2021, we attributed this to the cancelation and postponement of many elective procedures during the COVID-19 pandemic [17]. Another important limitation was the uneven distribution of FKGL across text levels. Had the reading levels been more standardized and had there been a larger sample size, there may have been a stronger association between reading level and understanding. Finally, the objective measure of understanding consisted of only two multiple choice questions and was limited by the asynchronous, remote nature of the study; more questions may have provided a more accurate assessment of understanding. It may also have been beneficial for participants to speak with a provider to demonstrate their understanding, as this would more closely mirror clinical reality. With this study, we have highlighted a need for further research with a larger sample size as well as a more economically and educationally diverse patient population that utilizes more in-depth assessment of objective understanding.

Conclusion

In conclusion, patients with college degrees have higher levels of understanding of information documents on radiology exams. Females read more of the documents than males and also have a higher understanding. We did not find that reading grade level affected understanding in our patient population .

Supplementary Information

Below is the link to the electronic supplementary material.

Data Availability

The data that support the findings of this study are available from the corresponding author, ABH, upon reasonable request.

Declarations

Conflict of interest

The authors declare no competing interests.

Footnotes

Publisher's note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Amissa Brewer-Hofmann, Email: abrewerhofmann@gmail.com.

Sana Sajjad, Email: sana.sajjad@downstate.edu.

Zane Bekheet, Email: znb2102@columbia.edu.

Matthew P. Moy, Email: mm5054@cumc.columbia.edu

Tony T. Wong, Email: ttw2105@cumc.columbia.edu

References

  • 1.America's Health Literacy: Why We Need Accessible Health Information. An Issue Brief From the U.S. Department of Health and Human Services. 2008.
  • 2.Weiss BD. Health Literacy and Patient Safety: Help Patients Understand. Manual for Clinicians. 2nd ed. Chicago, IL: Amer Medical Assoc Foundation. 2007.
  • 3.Wittink H, Oosterhaven J. Patient education and health literacy. Musculoskelet Sci Pract. 2018;38:120–127. doi: 10.1016/j.msksp.2018.06.004. [DOI] [PubMed] [Google Scholar]
  • 4.Conard S. Best practices in digital health literacy. Int J Cardiol. 2019;292:277–279. doi: 10.1016/j.ijcard.2019.05.070. [DOI] [PubMed] [Google Scholar]
  • 5.Weis BD. Health literacy: a manual for clinicians. Chicago, IL: American Medical Association; 2003. [Google Scholar]
  • 6.National Institutes of Health. How to write easy-to-read health materials. National Library ofMedicine website. www.nlm.nih.gov/medlineplus/etr.html. Accessed 10 Jan 2022.
  • 7.Hansberry DR, John A, John E, Agarwal N, Gonzales SF, Baker SR. A critical review of the readability of online patient education resources from RadiologyInfo.Org. AJR Am J Roentgenol. 2014;202(3):566–575. doi: 10.2214/AJR.13.11223. [DOI] [PubMed] [Google Scholar]
  • 8.Bange M, Huh E, Novin SA, Hui FK, Yi PH. Readability of patient education materials from RadiologyInfo.org: has there been progress over the past 5 years? Am J Roentgenol. 2019;213(4):875–879. doi: 10.2214/AJR.18.21047. [DOI] [PubMed] [Google Scholar]
  • 9.Kincaid JP FR, Rogers RL, Chissom BS. Derivation of New Readability Formulas) Automated Readability Index, Fog Count and Flesch Reading Ease Formula) for Navy Enlisted Personnel: Inst Sim Trng. 1975.
  • 10.Raghupathi V, Raghupathi W. The influence of education on health: an empirical assessment of OECD countries for the period 1995–2015. Arch Public Health. 2020;78:20. doi: 10.1186/s13690-020-00402-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Rossell SL, Bullmore ET, Williams SC, David AS. Sex differences in functional brain activation during a lexical visual field task. Brain Lang. 2002;80(1):97–105. doi: 10.1006/brln.2000.2449. [DOI] [PubMed] [Google Scholar]
  • 12.Topping KJ. Fiction and non-fiction reading and comprehension in preferred books. null. 2015;36(4):350–387 10.1080/02702711.2013.865692.
  • 13.Choudhry AJ, Baghdadi YM, Wagie AE, et al. Readability of discharge summaries: with what level of information are we dismissing our patients? Am J Surg. 2016;211(3):631–636. doi: 10.1016/j.amjsurg.2015.12.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Sudore RL, Schillinger D. Interventions to improve care for patients with limited health literacy. J Clin Outcomes Manag. 2009;16(1):20–29. [PMC free article] [PubMed] [Google Scholar]
  • 15.Cui Z, Su M, Li L, Shu H, Gong G. Individualized prediction of reading comprehension ability using gray matter volume. Cereb Cortex. 2018;28(5):1656–1672. doi: 10.1093/cercor/bhx061. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Hersh L, Salzman B, Snyderman D. Health literacy in primary care practice. Am Fam Physician. 2015;92(2):118–124. [PubMed] [Google Scholar]
  • 17.Mathews M., Mathews A.W. Hospitals push off surgeries to make room for coronavirus patients. Wall Street Journal. https://www.wsj.com/articles/hospitals-push-off-surgeries-to-make-room-for-coronavirus-patients-11584298575 Available at: Published March 16, 2020.

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Data Availability Statement

The data that support the findings of this study are available from the corresponding author, ABH, upon reasonable request.


Articles from Skeletal Radiology are provided here courtesy of Nature Publishing Group

RESOURCES