Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2018 Jan 1.
Published in final edited form as: Pract Radiat Oncol. 2016 Aug 1;7(1):57–62. doi: 10.1016/j.prro.2016.07.008

Online Patient Information from Radiation Oncology Departments is too Complex for the General Population

Stephen A Rosenberg 1,*, David M Francis 1,*, Craig R Hullet 1, Zachary S Morris 1, Jeffrey V Brower 1, Bethany M Anderson 1,2, Kristin A Bradley 1,2, Michael Bassetti 1,2, Randall J Kimple 1,2
PMCID: PMC5219938  NIHMSID: NIHMS807357  PMID: 27663932

Abstract

Purpose

Nearly two-thirds of cancer patients seek information about their diagnosis online. We assessed the readability of online patient education materials found on academic radiation oncology department websites to determine whether they adhered to guidelines suggesting that information be presented at a 6th grade reading level.

Materials/Methods

The Association of American Medical Colleges (AAMC) website was utilized to identify all academic radiation oncology departments in the United States. One-third of these department websites were selected for analysis using a random number generator. Both general information on radiation therapy and specific information regarding various radiation modalities were collected. To test the hypothesis that the readability of these online educational materials was written at the recommended grade level, a panel of ten common readability tests was employed. A composite grade level of readability was constructed using the eight readability measures that provide a single grade level output.

Results

A mean of 5,605 words (range 2,058 – 12,837) from thirty department websites was collected. Using the composite grade level score, the overall mean readability level was determined to be 13.36 (12.83–13.89), corresponding to a collegiate reading level. This was significantly higher than the target 6th grade reading level (middle school, t (29) = 27.41, p < 0.001).

Conclusions

Online patient educational materials from academic radiation oncology websites are significantly more complex than recommended by the NIH and the Department of Health and Human Services. To improve patients’ comprehension of radiotherapy and its role in their treatment our analysis suggests that the language used in online patient information should be simplified to communicate the information at a more appropriate level.

Introduction

The focus of health care in the United States is on personalized, patient-centered treatment using evidence-based approaches. This is imperative in oncology, where treatments for malignancies are individualized and based a myriad of factors such as stage, grade, and individual performance status. As such, the numerous options available to address any individual cancer can be staggering. This is especially true in radiation oncology, where the emotional complexity involved with a diagnosis of cancer combined with the diversity of treatment modalities can be overwhelming for even the most educated. Indeed, radiation oncologists meet with patients and their families at critical intersections of their lives, at their most crucial and vulnerable. Twenty years ago patients received most medical information from their physicians. Today, people increasingly are seeking health advice online. This information can impact decision-making before they even see a physician 1,2. The internet has rapidly become a popular means of disseminating medical information. Online health information that is accurate and understandable is essential in helping patients discern which treatment options are available and may even play an integral role towards overcoming health care disparities 3,4.

Health literacy refers to the degree at which individuals obtain, process, and understand basic health information and services needed to make appropriate health decisions 5. In the United States, over 80 million adults demonstrate limited health literacy, and less than a quarter of patients exhibiting literacy skills that are considered proficient 6,7. One potential reason for the poor health literacy is that the average adult in the US possesses a 7th to 9th grade reading ability 8,9. Decreased health literacy contributes to diminished comprehension of health care information, which directly correlates with inferior overall health outcomes 10. It is crucial that educational literature for patients be written at a level that is suitable to the readership of the general population. In 2010, the United States Department of Health and Human Services (US-DHHS) developed a National Action Plan to improve health literacy. National guidelines from the US-DHHS, the National Institute of Health, and the American Medical Association recommend that health materials be written at or below the sixth grade reading level based on the current United States literacy rate 11.

In this study, we sought to determine whether the readability of patient-specific materials on the websites of academic radiation oncology departments were written at the appropriate reading level in accordance with the recommended national guidelines. To accomplish this, we collected patient educational materials from randomly selected departmental websites and analyzed the content for overall readability using well-established metrics of readability. Our objective was to test the hypothesis that the mean readability of material aimed toward patients was written at a 6th grade reading level.

Methods

Text extraction

The Association of American Medical Colleges (AAMC) website (https://www.aamc.org) was utilized to identify registered academic radiation oncology departments in the United States (n = 90 in Sept 2015). One-third of these department websites were selected for analysis using a random number generator (n = 30). This study was limited to major academic Radiation Oncology Departments. We inferred that major academic centers should have the resources to produce some of the most robust materials on patient education regarding radiotherapy. Although this represents a small sub-site of radiation oncology centers in the United States, we believe it likely represents patient information that has likely been vetted multiple times. Therefore, reflects some of the best patient information available on radiation oncology treatment, side effects, and other important information. These randomly selected sites then were screened manually for general introductory information targeted towards patients and their families; text extracted included information describing common radiation treatment modalities such as external beam (EBRT), 3-D conformal (3D CRT), intensity modulated (IMRT), image-guided radiation therapy (IGRT), proton therapy, and side effects related to radiation treatment. OPI collected from each center included information on simulation, treatment, and staging as available on each website. Importantly, department information directed towards professionals such as individual physician profiles, references, hyperlinks leading to websites outside of each radiation oncology department domain (e.g. National Cancer Institute, American Cancer Society, American Society for Radiation Oncology), and nonmedical information (copyright notices, author information, citations) was excluded from our collection. Every attempt was made to capture online information from Radiation Oncology websites, as patients would have encountered it as reading through. Given the inconsistency whether information on brachytherapy was available, this was not included in information extracted and analyzed.

Readability assessment

The collected text from each academic department website was analyzed using Readability Studio version 2012.0 (Oleander Software, Hadapsar, India). Ten readability tests were employed to assess the material: the New Dale-Chall Test, the Flesch Reading Ease Score, the Flesh-Kinkaid Grade Level, the FORCAST test, the Fry Score, the Simple Measure of Gobbledygook (SMOG), the Gunning Frequency of Gobbledygook, the New Fog Count, the Raygor Readability Estimate, and the Coleman-Liau Index 12,13. The Flesch-Reading Ease scale is employed by most government agencies and generates a score ranging from 0 (very difficult) to 100 (very easy). For the majority of readability tests the basis of the overall score analysis is words per sentence and syllables per word. Collectively, each of the ten tests reported a score or range of scores that were utilized for overall readability analysis.

Statistical analysis

Statistical analyses were performed using IBM SPSS for Macintosh, version 22.0 (IBM Corporation, Armonk, New York). The readability of online patient information from each radiation oncology department website was compared against the recommended 6th grade reading level using single sample t-tests.

Results

Online educational patient information (OPI) was collected from 30 randomly selected academic radiation oncology department websites (Supplemental Figure). A mean of 5,604 words (range 2,058 – 12,837) per website was collected for analysis. Some of the most commonly utilized readability tests are the Raygor Readability score, the Flesch-Reading Ease scale, and the Fry score. Analysis by the Raygor Readability estimate revealed a mean score of 15 [range 10–17] for OPI, corresponding to a collegiate level. This is well above the target 6th grade (score of 6) reading level (Figure 1). Using the Flesch-Reading Ease scale, the average score of the OPI from each of the websites is 37 [range 16–55], approximately collegiate level (Figure 2). If OPI was written at an appropriate level to be understood by most 13–15 year olds, that would correspond to score of 60–70 on the Flesch-Reading Ease scale. Additionally, the Fry Score yielded a mean score of a 16th grade level, again significantly above the recommended reading level. Collectively, these commonly used tests of readability illustrate that OPI from academic radiation oncology department websites is written at a substantially higher level that recommended by guidelines across a variety of tests.

Figure 1. Raygor Readability level.

Figure 1

OPI (Online Patient Information) from Radiation Oncology websites (red) and underwent Raygor Readability analysis. OPI for Radiation Oncology websites was 15 (Range 10 –17), well above the target grade level of 6th grade (green shading) as determined by the NIH/DHHS.

Figure 2. Flesch Reading Ease Score of OPI (Online Patient Information).

Figure 2

The Flesch Reading Ease scale generates a score from 100 (very easy) to 0 (very difficult) with “plain English” scoring a level of 60–70 (understood by most 13–15 year olds). This test is a standard measurement of readability often used by the US government agencies. The mean scores on this test was 37 (range 16 – 55) for Radiation Oncology OPI, well below the target level (green shading).

These results were consistent across the variety of tests we employed. Regardless of the test, the collected OPI was generally was presented at or above a freshman year in college. Of these assessments, eight (Coleman-Liau, Flesch-Kincaid, FORCAST, Fry, Gunning Fog, New Fog, Raygor, and SMOG) generated a single measure of reading grade level (Table 1). These scales were highly correlated, and in combination created a strongly reliable measure (SI alpha = 0.98); as such, they were combined to yield a comprehensive grade level. Using this combined readability metric, the composite grade level was 13.34 (12.83–13.89), corresponding to the freshman or sophomore year in college. This is significantly higher than the target 6th grade level recommended by guidelines.

Table 1. Readability of Radiation Oncology Departments Online Patient Information.

Eight readability tests provided a single measure of grade level. These tests were strongly correlated and in combination created a highly reliable measure (SI alpha =0.98). Across 30 Radiation Oncology Departments chosen, OPI was written on average at the 13.4 grade level (freshman-sophomore collegiate level) with a combined metric of readability.

Reading Test Grade Mean [min, max]
Coleman-Liau 13.5 [10.9, 16.5]
Flesch-Kincaid 12.7 [9.6, 16.2]
FORCAST 11.7 [10.8, 12.5]
Fry 16 [11, 17]
Gunning Fog 13.9 [10.6, 17.1]
New Fog Count (Kincaid) 10 [7.2, 13.6]
Raygor Estimate 15 [10, 17]
SMOG 14.3 [12, 17]

Discussion

The widespread availability of the internet has had a significant impact on how patients seek information regarding cancer diagnoses and treatment options. Patients and their families are increasingly are supplemental information online to help ascertain which treatments are most suitable to their malignancy. The proliferation of internet connected mobile devices makes accessing this information instantaneous and seamless. These educational health materials can play an important role in describing treatments and side effects to patients so that they approach consultations with greater knowledge base than would otherwise be expected. Patients naturally gravitate towards information they can comprehend and will trust these materials to guide treatment decisions 5,9. While it is unknown known whether providing information at the appropriate level will impact health outcomes, it is a worthwhile endeavor.

Over 60% of cancer patients receive radiation as a component of their treatment. Modern radiation therapy is a highly technical treatment that has evolved to utilize ever more precise modalities such as 3D-CRT, IMRT, and IGRT. The terminology involved in describing radiation treatment and malignancies in general is inherently esoteric. Radiation oncology department websites are essential resources for providing additional cancer and radiation related information to current and prospective patients. Based on our analyses, academic radiation oncology departments have significant room to improve their OPI. All sites investigated presented material that was written substantially above both the reading level of the average adult (7th–9th grade) and above consensus guidelines (6th grade).

In order to limit potential biases involved with any individual assessment, we employed a panel of commonly used readability tests. The results were consistently recapitulated across each of these ten indices for readability, with a composite grade level of 13.36 (Table 1). These results illustrate the need to refine the language of general radiation oncology information available to patients. The majority of these measures generate their grade output by analyzing the number of syllables per word and/or the number of words per sentence. Simpler word choice and grammar is pivotal to improving OPI. For example, limiting the number of times a complex term/phrase is used. Exchanging the term “brachytherapy” to “local therapy” decreases the reading level from collegiate to high school level.

It is important to realize that not all patients learn best from reading material. Online videos and other media are available for patients to learn about cancer treatments. Material in other genres should also be presented appropriately for patients to comprehend 1419. Improving OPI will require a multi-disciplinary approach of physicians, nurses, teachers, and patients. It is critical to engage with patients to across various media and venues to address differences in learning styles and educational levels.

Information found on departmental websites often is designed to serve multiple purposes: patient education and recruitment; practice building; description of ongoing research and clinical trials; resident and medical student education; and donor outreach. To avoid the confounding effect of material written for different audiences, we manually collected data specifically designated for patients. This included general cancer information, basic radiation therapy concepts, and treatment side effects. We also included information regarding commonly used radiation modalities (EBRT, 3D CRT, IMRT, and IGRT) that was targeted towards patients. We did not include materials that were likely directed towards clinicians and scientists such as department research, clinical trials, or resident education. We also excluded references and links that led to websites or pdf files outside of each department’s network domain. As not all departments included information on brachytherapy and radioisotopes, we specifically excluded this information which may have led to the over or under-estimation of reading level.

Readability of the information we collected is written at the 11th grade or higher reading level; the results from most tests correspond to the reading level of a college student or professor. Even though the quantity of information we collected differed significantly (word range 2,058 – 13,837), the readability levels remained high and did not differ based on the quantity of text. Similar results also have been observed amongst other non-oncological disciplines, with large gaps between the complexity of information found on department websites and the nationally recommended standard literacy level 20,21.

Other investigators recently have examined patient material pertaining to radiation therapy from the American Cancer Society (ACS), the American Society for Radiation Oncology (ASTRO), NCI-Designated Cancer Centers, and the combined American College of Radiology/Radiological Society of North American (ACR/RSNA) websites 22,23. This analysis demonstrated that these distinct text were written at the 10th grade level. In this analysis, just over one quarter of documents from the ACS met consensus guidelines suggesting that it is possible to convey this complex medical information at an appropriate level.

We acknowledge several limitations of our analysis. Text was collected and analyzed from a single point in time, and radiation oncology departments may regularly change and revise information presented on their websites. The effects of this time-bias may be minimized as it appears that departments more frequently revise faculty web pages, ongoing research, clinical trials, and residency education and curriculum. We did not include printed educational materials that may be provided to patients and their families during visits because some departments provided this information online while others did not do so. We also excluded hyperlinks that led to websites outside of each department’s domain. Some departments may limit the amount of text intended for patients on their websites but include multiple hyperlinks that direct patients to outside resources that are written at a clearer reading level. Finally, we assumed that all analyzed text was accurate, but did not employ a formal investigation for accuracy.

Conclusion

Online patient information at academic radiation oncology websites is written at a level that exceeds both the average adult literacy in the United States. Our analysis suggests the need for broad simplification and revision of these resources in order to improve comprehension of the average radiation oncology patient. Additional work is also needed to determine whether changes in the communication of health information can impact patient outcomes by improving compliance with recommendations, preparing patients for the anticipated treatments, or enhancing enrollment in clinical trials. As radiation modalities and cancer therapies continue to advance and become more sophisticated, it is imperative that the information explaining these treatments to patients and their families remains both accurate and understandable. Simplification of patient educational materials with the goal of enhancing overall communication and clinical outcomes will require a multidisciplinary approach among members of the oncologic team along with collaboration with patients and their families.

Supplementary Material

Acknowledgments

Research Support: This work was supported in part by the NIH/NCI CA160639 (RK)

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

Disclaimers (by author): SR (None); DF (None); CH (None); ZM (None); JB (None); BA (Elekta); KB (UptoDate-Intellectual Property Contributor); MB (None); RK (Threshold Pharmaceuticals)

Previously Presented: This worked presented at ASTRO 2015, San Antonio, TX.

References

  • 1.Welch Cline RJ, Penner LA, Harper FW, et al. The roles of patients' internet use for cancer information and socioeconomic status in oncologist-patient communication. J Oncol Pract. 2007;3:167–171. doi: 10.1200/JOP.0737001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Koch-Weser S, Bradshaw YS, Gualtieri L, et al. The Internet as a health information source: findings from the 2007 Health Information National Trends Survey and implications for health communication. J Health Commun. 2010;3(15 Suppl):279–293. doi: 10.1080/10810730.2010.522700. [DOI] [PubMed] [Google Scholar]
  • 3.Berkman ND, Sheridan SL, Donahue KE, et al. Low health literacy and health outcomes: an updated systematic review. Ann Intern Med. 2011;155:97–107. doi: 10.7326/0003-4819-155-2-201107190-00005. [DOI] [PubMed] [Google Scholar]
  • 4.Hirschberg I, Seidel G, Strech D, et al. Evidence-based health information from the users' perspective--a qualitative analysis. BMC Health Serv Res. 2013;13:405. doi: 10.1186/1472-6963-13-405. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Baker DW. The meaning and the measure of health literacy. J Gen Intern Med. 2006;21:878–883. doi: 10.1111/j.1525-1497.2006.00540.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.S W. Assessing the nation’s health literacy: Key concepts and findings of the National Assessment of Adult Literacy (NAAL) American Medical Association Foundation. 2008. [Google Scholar]
  • 7.Cutilli CC, Bennett IM. Understanding the health literacy of America: results of the National Assessment of Adult Literacy. Orthop Nurs. 2009;28:27–32. doi: 10.1097/01.NOR.0000345852.22122.d6. quiz 33–4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Kirsch I, Jungeblut A, Jenkins L, et al. Adult Literacy in America. 3rd. National Center for Education Statistics; 2002. [Google Scholar]
  • 9.Safeer RS, Keenan J. Health literacy: the gap between physicians and patients. Am Fam Physician. 2005;72:463–468. [PubMed] [Google Scholar]
  • 10.Bains SS, Bains SN. Health literacy influences self-management behavior in asthma. Chest. 2012;142:1687. doi: 10.1378/chest.12-2292. author reply 1687–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.NIH. How to Write Easy-to-Read Health Materials. https://www.nlm.nih.gov/medlineplus/etr.html. National Institute of Health; 2013. https://www.nlm.nih.gov/medlineplus/etr.html. [Google Scholar]
  • 12.Flesch R. A new readability yardstick. J Appl Psychol. 1948;32:221–233. doi: 10.1037/h0057532. [DOI] [PubMed] [Google Scholar]
  • 13.McLaughlin G. SMOG grading: a new readability formula. J Reading. 1969 [Google Scholar]
  • 14.Dunn J, Steginga SK, Rose P, et al. Evaluating patient education materials about radiation therapy. Patient Educ Couns. 2004;52:325–332. doi: 10.1016/S0738-3991(03)00108-3. [DOI] [PubMed] [Google Scholar]
  • 15.Hahn CA, Fish LJ, Dunn RH, et al. Prospective trial of a video educational tool for radiation oncology patients. Am J Clin Oncol. 2005;28:609–612. doi: 10.1097/01.coc.0000182417.94669.a0. [DOI] [PubMed] [Google Scholar]
  • 16.Jahraus D, Sokolosky S, Thurston N, et al. Evaluation of an education program for patients with breast cancer receiving radiation therapy. Cancer Nurs. 2002;25:266–275. doi: 10.1097/00002820-200208000-00002. [DOI] [PubMed] [Google Scholar]
  • 17.Matsuyama RK, Lyckholm LJ, Molisani A, et al. The value of an educational video before consultation with a radiation oncologist. J Cancer Educ. 2013;28:306–313. doi: 10.1007/s13187-013-0473-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Partin MR, Nelson D, Radosevich D, et al. Randomized trial examining the effect of two prostate cancer screening educational interventions on patient knowledge, preferences, and behaviors. J Gen Intern Med. 2004;19:835–842. doi: 10.1111/j.1525-1497.2004.30047.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Brock TP, Smith SR. Using digital videos displayed on personal digital assistants (PDAs) to enhance patient education in clinical settings. Int J Med Inform. 2007;76:829–835. doi: 10.1016/j.ijmedinf.2006.09.024. [DOI] [PubMed] [Google Scholar]
  • 20.Colaco M, Svider PF, Agarwal N, et al. Readability assessment of online urology patient education materials. J Urol. 2013;189:1048–1052. doi: 10.1016/j.juro.2012.08.255. [DOI] [PubMed] [Google Scholar]
  • 21.Svider PF, Agarwal N, Choudhry OJ, et al. Readability assessment of online patient education materials from academic otolaryngology-head and neck surgery departments. Am J Otolaryngol. 2013;34:31–35. doi: 10.1016/j.amjoto.2012.08.001. [DOI] [PubMed] [Google Scholar]
  • 22.Byun J, Golden DW. Readability of patient education materials from professional societies in radiation oncology: are we meeting the national standard? Int J Radiat Oncol Biol Phys. 2015;91:1108–1109. doi: 10.1016/j.ijrobp.2014.12.035. [DOI] [PubMed] [Google Scholar]
  • 23.Rosenberg SA, Francis D, Hullett CR, et al. Readability of Online Patient Educational Resources Found on NCI-Designated Cancer Center Web Sites. J Natl Compr Canc Netw. 2016;14:735–740. doi: 10.6004/jnccn.2016.0075. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

RESOURCES