Abstract
Background
Little is known about the readability and utility of patient education materials for stereotactic radiosurgery (SRS). Therefore, the goal of this investigation was to evaluate such materials from high-performing neurosurgery hospitals and professional societies through an analysis of readability and educational content.
Methods
In this cross-cross sectional study, 61 websites associated with the top 50 neurosurgery and neurology hospitals according to U.S. News & World Report (USNWR) and 11 predetermined professional medical societies were queried. Identified SRS education materials were analyzed by 6 readability indices. Educational content was assessed by 10 criteria based on surveys of patients’ perspectives about SRS.
Results
Fifty-four materials were identified from the target population (45 from USNWR hospital websites and 9 from professional society websites). Mean readability of materials ranged from 11.7 to 15.3 grade level, far more difficult than national recommendations of sixth and eighth grade. Materials were found to have deficiencies in educational content. Compared with high-performing hospitals, materials from websites of professional societies were longer (P = .002), and more likely to discuss risks and benefits specific to SRS (P = .008), alternative treatment options (P = .05) and expected outcomes or postprocedure descriptions (P = .004). Hospital materials were also more likely to favor brand-specific terminology (eg, GammaKnife) over generic terminology (eg, radiosurgery; P = .019).
Conclusion
Publicly available online patient educational materials for SRS are written at reading levels above national recommendations. Furthermore, many lack information identified as important by patients. Reevaluation and improvement of online SRS educational materials on a national scale are warranted.
Keywords: patient education, stereotactic radiosurgery (SRS)
The internet is increasingly becoming a resource for patients to access health care information about their medical conditions.1,2 A vast expanse of readily accessible online educational materials exists and has been shown to affect how patients make decisions regarding their medical care.3,4 Furthermore, these resources can be important tools to improve patient self-management after receiving care. Optimally, in this setting, availability of easily readable, accurate, and comprehensive online education materials would be valuable for patient education and decision making.5
Poor health literacy is gaining recognition as a predictor of worse clinical outcomes. As a result, there is a growing effort to address gaps in population health literacy to improve large-scale quality of care.6,7 The American Medical Association (AMA) and National Institutes of Health (NIH) recommend that patient education materials be written at the sixth- and eighth-grade reading level,8,9 respectively, consistent with the estimated average reading level of eighth to ninth grade for the American adult.10 It has previously been shown that online patient education materials from hospitals and professional medical societies are often written at reading levels above these recommendations.11–17 However, previous investigations often did not assess the educational content of such materials, which limits the generalizability of conclusions. An easily readable text that does not accurately represent key medical concepts would ultimately be of limited value.
For highly technical procedures and treatments, creating effective educational documents can be especially challenging. Stereotactic radiosurgery (SRS) is an example of a procedure that is hard to comprehensively explain in easily readable text. The difficulty arises in using understandable language without compromising clinically important technical details. Therefore, the goal of this study was to analyze online patient education materials for SRS from high-performing hospitals and professional societies on a national scale, with the hypothesis that many materials would have content deficiencies and fail to meet readability recommendations of sixth and eighth grade. By evaluating materials in terms of readability and educational content alike, we aimed to provide a novel assessment of their utility in patient education. Identification of significant deficiencies would encourage modification of currently available materials and could provide insight to guide the development of future educational documents as well.
Methods
An experimental protocol was defined according to Strengthening the Reporting of Observational studies in Epidemiology guidelines.18 A flowchart of the general experimental design is provided in Supplemental Figure 1. Patient education is defined as “a systematic experience in which a combination of methods is generally used, such as the provision of information and advice and behavior modification techniques, which influence the way the patient experiences his illness and/or his knowledge and health behavior, aimed at improving or maintaining or learning to cope with a condition.”19 For this investigation, an educational material was considered any written document or text designed to deliver patient education according to this definition.
Website Query and Educational Content Analysis
From August to September 2018, all publicly available websites per the following criteria were queried: associated with 1) the top-50 ranked U.S. News & World Report (USNWR)20 hospitals in neurosurgery and neurology or 2) 11 predetermined professional societies (American Society for Radiation Oncology, American Society of Clinical Oncology: cancer.net, Cancer Research UK, Radiological Society of North America: radiologyinfo.org, American Association of Neurological Surgeons, American Brain Tumor Association, International Radiosurgery Association, International Stereotactic Radiosurgery Society, National Health Services Foundation Trust, Radiosurgery Society, and United States National Library of Medicine: MedlinePlus.gov).
For each USNWR hospital, the department of neurosurgery website was searched first for SRS educational materials; if no materials were identified then a subsequent search of the corresponding radiation oncology department was performed. These departments were chosen because they represent the specialties typically involved in the delivery of SRS treatment. The professional societies were selected because they represent well-recognized domestic and international oncology and neurosurgical groups. The USNWR top 50 rankings were used on the basis of being a well-known public list of high-performing hospitals. Neither the individual rankings nor the ranking criteria were considered (or relevant) for this analysis; the list was used solely to identify a group of institutions of purported excellence in the private and academic sectors.
Sixty-one websites in total were queried (50 USWNR hospital websites and 11 professional society websites) and only patient education materials with text that specifically described SRS were included. There were no other exclusion criteria. Educational materials were identified on 45 of the 50 USWNR websites and 9 of the 11 professional society websites. Minimum length of text was not used for exclusion as this was a predetermined parameter of interest.
Readability Analysis
All texts were converted into Microsoft Word documents (Microsoft Corp). Readability analysis was conducted using batched projects in Readability Studio 2012 (Oleander Software Ltd). Readability scores included in this investigation were Flesch-Kincaid,21 FORd/CAylor/Stitch,22 Fry score,23 Gunning Fog,24 Raygor estimate,25 and Simple Measure of Gobbledygook,26 all of which are reported in grade-level equivalents. Scores can be interpreted as describing the required education level necessary to understand a particular written material. For example, a score of 8.0 would suggest that an eighth-grade education is required to understand a given text. These indices are widely used and well-validated measures of readability.27 Additional information with a calculation formula or description of each score is available in Supplemental Table 1. Text length was measured using the difficult word analysis function in Readability Studio. One identified material from a USNWR hospital website had insufficient text for analysis with Readability Studio and was thus excluded from this analysis.
Content Analysis
All 54 identified materials, including the 1 USNWR hospital material with insufficient text for readability analysis, were included for educational content analysis. Based on previous surveys of patient perspectives before and during SRS treatment,28–31 binary criteria were created by the authors to assess inclusion of educational content perceived as important by patients (Supplemental Table 2). This patient-centered evaluation approach has been recommended in the development and assessment of information materials.5,32 To quantify the use of brand-specific terminology, general terms (radiosurgery, SRS, etc) as well as brand-specific terms (eg, CyberKnife [Accuray Incorporated]) were summed in usage. A list of all included general and brand-specific terms is available in Supplemental Table 3. A ratio was constructed (number of brand terms/number of general name terms) for each material and used to evaluate whether there was preferential usage of brand terminology among types of materials. Two hospital website materials used exclusively brand name terms and thus their usage ratio of brand/generic terms would be undefined (eg, dividing by zero); in these cases, an artificial maximum ratio of 10 was used for statistical comparisons. A paired bar plot was used to compare term usage ratio frequencies between types of materials. Institutional review board approval was not sought because of the nature of the investigation.
Statistical Analysis
The Wilcoxon rank sum test was used to analyze differences in continuous variables. Fisher exact test was used to evaluate categorical differences between groups based on educational content criteria. In accordance with national recommendations, materials were evaluated to determine whether they were at or below each recommended readability level (sixth and eighth grade), as measured by the most permissive of the 6 included readability indices for each material. Statistical testing was conducted using Microsoft Excel (Microsoft Corp) and R version 3.3.3 (R Foundation for Statistical Computing).
Results
Readability Analysis
A summary of readability scores is provided in Table 1, and score distributions for each included readability index are displayed in Fig. 1. A Raygor distribution of readability scores for all included materials is provided in Fig. 2. Mean readability grade level for all texts in aggregate ranged from 11.7 to 15.3 for the 6 included readability indices. Materials from USNWR hospital websites had a mean readability level ranging from 11.8 to 15.6, whereas materials from professional society websites ranged from 11.1 to 13.6. There were no significant differences in readability of USNWR center materials and professional society materials as measured by any score. Two of 44 (5%) USNWR hospital SRS materials were written at or below both the sixth- and eighth-grade recommendations. Of professional society materials, 2 of 9 texts (22%) were written at the eighth-grade level with 1 (11%) meeting the sixth-grade recommendation.
Table 1.
Summary of Readability Grade Levels for Online Stereotactic Radiosurgery Patient Education Materials From USNWR Top-Ranked Neurosurgery Hospitals and Professional Society Websites
| Total | USNWR | Professional Society | P | ||||
|---|---|---|---|---|---|---|---|
| n = 53 | n = 44a | n = 9b | USNWR vs Professional Society | ||||
| Readability Score | Mean | SD | Mean | SD | Mean | SD | |
| FK | 12.9 | 2.8 | 13.2 | 2.7 | 11.5 | 2.9 | .12 |
| FORCAST | 11.7 | 0.9 | 11.8 | 0.8 | 11.1 | 0.8 | .068 |
| Fry | 15.3 | 2.9 | 15.6 | 2.5 | 13.6 | 4 | .14 |
| GF | 14.1 | 2.7 | 14.3 | 2.7 | 13.2 | 3.1 | .59 |
| Raygor | 13.6 | 3.5 | 14 | 3.2 | 11.5 | 4 | .06 |
| SMOG | 14.2 | 2.1 | 14.3 | 2.1 | 13.2 | 2.3 | .19 |
Abbreviations: FK, Flesch-Kincaid; FORCAST, FORd/CAylor/Stitch; GF, Gunning Fog; SMOG, Simple Measure of Gobbledygook; USNWR, U.S. News & World Report.
aFifty USNWR center websites were queried for SRS educational materials. Forty-five websites had some form of educational material available. One website had insufficient text for analysis with Readability Studio and was excluded from this analysis.
bEleven professional society websites were queried for stereotactic radiosurgery educational materials. Nine websites had some form of educational material available.
Fig. 1.
Readability of Online Patient Education Materials Designed to Explain Stereotactic Radiosurgery as Measured by 6 Readability IndicesFORCAST indicates FORd/CAylor/Stitch; SMOG, Simple Measure of Gobbledygook.
Fig. 2.
Distribution of Readability Scores for All Stereotactic Radiosurgery Online Patient Education Materials, as Measured by the Raygor Method
Educational Content Analysis
A comparison of content between educational materials from USNWR and professional society websites is provided in Table 2. Content was evaluated according to predetermined criteria based on surveys of patient perspectives and experiences with SRS, as discussed previously. When comparing types of materials, those from professional societies were on average of longer length than their USNWR-identified high-performing hospital counterparts (P = .002) and were more likely to discuss risks and benefits specific to SRS (89% vs 38%, P = .008), discuss alternative treatments (67% vs 29%, P = .05), and describe expected outcomes or posttreatment evaluation (78% vs 24%, P = .004). No other differences were noted per our predetermined criteria.
Table 2.
Comparison of Educational Contenta for Online SRS Patient Education Materials From USNWR Top Neurosurgery Hospital and Professional Society Websites
| USNWR Top Hospitals | Professional Societies | Total | P | ||||
|---|---|---|---|---|---|---|---|
| Count | 45 | % | 9 | % | 54 | % | |
| Length (words) | .002 | ||||||
| Mean | 569 | 1602 | 744 | ||||
| Median | 367 | 2041 | 439 | ||||
| Range | 2012 (82–2094) | 2455 (135–2590) | 2508 (82–2590) | ||||
| Includes a separate educational document for SRS (aside from general radiotherapy document) | 39 | 87 | 8 | 89 | 47 | 87 | .99 |
| Includes a discussion of risks and benefits specific to SRS | 17 | 38 | 8 | 89 | 25 | 46 | .008 |
| Discusses alternative treatments other than SRS | 13 | 29 | 6 | 67 | 19 | 35 | .05 |
| Discusses the use of SRS for malignant as well as benign conditions | 35 | 78 | 8 | 89 | 43 | 80 | .67 |
| Makes a clear statement that no incisions will be made | 27 | 60 | 6 | 67 | 33 | 61 | .99 |
| Uses graphics or videos to explain treatment or equipment | 17 | 38 | 4 | 44 | 21 | 39 | .72 |
| Discusses immobilization including a headframe when necessary | 25 | 56 | 8 | 89 | 33 | 61 | .075 |
| Includes a picture or graphic of the headframe/mask | 5 | 20 | 3 | 38 | 8 | 15 | .37 |
| Discusses expected outcomes and plan for follow-up | 11 | 24 | 7 | 78 | 18 | 33 | .004 |
| Uses brand-specific terminologyb | 30 | 67 | 7 | 78 | 37 | 69 | .7 |
Abbreviations: SRS, stereotactic radiosurgery; USNWR, U.S. News & World Report.
aCriteria for important content were determined by authors based on surveys of patient experiences and perspectives on SRS,22–25 with positive findings (eg, meeting criteria) reflecting patient preferences.
bThe use of brand-specific terminology is the only criterion that does not reflect patient preferences.
The majority of identified materials from hospital and professional society websites used at least one brand-specific name when describing SRS (67% and 78%, respectively). On quantitative usage analysis, the aggregate median ratio of brand/generic terms was 1.47. Materials from hospital websites used more brand terms compared with their counterparts from professional societies, with median brand/generic term usage ratios of 2.275 and 0.333, respectively (Fig. 3, P = .019).
Fig. 3.
Quantification and Comparison of Use of Brand-Specific vs General Stereotactic Radiosurgery Terms Between Professional Society and Hospital Websites
Excluding brand term usage, only 4% (2/45) of hospital materials and none of the professional society materials met all the content criteria. However, 78% (7/9) of society materials and only 13% (6/45) of hospital materials met at least 7 criteria.
Discussion
Patients with cancer are thrust into an unfamiliar world of tests, imaging, and procedures. Given this uncertainty, they often desire extensive information about their disease, prognosis, and possible treatments to make informed decisions that align with their values and preferences.5,32 Providers play a central role in facilitating decision making in this process, ideally through open-ended dialogue in a nonjudgmental setting.33,34 In this manner, treatment details including risks, benefits, and alternatives can be fully explored. In the current era, however, patients are increasingly using the internet as a source of supplementary educational materials to help guide them in their own medical care.1,2 If optimal, such materials could be valuable in allowing patients to make informed decisions.35
Designing easily understandable education materials is challenging, especially for procedures/treatments that are more abstract or technically complex such as SRS. Surveys have shown that patients often experience high levels of anxiety prior to and during SRS, partly as a result of “fear[ing] the unknown … and the prospect of [undergoing] a complex high-tech procedure.”28 Immobilization or placement of a headframe is a commonly cited source of anxiety for patients,29 with some suggesting that graphics or actual examples during consultation would aid in understanding.28 Patients also express concerns about uncertainty in expected outcomes after treatment.30,31
The use of the term stereotactic radiosurgery itself can contribute to misconceptions regarding the procedure, as patients may anticipate that it is “surgery” in the traditional sense of the word, requiring incisions and general anesthesia. This may compromise informed early decisions and heighten anxiety for those undergoing treatment. Difficulty with terminology is further compounded by the commonplace use of brand-specific terms (eg, GammaKnife, Novalis, Trilogy), which creates another level of medical jargon for patients to navigate and may add bias in favor of one brand-specific treatment over another.
This study suggests that currently available online patient educational materials for SRS are of limited utility for patients. First, readability of available materials poses a challenge, particularly for those with low health literacy. Estimates place 20% of the United States population at the fifth-grade reading level or below,36 whereas the majority of identified materials were in the range of 12th-16th grades, well above national recommendations. As measured by the most permissible readability metric for each material, only 4 met NIH recommendations of achieving an eighth-grade reading level and, of those, only 3 met the AMA recommendation of a sixth-grade reading level.
Even with increased awareness of readability, designing easily readable education materials can be difficult, particularly for complex procedures such as SRS. The majority of readability metrics are determined by textual parameters such as average sentence or word length and use of uncommon terminology (Supplementary Table 1). Designers of education materials should therefore carefully consider these components when creating written educational resources. For example, intentional use of short words and sentences combined with an avoidance of unnecessary difficult terminology would translate into improved readability scores. Furthermore, designers can use free online readability calculators to quickly estimate the readability level of a text and guide the language selection for a particular material.
Graphics or visual aids in educational documents can aid in communication with patients of low health literacy and mitigate readability limitations of materials.37 This would be especially useful with SRS given the technical nature of the procedure. In the identified sample of educational materials, however, only 39% included some form of visual aid, which represents a significant opportunity for modification and improvement.
Other large-scale investigations of health-related educational materials in various medical fields have also found materials at inappropriately high reading levels.11–15 This study delves deeper to analyze educational content, which optimally should be accurate and comprehensive for patient use. A significant number of online educational materials were missing important points of information from the patient perspective. Fifty-six percent of all identified materials failed to describe risks and benefits of SRS, with most describing only the latter—benefits and advantages. Although SRS is typically considered a safe procedure,38–40 complete omission of a discussion of potential harms or a reference to the possibility may create false perceptions about safety. At a minimum, patients should be encouraged to discuss the risks with their providers. Discussion of alternative treatments or the possibility of such alternatives was also infrequent (35%). For patients with benign lesions, it may be especially important to highlight alternative treatment options, even if such a discussion were superficial (eg, a bulleted list) or references to other resources.
Immobilization or headframe placement (in the setting of a suffix such as “surgery”) is a commonly cited source of apprehension for patients considering SRS.28–31 Educational documents should optimally address this in plain language to help alleviate unnecessary anxiety. The majority of queried resources did make clear statements that incisions are not required for SRS (61%) and that a headframe/mask is needed for immobilization (61%). Only rarely was this accompanied with a graphic or visual (15%), which could lessen the fear of the unkown.28
Information was also lacking on general expected outcomes, follow-up, and posttreatment assessment (33%), another important source of uncertainty for patients undergoing SRS.30,31 Although this can vary depending on histology and needs provider input, it is helpful to note, for instance, that SRS treatment outcome is not instant and generally requires weeks to months. This is an important distinction between surgery and radiosurgery. A minority of documents addressed this issue well, including tables of conditions specifying success rates and expected timelines for patient follow-up. A similar discussion should be encouraged to answer a natural question of “what happens after?”
Medical jargon and complex terminology can create a significant barrier for effective communication between patients and medical providers.41 The use of brand names can amplify this problem. In this study, the majority (69%) of queried materials referenced individual brands at least once. Further, brand-specific terms were more commonly used than generic terminology, with a median usage ratio of 1.47 (number of brand-specific terms/number of generic terms). Hospital material used this terminology more often than professional societies, which may be reflective of individual institutional interests, as treatment centers may be motivated to market specific brands of technology. This is a significant and unnecessary obstacle for effective patient communication in SRS, for which brand names are often emphasized. By adopting generalized, universal language, providers can more clearly communicate to patients in a way that is consistent with each other and with widely available materials. Providers should recognize that with a properly maintained, physics-verified radiosurgery unit—target definition, prescribed dose, and fractions are far more important than the specific technology used for treatment delivery. A nationwide effort to address this deficiency can help diminish important barriers and sources of confusion for patients considering SRS.
In a difference analysis between institution types, we found that materials from professional societies were more likely to address SRS risks and benefits, alternative treatments, and expected outcomes or plans for patient follow-up than USNWR hospital materials. The majority of society materials met at least 7 of 9 content criteria (78%), whereas only a minority of hospital materials met this threshold (13%). Professional society materials were also significantly longer on average, allowing for a more thorough discussion of details and alternatives. This again suggests that these materials may be more effective in providing essential treatment information, especially relating to patient decision making. Although increasing length does not guarantee improved efficacy of educational documents, a sufficient minimum length is necessary for adequate discussion of necessary content and thus should be considered in the design of materials. These findings should encourage reevaluation and modification of hospital/institution-provided materials and promote emphasis on information from exemplary professional societies.
Limitations
This study is the first to evaluate SRS patient education materials on the basis of readability and educational content. Our findings suggest a need for improvement of educational materials pertaining to SRS. However, several limitations must be highlighted. First, USWNR rankings are not necessarily reflective of all high-performing hospitals in neurosurgery and radiation oncology. It is therefore possible that some centers of excellence were not included in this analysis. Further, this method of material identification might also limit the generalizability of our findings, as these centers of excellence might not be representative of all centers with available materials. Nevertheless, given the widespread recognition of these rankings (especially among patients), this approach allowed for a reproducible and objective method with which to assay hospital educational materials. It is also possible that some documents were missed on individual website review and thus were not included. Additionally, some websites include educational videos but these were not evaluated in this study.
Second, although the educational content criteria were developed per patient survey responses and qualitative studies, it is possible that these perspectives may not be generalizable to populations across all geographic and socioeconomic distributions. Furthermore, it can be challenging to determine a practical goal for educational content coverage. We suggest that designers of materials aim to meet as many criteria as possible, but perhaps covering at least 75% is a realistic target. In the brand-term usage analysis, a maximum value of 10 was used for materials that otherwise would have had undefined ratios (eg, diving by zero). This number was selected because it represents strong preferential use of brand terms and allows for objective comparison across groups. Use of a different maximum value would have minimal effect on the observed comparisons.
Third, accuracy of content is an essential component of effective patient education material; however, evaluation of accuracy was not attempted in this investigation. Furthermore, because our criteria were binary in nature they may be unable to fully capture depth of content coverage. For example, one material might spend an entire paragraph discussing side effects whereas another describes them only superficially; however, our criteria would be unable to discriminate between them. Lastly, some of the data gathered could have been interpreted differently. For instance, when deciding whether a document makes a clear statement about incisions not being needed in SRS, one reviewer’s interpretation of “clear” may differ from another, limiting some reproducibility.
Conclusion
Online SRS education materials from hospital and professional societies are written at readability levels above national recommendations. Patients are increasingly using the internet as a resource, and this may limit their ability to fully understand important treatment details regarding SRS. Professional society materials are more comprehensive in content compared with those from hospital websites but global reevaluation and modification of all such materials is warranted. Individual institutions and professional societies are encouraged to evaluate readability and content when revising and updating patient education materials for SRS.
Funding
No authors received any financial support for this work.
Supplementary Material
Acknowledgment
The authors declare that the enclosed manuscript has not been published elsewhere, it has not been accepted for publication elsewhere, and it is not under editorial review for publication elsewhere. MSL, RVL, and SS are supported by NIH P50CA221747 SPORE for Translational Approaches to Brain Cancer.
Conflict of interest statement. Dr Golden reports having a financial interest in RadOnc Questions LLC and HemOncReview LLC. No other authors have conflicts of interests to declare.
References
- 1. Kummervold PE, Chronaki CE, Lausen B, et al. eHealth trends in Europe 2005-2007: a population-based survey. J Med Internet Res. 2008;10(4):e42. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. Ayantunde AA, Welch NT, Parsons SL. A survey of patient satisfaction and use of the Internet for health information. Int J Clin Pract. 2007;61(3):458–462. [DOI] [PubMed] [Google Scholar]
- 3. Kurup V, Considine A, Hersey D, et al. Role of the internet as an information resource for surgical patients: a survey of 877 patients. Br J Anaesth. 2013;110(1):54–58. [DOI] [PubMed] [Google Scholar]
- 4. Ilic D. The role of the internet on patient knowledge management, education, and decision-making. Telemed J E Health. 2010;16(6):664–669. [DOI] [PubMed] [Google Scholar]
- 5. Coulter A, Entwistle V, Gilbert D. Sharing decisions with patients: is the information good enough? BMJ. 1999;318(7179):318–322. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Parker R. Health literacy: a challenge for American patients and their health care providers. Health Promot Int. 2000;15(4):277–283. [Google Scholar]
- 7. Dewalt DA, Berkman ND, Sheridan S, Lohr KN, Pignone MP.. Literacy and health outcomes: a systematic review of the literature. J Gen Intern Med. 2004;19(12):1228–1239. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. National Institutes of Health. How to Write Easy-to-Read Health Materials. 2017. https://medlineplus.gov/etr.html. Accessed October 1, 2018. [Google Scholar]
- 9. Weiss BD. Health Literacy: A Manual for Clinicians. Chicago, IL: American Medical Association Foundation and American Medical Association; 2003. [Google Scholar]
- 10. Davis TC, Wolf MS. Health literacy: implications for family medicine. Fam Med. 2004;36(8):595–598. [PubMed] [Google Scholar]
- 11. Byun J, Golden DW. Readability of patient education materials from professional societies in radiation oncology: are we meeting the national standard? Int J Radiat Oncol Biol Phys. 2015;91(5):1108–1109. [DOI] [PubMed] [Google Scholar]
- 12. Prabhu AV, Hansberry DR, Agarwal N, Clump DA, Heron DE.. Radiation oncology and online patient education materials: deviating from NIH and AMA recommendations. Int J Radiat Oncol Biol Phys. 2016;96(3):521–528. [DOI] [PubMed] [Google Scholar]
- 13. Agarwal N, Hansberry DR, Sabourin V, Tomei KL, Prestigiacomo CJ.. A comparative analysis of the quality of patient education materials from medical specialties. JAMA Intern Med. 2013;173(13):1257–1259. [DOI] [PubMed] [Google Scholar]
- 14. Huang G, Fang CH, Agarwal N, Bhagat N, Eloy JA, Langer PD.. Assessment of online patient education materials from major ophthalmologic associations. JAMA Ophthalmol. 2015;133(4):449–454. [DOI] [PubMed] [Google Scholar]
- 15. Gupta R, Adeeb N, Griessenauer CJ, et al. Evaluating the complexity of online patient education materials about brain aneurysms published by major academic institutions. J Neurosurg. 2017;127(2):278–283. [DOI] [PubMed] [Google Scholar]
- 16. Storino A, Castillo-Angeles M, Watkins AA, et al. Assessing the accuracy and readability of online health information for patients with pancreatic cancer. JAMA Surg. 2016;151(9):831–837. [DOI] [PubMed] [Google Scholar]
- 17. Miles RC, Baird GL, Choi P, Falomo E,Dibble EH, Garg M. Readability of online patient educational materials related to breast lesions requiring surgery. Radiology. 2019;291(1):112–118. [DOI] [PubMed] [Google Scholar]
- 18. von Elm E, Altman DG, Egger M, et al. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. Lancet. 2007;370(9596):1453–1457. [DOI] [PubMed] [Google Scholar]
- 19. van den Borne HW. The patient from receiver of information to informed decision-maker. Patient Educ Couns. 1998;34(2):89–102. [DOI] [PubMed] [Google Scholar]
- 20. U.S. News & World Report. Best Hospitals for Neurology & Neurosurgery. https://health.usnews.com/best-hospitals/rankings/neurology-and-neurosurgery. Accessed August 5, 2018. [Google Scholar]
- 21. Kincaid J, Fishburne R, Rogers R, Chissom B.. Derivation of New Readability Formulas (Automated Readability Index, Fog Count And Flesch Reading Ease Formula) for Navy Enlisted Personnel. Institute for Simulation and Training; January 1975. http://stars.library.ucf.edu/istlibrary/56. Accessed September 1, 2018. [Google Scholar]
- 22. Caylor JS, Sticht TG, Fox LC, Ford JP.. Methodologies for Determining Reading Requirements of Military Occupational Specialties. March 1973. https://eric.ed.gov/?id=ED074343. Accessed September 1, 2018. [Google Scholar]
- 23. Fry E. A readability formula that saves time. J Read. 1968;11(7):513–578. [Google Scholar]
- 24. Gunning R. The Technique of Clear Writing. Toronto, ON : McGraw-Hill; 1952. [Google Scholar]
- 25. Raygor AL. The Raygor Readability Estimate: a quick and easy way to determine difficulty. In: Pearson PD, ed. Reading: Theory, Research, and Practice, Twenty-sixth Yearbook of the National Reading Conference. Clemson, SC: National Reading Conference; 1977:259–263. [Google Scholar]
- 26. McLaughlin GH. SMOG grading: a new readability formula. J Read. 1969;12(8):639–646. [Google Scholar]
- 27. Friedman DB, Hoffman-Goetz L. A systematic review of readability and comprehension instruments used for print and web-based cancer information. Health Educ Behav. 2006;33(3):352–373. [DOI] [PubMed] [Google Scholar]
- 28. Clifford W, Sharpe H, Khu KJ, Cusimano M, Knifed E, Bernstein M.. Gamma Knife patients’ experience: lessons learned from a qualitative study. J Neurooncol. 2009;92(3):387–392. [DOI] [PubMed] [Google Scholar]
- 29. Avbovbo UE, Appel SJ. Strategies to alleviate anxiety before the placement of a stereotactic radiosurgery frame. J Neurosci Nurs. 2016;48(4):224–228. [DOI] [PubMed] [Google Scholar]
- 30. Ward-Smith P. Stereotactic radiosurgery for malignant brain tumors: the patient’s perspective. J Neurosci Nurs. 1997;29(2):117–122. [DOI] [PubMed] [Google Scholar]
- 31. Menkes DB, Davison MP, Costello SA, Jaye C.. Stereotactic radiosurgery: the patient’s experience. Soc Sci Med. 2005;60(11):2561–2573. [DOI] [PubMed] [Google Scholar]
- 32. Jefford M, Tattersall MH. Informing and involving cancer patients in their own care. Lancet Oncol. 2002;3(10):629–637. [DOI] [PubMed] [Google Scholar]
- 33. Cuisinier MC, Van Eijk JT, Jonkers R, Dokter HJ.. Psychosocial care and education of the cancer patient: strengthening the physician’s role. Patient Educ Couns. 1986;8(1):5–16. [DOI] [PubMed] [Google Scholar]
- 34. McCann DP, Blossom HJ. The physician as a patient educator. From theory to practice. West J Med. 1990;153(1):44–49. [PMC free article] [PubMed] [Google Scholar]
- 35. Jewitt N, Hope AJ, Milne R, et al. Development and evaluation of patient education materials for elderly lung cancer patients. J Cancer Educ. 2016;31(1):70–74. [DOI] [PubMed] [Google Scholar]
- 36. Doak CC, Doak LG, Friedell GH, Meade CD.. Improving comprehension for cancer patients with low literacy skills: strategies for clinicians. CA Cancer J Clin. 1998;48(3):151–162. [DOI] [PubMed] [Google Scholar]
- 37. Choi J. Literature review: using pictographs in discharge instructions for older adults with low-literacy skills. J Clin Nurs. 2011;20(21-22):2984–2996. [DOI] [PubMed] [Google Scholar]
- 38. Hasegawa T, Kida Y, Kato T, Iizuka H, Kuramitsu S, Yamamoto T.. Long-term safety and efficacy of stereotactic radiosurgery for vestibular schwannomas: evaluation of 440 patients more than 10 years after treatment with Gamma Knife surgery. J Neurosurg. 2013;118(3):557–565. [DOI] [PubMed] [Google Scholar]
- 39. Trifiletti DM, Lee CC, Winardi W, et al. Brainstem metastases treated with stereotactic radiosurgery: safety, efficacy, and dose response. J Neurooncol. 2015;125(2):385–392. [DOI] [PubMed] [Google Scholar]
- 40. Choi CY, Soltys SG, Gibbs IC, et al. Stereotactic radiosurgery of cranial nonvestibular schwannomas: results of single- and multisession radiosurgery. Neurosurgery. 2011;68(5):1200–1208; discussion 1208. [DOI] [PubMed] [Google Scholar]
- 41. Donovan-Kicken E, Mackert M, Guinn TD, Tollison AC, Breckinridge B.. Sources of patient uncertainty when reviewing medical disclosure and consent documentation. Patient Educ Couns. 2013;90(2):254–260. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.



