ABSTRACT
Most cases of optic neuritis (ON) occur in women and in patients between the ages of 15 and 45 years, which represents a key demographic of individuals who seek health information using the internet. As clinical providers strive to ensure patients have accessible information to understand their condition, assessing the standard of online resources is essential. To assess the quality, content, accountability, and readability of online information for optic neuritis. This cross-sectional study analyzed 11 freely available medical sites with information on optic neuritis and used PubMed as a gold standard for comparison. Twelve questions were composed to include the information most relevant to patients, and each website was independently examined by four neuro-ophthalmologists. Readability was analyzed using an online readability tool. Journal of the American Medical Association (JAMA) benchmarks, four criteria designed to assess the quality of health information further were used to evaluate the accountability of each website. Freely available online information. On average, websites scored 27.98 (SD ± 9.93, 95% CI 24.96–31.00) of 48 potential points (58.3%) for the twelve questions. There were significant differences in the comprehensiveness and accuracy of content across websites (p < .001). The mean reading grade level of websites was 11.90 (SD ± 2.52, 95% CI 8.83–15.25). Zero websites achieved all four JAMA benchmarks. Interobserver reliability was robust between three of four neuro-ophthalmologist (NO) reviewers (ρ = 0.77 between NO3 and NO2, ρ = 0.91 between NO3 and NO1, ρ = 0.74 between NO2 and NO1; all p < .05). The quality of freely available online information detailing optic neuritis varies by source, with significant room for improvement. The material presented is difficult to interpret and exceeds the recommended reading level for health information. Most websites reviewed did not provide comprehensive information regarding non-therapeutic aspects of the disease. Ophthalmology organizations should be encouraged to create content that is more accessible to the general public.
KEYWORDS: Optic neuritis, patient education, readability, patient information, online resources
Introduction
Online resources for medical conditions are a natural consequence of an information-accessible era and have the potential to be an effective means of disseminating knowledge and educating patients. Prevalent conditions, such as optic neuritis, are important and can have life-changing implications. Optic neuritis (ON) is an inflammatory optic neuropathy that may occur in isolation or association with other conditions, such as multiple sclerosis.1–3 The disease incidence is variable but greatest among populations at higher latitudes.4–6 In the United States, reports have suggested an incidence and prevalence of 5–6.4 per 100,000 and 115 per 100,000, respectively.7,8 Although most patients eventually recover vision, long-term vision deficits occur.9 Moreover, the loss of visual function is associated with a substantial decrement in quality of life.10
Considering the potential impact on quality of life from optic neuritis, patients may be expected to search for more information about the condition online. Indeed, prior research shows that online resources are increasingly becoming the primary means for people to learn about their condition.11,12 A substantial proportion of patients regard this information as equivalent or superior to that provided by their physicians despite the absence of guidelines regulating available content.11 While the breadth of online medical resources may empower patients,13 it may also be overwhelming, and poorly presented information can promote anxiety.14 Therefore, it is essential that health-related resources are accurate, readable, and ultimately understandable for patients and the public.
Guidelines have been crafted by the American Medical Association and the United States Department of Health and Human Services to ensure that health educational materials do not exceed the reading level of a sixth-grade student.15,16 However, numerous studies have demonstrated that Ophthalmology materials found online exceed the recommended reading level and are often of poor quality.16–20 As clinical providers strive to ensure patients have accessible and reliable information about their condition, they need a clear understanding of which online sources are trustworthy. Because there currently exists no data regarding the quality and readability of patient education materials for ON, this investigation aimed to examine the accuracy, comprehensiveness, and readability of content from major medical websites.
Methods
Based on the institutional standards at UT Southwestern, our study met the criteria for IRB exemption. We did not receive any financial support for this study’s research, authorship, and publication.
Selection and evaluation of websites
The first ten major medical websites that appeared for the term ‘optic neuritis’ were selected for analysis using the Google search engine. These sites included the American Academy of Ophthalmology (https://www.aao.org/eye-health), WebMD (https://www.webmd.com), EyeWiki (https://eyewiki.org/Main_Page), Mayo Clinic (https://www.mayoclinic.org), All About Vision (https://www.allaboutvision.com), American Academy of Pediatric Ophthalmology and Strabismus (https://www.aapos.org), Medical News Today (https://www.medicalnewstoday.com), Johns Hopkins Medicine (https://www.hopkinsmedicine.org), Wikipedia (https://www.wikipedia.org) and MedicineNet (https://www.medicinenet.com). An additional source, the North American Neuro-Ophthalmology Society (https://www.nanosweb.org/i4a/pages/index.cfm?pageid=4191), was employed as it represents a resource designed explicitly by neuro-ophthalmologists for patient consumption. Specific links to each ON webpage are available in Supplemental Table S1. Overall, 11 websites were included for subsequent analysis. PubMed (https://pubmed.ncbi.nlm.nih.gov) was considered the gold standard by which other websites were evaluated for the accuracy and comprehensiveness of content. All online content was collated into a masked document for further review.
An assessment was created to evaluate the 11 resources against 12 key questions commonly encountered by ophthalmologists regarding ON. These were designed to examine the comprehensiveness and quality of each website’s content (Table 1). The evaluation was conducted by 4 fellowship-trained neuro-ophthalmologists (NO1; NO2; NO3; NO4). Resources were independently graded for each question on a scale of 0–4. A score of 0 points indicated the absence of relevant information; 1 point indicated the information was inaccurate, omitted significant details, or otherwise lacked clarity and organization; 2 points indicated the information was partially explicatory with some deficits in organization and salient details; 3 points indicated most necessary information was provided in a cogent manner; 4 points indicated the information was accurate, comprehensive, and organized.
Table 1.
Content analysis of websites.
| Total Points (maximum score 48) | Percentage | Mean | Standard deviation | 95% Confidence Interval | |
|---|---|---|---|---|---|
| WebMD | 19.75 | 41% | 1.646 | 0.4702 | 1.35–1.95 |
| AAO | 14.75 | 31% | 1.229 | 1.079 | 0.54–1.92 |
| Medical News Today | 27 | 56% | 2.25 | 0.977 | 1.63–2.87 |
| AAPOS | 24.5 | 51% | 2.042 | 1.278 | 1.23–2.85 |
| Johns Hopkins University | 24.25 | 51% | 2.021 | 1.1 | 1.32–2.72 |
| Mayo Clinic | 36 | 75% | 3 | 1.177 | 2.25–3.75 |
| Wikipedia | 24.5 | 51% | 2.042 | 1.022 | 1.39–2.69 |
| All About Vision | 3.75 | 64% | 2.563 | 0.8266 | 2.04–3.09 |
| MedicineNet | 39.75 | 83% | 3.313 | 0.6495 | 2.90–3.73 |
| EyeWiki | 41.25 | 86% | 3.438 | 0.899 | 2.87–4.01 |
| NANOS | 25.25 | 53% | 2.104 | 1.456 | 1.18–3.03 |
Website accountability
In 1997, the Journal of the American Medical Association (JAMA) developed four benchmarks to ascertain the accountability of websites. These characteristics are attribution (or sources), authorship, currency (or most recent date of update), and disclosure.21
Website readability
Through the Readable tool, the readability of each website was assessed using multiple validated measures: Flesch Reading Ease Score, Flesch Kincaid Grade Level, Coleman Liau Index, Gunning Fog Index, and Simple Measure of Gobbledygook (SMOG) Index. The Flesch Reading Ease Score is derived from word and sentence length to provide a score ranging between 0 and 100, with higher scores denoting greater readability; for reference, a score of 70 to 80 indicates a text equivalent to a 7th grader’s reading level. Alternatively, the Flesch Kincaid Grade Level, Coleman Liau Index, Gunning Fog Index, and Simple Measure of Gobbledygook (SMOG) Index provide scores directly representing US grade reading levels.
Statistical analysis
Kruskall-Wallis tests were performed to compare website scores, with individual pairwise comparisons assessed via a post hoc Dunn’s test. Spearman correlation coefficients were calculated to evaluate interobserver reproducibility and correlations between websites’ content and readability scores. All analyses were performed using GraphPad Prism 9.0 (San Diego, CA), with a threshold for significance set at p < .05.
Results
Content quality analysis
Overall, 11 websites were included in our investigation. Of these, only the American Academy of Ophthalmology (AAO) website (https://www.aao.org/eye-health) provided comprehensive accessibility options, with the ability to reverse contrast, increase the font size, and enable text reading. The remaining resources did not offer those options. Nine websites (81.8%) offered an accompanying graphic, although the quality of these visual aids was not assessed.
Interobserver reliability was robust between three of four reviewers (ρ = 0.77 between NO3 and NO2, ρ = 0.91 between NO3 and NO1, ρ = 0.74 between NO2 and NO1; all p < .05 and ρ = 0.33 between NO3 and NO4, ρ = 0.53 between NO2 and NO4, ρ = 0.43 between NO4 and NO1; all p > .05).
On average, websites scored 27.98 (SD ±9.93, 95% CI 24.96–31.00) of 48 potential points (58.3%). When including PubMed, there were significant differences between websites with respect to the comprehensiveness and accuracy of content (H = 33.44; p < .001). Excluding PubMed, EyeWiki had the highest average questionnaire score, with 41.25 points (85.9%), whereas AAO had the lowest, with 14.75 points (30.7%), as depicted in Table 1. Significant differences were observed between AAO and both PubMed (H = −40.26; p = .003) and EyeWiki (H = −33.88; p = .04). Among websites, there were significant differences in scores (p < .05) for 8 of 12 questions (66.7%; Table 2).
Table 2.
Questionnaire for content analysis of websites.
| WebMD | AAO | Medical News Today | AAPOS | Johns Hopkins University | Mayo Clinic | Wikipedia | All About Vision | MedicineNet | EyeWiki | NANOS | |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 1. What is optic neuritis? | 1.5 | 2.25 | 3.25 | 3 | 3 | 3.25 | 2.5 | 3.25 | 3.75 | 3.5 | 3.25 |
| 2. What causes optic neuritis? | 1.5 | 2.5 | 2.75 | 3.25 | 3 | 3.25 | 2.5 | 3.25 | 3.5 | 3.75 | 3.25 |
| 3. How common is optic neuritis? | 0.75 | 0 | 0 | 0 | 0.5 | 0 | 3.75 | 1.5 | 2 | 2.5 | 0 |
| 4. What are the demographics of optic neuritis? | 1.5 | 0 | 1.25 | 0 | 1.25 | 2 | 3.25 | 3 | 3 | 3 | 0 |
| 5. What are risk factors for optic neuritis? | 2 | 0.5 | 1.25 | 1.5 | 3 | 4 | 1.25 | 1.5 | 2.25 | 3.75 | 1.25 |
| 6. How is optic neuritis diagnosed? | 2.25 | 2 | 2.5 | 3.75 | 3 | 3.75 | .75 | 2.25 | 3.75 | 4 | 2 |
| 7. What treatment options are available? | 2 | 2.25 | 2.25 | 3.25 | 3 | 3.75 | 2.5 | 3 | 4 | 3.75 | 3.25 |
| 8. What can help my optic neuritis besides medications? | 1.5 | 0.5 | 2.25 | 1 | 0.75 | 1.75 | 0.5 | 1 | 3.5 | 1 | 0 |
| 9. What are the symptoms of optic neuritis? | 2.5 | 3 | 3.5 | 3 | 3.25 | 3.5 | 2.75 | 2.75 | 4 | 4 | 3.25 |
| 10. Can optic neuritis increase my risk for other diseases? | 1.5 | 0.5 | 2.5 | 1.75 | 1.25 | 3.75 | 1.25 | 2.5 | 3 | 4 | 3.25 |
| 11. What is the long-term prognosis? | 1.5 | 0.5 | 2.75 | 2.5 | 1.5 | 3.75 | 1.25 | 3.5 | 3.75 | 4 | 3.75 |
| 12. What other diseases may be associated with or be confused with optic neuritis? | 1.25 | 0.75 | 2.75 | 1.5 | 0.75 | 3.25 | 2.25 | 3.25 | 3.25 | 4 | 2 |
| Total Points (maximum score 48) | 19.75 | 14.75 | 27.00 | 24.50 | 24.25 | 36.00 | 24.50 | 30.75 | 39.75 | 41.25 | 25.25 |
| Percentage | 41% | 31% | 56% | 51% | 51% | 75% | 51% | 64% | 83% | 86% | 53% |
| Mean | 1.646 | 1.229 | 2.250 | 2.042 | 2.021 | 3.000 | 2.042 | 2.563 | 3.313 | 3.438 | 2.104 |
| Standard deviation | 0.4702 | 1.079 | 0.9770 | 1.278 | 1.100 | 1.177 | 1.022 | 0.8266 | 0.6495 | 0.8990 | 1.456 |
| 95% Confidence Interval | 1.35–1.95 | 0.54–1.92 | 1.63–2.87 | 1.23–2.85 | 1.32–2.72 | 2.25–3.75 | 1.39–2.69 | 2.04–3.09 | 2.90–3.73 | 2.87–4.01 | 1.18–3.03 |
Accountability analysis
Of all resources except PubMed, none achieved all four JAMA criteria, and two (18.2%) achieved three (Table 3). The most commonly fulfilled benchmark was currency (nine websites [81.8%]). Website accountability was not related to the quality of published content (ρ = 0.23; p = .48).
Table 3.
Accountability analysis of websites.
| JAMA Criteria | n (%) |
|---|---|
| 4 Benchmarks | 0 (0%) |
| 3 Benchmarks | 2 (18.2%) |
| 2 Benchmarks | 3 (27.3%) |
| 1 Benchmark | 4 (36.4%) |
| 0 Benchmarks | 2 (18.2%) |
| Attribution | 1 (9.1%) |
| Authorship | 4 (36.4%) |
| Currency | 10 (90.9%) |
| Disclosure | 2 (18.2%) |
Readability analysis
Across all websites, the average Flesch Reading Ease score was 46.07 (SD ± 15.36, 95% CI 28.00–64.20), and the average reading grade level was 11.90 (SD ± 2.52, 95% CI 8.83–15.25). These metrics strongly correlated (ρ = −0.99; p < .001).
A significant difference in average reading grade level was observed among resources (H = 37.53; p < .001), as illustrated in Table 4. Wikipedia had the highest average reading grade level (15.55) and lowest Flesch Reading Ease score (23.60), indicating poor readability. Comparatively, WebMD has the lowest average reading grade level (7.75) and highest Flesch Reading Ease score (70.5), indicating fair readability. The correlation between website quality and average reading grade level was significant (ρ = 0.61; p = .05).
Table 4.
Readability analysis of websites.
| WebMD | AAO | Medical News Today | AAPOS | Johns Hopkins University | Mayo Clinic | Wikipedia | All About Vision | MedicineNet | EyeWiki | NANOS | |
|---|---|---|---|---|---|---|---|---|---|---|---|
| Flesch Reading Ease | 70.50 | 64.20 | 55.80 | 39.30 | 6.00 | 45.80 | 23.60 | 42.40 | 28.00 | 31.20 | 46.00 |
| Average Reading Grade Level | 7.75 | 8.83 | 10.48 | 13.375 | 9.88 | 11.48 | 15.55 | 12.48 | 15.25 | 13.8 | 12.05 |
| Average Reading Grade Level, SD | 1.24 | 1.39 | 1.14 | 0.78 | 1.40 | 1.12 | 0.91 | 0.85 | 1.42 | 0.91 | 1.16 |
| Average Reading Grade Level, 95% CI | 5.78–9.72 | 6.62–11.03 | 8.65–12.30 | 12.13–14.62 | 7.65–12.10 | 9.69–13.26 | 14.10–17.00 | 11.12–13.83 | 13.00–17.50 | 12.36–15.24 | 10.21–13.89 |
Discussion
As patients increasingly rely on the internet for health-related information, access to scientifically validated, comprehensible online resources is essential. This study evaluated the content quality, accountability, accessibility, and readability of optic neuritis information online. Overall, our findings suggest a lack of uniformity in the suitability of online resources for patients with optic neuritis, with significant room for improvement. Although the websites performed well in certain content areas (describing ON, causes and symptoms, and diagnostic tests), information was lacking in other areas (scope of treatment options, demographics, and epidemiologic information). Moreover, the surveyed websites were mostly accurate and well-organized but demonstrated poor accountability and accessibility. Lastly, readability analysis revealed a mean 11th-grade reading level, which is substantially higher than the 6th-grade reading level recommended by the American Medical Association and the US Department of Health and Human Services for health information targeted for patients.15,16
One of the main considerations with the shift toward online patient education is the accuracy and thoroughness of the information provided. Across our surveyed websites, content was generally accurate, but there were significant differences with respect to its comprehensiveness and quality. In particular, AAO was found to have significantly poorer content than PubMed, whereas EyeWiki scored the highest out of all websites evaluated, receiving 41.25 points out of 48 (86%). Interestingly, both of these resources are published by the American Academy of Ophthalmology, which serves as a leader in curating patient-centered educational materials for the ophthalmology community. The observed discrepancy in content quality between these websites, AAO and EyeWiki, highlights the lack of standardization in online informational material, even when distributed by the same organization. This finding may be explained, in part, by differences in the intended audience. EyeWiki, although described as a site where ‘ophthalmologists, other physicians, patients, and the public can view an eye encyclopedia’, is primarily designed for ophthalmic provider education. Still, it was included in the analysis, considering that EyeWiki is freely available to the public. Interestingly, the reading level for EyeWiki was high (13.8), which highlights that thoroughness and readability are two distinct areas in evaluating online content and may come with trade-offs; choosing resources solely on comprehensiveness may not be ideal for patients with lower literacy levels. In addition, it was surprising that the website of another major ophthalmological organization in our analysis, the North American Neuro-Ophthalmology Society, was found to have less comprehensive information than that of other providers. Expert neuro-ophthalmologists write the NANOS brochure. It is intended for patients, yet when assessed against our key questions, there were several omitted content areas, and the overall mean questionnaire score ranked in the lower half of websites in our analysis. Nonetheless, this finding does not suggest EyeWiki is a more appropriate resource for patients relative to the NANOS brochure. Similar to other analyses, the content evaluation is predicated on multiple predefined questions crafted to reveal answers to questions the authors understood to be important to the patient. As our questionnaire was not validated against other measures, conclusions regarding the suitability of websites for patient consumption should be tempered. Additionally, the websites were accessed and analyzed in 2022, and since then there may been changes to the content.
Etiology, symptoms, and various diagnostic and treatment options were the most commonly discussed topics for optic neuritis. However, there was inconsistent coverage of demographic and epidemiologic information about ON. Given that ON is the most common cause of vision loss in the young adult population,22 inclusion of incidence and demographic information may serve to educate young individuals who are likely seeking information regarding their first vision-threatening disease. Anxieties surrounding the uncertainty of future vision loss may be mitigated by discussing long-term prognosis.23 However, whether this clinical element was discussed varied significantly across websites. Although prognosis is typically excellent in clinically isolated and multiple sclerosis-associated ON, an improved understanding of outlook may facilitate more productive discussions with physicians or health care providers. In addition, most websites mentioned the association of ON with multiple sclerosis but lacked discussion of alternative diagnoses that may be confused with ON. Mention of the differential diagnosis, even briefly, may inform patients of other possibilities and enhance effective shared decision-making and understanding of the physician’s clinical reasoning. Lastly, most resources emphasized the available medical treatments, such as steroids, but discussions of possible alternative therapies, such as intravenous immunoglobulin (IVIg) or plasma exchange (PLEX), were limited.
Accessibility
In the analysis of the websites’ overall accessibility, significant weaknesses were found. Of all 11 websites, only AAO provided comprehensive accessibility options, with the ability to reverse contrast, increase the font size, and enable text reading. Given that patients with optic neuritis typically have worse visual acuity and contrast sensitivity compared to the average population,24 online resources could be encouraged to provide the option to adjust font size and contrast. It is reassuring, however, to note that most websites (81.8%) offered an accompanying graphic, which is a helpful adjunct to written text, especially for people with low health literacy.25 Visual aids may help simplify the technically complex concepts inherent in neurologic and ophthalmic disorders.
Accountability
Accountability of online ON content, as measured using the validated JAMA benchmark, was overwhelmingly poor across the websites. We found no correlation between these benchmarks and website quality, indicating that these tools cannot be used to identify which websites have reliable information. In our study, no single website attained all four JAMA benchmarks, with the majority meeting two or fewer (9 of 11). ‘Currency’ was the most commonly met (90.8%), while ‘attribution’ was only met by one website (9.1%). While it is positive that ON websites frequently report and update information regarding the date that content was posted, references to the relevant source material should be provided. Online patient-facing websites are largely unregulated, and some websites may be biased or purposely misleading for financial gain26; references to scientific literature can enable patients to seek out validated information independently. Indeed, prior studies show that most ophthalmology patients would like their physicians to provide them with relevant links to webpages with reliable information.27 Overall, the poor accountability of ON websites is not surprising. Studies of online content about diabetic retinopathy, age-related macular degeneration, cataracts, and epiretinal membranes have demonstrated that most resources similarly met two or fewer benchmarks.17,19,20,28 Thus, poor accountability might be a generalized issue across online ophthalmic resources.
Readability
The readability of the websites was found to be of high complexity, far exceeding the average American literacy level. Previous literature has found that the mean readability of websites containing health information ranged between a US reading grade level of 10 and 15.29–31 It is suggested that websites aimed at the general public should present information at a 6th-grade reading level per the American Medical Association and the US Department of Health and Human Services guidelines.15,16 In our study, the most complex website was Wikipedia, with the highest average reading grade level (15.55). Only one of the websites, WebMD, was relatively close to conforming to the recommended guidelines, demonstrating that most online information patients are accessing about optic neuritis might be difficult to read and comprehend. Our observations are consistent with other studies investigating online content in ophthalmology and other medical specialties, showing higher readability levels than recommended.32 In particular, one study analyzed websites for a breadth of 16 different ophthalmic conditions, finding that ON content was written at a reading level of 12.1, corroborating our observed level of 11.9.16 Interestingly, ON was among the top three most complex ophthalmic topics online based on readability in that study, only behind uveitis and keratoconus. This further suggests that neuro-ophthalmological content may be inherently more difficult to understand. Given our observed positive trend in website quality and average reading grade level, this would suggest that materials may tend to sacrifice comprehensiveness and quality for comprehensibility. Information presented at a reading level above a patient’s capabilities may result in misunderstandings of information and potential stress.
Numerous studies have demonstrated that patients increasingly interact with online health information to inform and influence their healthcare practices, including their relationship with providers, adherence to treatment plans, and general understanding of disease.33–35 Often times, the internet may be the first source of information for patients instead of physicians.36 Therefore, online materials must strike a balance between containing a robust level of scientific information and maintaining comprehensibility for the general public. Patients with ON may be particularly inclined to research their condition online, as their demographics (typically aged between 15 and 45 years and female) represent a significant portion of heath-information seekers online. Indeed, 58.8% of individuals interacting with online health information are aged between 18 and 49 years, and the female gender is well-established as being among the strongest predictors of online health information-seeking behavior.37,38 Regardless, it is essential that online material is suitable to all patient groups of varying health literacy levels, and it is arguably most important that incorrect information is avoided. Misinformation may result in confusion of diagnosis, treatment options, and outcomes, leading to challenging consultations and improper or poor adherence to management.39,40
Future efforts
To enhance future efforts, websites providing information on ON should focus on reducing the average sentence length and substituting complex medical terms with simpler ones. Although, in many cases, technical terms are unavoidable, care should be taken to ensure that the primary message is delivered succinctly. Well-structured content will improve overall readability and increase patient engagement.41 However, the range of content can be expanded to address certain contextual information. On an individual level, ophthalmologists may provide their patients with a list of high-quality online resources to facilitate their education. Ideally, this list would encompass many resources appropriate to various levels of health literacy and with differing amounts of complexity and detail. In this manner, patients could select the resource most consistent with their level of interest and health literacy. Future research utilizing input from ophthalmology patients and the public is important to improve website scoring systems and ascertain which aspects of online patient resources are most important. Additional work assessing the effect of high-quality, highly readable patient resources on patient-centered outcomes may also be useful. Online educational materials have the potential to empower patients to work with their physicians to make informed decisions about their healthcare. Online resources can provide knowledge of available treatments and outcomes that may be important in implementing and adopting guideline recommendations.42
Limitations
This study has some important limitations. First, webpages are often created with different goals, such as encouraging newly symptomatic patients to seek medical care or providing a deeper level of information after an initial consultation. Thus, the level of the content may vary based on the target audience, which may have introduced bias in our analysis, where we compared the web pages’ quality to our set criteria. Fortunately, this does not affect our other quality metrics: accountability, interactivity, and readability, which are important considerations in the holistic evaluation of any particular resource. Second, despite including five distinct readability scales, these are quantitative, not qualitative, measures. A shorter word may not be more understandable, while a longer word may not necessarily be more difficult to understand. Short, monosyllabic words may be medical jargon, and the tools could then underestimate the reading level required to comprehend the text. Third, our study relied on a content questionnaire curated and completed by a panel of internationally recognized clinical experts and representatives of neuro-ophthalmology patient information committees. Expert reviewers were recruited for their clinical experience, familiarity with national guidelines, or current research in the medical condition of interest. Panelists also considered whether it was reasonable to expect to find the information queried in the survey on the Internet. Although interrater reliability was robust, reviewers may have tended to evaluate lengthier text more highly, as they were more thorough and inclusive. However, this does not necessarily mean they are the best sources for the purpose of educating patients. Future work in this disease area would include recruiting patients with lived experience of ON, their careers, and the general public. The determination of quality of content on the websites relies on the analysis of the subjective rankings of 4 individuals. As such, this aspect of the study is not organized in a way to make scientific conclusions, and results should only be used as a point of discussion.
In conclusion, there is significant variation in the content, quality, and readability of freely available online ON education material. According to JAMA standards, many websites lack transparency regarding website accountability. Most websites are written at a reading grade level higher than that recommended by established guidelines. As the quality of the online information regarding ON is variable, patients should be encouraged to discuss the information they gather from various resources with their physician. Major ophthalmology organizations could consider concentrating their efforts on creating patient-oriented content that is more suitable for individuals of varying health literacy levels.
Supplementary Material
Funding Statement
The author(s) reported there is no funding associated with the work featured in this article.
Disclosure statement
No potential conflict of interest was reported by the author(s) except for the following: Professor Mollan reports consultancy fees (Invex 555 Therapeutics); advisory board fees (Invex therapeutics;Gensight) and speaker fees (Heidelberg engineering; Chugai-Roche Ltd; Allergan; Santen; Chiesi; and Santhera), all outside the submitted work.
Author contributor
All authors contributed to data acquisition, writing and critical review of the paper.
Supplementary material
Supplemental data for this article can be accessed online at https://doi.org/10.1080/01658107.2024.2301728.
References
- 1.Toosy AT, Mason DF, Miller DH.. Optic neuritis. Lancet Neurol. 2014;13(1):83–99. doi: 10.1016/S1474-4422(13)70259-X. [DOI] [PubMed] [Google Scholar]
- 2.Pau D, Al Zubidi N, Yalamanchili S, Plant GT, Lee AG.. Optic neuritis. Eye. 2011;25(7):833–842. doi: 10.1038/eye.2011.81. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Balcer LJ. Optic neuritis. N Engl J Med. 2006;354(12):1273–1280. doi: 10.1056/NEJMcp053247. [DOI] [PubMed] [Google Scholar]
- 4.Shams PN, Plant GT. Optic neuritis: a review. Int MS J. 2009;16:82–89. [PubMed] [Google Scholar]
- 5.Kurtzke JF. Optic neuritis or multiple sclerosis. Arch Neurol. 1985;42(7):704–710. doi: 10.1001/archneur.1985.04060070098026. [DOI] [PubMed] [Google Scholar]
- 6.Braithwaite T, Subramanian A, Petzold A, et al. Trends in optic neuritis incidence and prevalence in the UK and association with systemic and neurologic disease. JAMA Neurol. 2020;77(12):1514–1523. doi: 10.1001/jamaneurol.2020.3502. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Rodriguez M, Siva A, Cross SA, O’Brien PC, Kurland LT. Optic neuritis: a population-based study in olmsted county, Minnesota. Neurology. 1995;45(2):244–250. doi: 10.1212/wnl.45.2.244. [DOI] [PubMed] [Google Scholar]
- 8.Percy AK, Nobrega FT, Kurland LT. Optic neuritis and multiple sclerosis. An epidemiologic study. Arch Ophthalmol. 1972;87(2):135–139. doi: 10.1001/archopht.1972.01000020137004. [DOI] [PubMed] [Google Scholar]
- 9.Galetta SL, Villoslada P, Levin N, et al. Acute optic neuritis. Unmet clinical needs and model for new therapies. Neurol Neuroimmunol Neuroinflammation. 2015;2(4):e135. doi: 10.1212/nxi.0000000000000135. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Visual function 15 years after optic neuritis: a final follow-up report from the optic neuritis treatment trial. Ophthalmology. 2008;115(6):1079–1082.e5. doi: 10.1016/j.ophtha.2007.08.004. [DOI] [PubMed] [Google Scholar]
- 11.Diaz JA, Griffith RA, Ng JJ, Reinert SE, Friedmann PD, Moulton AW. Patients’ use of the Internet for medical information. J Gen Intern Med. 2002;17(3):180–185. doi: 10.1046/j.1525-1497.2002.10603.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Shuyler KS, Knight KM. What are patients seeking when they turn to the Internet? Qualitative content analysis of questions asked by visitors to an orthopaedics web site. J Med Internet Res. 2003;5(4):e24. doi: 10.2196/jmir.5.4.e24. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Calvillo J, Roman I, Roa LM. How technology is empowering patients? A literature review. Health Expectations. 2015;18(5):643–652. doi: 10.1111/hex.12089. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Lee JJ, Kang K-A, Wang MP, et al. Associations between COVID-19 misinformation exposure and belief with COVID-19 knowledge and preventive behaviors: cross-sectional online study. J Med Internet Res. 2020;22(11):e22205. doi: 10.2196/22205. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Kher A, Johnson S, Griffith R. Readability assessment of online patient education material on congestive heart failure. Adv Prev Med. 2017;2017:9780317. doi: 10.1155/2017/9780317. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Edmunds MR, Barry RJ, Denniston AK. Readability assessment of online ophthalmic patient information. JAMA Ophthalmol. 2013;131(12):1610–1616. doi: 10.1001/jamaophthalmol.2013.5521. [DOI] [PubMed] [Google Scholar]
- 17.Kloosterboer A, Yannuzzi NA, Patel NA, Kuriyan AE, Sridhar J. Assessment of the quality, content, and readability of freely available online information for patients regarding diabetic retinopathy. JAMA Ophthalmol. 2019;137(11):1240–1245. doi: 10.1001/jamaophthalmol.2019.3116. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Patel PA, Gopali R, Reddy A, Patel KK. The readability of ophthalmological patient education materials provided by major academic hospitals. Semin Ophthalmol. 2022;37(1):71–76. doi: 10.1080/08820538.2021.1915341. [DOI] [PubMed] [Google Scholar]
- 19.Patel AJ, Kloosterboer A, Yannuzzi NA, Venkateswaran N, Sridhar J. Evaluation of the content, quality, and readability of patient accessible online resources regarding cataracts. Semin Ophthalmol. 2021;36(5–6):384–391. doi: 10.1080/08820538.2021.1893758. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Kloosterboer A, Yannuzzi N, Topilow N, Patel N, Kuriyan A, Sridhar J. Assessing the quality, content, and readability of freely available online information for patients regarding age-related macular degeneration. Semin Ophthalmol. 2021;36(5–6):400–405. doi: 10.1080/08820538.2021.1893761. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the Internet: caveant lector et viewor–let the reader and viewer beware. JAMA. 1997;277(15):1244–1245. doi: 10.1001/jama.1997.03540390074039. [DOI] [PubMed] [Google Scholar]
- 22.Hassan MB, Stern C, Flanagan EP, et al. Population-based incidence of optic neuritis in the era of aquaporin-4 and myelin oligodendrocyte glycoprotein antibodies. Am J Ophthalmol. 2020;220:110–114. doi: 10.1016/j.ajo.2020.07.014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Braithwaite T, Wiegerinck N, Petzold A, Denniston A. Vision loss from atypical optic neuritis: patient and physician perspectives. Ophthalmol Ther. 2020;9(2):215–220. doi: 10.1007/s40123-020-00247-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Trobe JD, Beck RW, Moke PS, Cleary PA. Contrast sensitivity and other vision tests in the optic neuritis treatment trial. Am J Ophthalmol. 1996;121(5):547–553. doi: 10.1016/s0002-9394(14)75429-7. [DOI] [PubMed] [Google Scholar]
- 25.Schubbe D, Cohen S, Yen RW, et al. Does pictorial health information improve health behaviours and other outcomes? A systematic review protocol. BMJ Open. 2018;8(8):e023300. doi: 10.1136/bmjopen-2018-023300. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Lau D, Ogbogu U, Taylor B, Stafinski T, Menon D, Caulfield T. Stem cell clinics online: the direct-to-consumer portrayal of stem cell medicine. Cell Stem Cell. 2008;3(6):591–594. doi: 10.1016/j.stem.2008.11.001. [DOI] [PubMed] [Google Scholar]
- 27.Narendran N, Amissah-Arthur K, Groppe M, Scotcher S. Internet use by ophthalmology patients. Br J Ophthalmol. 2010;94(3):378–379. doi: 10.1136/bjo.2009.170324. [DOI] [PubMed] [Google Scholar]
- 28.Redick DW, Hwang JC, Kloosterboer A, et al. Content, readability, and accountability of freely available online information for patients regarding epiretinal membranes. Semin Ophthalmol. 2022;37(1):67–70. doi: 10.1080/08820538.2021.1913192. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Santos PJF, Daar DA, Badeau A, Leis A. Readability of online materials for Dupuytren’s contracture. J Hand Ther. 2018;31(4):472–479. doi: 10.1016/j.jht.2017.07.005. [DOI] [PubMed] [Google Scholar]
- 30.Nghiem AZ, Mahmoud Y, Som R. Evaluating the quality of internet information for breast cancer. The Breast. 2016;25:34–37. doi: 10.1016/j.breast.2015.10.001. [DOI] [PubMed] [Google Scholar]
- 31.Basch CH, Ethan D, MacLean SA, Fera J, Garcia P, Basch CE. Readability of prostate cancer information online: a cross-sectional study. Am J Mens Health. 2018;12(5):1665–1669. doi: 10.1177/1557988318780864. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Williams AM, Muir KW, Rosdahl JA. Readability of patient education materials in ophthalmology: a single-institution study and systematic review. BMC Ophthalmol. 2016;16(1):133. doi: 10.1186/s12886-016-0315-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Paolino L, Genser L, Fritsch S, De’ Angelis N, Azoulay D, Lazzati A. The web-surfing bariatic patient: the role of the internet in the decision-making process. Obes Surg. 2015;25(4):738–743. doi: 10.1007/s11695-015-1578-x. [DOI] [PubMed] [Google Scholar]
- 34.Peterlin BL, Gambini-Suarez E, Lidicker J, Levin M. An analysis of cluster headache information provided on internet websites. Headache. 2008;48(3):378–384. doi: 10.1111/j.1526-4610.2007.00951.x. [DOI] [PubMed] [Google Scholar]
- 35.Younis J, Salerno G, Chaudhary A, et al. Reduction in hospital reattendance due to improved preoperative patient education following hemorrhoidectomy. J Healthc Qual. 2013;35(6):24–29. doi: 10.1111/j.1945-1474.2012.00201.x. [DOI] [PubMed] [Google Scholar]
- 36.Hesse BW, Moser RP, Rutten LJ. Surveys of physicians and electronic health information. N Engl J Med. 2010;362(9):859–860. doi: 10.1056/NEJMc0909595. [DOI] [PubMed] [Google Scholar]
- 37.Calixte R, Rivera A, Oridota O, Beauchamp W, Camacho-Rivera M. Social and demographic patterns of health-related internet use among adults in the United States: a secondary data analysis of the health information national trends survey. Int J Environ Res Public Health. 2020;17(18):6856. doi: 10.3390/ijerph17186856. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Renahy E, Parizot I, Chauvin P. Determinants of the frequency of online health information seeking: results of a web-based survey conducted in France in 2007. Inform Health Soc Care. 2010;35(1):25–39. doi: 10.3109/17538150903358784. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Hainer MI, Tsai N, Komura ST, Chiu CL. Fatal hepatorenal failure associated with hydrazine sulfate. Ann Int Med. 2000;133(11):877–880. doi: 10.7326/0003-4819-133-11-200012050-00011. [DOI] [PubMed] [Google Scholar]
- 40.Islam MS, Kamal AM, Kabir A, et al. COVID-19 vaccine rumors and conspiracy theories: the need for cognitive inoculation against misinformation to improve vaccine adherence. PLoS One. 2021;16(5):e0251605. doi: 10.1371/journal.pone.0251605. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Creating patient education materials. https://www.muhclibraries.ca/training-and-guides/creating-patient-education-materials/. Accessed July 12, 2022.
- 42.Matti AI, Keane MC, McCarl H, Klaer P, Chen CS. Patients’ knowledge and perception on optic neuritis management before and after an information session. BMC Ophthalmol. 2010;10(1):7. doi: 10.1186/1471-2415-10-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
