Skip to main content
Journal of Vitreoretinal Diseases logoLink to Journal of Vitreoretinal Diseases
. 2022 Jun 9;6(6):437–442. doi: 10.1177/24741264221094683

Quality, Readability, and Accessibility of Online Content From a Google Search of “Macular Degeneration”: Critical Analysis

Hong-Uyen Hua 1, Nadim Rayess 2, Angela S Li 2, Diana Do 2, Ehsan Rahimy 3,
PMCID: PMC9954772  PMID: 37009540

Abstract

Purpose:

This work aims to assess the quality, accountability, readability, accessibility, and presence of Spanish translation in online material through a Google search of “macular degeneration”.

Methods:

In this retrospective cross-sectional analysis of website results from a Google search of “macular degeneration”, the quality and accountability for each website were assessed using the DISCERN criteria and the Health on the Net Foundation Code of Conduct (HONcode) principles. All 31 sites were independently graded by 2 ophthalmologists. Readability was evaluated using an online tool. The presence of accessibility features on the website and Spanish translation was recorded. The primary outcome measure was the DISCERN and HONcode quality and accountability scores of each website. Secondary outcome measures included the readability, accessibility, and presence of Spanish translation.

Results:

The mean ± SD of each criterion across all 15 DISCERN questions was 2.761 ± 0.666 (out of 5). The mean HONcode score for all websites was 7.355 ± 3.123. The mean consensus reading grade level was 10.258 ± 2.49. There were no statistically significant differences in any score between the top 5 websites and the bottom 26 websites evaluated. Accessibility was available on 10 of 31 websites. Spanish translation was available on 10 of 31 websites.

Conclusions:

The top 5 websites that appeared on a Google search did not have better quality or readability of online content. Improving quality, accountability, and readability can help improve patients’ health literacy regarding macular degeneration.

Keywords: Google, online, search, macular degeneration, readability, quality, content, online information, retina, patient education

Introduction

Age-related macular degeneration (AMD) is a leading cause of irreversible blindness and severe visual impairment, especially in older individuals. 1 Nearly 200 million people worldwide have AMD, and that number is expected to increase to 288 million by the year 2040. 2

Many patients with AMD will use the internet to supplement information provided to them by their ophthalmologists. Seventy-two percent of all internet users have used it to search for health information, with the vast majority of these patients (77%) using a search engine such as Google or Bing in lieu of more specialized medical websites. 3

At present, online health information is largely unregulated, with sites of variable quality, accuracy, or both. Most major ophthalmology association websites are written well above the recommended fourth- to sixth-grade reading level recommended by the US Department of Health and Human Services.4,5 Lower health literacy can lead to poorer patient outcomes.4,6,7 Thus, there is a need to ensure that websites are not only well informed, up to date, and unbiased but also easily comprehensible to the average patient.

Given the prevalence of AMD and the importance of directly evaluating the information patients are accessing, we sought to quantitatively assess the reliability, content, readability, accessibility, and option of Spanish translation of online health information regarding AMD. These metrics were also used to evaluate whether there were differences between academic institutions and private practice websites and between the top 5 ranked websites on Google and the lower-ranked websites.

Methods

No institutional review board approval was necessary for this study because no protected patient information was accessed. This cross-sectional study was performed between November 2019 and April 2020. The goal was to assess the websites’ quality, accountability, accessibility, and presence of translation from a Google search for the term macular degeneration. Incognito mode on Google Chrome was used to identify the top 31 unique websites because this mode prevents cookies from influencing search results.

Websites were classified by institutional association; that is, as academic or private. Institutional association was determined by website suffix. Sites ending in “.org” or “.edu” were deemed academic or reference sources, and sites ending in “.com” were deemed affiliated with private institutions, such as a private practice or a pharmaceutical company. In addition, the first 5 websites to appear on Google were compared with the other 26 websites because studies have shown that patients typically look at the first 2 to 5 sources listed on a search engine. 8

Quality, Accountability, and Reliability of Online Content

The DISCERN criteria and the Health on the Net Foundation Code of Conduct (HONcode) principles are internationally recognized measures for rating the quality and accountability of online health content.9,10 In this study, both indices were used to evaluate the quality, reliability, and accountability of online content. The DISCERN criteria, developed at the University of Oxford, Oxford, England, consists of 16 questions (Figure 1). The first 8 questions grade the reliability of the site, while questions 9 through 15 evaluate the quality of content regarding the treatment. The last question is an overall grade for the site. This study used the same grading method of 1 to 5 as outlined by Rayess et al,11,12 with a grade of 1 indicating that the website did not meet the criterion and a grade of 5 indicating that the website fully and clearly addressed the criterion.

Figure 1.

Figure 1.

DISCERN credibility and quality criteria. Questions were graded on a scale of 1 to 5 (grade 1, website did not meet the criterion; grade 5, website fully and clearly addressed the criterion).

The HONcode has 7 criteria that are used to evaluate the quality and accountability of online health content. The 7 criteria used to grade all websites include ownership, purpose, authorship, currency, qualification, attribution (references and sources), and interactivity. Each criterion was given a score on a scale of 0 to 2 based on the amount of resources provided. The maximum score is 14 based on the 7 criteria. This study used the same HONcode grading system described by Rayess et al.11,12 In addition, all 31 sites were reviewed to determine whether they had received HONcode certification, which is a free certification provided by the HONcode committee.

The 31 sites along with their associated webpages related to the search terms were independently graded by 2 ophthalmologists (H.H., N.R.). Adjudication was performed for every question with a greater than 1-point difference between the 2 reviewers on the DISCERN criteria and HONcode criteria. For each question in the DISCERN criteria, the grades given by the 2 ophthalmologists were averaged. This number was then used in the downstream analysis.

Readability

Readability of online information was graded using various scoring formulas provided from an online calculator (readabilityformulas.com). Based on the 7 indices for readability, the website provides a consensus grade level. These indices provide a grade level using various formulas that incorporate the number of letters or syllables per word and the average sentence length.

Accessibility

Websites were determined to be accessible if they provided an option for changing font size. Furthermore, the ability to adjust contrast as well as voice-generated screen reader capability was recorded.

Translation

The sites were also screened to determine whether they had an option for full translation into Spanish.

Statistical Analysis

Two comparisons were performed: (1) one between academic sites and private sites and (2) one between the top 5 ranked sites on Google and the 26 lower-ranked websites. For each set of comparisons, the Mann-Whitney test was used because these data did not conform to a normal distribution. The threshold for significance was set at α = .05. To account for multiple comparisons of the same dataset, the Bonferroni correction was used for a new P value cutoff of .01. For each of the 7 principles of the HONcode, a Fisher exact test was performed to evaluate for differences in the proportion of academic or private sites that met each principle. To again account for multiple comparisons of the same dataset, the Bonferroni correction was used for a new P value cutoff of .007. Finally, the Fisher exact test was performed to compare the proportion of sites with accessibility features and Spanish translation. All statistical analyses were performed using R software (R Project for Statistical Computing).

Results

Of the 31 websites analyzed, 18 originated from academic sources and 13 were classified as private organizations. Figure 1 shows the DISCERN criteria questions for each website. Overall, the mean DISCERN score for all question criteria for all websites was 2.761 ± 0.666 (scale, 1-5) (Table 1). The mean overall score for the DISCERN credibility criteria across all websites was 2.872 ± 0.772, and the mean overall score for the DISCERN treatment criteria was 2.635 ± 0.859. Of a possible total 14 points for the HONcode, the mean sum score for all websites was 7.355 ± 3.123. In evaluating readability, the mean consensus reading grade level was 10.258 ± 2.490 (Table 1 and Figure 2). Ten of the top 31 searched websites had accessibility features, and 10 had Spanish translation available.

Table 1.

Mean DISCERN Overall Scores, Mean Credibility DISCERN Scores, Mean Quality Criteria Scores, Mean HONcode Scores, and Mean Readability Grade Level Consensus for all Top 31 Websites.

Variable Value
Mean DISCERN ± SD 2.761 ± 0.666
Mean credibility (Q 1-8) ± SD 2.872 ± 0.772
Mean treatment (quality) (Q 9-15) ± SD 2.635 ± 0.859
Mean HONcode criteria (sum) ± SD 7.355 ± 3.123
Mean readability consensus ± SD 10.258 ± 2.490
Low-vision accessibility, n (%) 10 (0.323)
Spanish translation provided, n (%) 10 (0.323)

Abbreviations: HONcode, Health on the Net Foundation Code of Conduct; Q, questions.

Figure 2.

Figure 2.

Online readability grade level scores for all websites, top 5 websites, and other 26 websites using 6 scoring indices and a consensus grade level.

Abbreviations: AR, Automated Readability; CL, Coleman-Liau; FK, Flesch-Kincaid; GFOG, Gunning Fog; LWF, Linsear Write Formula; SMOG, Simple Measure of Gobbledygook.

A subanalysis comparing the DISCERN scores and HONcode scores between the top 5 search results and the bottom 26 results found a mean DISCERN score of 3.167 ± 0.211 and 2.683 ± 0.698, respectively (Table 2). Although the mean DISCERN score trended higher for the top 5 sites, the difference was not statistically significant (P = .126). The mean score for the DISCERN credibility criteria was 3.394 ± 0.209 for the top 5 websites and 2.772 ± 0.802 for the bottom 26 (P = .056). The mean DISCERN credibility score for the top 5 websites trended higher with Bonferroni correction (P = .01); however, the difference did not reach statistical significance. There was no significant difference in any other quality measure. The mean DISCERN treatment score was 2.907 ± 0.239 for the top 5 websites and 2.582 ± 0.926 for the bottom 26 websites (P = .610). The mean HONcode score sum was 8.800 ± 2.515 and 7.077 ± 3.193, respectively (P = .306). For readability, the mean consensus reading grade level score was 9.400 ± 1.517 and 10.423 ± 2.626 (P = .463), respectively (Table 2 and Figure 2). None of the top 5 websites and 10 of the bottom 26 websites offered low-vision accessibility options. Two of the top 5 websites and 8 of the bottom 26 websites provided Spanish translation (P = .686).

Table 2.

Mean DISCERN, HONcode, Readability Grade-Level Scores, and Proportion of Websites Providing Accessibility and Spanish Translation: Top 5 Google Searches vs Bottom 26 (of Top 31 Websites).

Variable Top 5 Bottom 26 P
Mean Mean
DISCERN (mean) 3.167 2.683 .126
 Credibility (Q 1-8) 3.394 2.772 .056
 Treatment (quality) (Q 9-15) 2.907 2.582 .610
HONcode criteria (sum) 8.800 7.077 .306
Readability consensus 9.400 10.423 .463
Low-vision accessibility (%) 0.000 0.385 .147
Spanish translation (%) 0.400 0.308 .686

Abbreviations: HONcode, Health on the Net Foundation Code of Conduct; Q, questions.

A subanalysis comparing academic-associated websites (n = 18) and private organization websites (n = 13) found a mean DISCERN score of 2.828 ± 0.620 and 2.669 ± 0.742, respectively (P = .575) (Table 3). The average HONcode sum score was 6.667 ± 2.195 for academic websites and 8.308 ± 2.869 for private websites (P = .148). The mean consensus reading grade level was 9.667 ± 1.847 and 11.077 ± 3.068, respectively (P = .240). Accessibility options were offered by 39% of academic websites and 23% of private websites. Spanish translation was offered by 8 of 18 academic websites and 2 of 13 private websites. None of these comparisons showed statistically significant differences between academic websites and private websites.

Table 3.

Mean DISCERN, HONcode, Readability Grade-Level Scores, and Proportion of Websites Providing Accessibility and Spanish Translation: Academic vs Private Websites.

Variable Academic (n = 18) Private (n = 13) P
Mean Mean
DISCERN (mean) 2.828 2.669 .575
 Credibility (Q 1-8) 2.877 2.865 .826
 Treatment (quality) (Q 9-15) 2.772 2.445 .245
HONcode criteria (sum) 6.667 8.308 .148
Readability consensus 9.667 11.077 .240
Low-vision accessibility (%) 0.389 0.231 .353
Spanish translation (%) 0.444 0.154 .210

Abbreviations: HONcode, Health on the Net Foundation Code of Conduct; Q, questions.

Conclusions

Today, the majority of patients consult the internet in search of answers to their health questions. 13 At present, there is no rigorous standard for posting health information online. Online recommended guidelines to uphold the content quality and credibility of online health information include the DISCERN criteria, the HONcode principles, and Journal of the American Medical Association (JAMA) benchmarks, among others.9,1416 In this study using the DISCERN and HONcode criteria, we evaluated the content quality, readability, and accessibility of online information by searching Google for “macular degeneration”.

As a whole, online information about macular degeneration did not perform well on the DISCERN criteria quality indices, with a mean DISCERN criterion score of 2.761 (scale, 1-5) (Table 1). Previous evaluations of online content on macular degeneration were performed before the antivascular endothelial growth factor (anti-VEGF) era or at the advent of the anti-VEGF era.1719

More recently, Kloosterboer et al 20 performed a study similar to ours. They analyzed the quality of online content on AMD using criteria based on JAMA benchmarks. Although their study analyzed content through an expert-curated list of questions about content, we used the established DISCERN criteria regarding the content quality and the HONcode criteria to evaluate content accountability rather than using the JAMA benchmarks. In both studies, the overall online content quality, accountability, and accessibility have considerable room for improvement. Specifically, we found that the overall quality of online content about macular degeneration was poor, with a previous mean quality score of 58% (out of 100). 18

There is no quality control for online health information; thus, laypeople should practice an abundance of caution when considering health information obtained through the internet. To improve the quality of online health information, online health content creators should consider the DISCERN reliability and treatment criteria developed by the University of Oxford. 9 Although the DISCERN criteria are established and published, online content creators could consider giving more weight to the more important criteria within DISCERN, such as the quality questions 9 and 13 (Figure 1). In addition, this information should be authored by a credible source with references and be updated regularly, as determined by the HONcode. 15

We found that Google search rankings did not correlate with higher quality content (Table 2). Data analysis of Google click-through rates shows that the top 3 Google search results receive 75.1% of all clicks. 21 Although the top search results garner a vast majority of click-through rates, the DISCERN scores, HONcode scores, readability, and accessibility of these 5 websites did not differ from the bottom 26 websites. Only the credibility of the top 5 search results trended toward statistical significance (P = .056). Despite a complex, multifactorial Google algorithm, our analysis found that the top search results did not provide the highest quality content. 22

Our comparison of the quality of online content between academic organizations and private organizations found no difference in DISCERN scores, HONcode scores, or readability. A comparison between academic websites and private websites was not included in the similar study by Kloosterboer et al. 20 This lack of difference is in contrast to an analysis of online information about intravitreal injections, which found that academic websites had significantly higher-quality DISCERN scores and overall DISCERN scores. 12

The US Department of Health and Human Services recommends that patient educational health materials should be written at a fourth- to sixth-grade reading level. 5 In our analysis of readability, none of the 31 websites had the recommended readability consensus grade level. Overall, the mean consensus grade reading level of the 31 websites was 10.258 ± 2.490 (Figure 2), well above the recommended criterion. This is similar to results in a 2007 study by Rennie et al 18 that also evaluated the readability of online content about AMD; the average Simple Measure of Gobbledygook readability ratings was also around 10, above the recommended reading level. For patients with low health literacy, this likely makes reading, understanding, and consuming online information about macular degeneration or other health issues very difficult. In a systematic review of low health literacy and health outcomes, low health literacy was associated with increased hospitalizations and emergency care. 7 Simplifying patient educational materials in the office and online might improve patient adherence and outcomes.

Given that Spanish is the second-most spoken language spoken in the United States, we looked at the number of websites that offered Spanish translation. Overall, only 10 of the 31 websites offered Spanish translation. Low health literacy substantially affects the Hispanic community, especially in individuals with limited English proficiency.23,24 Online health content can increase language accessibility by providing a clear, conspicuous link for Spanish translation on web pages.

In addition to readability and language barriers, poor vision can negatively affect accessibility to care, especially in a patient population affected by macular degeneration. In our assessment of macular degeneration online content, we evaluated whether each webpage offered accessibility options for low vision. Of the top 31 websites, only 10 had accessibility features. Notably, none of the top 5 search results offered accessibility features. Decreased vision is associated with a lower quality of life; use of low-vision services is associated with increased self-reported functional status and quality of life. 25 Thus, providing low-vision accessibility services online, especially to an aging population with macular degeneration, might increase quality of life, health literacy, and access to care.

There are limitations to this study. Online search results were accessed through Google only. No website advertisements were included in the study. Furthermore, the readability formulas we used did not allow us to evaluate photographs, figures, or infographics. In addition, the search terms patients use might be different. We did not search acronyms for AMD, dry AMD, or wet AMD.

We present an updated, rigorous analysis of the quality, accountability, readability, accessibility, and presence of Spanish translation of online information available through a Google search of “macular degeneration”. Overall, both the quality and accountability of online information related to macular degeneration can be improved. Furthermore, the top 5 Google search results did not have higher quality data. Readability scores for all websites were still much higher than the recommended fourth- to sixth-grade level. Accessibility of these websites was poor; only 10 out of the 31 websites provided low-vision accessibility options, and 10 provided Spanish translation. More comprehensive studies are needed to assess patient comprehension, trust, and health literacy from online sources of information regarding macular degeneration and other ophthalmic diseases.

Footnotes

Authors’ Note: This study was accepted as a poster at the American Society of Retinal Surgeons Virtual Meeting 2020.

Ethical Approval: Ethical approval was not sought for this present study because it did not involve protected health information.

Statement of Informed Consent: Informed consent was not sought for this present study because protected health data and patient information were not accessed.

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding: The author(s) received no financial support for the research, authorship, and/or publication of this article.

References

  • 1. Flaxman SR, Bourne RRA, Resnikoff S, et al. ; Vision Loss Expert Group of the Global Burden of Disease Study. Global causes of blindness and distance vision impairment 1990-2020: a systematic review and meta-analysis. Lancet Glob Health. 2017;5(12):e1221-e1234. doi: 10.1016/S2214-109X(17)30393-5 [DOI] [PubMed] [Google Scholar]
  • 2. Jonas JB, Cheung CMG, Panda-Jonas S. Updates on the epidemiology of age-related macular degeneration. Asia Pac J Ophthalmol (Phila). 2017;6(6):493-497. doi: 10.22608/APO.2017251 [DOI] [PubMed] [Google Scholar]
  • 3. Pew Research Center. Majority of adults look online for health information. 2013. Accessed July 1, 2020. https://www.pewresearch.org/fact-tank/2013/02/01/majority-of-adults-look-online-for-health-information/
  • 4. Edmunds MR, Barry RJ, Denniston AK. Readability assessment of online ophthalmic patient information. JAMA Ophthalmol. 2013;131(12):1610-1616. doi: 10.1001/jamaophthalmol.2013.5521 [DOI] [PubMed] [Google Scholar]
  • 5. AHRQ. Health literacy. Agency for Healthcare and Research Quality. 2020. Accessed July 1, 2020. https://www.ahrq.gov/health-literacy/research/tools/index.html
  • 6. Huang G, Fang CH, Agarwal N, Bhagat N, Eloy JA, Langer PD. Assessment of online patient education materials from major ophthalmologic associations. JAMA Ophthalmol. 2015;133(4):449-454. doi: 10.1001/jamaophthalmol.2014.6104 [DOI] [PubMed] [Google Scholar]
  • 7. Berkman ND, Sheridan SL, Donahue KE, Halpern DJ, Crotty K. Low health literacy and health outcomes: an updated systematic review. Ann Intern Med. 2011;155(2):97-107. doi: 10.7326/0003-4819-155-2-201107190-00005 [DOI] [PubMed] [Google Scholar]
  • 8. Fox S, Rainie L. Main report: The search for online medical help. Pew Research Center: Internet & Technology. 2002. Accessed June 28, 2020. https://www.pewresearch.org/internet/2002/05/22/main-report-the-search-for-online-medical-help/
  • 9. Charnock D, Shepperd S, Needham G, Gann R. DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health. 1999;53(2):105-111. doi: 10.1136/jech.53.2.105 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Boyer C, Selby M, Scherrer JR, Appel RD. The health on the net code of conduct for medical and health websites. Comput Biol Med. 1998;28(5):603-610. doi: 10.1016/S0010-4825(98)00037-7 [DOI] [PubMed] [Google Scholar]
  • 11. Rayess H, Zuliani GF, Gupta A, et al. Critical analysis of the quality, readability, and technical aspects of online information provided for neck-lifts. JAMA Facial Plast Surg. 2017;19(2):115-120. doi: 10.1001/jamafacial.2016.1219 [DOI] [PubMed] [Google Scholar]
  • 12. Rayess N, Li AS, Do DV, Rahimy E. Assessment of online sites reliability, accountability, readability, accessibility and translation for intravitreal injections. Ophthalmol Retina. 2020;4(12):1188-1195. doi: 10.1016/j.oret.2020.05.019 [DOI] [PubMed] [Google Scholar]
  • 13. Fox S, Duggan M. Health online 2013. Pew Research Center: Internet & Technology. 2013. Accessed June 18, 2020. https://www.pewresearch.org/internet/2013/01/15/health-online-2013/
  • 14. Silberg WM. Assessing, controlling, and assuring the quality of medical information on the internet: caveant lector et viewor—let the reader and viewer beware. JAMA. 1997;277(15):1244-1245. doi: 10.1001/jama.1997.03540390074039 [DOI] [PubMed] [Google Scholar]
  • 15. Boyer C, Dolamic L. Automated detection of HONcode website conformity compared to manual detection: an evaluation. J Med Internet Res. 2015;17(6):e135. doi: 10.2196/jmir.3831 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Berland GK, Elliott MN, Morales LS, et al. Health information on the internet: accessibility, quality, and readability in English and Spanish. JAMA. 2001;285(20):2612-2621. doi: 10.1001/jama.285.20.2612 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Kahana A, Gottlieb JL. Ophthalmology on the internet: what do our patients find? Arch Ophthalmol. 2004;122(3):380-382. doi: 10.1001/archopht.122.3.380 [DOI] [PubMed] [Google Scholar]
  • 18. Rennie CA, Hannan S, Maycock N, Kang C. Age-related macular degeneration: what do patients find on the internet? J R Soc Med. 2007;100(10):473-477. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Schalnus R, Aulmann G, Hellenbrecht A, Hägele M, Ohrloff C, Lüchtenberg M. Content quality of ophthalmic information on the internet. Ophthalmologica. 2009;224(1):30-37. doi: 10.1159/000233233 [DOI] [PubMed] [Google Scholar]
  • 20. Kloosterboer A, Yannuzzi NA, Patel NA, Kuriyan AE, Sridhar J. Assessment of the quality, content, and readability of freely available online information for patients regarding diabetic retinopathy. JAMA Ophthalmol. 2019;137(11):1240-1245. doi: 10.1001/jamaophthalmol.2019.3116 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21. Dean B. Here’s what we learned about organic click through rate. BACKLINKO. 2019. Accessed July 1, 2020. https://backlinko.com/google-ctr-stats
  • 22. How search algorithms work. Google Search. Accessed June 14,2020. https://www.google.com/search/howsearchworks/algorithms/#:~:text=To%20give%20you%20the%20most,and%20your%20location%20and%20settings
  • 23. Edward J, Morris S, Mataoui F, Granberry P, Williams MV, Torres I. The impact of health and health insurance literacy on access to care for Hispanic/Latino communities. Public Health Nurs. 2018;35(3):176-183. doi: 10.1111/phn.12385 [DOI] [PubMed] [Google Scholar]
  • 24. Becerra BJ, Arias D, Becerra MB. Low health literacy among immigrant Hispanics. J Racial Ethn Health Disparities. 2017;4(3):480-483. doi: 10.1007/s40615-016-0249-5 [DOI] [PubMed] [Google Scholar]
  • 25. Stelmack J. Quality of life of low-vision patients and outcomes of low-vision rehabilitation. Optom Vis Sci. 2001;78(5):335-342. doi: 10.1097/00006324-200105000-00017 [DOI] [PubMed] [Google Scholar]

Articles from Journal of Vitreoretinal Diseases are provided here courtesy of SAGE Publications

RESOURCES