Abstract
Importance:
One of the top ten causes of disability in the United States is vision loss, primarily due to age-related eye diseases such as age-related macular degeneration. With an aging population, the number of people affected by this condition is expected to rise. Patients increasingly turn to the internet for health-related information, but no standard exists across published websites.
Objective:
To assess the quality, content, accountability and readability of information found online for age-related macular degeneration.
Design:
This cross-sectional study analyzed 12 freely available medical sites with information on age-related macular degeneration and used PubMed as a gold standard for comparison. Thirty-four questions were composed to include information most relevant to patients and each website was independently evaluated by one vitreoretinal surgeon, two vitreoretinal fellows and one ophthalmology resident. Readability was analyzed using an online readability tool. The JAMA benchmarks were used to evaluate the accountability of each site.
Setting:
Freely available online information was used in this study.
Results:
The average questionnaire score for all websites was 90.23 (SD 17.56, CI 95% ±9.55) out of 136 possible points. There was a significant difference between the content quality of the websites (P=0.01). The mean reading grade for all websites was 11.44 (SD 1.75, CI 95% ±0.99). No significant correlation was found between content accuracy and the mean reading grade or Google rank (r=0.392, P=0.207) and r=0.133, P=0.732 respectively). Without including PubMed, only one website achieved the full 4 JAMA benchmarks. There was no correlation between the accuracy of the content of the website and JAMA benchmarks (r=0.344, P=0.273). The interobserver reproducibility was similar among 3 out of 4 observers (r=0.747 between JS and NT, r=0.643 between JS and NP, r=0.686 between NP and NT, r=0.581 between JS and NY; P≤0.05
Conclusion and Relevance:
The freely available information online on age-related macular degeneration varies by source but is generally of low quality. The material presented is difficult to interpret and exceeds the recommended reading level for health information. Most websites reviewed did not provide sufficient information using the grading scheme we used to support the patient in making medical decisions.
Keywords: Age-related macular degeneration, Retina, Online health information
Introduction
Age-related macular degeneration (AMD) is an chronic degenerative disease of the retina that affects central vision.1 It is a common cause of vision loss worldwide, particularly seen in individuals older than 55 years of age.2 An estimated 8.7% of the world’s population currently suffers from this disease, and the number is projected to increase.3 Furthermore, AMD is associated with a reduction in quality of life and activities of daily living.4 With an aging population, it is expected that the number of people affected by AMD will rise to 196 million in the year 2020, and to 288 million by 2040.3
After diagnosis of a chronic condition, patients may search for health-related information online. Studies have shown that the use of online information can reduce anxiety and depression, and aid in patient empowerment.5–7 Health related searches center around information regarding a condition, symptoms, and treatment.8 Notably, those resources found online can impact the patient’s medical decision-making.9,10 Since websites are not regulated, the information displayed can have varying degrees of quality and accuracy. In general, people assess the credibility of the website based on the source, particularly if associated with a professional organization, design, and ease of use.11 While important, neither of these markers guarantees that the information presented is complete and without bias. Moreover, an increasing number of people are accessing the internet for health content and advice. In the United States (US) about 70% of adults use the internet for this purpose, while in the United Kingdom the number has doubled since 2005, from 37% to 69%.12,13 As such, it is crucial to investigate the quality of the health-related material patients access.
The capacity to understand and act on health information is a strong predictor of overall health, and patients that are well-informed participate more in their care.14,15 However, while medical content is easily available, it does not mean that those viewing it can read or interpret it appropriately. In the US, many adults struggle with health literacy and may not have the skillset needed to process and understand basic health information.16,17 The National Assessment of Adult Literacy (NAAL) recommends that health information should be written at the sixth-grade level or below.18 Nevertheless, online medical information has been shown to exceed this recommended reading level indicating that a substantial number of patients may not actually benefit from the use of these resources.19
To help patients use free and readily available health-related services appropriately, it is important for physicians to be informed of what resources are available. This study aims to evaluate the quality, accountability, and readability of major medical websites regarding AMD.
Methods
Website Selection and Content Analysis
A Google search was conducted by entering the term “age-related macular degeneration” into the search engine. Major medical sites were selected for analysis, including the American Academy of Ophthalmology (https://www.aao.org/), All About Vision (https://www.allaboutvision.com/), American Optometric Association (https://www.aoa.org/), American Society of Retinal Specialists (https://www.asrs.org/), EyeWiki (https://eyewiki.org/), American Macular Degeneration Foundation (https://www.macular.org), Mayo Clinic (https://www.mayoclinic.org/), Medical News Today (https://www.medicalnewstoday.com/), MedicineNet (https://www.medicinenet.com/), National Eye Institute (https://nei.nih.gov/), WebMD (https://www.webmd.com/), and Wikipedia (https://www.wikipedia.org/). PubMed (https://www.ncbi.nlm.nih.gov/pubmed/) was used as the gold standard of the content each site should contain. Website selection was correlated and confirmed with prior similar studies.20,21 The rank of each website was recorded. Each website was evaluated for the presence of low vision accessibility features including the possibility of increasing font size, adjustment to contrast, and reading text as audio. If a website consisted of multiple links or tabs, all areas of the website were accessed and evaluated.
An evaluation was conducted of 34 questions composed by the authors based on frequently encountered questions by patients (Table 1). They were designed to encompass information generally conveyed during a patient evaluation and used to assess the accuracy and completeness of the content in each website. Only these questions were utilized to compare PubMed to the selected websites. One vitreoretinal surgeon (JS), two vitreoretinal surgery fellows (NY and NP) and one ophthalmology resident (NT) independently analyzed each website. A grading scheme was created with a scale from 0–4, with 4 as a maximum, to assess each question. A score of 0 represents that no information for that question was available; 1 point corresponds to an answer that is unclear, inaccurate, or omits significant information and displays poor organization; 2 points corresponds to an answer that is partially complete and somewhat addresses the concept, but has gaps in information and organization; 3 points correspond to a question that provides essential elements to answer the question and addresses the most relevant points in a focused and organized way; lastly 4 points corresponds to an answer that is accurate and thorough, it explains the information in a clear, focused, and organize manner. The interobserver reproducibility was measured using a Spearman correlation. The average score between the 4 observers was used to compare the quality of each site. The mean score of each site was correlated to its rank on Google.com.
Table 1:
Mean Points | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Questions | AAO | All About Vision | AOA | ASRS | EyeWiki | Macular.org | Mayo Clinic | Medical News Today | MedicineNet | NEI | WebMD | Wikipedia |
What is AMD? | 3.50 | 3.50 | 3.00 | 3.75 | 4.00 | 3.50 | 3.50 | 3.50 | 3.75 | 3.75 | 3.50 | 3.75 |
What are the symptoms of AMD? | 3.50 | 4.00 | 4.00 | 4.00 | 4.00 | 4.00 | 4.00 | 3.75 | 4.00 | 3.50 | 3.75 | 3.75 |
What is the difference between dry and wet AMD? | 3.25 | 3.50 | 2.75 | 3.75 | 3.75 | 3.25 | 3.25 | 3.25 | 3.50 | 3.50 | 3.50 | 3.50 |
What are the stages of dry AMD? | 1.50 | 1.50 | 0.00 | 2.50 | 4.00 | 3.00 | 1.25 | 1.00 | 3.50 | 3.75 | 2.00 | 3.25 |
What are the stages of wet AMD? | 0.50 | 1.75 | 0.00 | 2.25 | 3.25 | 2.00 | 1.25 | 0.75 | 1.00 | 2.25 | 1.00 | 2.25 |
How is AMD diagnosed? | 3.50 | 1.75 | 1.50 | 3.50 | 3.75 | 2.50 | 3.75 | 3.50 | 3.25 | 3.50 | 3.50 | 3.25 |
What is optical coherence tomography? | 3.25 | 0.00 | 0.00 | 3.25 | 4.00 | 2.50 | 3.25 | 3.25 | 1.75 | 3.25 | 2.75 | 4.00 |
What causes dry AMD? | 2.50 | 3.00 | 1.50 | 2.75 | 4.00 | 3.25 | 3.25 | 1.75 | 3.25 | 3.00 | 2.75 | 3.75 |
What causes wet AMD? | 2.50 | 3.25 | 2.00 | 3.00 | 3.50 | 3.25 | 3.50 | 2.50 | 3.50 | 3.25 | 3.00 | 3.50 |
What is the incidence of AMD? | 1.25 | 3.50 | 3.25 | 0.50 | 3.50 | 2.50 | 0.00 | 3.00 | 1.75 | 0.50 | 1.75 | 3.25 |
Which gender is most commonly affected by AMD? | 0.00 | 3.75 | 4.00 | 3.75 | 4.00 | 3.75 | 1.00 | 1.00 | 3.75 | 0.00 | 3.75 | 0.00 |
Which age group is most commonly affected by AMD? | 3.75 | 4.00 | 3.75 | 3.25 | 4.00 | 3.50 | 3.75 | 4.00 | 3.75 | 3.75 | 3.50 | 4.00 |
What race is most susceptible to AMD? | 3.00 | 4.00 | 3.75 | 3.75 | 3.75 | 4.00 | 3.75 | 3.75 | 3.00 | 3.75 | 4.00 | 3.25 |
What are the risk factors for AMD? | 3.50 | 4.00 | 2.00 | 4.00 | 4.00 | 3.75 | 4.00 | 3.75 | 3.75 | 3.75 | 3.75 | 3.75 |
Can AMD be prevented? | 1.75 | 3.00 | 1.50 | 3.25 | 3.00 | 2.25 | 2.50 | 1.75 | 2.50 | 3.00 | 2.00 | 3.00 |
Is there a genetic component to AMD? | 2.75 | 3.25 | 0.00 | 1.00 | 4.00 | 3.50 | 2.75 | 1.50 | 1.50 | 3.50 | 2.75 | 4.00 |
Should I receive genetic testing for AMD? | 1.00 | 0.75 | 1.00 | 0.00 | 1.50 | 2.25 | 0.00 | 0.00 | 0.00 | 4.00 | 0.00 | 0.25 |
Are there any home monitoring devices to assist in the conversion of dry to wet AMD? | 1.50 | 2.50 | 0.00 | 1.00 | 1.75 | 0.75 | 0.75 | 1.00 | 0.75 | 1.00 | 2.25 | 1.25 |
Is smoking associated with AMD? | 3.75 | 4.00 | 0.00 | 3.75 | 3.75 | 3.75 | 3.75 | 3.75 | 3.75 | 3.75 | 3.75 | 3.75 |
What is the incidence of legal blindness from wet AMD? | 0.50 | 1.00 | 0.50 | 0.50 | 2.75 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 2.75 |
What is an Amsler grid? | 4.00 | 4.00 | 0.00 | 1.00 | 3.00 | 3.00 | 3.75 | 3.75 | 3.25 | 3.00 | 3.00 | 4.00 |
What are ocular vitamins and what types of AMD might benefit from them? | 3.50 | 3.50 | 3.50 | 3.75 | 4.00 | 3.50 | 3.50 | 3.75 | 3.25 | 3.75 | 3.00 | 1.50 |
Is vision loss due to AMD reversible? | 1.50 | 3.00 | 3.25 | 2.75 | 2.50 | 2.75 | 2.50 | 2.50 | 2.75 | 2.75 | 2.00 | 3.25 |
What are the treatment options for AMD? | 3.75 | 3.75 | 2.75 | 3.50 | 4.00 | 3.75 | 3.25 | 3.50 | 3.50 | 3.50 | 3.25 | 3.25 |
How often should I be seen if I have AMD? | 0.75 | 0.75 | 0.00 | 1.50 | 0.25 | 0.25 | 0.75 | 1.25 | 0.75 | 2.25 | 0.75 | 0.00 |
What is an anti-VEGF injection and what are complications associated with anti-VEGF therapy? | 3.00 | 3.25 | 1.50 | 2.25 | 2.50 | 3.75 | 2.75 | 3.25 | 3.00 | 2.75 | 2.00 | 2.50 |
What are the complications associated with photodynamic therapy? Is this treatment still used? | 2.00 | 2.25 | 1.25 | 1.75 | 2.50 | 3.00 | 2.75 | 2.00 | 2.50 | 2.50 | 1.75 | 2.50 |
What is a fluorescein angiogram? | 3.75 | 2.50 | 0.00 | 3.00 | 4.00 | 2.75 | 3.25 | 3.25 | 2.25 | 3.25 | 3.00 | 4.00 |
An ICG angiogram? | 0.00 | 0.00 | 0.00 | 2.50 | 0.00 | 0.00 | 3.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 |
What is focal laser for AMD and is it still used as treatment? | 3.25 | 3.75 | 2.00 | 2.25 | 3.75 | 2.25 | 3.00 | 2.25 | 3.25 | 3.50 | 2.25 | 2.75 |
What is in the AREDS formulation? | 4.00 | 3.75 | 3.75 | 3.25 | 4.00 | 4.00 | 4.00 | 4.00 | 3.50 | 4.00 | 3.50 | 0.00 |
What is an intraocular telescope? | 0.00 | 4.00 | 1.50 | 0.00 | 0.00 | 0.50 | 3.25 | 3.00 | 2.00 | 2.50 | 0.50 | 0.00 |
What are low vision aids? | 4.00 | 4.00 | 2.00 | 1.00 | 0.50 | 2.00 | 3.50 | 1.00 | 0.75 | 3.75 | 2.25 | 2.75 |
Does the source provide pictures of AMD? | 4.00 | 0.00 | 1.25 | 3.75 | 4.00 | 0.75 | 1.50 | 1.25 | 1.50 | 0.00 | 0.25 | 3.00 |
Total Points | 84.25 | 94.50 | 57.25 | 85.75 | 105.25 | 89.50 | 90.00 | 81.50 | 84.25 | 94.25 | 80.75 | 89.75 |
Percentage (Max 136) | 62% | 69% | 42% | 63% | 77% | 66% | 66% | 60% | 62% | 69% | 59% | 66% |
Points, Mean | 2.48 | 2.78 | 1.68 | 2.52 | 3.10 | 2.63 | 2.65 | 2.40 | 2.48 | 2.77 | 2.38 | 2.64 |
SD | 1.33 | 1.30 | 1.40 | 1.23 | 1.25 | 1.20 | 1.25 | 1.28 | 1.24 | 1.27 | 1.21 | 1.37 |
CI 95% | 0.45 | 0.44 | 0.47 | 0.41 | 0.42 | 0.40 | 0.42 | 0.43 | 0.42 | 0.43 | 0.41 | 0.46 |
Accountability Analysis
The Journal of the American Medical Association (JAMA) instituted the use of four benchmarks in 1997 to identify websites that lacked accountability.22 The four key features that every website should list are: authorship, attribution or sources, currency or date of update, and disclosures.
Readability Analysis
Each website was evaluated for total words, average number of sentences, average number of syllables per word, and average words per sentence. The Flesch Reading Ease Score provides a score ranging from 0–100, with a higher score indicating an easier to read text. It is accepted that a score of 80–70 can be understood by a seventh grader. Reading grade level was evaluated through the Flesch Kincaid Grade Level, Gunning Fog Index, Coleman Liau Index, and Simple Measure of Gobbledygook (SMOG) Index. The analysis was carried out by using an online readability tool (Readable).23 Since PubMed is not a resource meant for the layperson, this website was not included in this analysis.
Statistical Analysis
All statistical analysis was carried out with IBM SPSS Statistics for Windows, version 25.0, released 2017 (Armonk, NY: IBM Corp). Interobserver reproducibility, and correlation calculations was done using a Spearman correlation test. Data was analyzed using the Kruskal-Wallis test, with a post-hoc Dunn-Bonferroni test to evaluate pairwise comparison. Statistical significance was set at P≤0.05 for all statistical analysis.
Results
Website Selection and Content Analysis
13 websites were analyzed in our study. The only website that offered the possibility of changing the font size was ASRS. None of the 13 websites included option for changing or reversing the contrast, or having the text read as audio. AOA did have a section on website accessibility with instructions on how to modify the internet browser to accomplish these tasks. The interobserver reproducibility was strong among 3 out of 4 observers (r=0.747 between JS and NT, r=0.643 between JS and NP, r=0.686 between NP and NT, r=0.581 between JS and NY; P≤0.05 and r=0.494 between NY and NP, r=0.377 between NY and NT; P≥0.05). The average questionnaire score for all websites was 90.23 (SD 17.56, CI 95% ±9.55) out of 136 possible points. There was a statistically significant difference of content accuracy and completeness between websites (H=25.456, P=0.01). After PubMed, EyeWiki was the top scoring website, with an average of 105.25 points, representing 77% of total possible points. AOA was the lowest scoring website with an average of 57.25 points, which represents 42% of total possible points (Table 2). There was a significant difference between PubMed and the two lowest scoring websites, AOA and WebMD (H=−44.250, P=0.003; H=−36.625, P=0.049 respectively). Out of the 34 questions used to carry out the analysis, 14 had a significant difference between websites (highlighted questions in Table 1). There was no significant correlation between the rank in Google.com and the content quality of the website (r=0.133, P=0.732).
Table 2:
Total Points | Percentage (Max 136) | Points, Mean | SD | CI 95% | |
---|---|---|---|---|---|
PubMed | 136 | 100% | 4 | 0 | |
EyeWiki | 105.25 | 77% | 3.1 | 1.25 | 0.42 |
All About Vision | 94.5 | 69% | 2.78 | 1.3 | 0.44 |
NEI | 94.25 | 69% | 2.77 | 1.27 | 0.43 |
Mayo Clinic | 90 | 66% | 2.65 | 1.25 | 0.42 |
Wikipedia | 89.75 | 66% | 2.64 | 1.37 | 0.46 |
Macular.org | 89.5 | 66% | 2.63 | 1.2 | 0.4 |
ASRS | 85.75 | 63% | 2.52 | 1.23 | 0.41 |
AAO | 84.25 | 62% | 2.48 | 1.33 | 0.45 |
MedicineNet | 84.25 | 62% | 2.48 | 1.24 | 0.42 |
Medical News Today | 81.5 | 60% | 2.4 | 1.28 | 0.43 |
WebMD | 80.75 | 59% | 2.77 | 1.27 | 0.43 |
AOA | 57.25 | 42% | 1.68 | 1.4 | 0.47 |
Accountability Analysis
Without including PubMed in the analysis, only 1 website (9.00%) achieved the full 4 JAMA benchmarks, and 2 (18.00%) websites achieved 3 of the 4 JAMA benchmarks (Table 3). The most commonly displayed attributes were currency and authorship (9 and 6 of 12, respectively), and only EyeWiki fulfilled the disclosure criteria. There was no correlation between the content quality of the website and JAMA benchmarks (r=0.344, P=0.273).
Table 3:
JAMA Benchmarks | N (%) |
---|---|
4 Benchmarks | 1 (9%) |
3 Benchmarks | 2 (18%) |
2 Benchmarks | 3 (27%) |
1 Benchmark | 5 (45%) |
0 Benchmars | 1 (9%) |
Attribution | 5 (45%) |
Authorship | 6 (55%) |
Currency | 9 (82%) |
Disclosure | 1 (9%) |
Readability Analysis
The mean Flesch Reading Ease Score was 46.15 (SD 11.86, CI 95% ±6.71). The mean reading grade for all websites was 11.44 (SD 1.75, CI 95% ±0.99; Table 4). There was a significant correlation between the FRE score and mean reading grade level (r=−0.958, P<0.001). There was a significant difference between the mean reading grade level of the websites analyzed (H=37.18, P<0.001). No significant correlation was found between website quality and the mean reading grade (r=0.392, P=0.207).
Table 4:
AAO | All About Vision | AOA | ASRS | EyeWiki | Macular.org | Mayo Clinic | Medical News Today | MedicineNet | NEI | WebMD | Wikipedia | |
---|---|---|---|---|---|---|---|---|---|---|---|---|
Flesch Reading Ease | 64.2 | 35.9 | 48.9 | 50.2 | 19.6 | 41.3 | 51.7 | 58.7 | 48.5 | 57.2 | 45.2 | 32.40 |
Mean Reading Grade | 8.60 | 13.03 | 11.53 | 9.33 | 14.53 | 12.63 | 10.60 | 9.48 | 11.55 | 10.38 | 12.08 | 13.58 |
Mean Reading Grade SD | 1.14 | 0.66 | 1.02 | 0.91 | 0.98 | 0.82 | 1.05 | 0.99 | 0.93 | 1.11 | 0.91 | 0.79 |
Mean Reading Grade CI (95%) | 1.12 | 0.64 | 1.00 | 0.89 | 0.96 | 0.81 | 1.03 | 0.97 | 0.91 | 1.09 | 0.90 | 0.77 |
Discussion
Accessibility to high-quality educational material is an essential tool for patients to take an active role in the management of their health.24 However, information regarding the quality of the material found online is often scarce. This study aimed to evaluate the accuracy, accountability, and readability of AMD information available online.
AMD is the leading cause of vision loss in the elderly population of the United States.25 The principal goal in management of patients who suffer from AMD is to slow the progression of the disease. Early diagnosis and treatment are fundamental to optimize vision, especially in those patients with wet AMD.26 Treatment in the form of intravitreal injections of anti-vascular endothelial growth factor (VEGF) has been shown to not only slow down the progression of the disease, but also improve visual acuity.26 However, this requires a close patient-physician relationship with continuous follow-up and consistent therapy cycles. The use of inaccurate online health information by patients can strain this relationship, and lead to treatment non-compliance.27,28 When patients value online information above the physician’s recommendation, the relationship can become conflicted resulting in a higher likelihood of patients ignoring the physician’s expertise.29 In patients with AMD one of the most common reasons for discontinuation of anti-VEGF injections is the disbelief in benefits of this treatment.30 A possible cause is the use of online resources that are inaccurate or not written at the appropriate reading level. Misinterpretation of information, especially of treatment options and their risks and benefits, can affect a patient’s medical decision making and their willingness to undergo repetitive treatment.
Overall, there was a significant difference in the quality of the information on AMD provided by the websites analyzed. In particular, AOA and WebMD were found to have significantly poor content compared to PubMed. EyeWiki scored the highest out of all websites evaluated, and received 105.25 points out of 136, representing a 77.4%. While this website performed better when compared to others, it is still not an ideal source. A recent study that analyzed online information on diabetic retinopathy found that this same website only scored a 49%, while the highest scoring website, Wikipedia, earned 74% of the maximum points. The difference in content quality across topics from the same source further demonstrates a lack of standardization in written material.21 Moreover, websites were found to have deficits on specific topics important to patient care including AMD diagnosis, risk factors, potential treatments, and low vision aids. Poor quality information can result in requests for unnecessary tests or treatments and negatively impact the patient-physician relationship.31 In order to prevent this, it is important for physicians to be aware of the limitations of online information to appropriately educate their patients.
The Google rank is an important predictor of which information will reach patients.32 It is well known that websites in higher positions on search pages are associated with reputability and, therefore, higher traffic.33 In this study, there was no correlation between the position in Google.com with the quality of the content. Wikipedia, a free encyclopedia that can be edited by its users, generally ranks among the top first ten results when searching health-related keywords on different search engines.34 However, this well-known source of information scored 5 out of 12 in our quality analysis with an overall score of 84.25 of 136. The misconception that rank and website credibility and quality are correlated must be corrected. Patients looking for online health information should be made aware of this, since this metric cannot be used as an indicator of quality.
A validated tool to assess the accountability of each site is the JAMA benchmark. Only EyeWiki, a website by the American Academy of Ophthalmology, met all four benchmarks while most websites only met two or less (9 of 12). This indicates that major medical sites reporting information on AMD have poor accountability. There was no correlation between JAMA benchmarks and website quality (r=0.344, P=0.273) or Google Rank (r=0.133, P=0.732) indicating that these benchmarks cannot be used to identify which websites have reliable information. Interestingly, these results are not surprising. A study on online information regarding diabetic retinopathy found that out of the elven websites analyzed, none met the four JAMA benchmarks.21 That same study also showed no correlation between JAMA benchmarks and website quality or Google Rank, indicating that lack of accountability might be a generalized issue found in online medical resources.
Patient education materials are often written at a level above the recommended national average. One of the primary findings of this study is that the available patient information online for AMD is poorly comprehendible for a layperson. On average, an eleventh-grade education was needed to digest the information, which surpasses the recommendation of the NAAL. The most complex website was EyeWiki with a mean reading grade of 14.53; this is not unexpected, as EyeWiki is designed for primary consumption by eye providers. While EyeWiki received the highest quality score, no correlation was found between the average score received by a website and the reading grade level. This demonstrates that regardless of quality and source, ophthalmological websites are generally difficult to read. In order to encourage patient understanding and education, the material provided must be easy to read. Physicians and professional associations should take this into consideration when creating material to supplement patient education. Interestingly, only one website offered a large-font format of their text, and none of the websites analyzed included any other low-vision accessibility features. Given that patient with AMD can have significantly reduced vision, websites should consider offering these features to aid the legibility of their resources.
As the use of the internet for health-related content continues to increase, action must be taken to ensure that patients are accessing high-quality resources. While certain tools, such as the HONcode, have been developed to audit online information, there is no methodology in place to regulate the material presented online.32 Therefore, both national health organizations and individual physicians should help direct patients to validated resources. Organizations can ensure information on their website is up to date, complete, and importantly, written at a sixth-grade level or below. Meanwhile, physicians can complement patient education by providing written materials as well as a list of validated sources.
This study had some inherent limitations. The keyword “age-related macular degeneration” was used to select websites, which may be different from the terminology used by patients. Additionally, because this study centered on AMD, it may not be applicable to other ophthalmic diseases. Lastly, the interobserver reproducibility showed that 3 of the 4 graders had strong correlations with each other, while the fourth one had weak to moderate correlations. Taken together, the discrepancy found between the observers can be related both to the subjective nature of the study and the small number of observations (n=13).
An important source of medical education materials for patients are those provided by electronic medical records (EMRs). Companies such as Healthwise, Inc. (Boise, ID) can be integrated directly into the EMR to provide printouts of medical information at the end of a patient visit or hospitalization. Future studies should analyze the content being provided and compare it to other available resources such as online material.
Conclusion
This study revealed that in general patient education material on AMD is of low quality and accountability. Websites available to patients have substantially less information than reference sources commonly utilized by physicians, such as PubMed. In addition, the sources analyzed were written at a reading level higher than the nationally recommended one. These results suggest that health professionals should advise patients against utilizing the internet as their primary source of information and consider characteristics such as readability when providing written material.
Funding/Support:
The Bascom Palmer Eye Institute received funding from NIH Core Grant P30EY014801, Department of Defense Grant #W81XWH-13-1-0048, and a Research to Prevent Blindness Unrestricted Grant. The Flaum Eye Institute received funding from a Research to Prevent Blindness Unrestricted Grant and the Center for Visual Sciences received funding from NIH Core Grant NIH P30 EY001319
Footnotes
Conflict of Interest: JS is a consultant for Alcon; Allergan PLC; Regeneron; and Alimera Sciences, Inc. AEK receives grant funding from Second Sight and Genentech and is a consultant for Alimera Sciences, Inc.; Allergan PLC, Bausch Health, Genentech, and Regeneron. None of the other authors report any disclosures.
References
- 1.Al-Zamil WM, Yassin SA. Recent developments in age-related macular degeneration: a review. Clin Interv Aging. 2017;12:1313–1330. doi: 10.2147/CIA.S143508 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.García-Layana A, Cabrera-López F, García-Arumí J, Arias-Barquet L, Ruiz-Moreno JM. Early and intermediate age-related macular degeneration: update and clinical review. Clin Interv Aging. 2017;12:1579–1587. doi: 10.2147/CIA.S142685 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Ling Wong W, Su X, Li X, et al. Global prevalence of age-related macular degeneration and disease burden projection for 2020 and 2040: a systematic review and meta-analysis. 2014. doi: 10.1016/S2214-109X(13)70145-1 [DOI] [PubMed]
- 4.Taylor DJ, Hobby AE, Binns AM, Crabb DP. How does age-related macular degeneration affect real-world visual ability and quality of life? A systematic review. BMJ Open. 2016;6(12):e011504. doi: 10.1136/bmjopen-2016-011504 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Powell J, Inglis N, Ronnie J, Large S. The Characteristics and Motivations of Online Health Information Seekers: Cross-Sectional Survey and Qualitative Interview Study. J Med Internet Res. 2011;13(1):e20. doi: 10.2196/jmir.1600 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Sillence E, Briggs P, Harris PR, Fishwick L. How do patients evaluate and make use of online health information? Soc Sci Med. 2007;64(9):1853–1862. doi: 10.1016/j.socscimed.2007.01.012 [DOI] [PubMed] [Google Scholar]
- 7.Ybarra M, Suman M. Reasons, assessments and actions taken: sex and age differences in uses of Internet health information. Health Educ Res. 2008;23(3):512–521. doi: 10.1093/her/cyl062 [DOI] [PubMed] [Google Scholar]
- 8.Shuyler KS, Knight KM. What Are Patients Seeking When They Turn to the Internet? Qualitative Content Analysis of Questions Asked by Visitors to an Orthopaedics Web Site. J Med Internet Res. 2003;5(4):e24. doi: 10.2196/jmir.5.4.e24 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Paolino L, Genser L, Fritsch S, de’ Angelis N, Azoulay D, Lazzati A. The Web-Surfing Bariatic Patient: the Role of the Internet in the Decision-Making Process. Obes Surg. 2015;25(4):738–743. doi: 10.1007/s11695-015-1578-x [DOI] [PubMed] [Google Scholar]
- 10.Lagan BM, Sinclair M, Kernohan WG. What Is the Impact of the Internet on Decision-Making in Pregnancy? A Global Study. Birth. 2011;38(4):336–345. doi: 10.1111/j.1523-536X.2011.00488.x [DOI] [PubMed] [Google Scholar]
- 11.Eysenbach G, Köhler C. How do consumers search for and appraise health information on the world wide web? Qualitative study using focus groups, usability tests, and in-depth interviews. BMJ. 2002;324(7337):573–577. doi: 10.1136/bmj.324.7337.573 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Online Health 2013. | Pew Research Center. http://www.pewinternet.org/2013/01/15/health-online-2013/. Accessed January 19, 2019.
- 13.Dutton WH, Blank G. Cultures of the Internet: The Internet in Britain. http://oxis.oii.ox.ac.uk/. Accessed August 11, 2019.
- 14.Ashraf AA, Colakoglu S, Nguyen JT, et al. Patient involvement in the decision-making process improves satisfaction and quality of life in postmastectomy breast reconstruction. J Surg Res. 2013;184(1):665–670. doi: 10.1016/j.jss.2013.04.057 [DOI] [PubMed] [Google Scholar]
- 15.Nutbeam D The evolving concept of health literacy. Soc Sci Med. 2008;67(12):2072–2078. doi: 10.1016/j.socscimed.2008.09.050 [DOI] [PubMed] [Google Scholar]
- 16.The Health Literacy of America’s Adults Results From the 2003 National Assessment of Adult Literacy. https://files.eric.ed.gov/fulltext/ED493284.pdf. Accessed August 11, 2019.
- 17.Rikard RV, Thompson MS, McKinney J, Beauchamp A. Examining health literacy disparities in the United States: a third look at the National Assessment of Adult Literacy (NAAL). BMC Public Health. 2016;16(1):975. doi: 10.1186/s12889-016-3621-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Kutner Greenberg, Jin Paulsen. The Health Literacy of America’s Adults: Results From the 2003 National Assessment of Adult Literacy.; 2003. https://nces.ed.gov/pubs2006/2006483.pdf. Accessed January 20, 2019.
- 19.Daraz L, Morrow AS, Ponce OJ, et al. Readability of Online Health Information: A Meta-Narrative Systematic Review. Am J Med Qual. 2018;33(5):487–492. doi: 10.1177/1062860617751639 [DOI] [PubMed] [Google Scholar]
- 20.Ivastinovic D, Wackernagel W, Wedrich A. Accuracy of Freely Available Information About Rhegmatogenous Retinal Detachment on the Internet. JAMA Ophthalmol. 2019;137(1):113. doi: 10.1001/jamaophthalmol.2018.4682 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Kloosterboer A, Yannuzzi NA, Patel NA, Kuriyan AE, Sridhar J. Assessment of the Quality, Content, and Readability of Freely Available Online Information for Patients Regarding Diabetic Retinopathy. JAMA Ophthalmol. August 2019. doi: 10.1001/jamaophthalmol.2019.3116 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Silberg WM, Lundberg GD, Musacchio RA. Assessing, Controlling, and Assuring the Quality of Medical Information on the Internet. JAMA. 1997;277(15):1244. doi: 10.1001/jama.1997.03540390074039 [DOI] [PubMed] [Google Scholar]
- 23.Readable. https://readable.io/. Accessed January 29, 2019.
- 24.Shepperd S, Charnock D, Gann B. Helping patients access high quality health information. BMJ. 1999;319(7212):764–766. doi: 10.1136/bmj.319.7212.764 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Mehta S Age-Related Macular Degeneration. doi: 10.1016/j.pop.2015.05.009 [DOI] [PubMed]
- 26.Boyer DS, Antoszyk AN, Awh CC, Bhisitkul RB, Shapiro H, Acharya NR. Subgroup Analysis of the MARINA Study of Ranibizumab in Neovascular Age-Related Macular Degeneration. Ophthalmology. 2007;114(2):246–252. doi: 10.1016/j.ophtha.2006.10.045 [DOI] [PubMed] [Google Scholar]
- 27.Sommerhalder K, Abraham A, Zufferey MC, Barth J, Abel T. Internet information and medical consultations: Experiences from patients’ and physicians’ perspectives. Patient Educ Couns. 2009;77(2):266–271. doi: 10.1016/j.pec.2009.03.028 [DOI] [PubMed] [Google Scholar]
- 28.Linn AJ, van Weert JCM, Gebeyehu BG, et al. Patients’ Online Information-Seeking Behavior Throughout Treatment: The Impact on Medication Beliefs and Medication Adherence. Health Commun. 2018. doi: 10.1080/10410236.2018.1500430 [DOI] [PubMed] [Google Scholar]
- 29.Tan SSL, Goonawardene N. Internet health information seeking and the patient-physician relationship: A systematic review. J Med Internet Res. 2017;19(1). doi: 10.2196/jmir.5729 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Polat O, İnan S, Özcan S, et al. Factors affecting compliance to intravitreal anti-vascular endothelial growth factor therapy in patients with age-related macular degeneration. Turk Oftalmoloiji Derg. 2017;47(4):205–210. doi: 10.4274/tjo.28003 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Ditzler N, Greenhawt M. Influence of health literacy and trust in online information on food allergy quality of life and self-efficacy. Ann Allergy, Asthma Immunol. 2016;117(3):258–263.e1. doi: 10.1016/j.anai.2016.07.011 [DOI] [PubMed] [Google Scholar]
- 32.Fahy E, Hardikar R, Fox A, Mackay S. Quality of patient health information on the Internet: reviewing a complex and evolving landscape. Australas Med J. 2014;7(1):24. doi: 10.4066/AMJ.2014.1900 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Litsa T Is it important for SEO to rank first in 2018? - Search Engine Watch Search Engine Watch. https://searchenginewatch.com/2018/08/17/is-it-important-for-seo-to-rank-first-in-2018/. Published 2018. Accessed January 29, 2019.
- 34.Laurent MR, Vickers TJ. Seeking health information online: does Wikipedia matter? J Am Med Inform Assoc. 2009;16(4):471–479. doi: 10.1197/jamia.M3059 [DOI] [PMC free article] [PubMed] [Google Scholar]