Skip to main content
Indian Journal of Otolaryngology and Head & Neck Surgery logoLink to Indian Journal of Otolaryngology and Head & Neck Surgery
. 2023 Nov 9;76(1):987–991. doi: 10.1007/s12070-023-04341-9

Digital Health Literacy: Evaluating the Readability and Reliability of Cochlear Implant Patient Information on the Web

Vishak MS 1,, Adwaith Krishna Surendran 2, Nandini B Krishnan 3, Kalaiarasi Raja 1
PMCID: PMC10908950  PMID: 38440512

Abstract

Hearing aids and implants are used to treat hearing loss, with cochlear implants being the most successful option for severe sensorineural hearing loss. Patients frequently use the internet as a trusted source of clinical information before committing to any therapeutic procedure, including receiving a cochlear implant. A health resource’s readability and dependability influence its value to patients. Readability refers to how easily language can be understood, whereas reliability refers to the correctness and consistency of the information presented. JAMA standards and the DISCERN tool were used to assess the reliability of the websites listed. For readability analysis, the FRE, FKG and GFI were chosen. The acceptable readability level was set to < 7 for the FKG, GF score over 17 as the equivalent of college-level education and ≥ 80.0 for the FRE. The readability scores vary across the sources, suggesting a range of comprehension levels required for understanding the cochlear implant patient information found on Google. There was a statistical difference detected in Discern score between the groups (p = 0.008). The mean discern score was significantly higher in hospital generated sources when compared to industry (3.13 ± 0.69 vs. 2.11 ± 0.78, p = 0.03).

Keywords: Cochlear implant, Patient information, Health literacy, Readability and reliability

Introduction

Hearing loss, whether sensorineural or conductive, is a serious health concern as it impairs the quality of life and can lead to mutism when present in childhood [1]. Nearly 2.5 billion people are at a risk of developing hearing loss by 2050 with 700 million requiring rehabilitation [2]. Hearing aids and implants are used to treat hearing loss and cochlear implants are the only reliable modality for treating severe sensorineural hearing loss [3]. Patients tend to use the internet as a reliable source of clinical information before vouching for any clinical procedure and in this case, getting a cochlear implant done. With 60% of the world’s population having access to the internet and in western societies the figure being 90%, an instant access to large unregulated amounts of data is available to the general public [4, 5].

Google is the most popular search engine as compared to other alternatives like Bing and Yahoo and occupies about 92% of the market share [6, 7]. Physicians have different opinions on patients using the internet to understand their symptoms and diagnosis [8]. The validity of internet sources used by patients to learn about their illnesses and associated therapies have been investigated in a variety of studies before [912]. Nevertheless, there is little research on the subject of accuracy of online patient health information in otorhinolaryngology [13]. The literature on cochlear implants available on the internet has not been thoroughly researched before.

Readability and reliability of a health resource affect its value to patients. Readability is the ease with which text is understood, while reliability is the accuracy and consistency of the information provided. As the lack of either element could affect the patient’s knowledge and eventually their medical care, a number of objective tools and scoring systems have been created. The aim of this study is to objectively assess the readability and reliability of information related to cochlear implants available on the internet.

Methodology

In this study, the selected search engine was used to retrieve information available on cochlear implants to the general public.

Using Google Chrome, version 117.0.5938, we used the search engine Google (http://www.google.com) to conduct the search after erasing cookies, download history and browsing was done in “incognito” mode to avoid biases that could arise from previous searches.

The search keyword used was “Cochlear implants patient information” and 50 websites from the search engine was obtained. The duplicates and inaccessible websites present were not included and the consecutive websites were taken so that a total of 50 websites with relevant health information were available for analysis. The exclusion criteria includes: (1) exclusively audio or video based presentation of information; (2) presence of banner advertisements or sponsored links; (3) blocked sites or sites with denied direct access (required logging in); (4) social media websites. The remaining websites were included for readability and reliability analysis.

Quality Assessment Tools

JAMA benchmarks [14], the DISCERN tool [15], were used to evaluate the reliability of the included websites.

The JAMA benchmarks tool, which is published by the Journal of the American Medical Association, evaluates the websites based on the following criteria: authorship (availability of data on authors, their contributors, affiliations, and pertinent credentials); attribution (availability of clear references and sources from which the content was cited); currency (indication of the dates in which the content is posted and updated) and disclosure (availability of data on ownership, sponsorship, advertising, underwriting, commercial funding or support sources, and any potential conflicts of interest). When a requirement was satisfied (with a “yes” response), the website received one point; if not, it received zero points. The aggregate JAMA score therefore ranges from 0 (no criteria met) to 4 points (all 4 criteria met).

The DISCERN tool consists of 16 questions divided into 3 sections: questions 1–8 investigate the trustfulness of websites as sources of information about particular therapies, questions 9–15 examine treatment choices, and question 16 assesses the overall quality score. Each question is given a score between 1 and 5, with 5 being a high-quality website and 1 representing a poor website.

Readability of the websites were evaluated using a free and validated readability calculator tool available online [16]. This website evaluates the text using a variety of widely used analytical measures, including the Gunning Fog Index (GFI), Coleman Liau Index, Flesch Kincaid Grade (FKG), Automated Readability Index (ARI), Simple Measure of Gobbledygook (SMOG), and Flesch Reading Ease (FRE). For readability analysis, the FRE, FKG and GFI were chosen. The acceptable readability level was set to < 7 for the FKG, GF score over 17 as the equivalent of college-level education and ≥ 80.0 for the FRE [9, 17, 18].

Results

The hospital and discussion categories had the most website numbers with 23 and 12 respectively (Tables 1, 2 and 3).

Table 1.

Website numbers for the article categories (N = 50)

Sl. no. Article categories n (%)
1 Academic 6 (12)
2 Hospital generated 23 (46)
3 Discussion 12 (24)
4 Industry 9 (18)

Table 2.

Mean scores depicting the website quality

S. no. Scores Mean (SD)
1 Flesch reading ease score (FRE) 54.6 (16.6)
2 Flesch-Kincaid Grade level (FKG) 7.46 (2.6)
3 Gunning fog score 7.4 (2.4)
4 Discern score 3 (0.96)
5 JAMA score 2 (1.3)

Table 3.

Mean scores for various categories depicting the website quality

S. no. Categories Mean scores (SD)
FRE FKG GF DISCERN JAMA
1 Hospital generated 47.9 (18.3) 8.5 (2.6) 8.5 (2.3) 3.7 (1.2) 3.2 (1.3)
2 Academic sources 56.6 (17.5) 7.1 (2.4) 7.4 (2.3) 3.1 (0.7) 1.9 (1.1)
3 Discussion 54.8 (26.7) 7.2 (3.6) 6.8 (3.1) 3.1 (1.1) 2.2 (3.4)
4 Industry 49 (10.7) 8 (1.7) 7.6 (2) 2.1 (0.8) 1.4 (1.2)

Assessment of Readability

We first calculated the readability scores for each unique source in the first 50 search results using the FRE and FKG formulas. The FRE score measures the ease of comprehension, with higher scores indicating easier readability, while the FKG score estimates the grade level required to understand the content. The statistical test used was One way ANOVA.

Flesch Reading Ease Score (FRE)

There was no statistical difference detected in FRE scores between the categories (p value = 0.66) (Table 4).

Table 4.

Mean FRE scores

S. no. Categories Mean FRE scores (SD) P value
1 Hospital generated 47.9 (18.3) 0.66
2 Academic sources 56.6 (17.5)
3 Discussion 54.8 (26.7)
4 Industry 49 (10.7)

Flesch-Kincaid Grade Level (FKG)

There was no statistical difference detected in FKG scores between the categories - p value = 0.62 (Table 5).

Table 5.

Mean FKG scores

S. no. Categories Mean FKG scores (SD) P value
1 Hospital generated 8.5 (2.6) 0.62
2 Academic sources 7.1 (2.4)
3 Discussion 7.2 (3.6)
4 Industry 8 (1.7)

Gunning Fog Score (GF)

There was no statistical difference detected in Gunning score between the categories - p value = 0.59.

The readability scores vary across the sources, suggesting a range of comprehension levels required for understanding the cochlear implant patient information found on Google (Table 6).

Table 6.

Mean GF scores

S. no. Categories Mean gunning fog scores (SD) P value
1 Hospital generated 8.5 (2.3) 0.55
2 Academic sources 7.4 (2.3)
3 Discussion 6.8 (3.1)
4 Industry 7.6 (2)

Assessment of Reliability

We assessed the reliability of the information in the selected sources using two criteria: the JAMA criteria and the DISCERN tool. The JAMA criteria evaluate the presence of certain key attributes such as authorship, references, and disclosure of conflicts of interest. The DISCERN tool, on the other hand, assesses the quality and reliability of health information specifically.

DISCERN Score

There was a statistical difference detected in Discern score between the groups (p = 0.008). The mean discern score was significantly higher in academic sources when compared to industry (3.67 +/− 1.21 vs. 2.11 +/− 0.78, p = 0.01) (Table 7).

Table 7.

Mean discern scores

S. no. Categories Mean discern scores (SD) P value
1 Hospital generated 3.7 (1.2) 0.008
2 Academic sources 3.1 (0.7)
3 Discussion 3.1 (1.1)
4 Industry 2.1 (0.8)

The mean discern score was significantly higher in hospital generated sources when compared to industry (3.13 +/− 0.69 vs. 2.11 +/− 0.78, p = 0.03).

No difference in reliability between academic sources and hospital generated sources.

JAMA Criteria

There was no statistical difference detected in JAMA score between the groups (p = 0.051) (Table 8).

Table 8.

Mean JAMA scores

S. no. Categories Mean JAMA scores (SD) P value
1 Hospital generated 3.2 (1.3) 0.508
2 Academic sources 1.9 (1.1)
3 Discussion 2.2 (3.4)
4 Industry 1.4 (1.2)

Discussion

The internet provides access to an extensive amount of health-related information to the general public which are utilised by patients in addition to their regular medical care. Though it is one of the most common online activities, professional guidance is not sought often. [19, 20] The information obtained from the internet is to be used as a resource to support management, not as a hurdle for practitioners, hence the medical practitioners are against patients cross-questioning based on data found online. [21] In recent years, the quality and readability of the available internet resources in numerous specialities including otorhinolaryngology were found to be subpar. [2224] Readers could be misled by the poor quality of content and the potential for cyberchondria, which is described as an excessive amount of tension or anxiety brought on by a patient’s internet study of health concerns, is raised by this lack of reliable information. [25] This lead to the creation of tools such as JAMA and DISCERN were used to assess the content provided on medical sites.

The results of the study show that the quality of information available on cochlear implants are average but with poor readability. While the mean Discern score and JAMA scores were 3/5 and 2/4 which points out toward average reliability, the FRE and FKG mean scores were 53.74 and 7.46 respectively as against the acceptable scores of ≥ 80.0 for the FRE and < 7 for the FKG. The mean GF score was 7.41 which is quite acceptable as the websites are considered to be readable for patients with education till 8th grade (GF score = 8).

The websites were classified into 4 groups: Academic, Hospital-generated, Discussion and Industry similar to previous studies related to reliability and readability assessment. [9, 26] The fact that academic and hospital-generated websites had a higher DISCERN score as well as JAMA score as compared to industry sector, which had a lesser frequency is to be appreciated as it is in contrast to the internet becoming a competitive marketplace in other fields of patient care [27]. Most of the websites in academic, hospital-generated and discussion groups offered a wide range of medical information on cochlear implants, the advantages and disadvantages of receiving one, as well as potential alternatives while some were focused on providing just the device and the alternatives. Websites labelled as “academic” (n = 6) were found to be of the highest quality in our investigation. The creators of academic websites are aware of the importance of health information’s dependability, validity, and transparency. Since these websites are the online representations of well-known periodicals, they operate at the greatest levels of professionalism and openness. Their readers are more important to them than the information they give, thus they care more about that.

Though there were limitations to our study that the search was limited to the search engine “Google” and only the first 50 websites were used for analysis, “Google” remains the most favoured search engine with many users primarily engaging in the initial few search results.

It is vital to improve the quality of online resources for Cochlear implant information. These websites must to follow strict medical guidelines and display data using a range of assessment methods. It’s essential for them to actively pursue accreditation through recognized criteria, as this will guarantee dependability. Additionally, we intend to compile a list of the best websites in this field, which we’ll distribute to experts in otorhinolaryngology. This can be a useful tool to help their patients locate reliable information on cochlear implants. After our study is approved, we also intend to share it with academic and professional organizations.

Author Contributions

All the authors have equally contributed to the case report. VMS is the major contributor in writing the manuscript. AKS and KR participated in writing and editing with VMS and NBK was a part of data interpretation along with VMS.

Funding

There was no funding required to take up the study.

Data Availability

The datasets during and/or analyzed during the current study are available from the corresponding author upon reasonable request.

Declarations

Conflict of interest

The authors declare that they have no competing interest.

Ethical Approval

Ethics approval was not required for the study as the study is conducted using open access data in internet and no actual human/animal data were used.

Consent for Publication

Written informed consent for publication of their clinical details and/or clinical images was obtained from the patient. A copy of the consent form is available for review by the Editor of this journal.

Footnotes

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Singh S, Jain S. Factors associated with deaf-mutism in children attending special schools of rural central India: a survey. J Family Med Primary Care. 2020;9(7):3256–3263. doi: 10.4103/jfmpc.jfmpc_222_20. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.World report on hearing . Licence: CC BY-NC-SA 3.0 IGO. Geneva: World Health Organization; 2021. [Google Scholar]
  • 3.Roche JP, Hansen MR. On the horizon: Cochlear implant technology. Otolaryngol Clin North Am. 2015;48(6):1097–1116. doi: 10.1016/j.otc.2015.07.009. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Internet World Stats—Usage and Population Statistics. Available online: https://www.internetworldstats.com/stats.html
  • 5.European Commission (2015) Flash Eurobarometer 404 (European Citizens’ Digital Health Literacy). In: Cologne GDA (ed)
  • 6.State of the Internet. The state of the Internet in New Zealand. Available online: https://internetnz.nz/sites/default/files/SOTI%20FINAL.pdf
  • 7.Search Engine Market Share Worldwide. http://gs.statcounter.com/search-engine-market-share
  • 8.Van Riel N, Auwerx K, Debbaut P, et al. The effect of Dr Google on doctor-patient encounters in primary care: a quantitative, observational, cross-sectional study. BJGP Open. 2017;1:bjgpopen17X100833. doi: 10.3399/bjgpopen17X100833. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Wu J, Hunt L, Wood AJ. Readability and reliability of Rhinology patient information on Google. Aust J Otolaryngol. 2021 doi: 10.21037/ajo-21-2. [DOI] [Google Scholar]
  • 10.Murphy B, Irwin S, Condon F. Readability and quality of online information for patients pertaining to revision knee arthroplasty: an objective analysis. Surgeon. 2022;20(6):e366–e370. doi: 10.1016/j.surge.2021.12.009. [DOI] [PubMed] [Google Scholar]
  • 11.Saleh D, Fisher JH, Provencher S, Liang Z, Ryerson CJ. A systematic evaluation of the quality, accuracy, and reliability of internet websites about pulmonary arterial hypertension. Ann Am Thorac Soc. 2022;19(8):1404–1413. doi: 10.1513/AnnalsATS.202103-325OC. [DOI] [PubMed] [Google Scholar]
  • 12.Al-Ak’hali MS, Fageeh HN, Halboub E, Alhajj MN, Ariffin Z. Quality and readability of web-based arabic health information on periodontal disease. BMC Med Inform Decis Mak. 2021;21(1):41. doi: 10.1186/s12911-021-01413-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Maung JKH, Roshan A, Sood S. P183: FESS on the internet. Otolaryngol Head Neck Surg. 2006;135:P272–P273. doi: 10.1016/j.otohns.2006.06.1218. [DOI] [Google Scholar]
  • 14.Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the internet: caveant lector et viewor–let the reader and viewer beware. JAMA. 1997;277(15):1244–1245. doi: 10.1001/jama.1997.03540390074039. [DOI] [PubMed] [Google Scholar]
  • 15.Charnock D, Shepperd S, Needham G, Gann R. DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Commun Health. 1999;53(2):105–111. doi: 10.1136/jech.53.2.105. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Test document readability. Available from: https://www.webfx.com/tools/read-able/?url=https://www.mayo.edu/research/clinical-trials. Accessed 18 Sept 2023
  • 17.Edmunds MR, Barry RJ, Denniston AK. Readability assessment of online ophthalmic patient information. JAMA Ophthalmol. 2013;131(12):1610–1616. doi: 10.1001/jamaophthalmol.2013.5521. [DOI] [PubMed] [Google Scholar]
  • 18.Kher A, Johnson S, Griffith R. Readability assessment of online patient education material on congestive heart failure. Adv Prev Med. 2017;2017:9780317. doi: 10.1155/2017/9780317. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Cline RJ, Haynes KM. Consumer health information seeking on the internet: the state of the art. Health Educ Res. 2001;16(6):671–692. doi: 10.1093/her/16.6.671. [DOI] [PubMed] [Google Scholar]
  • 20.Fox S. Health Topics [Internet]: Pew Research Center: Internet, Science & Tech; 2011 [3 August 2020]. Available from: https://www.pewresearch.org/internet/2011/02/01/health-topics-2/
  • 21.Woodward-Kron R, Connor M, Schulz PJ, Elliott K. Educating the patient for health care communication in the age of the world wide web: a qualitative study. Acad Med. 2014;89(2):318–325. doi: 10.1097/ACM.0000000000000101. [DOI] [PubMed] [Google Scholar]
  • 22.Ahsanuddin S, Cadwell JB, Povolotskiy R, Paskhover B. Quality, reliability, and readability of online information on rhinoplasty. J Craniofac Surg. 2021;32(6):2019–2023. doi: 10.1097/SCS.0000000000007487. [DOI] [PubMed] [Google Scholar]
  • 23.O’Neill SC, Baker JF, Fitzgerald C, et al. Cauda equina syndrome: assessing the readability and quality of patient information on the internet. Spine (Phila Pa 1976) 2014;39(10):E645–E649. doi: 10.1097/BRS.0000000000000282. [DOI] [PubMed] [Google Scholar]
  • 24.Arts H, Lemetyinen H, Edge D. Readability and quality of online eating disorder information-are they sufficient? A systematic review evaluating websites on Anorexia Nervosa using DISCERN and Flesch readability. Int J Eat Disord. 2020;53(1):128–132. doi: 10.1002/eat.23173. [DOI] [PubMed] [Google Scholar]
  • 25.Barke A, Bleichhardt G, Rief W, Doering BK. The cyberchondria severity scale (CSS): German validation and development of a short form. Int J Behav Med. 2016;23(5):595–605. doi: 10.1007/s12529-016-9549-8. [DOI] [PubMed] [Google Scholar]
  • 26.Devitt BM, Hartwig T, Klemm H, et al. Comparison of the source and quality of information on the internet between anterolateral ligament reconstruction and anterior cruciate ligament reconstruction: an Australian experience. Orthop J Sports Med. 2017;5(12):2325967117741887. doi: 10.1177/2325967117741887. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Schaffer JL, Bozic KJ, Dorr LD, Miller DA, Nepola JV. AOA symposium: direct-to-consumer marketing in orthopaedic surgery: Boon or boondoggle? J Bone Jt Surg Am. 2008;90(11):2534–2543. doi: 10.2106/JBJS.G.00309. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

The datasets during and/or analyzed during the current study are available from the corresponding author upon reasonable request.


Articles from Indian Journal of Otolaryngology and Head & Neck Surgery are provided here courtesy of Springer

RESOURCES