Dear Authors,
It is with great interest that we have recently read the article “Readability of the Most Commonly Accessed Online Patient Education Materials Pertaining to Pathology of the Hand” published in Hand.1 Health information is no longer limited to the traditional, in-office, physician-patient conversations.2 We are living in an “information technology” era, in which an increasing proportion of patients, family members, and friends seek health-related information online. The readily available consumption of astonishing amounts of information online, from established healthcare organization websites to social media platforms, has changed patient education.
As described, the Kincaid et al3 formula for assessing readability in 1970s has been widely used to establish a standard minimum level of readability for medical, legal, and insurance documents with the aim of making information truly accessible for all readers to understand. Unfortunately, several studies evaluating the quality of web-based patient education materials continue to find a higher readability level than recommended.4,5 However, the majority of these studies focus on the Flesch Reading Ease (FRE) and Flesch-Kincaid (FK-level) grading scales. And while we agree with the utility of the both grading scales, these two tools alone fail to capture the complete picture of readability.
FRE and FK-level grading scales rely heavily/entirely on the word length and sentence length as an indicator of reading difficulty. Furthermore, reading materials rated as easier readability by these scales does not guarantee that a health-related article is suitable for patients. For example, a patient education material could describe inaccurate data about management of carpal tunnel syndrome, an inappropriately complex topic, but because it is written with shorter word and sentence length obtain a FK-level of 5, which is considered the ideal score (FK-level < sixth-grade reading level).
Beyond readability, it is necessary to evaluate the reliability, accuracy, and quality of patient education information. This is particularly true for online material, as content can be freely added by anybody without restriction or verification of creditability. Therefore, the information available to patients online has variable levels of appropriateness, accuracy, and thoroughness. For example, objectivity of a website may be influenced by advertising sales and sponsorship funding. It is critical that any health-related material about hand pathologies and their preferred method of treatment contain only accurate and reliable content in order to achieve true health literacy, patient education, and our ultimate goal, improved patient outcomes.
The LIDA score and the DISCREN score are two assessment tools that are becoming increasingly popular to evaluate the holistic accessibility of a webpage.5,6 They are both validated instruments that use comprehensive questions to determine both reliability and readability. If the authors had utilized these scoring tools, in addition to the FRE and FK-level grading scales, when assessing the available online health information of pathology of the hand, it could have provided a more complete evaluation.
Footnotes
Ethical Approval: This study was approved by our institutional review board.
Statement of Human and Animal Rights: This article does not contain any studies with human or animal subjects.
Statement of Informed Consent: No consent was required in this study due to nature of “letter to editor” study.
Declaration of Conflicting Interests: The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding: The author(s) received no financial support for the research, authorship, and/or publication of this article.
Institution at Which the Work was Performed: Investigation performed at the Massachusetts General Hospital, Partners Health System, Boston, Massachusetts, USA, 02114.
ORCID iD: Ali Parsa
https://orcid.org/0000-0002-7374-0814
References
- 1. Akinleye SD, Garofolo-Gonzalez G, Montuori M, et al. Readability of the most commonly accessed online patient education materials pertaining to pathology of the hand. Hand. 2018;13(6):705-714. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. Cline RJ, Haynes KM. Consumer health information seeking on the Internet: the state of the art. Health Educ Res. 2001;16(6):671-692. http://www.ncbi.nlm.nih.gov/pubmed/11780707. Accessed December 19, 2018. [DOI] [PubMed] [Google Scholar]
- 3. Kincaid JP, Fishburne RP, Rogers RD, et al. Derivation of new readability formulas (automated readability index, fog count and flesch reading ease formula) for navy enlisted personnel. http://stars.library.ucf.edu/istlibrary/56. Institute Simulation Training paper 56. Published January 1975. Accessed December 19, 2018.
- 4. Hadden K, Prince LY, Schnaekel A, et al. Readability of patient education materials in hand surgery and health literacy best practices for improvement. J Hand Surg Am. 2016;41(8):825-832. [DOI] [PubMed] [Google Scholar]
- 5. Küçükdurmaz F, Gomez MM, Secrist E, et al. Reliability, readability and quality of online information about femoracetabular impingement. Arch Bone Jt Surg. 2015;3(3):163-168. http://www.ncbi.nlm.nih.gov/pubmed/26213699. Accessed December 19, 2018. [PMC free article] [PubMed] [Google Scholar]
- 6. Charnock D, Shepperd S, Needham G, et al. DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health. 1999;53(2):105-111. http://www.ncbi.nlm.nih.gov/pubmed/10396471. Accessed December 19, 2018. [DOI] [PMC free article] [PubMed] [Google Scholar]
