Abstract
Cardiovascular diseases (CVD) are a leading cause of morbidity & mortality worldwide. Patient education materials help patients understand the disease and its management. Health literacy is an important challenge that may contribute to health inequities and disparities. The National Institute of Health and American Medical Association recommend patient education materials to be ≤6th-grade reading level.
Objective
To evaluate readability and comprehension of patient education materials related to CVD, available at the American Heart Association (AHA) & CardioSmart web platform by the American College of Cardiology (ACC) websites.
Method
We examined the readability and comprehension of 63 patient education materials (accessed June 2022) using: (a) Flesch Kincaid Readability Ease (FKRE): measures readability (0–100, goal > 70), (b) Flesch Kincaid Grade Level (FKGL) (goal = grade 7). We compared the AHA and ACC scores using descriptive and t-tests. P-value ≤ 0.05 was significant.
Results
Sixty-three web pages of patient education materials (AHA 24, ACC 39) were reviewed in June 2022. Mean ± standard deviation (SD) FKRE was 54.9 ± 6.8 for all the web pages. FKRE 50–60 equates to “fairly difficult to read.” Mean ± SD FKGL was 10.0 ± 1.3. AHA patient education materials content was significantly more difficult to read and comprehend, were longer, and had more complex words than ACC patient education materials.
Conclusions
CVD-related patient education materials available online through leading national organizations are not congruent with the recommendations from national healthcare organizations. They are not as user-friendly as they can be. Urgent recognition of the gaps and unmet needs are indicated to optimize patient health literacy.
Keywords: American college of cardiology, American Heart Association, Cardiovascular conditions, Comprehension, Health literacy, Patient education materials
Highlights
-
•
Patient education materials help patients understand the disease and its management.
-
•
CVD related online education materials are not patient-centric or conducive to use.
-
•
Urgent recognition of these gaps will improve patient outcomes in the community.
1. Introduction
Cardiovascular diseases (CVD) are a leading cause of morbidity and mortality worldwide. The annual total costs of CVDs in the United States are $378.0 billion, and the estimated direct costs increased significantly from $103.5 billion (1996–1997) to $226.2 billion (2017–2018), reflecting the increased healthcare and economic burden of CVDs nationwide [1].
Health disparities have been well reported in CVDs [2,3]. Health disparities can result from health inequities, such as health literacy which plays a central role in influencing health-related outcomes in patients and is defined as “the capacity to obtain, interpret, and understand basic health information and services, and the competence to use such information and services to enhance one's health.” [4,5] Health literacy contributes significantly to patient-related outcomes in varied clinical settings [6,7]. Low health literacy is directly linked to worse patient outcomes and increased health care burden [8,9]. These impose unique challenges to patients, impacting their health outcomes and elective, urgent, and emergent healthcare utilization. Low health literacy is associated with worse cardiovascular outcomes in patients and increased all-cause hospitalization and mortality among patients with heart failure [10]. Quality of communication in clinics and hospitals with the patient is influenced by the health literacy of patients [11,12]. Organizational health literacy is the degree to which healthcare organizations design and implement strategies to facilitate patients' understanding of their health information, navigation of the health care system, engagement in their health care process, and management of their health.
In the research setting, the use of literacy-sensitive CVD prevention decision aid has shown promise toward improving patients' knowledge and health behaviors [13]. In the real-world patient care setting, organizational health literacy efforts can potentially improve patient health literacy, health outcomes, and cost of care. Several National and International organizations (e.g., American Heart Association (AHA) and American College of Cardiology (ACC)) have taken the initiative to develop and make accessible patient education materials. However, their utility has not been well established thus far.
National Assessment of Adult Literacy study reported that 36 % of the American population have basic or below basic levels of health literacy and that an average American reads at a 7th–8th grade level while 20 % reads at a 5th-grade level or below [[14], [15], [16], [17]]. Prominent organizations like the National Institutes of Health (NIH), American Medical Association (AMA), and US Department of Health and Human Services have thus recommended that the patient education materials be written at or below a 6th reading grade level. In contrast, the Center for Disease Control recommends it to be less than the 8th reading grade level [14,[16], [17], [18]]. Studies show that available online patient education materials are not commonly aligned to these recommendations and thus may not be patient friendly [19,20]. Meade et al. showed that reducing the reading grade levels of the patient education materials results in better patient comprehension [21], indicating that the reading grade levels of patient education materials should be improved to achieve better understanding and higher health literacy among patients. Herein we evaluated the readability and comprehension of patient education materials for CVDs, available online at the AHA website & the CardioSmart web platform by ACC.
2. Methods
We identified CVD-related online patient education materials of leading medical organizations, including the AHA website and CardioSmart web platform of ACC, and reviewed them for reading comprehension in June 1 to June 30, 2022 using the following measures: 1) Flesch Kincaid Readability Ease (FKRE) - measures readability (0 to 100, where higher scores indicate greater ease of reading, goal > 70); 2) Flesch Kincaid Grade Level (FKGL) - US grade level required to comprehend the text on a page (goal ≤ grade 7); 3) Gunning Fog Score (GFS): estimates the number of formal education years required to understand the text on the first reading (goal ≤ 7); 4) Simple Measure of Gobbledygook (SMOG) Index: estimates the education years required to comprehend the written materials; 5) Coleman Liau Index (CLI): assesses the US grade level necessary to understand text material; and 6) Automated Readability Index (ARI): determines the grade level required for a reader to understand a passage.
These tools are frequently used to assess reading comprehension of patient education materials [5,[22], [23], [24]]. FKGL instrument is one of the widely used tools for assessing readability, which uses the number of words/sentences and syllables/words to calculate the reading grade level. In contrast, the FKGE calculates the reading ease scores and interprets them based on a preformed mathematical formula [25,26]. Data was also expressed as the number of sentences, complex words (words with >3 syllables and long words as those with ≥6 characters), percentage of complex words, average words per sentence, and average syllables per word.
The patient education materials selected for this study included any possible resources available on these websites in English for patient education about or related to CVDs or their treatment, as the reading and comprehension metrics methodology employed only applies to English language. Different subtopic materials of the same educational topic were grouped and considered under a common heading. The information available for caregivers and professionals was excluded. Supplementary materials were excluded, including attachments and patient tools like hyperlinks, infographics, patient voices, worksheets, pictures, charts, videos, references for further reading, advertisements, and checklists. We utilized the https://www.webfx.com/tools/read-able/ online platform to calculate the readability and comprehension. Data were analyzed using the IBM SPSS Statistics software for Windows. Basic descriptive tests and students t-tests based on the observed normative distribution of the data (significant P value ≤ 0.05 on two-tailed tests) were used to compare the reading comprehension results obtained from the two websites.
3. Results
In total, 63 main topics of patient education materials related to CVDs were studied and evaluated, which included 24 topics from the AHA website and 39 topics from the CardioSmart web platform of the ACC (Table 1). Overall mean ± standard deviation (SD) FKRE for patient education materials was 54.9 ± 6.8, which equates to “fairly difficult to read, best understood by 10th-12th US grade levels” for a score between 50.0 and 60.0 [25,26]. Mean ± SD FKGL was 10.0 ± 1.3, representing a 10th-grade reading level. The mean ± SD GFS was 12.4 ± 1.4, which represents 12 years of formal education required to understand the text on the first reading. Mean ± SD SMOG was 9.4 ± 1.0, representing nine years of education required to comprehend the written materials. Mean ± SD CLI and ARI were 12.8 ± 1.0 and 10.3 ± 1.5, respectively (Table 2).
Table 1.
Web-based patient education materials reviewed for cardiovascular diseases and their sources.
| American Heart Association (AHA) | CardioSmart Website- American College of Cardiology (ACC) | ||
|---|---|---|---|
| 1. | Aortic Aneurysm | 1. | Angina (Chest Pain) |
| 2. | Arrhythmia | 2. | Aortic Aneurysm |
| 3. | Atrial Fibrillation | 3. | Aortic Stenosis |
| 4. | Cardiac Arrest | 4. | Atrial Fibrillation |
| 5. | Cardiac Rehabilitation | 5. | Bradycardia |
| 6. | Cardiomyopathy | 6. | Cancer Treatment and Your Heart |
| 7. | Cholesterol | 7. | Cardiac Rehabilitation |
| 8. | Coronavirus (COVID-19) | 8. | Congenital Heart Disease |
| 9. | Congenital Heart Defects | 9. | Coronary Artery Disease |
| 10. | Diabetes | 10. | Coronavirus (COVID-19) |
| 11. | Flu Prevention | 11. | Decisions |
| 12. | Heart Attack | 12. | Diabetes and Your Heart |
| 13. | Heart Failure | 13. | Endocarditis |
| 14. | Heart Murmurs | 14. | Familial Hypercholesterolemia |
| 15. | Heart Valve Problems and Disease | 15. | Flu Shots and Your Heart |
| 16. | High Blood Pressure | 16. | Healthy living |
| 17. | Infective Endocarditis | 17. | Heart Attack |
| 18. | Kawasaki Disease | 18. | Heart Failure |
| 19. | Metabolic Syndrome | 19. | High Blood Pressure |
| 20. | Pericarditis | 20. | High Cholesterol |
| 21. | Peripheral Artery Disease | 21. | Hypertrophic Cardiomyopathy (HCM) |
| 22. | Sleep Disorders | 22. | Heart Rhythm Problems |
| 23. | Stroke | 23. | Manage Your Care |
| 24. | Venous Thromboembolism | 24. | Metabolic Syndrome |
| 25. | Mitral Regurgitation | ||
| 26. | Older Adults and Heart Disease | ||
| 27. | Palliative Care | ||
| 28. | Peripheral Artery Disease | ||
| 29. | Renal Artery Disease | ||
| 30. | Sleep Apnoea | ||
| 31. | Stroke | ||
| 32. | Subclavian Artery Disease | ||
| 33. | Sudden Cardiac Arrest | ||
| 34. | Supraventricular Tachycardia | ||
| 35. | Varicose Veins | ||
| 36. | Ventricular Tachycardia | ||
| 37. | Very High Triglycerides | ||
| 38. | Women and Heart Disease | ||
| 39. | Wearable Technology and Your Heart Health | ||
Table 2.
Readability and comprehension of web-based patient education materials for cardiovascular diseases.
| Readability and comprehension indices | Cardiovascular diseases (Mean ± (SD)) | Recommended scores |
|---|---|---|
| N | 63 | |
| Flesch Kincaid Readability Ease (FKRE) | 54.9 ± 6.8 | 70 |
| Flesch Kincaid Grade Level (FKGL) | 10.0 ± 1.3 | 7 |
| Gunning Fog Score (GFS) | 12.4 ± 1.4 | 7 |
| Simple Measure of Gobbledygook Index (SMOG) | 9.4 ± 1.0 | 7 |
| Coleman Liau Index (CLI) | 12.8 ± 1.0 | 7 |
| Automated Readability Index (ARI) | 10.3 ± 1.5 | 7 |
| Sentences | 315.9 ± 298.4 | N/A |
| Words | 5439.6 ± 4887.6 | N/A |
| Complex Words | 823.0 ± 787.9 | N/A |
| Percentage of Complex Words (%) | 14.7 ± 2.5 | N/A |
| Average Words Per Sentence | 17.7 ± 2.4 | N/A |
| Average Syllables Per Word | 1.6 ± 0.1 | N/A |
Abbreviations: N/A- Not Applicable, SD- Standard deviation.
Legend: This table depicts the different reading comprehension tools used and their scores obtained for overall data evaluating all webpages, alongside comparing these scores with the recommended scores of reading grade levels by prominent healthcare organizations.
In the comparison of reading comprehension of the patient education materials from the AHA and ACC websites, FKRE for AHA was statistically significantly lower than ACC; 51.56 ± 6.03 vs. 56.93 ± 6.54 (p value = 0.002) (Fig. 1), but FKGL were similar, 10.22 ± 1.19 vs. 9.86 ± 1.42 (p value = 0.304). The CLI index was also considerably higher among the patient education materials on the AHA website compared to the ACC website; 13.49 ± 0.97 vs. 12.41 ± 0.89, (p value ≤ 0.0001). No significant differences were observed between the two groups in reference to the GFS, SMOG, and ARI measures. Regarding the volume and difficulty of information presented, the patient education materials on the AHA website had almost double the number of sentences, and it also included a more difficult text with a higher number of complex words than those found on the ACC website. The average syllables per word were significantly higher on the AHA website compared to the ACC website. However, the average words per sentence were higher on the ACC website (Table 3). The AHA website was deemed complex and challenging for patients to navigate as it had several informational materials and resources embedded within the pages.
Fig. 1.
Readability and comprehension of web-based patient educational materials for cardiovascular diseases.
Legend: This figure depicts a graphical analysis comparing the reading comprehension between the AHA and ACC websites, showing that using the FKRE tool, the ACC website obtained a higher score (56.93 ± 6.54) than the AHA website (51.56 ± 6.03).
Abbreviations: AHA- American Heart Association, ACC- American College of Cardiology, FKRE- Flesch Kincaid Reading Ease.
Table 3.
Comparison of readability and comprehension of web-based patient education materials for cardiovascular diseases from different websites and their recommended scores.
| Readability and comprehension indices | AHA (Mean ± (SD)) | CardioSmart-ACC (Mean ± (SD)) | P value | Recommended scores |
|---|---|---|---|---|
| N | 24 | 39 | ||
| Flesch Kincaid Readability Ease (FKRE) | 51.56 ± 6.03 | 56.93 ± 6.54 | 0.002 | 70 |
| Flesch Kincaid Grade Level (FKGL) | 10.22 ± 1.19 | 9.86 ± 1.42 | 0.304 | 7 |
| Gunning Fog Score (GFS) | 12.69 ± 1.27 | 12.27 ± 1.52 | 0.266 | 7 |
| Simple Measure of Gobbledygook Index (SMOG) | 9.64 ± 0.88 | 9.22 ± 1.11 | 0.120 | 7 |
| Coleman Liau Index (CLI) | 13.49 ± 0.97 | 12.41 ± 0.89 | <0.0001 | 7 |
| Automated Readability Index (ARI) | 10.36 ± 1.44 | 10.29 ± 1.61 | 0.859 | 7 |
| Sentences | 495.08 ± 373.10 | 205.66 ± 168.40 | 0.001 | N/A |
| Words | 8157.83 ± 5972.35 | 3766.87 ± 3138.33 | 0.002 | N/A |
| Complex Words | 1317.83 ± 953.53 | 518.61 ± 461.40 | 0.001 | N/A |
| Percentage of Complex Words (%) | 16.39 ± 2.06 | 13.72 ± 2.22 | <0.0001 | N/A |
| Average Words Per Sentence | 16.71 ± 1.93 | 18.30 ± 2.50 | 0.010 | N/A |
| Average Syllables Per Word | 1.63 ± 0.06 | 1.55 ± 0.06 | <0.0001 | N/A |
Abbreviations: SD- Standard deviation, AHA- American Heart Association, ACC- American College of Cardiology, N/A - Not applicable.
Legend: This table compares the scores obtained on the AHA and ACC websites using different reading comprehension indices alongside the recommended scores suggested by prominent healthcare organizations.
4. Discussion
Given the prevalent use of the internet in today's era, web-based patient information can provide meaningful and valuable information quickly, widely, and across geographic borders in a cost-efficient manner. The information delivered through patient education materials is a cornerstone that can influence a patient's knowledge, shared decision making and participation in their care to drive the further process of disease management [27].
Very few studies address the readability of online patient education materials in cardiology, and some are limited to print media [[28], [29], [30]]. Almost all studies assessing the reading comprehension of web-based and print patient education materials in cardiology suggest higher reading grade levels than recommended [10,11,[28], [29], [30], [31], [32], [33]]. Our study indicates that patient education materials for CVD from these leading national platforms, in general, are not patient-centric or conducive to use by healthcare providers as they are written at significantly higher reading grade levels than the recommended levels. Wide variations in the length and complexity of the sentences in the patient education materials were noted, showing nonconformity with the recommendations.
FKRE scores for both AHA and ACC equate to “fairly difficult to read, best understood by 10th–12th US grade levels.” [25,26] However, ACC materials performed statistically significantly better in reading than the AHA materials. The significance of this difference is not entirely clear, as both FKRE values fall within the same reading category range. Approximately 10th-grade level was required to comprehend the text on the page for both ACC and AHA patient education materials, far exceeding the recommendation of sixth or seventh-grade level.
The scores obtained by the CLI index on both websites also require a 12th/13th reading grade level to understand a passage, which exceeds the recommended levels of years of education by many folds. This similarity between the GFS, SMOG, and ARI scores indicates that the patient education materials available on these websites require about twelve formal years of education to understand the text on the first reading (GFS), about nine years of education (SMOG), and about 10th-grade level to comprehend these written materials (ARI), all above the recommended grade levels.
The AHA's patient education material content was more difficult to navigate on the website, read, and understand. It was lengthier, with more complex words than the ACC's patient education materials. The patient education materials on the CardioSmart website (ACC) were brief, concise, and had better reading comprehension than the ones available on the AHA website.
There are limited studies evaluating online patient education material relevant to CVDs. Kapoor et al. [31] assessed the readability of 454 articles available online at ACC/AHA websites and found out that all the articles had a mean ± SD FKGL, SMOG, CLI, and GFI of 9.6 ± 2.1, 11.2 ± 2.1, 11.9 ± 1.6 and 10.8 ± 1.6 respectively, significantly higher than both the NIH recommended grade level of 6.5 and the national mean grade level of 8 (p < 0.00625). Commonalities between our study and Kapoor et al.'s study include: 1) evaluation of reading comprehension of the patient education materials from the same web-based platforms (ACC and AHA), and 2) Inclusion of all CVD-relevant patient education materials. In our study, we collated all subheading patient education materials under the main topic, so the total number of materials appears smaller than in the Kapoor et al. study. However, there are major differences between the two studies, specifically, the period in which the reviews were undertaken for reading comprehension of these patient education materials and the differences in some of the indices and methodology utilized in evaluating the reading comprehension of the patient education materials. Kapoor et al. published their study in 2017, and despite nearly five years have passed since their publication, the state of the reading comprehension of the patient education materials available at both websites for CVD care has not improved.
Similar findings have been noted in some other studies, which differ in methodology [23,31,32]. Ayyaswami et al. [32] analyzed the readability of web-based CVD-related patient education materials available through Google search using 20 commonly searched cardiovascular terms (World Health Organization's definition for cardiovascular terms). The authors found that all 196 patient education materials were written at a mean reading grade level of 10.9 ± 1.8, with 99.5 % of the articles written beyond the 5th–6th reading grade level recommendations. Rodriguez et al. [33] evaluated the readability of online patient education materials for coronary artery calcium scans from the top 50 commonly accessed websites and reported differences in the reading grade level based on the type of organization hosting the website. Patient education materials from professional societies and news/media/blog websites had the highest average reading grade level of 12.6. In contrast, health system websites had the lowest average reading grade level of 10.0, suggesting that perhaps websites of professional societies and large organizations preferentially focus on research, advocacy, and advancement of public health than the development of patient education materials to foster patient understanding and organizational health literacy.
Studies from other disciplines also indicate a similar discordance between reading and comprehension of patient educational material comprehension [23,[34], [35], [36]]. As most of these studies were completed and published before 2020, our study clearly shows that the state of reading comprehension of patient education materials has not improved in general and specifically in CVD despite the documented gaps.
An urgent call by the lead organizations to recognize the existent gaps in health literacy and the unmet needs of patients for patient-friendly education materials in CVD is indicated. The development of pertinent patient education materials must follow recommended methods and a systematic approach [17,37,38]. A reading comprehension metric should accompany each material so that each can be used, disseminated, and integrated to facilitate patients' health literacy would be helpful.
Our study has some strengths. It provides insight into the current state of patient information materials in CVD and includes the two leading national and international organizations involved with physician and patient education in CVD. Moreover, our study uses standard assessment tools across different indices to evaluate the readability and comprehension of the patient education materials. The Flesch Kincaid Reading Ease is a widely accepted tool for assessing reading grade levels as it uses simple algorithms to determine the reading ease based on the type of words used in a sentence. However, any assessment tool cannot be accurate enough to correctly determine the reading grade level as it does not consider the reader's knowledge, behavior, or interest.
There are several limitations to our study. Firstly, the indices used to evaluate reading comprehension are surrogates, not the “actual” evaluation of a patient's reading comprehension. Secondly, these reading comprehension indices do not consider pictures and infographics, which are more patient-friendly. Lastly, we could not evaluate the reading comprehension of the patient education materials in languages other than English. This limits the generalizability of the results to other language patient education materials and especially to some groups at risk for poor health literacy and health disparities. The methodology and platform used to evaluate the reading and comprehension metrics is only eligible for English language text. However, this research paves the way to undertake similar work in other languages.
5. Conclusion
CVD-related patient education materials available online through leading national organizations are not congruent with the recommendations from national healthcare organizations. They are not as user-friendly as they can be. Despite previous studies documenting these gaps five years ago, the state of the reading comprehension of patient education materials in CVD has not improved since then. Urgent recognition of the gaps and unmet needs are indicated to optimize patient health literacy, especially in view of the healthcare disparities seen due to social determinants of health.
Disclosures
Volgman: Consultant – Sanofi, Merck, Pfizer, Jannsen, Novartis and NIH Clinical Trials, Apple Inc. stock.
Source of funding
Brewer Foundation.
Ethical statement
-
•
Ethical Committee/Institutional Review Board (IRB) Approval is not applicable for this study as no patient data was accessed.
-
•
There are no human subjects involved in this research, hence informed consent is not applicable.
CRediT authorship contribution statement
Amanpreet Singh Wasir: Data curation, Visualization, Investigation, Writing - original draft. Annabelle Santos Volgman: Project administration, Supervision, Resources, Visualization, Writing - original draft, Writing - review & editing. Meenakshi Jolly: Conceptualization, Funding acquisition, Methodology, Project administration, Resources, Formal analysis, Supervision, Validation, Writing - original draft, Writing - review & editing.
Declaration of competing interest
There are no conflicts of interest.
References
- 1.Tsao C.W., Aday A.W., Almarzooq Z.I., Alonso A., Beaton A.Z., Bittencourt M.S., Boehme A.K., Buxton A.E., Carson A.P., Commodore-Mensah Y., et al. Heart disease and stroke Statistics-2022 update: a report from the American Heart Association. Circulation. 2022;145:e153–e639. doi: 10.1161/cir.0000000000001052. [DOI] [PubMed] [Google Scholar]
- 2.Tertulien T., Broughton S.T., Swabe G., Essien U.R., Magnani J.W. Association of race and ethnicity on the management of acute non-ST-segment elevation myocardial infarction. J. Am. Heart Assoc. 2022;11 doi: 10.1161/jaha.121.025758. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Biancari F., Teppo K., Jaakkola J., Halminen O., Linna M., Haukka J., Putaala J., Mustonen P., Kinnunen J., Hartikainen J., et al. Income and outcomes of patients with incident atrial fibrillation. J. Epidemiol. Community Health. 2022 doi: 10.1136/jech-2022-219190. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Institute of Medicine Committee on Health L . In: Health Literacy: A Prescription to End Confusion. Nielsen-Bohlman L., Panzer A.M., Kindig D.A., editors. National Academies Press (US) Copyright 2004 by the National Academy of Sciences. All rights reserved; Washington (DC): 2004. [PubMed] [Google Scholar]
- 5.Friedman D.B., Hoffman-Goetz L. A systematic review of readability and comprehension instruments used for print and web-based cancer information. Health Educ. Behav. 2006;33:352–373. doi: 10.1177/1090198105277329. [DOI] [PubMed] [Google Scholar]
- 6.Dewalt D.A., Berkman N.D., Sheridan S., Lohr K.N., Pignone M.P. Literacy and health outcomes: a systematic review of the literature. J. Gen. Intern. Med. 2004;19:1228–1239. doi: 10.1111/j.1525-1497.2004.40153.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Wolf M.S., Gazmararian J.A., Baker D.W. Health literacy and functional health status among older adults. Arch. Intern. Med. 2005;165:1946–1952. doi: 10.1001/archinte.165.17.1946. [DOI] [PubMed] [Google Scholar]
- 8.Paasche-Orlow M.K., Wolf M.S. The causal pathways linking health literacy to health outcomes. Am. J. Health Behav. 2007;31(Suppl. 1):S19–S26. doi: 10.5555/ajhb.2007.31.supp.S19. [DOI] [PubMed] [Google Scholar]
- 9.Berkman N.D., Sheridan S.L., Donahue K.E., Halpern D.J., Crotty K. Low health literacy and health outcomes: an updated systematic review. Ann. Intern. Med. 2011;155:97–107. doi: 10.7326/0003-4819-155-2-201107190-00005. [DOI] [PubMed] [Google Scholar]
- 10.Safeer R.S., Cooke C.E., Keenan J. The impact of health literacy on cardiovascular disease. Vasc. Health Risk Manag. 2006;2:457–464. doi: 10.2147/vhrm.2006.2.4.457. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Peterson P.N., Shetterly S.M., Clarke C.L., Bekelman D.B., Chan P.S., Allen L.A., Matlock D.D., Magid D.J., Masoudi F.A. Health literacy and outcomes among patients with heart failure. Jama. 2011;305:1695–1701. doi: 10.1001/jama.2011.512. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Seurer A.C., Vogt H.B. Low health literacy: a barrier to effective patient care. S D Med. 2013;66(51):53–57. [PubMed] [Google Scholar]
- 13.Bonner C., Batcup C., Ayre J., Cvejic E., Trevena L., McCaffery K., Doust J. The impact of health literacy-sensitive design and heart age in a cardiovascular disease prevention decision aid: randomized controlled trial and end-user testing. JMIR Cardio. 2022;6 doi: 10.2196/34142. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Weiss B. American Medical Association Foundation and American Medical Association; 2007. Health Literacy and Patient Safety: Help Patients Understand. Manual for Clinicians. [Google Scholar]
- 15.Me K., Greenberg E., Jin Y., Paulsen C. 2006. The Health Literacy of America’s Adults: Results From the 2003 National Assessment of Adult Literacy (NCES 2006–483) [Google Scholar]
- 16.US Department of Health and Human Services-Centers for Disease Control and Prevention Simply put—a guide for creating easy-tounderstand materials. 2010. www.cdc.gov/healthliteracy/pdf/simply_put.pdf
- 17.Brega A.G.B.J., Mabachi N.M. 2 ed. Agency for Healthcare Research and Quality; Rockville, MD: 2015. AHRQ Health Literacy Universal Precautions Toolkit. [Google Scholar]
- 18.National Institutes of Health US Department of Health and Human Services. Clear & simple. 2018. https://www.nih.gov/institutes-nih/nih-office-director/office-communications-public-liaison/clear-communication/clear-simple
- 19.Kasabwala K., Misra P., Hansberry D.R., Agarwal N., Baredes S., Setzen M., Eloy J.A. Readability assessment of the American Rhinologic society patient education materials. Int. Forum Allergy Rhinol. 2013;3:325–333. doi: 10.1002/alr.21097. [DOI] [PubMed] [Google Scholar]
- 20.Agarwal N., Hansberry D.R., Sabourin V., Tomei K.L., Prestigiacomo C.J. A comparative analysis of the quality of patient education materials from medical specialties. JAMA Intern. Med. 2013;173:1257–1259. doi: 10.1001/jamainternmed.2013.6060. [DOI] [PubMed] [Google Scholar]
- 21.Meade C.D., Byrd J.C., Lee M. Improving patient comprehension of literature on smoking. Am. J. Public Health. 1989;79:1411–1412. doi: 10.2105/ajph.79.10.1411. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Wang L.-W., Miller M., Schmitt M., Wen F. Assessing readability formula differences with written health information materials: application, results, and recommendations. Res. Soc. Adm. Pharm. 2013;9:503–516. doi: 10.1016/j.sapharm.2012.05.009. [DOI] [PubMed] [Google Scholar]
- 23.Kher A., Johnson S., Griffith R. Readability assessment of online patient education material on congestive heart failure. Adv. Prev. Med. 2017;2017 doi: 10.1155/2017/9780317. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Badarudeen S., Sabharwal S. Assessing readability of patient education materials: current role in orthopaedics. Clin. Orthop. Relat. Res. 2010;468:2572–2580. doi: 10.1007/s11999-010-1380-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Flesch R. A new readability yardstick. J. Appl. Psychol. 1948;32:221–233. doi: 10.1037/h0057532. [DOI] [PubMed] [Google Scholar]
- 26.Flesch R. University of Canterbury; July 12, 2016. How to Write Plain English. [Google Scholar]
- 27.Fox S., Rainie L. 1999. The Online Health Care Revolution: How the Web Helps Americans Take Better Care of Themselves. [Google Scholar]
- 28.Conroy R.M., Mulcahy R. Readability of literature written for cardiac patients. Clin. Cardiol. 1985;8:104–106. doi: 10.1002/clc.4960080207. [DOI] [PubMed] [Google Scholar]
- 29.Mueller L.A., Sharma A., Ottenberg A.L., Mueller P.S. Readability of “dear patient” device advisory notification letters created by a device manufacturer. Heart Rhythm. 2013;10:501–507. doi: 10.1016/j.hrthm.2012.12.022. [DOI] [PubMed] [Google Scholar]
- 30.Terranova G., Ferro M., Carpeggiani C., Recchia V., Braga L., Semelka R.C., Picano E. Low quality and lack of clarity of current informed consent forms in cardiology: how to improve them. JACC Cardiovasc. Imaging. 2012;5:649–655. doi: 10.1016/j.jcmg.2012.03.007. [DOI] [PubMed] [Google Scholar]
- 31.Kapoor K., George P., Evans M.C., Miller W.J., Liu S.S. Health literacy: readability of ACC/AHA online patient education material. Cardiology. 2017;138:36–40. doi: 10.1159/000475881. [DOI] [PubMed] [Google Scholar]
- 32.Ayyaswami V., Padmanabhan D., Patel M., Prabhu A.V., Hansberry D.R., Agarwal N., Magnani J.W. A readability analysis of online cardiovascular disease-related health education materials. Health Lit. Res. Pract. 2019;3:e74–e80. doi: 10.3928/24748307-20190306-03. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Rodriguez F., Ngo S., Baird G., Balla S., Miles R., Garg M. Readability of online patient educational materials for coronary artery calcium scans and implications for health disparities. J. Am. Heart Assoc. 2020;9 doi: 10.1161/jaha.120.017372. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Crihalmeanu T., Prabhu A.V., Hansberry D.R., Agarwal N., Fine M.J. Readability of online allergy and immunology educational resources for patients: implications for physicians. J Allergy Clin Immunol Pract. 2018;6:286–288.e281. doi: 10.1016/j.jaip.2017.07.016. [DOI] [PubMed] [Google Scholar]
- 35.Hansberry D.R., D’Angelo M., White M.D., Prabhu A.V., Cox M., Agarwal N., Deshmukh S. Quantitative analysis of the level of readability of online emergency radiology-based patient education resources. Emerg. Radiol. 2018;25:147–152. doi: 10.1007/s10140-017-1566-7. [DOI] [PubMed] [Google Scholar]
- 36.Prabhu A.V., Gupta R., Kim C., Kashkoush A., Hansberry D.R., Agarwal N., Koch E. Patient education materials in dermatology: addressing the health literacy needs of patients. JAMA Dermatol. 2016;152:946–947. doi: 10.1001/jamadermatol.2016.1135. [DOI] [PubMed] [Google Scholar]
- 37.Simply Put—A Guide for Creating Easy-tounderstand Materials. US Department of Health and Human Services-Centers for Disease Control and Prevention; 2010. [Google Scholar]
- 38.The Patient Education Materials Assessment Tool (PEMAT) and User’s Guide. Agency for Healthcare Research and Quality; 2013. [Google Scholar]

