Skip to main content
Health Research Alliance Author Manuscripts logoLink to Health Research Alliance Author Manuscripts
. Author manuscript; available in PMC: 2019 Apr 30.
Published in final edited form as: Health Lit Res Pract. 2019 Apr 10;3(2):e74–e80. doi: 10.3928/24748307-20190306-03

A Readability Analysis of Online Cardiovascular Disease-Related Health Education Materials

Varun Ayyaswami 1, Divya Padmanabhan 2, Manthan Patel 3, Arpan Vaikunth Prabhu 4, David R Hansberry 5, Nitin Agarwal 6, Jared W Magnani 7
PMCID: PMC6489118  NIHMSID: NIHMS1024656  PMID: 31049489

Abstract

Background:

Online cardiovascular health materials are easily accessible with an Internet connection, but the readability of its content may limit practical use by patients.

Objective:

The goal of our study was to assess the readability of the most commonly searched Internet health education materials for cardiovascular diseases accessed via Google.

Methods:

We selected 20 commonly searched cardiovascular disease terms: aneurysm, angina, atherosclerosis, cardiomyopathy, congenital heart disease, coronary artery disease, deep vein thrombosis, heart attack, heart failure, high blood pressure, pericardial disease, peripheral arterial disease, rheumatic heart disease, stroke, sudden death, valvular heart disease, mini-stroke, lower extremity edema, pulmonary embolism, and exertional dyspnea. Terms were selected on Google and selected up to 10 results in order of presentation in the search results by reviewing a maximum of 15 pages of Google search results specifically providing education toward patients to yield 196 total patient education articles.

Key Results:

All readability measures assessing grade level measures found the 196 articles were written at a mean 10.9 (SD = 1.8) grade reading level. Moreover, 99.5% of the articles were written beyond the 5th- to 6th-grade level recommended by the American Medical Association.

Conclusions:

Given the prominent use of online patient education material, we consider readability as a quality metric that should be evaluated prior to online publication of any health education materials. Further study of how to improve the readability of online materials may enhance patient education, engagement, and health outcomes.

Plain Language Summary:

Patients often use Google as a tool for understanding their medical conditions. This study examined the readability of articles accessed via Google for commonly searched cardiovascular diseases and found all articles were written above reading grade levels appropriate for patients. We hope this study will promote the importance of ensuring that online patient education articles are written at appropriate reading levels.


Approximately 52 million adults in the United States seek health information online, and 70% of them report that the Internet influences their decision-making about treatments (Rainie & Susannah, 2000). Although online health materials are easily accessible with an Internet connection, the readability of their content may limit practical use by patients (Agarwal, Hansberry, & Prabhu, 2017). Health literacy is a widely prevalent challenge, and the mean reading level of adults in the U.S. is estimated as 7th- to 8th-grade level (Kutner, Greenberg, Jin, & Paulsen, 2006; National Assessment of Adult Literacy, 2003). Thus, the American Medical Association (AMA) and National Work Group on Cancer and Health have recommended that educational materials for patients be on a 5th- to 6th-grade reading level (Cotugna, Vickery, & Carpenter-Haefele, 2005; Weiss, 2007). Unfortunately, previous studies performed by our group and others show that Internet-based patient educational materials do not follow these recommendations (Bernard, Cooke, Cole, Hachani, & Bernard, 2018; Hansberry, Kraus, Agarwal, Baker, & Gonzales, 2014; Hansberry, Ramchand, et al., 2014; Kher, Johnson, & Griffith, 2017; Lee, Berg, Jazayeri, Chuang, & Eisig, 2019; Prabhu, Hansberry, Agarwal, Clump, & Heron, 2016).

Cardiovascular diseases have complex mechanisms and etiologies that are difficult for patients to comprehend. Limited health literacy is a barrier to the successful clinical management of patients with cardiovascular disease and is associated with higher cardiovascular disease risk scores (Magnani et al., 2018; Van Schaik et al., 2017) and higher rates of all-cause mortality in patients with heart failure (Peterson et al., 2011). Previous studies have shown that printed educational materials for cardiovascular disease patients are not appropriate for patients with limited health literacy (Eames, McKenna, Worrall, & Read, 2003; Hill-Briggs & Smith, 2008). Furthermore, cardiovascular professional societies have promoted online educational materials that exceed national recommendations (Kapoor, George, Evans, Miller, & Liu, 2017). Thus, the goal of our study was to assess the readability of commonly accessed Internet health education materials for cardiovascular diseases accessed via Google. We hypothesized that the readability of patient-specific health education materials available online through Google are written well above nationally recommended reading levels.

METHODS

We selected 20 patient-oriented terms related to cardiovascular disease for our study based on our clinical experience using the World Health Organizations broad definition of cardiovascular disease as a guideline for term selection (https://www.who.int/news-room/fact-sheets/ detail/cardiovascular-diseases-(cvds). Terms selected were aneurysm, angina, atherosclerosis, cardiomyopathy, congenital heart disease, coronary artery disease, deep vein thrombosis, heart attack, heart failure, high blood pressure, pericardial disease, peripheral arterial disease, rheumatic heart disease, stroke, sudden death, valvular heart disease, ministroke, lower extremity edema, pulmonary embolism, and exertional dyspnea.

We entered these terms separately into Google and selected up to 10 results in order of presentation in the search results by reviewing a maximum of 15 pages of Google search results specifically providing education toward patients to yield 196 total patient education articles. A search result was considered written for patient education unless the article was explicitly indicated for a medical professional. We reformatted articles to plain text in Microsoft Word and deleted material unrelated to patient education. This included removal of figures and their legends, disclaimers, copyright notices, acknowledgments, multimedia, captions, author information, web page navigation text, and references. We quantitatively evaluated the readability according to 10 readability measures (to minimize influence of any individual scale) with commercially available software (Readability Studio; Professional Edition Version 2012.1, Oleander Software, Ltd, Vandalia, OH). These readability measures included the Flesch Reading Ease (FRE), Coleman-Liau Index (CLI), Flesch-Kincaid Grade Level (FKGL), Gunning Fog Index (GFI), FORCAST Readability Formula, New Dale-Chall formula (NDC), New Fog Count (NFC), Simple Measure of Gobbledygook (SMOG) Index, Fry Readability Formula (FRF), and Raygor Readability Estimate (RRE). The FRE test assesses readability through word, syllable, and sentence counts with lower scores indicating more difficult text (0-29, very difficult; 30-49, difficult; 50-59, fairly difficidt; 60-69, standard; 70-79, fairly easy; 80-89, easy; 90-100, very easy) (Flesch, 1948; Jindal & MacDermid, 2017). The nine other readability measures provide grade level values with higher values implicating more complex text. The CLI measures readability by analyzing the number of letters and sentences per 100 words (Coleman & Liau, 1975). FKGL analyzes the average number of words per sentence and average number of syllables per word (Kincaid, Fishburne, Rogers, & Chissom, 1975). The GFI uses the number of sentences, number of words with three or more syllables, and the average sentence length (Gunning, 1952). The FORCAST Readability Formula evaluates the readability of a material on the number of single-syllable words present in a 150-word section (Caylor, Sticht, Fox, & Ford, 1973). NDC uses a count of hard words and the number of words per sentence (Chall & Dale, 1995). NFC assesses readability by counting the number of sentences, complex words, and easy words (Kincaid et al., 1975). The SMOG Index evaluates readability through the number of polysyllabic words and the number of sentences (Caylor et al., 1973). The FRF examines the average number of sentences and syllables in every 100 words (Fry, 1968). Finally, the RRE determines readability grade-level based on the mean number of sentences and number of words with 6 or more letters (Raygor, 1977). Institutional review board approval was not required because all data were publicly available online.

RESULTS

The nine readability measures assessing grade level found that the 196 articles were written at a mean 10.9 (standard deviation [SD] = 1.8) grade reading level (Figure 1). A single article, 1 of 196 articles, was written at the recommended 5th- to 6th-grade level. The FRE test identified the articles as being “fairly difficult” to read (M = 52.3, SD = 12.1 of 100) (Figure 2). Articles associated with exertional dyspnea had the highest reading grade level (M = 13.2, SD = 2.2) (Figure 3). Articles on deep vein thrombosis had the lowest readability scores with an average grade-level score of 9.5 (SD = 1.9) (Figure 3). As a whole, the articles associated with each search term were above national recommendations as measured by all nine readability measures assessing grade level.

Figure 1.

Figure 1.

The mean readability score for 196 collected articles yielded from commonly searched cardiovascular diseases on Google for readability analysis as measured by nine established readability scales that assess grade level. Scores correspond to the corresponding academic grade level required for reading. Error bars refer to standard deviation. All readability scales determined the mean reading level of the 196 collected articles that were above the grade-level recommendations promoted by the American Medical Association.

Figure 2.

Figure 2.

The Flesch Reading Ease scores for 196 collected articles yielded from commonly searched cardiovascular diseases on Google for readability analysis. This test evaluates readability through syllable count and sentence length to calculate a score between 0 and 100 to indicate readability ease, with lower scores indicating greater difficulty. The Flesch Reading Ease mean score of the 196 analyzed articles (52.3 ± 12.1) corresponded to a qualitative readability score of fairly difficult to read. The figure plots the qualitative and quantitative scores of the 196 articles analyzed.

Figure 3.

Figure 3.

The mean readability score for commonly searched cardiovascular diseases on Google as measured by nine established readability scales that estimate grade level. Scores correspond to the academic grade level required for reading. Error bars refer to standard deviation. Readability scales determined the mean reading level for the 20 terms that were above the grade-level recommendations promoted by the American Medical Association.

DISCUSSION

Our study revealed that the top 10 articles on Google for 20 commonly searched cardiovascular disease terms were written at an average 10.9 (SD = 1.8) grade reading level as measured by nine established readability scales that evaluate grade level. Moreover, 99.5% of articles were written beyond the 5th- to 6th-grade level promoted by national organizations for patients (Cotugna et al., 2005; Weiss, 2007). Our results demonstrate a disconnect between frequently searched cardiovascular disease-related health education materials and the reading grade level for patients. The AMA has promoted readability to make health-related educational materials accessible to patients (Weiss, 2007). In the context of both the numerous studies in the medical field that have demonstrated that Internet-based patient educational materials do not follow national readability recommendations (Bernard et al., 2018; Crihalmeanu, Prabhu, Hansberry, Agarwal, & Fine, 2018; Hansberry, Ayyaswami et al., 2017; Hansberry, D’Angelo, et al., 2018; Kher et al., 2017; Lee at al., 2019; Prabhu, Crihalmeanu, et al., 2017; Prabhu, Gupta, et al., 2016; Prabhu, Hansberry, et al., 2016; Prabhu, Kim, et al., 2017; Prabhu et al., 2018) and the promotion of educational materials that exceed readability recommendations by cardiovascular professional societies (Kapoor et al., 2017), the gap that we observed between the readability of online materials and patients’ reading level has important implications for patients and health care delivery.

Specifically, limited health literacy has been associated with nonadherence to treatment plans, increased health care costs, and greater hospitalization rates (King, 2010). Health literacy challenges are common among older adults, who have increased risk from multiple chronic and cardiovascular diseases (King, 2010). For example, patients with heart failure with limited health literacy had a significantly higher rate of unplanned health care use in the 30 days after hospital discharge (48.3%) compared to those with adequate health literacy (34.9%) (Cox et al., 2016). Furthermore, there was an increased risk of death after hospitalization for acute heart failure in limited health literacy patient populations (McNaughton et al., 2015). Patients with limited health literacy were more than 2 times as likely to be unaware of their atrial fibrillation diagnosis compared to patients with adequate health literacy (24.6% vs. 11.9%) (Reading et al., 2017). In addition, among adults with hypertension seeking treatment in primary care centers, adequate health literacy is associated with increased medication adherence and lower blood pressure levels (Wannasirikul, Termsirikulchai, Sujirarat, Benjakul, & Tanasugarn, 2016).

We suggest increased readability of online health education materials may exert a protective effect against the negative health outcomes associated with cardiac patients who suffer from limited health literacy. Although not the direct focus of this study, we propose such a link between appropriate online patient education and improved health care delivery and outcomes may be an interesting area of future research. With this in mind, professional societies, academic facilities, and others using the Internet to promote patient education should evaluate the readability of their online materials. Moreover, given the prominent use of such online patient education material (Rainie & Susannah, 2000), we consider readability as a quality metric and specifically encourage the use of commercially available readability software prior to publication of any online heath education materials to ensure appropriate readability. Improving the readability of patient educational materials is low cost, patient-centered, and makes the information more accessible to patients (Agarwal, Hansberry & Prabhu, 2017).

STUDY LIMITATIONS

An important limitation of our study is that readability alone does not imply the factual accuracy of written education materials. The study also did not address if the patient education materials are comprehensive in their scope or adhere to patient empowerment guidelines to be action oriented. Future studies could evaluate the accuracy, understandability, and actionability by employing assessments such as the Patient Education Materials Assessment Tool (Shoemaker, Wolf, & Brach, 2014). Furthermore, including nontext equivalents such as figures and graphics can help compensate for readability concerns (Agarwal, Hansberry & Prabhu, 2017), but we propose there is still a need for the development of comprehendible written education materials when nontext equivalents cannot adequately communicate health information alone. It is important to note that there are discrepancies in readability grade level recommendations, and we chose to focus on national recommendations presented by the AMA and National Work Group on Cancer and Health. We also acknowledge that our study only provides a broad overview of both common and rare cardiovascular disease term readability. Thus, future studies should use the results of our study to more narrowly analyze patient education materials for individual cardiac disease terms of interest. In future studies, we intend to explore specific terms in more detail by including comparisons by website (e.g., WebMD vs. Mayo Clinic).

CONCLUSIONS

We identified that cardiac-specific online patient education materials commonly accessed through Google exceeds grade-level recommendations promoted by the AMA. Future studies should assess the readability of health materials promoted on newer online platforms, such as mobile applications and social media, and the effect of increased readability on health care delivery and outcomes for patients with limited health literacy. Ultimately, further study of how to improve readability of online patient materials may enhance patient education, engagement, and health outcomes by allowing patients with limited health literacy to take a more active role in their health.

Acknowledgments

Grant: J.W.M. received a grant (2015084) from the Doris Duke Charitable Foundation.

Footnotes

Disclosure: The authors have no relevant financial relationships to disclose.

Contributor Information

Varun Ayyaswami, University of Maryland School of Medicine..

Divya Padmanabhan, University of New England College of Osteopathic Medicine..

Manthan Patel, Philadelphia College of Osteopathic Medicine..

Arpan Vaikunth Prabhu, University of Pittsburgh Medical Center..

David R. Hansberry, Thomas Jefferson University Hospitals..

Nitin Agarwal, University of Pittsburgh School of Medicine..

Jared W. Magnani, University of Pittsburgh Medical Center Heart and Vascular Institute..

REFERENCES

  1. Agarwal N, Hansberry DR, & Prabhu AV (2017). The evolution of health literacy: Empowering patients through improved education. Hauppauge, NY: Nova Science Publishers. [Google Scholar]
  2. Bernard S, Cooke T, Cole T, Hachani L, & Bernard J (2018). Quality and readability of online information about type 2 diabetes and nutrition. Journal of the American Academy of Physician Assistants, 31(11), 41–44. doi: 10.1097/01.JAA.0000546481.02560.4e [DOI] [PubMed] [Google Scholar]
  3. Caylor JS, Sticht TG, Fox LC, Ford JP (1973). Methodologies for determining reading requirements of military occupational specialties. Retrieved from Institute of Education Sciences website: https://eric.ed.gov/?id=ED074343
  4. Chall J, & Dale E (1995). Readability revisited: The new Dale-Chall readability formula. Cambridge, MA: Brookline Books. [Google Scholar]
  5. Coleman M, & Liau TL (1975). A computer readability formula designed for machine scoring. Journal of Applied Psychology, 60(2), 283–284. doi: 10.1037/h0076540 [DOI] [Google Scholar]
  6. Cotugna N, Vickery CE, & Carpenter-Haefele KM (2005). Evaluation of literacy level of patient education pages in health-related journals. Journal of Community Health, 30(3), 213–219. [DOI] [PubMed] [Google Scholar]
  7. Cox SR, Liebl MG, McComb MN, Chau JQ, Wilson AA, Achi M,…Wallace D (2016). Association between health literacy and 30-day healthcare use after hospital discharge in the heart failure population. Research in Social and Administrative Pharmacy, 13(4), 754–758. doi: 10.1016/j.sapharm.2016.09.003 [DOI] [PubMed] [Google Scholar]
  8. Crihalmeanu T, Prabhu AV, Hansberry DR, Agarwal N, & Fine MJ (2018). Readability of online allergy and immunology educational resources for patients: Implications for physicians. Journal of Allergy and Clinical Immunology: In Practice, 6(1), 286–288.e1. doi: 10.1016/j.jaip.2017.07.016 [DOI] [PubMed] [Google Scholar]
  9. Eames S, McKenna K, Worrall L, & Read S (2003). The suitability of written education materials for stroke survivors and their carers. Topics in Stroke Rehabilitation, 10(3), 70–83. doi: 10.1310/KQ70-P8UD-QKYT-DMG4 [DOI] [PubMed] [Google Scholar]
  10. Flesch R (1948). A new readability yardstick. Journal of Applied Psychology, 32(3), 221–233. doi: 10.1037/h0057532 [DOI] [PubMed] [Google Scholar]
  11. Fry E (1968). A readability formula that saves time. Journal of Reading, 11(7), 513–578. [Google Scholar]
  12. Gunning R (1952). The technique of clear writing. New York, NY: McGraw-Hill. [Google Scholar]
  13. Hansberry DR, Ayyaswami V, Sood A, Prabhu AV, Agarwal N, & Deshmukh SP (2017). Abdominal imaging and patient education resources: Enhancing the radiologist-patient relationship through improved communication. Abdominal Radiology, 42(4), 1276–1280. doi: 10.1007/S00261-016-0977-3 [DOI] [PubMed] [Google Scholar]
  14. Hansberry DR, D’Angelo M, White MD, Prabhu AV, Cox M, Agarwal N, & Deshmukh S (2018). Quantitative analysis of the level of readability of online emergency radiology-based patient education resources. Emergency Radiology, 25(2), 147–152. doi: 10.1007/s10140-017-1566-7 [DOI] [PubMed] [Google Scholar]
  15. Hansberry DR, Kraus C, Agarwal N, Baker SR, & Gonzales SF (2014). Health literacy in vascular and interventional radiology: A comparative analysis of online patient education resources. Cardiovascular and Interventional Radiology, 37(4), 1034–1040. doi: 10.1007/s00270-013-0752-6 [DOI] [PubMed] [Google Scholar]
  16. Hansberry DR, Ramchand T, Patel S, Kraus C, Jung J, Agarwal N,…Baker SR (2014). Are we failing to communicate? Internet-based patient education materials and radiation safety. European Journal of Radiology, 83(9), 1698–1702. doi: 10.1016/j.ejrad.2014.04.013 [DOI] [PubMed] [Google Scholar]
  17. Hill-Briggs F, & Smith AS (2008). Evaluation of diabetes and cardiovascular disease print patient education materials for use with low-health literate populations. Diabetes Care, 31(4), 667–671. doi: 10.2337/dc07-1365 [DOI] [PubMed] [Google Scholar]
  18. Jindal P & MacDermid J (2017). Assessing reading levels of health information: Uses and limitations of Flesch formula. Education for Health, 30(1), 84–88. doi: 10.4103/1357-6283.210517 [DOI] [PubMed] [Google Scholar]
  19. Kapoor K, George P, Evans MC, Miller WJ, & Liu SS (2017). Health literacy: Readability of ACC/AHA online patient education material. Cardiology, 138(1), 36–40. doi: 10.1159/000475881 [DOI] [PubMed] [Google Scholar]
  20. Kher A, Johnson S, & Griffith R (2017). Readability assessment of online patient education material on congestive heart failure. Advances in Preventive Medicine, 2017, 1–8. doi: 10.1155/2017/9780317 [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Kincaid JP, Fishburne RP, Rogers RL, & Chissom BS (1975). Derivation of new readability formulas (Automated Readability Index, Fog Count and Flesch Reading Ease Formula) for Navy enlisted personnel. Retrieved from University of Central Florida website: https://stars.library.ucf.edu/cgi/viewcontent.cgi?article=1055&context=istlibrary
  22. King A (2010). Poor health literacy: A “hidden” risk factor. Nature Reviews Cardiology, 7(9), 473–474. doi: 10.1038/nrcardio.2010.122 [DOI] [PubMed] [Google Scholar]
  23. Kutner M, Greenburg E, Jin Y, & Paulsen C (2006). The health literacy of America’s adults: Results from the 2003 National Assessment of Adult Literacy. Retrieved from National Center for Education Statistics website: https://nces.ed.gov/pubs2006/2006483.pdf
  24. Lee KC, Berg ET, Jazayeri HE, Chuang S-K, & Eisig SB (2019). Online patient education materials for orthognathic surgery fail to meet readability and quality standards. Journal of Oral and Maxillofacial Surgery, 77(1), 180.e1–180.e8. doi: 10.1016/j.joms.2018.08.033 [DOI] [PubMed] [Google Scholar]
  25. Magnani JW, Mujahid MS, Aronow HD, Cené CW, Dickson VV, Havranek E,…Willey JZ (2018). Health literacy and cardiovascular disease: Fundamental relevance to primary and secondary prevention: A scientific statement from the American Heart Association. Circulation, 138, 48–74. doi: 10.1161/CIR.0000000000000579 [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. McNaughton CD, Cawthon C, Kripalani S, Liu D, Storrow AB, & Roumie CL (2015). Health literacy and mortality: A cohort study of patients hospitalized for acute heart failure. Journal of the American Heart Association, 4(5), e001799. doi: 10.1161/JAHA.115.001799 [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. National Assessment of Adult Literacy (NAAL). (2003). What is NAAL? Retrieved from National Center for Education Statistics website: https://nces.ed.gov/naal/index.asp
  28. Peterson PN, Shetterly SM, Clarke CL, Bekelman DB, Chan PS, Allen LA,…Masoudi FA (2011). Health literacy and outcomes among patients with heart failure. The Journal of the American Medical Association, 305(16), 1695. doi: 10.1001/jama.2011.512 [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Prabhu AV, Crihalmeanu T, Hansberry DR, Agarwal N, Glaser C, Clump DA,…Beriwal S (2017). Online palliative care and oncology patient education resources through Google: Do they meet national health literacy recommendations? Practical Radiation Oncology, 7(5), 306–310. doi: 10.1016/j.prro.2017.01.013 [DOI] [PubMed] [Google Scholar]
  30. Prabhu AV, Donovan AL, Crihalmeanu T, Hansberry DR, Agarwal N, Beriwal S,…Heller M (2018). Radiology online patient education materials provided by major university hospitals: Do they conform to NIH and AMA guidelines? Current Problems in Diagnostic Radiology, 47(2), 75–79. doi: 10.1067/j.cpradiol.2017.05.007 [DOI] [PubMed] [Google Scholar]
  31. Prabhu AV, Gupta R, Kim C, Kashkoush A, Hansberry DR, Agarwal N, & Koch E (2016). Patient education materials in dermatology: Addressing the health literacy needs of patients. JAMA Dermatology, 152(S), 946–947. doi: 10.1001/jamadermatol.2016.1135 [DOI] [PubMed] [Google Scholar]
  32. Prabhu AV, Hansberry DR, Agarwal N, Clump DA, & Heron DE (2016). Radiation oncology and online patient education materials: Deviating from NIH and AMA recommendations. International Journal of Radiation Oncology Biology Physics, 96(3), 521–528. doi: 10.1016/j.ijrobp.2016.06.2449 [DOI] [PubMed] [Google Scholar]
  33. Prabhu AV, Kim C, Crihalmeanu T, Hansberry DR, Agarwal N, DeFrances MC, & Trejo Bittar HE (2017). An online readability analysis of pathology-related patient education articles: An opportunity for pathologists to educate patients. Human Pathology, 65, 15–20. doi: 10.1016/j.humpath.2017.04.020 [DOI] [PubMed] [Google Scholar]
  34. Rainie L, & Susannah F (2000). The online health care revolution. Retrieved from Pew Research Center website: http://www.pewinternet.org/2000/11/26/the-online-health-care-revolution/
  35. Raygor A (1977). The Raygor readability estimate: A quick and easy way to determine difficulty In Pearson PD (Ed.), Reading: Theory, research, and practice (pp. 259–263). Clemson, SC: National Reading Conference. [Google Scholar]
  36. Reading SR, Go AS, Fang MC, Singer DE, Liu I-LA, Black MH,…Udaltsova N (2017). Health literacy and awareness of atrial fibrillation. Journal of the American Heart Association, 6(4), e005128. doi: 10.1161/JAHA.116.005128 [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Shoemaker SJ, Wolf MS, & Brach C (2014). Development of the Patient Education Materials Assessment Tool (PEMAT): A new measure of understandability and actionability for print and audiovisual patient information. Patient Education and Counseling, 96(3), 395–403. doi: 10.1016/j.pec.2014.05.027 [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Van Schaik TM, Jørstad HT, Twickler TB, Peters RJG, Tijssen JPG, Essink-Bot ML, & Fransen MP (2017). Cardiovascular disease risk and secondary prevention of cardiovascular disease among patients with low health literacy. Netherlands Heart Journal, 25(7-8), 446–454. doi: 10.1007/s12471-017-0963-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Wannasirikul P, Termsirikulchai L, Sujirarat D, Benjakul S, & Tanasugarn C (2016). Health literacy, medication adherence, and blood pressure level among hypertensive older adults treated at primary health care centers. The Southeast Asian Journal of Tropical Medicine and Public Health, 47(1), 109–120. [PubMed] [Google Scholar]
  40. Weiss BD (2007). Health literacy and patient safety: Help patients understand. Retrieved from The Portal of Geriatrics Online Education website: https://www.pogoe.org/sites/default/files/Health%20Literacy%20-%20Reducing%20the%20Risk%20by%20Designing%20a%20Safe,%20Shame-Free%20Health%20Care%20Environment.pdf

Articles from Health literacy research and practice are provided here courtesy of Health Research Alliance manuscript submission

RESOURCES