Abstract
Background
Many patient education materials (PEMs) available on the internet are written at high school or college reading levels, rendering them inaccessible to the average US resident, who reads at or below an 8th grade level. Currently, electronic health record (EHR) providers partner with companies that produce PEMs, allowing clinicians to access PEMs at the point of care.
Objective
To assess the readability of PEMs provided by a popular EHR vendor as well as the National Library of Medicine (NLM).
Design
We included PEMs from Micromedex, EBSCO, and MedlinePlus. Micromedex and EBSCO supply PEMs to Meditech, a popular EHR supplier in the US. MedlinePlus supplies the NLM. These PEM databases have high market penetration and accessibility.
Measurements
Grade reading level of the PEMs was calculated using three validated indices: Simple Measure of Gobbledygook (SMOG), Gunning Fog (GFI), and Flesch–Kincaid (FKI). The percentage of documents above target readability and average readability scores from each database were calculated.
Results
We randomly sampled 100 disease-matched PEMs from three databases (n = 300 PEMs). Depending on the readability index used, 30-100% of PEMs were written above the 8th grade level. The average reading level for MedlinePlus, EBSCO, and Micromedex PEMs was 10.2 (1.9), 9.7 (1.3), and 8.6 (0.9), respectively (p ≤ 0.000) as estimated by the GFI. Estimates of readability using SMOG and FKI were similar.
Conclusions
The majority of PEMS available through the NLM and a popular EHR were written at reading levels considerably higher than that of the average US adult.
KEY WORDS: readability, health literacy, patient education materials, electronic health records
INTRODUCTION
Patient Education Materials
The 2008 Health Tracking Physician Survey from the Center for Studying Health System Change (HSC) reported that 75% of physicians routinely hand out patient education materials (PEMs).1 PEMs improve patient self-efficacy, thus supporting the growing trend towards disease self-management.2–4 Additionally, they are potentially effective at improving patient comprehension and influencing health behaviors, especially if they are written at appropriate reading levels for patients.5–7 The Joint Commission8 states that PEMs should be written at or below a 5th grade reading level, and encourages hospitals to use readability tests to revise written materials in order to address the health literacy needs of all patients.
Health literacy is a critical issue. The average US resident reads at an 8th grade level,9 and the average Medicare beneficiary reads at a 5th grade level.10 These statistics have implications for patients, including their ability to understand common medical terms. In a study of 249 adults at a metropolitan Emergency Department, investigators found that nearly 80% could not correctly state that “hemorrhage” meant “bleeding”, “myocardial infarction” meant “heart attack”, or that “fractured” meant “broken”.11 This is despite the fact that greater than 50% of surveyed patients had a college education.
Studies have demonstrated that more patients comprehend PEMs written at lower grade reading levels. Overland et al.6 compared comprehension of diabetes education materials written at varied grade levels amongst 85 diabetic patients. Patients were randomized to read food care information at 6th, 9th, or 11th grade reading levels. 60% of patients assigned to 6th grade-level information understood it independently, whereas 21% understood the 9th grade-level information, and 19% understood the 11th grade-level information (p < 0.001). Baker et al.7 found that reducing the reading level of a patient education pamphlet on allergic contact rashes from an 11th to 7th grade level improved comprehension by patients in a private dermatology practice.
While there is evidence that PEMs written at 6th to 8th grade levels improve comprehension, few studies demonstrate the effect of readable PEMs on health-related behaviors. Jacobsen et al.5 showed that an educational pamphlet on the pneumococcal vaccine written at a 5th grade level and given to patients in an inner city primary care waiting room increased pneumococcal vaccination rates more than five-fold and patient-clinician communication about the issue approximately four-fold (P < 0.001), when compared to a control group who received a pamphlet on nutrition. It should be noted that the nutritional pamphlet was also written at a 5th grade level, and no comparison was made between pamphlets of different grade reading levels in this study.
While there is currently a paucity of data demonstrating the effects of improved readability on health outcomes, it follows logically that, in order for PEMs to be effective, patients must be able to comprehend them. Studies have shown that many PEMs available in primary care waiting rooms are written at levels significantly higher than the reading abilities of most patients.12,13 In 1990 Davis et al.12 found that only 6% of PEMs sampled from primary care waiting rooms were written below a 9th grade reading level. In 2004 Wallace et al.13 found that only 7% of the American Academy of Family Practice’s PEMS were written below an 8th grade reading level. Various subspecialties such as cardiology14,15, infectious disease,16 plastic surgery,17 dermatology,18 orthopedics,19,20 palliative care,21 oncology22,23 and obstetrics and gynecology24,25 have studied the readability of the PEMs utilized in their fields, and findings consistently indicate that 9th–12th grade reading levels are necessary to read and comprehend these documents.
Patient Education Materials at the Point of Care
President Obama’s 2009 stimulus package set the goal of a certified electronic health record (EHR) for each person in the U.S. by 2014.26 Currently, EHRs include links to PEMs, allowing clinicians to access and print them during a patient visit without navigating away from the patient’s record.27 Because institutions are tacitly endorsing the PEMs made available through the EHRs they purchase, it is crucial to assess their readability.
Another source of PEMs available to clinicians at the point of care is the National Library of Medicine (NLM). The NLM is free, widely accessible, and accredited by the U.S. government. To date, there are no studies assessing the readability of PEMs supplied through the NLM or EHRs. The goal of our study was to determine the readability of PEMs supplied though the NLM and a popular EHR. We hypothesized that the majority of these PEMs would be written at grade levels above the recommended target readability of a 5th–8th grade reading level.
DESIGN
Sample Selection
We included a convenience sample of EHR-linked PEM databases with significant market penetration and accessibility. Penetration was based on the number of hospitals using each EHR. Per a 2009 report by Modern Healthcare, Meditech is a popular EHR supplier, providing EHRs to 27% of US hospitals. McKesson Provider Technologies is the second most popular, providing EHRs to 14% of US hospitals, and Cerner Corp is the third most popular, providing EHRs to 13% of US hospitals.28 Meditech collaborates with three organizations to provide PEMs: Thomson and Reuters’ Micromedex, PatientEDU and EBSCO (also marketed as Lexicomp).29 Collaborations were based on the widespread use of these PEM databases by Meditech customers, as well as their capacity to integrate with Meditech’s technological platform. Because Micromedex and EBSCO are accessible in most academic hospitals, while PatientEDU requires a subscription, we sampled PEMs from the two former databases. We also sampled PEMs from MedlinePlus, the National Library of Medicine’s PEM supplier. This government-endorsed database has received numerous awards30 and is a commonly used web-based source of health information. It is accessible to anyone with internet access and has received over 155 million visits in 2010.31
To determine the percentage of PEMs written above target readability, we randomly selected 100 disease/condition-matched PEMs from each of three databases (n = 300 PEMs), and assessed their readability. The target of 100 PEMs was based on proportions assuming 20,000 total PEMs per database with 50% of PEMs above target readability, a 10% margin of error, and α = 0.05. PEMs were selected from a web-based alphabetical list via a random numbers generating program.32 If a disease topic was present in one PEM database but not in the others, the next alphabetical disease topic on the list was checked against the other two databases until a match was found. This methodology assured the avoidance of particularly obscure topics, and favored the inclusion of common diseases topics such as arthritis, cardiac angina, diabetes, gastroesophageal reflux, and low back pain. Titles, citations, glossaries, and “further resources” listed on each handout were excluded from the readability analysis. In MedlinePlus’ database, multiple resources are listed for each disease. “Patient handouts” and “Easy-To-Read” resources were selected whenever possible. If these were not available, handouts were taken from the “Medical Encyclopedia” or the first listing in the “Overview” category for a given disease. The “alternative names” category at the conclusion of these handouts was excluded from the readability analysis, as it often consisted of lists of medical terms without informational content.
MEASUREMENTS
Readability indices are used to assess health documents, including PEMs.33 They use mathematical formulas to assign passages of text a grade reading level based on word and sentence length (Table 1). Word length is a proxy for semantic or meaning difficulty, and sentence length is a measure of syntactic complexity.34
Table 1.
Readability Index | Formula | Disadvantages | Recommended For PEM Analysis By |
---|---|---|---|
Flesch–Kincaid | Underestimates actual grade reading level34 | Health Literacy Advisor,34 Meade et al.33 | |
Gunning Fog | Cannot measure readability of text in boxes or tables | Health Literacy Advisor,34 Meade et al.33 | |
SMOG | Does not discriminate well below a 6th grade level35 | Health Literacy Advisor,34 Meade et al.,33 National Cancer Institute, Center for Medicare and Medicaid Services35 |
Based on recommendations from the Health Literacy Advisor (HLA),34 we chose the Simple Measure of Gobbledygook (SMOG), Gunning Fog Index (GFI), and Flesch–Kincaid Index (FKI) for our analysis. SMOG is additionally recommended by the Centers for Medicare and Medicaid Services (CMS).35
Text from each PEM was copied and pasted into an online readability calculator36 recommended by the University of Minnesota Biomedical Libraries37 for aiding in the creation of patient education materials. This web-based tool analyzes the grade reading level of English text using a series of readability indices, including the three listed above. It also provides suggestions on how to improve readability by specifying sentences that increase the grade reading level of the text. Before the readability calculation, each PEM was cleaned to allow for accurate calculation. Cleaning involved standardizing the punctuation such that a period marked the end of each heading, sentence fragment, or sentence. The output from the online readability calculator was then copied into an excel spreadsheet for analysis. Data extraction was performed manually from MedlinePlus and Micromedex PEMs. Data extraction from EBSCO PEMs was performed by an automated computer program designed especially for this task. The program automated the method of manual data extraction. It ran through EBSCO’s PEM database, copied the text from each PEM, punctuated each line with a period, and pasted the text into the readability calculator. The program then recorded each PEM’s output in an Excel document. Automated data extraction was only possible for EBSCO due to the different formats of each database’s PEMs.
To determine the reliability of the computerized abstractions, two investigators (LS and NS) manually abstracted and cleaned ten randomly selected32 EBSCO PEMs, reanalyzed them using GFI, and determined the interrater reliability between the computer and manually abstracted PEMs through kappa analysis. To determine the inter-rater reliability of the manual data extractions for the Micromedex and Medline PEMs, a second investigator (NS) manually abstracted and cleaned 10 randomly selected PEMs and analyzed their readability, also using GFI. Additionally, we assessed reliability of our online calculator by entering the same ten MedlinePlus PEMs into a second online calculator38 to ensure inter-rater reliability between two online calculators using the GFI.
There was near perfect agreement between the computer program and manual extraction, manual extraction by authors, and two online readability calculators, with all kappas = 1. The mean errors were 0.17 grade levels between computer and manual extraction, 0.12 grade levels between manual raters, and 0.2 grade levels between online calculators.
Analysis
We determined the percentage of PEMs in each database written above the 8th grade level. Differences in the mean readability scores for PEMs across the three databases were determined with one-way analyses of variance (ANOVA) using SPSS version 19.0. We recorded sentences that raised the grade reading levels of the least readable PEMs in each database.
RESULTS
There was wide variability in the readability of PEMs from the three sources, but a majority were written above the 8th grade level (Table 2). The average reading level for MedlinePlus, EBSCO, and Micromedex PEMs was 10.2 (1.9), 9.7 (1.3), and 8.6 (0.9), respectively (p ≤ 0.000) as estimated by the GFI. Estimates of readability using SMOG and FKI were similar (Table 2).
Table 2.
PEM database | Micromedex | MedlinePlus | EBSCO | ||||||
---|---|---|---|---|---|---|---|---|---|
Readability Index | Flesch–Kincaid | Gunning Fog | SMOG | Flesch–Kincaid | Gunning Fog | SMOG | Flesch–Kincaid | Gunning Fog | SMOG |
Mean Grade Reading Level (SD) | 7.7 (0.8) | 8.6 (0.9) | 9.4 (0.7) | 8.5 (1.6) | 10.2 (1.9) | 10.4 (1.3) | 8.4 (1.0) | 9.7 (1.3) | 9.7 (0.7) |
% PEMs above 8th Grade Level | 31% | 74% | 96% | 63% | 90% | 98% | 68% | 89% | 100% |
Average # of Grades Above Target Readability | 0.5 | 1.0 | 1.5 | 1.4 | 2.5 | 2.4 | 0.9 | 1.9 | 1.7 |
Grade Level Range | 5–9 | 7–11 | 7–11 | 5–14 | 6–17 | 8–15 | 6–11 | 7–13 | 8–12 |
* PEM = patient education materials
† SD = standard deviation
The range of grade reading levels varied across databases. The most readable PEMs were consistently readable at a 5th–7th grade level in each database. EBSCO’s least readable PEMs were readable at 11th–13th grade levels, while MedlinePlus’ were readable at 14th–17th grade levels and Micromedex’s were readable at an 11th grade level. For each database, the percentage of PEMs above target readability, median grade reading levels, and ranges are shown in Table 2. In an analysis of average readability scores using the Gunning Fog index, Micomedex’s PEMs were more readable than MedlinePlus’ and EBSCO’s (p ≤ 0.000). MedlinePlus’ PEMs were more readable than EBSCO’ PEMs (p ≤ 0.039). PEMs assessed by FKI scored at an average grade reading level of 8.2—significantly lower than PEMs assessed by GFI (9.4) and SMOG (9.8) (p ≤ 0.000).
Eighteen out of 100 sampled disease or condition topics in MedlinePlus’ PEMs database were designated as “Easy-To-Read”. However, 72% of these 18 “Easy-To-Read” PEMs were a full grade level above target readability by the measure of at least one readability index.
PEMs on bacterial vaginosis, Barrett’s esophagus, generalized anxiety, malaria, pyelonephritis, salmonella, undescended testes, and yeast infections were at least two grade levels above target readability in all databases, as measured by the GFI. Examples of difficult-to-read sentences from each of these PEMs are listed in Table 3. Additionally, some disease topics were written at target readability in all databases. Examples of well-written, readable sentences from these three PEMs are listed in Table 4.
Table 3.
Disease Topic | Grade Reading Level | Example | ||
---|---|---|---|---|
Micromedex | EBSCO | MedlinePlus | ||
Bacterial Vaginosis | 10.22 | 11.81 | 13.59 | “Normally, the vagina has helpful bacteria (lactobacilli), as well as more harmful bacteria (anaerobes—bacteria that do not need oxygen to live)”.—EBSCO |
Barrett’s Esophagus | 10.06 | 10.79 | 13.1 | “Removal of most of the esophagus is recommended if a person with Barrett’s esophagus is found to have severe dysplasia or cancer and can tolerate a surgical procedure.”—MedlinePlus |
Generalized Anxiety | 9.82 | 10.64 | 12.77 | “With GAD, you may have symptoms like those of a serious health problem, such as a heart problem.”—Micromedex |
Malaria | 9.69 | 10.84 | 13.56 | “The majority of symptoms are caused by the massive release of merozoites into the bloodstream, the anemia resulting from the destruction of the red blood cells, and the problems caused by large amounts of free hemoglobin released into circulation after red blood cells rupture.”—MedlinePlus |
Pyelonephritis | 10.52 | 10.3 | 12.25 | “Voiding cystourethrography—x-ray of the urinary bladder and urethra made after injection with a contrast medium.”—EBSCO |
Salmonella | 10.93 | 10.1 | 11.38 | “Salmonellosis is a common gastrointestinal (digestive) infection caused by a bacteria (germ) called Salmonella”—Micromedex |
Undescended Testes | 9.74 | 12.89 | 10.89 | “Grown men with undescended testes may have low sperm counts resulting in infertility, and are at increased risk for hernia and testicular cancer because of their untreated undescended testes.”—EBSCO |
Yeast Infection | 10.4 | 11.07 | 12.06 | “Although many women feel cleaner if they douche after menstruation or intercourse, it may actually worsen vaginal discharge because it removes healthy bacteria lining the vagina that protect against infection.”—MedlinePlus |
Table 4.
Disease/Condition Topic | Grade Reading Level (GFI) | Example | ||
---|---|---|---|---|
Micromedex | EBSCO | MedlinePlus | ||
Diaper Rash | 7.6 | 8 | 7.3 | “Clean your child's diaper area with plain, warm water. Allow the skin to air dry, or gently pat it dry with a clean cloth. Do not use baby wipes or soap during diaper changes. Before closing the new diaper, make sure your child's bottom is completely dry.”—Micromedex |
Irritable Bowel Syndrome | 7.9 | 8.1 | 7.1 | “For this test the doctor inserts a long, thin tube, called a colonoscope, into your anus and up into your colon. The tube has a light and tiny lens on the end. The doctor can view the inside of your colon on a big television screen.”—MedlinePlus |
Scabies | 7.2 | 7.9 | 7.9 | “The scabies mite does not suck blood. It does not transmit any disease other than scabies between people.”—EBSCO |
DISCUSSION
Our study found that the majority of PEMs supplied through Meditech and the NLM were written above target readability. Micromedex’s PEMs are more readable than MedlinePlus’ or EBSCO’s by a significant margin. Our findings are consistent with results of studies in which investigators found only 6% to 7% of PEMs to be readable below an 8th grade level.12,13,39 While these studies assessed PEMs distributed in primary care waiting rooms12 and downloaded off the American Academy of Family Physicians (AAFP) website,13 our study is unique in that it examines the PEMs provided for clinician use at the point of care in EHR’s. Terry Davis, a health literacy expert, argued that having PEMs available during a physician visit would encourage physicians to use them as teaching tools, highlighting key points related to the topic of the medical encounter. Davis argues that this would be more meaningful for patients than simply having materials available in the waiting room.9
Studies have found a high level of correlation between various readability indices. For instance, Meade et al.33 found a 99% correlation between GFI and SMOG when assessing 49 health-based materials. In our analysis, a significant percentage of documents from each database were below target readability when analyzed using FKI, but above target readability when analyzed by SMOG and GFI. This is because FKI uses a lower criterion score than SMOG and GFI. While FKI predicts the level of reading skill required to correctly answer 75% of the questions on a reading test, GFI and SMOG predict the reading ability required to correctly answer 90-100% of the questions on a reading test.40 Both the Health Literacy Advisor34 and CMS35 warn that FKI underestimates grade reading level by approximately two grades. This likely explains the discrepancy.
As EHRs become the norm in practice settings, and efficiency is prioritized by busy clinicians, it is essential that the tools clinicians use to educate their patients be optimal and effective. Our results indicate that too many PEMs made available to physicians at the point-of-care are inadequate are written above recommended reading levels.
Limitations
Our study has several limitations. First, the 300 PEMs analyzed in this study are a random sample from three databases. We sampled PEMs from a national source, and an EHR with 27% market penetration. Our sample may not be representative of all of the PEMs, or the most commonly used PEMs. Second, the readability calculator used in this study only accommodates English text. Therefore, while each database provides non-English PEMs, these PEMs were excluded from our analysis. Third, results from our readability calculator may differ from results of other online readability calculators. Because calculators use different algorithms to count sentences, words, and syllables, there may be discrepancies between results, even when different calculators use the same formulas.40
Fourth, readability formulas are imperfect predictors of comprehensibility. They account for 50–84% of the variance in text difficulty as measured by comprehension tests.40 This is because readability indices do not account for some variables that influence comprehension and recall, such as visual aids, text organization, syntax, and rhetorical structure.41 Additionally, sentence and word length are not perfect measures of complexity. Multisyllabic words are not always unfamiliar, and short words may be difficult jargon. Our study does not assess how well patients understand the content of the PEMs examined.
Lastly, while it is necessary to improve readability of PEMs, this intervention alone may not be sufficient to improve health outcomes. Few published studies have shown a beneficial effect on patients’ health outcomes from using simplified reading materials alone.
CONCLUSIONS
The majority of PEMs available through Meditech and the NLM are written at reading levels higher than what has been recommended by expert groups. As EHR’s become ubiquitous and health care providers become increasingly pressed for time, the availability of accessible and comprehensible PEMs becomes more important. Improving patient comprehension of disease processes, prevention, and treatment is a necessary step towards improving health outcomes. Given health literacy’s enormous impact on our healthcare system, it is crucial that EHR vendors and the US government provide educational materials that are written at appropriate reading levels designed for comprehension by patients at all levels of health literacy. Further studies should assess the appropriateness of additional strategies for educating low health literacy patients, such as visual, animated, and interactive PEMs.
Acknowledgements
The Doris Duke Charitable Foundation for The Doris Duke Clinical Research Fellowship.
Seth Zimmerman, BA, computer program design.
Alex Federman, MD, manuscript revisions.
Lauren Stossel had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.
Conflict of Interest
The authors declare that they do not have a conflict of interest.
REFERENCES
- 1.Carrier ERJ. Expectations outpace reality: physicians’ use of care management tools for patients with chronic conditions. Issue Brief Cent Stud Health Syst Chang. 2009;129:1–4. [PubMed] [Google Scholar]
- 2.Wallace AS, Seligman HK, Davis TC, Schillinger D, Arnold CL, Bryant-Shilliday B, et al. Literacy-appropriate educational materials and brief counseling improve diabetes self-management. Patient Educ Couns. 2009;75(3):328–33. doi: 10.1016/j.pec.2008.12.017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Lorig KR, Sobel DS, Ritter PL, Laurent D, Hobbs M. Effect of a self-management program on patients with chronic disease. Eff Clin Pract. 2001;4(6):256–62. [PubMed] [Google Scholar]
- 4.Taal E, Rasker JJ, Wiegman O. Patient education and self-management in the rheumatic diseases: a self-efficacy approach. Arthritis Care Res. 1996;9(3):229–38. doi: 10.1002/1529-0131(199606)9:3<229::AID-ANR1790090312>3.0.CO;2-U. [DOI] [PubMed] [Google Scholar]
- 5.Jacobson TA, Thomas DM, Morton FJ, Offutt G, Shevlin J, Ray S. Use of a low-literacy patient education tool to enhance pneumococcal vaccination rates - A randomized controlled trial. Jama-J Am Med Assoc. 1999;282(7):646–50. doi: 10.1001/jama.282.7.646. [DOI] [PubMed] [Google Scholar]
- 6.Overland JE, Hoskins PL, Mcgill MJ, Yue DK. Low-literacy - a problem in diabetes education. Diabet Med. 1993;10(9):847–50. doi: 10.1111/j.1464-5491.1993.tb00178.x. [DOI] [PubMed] [Google Scholar]
- 7.Baker GC, Newton DE, Bergstresser PR. Increased readability improves the comprehension of written information for patients with skin-disease. J Am Acad Dermatol. 1988;19(6):1135–41. doi: 10.1016/S0190-9622(88)70280-7. [DOI] [PubMed] [Google Scholar]
- 8.Advancing Effective Communication, Cultural Competence, and Patient- and Family-Centered Care: A Roadmap for Hospitals. The Joint Commission. Oakbrook Terrace, IL; 2010.
- 9.Davis TC, Wolf MS. Health literacy: implications for family medicine. Fam Med. 2004;36(8):595–8. [PubMed] [Google Scholar]
- 10.United States Government Accountability Office Report to Congressional Requesters. Medicare: Communications to Beneficiaries on the Prescription Drug Benefit Could Be Improved. Available at: http://www.gao.gov/new.items/d06654.pdf. Accessed on March 12th, 2012.
- 11.Lerner EB, Jehle DVK, Janicke DM, Moscati RM. Medical communication: do our patients understand? Am J Emerg Med. 2000;18(7):764–6. doi: 10.1053/ajem.2000.18040. [DOI] [PubMed] [Google Scholar]
- 12.Davis TC, Crouch MA, Wills G, Miller S, Abdehou DM. The gap between patient reading-comprehension and the readability of patient education materials. J Fam Pract. 1990;31(5):533–8. [PubMed] [Google Scholar]
- 13.Wallace LS, Lennon ES. American academy of family physicians patient education materials: can patients read them? Fam Med. 2004;36(8):571–4. [PubMed] [Google Scholar]
- 14.Strachan PH, de Laat S, Carroll SL, Schwartz L, Vaandering K, Toor GK, et al. Readability and content of patient education material related to implantable cardioverter defibrillators. J Cardiovasc Nurs. 2011. [DOI] [PMC free article] [PubMed]
- 15.Taylor-Clarke K, Henry-Okafor Q, Murphy C, Keyes M, Rothman R, Churchwell A, et al. Assessment of commonly available education materials in heart failure clinics. J Cardiovasc Nurs. 2011. [DOI] [PMC free article] [PubMed]
- 16.Downing MA, Omar AH, Sabri E, McCarthy AE. Information on the internet for asplenic patients: a systematic review. Can J Surg. 2011;54(4):232–6. doi: 10.1503/cjs.005510. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Aliu O, Chung KC. Readability of ASPS and ASAPS educational web sites: an analysis of consumer impact. Plast Reconstr Surg. 2010;125(4):1271–8. doi: 10.1097/PRS.0b013e3181d0ab9e. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Tulbert BH, Snyder CW, Brodell RT. Readability of patient-oriented online dermatology resources. J Clin Aesthet Dermatol. 2011;4(3):27–33. [PMC free article] [PubMed] [Google Scholar]
- 19.Badarudeen S, Sabharwal S. Assessing readability of patient education materials: current role in orthopaedics. Clin Orthop Relat Res. 2010;468(10):2572–80. doi: 10.1007/s11999-010-1380-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Wang SW, Capo JT, Orillaza N. Readability and comprehensibility of patient education material in hand-related web sites. J Hand Surg Am. 2009;34(7):1308–15. doi: 10.1016/j.jhsa.2009.04.008. [DOI] [PubMed] [Google Scholar]
- 21.Ache KA, Wallace LS. Are end-of-life patient education materials readable? Palliat Med. 2009;23(6):545–8. doi: 10.1177/0269216309106313. [DOI] [PubMed] [Google Scholar]
- 22.Helitzer DHC, Cotner J, Oestreicher N. Health literacy demands of written health information materials: an assessment of cervical cancer prevention materials. Cancer Control. 2009;16(1):70–8. doi: 10.1177/107327480901600111. [DOI] [PubMed] [Google Scholar]
- 23.Cox N, Bowmer C, Ring A. Health literacy and the provision of information to women with breast cancer. Clin Oncol (R Coll Radiol) 2011;23(3):223–7. doi: 10.1016/j.clon.2010.11.010. [DOI] [PubMed] [Google Scholar]
- 24.Johnson LK, Edelman A, Jensen J. Patient satisfaction and the impact of written material about postpartum contraceptive decisions. Am J Obstet Gynecol. 2003;188(5):1202–4. doi: 10.1067/mob.2003.308. [DOI] [PubMed] [Google Scholar]
- 25.Homewood VJ, Zite NB, Wallace LS. Over-the-counter ovulation prediction devices: do accompanying instructions adhere to low-literacy guidelines? J Reprod Med. 2009;54(8):473–7. [PubMed] [Google Scholar]
- 26.Healthcare Finance News. Economic Stimulus Bill Mandates Electronic Health Records For Every American by 2014 - With No Opt-out Provision. Available at: http://www.healthcarefinancenews.com/press-release/economic-stimulus-bill-mandates-electronic-health-records-every-american-2014-no-opt-o. Accessed on March 12th, 2012.
- 27.New York City Department of Health and Mental Hygeine. What Do Electronic Health Records Mean for Our Practice? Available at: http://www.nyc.gov/html/doh/downloads/pdf/csi/ehrkit-brochure.pdf. Accessed on March 12th, 2012.
- 28.Modern Healthcare. Top vendors for enterprise EMR systems: 2010. Available at: http://www.modernhealthcare.com/section/lists?djoPage=product_details&djoPid=17210&djoTry=1299174101. Accessed on March 12th, 2012.
- 29.Medical Information Technology, Inc. Meditech Patient Education Functionality Brief. Available at: http://www.meditech.com/ProductBriefs/pages/productpagepted.htm. Accessed on March 12th, 2012.
- 30.MedlinePlus. MedlinePlus Awards and Recognition. Available at: http://www.nlm.nih.gov/medlineplus/recognition.html. Accessed on March 12th, 2012.
- 31.MedlinePlus. MedlinePlus Statistics. Available at: http://www.nlm.nih.gov/medlineplus/usestatistics.html. Accessed on March 12th, 2012.
- 32.Random.org. Random Integer Generator. Available at: http://www.random.org/integers/. Accessed on March 12th, 2012.
- 33.Meade CD. Readability formulas: cautions and criteria. Patient Educ Couns. 1990;17:153–8. doi: 10.1016/0738-3991(91)90017-Y. [DOI] [Google Scholar]
- 34.Health Litearcy Innovations. Newsletter, Volume 1, Issue 1. Available at: http://www.healthliteracyinnovations.com/newsletter/. Accessed on March 12th, 2012.
- 35.Centers for Medicare and Medicaid Services. Toolkit for Making Written Material Clear and Effective, Toolkit Part 7: Using readability formulas. Available at: http://www.cms.gov/WrittenMaterialsToolkit/09_ToolkitPart07.asp#TopOfPage. Accessed on March 12th, 2012.
- 36.Online Utility. Readability Calculator. Available at http://www.online-utility.org/english/readability_test_and_improve.jsp. Accessed on March 12th, 2012.
- 37.University of Minesota Libraries. Creating Patient Educaiton Materials. Available at: http://www.lib.umn.edu/libdata/page_print.phtml?page_id=839. Accessed on March 12th, 2012.
- 38.Edit Central. Readability Caculator. Available at: http://www.editcentral.com/gwt1/EditCentral.html Accessed on March 12th, 2012.
- 39.Freda MC. The readability of American Academy of Pediatrics patient education brochures. J Pediatr Health Care. 2005;19(3):151–6. doi: 10.1016/j.pedhc.2005.01.013. [DOI] [PubMed] [Google Scholar]
- 40.DuBay WH. The Principles of Readability. Costa Mesa, California: Impact Information; 2004.
- 41.Reid JC, Klachko DM, Kardash CAM, Robinson RD, Scholes R, Howard D. Why people dont learn from diabetes literature - influence of text and reader characteristics. Patient Educ Couns. 1995;25(1):31–8. doi: 10.1016/0738-3991(94)00688-I. [DOI] [PubMed] [Google Scholar]