Abstract
Objective
Physicians who more intensively interact with electronic health records (EHRs) through their documentation style may pay greater attention to coded fields and clinical decision support and thus may deliver higher quality care. We measured the quality of care of physicians who used three predominating EHR documentation styles: dictation, structured documentation, and free text.
Methods
We conducted a retrospective analysis of visits by patients with coronary artery disease and diabetes to the Partners Primary Care Practice Based Research Network. The main outcome measures were 15 EHR-based coronary artery disease and diabetes measures assessed 30 days after primary care visits.
Results
During the 9-month study period, 7000 coronary artery disease and diabetes patients made 18 569 visits to 234 primary care physicians of whom 20 (9%) predominantly dictated their notes, 68 (29%) predominantly used structured documentation, and 146 (62%) predominantly typed free text notes. In multivariable modeling adjusted for clustering by patient and physician, quality of care appeared significantly worse for dictators than for physicians using the other two documentation styles on three of 15 measures (antiplatelet medication, tobacco use documentation, and diabetic eye exam); better for structured documenters for three measures (blood pressure documentation, body mass index documentation, and diabetic foot exam); and better for free text documenters on one measure (influenza vaccination). There was no measure for which dictators had higher quality of care than physicians using the other two documentation styles.
Conclusions
EHR-assessed quality is necessarily documentation-dependent, but physicians who dictated their notes appeared to have worse quality of care than physicians who used structured EHR documentation.
Clinical trial registration number
ClinicalTrials.gov Identifier: NCT00235040.
Keywords: Medical record systems; computerized; quality of healthcare; physicians; primary care; human-computer interaction and human-centered computing; developing/using clinical decision support (other than diagnostic) and guideline systems, other specific EHR applications (results review); medication administration; disease progression; quality improvement; patient safety; clinical decision support; hospital medicine; uncertain reasoning and decision theory; knowledge representations; designing usable (responsive) resources and systems; knowledge acquisition and knowledge management; demonstrating return on IT investment
Introduction
Electronic health records (EHRs) have been touted as a means to improve the quality of medical care in the USA.1 2 Indeed, EHRs and clinical decision support systems have been shown in certain settings and for certain problems to be associated with improved quality of care.3–5 However, on a national scale, quality gains have not emerged.6–8 Part of the explanation for this EHR use–quality gap may be that physicians are using the EHR mainly for documentation purposes, simply as an electronic replacement for the paper chart.
Physician documentation styles run the gamut from not interacting at all with the EHR—dictating and using the EHR only as a notes repository—to intensive focus on the EHR before, during, and after patient visits with attention to structured fields, coded data entry, and clinical decision support.9 One might expect physicians with documentation styles that lead them to more meaningfully interact with the EHR to have better quality of care. To examine if primary care physicians who more intensively interact with the EHR through their documentation style have better quality of care, we compared quality between physicians who predominantly dictated, used structured documentation, or typed free text notes.
Methods
Overview
We performed a cross-sectional analysis of data collected as part of a physician-randomized, controlled trial of a novel documentation-based clinical decision support system for coronary artery disease and diabetes (CAD/DM) called Smart Forms (ClinicalTrials.gov Identifier: NCT00235040).10 Physicians used the clinical decision support system in only 5.6% of CAD/DM visits. There were modest, statistically significant gains in quality associated with Smart Form use in an intention-to-treat analysis. As-used, Smart Forms were associated with marked improvements in clinical documentation and compliance with clinical decision support alerts.11 For the present analysis, the associations of interest were between three physician documentation styles—dictation, structured documentation, and free text—and visit-level performance on 15 CAD/DM quality measures. The Partners Human Research Committee approved the study protocol.
Setting and EHR
The Partners Primary Care Practice Based Research Network is part of an integrated regional healthcare delivery network in eastern Massachusetts that includes over 20 primary care clinics affiliated with Brigham and Women's Hospital and Massachusetts General Hospital. The main EHR used in Partners HealthCare ambulatory clinics is the Longitudinal Medical Record (LMR). The LMR is an internally developed, full-featured, Certification Commission for Healthcare Information Technology-approved EHR including primary care and subspecialty notes; problem lists; medication lists; coded allergies; and laboratory test and radiographic study results.12 The LMR includes e-prescribing and radiology ordering, but does not include computerized electronic laboratory order entry. The LMR has clinical decision support in the form of reminders for preventive services, chronic care management (including for CAD/DM), and medication monitoring; medication dosing alerts; and medication alerts for drug–drug, drug–lab, drug–condition, and drug–allergy interactions. The LMR includes a registry function called Reports Central that provides physician panel views for a range of preventive services and chronic medical conditions.
Data source
There were 10 primary care practices that participated in the CAD/DM Smart Form randomized controlled trial. Practices received the Smart Form on a rolling basis from March 3, 2007 to August 10, 2007. The study duration was 9 months for each practice. For patients to qualify for analysis they had to have CAD or DM on their EHR problem list as of the day prior to the trial start date for that practice. We previously found these definitions of CAD and DM to have a positive predictive value of 94% and 96%, respectively.13 Eligible visits were defined as those made by eligible patients to a primary care physician who belonged to one of the study practices during that practice's 9-month study period.
Data analysis
Documentation style
Physician documentation style was determined by evaluating 188 554 visit notes written by participating physicians for all of their patients (not just CAD/DM patients) from May 1, 2007 to May 10, 2008. In the EHR, physicians could dictate notes, use structured documentation, or type unstructured free text notes.
Dictations were done via telephone and transcribed and uploaded to the EHR by a third-party transcription service. During the study period, physicians were supposed to review and electronically sign their dictations, but dictations ‘auto-finalized’ after 21 days and became a permanent part of the record even if physicians did not actively review and sign the notes.
Structured documentation templates divide the components of the note into separate sections (eg, history of present illness, review of systems, family history, etc) that can be reused in subsequent notes. As part of structured documentation, physicians could, but are not required to, access and manipulate coded fields within the EHR such as problems, allergies, medications, vital signs, and health maintenance items (eg, screening and chronic disease monitoring tests). For example, a physician could choose to have a patient's problem list automatically imported from the EHR into a note each time a template is used.
Free text notes were generated using a single window, similar to a word processing program. Free text notes could be generated from free text templates, discrete pieces of data (eg, problem list, allergies, or medications) could be brought into the note as free text, and old notes could be ‘carried forward’ in their entirety for editing as a new note.
We divided physicians into three mutually exclusive groups by predominating documentation style: those who dictated more than 25% of their notes (dictation), those who used structured templates for more than 25% of their notes (structured), and the remainder, who mostly typed free text notes. For each predominating style, we calculated the proportion of notes generated using all three documentation types.
Quality measures
We calculated 15 visit-based quality measures: two for CAD patients, seven for both CAD and DM patients, and six for DM patients (see table 1 for definitions). Measures included quality of documentation (eg, smoking status), medication use (eg, antiplatelet prescribing), testing (eg, HbA1c ordering), management measures (eg, blood pressure control), and vaccinations (eg, influenza). We omitted a measure of tobacco treatment because treatment rates were 1% or lower for documented smokers across the three styles of documentation.11 Quality measures were fulfilled through the presence of data in EHR coded fields including vital signs, medications, allergies, problem list entries, lab tests, and vaccinations. None of the methods of documentation, by themselves, fulfilled the quality measures. At the end of the documentation process, all three documentation styles resulted in text notes, which were not used for quality assessment.
Table 1.
Quality measure | Denominator (determined day prior to visit) | Numerator (determined 30 days after visit) |
Coronary artery disease | ||
Antiplatelet medication | No allergy or contraindication to antiplatelet medications | Antiplatelet agent on medication list or new contraindication in allergies or problem list |
β Blocker | Patient with myocardial infarction, angina, or congestive heart failure, and no allergies or contraindication to β blockers | β Blocker on medication list or a new contraindication in allergies or problem list |
Coronary artery disease or diabetes | ||
Cholesterol testing | All patients | LDL test result in clinical data repository or health maintenance section within the previous 13 months |
Cholesterol control | All patients | An LDL test result <100 mg/dl prior to visit, after visit, or subsequent intensification of antihyperlipidemic therapy on medication list |
Blood pressure documentation | All patients | Blood pressure result in coded vital signs |
Blood pressure control | All patients | Blood pressure in coded field <140/90 mm Hg or <130/80 mm Hg for patients with diabetes or renal disease as of visit or subsequent intensification of antihypertensive therapy on medication list |
Tobacco use documentation | All patients | Tobacco use status in problem list or health maintenance section |
Body mass index | All patients | Weight within 13 months and height within 5 years if patient >23 years old (otherwise within 13 months) in vital signs |
Influenza vaccine | All patients visiting between October 1 and April 31 | Receipt of influenza vaccine as noted in immunization section of chart |
Diabetes | ||
ACE-I/ARB | No allergy or contraindication to either ACE-I or ARB | ACE-I or ARB on medication list or a new contraindication documented |
HbA1c testing | All patients | HbA1c test in clinical data repository or health maintenance section within the previous 13 months |
Glucose control | All patients | HbA1c result <7% as of visit or subsequent intensification of antihyperglycemic therapy on medication list |
Foot exam | All patients | Foot exam in health maintenance section within the previous 13 months |
Eye exam | All patients | Eye exam in health maintenance section within the previous 13 months |
Microalbuminuria testing | All patients | Urine microalbumin testing in clinical data repository or health maintenance section within the previous 13 months |
ACE-I, ACE inhibitor; ARB, angiotensin receptor blocker; LDL, low density lipoprotein.
Eligibility for each measure was determined as of the day prior to the visit. Because these data were from a trial of visit-based documentation and to give physicians and patients an opportunity to fulfill the quality measures, we calculated quality measures by querying the EHR and Partners Clinical Data Repository 30 days after the patient's visit. Individual patients could have fulfilled a quality measure on an earlier visit but fail to fulfill it on a subsequent visit (eg, cholesterol testing that becomes ‘out of date’ on a later visit).
Covariates
We collected information about physicians including age, gender, level of training, the proportion of patients in their panel with CAD or DM, total patient visits per week, visits per year, and whether they were in the intervention group of the Smart Form trial. We also surveyed physicians about their self-reported experience with the EHR. We collected information about patients including sociodemographic variables; the number of visits and hospitalizations in the previous year; total primary care clinic visits and visits with their own primary care physician during the study period; number of problems and medications listed in the EHR; median household income by zip code; whether their physician was in the trial intervention group; and whether the Smart Form was used during one or more of their visits.
Statistical analysis
We used standard descriptive statistics to describe and compare the baseline characteristics of physicians and patients. We compared linear variables using ANOVA. We compared categorical variables using Fisher's exact test for physician variables and patient variables with cells containing five or fewer counts, and the χ2 test for the remaining patient variables. We used SAS V.9.2 (SAS Institute Inc.).
To adjust for patient and physician clustering and potential confounders, we used SAS-callable SUDAAN V.10.0.1 (RTI International, Research Triangle Park, North Carolina, USA) to create multivariable logistic regression models for each quality measure as the dependent variable. To avoid collinearity, the models included a parsimonious list of patient and provider-level covariates that were clinically or statistically significant predictors of documentation style on bivariate testing. Although not statistically significantly related to documentation style on bivariate testing, we included the trial intervention group as an independent variable. We modeled continuous variables linearly. We report the adjusted percentage of visits at which the quality measure was fulfilled for each documentation style and the p value from the Wald F type 3 analysis of effects. We used an adaptive step-up Bonferroni procedure to account for multiple comparisons for the 15 quality measures.14
Results
Documentation style
Based on 188 554 notes (mean per physician, 805), 20 (9%) physicians predominantly dictated their notes, 68 (29%) used structured documentation, and 146 (62%) used free text notes. On average, dictators dictated 67% of their notes, used structured documentation for 4%, and used free text for 32%. Structured documenters dictated <1% of their notes, used structured documentation for 54%, and used free text for 46%. Free text documenters dictated <1% of their notes, used structured documentation for 4%, and used free text for 96%.
Practice, physician, and patient characteristics
During the 9-month study period, in the 10 participating primary care practices, there were 234 primary care physicians who saw 7000 patients with CAD/DM who made 18 569 visits. Practices ranged in size from six physicians to 96 physicians. The proportion of dictators in each practice ranged from 0% in three practices (of 8, 16, and 20 physicians) to 33% in two practices (of 9 and 6 physicians). The proportion of structured documenters in each practice ranged from 11% (1 of 9 physicians) to 67% (4 of 6 physicians). The proportion of free text documenters in each practice ranged from 0% (0 of 6 physicians) to 85% (17 of 20 physicians).
Physicians who predominantly dictated their notes were older, had more visits, and were exclusively attending physicians compared to structured documenters and free text documenters (table 2). Eligible patients had CAD (29%), DM (60%), or both (11%). Patients who saw physicians who predominantly dictated their notes were older; more often men and White; more often had private insurance and less often Medicaid; had fewer visits (than patients who saw predominantly structured documenters); had more documented problems; and had higher median household income by zip code (table 3). Patients who saw physicians who predominantly dictated their notes were also less likely to have had the Smart Form used once or more during the course of their care.
Table 2.
Physician characteristic | Total (N=234) | Documentation style | p Value | ||
Dictation (N=20) | Structured (N=68) | Free text (N=146) | |||
Age in years, mean (SD) | 38 (11) | 52 (8) | 38 (10) | 36 (10) | <0.001 |
Patients in panel with CAD, % mean (SD) | 4 (4) | 4 (3) | 4 (5) | 4 (4) | 0.90 |
Patients in panel with DM, % mean (SD) | 9 (7) | 6 (5) | 9 (8) | 10 (6) | 0.08 |
Visits per week, mean (SD) | 16 (17) | 37 (15) | 17 (18) | 12 (15) | <0.001 |
Visits per year, mean (SD) | 710 (839) | 1759 (777) | 781 (871) | 533 (718) | <0.001 |
Female, N (%) | 123 (53) | 10 (50) | 42 (62) | 71 (49) | 0.19 |
Level of training, N (%) | <0.001 | ||||
Attending physician, N (%) | 125 (53) | 20 (100) | 38 (56) | 67 (46) | |
Fellow, N (%) | 2 (1) | 0 (0) | 1 (1) | 1 (1) | |
Resident, N (%) | 107 (46) | 0 (0) | 29 (43) | 78 (53) | |
Physicians with ≥1000 visits/year, N (%) | 70 (30) | 14 (70) | 21 (31) | 35 (24) | <0.001 |
Experience with EHR, N (%) | 0.81 | ||||
Very experienced, N (%) | 42 (18) | 4 (20) | 11 (16) | 28 (19) | |
Somewhat experienced, N (%) | 62 (27) | 5 (25) | 20 (29) | 37 (25) | |
Somewhat or very inexperienced, N (%) | 37 (16) | 1 (5) | 10 (15) | 26 (18) | |
Did not respond to survey, N (%) | 92 (39) | 10 (50) | 27 (40) | 55 (38) | |
Intervention group, N (%) | 131 (56) | 10 (50) | 40 (59) | 81 (55) | 0.77 |
CAD, coronary artery disease; DM, diabetes mellitus; EHR electronic health record.
Table 3.
Patient characteristic | Total (N=7000) | Documentation style | p Value | ||
Dictation (N=960) | Structured (N=2102) | Free text (N=3938) | |||
Age (years), mean (SD) | 65 (14) | 67 (13) | 64 (14) | 64 (14) | <0.001 |
Visits in previous year, mean (SD) | 4 (4) | 4 (4) | 5 (4) | 4 (4) | 0.01 |
Visits during study period, mean (SD) | 4 (3) | 4 (3) | 4 (3) | 4 (3) | 0.38 |
Visits with primary care physician during study period, mean (SD) | 3 (2) | 3 (2) | 3 (2) | 3 (2) | 0.16 |
Hospitalizations in previous year, mean (SD) | 0.3 (0.9) | 0.3 (0.8) | 0.3 (0.8) | 0.4 (0.9) | 0.06 |
Problems on problem list, mean (SD) | 8 (5) | 11 (6) | 8 (5) | 8 (4) | <0.001 |
Active medications, mean (SD) | 7 (4) | 7 (4) | 6 (4) | 7 (4) | 0.007 |
Median household income by zip code ($), mean (SD) | 52 148 (30 412) | 59 665 (36 519) | 53 796 (30 271) | 49 436 (28 421) | <0.001 |
Female, N (%) | 3605 (52) | 454 (47) | 1045 (50) | 2106 (53) | <0.001 |
Race/ethnicity, N (%) | |||||
White | 4029 (58) | 709 (73) | 1159 (55) | 2161 (55) | <0.001 |
Latino | 1220 (17) | 52 (5) | 358 (17) | 810 (21) | |
Black | 1050 (15) | 82 (9) | 365 (17) | 603 (15) | |
Other | 319 (5) | 47 (5) | 119 (6) | 153 (4) | |
Unknown | 382 (5) | 70 (7) | 101 (5) | 211 (5) | |
Primary insurance, N (%) | <0.001 | ||||
Managed care | 925 (13) | 143 (15) | 313 (15) | 469 (12) | |
Private | 1210 (17) | 224 (23) | 372 (18) | 614 (16) | |
Medicare | 3671 (52) | 526 (55) | 1044 (50) | 2101 (53) | |
Medicaid | 926 (13) | 46 (5) | 289 (14) | 591 (15) | |
Free care/self-pay/other | 268 (4) | 21 (2) | 84 (4) | 163 (4) | |
Intervention group, N (%) | 3573 (51) | 440 (46) | 822 (39) | 2311 (59) | <0.001 |
Smart Form used once or more, N (%) | 218 (3) | 5 (<1) | 123 (6) | 90 (2) | <0.001 |
Quality measures
In multivariable modeling, after adjusting for clustering by patient and physician and for patient and physician covariates, quality of care appeared significantly worse for dictators than for physicians using the other two documentation styles on three of 15 measures: antiplatelet medication, tobacco use documentation, and diabetic eye exam (table 4). Quality appeared better for structured documenters than for physicians using the other two documentation styles on three measures: blood pressure documentation, body mass index documentation, and diabetic foot exam. Quality of care appeared better for free text documenters on one measure (influenza vaccination). There was no measure for which dictators had higher quality of care than physicians using the other two documentation styles.
Table 4.
Quality measure | Eligible visits | Documentation style | p Value* | ||
Dictation | Structured | Free Text | |||
N | Adjusted % visits fulfilled† | ||||
Coronary artery disease | |||||
Antiplatelet medication | 16 723 | 51 | 59 | 56 | 0.03 |
β Blocker | 1054 | 63 | 69 | 72 | 0.68 |
Coronary artery disease and diabetes | |||||
Cholesterol testing | 18 569 | 92 | 93 | 92 | 0.68 |
Cholesterol control | 18 569 | 69 | 70 | 69 | 0.68 |
Blood pressure documentation | 18 569 | 81 | 98 | 89 | <0.001 |
Blood pressure control | 18 569 | 54 | 59 | 56 | 0.09 |
Tobacco use documentation | 18 569 | 22 | 38 | 36 | <0.001 |
Body mass index | 18 569 | 28 | 40 | 35 | <0.001 |
Influenza vaccination | 8783 | 60 | 64 | 68 | <0.001 |
Diabetes | |||||
ACE-I/ARB | 13 572 | 60 | 62 | 62 | 0.68 |
HbA1c testing | 13 736 | 98 | 99 | 98 | 0.68 |
Glucose control | 13 736 | 53 | 59 | 57 | 0.33 |
Foot exam | 13 736 | 11 | 14 | 9 | <0.001 |
Eye exam | 13 736 | 39 | 53 | 54 | <0.001 |
Microalbuminuria testing | 13 736 | 84 | 88 | 88 | 0.27 |
p Value uses an adaptive, step-up Bonferroni procedure for the difference in proportions between the three documentation styles for each quality measure adjusted for clustering by patient; clustering by physician; patient age, gender, race/ethnicity, insurance, median household income by zip code, annual visits per year, number of medications, and number of problems; physician age, gender, and annual number of visits; and intervention group.
Percentages are adjusted as is the p value. We do not provide Ns because the denominators are different for each quality measure and because the percentages are adjusted and cannot be calculated directly from the raw numbers.
ACE-I, ACE inhibitor; ARB, angiotensin receptor blocker.
Discussion
We hypothesized, based on our prior work, that physicians with documentation styles that led them to interact with the EHR to a greater degree would have better quality of care.9 We found that physicians who predominantly dictated their notes appeared to have generally lower quality of care than physicians who used structured EHR documentation or typed free text notes. Physicians who used structured EHR documentation appeared to have generally higher quality of care than the other two documentation styles.
None of the three methods of documentation, by themselves, would fulfill the quality measures. All three result in text within clinic notes. So, why might structured documentation have been associated with improved quality of care? Physicians who used the EHR more intensively for documentation could have paid more attention to necessary items that were missing from coded fields. In addition, physicians interacting with the EHR had greater potential to see and respond to clinical decision support before, during, or after the patient visit, some of which was relevant to CAD/DM documentation and care. Notes could be dictated without interacting with or even necessarily looking at the EHR. Dictations were uploaded to the EHR as unstructured, free text. In addition, dictation built in a documentation delay with unstructured information reaching the chart potentially days after the patient visit, when an opportunity to take action may have passed. Of course, use of structured EHR documentation may simply be a marker for physicians who were more attentive to quality measure-relevant detail, but our analysis was adjusted for clustering by physician.
Although it might appear obvious that physicians who use a more EHR-intensive documentation style would have better EHR-documented quality of care, this is not a foregone conclusion. Physicians who dictate could potentially have more time with their patients, time to review quality reports, and time to direct practice staff to enter structured data. Until large scale natural language processing (NLP) can produce structured data from dictated and free text reports, structured data entry will be an essential input to both clinical decision support and increasingly detailed quality measurement. Even dictation with advanced NLP may not be ideal because it would limit physicians' interaction with the EHR and clinical decision support.
Most of the differences we found were for ‘documentation-dependent’ quality measures that required coded information in the right place, for example, for blood pressure, tobacco use, and body mass index. However, even if documentation style were only associated with better documentation quality, a complete and accurate record is important to demonstrate high-quality care and inform clinical decision support and population management.15 16 Put simply, good documentation drives quality improvement and vice versa. In addition, while there are many ways to measure quality,17 18 because of the limitations in administrative data,19 the future of large scale quality measurement probably lies with EHR-based quality measurement that will be dependent on structured documentation. Indeed, the national Meaningful Use incentive program will increase EHR use and has quality measurement as one important EHR capability.20
Our findings are consistent with other studies showing that the simple presence of an EHR was not associated with improved quality,6–8 but use of certain EHR features, like the problem list, radiology result features, and visit note functionality were associated with improved quality.21 There have been many examples in which structured electronic documentation has been associated with increased timeliness, increased completeness, decreased errors, and increased report quality for operative reports, disability exams, discharge summaries, and radiology reports.22–27
Despite structured documentation being associated with better quality of care, it is worth noting that a separate study in our system of self-reported satisfaction with documentation method found that physicians who used structured documentation were the least satisfied with their method of documentation.28 It may be that physicians who use structured documentation are less satisfied, but understand they get benefits like generating reusable data elements for later documentation and are more likely to enter coded data which facilitates decision support and higher quality of care for their patients. Of course, structured documentation and information system rigidity can go too far: overly structured or singularly focused templates or clinical decision support systems may go largely unused by physicians.11 29–35
Our analysis has limitations that should be considered. First, we have described an association and can only speculate about causality. Second, our definition of documentation style was presumed to be stable during the study period, was based on a broader range of visits than just CAD/DM visits, and we did not have data about documentation type used for each individual CAD/DM visit. Third, we did not consider other methods of documentation such as the ‘scribe’ method or voice recognition software. Fourth, our sample size, especially for dictating physicians, was relatively small. Fifth, we assessed a restricted set of 15 quality measures that were specific to CAD/DM; these findings may not generalize to other conditions or measures. Sixth, our quality measures, which in some cases only required intensification of therapy or documentation of a contraindication within 30 days of a visit, rather than actual risk factor control, were forgiving. Seventh, our method of adjusting for clustering and confounding was complex, assumes similar clustering and confounding for each of the quality measures, and does not account for unmeasured confounding. In particular, we included proxies for severity of illness such as patient age and number of problems, medications, and visits. Regardless, it is possible the patients of physicians who dictate their notes were still sicker than the patients of physicians using the other two documentation styles and this confounded the results of our analyses (perhaps because it is more difficult to document high quality care in sicker patients, although this is not necessarily true). We could not adjust for level of training as all of the dictators were attending physicians, but there were proportionally more residents among the free text documenters and we were able to adjust for other physician, panel, and patient characteristics as well as clustering by physician. Finally, this study was conducted at a single academically-affiliated network of primary care practices. However, this limitation is also a strength as all of the physicians provided care in a similar system using a single EHR regardless of their documentation style.
Conclusion
In conclusion, we found that physicians who predominantly dictated their notes appeared to have worse quality of care, especially as compared to physicians who used structured EHR documentation. Potential solutions include increasing the usability of structured documentation systems so they are more appealing to physicians, improving NLP and other technologies to pull structured and coded data out of free text or dictated notes, and better use of affiliated staff, such as medical assistants or nurses, to enter critical coded data. Whatever dictation style physicians use, practices need systems to ensure that critical coded information is captured and deficits in quality are addressed.
Footnotes
Funding: This study was supported by a grant from the Agency for Healthcare Research and Quality (R01 HS015169). Dr Linder was supported by a career development award (K08 HS014563) from the Agency for Healthcare Research and Quality. Dr Schnipper was supported by a mentored career development award from the National Heart, Lung, and Blood Institute (K08 HL072806). The sponsors had no role in the design and conduct of the study; collection, management, analysis, or interpretation of the data; or preparation, review, or approval of the manuscript. The authors had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.
Competing interests: None.
Ethics approval: Ethics approval was provided by Partners HealthCare Human Research Committee.
Provenance and peer review: Not commissioned; externally peer reviewed.
References
- 1. Bush GW. State of the Union Address. 2006. http://georgewbush-whitehouse.archives.gov/stateoftheunion/2006/ (accessed 14 Sep 2011). [Google Scholar]
- 2. Obama B. State of the Union Address. 2010. http://www.whitehouse.gov/the_press_office/remarks-of-president-barack-obama-address-to-joint-session-of-congress/ (accessed 14 Sep 2011). [Google Scholar]
- 3. Hunt DL, Haynes RB, Hanna SE, et al. Effects of computer-based clinical decision support systems on physician performance and patient outcomes: a systematic review. JAMA 1998;280:1339–46 [DOI] [PubMed] [Google Scholar]
- 4. Garg AX, Adhikari NK, McDonald H, et al. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA 2005;293:1223–38 [DOI] [PubMed] [Google Scholar]
- 5. Chaudhry B, Wang J, Wu S, et al. Systematic review: impact of health information technology on quality, efficiency, and costs of medical care. Ann Intern Med 2006;144:742–52 [DOI] [PubMed] [Google Scholar]
- 6. Linder JA, Ma J, Bates DW, et al. Electronic health record use and the quality of ambulatory care in the United States. Arch Intern Med 2007;167:1400–5 [DOI] [PubMed] [Google Scholar]
- 7. Keyhani S, Hebert PL, Ross JS, et al. Electronic health record components and the quality of care. Med Care 2008;46:1267–72 [DOI] [PubMed] [Google Scholar]
- 8. Romano MJ, Stafford RS. Electronic health records and clinical decision support systems: impact on national ambulatory care quality. Arch Intern Med 2011;171:897–903 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Linder JA, Schnipper JL, Tsurikova R, et al. Barriers to electronic health record use during patient visits. AMIA Annu Symp Proc 2006:499–503 [PMC free article] [PubMed] [Google Scholar]
- 10. Schnipper JL, Linder JA, Palchuk MB, et al. “Smart Forms” in an electronic medical record: documentation-based clinical decision support to improve disease management. J Am Med Inform Assoc 2008;15:513–23 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Schnipper JL, Linder JA, Palchuk MB, et al. Effects of documentation-based decision support on chronic disease management. Am J Manag Care 2010;16(12 Suppl HIT):SP72–81 [PubMed] [Google Scholar]
- 12. Certification Commission for Healthcare Information Technology Partners Healthcare System, Partners Longitudinal Medical Record LMR 9.0. 2011. http://www.cchit.org/products/2011-2012/arrafinalruleeligibleprovider/3065 (accessed 31 Aug 2011). [Google Scholar]
- 13. Maviglia SM, Teich JM, Fiskio J, et al. Using an electronic medical record to identify opportunities to improve compliance with cholesterol guidelines. J Gen Intern Med 2001;16:531–7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Hochberg Y, Benjamini Y. More powerful procedures for multiple significance testing. Stat Med 1990;9:811–18 [DOI] [PubMed] [Google Scholar]
- 15. Soto CM, Kleinman KP, Simon SR. Quality and correlates of medical record documentation in the ambulatory care setting. BMC Health Serv Res 2002;2:1–7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Logan JR, Gorman PN, Middleton B. Measuring the quality of medical records: a method for comparing completeness and correctness of clinical encounter data. AMIA Annu Symp Proc 2001:408–12 [PMC free article] [PubMed] [Google Scholar]
- 17. Peabody JW, Luck J, Glassman P, et al. Comparison of vignettes, standardized patients, and chart abstraction: a prospective validation study of 3 methods for measuring quality. JAMA 2000;283:1715–22 [DOI] [PubMed] [Google Scholar]
- 18. Pronovost PJ, Lilford R. A road map for improving the performance of performance measures. Health Aff (Millwood) 2011;30:569–73 [DOI] [PubMed] [Google Scholar]
- 19. Peabody JW, Luck J, Jain S, et al. Assessing the accuracy of administrative data in health information systems. Med Care 2004;42:1066–72 [DOI] [PubMed] [Google Scholar]
- 20. Blumenthal D, Tavenner M. The “Meaningful use” regulation for electronic health records. N Engl J Med 2010;363:501–4 [DOI] [PubMed] [Google Scholar]
- 21. Poon EG, Wright A, Simon SR, et al. Relationship between use of electronic health record features and health care quality: results of a statewide survey. Med Care 2010;48:203–9 [DOI] [PubMed] [Google Scholar]
- 22. Cowan DA, Sands MB, Rabizadeh SM, et al. Electronic templates versus dictation for the completion of Mohs micrographic surgery operative notes. Dermatol Surg 2007;33:588–95 [DOI] [PubMed] [Google Scholar]
- 23. Johnson AJ, Chen MY, Zapadka ME, et al. Radiology report clarity: a cohort study of structured reporting compared with conventional dictation. J Am Coll Radiol 2010;7:501–6 [DOI] [PubMed] [Google Scholar]
- 24. Fielstein EM, Brown SH, McBrine CS, et al. The effect of standardized, computer-guided templates on quality of VA disability exams. AMIA Annu Symp Proc 2006:249–53 [PMC free article] [PubMed] [Google Scholar]
- 25. Laflamme MR, Dexter PR, Graham MF, et al. Efficiency, comprehensiveness and cost-effectiveness when comparing dictation and electronic templates for operative reports. AMIA Annu Symp Proc 2005:425–9 [PMC free article] [PubMed] [Google Scholar]
- 26. van Walraven C, Laupacis A, Seth R, et al. Dictated versus database-generated discharge summaries: a randomized clinical trial. CMAJ 1999;160:319–26 [PMC free article] [PubMed] [Google Scholar]
- 27. Kuhn K, Gaus W, Wechsler JG, et al. Structured reporting of medical findings: evaluation of a system in gastroenterology. Methods Inf Med 1992;31:268–74 [PubMed] [Google Scholar]
- 28. Neri PM, Wilcox AR, Volk LA, et al. Primary care providers' clinical documentation method and electronic health record satisfaction. AMIA Annu Symp 2009:984 [Google Scholar]
- 29. Tai SS, Nazareth I, Donegan C, et al. Evaluation of general practice computer templates. Lessons from a pilot randomised controlled trial. Methods Inf Med 1999;38:177–81 [PubMed] [Google Scholar]
- 30. Linder JA, Schnipper JL, Tsurikova R, et al. Documentation-based clinical decision support to improve antibiotic prescribing for acute respiratory infections in primary care: a cluster randomised controlled trial. Inform Prim Care 2010;17:231–40 [DOI] [PubMed] [Google Scholar]
- 31. Murray MD, Harris LE, Overhage JM, et al. Failure of computerized treatment suggestions to improve health outcomes of outpatients with uncomplicated hypertension: results of a randomized controlled trial. Pharmacotherapy 2004;24:324–37 [DOI] [PubMed] [Google Scholar]
- 32. Tierney WM, Overhage JM, Murray MD, et al. Can computer-generated evidence-based care suggestions enhance evidence-based management of asthma and chronic obstructive pulmonary disease? A randomized, controlled trial. Health Serv Res 2005;40:477–97 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33. Eccles M, McColl E, Steen N, et al. Effect of computerised evidence based guidelines on management of asthma and angina in adults in primary care: cluster randomised controlled trial. BMJ 2002;325:941. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34. Rousseau N, McColl E, Newton J, et al. Practice based, longitudinal, qualitative interview study of computerised evidence based guidelines in primary care. BMJ 2003;326:314. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35. Middleton B, Renner K, Leavitt M. Ambulatory practice clinical information management: problems and prospects. Healthc Inf Manage 1997;11:97–112 [PubMed] [Google Scholar]