Skip to main content
Journal of the American Medical Informatics Association : JAMIA logoLink to Journal of the American Medical Informatics Association : JAMIA
. 2016 Jun 29;24(1):140–144. doi: 10.1093/jamia/ocw067

Comparison of accuracy of physical examination findings in initial progress notes between paper charts and a newly implemented electronic health record

Siddhartha Yadav 1,2,, Noora Kazanji 1, Narayan K C 3, Sudarshan Paudel 4, John Falatko 1, Sandor Shoichet 1, Michael Maddens 1, Michael A Barnes 1
PMCID: PMC7654088  PMID: 27357831

Abstract

Introduction: There have been several concerns about the quality of documentation in electronic health records (EHRs) when compared to paper charts. This study compares the accuracy of physical examination findings documentation between the two in initial progress notes.

Methodology: Initial progress notes from patients with 5 specific diagnoses with invariable physical findings admitted to Beaumont Hospital, Royal Oak, between August 2011 and July 2013 were randomly selected for this study. A total of 500 progress notes were retrospectively reviewed. The paper chart arm consisted of progress notes completed prior to the transition to an EHR on July 1, 2012. The remaining charts were placed in the EHR arm. The primary endpoints were accuracy, inaccuracy, and omission of information. Secondary endpoints were time of initiation of progress note, word count, number of systems documented, and accuracy based on level of training.

Results: The rate of inaccurate documentation was significantly higher in the EHRs compared to the paper charts (24.4% vs 4.4%). However, expected physical examination findings were more likely to be omitted in the paper notes compared to EHRs (41.2% vs 17.6%). Resident physicians had a smaller number of inaccuracies (5.3% vs 17.3%) and omissions (16.8% vs 33.9%) compared to attending physicians.

Conclusions: During the initial phase of implementation of an EHR, inaccuracies were more common in progress notes in the EHR compared to the paper charts. Residents had a lower rate of inaccuracies and omissions compared to attending physicians. Further research is needed to identify training methods and incentives that can reduce inaccuracies in EHRs during initial implementation.

Keywords: electronic health record, EHR, EMR, paper chart, accuracy, inaccuracy, physical examination

Introduction

The Health Information Technology for Economic and Clinical Health (HITECH) Act has ushered in the transition from paper charts to electronic health records (EHRs) in the United States.1 HITECH is an incentive payment system, executed through Medicare and Medicaid, for clinicians and hospitals when they use EHRs to achieve specified improvements in care delivery.2 Adoption of EHR systems by nonfederal acute care hospitals has steadily increased since HITECH, with 59% of nonfederal acute care hospitals using at least a basic EHR system in 2013.3

Despite the perceived benefits, the transition to EHRs has not been without problems. Physicians’ resistance to implementation of EHRs continues to be a serious challenge.4–7 Part of this resistance originates from how EHRs impact physicians’ workflow.5,6 Because of the availability of remote access to EHRs, physicians are no longer required to compose their notes at the location of the patient. This has brought significant concerns about the ability to recall certain findings and to document correctly. Also, use of templates and the copy-paste function has increased verbosity, perhaps impairing communication through the daily progress note.8,9

The accurate delivery of information within the care team is vital when caring for a hospitalized patient. Although EHRs offer several advantages over paper documentation,10–13 concerns regarding accuracy remain. This study evaluates the accuracy of physical examination findings between EHRs and paper medical records following a health system’s conversion to an EHR.

MATERIALS AND METHODS

This was a retrospective study assessing the accuracy of physical examination findings found in paper charts compared to an EHR. Approval by the Human Investigation Committee at Beaumont Health was obtained prior to initiation of the study. Medical records of inpatient admissions from August 1, 2011, to July 31, 2013, at Beaumont Hospital, Royal Oak, Michigan, were reviewed. This time frame was chosen because the hospital converted from paper documentation to a fully functional EHR (EPIC, Madison, WI, USA) on August 1, 2012.

Note selection

Medical charts were identified for this study by searching International Classification of Diseases, 9th Revision (ICD-9) codes for 5 diagnoses: permanent atrial fibrillation, aortic stenosis, intubation, lower limb amputation, and cerebrovascular accident (CVA) with hemiparesis. These diagnoses were chosen because certain physical findings would be expected to be invariably present on physical exam. The paper documentation arm consisted of patients admitted to the hospital between August 2011 and July 2012. The EHR arm consisted of patients admitted to the hospital between August 2012 and July 2013. Based on these criteria, 2820 medical charts in the paper documentation arm and 2767 charts in the EHR arm were identified. A random number generator was then used to select 50 charts from each diagnosis code for each arm. The existence of each diagnosis was verified by review of the medical charts by a reviewer not involved in data collection. Eleven medical charts from the paper documentation arm and 16 medical charts from the EHR arm were excluded due to uncertainty in diagnosis or inappropriate coding. A new medical chart was randomly selected and reviewed to replace any excluded chart. If a patient had more than 1 hospital admission during the study period, only the first hospital admission was included for progress note selection.

The first daily progress note from the hospital admission was reviewed. History and physical examination notes, consults, and discharge summaries were not included. Only notes from internal medicine physicians were used. Both residents’ and attending physicians’ notes were included.

Data collection and outcomes

Data on baseline characteristics and outcome variables were collected by retrospective review of the medical records. It was not possible to blind data collectors to the type of chart being evaluated.

The primary outcome was the accuracy of documentation of the physical examination abnormality. This was recorded as being either accurate, inaccurate, or omitted (not recorded). Secondary outcomes were time of initiation of progress note, word count, number of systems documented, and comparison based on level of training. Data on length of stay and number of home medications were collected to compare complexity of patients between the 2 arms.

Definitions of endpoints

A physical examination finding was considered to be accurate if the expected finding for the diagnosis being evaluated was documented in the physical examination portion of the progress note. A physical examination finding was considered to be inaccurate if a normal finding or the opposite of the expected finding for the diagnosis being evaluated was documented. A physical examination finding was considered omitted if there was no mention of a physical finding associated with the diagnosis. A few examples of these are listed in Table 1.

Table 1.

Examples of accurate, inaccurate, and omitted physical examination findings

ICD-9 diagnosis Expected physical examination finding Examples
Permanent atrial fibrillation Irregularly irregular heart beat
  • Accurate: Irregular rhythm

  • Inaccurate: Regular rhythm

  • Omitted: No mention of cardiac rhythm in cardiac exam

CVA with hemiparesis Hemiparesis
  • Accurate: Left-sided weakness noted

  • Inaccurate: No focal neurological deficit noted

  • Omitted: No mention of muscle strength in physical examination

Severe aortic stenosis Presence of a murmur
  • Accurate: Murmur present

  • Inaccurate: Murmur absent

  • Omitted: No mention of murmur in cardiac exam

Lower limb amputation Absence of a limb
  • Accurate: Below-knee amputation noted

  • Inaccurate: Bilateral ankle edema noted (in a patient with below-knee amputation)

  • Omitted: No mention of status of extremities

Intubation Presence of endotracheal tube
  • Accurate: Patient is intubated

  • Inaccurate: Normal speech

  • Omitted: No mention of ET tube in subjective or objective portion of progress note

Statistical analysis

Statistical analysis was performed using SPSS 21 (SPSS Statistics for Windows, Version 21.0, released 2012, IBM Corp., Armonk, NY). Fisher’s exact test was used for categorical variables and Mann-Whitney U-test was used for continuous variables. All tests were 2-sided. Statistical significance was considered at P < .05.

RESULTS

A total of 500 charts were reviewed for this study. There were no significant differences in mean age, sex, or medical complexity between the paper notes and the EHR notes (Table 2). The proportion of notes written by residents and attending physicians was similar in both arms of the study. Approximately 75% of the notes were from attending physicians.

Table 2.

Baseline characteristics and complexity of charts evaluated

Paper chart (n = 250) EHR (n = 250) Total (n = 500) P-value
Median age (years) 75 77 75 .31
Sex, n (%) .17
 Female 107 (42.8) 123 (49.2) 230 (46)
 Male 143 (57.2) 127 (50.8) 270 (54)
Baseline complexity of patients
 Median length of stay (days) 7 6 6 .13
 Median number of home medications 13 13 13 .25
Types of notes evaluated, n (%) .41
 Attending 180 (72) 189 (75.6) 369 (73.8)
 Resident 70 (28) 61 (24.4) 131 (26.2)

Accuracies, inaccuracies, and omissions

Inaccurate documentation was significantly higher in EHRs compared to paper notes (24% vs 4.4%) (Table 3). There were more omissions in the paper notes compared to electronic notes (41.2% vs 17.6%). These findings were noted across most individual diagnoses except intubation. Accurate documentation rates were similar between paper charts and EHRs, at 54.4% and 58.4%, respectively.

Table 3.

Primary outcome: accuracies, inaccuracies, and omissions in physical examination findings documentation

Paper chart (n = 250), n (%) EMR (n = 250), n (%) Total (n = 500), n (%) P-value
Accurate documentation 136 (54.4) 146 (58.4) 282 (56.4) .41
 Permanent atrial fibrillation 29 (58) 28 (56) 57 (57) .84
 CVA with hemiparesis 24 (48) 30 (60) 54 (54) .22
 Severe aortic stenosis 18 (36) 20 (40) 38 (38) .68
 Lower limb amputation 38 (76) 32 (64) 70 (70) .19
 Intubation 27 (54) 36 (72) 63 (63) .06
Inaccurate documentation 11 (4.4) 60 (24) 71 (14.2) <.001
 Permanent atrial fibrillation 4 (8) 16 (32) 20 (20) <.05
 CVA with hemiparesis 2 (4) 10 (20) 12 (12) <.05
 Severe aortic stenosis 4 (8) 20 (40) 24 (24) <.001
 Lower limb amputation 1 (2) 14 (28) 15 (15) <.001
 Intubation 0 0 0
Physical examination finding not documented 103 (41.2) 44 (17.6) 147 (29.4) <.001
 Permanent atrial fibrillation 17 (34) 6 (12) 23 (23) <.001
 CVA with hemiparesis 24 (48) 10 (20) 34 (34) <.001
 Severe aortic stenosis 28 (56) 10 (20) 38 (38) <.001
 Lower limb amputation 11 (22) 4 (8) 15 (15) .05
 Intubation 23 (46) 14 (28) 37 (37) .06
Median time of initiation of notes 9:29 a.m. 11:11 a.m. 10:14 a.m. <.001
Median word count of physical exam 15 60 28 <.001
Median number of systems documented 4 8 6 <.001

Level of training analysis

Resident physicians had a significantly higher rate of accurate documentation compared to attending physicians, at 77.9% and 48.8%, respectively (P < .001) (Table 4). Resident physicians had fewer inaccuracies (5.3% vs 17.3%, P < .001) and omissions (16.8% vs 33.9%, P < .001) compared to attending physicians. When analyzed by the type of chart, resident physicians had higher accuracy compared to attending physicians in both paper charts and EHRs. They also had fewer omissions in paper charts and fewer inaccuracies in EHRs compared to attending physicians.

Table 4.

Accuracies, inaccuracies, and omissions in physical examination findings documentation by level of training

Resident (n = 131), n (%) Attending (n = 369), n (%) Total (n = 500), n (%) P-value
Accurate documentation 102 (77.9) 180 (48.8) 282 (56.4) <.001
 Paper chart 53 (75.7) 83 (46.1) <.001
 EHR 49 (8.03) 97 (51.3) <.001
Inaccurate documentation 7 (5.3) 64 (17.3) 71 (14.2) <.001
 Paper chart 2 (3) 9 (5) .45
 EHR 5 (8.2) 55 (29.1) <.001
Physical examination finding not documented 22 (16.8) 125 (33.9) 147 (29.4) <.001
 Paper chart 15 (21.4) 88 (48.9) <.001
 EHR 7 (11.5) 37 (19.6) .149

Time of initiation, word count, and number of systems documented in progress notes

Median time of initiation of note was different between paper charts and EHRs. Notes were initiated earlier in paper charts compared to EHRs, at 9:29 a.m. and 11:11 a.m., respectively. The median number of words in paper charts was 15, compared to 69 in EHRs (P < .001). The median number of systems documented in paper charts was 4, compared to 8 in EHRs (P < .001).

DISCUSSION

This study identifies several important findings. First, physical exam findings were more likely to be inaccurately documented in an EHR system. However, omissions were more likely to be found in a paper system. Second, resident physician documentation is more accurate than attending physician documentation, independent of the mode of documentation. Third, notes are written earlier in the workday in the paper system. Overall accuracy of documentation was poor, with only 54.4% and 58.4% accurately documenting physical exam findings in the paper system and the EHR, respectively.

Several studies have compared different aspects of accuracy between paper charts and EHRs and have demonstrated mixed results.14–19 A study on ophthalmology trainees showed similar accuracy of detecting standard physical examination findings between paper charts and EHRs; however, EHRs consumed more of the physician’s time.14 Another parallel study of paper and electronic notes demonstrated that 7% of electronic notes were significantly different and more incomplete documents were present in the EHR.17

Our study identifies several significant differences from previously published studies. First, we identify that the type of error is dependent on the system being used. Omissions were more common in the paper system. We suspect this is due to physician time constraints. The lower number of words and systems documented in paper charts also supports this. Inaccuracies were more common in the EHR. We suspect this is due to the use of macros, templates, and copied notes. This theory would align with published reports regarding the prevalence of copied material in established EHRs.20–23 In addition, the inability to recall observations due to the significant time difference between patient visit and documentation may play a role.24,25

For each individual diagnosis that was more obvious at first look, such as intubation or amputation, the accuracy was higher, whereas inaccuracies and omissions were lower in both paper charts and EHRs. The converse was true for diagnoses that required more subtle and focused examination, such as atrial fibrillation or aortic stenosis. Prior studies have demonstrated that physicians spend less than half their time in direct patient care, including physical examination,26,27 which may explain why there were more errors in cases of findings that required more subtle and thorough examination for detection.

For both paper charts and EHRs, residents did better than attending physicians in terms of higher accuracy and fewer inaccuracies or omissions. This may be a result of increased oversight from other residents, medical students, and attending physicians in resident cases. Although oversight of resident activities can reduce medical errors,28–30 its role in preventing documentation errors is not known. Also, residents’ comfort and acceptance of EHR due to their younger age and ability to multitask may have led to improved documentation.31–33

There are several strengths of this study to highlight. We analyzed physical examination documentation from real patient encounters rather than in a simulated environment. A large number of medical charts with specific physical exam findings were reviewed, which allowed us to detect differences between the 2 systems. Finally, our observations may provide some insight into other health systems as they convert from paper systems to EHRs.

There are several limitations of this study. Although the accuracy of the ICD-9 diagnosis code was reviewed and verified in the medical record, the actual presence of the physical exam abnormality was not confirmed at the time of documentation. The ability of individual physicians to detect physical exam abnormalities was not assessed. The severity of a given abnormality was not identified. Only initial progress notes were included, whereas history and physical examination notes, which may be more comprehensive, were excluded. Data collectors were resident physicians who were not blinded to the type of chart. The study was performed at a single center.

Finally, an important limitation of this study is that the consequences of inaccurate documentation or omission of physical examination findings were not evaluated. Although these are both undesired, their adverse effects on patient care or outcomes is not well known.

CONCLUSION

During the initial phase of implementation of an EHR system, inaccuracies were more common in progress notes in EHRs compared to paper charts. Level of training is a factor that influences the accuracy of documentation. Inaccuracies in the EHR may be due to the use of templates and copied notes, and significant delay in the initiation of making notes. Hospital systems should discourage the use of copied notes and encourage completion of progress notes at the time of patient encounter during the initial phase of implementation of an EHR. As EHRs become more disseminated, research should focus on implementing training programs and incentives that support accurate documentation.

Contributors

All authors were involved in study design. S.Y. reviewed the accuracy of diagnoses and selected the charts using random number generator. N.K., N.K.C., and S.P. were involved in data collection. S.Y. performed statistical analysis. S.S., M.M., and M.B. provided supervision and guidance throughout the study period. All authors were involved in interpretation of data, manuscript writing, editing, and revisions.

Funding

This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sector.

Competing interest

All authors declare that they have no competing interests.

REFERENCES

  • 1. Blumenthal D. Launching HITECH. N Engl J Med. 2010;362(5):382–385. [DOI] [PubMed] [Google Scholar]
  • 2. Blumenthal D, Tavenner M. The “meaningful use” regulation for electronic health records. N Engl J Med. 2010;363(6):501–504. [DOI] [PubMed] [Google Scholar]
  • 3. Charles D, King J, Patel V, Furukawa MF. Adoption of Electronic Health Record Systems Among U.S. Non-federal Acute Care Hospitals: 2008–2012[Internet]. Washington, DC: Office of the National Coordinator for Health Information Technology; 2013 (ONC Data Brief No. 9). http://www.healthit.gov/sites/default/files/oncdatabrief9final.pdf. Accessed September 15, 2015. [Google Scholar]
  • 4. Ford EW, Menachemi N, Peterson LT, Huerta TR. Resistance is futile: but it is slowing the pace of EHR adoption nonetheless. J Am Med Inform Assoc. 2009;16(3):274–281. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Grabenbauer L, Skinner A, Windle J. Electronic health record adoption—maybe it's not about the money: physician super-users, electronic health records and patient care. Appl Clin Inform. 2011;2(4):460–471. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Grabenbauer L, Fraser R, McClay J, et al. Adoption of electronic health records: a qualitative study of academic and private physicians and health administrators. Appl Clin Inform. 2011;2(2):165–176. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Zandieh SO, Yoon-Flannery K, Kuperman GJ, et al. Correlates of expected satisfaction with electronic health records in office practices by practitioners. AMIA Annu Symp Proc. 2008:1190. [PubMed] [Google Scholar]
  • 8. Hirschtick RE. A piece of my mind. John Lennon's elbow. JAMA. 2012;308(5):463–464. [DOI] [PubMed] [Google Scholar]
  • 9. Thornton JD, Schold JD, Venkateshaiah L, Lander B. Prevalence of copied information by attendings and residents in critical care progress notes. Crit Care Med. 2013;41(2):382–388. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Koppel R, Metlay JP, Cohen A, et al. Role of computerized physician order entry systems in facilitating medication errors. JAMA. 2005;293(10): 1197–1203. [DOI] [PubMed] [Google Scholar]
  • 11. Romano MJ, Stafford RS. Electronic health records and clinical decision support systems: impact on national ambulatory care quality. Arch Intern Med. 2011;171(10):897–903. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Black AD, Car J, Pagliari C, et al. The impact of eHealth on the quality and safety of health care: a systematic overview. PLoS Med. 2011;8(1):e1000387. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. Cebul RD, Love TE, Jain AK, et al. Electronic health records and quality of diabetes care. N Engl J Med. 2011;365(9):825–833. [DOI] [PubMed] [Google Scholar]
  • 14. Chan P, Thyparampil PJ, Chiang MF. Accuracy and speed of electronic health record versus paper-based ophthalmic documentation strategies. Am J Ophthalmol. 2013;156(1):165,172.e2. [DOI] [PubMed] [Google Scholar]
  • 15. Stausberg J, Koch D, Ingenerf J, et al. Comparing paper-based with electronic patient records: lessons learned during a study on diagnosis and procedure codes. J Am Med Inform Assoc. 2003;10(5):470–477. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Smith CA, Haque SN. Paper versus electronic documentation in complex chronic illness: a comparison. AMIA Annu Symp Proc. 2006:734–738. [PMC free article] [PubMed] [Google Scholar]
  • 17. Mikkelsen G, Aasly J. Concordance of information in parallel electronic and paper based patient records. Int J Med Inform. 2001;63(3):123–131. [DOI] [PubMed] [Google Scholar]
  • 18. Tsai J, Bond G. A comparison of electronic records to paper records in mental health centers. Int J Qual Health Care. 2008;20(2):136–143. [DOI] [PubMed] [Google Scholar]
  • 19. Gunningberg L, Dahm MF, Ehrenberg A. Accuracy in the recording of pressure ulcers and prevention after implementing an electronic health record in hospital care. Qual Saf Health Care. 2008;17(4): 281–285. [DOI] [PubMed] [Google Scholar]
  • 20. Weis JM, Levy PC. Copy, paste, and cloned notes in electronic health records: prevalence, benefits, risks, and best practice recommendations. Chest. 2014;145(3):632–638. [DOI] [PubMed] [Google Scholar]
  • 21. Erickson DR. Paradoxical consequences and electronic notes. J Urol. 2013;189(3):793–795. [DOI] [PubMed] [Google Scholar]
  • 22. O'Donnell HC, Kaushal R, Barron Y, Callahan MA, Adelman RD, Siegler EL. Physicians' attitudes towards copy and pasting in electronic note writing. J Gen Intern Med. 2009;24(1):63–68. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23. Weir CR, Hurdle JF, Felgar MA, et al. Direct text entry in electronic progress notes. An evaluation of input errors. Methods Inf Med. 2003;42(1):61–67. [PubMed] [Google Scholar]
  • 24. Carayon P, Wetterneck TB, Alyousef B, et al. Impact of electronic health record technology on the work and workflow of physicians in the intensive care unit. Int J Med Inform. 2015;84(8):578–594. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25. Saleem JJ, Flanagan ME, Russ AL, et al. You and me and the computer makes three: variations in exam room use of the electronic health record. J Am Med Inform Assoc. 2014;21(e1):e147–e151. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3957404/. Accessed October 15, 2015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26. O'Leary KJ, Liebovitz DM, Baker DW. How hospitalists spend their time: insights on efficiency and safety. J Hosp Med. 2006;1(2):88–93. [DOI] [PubMed] [Google Scholar]
  • 27. Sharma S. A Single-Blinded, Direct Observational Study of PGY-1 Interns and PGY-2 Residents in Evaluating Their History-Taking and Physical-Examination Skills. Perm J. 2011;15(4):23–29. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28. Sox CM, Burstin HR, Orav EJ, Conn A, Setnik G, Rucker DW, et al. The effect of supervision of residents on quality of care in five university-affiliated emergency departments. Acad Med. 1998;73(7):776–782. [DOI] [PubMed] [Google Scholar]
  • 29. Velmahos GC, Fili C, Vassiliu P, Nicolaou N, Radin R, Wilcox A. Around-the-clock attending radiology coverage is essential to avoid mistakes in the care of trauma patients. Am Surg. 2001;67(12):1175–1177. [PubMed] [Google Scholar]
  • 30. Kennedy TJ, Lingard L, Baker GR, Kitchen L, Regehr G. Clinical oversight: conceptualizing the relationship between supervision and safety. J Gen Intern Med. 2007;22(8):1080–1085. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31. Carayon P, Wetterneck TB, Alyousef B, et al. Impact of electronic health record technology on the work and workflow of physicians in the intensive care unit. Int J Med Inform. 2015;84(8):578–594. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32. Shea S, Hripcsak G. Accelerating the use of electronic health records in physician practices. N Engl J Med. 2010;362(3):192–195. [DOI] [PubMed] [Google Scholar]
  • 33. Hier DB, Rothschild A, LeMaistre A, Keeler J. Differing faculty and housestaff acceptance of an electronic health record. Int J Med Inform. 2005;74(7-8): 657–662. [DOI] [PubMed] [Google Scholar]

Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of Oxford University Press

RESOURCES