Skip to main content
JAMIA Open logoLink to JAMIA Open
. 2018 Aug 14;1(2):142–146. doi: 10.1093/jamiaopen/ooy031

Impact of a tailored training on advanced electronic medical records use for providers in a Veterans Health Administration Medical System

Christopher A Lopez 1, Reese K Omizo 1, Julia M Whealin 1,2,
PMCID: PMC6952022  PMID: 31984328

Abstract

This quality improvement project evaluated the impact of a tailored, evidence-based training strategy on advanced electronic medical record (EMR) use for Veterans Administration (VA) clinicians experienced in using the EMR. After developing the curriculum, an online needs assessment tool evaluated 20 clinicians’ competency gaps. Responses were used to prioritize clinicians’ training needs. Clinician informaticists then provided 2–4 h of tailored training to groups of 1–5 clinicians. Compared with baseline scores (M = 3.59), scores on EMR Task Comfort showed a large improvement in the week following training (M = 4.60; t = 5.41; P <.000, r = 0.58) regardless of baseline level of computer anxiety. Assessment and tailored training methods can help maximize the benefits of resources for EMR training. This formative evaluation suggests that tailored, hands-on training led by clinician informaticists effectively improved clinicians’ EMR comfort and confidence in only 2–4 h.

Keywords: electronic health records, electronic medical records, tailored training, personalized training, clinicians, providers

INTRODUCTION

Electronic medical records (EMRs) and its systematized collection of patient and population digital health information can improve patient care in many ways.1–4 The Veterans Health Administration’s (VA’s) EMR is considered one of the best in the world.5–8 However, its power is untapped by most clinical users because time dedicated to learning the system, and its packages has not been a priority. Clinical duties take precedence over training, and new clinicians typically receive only basic EMR training prior to seeing patients.

When EMR training is inadequate, clinicians frequently make avoidable errors.2,9 Encounter forms and orders for labs, imaging, and consults may be incomplete, incorrect, or missing. Historical patient information may not be reviewed, and tests/consults may be unnecessarily repeated. Clinicians may spend valuable time going back to review, redo, or finish EMR tasks. As a result, these inefficiencies can leave clinicians with insufficient “face time” for patients. At worst, clinicians have entered data that are incomplete or incorrect, placing the patients’ wellbeing at risk.10,11 Ultimately, more episodes of care are required, substantially decreasing patients’ access to care.

Countering the risks of undertrained clinicians is the high cost of resources (including clinician time) needed for EMR training.12 Regardless of the expense, the U.S. Office of the National Coordinator for Health Information Technology maintains that investment in EMR training is crucial to maximize EMR’s potential.13 To minimize costs, training that judiciously allocates limited resources is needed.

The sensible solution, making the most of limited resources, is to develop a training curriculum that can be tailored, or personalized, to meet the unique needs of users. Clinicians in different settings (ie, primary care, specialty care) have markedly different EMR training needs.14 Training interventions that are personalized have content and/or modalities adapted to meet the unique needs of recipients. Research shows that educational content that is tailored is more likely to be understood, recalled, and rated as credible than content that is not.10,15,16 However, information on the impact of tailored EMR training programs is lacking.17,18

Objective

Currently, there is a pressing need to evaluate best practice EMR training with a range of providers (ie, General Internists, Rheumatologists, etc.) in varied clinical settings.19 No work of which we are aware has evaluated advanced EMR training for clinicians in a VA setting. Additionally, scholars in the field have called for data evaluating EMR training programs that are tailored to meet the needs of diverse clinicians.9 The objective of this project was to evaluate an innovative tailored training program for experienced VA clinicians designed to increase comfort and confidence in using the VA’s EMR.

METHODS

Setting

The EMR used by the VA is called the Veterans Health Information Systems and Technology Architecture (VISTA). Its graphical interface is the Computerized Patient Record System (known as CPRS). CPRS’ integrated inpatient and outpatient EMR along with its associated decision tools have received top scores by providers nationwide.8 Customized menus and templates facilitate CPRS’ order entry and documentation. Health summaries and personalization of views expedite date review, and clinical reminders and alerts are intended to ensure safe care is provided. However, CPRS’ tools are not intuitive and require higher-level training that varies by specialty and site.

Usual training

When new clinicians begin a position at the VA, they attend a group overview on basic CPRS after which they must demonstrate a minimum competency to obtain access to the EMR. Within the next 2 weeks, a CPRS trainer provides additional, basic training specific to the user’s role (such as setting favorite note titles, clinic schedules, and managing consults). Subject matter experts from the service are then expected to provide more specialize training that is specific to the user’s field; however, there is no standard for competency.

Training development

Generalized steps to develop a tailored training program are summarized in Supplementary Material S1. Two Physician Informaticists (C.L. and R.O.), each considered EMR “super users” (ie, having extensive EMR experience and the skill set to mentor others on using CPRS) developed the curriculum. Named the Personalized Informatics Educational Clinic for Efficient Solutions (PIECES), this training covers best practice use of CPRS related to 24 specific tasks/competencies (see Supplementary Material S2). Prior to training, clinician trainees complete a needs assessment tool developed by the project evaluator (J.W.), which is then compiled and summarized to identify “gaps” between current comfort/knowledge base and the desired end-result. Each clinician or group is then given a personalized curriculum based on identified needs and learning style.

The overall goals of the training program are to increase clinicians’ comfort and confidence in using CPRS. Training is approached from a social learning theoretical perspective,20 which emphasizes that individuals will change behavior if they value the outcomes associated with the change.21 The training process also draws from constructivism, which considers the context of trainees’ larger body of existing information when building new knowledge. During training, clinicians are encouraged to demonstrate difficult clinical scenarios which present opportunities for teaching additional short-cuts and processes. The trainees are then taught to apply EMR solutions into their workflow.

The delivery methodology of the training program integrates current best practices, including: (1) assessing and matching user needs2,19,22; (2) employing active learning via observation and hands-on application of practices14,19,22; (3) utilizing fellow clinicians as instructors9,22,23; and (4) building on past training to optimize use.22

Project protocol

Clinicians were referred by their service or self-referred to training. Sessions were held for groups of 1–5 clinician trainees (depending on specialty group and locale) and lasted 2–4 h. Sessions were led by one physician/trainer (either C.L. or R.O.). The trainer presented to the clinic areas, which at times required air travel. Four Category 1 Continuing Medical Education credits are awarded to those who completed the course. The protocol was approved by the VA’s Evidenced Based Practice Council. Clinician performance was kept confidential.

Measures

A 30-item online training needs assessment tool was completed prior to the training, consisting mainly of the EMR Task Comfort Scale, which assesses the completion of 24 EMR tasks. Following the sentence stem, “I am comfortable in my ability to…” tasks are evaluated. For example, “…create a shortcut for ordering medications” (see Supplementary Material S2). Response options ranged from 1-5 (“strongly agree” to “strongly disagree”). The scale score was derived by calculating the mean score of the 24 individual items and thus ranged from 1 to 5. Other items measured relevant sociocultural factors (such as computer anxiety14) and the ability to use decision support tools (as a proxy assessment of perceived knowledge/confidence).

In the week following training, clinicians completed a post-training assessment tool, evaluating the same 24 EMR tasks assessed at baseline. Additionally, we assessed satisfaction with training, change in perceived knowledge/confidence, and key effective training components.

Statistical analysis

Pre-post training data were analyzed using PSPPIRE (https://www.gnu.org/software/pspp/; GNU pspp 0.10.4-g50f7b7, 2017). Descriptive summary statistics (eg, χ2 analyses, Pearson r correlations) were computed to compare clinicians based on basic factors and to examine the association between variables. Group EMR Task Comfort Scale scores were compared longitudinally using a paired samples t-test.

RESULTS

Twenty prescribing medical providers completed the measures. The majority (90.0%) had been employed by the VA for over 1 year (see Table 1). At baseline, there was no relationship between longer length of time providing care in VA and responses to “I understand and use the clinical decision support tools embedded in CPRS” (X2 = 9.76, P = .370). However, users who had been providing care in the VA for over 10 years were least likely to report: (1) that they “have the information [they] need to effectively perform [their] job tasks that rely on CPRS” (X2 = 12.18, P = .058), or (2) that they received CPRS training in the past 10 years (X2 = 19.00, P = .025).

Table 1.

Clinician characteristics (n = 20)

Item Frequency (%)
Time providing VA patient care
 <1 year 2 (10)
 Between 1 and 10 years 10 (50)
 Over 10 years 8 (40)
Last training on VISTA/CPRS
 <1 year 6 (30)
 Between 1 and 10 years 11 (55)
 I have never had training 3 (15)

VA: Veterans Administration; VISTA: Veterans Health Information System and Technology Architecture; CPRS: Computerized Patient Record System.

Impact of training

Compared with baseline scores (Mean = 3.59, standard deviation [SD] = 0.81], scores on the EMR Task Comfort Scale showed a large improvement following training (Mean = 4.60, SD = 0.57; t = 5.41; P < .000, r = 0.58). On perceived knowledge/confidence, the response with the greatest frequency (ie, modal score) increased from 2 (“disagree”) to 5 (“strongly agree”) following training.

To test if these results were impacted by other variables, we calculated Pearson r correlations between Time 2 EMR Task Comfort and (1) length of time providing patient care in the VA, (2) length of time to last CPRS Training, (3) computer anxiety levels, and (4) Time 1 EMR Task Comfort scores. None of the correlations were significant. However, computer anxiety level was associated with the Time 1 EMR Task Comfort score.

All clinicians agreed that the training was useful and could be applied to their clinical practice. Responses to an open-ended item requesting “any feedback about the training” identified five key themes (see Table 2).

Table 2.

Key training themes based upon trainees’ responses to an open-ended question: “please list any feedback about the training here”

Theme Quote
On sight, face-to-face training
  • Really excellent training that the CBOCs (VA Primary Care Clinics) need more of. We need in person training, V-tel is not reliable or relatable

  • Excellent training, Dr “X”! Thanks for taking the time to come out to (clinic “Y”) and understand our limitations.

Active learning and hands-on application
  • Thank you, wish there was more opportunity to have sessions/classes like this. Hands on sessions are really helpful for me to learn these things!

  • It would be beneficial to have regular hands on training in CPRS

Utilizing fellow clinicians as instructors
  • Super helpful to have clinician showing good shortcuts and tricks. thanks for coming!

Informative personalized training
  • “X” was very personable. I wish, we were allowed more time with him, as I learned a lot in the short time we met.

  • Dr “X” was very knowledgeable and helpful. It was great being able to sit with him for a few hours and figure out how to do things on CPRS.

Suggestions/future directions
  • The training was excellent, we need more training for providers who weren't here as well as refreshers for staff

  • This course was fantastic. It is the first time I have received formal training beyond the basics. And I think it just touched the iceberg of what CPRS is capable of in terms of gaining greater efficiency, making progress in quality of care, and ultimately patient access. Please make this an ongoing, on site, face to face course. Every 3 mo in the beginning would be optimal in order to ensure continued building of skills and assessment of process efficiency.

CBOC: Community Based Outpatient Clinic; CPRS: Computerized Patient Record System; VA: Veterans Administration.

DISCUSSION

The PIECES training was developed to efficiently train clinicians to better utilize tools available in CPRS using minimal resources. Our findings suggest that, regardless of clinicians’ length of time delivering care in the VA, their comfort and confidence using the EMR substantially improved following training. Furthermore, clinicians’ comfort using CPRS improved regardless of their level of computer anxiety.

All clinicians agreed that training content was useful and could be applied to their clinical practice. Examination of written comments from the trainees identified four main components that contributed to the training’s success:

  1. Instruction was on-sight and face-to-face.

  2. Training involved hands-on application of practices.

  3. Trainers were practicing providers and thus familiar with work flow demands.

  4. Training topics were tailored to trainees’ needs.

These identified training themes are consistent with those identified in research on best practice training conducted with other clinical populations.2,9,14,19–23

Of note, clinicians who had been providing patient care the longest were (1) the least likely to report any EMR training in the past 10 years, and (2) the least likely to report they could effectively perform job tasks that rely on the EMR. Results highlight the need for refresher training for experienced clinicians, who likely have more advanced, specialized needs compared with beginning employees.

The generalized steps an organization can take to develop an advanced EMR training were delineated in this work. In this program, we sought to train clinicians individually or within small groups with similar functions (such as clinicians from the same specialty area). Ideally, trainees would also be grouped according to the identified training gaps. However, this situation is not realistic due to cost and logistical limitations. At minimum, we recommend that trainees be grouped by their experience level (ie, beginner vs advanced) and, when possible, by similar profession/functional role.

Although the PIECES training was designed to minimize use of training resources via its tailored approach, time and financial capital remain challenges to scaling this strategy. Use continuing education credits; however, proved useful for offsetting clinicians’ time demands. Regardless of cost, we maintain that investment in EMR training is critical to maximize competence and ultimately, the quality of patient care. For sustainment at this facility, the PIECES training has become standard practice. Clinicians receive a referral once they begin clinical work, and experienced clinicians needing refresher training are referred by their service chiefs.

The program, we describe was a quality improvement project and so, by definition, results are not generalizable. Rather, this project was performed in the context of one EMR with a limited number of trainers/trainees. A variety of usability metrics exist beyond subjective comfort and user satisfaction, such as completion rate (or error rate) and time on task.24 Conducting a study that employs objective measures of the users’ interactions with the EMR before and after the training experience will be necessary to demonstrate the impact of such a program on clinician behavior/efficiency, cost-savings, and collective gain of productivity.

CONCLUSION

Investment in EMR training is crucial to maximize EMR potential.13 To minimize costs, a proven training method that judiciously accommodates limited resources (ie, limited trainers, limited dedicated training time) is needed.12 The findings from this quality improvement project provide formative evidence that tailored, hands-on training effectively improved clinicians’ comfort and confidence with CPRS in only 2–4 h.

CONTRIBUTORS

C.L. made substantial contributions to the conception/design of this project, interpretation of the data, revision, and editing of the drafts for important intellectual content, provided final approval of the published version and agrees to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. R.O. made substantial contributions to the conception/design of this project, interpretation of the data, revision, and editing of the drafts for important intellectual content, provided final approval of the published version and agrees to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. J.W. made substantial contributions to the conception and design of this work and the acquisition, analysis, and interpretation of data; drafted and edited the work for important intellectual content; provided final approval of the version to be published; and agrees to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Conflict of interest statement. None declared.

SUPPLEMENTARY MATERIAL

Supplementary material is available at Journal of the American Medical Informatics Association online.

Supplementary Material

Supplementary Data

ACKNOWLEDGEMENTS

The authors like to thank the VAPIHCS providers for participating in this evaluation, Dr Judy Carlson for her feedback on this project, and our anonymous reviewers. The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs or the United States Government

REFERENCES

  • 1. Gunter TD, Terry NP.. The emergence of national electronic health record architectures in the United States and Australia: models, costs, and questions. J Med Internet Res 2005; 7: e3.. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Chaudhry B, Wang J, Wu S, et al. Systematic review: impact of health information technology on quality, efficiency, and costs of medical care. Ann Intern Med 2006; 14410: 742–52. [DOI] [PubMed] [Google Scholar]
  • 3. Jamoom E, Patel V, King J, et al. National perceptions of EHR adoption: Barriers, impacts, and federal policies. National Conference on Health Statistics (2012). https://www.cdc.gov/nchs/ppt/nchs2012/ss-03_jamoom.pdf. Accessed March 15, 2018.
  • 4. Jamoom E, Heisey-Grove D, Yang N, et al. Physician opinions about EHR use by EHR experience and by whether the practice had optimized its EHR use. J Health Med Inform 2016; 74: 4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Edsall RL, Adler KG.. The 2012 EHR user satisfaction survey: responses from 3,088 family physicians. Fam Pract Manag 2012; 196: 23–30. [PubMed] [Google Scholar]
  • 6. EHSI (Electronic Health Solutions International). “Innovations in American Government” award. 2006. http://ehs-int.com/why-vista. Accessed September 14, 2017.
  • 7. Evans DC, Nichol WP, Perlin JB.. Effect of the implementation of an enterprise-wide electronic health record on productivity in the Veterans Health Administration. Health Econ Policy Law 2006; 102: 163–9. [DOI] [PubMed] [Google Scholar]
  • 8. Peckham C, Kane L, Rosensteel S. Medscape EHR report 2016: Physicians rate top EHRs (25 August, 2016). https://www.medscape.com/features/slideshow/public/ehr2016. Accessed September 14, 2017.
  • 9. Jalota L, Aryal MR, Mahmood M, Wasser T, Donato A.. Interventions to increase physician efficiency and comfort with an electronic health record system. Methods Inf Med 2015; 541: 103–9. [DOI] [PubMed] [Google Scholar]
  • 10. Sittig DF, Singh H.. Defining health information technology–related errors: new developments since to err is human. Arch Intern Med 2011; 17114: 1281–4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11. Venta K, Baker E, Fidopiastis C, Stanney K.. The value of EHR-based assessment of physician competency: an investigative effort with internal medicine physicians. Int J Med Inform 2017; 108: 169–74. [DOI] [PubMed] [Google Scholar]
  • 12. Porcheret M, Hughes R, Evans D, Jordan K, Whitehurst T, Ogden H.. Data quality of general practice electronic health records: the impact of a program of assessments, feedback, and training. J Am Med Inform Assoc 2004; 111: 78–86. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. U.S. Department of Health and Human Services (DHHS) Office of the National Coordinator for Health Information Technology. https://www.healthit.gov/. Accessed December 18, 2017.
  • 14. Ventres W, Kooienga S, Vuckovic N, Marlin R, Nygren P, Stewart V.. Physicians, patients, and the electronic health record: an ethnographic analysis. Ann Fam Med 2006; 42: 124–31. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Albarracín D, Gillette JC, Earl AN, Glasman LR, Durantini MR, Ho M-H.. A test of major assumptions about behavior change: a comprehensive look at the effects of passive and active HIV-prevention interventions since the beginning of the epidemic. Psychological Bull 2005; 1316: 856–97. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Noar SM, Benac CN, Harris MS.. Does tailoring matter? Meta-analytic review of tailored print health behavior change interventions. Psychological Bull 2007; 1334: 673–93. [DOI] [PubMed] [Google Scholar]
  • 17. Ruiz JG, Mintzer MJ, Leipzig RM.. The impact of e-learning in medical education. Acad Med 2006; 813: 207–12. [DOI] [PubMed] [Google Scholar]
  • 18. Bredfeldt CE, Awad EB, Joseph K, Snyder MH.. Training providers: beyond the basics of electronic health records. BMC Health Serv Res 2013; 131: 503.. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. McAlearney AS, Robbins J, Kowalczyk N, Chisolm DJ, Song PH.. The role of cognitive and learning theories in supporting successful EHR system implementation training: a qualitative study. Med Care Res Rev 2012; 693: 294–315. [DOI] [PubMed] [Google Scholar]
  • 20. Knowles MS III, Swanson RA.. The Adult Learner. Oxford, England: Taylor & Francis; 2011. [Google Scholar]
  • 21. Bandura A. Social Foundations of Thought and Action: A Social Cognitive Theory. Englewood Cliffs, NJ: Prentice-Hall; 1986. [Google Scholar]
  • 22. McAlearney AS, Song PH, Robbins J, et al. Moving from good to great in ambulatory electronic health record implementation. J Healthc Qual 2010; 325: 41–50. [DOI] [PubMed] [Google Scholar]
  • 23. Cheney PH, Mann RI, Amoroso DL.. Organizational factors affecting the success of end-user computing. J Manag Inf Syst 1986; 31: 65–80. [Google Scholar]
  • 24. Middleton B, Bloomrosen M, Dente MA, et al. Enhancing patient safety and quality of care by improving the usability of electronic health record systems: recommendations from AMIA. J Am Med Inform Assoc 2013; 20 (e1): e2–8. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary Data

Articles from JAMIA Open are provided here courtesy of Oxford University Press

RESOURCES