Skip to main content
. 2020 Jan 7;3(1):53–61. doi: 10.1093/jamiaopen/ooz051

Table 1.

The challenges physician champions faced

Challenges Representative quotations
Inappropriate training prior to go-live
 Inappropriate training timeline before go-live
  • The training we received was disjointed—we were asked to create SmartSets when NO ONE (unless they had previously used EPIC) had any clue on how it would work.(surgery)

  • The initial learning curve is high and tips and tricks should come later.(pediatrics)

 Lack of personalized training before go-live
  • More time spent on examples of our own workflows would have been more useful.(dermatology)

  • The pre-rollout training was much too general, and almost no thought appeared to have been given to customizing the training for specific clinics.(internal medicine)

  • There seemed to be little effort to understand the needs and workflows of our clinic and group…(surgery)

 Lack of a live or simulated environment for practicing before go-live
  • It would have been better to have brief classes, then mock patient visits with over-the-shoulder experts helping, then regrouping for role-specific classes.(pediatrics)

  • We need an ongoing robust practice environment populated with a lot of data, to see how things look.(internal medicine)

 Trainers did not have sufficient understanding of the system and workflow before go-live
  • The instructors had very little knowledge of what we do and how things work for us in clinics.(internal medicine)

  • …when we asked the trainers about issues perhaps unique to our own needs, seldom was the answer readily available.(radiology)

  • …the trainer said “I don't really understand inbasket so we'll only cover that briefly” which she proceeded to devote 5 min to the entire section. Inbasket training was terribly inadequate.(internal medicine)

 Not enough training for users after go-live
  • I strongly urge you to offer the recent Epic personnel led optimization sessions (4 h total) to all users. To offer it only to the Michart champions is a huge mistake, only 5–10% or so would take you up on the offer and it would be a true optimization step.(internal medicine)

  • I feel that everyone should have the opportunity to attend a similar class a couple of months after implementation… everyone should have a chance to learn to make this complex system (which is poorly designed and confusing) work better.(internal medicine)

Insufficient at-the-elbow support after go-live
 Poor at-the-elbow support
  • The at-the-elbow people were variable in their utility…some didn't have a handle on UM's system or what was allowed/encouraged in terms of workflow…(surgery)

  • I really think on-site help a month or two after go-live would help solidify what people are doing.(psychiatry)

Communication challenges with builders and the vendor company
 One-way and belated responses
  • My major disappointment has been the inefficiency of MiChart support in answering specific questions correctly in a timely fashion and in communication about questions in general. (surgery)

  • Communication between the MiChart team and the Physician Champions was poor. It's all one way- from the Michart team to the masses. Tickets often go into black holes. This led to delays in addressing problems ranging from serious to minor.(internal medicine)

System design flaws after go-live
 Workflow problems
  • This system is designed for a solo practitioner and little thought was placed toward teaching…(oncology)

  • We were absolutely unprepared for the mess that the inbox creates. Simple tasks like replying to a phone note in the inbox are unnecessarily complex.(internal medicine)

 Issues of functionality
  • Many elements are still missing and data review is very poor in EPIC.(pediatrics)

  • In general, there has a been a feeling in my department ([clinical area redacted]) that the system does not easily support many of our basic functions…(clinical area withheld)