Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2020 May 1.
Published in final edited form as: J Glaucoma. 2019 May;28(5):415–422. doi: 10.1097/IJG.0000000000001192

Integrating Patient Education into the Glaucoma Clinical Encounter: A Lean Analysis

Paula Anne Newman-Casey 1,2, John A Musser 1, Leslie M Niziol 1, Michele M Heisler 3,4,5, Shivani S Kamat 1, Manjool M Shah 1, Nish Patel 1, Amy M Cohn 2,3
PMCID: PMC6499667  NIHMSID: NIHMS1517956  PMID: 30640805

Abstract

PURPOSE:

To use lean analysis to identify how often and when wait times occur during a glaucoma visit to identify opportunities for additional patient engagement.

METHODS:

This prospective observational time-motion study measured process and wait times for 77 patient visits from 12 ophthalmologists at an academic glaucoma clinic over a 3-month period. Value stream maps visually diagrammed the process of a clinic visit from the patient’s perspective. Descriptive statistics were calculated for process times, wait times, and the frequency of 10+ minute wait times during each part of the visit. Key stakeholders participated in a root cause analysis to identify reasons for long wait times. The main outcome measure was average times (hours:minutes: seconds) for process times and wait times.

RESULTS:

29 new visit (NV) patients and 48 return visit (RV) patients were included. Total time in clinic was 187.1 ± 44.5 (mean ± SD) minutes for NV patients and 102.0 ± 44.7 minutes for RV patients. Wait time for NV patients was 63.7 ± 33.4 minutes (33.1% of total appointment time) and for RV patients was 52.6 ± 31.6 minutes (49.4% of the total appointment time). All NV patients and 87.5% of RV patients had at least one 10+ minute wait time during their clinic visit and the majority (75.9% NV, 60.4% RV) had >1.

CONCLUSIONS:

Currently, sufficient wait time exists during the visit for key portions of glaucoma education such as teaching eye drop instillation.

Keywords: lean analysis, glaucoma, education, operations engineering

Précis:

Lean analysis demonstrated sufficient wait time during a glaucoma patient encounter to integrate eye drop instillation education prior to the attending physician consultation. Adding more education to standard glaucoma care could improve glaucoma self-management.

Introduction

There is a striking difference between the physician and patient perceptions of time during a clinic visit. Physicians report lack of time as a barrier to delivering quality education during a patient’s clinical encounter.1 On the other hand, patients express dissatisfaction with wait time and report long wait times as a barrier to coming to clinic for care.2, 3 Patients also have stated that they do not always have enough time with their provider during their clinic visit to have all of their questions answered.4 This dichotomy is obvious to both providers and patients and is a major source of physicians’ dissatisfaction with their work and patients’ dissatisfaction with medical care.5 The Institute of Medicine’s (IOM) report on health care quality, Crossing the Quality Chasm, states that providing timely, efficient, and patient-centered care are critical for improving the quality of medical care in the United States.6 Improving the flow of the clinical encounter so that patients’ time is used efficiently is important, particularly for those with chronic conditions that require significant self-management and frequent clinical visits, such as glaucoma.

With the aging population, the prevalence of glaucoma will continue to rise.7 Ophthalmologists, who already provide high volume care and struggle with recent additional time requirements from the electronic medical record,810 will need to provide care to even larger numbers of patients. The amount of time physicians can spend on patient education and personalized counseling will, therefore, be further compressed. Poor or absent glaucoma education and self-management support are contributing factors to medication non-adherence.1114 Presently, half of all diagnosed glaucoma patients do not take their prescribed medications. This leads to needless vision loss from essentially untreated glaucoma. Prior research has shown that many barriers to medication adherence can be successfully addressed with individual counseling and education.15, 16

Team-based care, where para-professional staff gains the necessary skills to provide high-quality education and counseling, is a tested strategy for improving patients’ chronic disease self-management skills.17, 18 Utilizing patient wait times to incorporate education and counseling into routine office visits could be an effective way to improve patients’ clinical experience and health outcomes not only in glaucoma but also in many other chronic diseases that require high levels of self-management. Ten-minute wait times during clinic could be utilized to teach eye drop instillation. Ten minutes can also be used to go over any of a group of selected topics a patient might choose from to improve their understanding of glaucoma and help them formulate questions for their physician. Oermann and colleagues found that patients at a Veterans Affairs eye clinic who had a nurse show them a video about glaucoma, go over key points, and answer individual questions about glaucoma in the waiting room were significantly more satisfied with their clinic visit than the control group.19, 20

Lean analysis is a method of process improvement that has increased efficiency throughout the manufacturing sector21, 22 and has now been applied to numerous healthcare settings to decrease patient wait times.2325 It attempts to create more value for customers using fewer resources. Its notable contributions to the field of quality improvement include its emphasis on asking the people who are doing the work how it could be done better, and measuring processes to establish metrics for how a system could be improved. As an example of its success, in one study, a radiology department with over 2,000 staff used lean techniques to measure patient wait time for imaging studies and using lean analysis and process improvement, they were able to decrease wait times by over 30%.26 Lean analysis identifies time spent on processes, such as different aspects of a clinic visit, as either “value-added,” “non-value added,” or “essential.”27 Wait times for patients, for example, are considered “non-value added,” as no clinical services are being provided. Dilation time would be a prime example of an “essential” time as it is a service that is non-value added from the patient’s perspective but is necessary to perform the value-added work.

This study uses the novel lens of lean analysis of glaucoma clinic flow to quantify and characterize patient wait times during which education and counseling could be optimally provided. It also identifies ways to reduce patient wait. There is a natural tension between the two goals of using wait times for education and reducing wait time overall. Currently, if a patient has already spent much of their clinical encounter waiting, they may not be inclined to stay for an additional education session after the visit. If we could minimize all wait times to <5 minutes, patients may be willing to stay for a 30-minute counseling and education session after the visit. Conversely, if we cannot reduce wait times, we can add value to the wait time for the patient by incorporating education into the visit. Specifically, this study examined three questions: 1) How often are there wait times of at least ten minutes, an amount of time that could be amenable to education? 2) When and where are these wait times occurring during the visit process? 3) What are the root causes of long patient wait times?

Methods

Setting and Participants

A purposive, prospective sample of new visit (NV) and return visit (RV) patients seen in the glaucoma clinic at the University of Michigan - Kellogg Eye Center (Ann Arbor, Michigan) was obtained over a 3-month period (June-August 2016). Recruitment of patients was across 12 different providers on all five days of the workweek and at different appointment times throughout the day. Patients who provided verbal consent to participate in this study were followed through their clinic visit by one of two medical students to obtain time data associated with the components of their visit.

Value Stream Maps

A clinic visit is a fairly complex process as it involves the coordination of highly specific interactions with multiple members of the healthcare team that need to occur in a certain order. To establish the components of a patient visit, a certified Lean coach led the study team (key stakeholders including a physician, two medical students, a technician, a scheduler, and the clinic manager) through creating value-stream maps of patients’ clinical encounters in order to understand how a clinic visit flows from the patient’s perspective. Each process step in a clinic visit is not always visible, either to patients or to the group of health care providers that comprise the overall team. The purpose of a value stream map is to create a visual diagram of the process. The study team collectively agreed on the triggers that marked the beginning and end of each of the process steps in a clinical encounter for the value stream map. For example, the check-in process was defined as the time taken once a patient stepped off of the elevator and provided consent until they had handed in all of their paperwork to the clerk and the clerk finished checking them in.

To add time data to the value stream maps, patients were followed through their visits in time-motion studies. Medical students used stopwatches to measure duration (hour:minute:second) of each process or wait step of a patient visit. We measured process times from the time patients’ names were called to include patient transit time in our process time measurements as patients with different walking abilities will differ in the time it takes to transition between processes. The visit process potentially included steps for 1) check in; 2) technician work-up; 3) visual field; 4) photography; 5) technician dilation and/or intraocular pressure check; 6) resident/fellow physician exam; 7) attending physician exam; 8) check-out. Wait times were any time before a process step where a patient was not in contact with a clinic staff member. Not all patient visits included all of these processes or wait components. Additionally, any resources needed to perform each process – both personnel and equipment – were documented.

After mapping out the operations of a patient visit, the study team assigned value to the different components of the visual diagram. Aspects of a process that directly contribute to what patients value, such as seeing the physician, were considered “value added.” “Non-value added” components are parts of the visit process that do not add any benefit for the patient. Any time that a patient was not in contact with a clinic staff member was counted as “non-value added” wait time. We chose to include the “essential” time of dilation time as wait time in this analysis as it feels like wait time to a patient and it could be used for education. We also analyzed whether wait times differed between those RV patients who did and did not receive dilated fundus examinations as all NV patients received dilated fundus examinations. Once value stream maps for NV and RV patients were completed, they were circulated among technicians, clerks, and physicians in clinic to help visualize wait times. Brief interviews were conducted with patients and clinic staff to gather information on perceptions of causes for wait times.

Root Cause Analysis

As part of the lean analysis, the team of key stakeholders conducted a root cause analysis to evaluate factors leading to patient wait times. The key stakeholders that were part of the study team had meetings with all of the clinic physicians, technicians, clerks, and medical assistants to review the value stream maps that described the process and wait time data. During the meetings, all of the clinic staff hypothesized reasons for patient wait times during different aspects of the clinical encounter. The Lean coach helped the study team aggregate all of the suggestions from the clinic staff into key themes and then identify root causes of long wait times from the key themes. A root cause was defined as a reason for the problem (e.g. wait time) that if mitigated would be greatly improved.

Statistical Analysis

Descriptive statistics (mean, median, standard deviation [SD], range) were calculated to summarize process times and wait times, including patient/attending physician interaction time, and patient interaction time with all providers, overall patient visits. The percent of patient visits with at least one wait time of 10+ minutes was calculated. Two-sample t-tests were used to compare the percentage of wait time to total visit time between NV and RV patients. SAS statistical software (version 9.4, SAS Institute, Cary, NC) was used for all statistical analyses.

The University of Michigan institutional review board (IRB) ruled that approval was not required for this study because it was deemed quality improvement and thus IRB exempt. The research adhered to the tenets of the Declaration of Helsinki.

Results

A total sample of 78 patients agreed to participate in this study. One patient was excluded from analysis for incomplete data due to leaving their clinic visit early. Of the 77 patients analyzed, 29 were NV patients (37.6%) and 48 were RV patients (62.3%). Twelve different ophthalmologists saw patients in glaucoma clinic, and there were no more than four ophthalmologists providing patient care on the same day.

The value stream maps for NV and RV patients (SDC-Figure 1A and Figure 1B, respectively) consisted of 8 value-added process steps, including 1) check in; 2) technician work-up; 3) visual field; 4) photography; 5) technician dilation and/or intraocular pressure check; 6) resident/fellow physician exam; 7) attending physician exam; 8) check-out. Additionally, non-value added wait times were observed between process steps. Check-in was defined as the time it took from when a patient exited the elevator until they were finished with the check-in clerk (mean 2:41 (minutes:seconds) NV, 1:48 RV). Patients then went to the general reception area and waited (mean wait 17:31 NV, 14:45 RV) until they were taken back to a clinic room. The technician exam was much more involved for a NV patient (mean process time 40:45 NV, 17:05 RV). The end of the work up was indicated when the technician left the exam room and brought the patient to the in-process waiting area (mean wait 11:31 NV, 12:18 RV). The next process step was visual field testing, and began when a patient’s name was called from the in-process waiting room and ended when the patient exited the visual field room (mean process time 26:04 NV, 24:08 RV). We noted significant queuing and batching of patients before each process step besides check-in. There was such significant queuing around visual field testing that we placed a “chaos” sign on this step to denote this on the value stream map (SDC-Figure 1A and 1B).

Following visual field testing, patients went to the photography waiting room (mean wait 7:57 NV, 11:56 RV). Photography process time was measured from the time the patient’s name was called until they exited the photography room (mean process time 13:27 NV, 10:59). Patients then waited again in the in-process waiting area (mean wait 10:23 NV, 12:42 RV). When an exam room became available, a medical assistant or technician roomed the patient and administered dilating drops or checked intraocular pressure (IOP) if needed (mean process time <1 minute NV, mean 8:25 RV). The patient waited in the exam room until the resident or fellow entered the room (mean wait 11:27 NV, 10:48 RV).

Resident/fellow exam time was measured from the time the trainee entered the room until they left the room (mean process time 27:49 NV, 12:03 RV). The patient then waited in the exam room until the attending entered (mean wait time 9:50 NV, 19:54 RV). The attending exam time was measured from when the attending entered the room until they left the room (mean process time 17:46 NV, 10:54 RV). Standing in line at check-out was measured as wait time (mean wait 0:59 NV, 1:40 RV). Check-out process time was measured from when the check-out clerk called the patient to come to the desk until the patient departed (mean process time 3:12 NV, 2:55 RV).

Process and Wait Times for New Visit (NV) Patients

Time Studies

Total time in clinic from check-in to check-out was on average 187.1 ± 44.5 (mean ± SD) minutes for NV patients (Table 1). Wait time for NV patients was on average 63.7 ± 33.4 minutes or 33% of the total appointment time. NV patients spent 17.8 ± 8.9 minutes interacting with their attending physician. Patients spent 118.2 ± 21.3 minutes interacting with any provider (including ophthalmic technicians, medical assistants providing ancillary testing, residents, fellows, and attending physicians). All NV patients had at least one 5+ minute wait time and at least one 10+ minute wait time during their clinic visit. Among the 29 NV patients, 7 (24%) had one 10+ minute wait time, 10 (35%) had two, 7 (24%) had three, and 5 (17 %) had four or more 10+ minute wait times (Figure 2).

Figure 2.

Figure 2.

Bar chart displaying the percent of patients who experienced between 0 and 5 wait times of 10+ minutes during their clinic visit

Clinic Flow Observation

Not every patient completes every single potential step of a clinic visit, so we will describe the denominator for each process step to explain our calculations. Of the 27 NV patients who waited in the in-process waiting room after being seen by the ophthalmic technician but before they were taken to ancillary testing for visual field or photography, 14 (52%) waited at least ten minutes. Of the 21 patients who waited in the in-process waiting room after returning from ancillary testing, 7 (33%) again waited at least ten minutes. Of the 28 patients who were roomed by a technician to see the trainee physicians, 11 (39%) waited at least ten minutes in the room to see the resident or fellow. Similarly, 11/28 (39%) waited at least an additional ten minutes in the room to see the attending physician (Figure 3).

Figure 3.

Figure 3.

Distribution of wait times experienced by patients during a clinic visit, stratified by A) new visit patients, and B) return visit patients.

(IPW, in-process waiting)

Process and Wait Times for Return Visit (RV) Patients

Time Studies

RV patients spent an average of 102.0 ± 44.7 minutes in clinic from check-in to check-out, of which mean wait time was 52.6 ± 31.6 minutes (49.4% of the total time RV patients spent in clinic). RV patients who had a dilated fundus exam (n=15) waited for 66.4 ± 25.4 minutes compared to 45.1± 32.2 minutes for RV patients who did not have dilated fundus examinations (p=0.03). Because the overall visit time was longer for RV patients who received a dilated fundus examination, the percent of time patients spent waiting who did and did not receive dilated exams was not different (49.2% vs. 49.1%, p=0.98). Overall, RV patients spent 10.9 ± 7.4 minutes with their attending physician (10.6% of the total visit time). They spent 45.9 ± 24.5 minutes with all of the health care providers on the care team (ophthalmic technicians, medical assistants providing ancillary testing, residents, fellows and attending physicians), which was 46.7% of their total clinic visit time. All patients had at least one 5+ minute wait time, and 87.5% of RV patients had at least one 10+ minute wait time. Among RV patients, 6 (12.5%) had no 10+ minute wait times, 13 (27.1%) had one, 15 (31.2%) had two, 11 (22.9%) had three, and 3 (6.3%) had four 10+ minute wait times (Figure 2). Comparatively, the percent wait time to total visit time was significantly larger on average for RV patients compared to NV patients (33.1% vs. 49.4%, p<0.0001).

Clinic Flow Observation

All 48 return visit patients did not complete every potential process step during a clinic visit, therefore we give the denominator for each process we report. The location during which most RV patients wait for at least ten minutes is in the exam room while waiting to be seen by the attending physician (25 of 43 patients (58%), Figure 3). Additional areas of significant wait time were also identified. Of the 14 patients who waited in the in-process waiting room after being seen by the ophthalmic technician but before undergoing ancillary testing, 6 (43%) waited at least ten minutes. Of the 18 patients who waited in the in-process waiting area after they finished their ancillary testing but before a room was available for them to be seen by the physicians, 7 (39%) waited more than ten minutes. Of the 37 patients who were seen by a resident or fellow, 15 (40.5%) waited at least ten minutes in the exam room before being seen.

Root cause analysis of factors leading to wait times

Key stakeholders determined the root causes of long patient wait times to be problems with the templates used in clinic for scheduling patients, which lead to batching of patient flow (Figure 4). Each physician’s schedule was different and the types of patient visits on the schedule were not coordinated between physicians in this multi-physician practice. The current clinic schedules were front-loaded to try to ensure that clinic would be finished on time. Additionally, unscheduled ancillary testing was another significant contributor to batching of patient flow. For example, there may have been four patients all scheduled for a visual field at 1:00 pm even though there were only three medical assistants available to conduct the testing at that time. When an unscheduled fifth visual field was added at 1:00 pm, two patients scheduled for 1:00 pm visual fields would not begin their testing until 1:30 pm, pushing patients with visual fields scheduled at 1:30 pm behind, etc. Additionally, the three patients who finished their visual fields at the same time are now all ready to see the physician at once. This is called batching when several patients are at the same stage of their visit process at the same time, and this leads to very long wait times for the last patient of the day.28

Figure 4.

Figure 4.

Root Causes Analysis of Wait Times in Glaucoma Clinic

Batching also leads to waves of work where certain staff are overly busy while others are idle. For example, first thing in the morning and afternoon, the ophthalmic technicians and medical assistants were very busy, but physicians were idle. At the end of the morning or afternoon, physicians were very busy and behind by three or more patients while the technicians and medical assistants were idle.

Discussion

The key findings from this study are that patients spent somewhere between 20 minutes and an hour and a half waiting during their clinic visit. Though all new visit patients had at least one 10+ minute wait time, and the grand majority (87.5%) of return visit patients had at least one 10+ minute wait time, there was not a consistent time during the clinic visit when every single patient had a wait time besides the time for dilation, and not all patients are dilated at each visit. The longest wait times occurred surrounding ancillary testing for NV patients and waiting in the exam room for the attending physician for RV patients. If patient wait times during the clinic visit cannot be reduced or have some added value, it is less likely that people would stay to participate in additional education after their visit with the physician has ended.

Our lean analysis suggested one excellent opportunity to incorporate education into the time patients spend waiting in clinic is while patients are waiting in the exam room for the attending physician. One important topic for teaching is eye drop instillation, and this can be done by a technician or medical assistant. Teaching eye drop instillation takes less than ten minutes. A recent study found that 12% of patients who thought they had no problem using their eye drops did not get their drop into their eye at all. Of those who thought they had no problem with eye drop instillation, 54% had poor technique, touching the bottle tip to their eye or ocular adnexa.29 Teaching eye drop instillation by asking patients to demonstrate their technique, asking permission to show new techniques, and asking patients to demonstrate their new skill, will be crucial to improving glaucoma self-management skills.

The majority of both NV (75.9%) and RV patients (60.4%) had at least two 10+ minute wait times. The most frequent second wait time for NV patients occurred while waiting in the exam room for the attending physician, so the exam room could be used for education. The most frequent second wait time for RV patients occurred waiting for ancillary testing. As RV patients would be waiting in a waiting room at this time, they would require a separate room for education at this point in their visit.

This second wait time or future clinic wait times would provide an opportunity for more targeted counseling around patients’ specific glaucoma self-management problems. This type of more intense and personalized counseling might be best provided at the end of the visit, but could be integrated with brief interactions during wait times. Ophthalmic para-professional staff, a technician or medical assistant, could be trained to go over a topic the patient selects from a list (such as barriers to optimal self-management, disease information, explanation of ancillary testing, information about treatments the physician has recommended) during the wait time, and different topics could be addressed during different encounters. Health information technology can be leveraged to provide para-professional staff with tools that make it possible to deliver standardized yet personalized information on these varying topics.30 Staff could be trained in motivational interviewing based health-coaching, as it is a style of counseling that has been successfully used in brief health coaching sessions for many chronic diseases such as hypertension,3133 diabetes,17, 18 obesity,33 alcoholism,34 and tobacco abuse.34, 35 A motivational interviewing counseling approach would involve asking patients what they already know and what they would like to learn more about before giving information.36 Staff could help patients write down any questions that come up when they are going through the educational material to facilitate a more nuanced discussion with the physician.

Though wait time could be used as an opportunity to teach eye drop instillation, we also sought to identify ways to minimize wait times as long wait times have been identified as a predictor of patient dissatisfaction with their clinical care.2,5 Three-quarters of glaucoma patients (113/151) interviewed in one study stated that long wait times were key barriers to attending follow-up appointments.3 In our lean analysis, we identified several root causes contributing to long wait times. The most significant was batching of patients resulting from non-optimal scheduling.

Operations engineers have described how batching during production is associated with a wavelike pattern of activity where multiple products, or patients, are ready for the same process at the same time, such as when three patients are ready to see the same attending physician at the same time. Physicians are then very busy, but then may be inactive once they complete the first batch and are waiting for the next batch of patients to be ready.37 Operations engineers attempt to move processes towards continuous flow production to maximize efficiency. Continuous flow in the clinic would occur if visits were staggered such that only one patient was ready for the physician at one time, but there was always a patient ready so the physician is never idle. If a patient was scheduled for each process step of a clinic visit individuals – such as a separate schedule for the technician visit, ancillary testing, and the physician visit - it might help bring the schedule more towards continuous flow production. Patients would also know what time to expect to see each provider and why they need to see so many different providers before interacting with their physician. In this way, time for education and counseling could also be scheduled into the clinic visit.

Our study found that changes need to be made to the scheduling templates in order to decrease wait time and to schedule time for education. If operations engineers have sufficient time study data, they can optimize schedules in the healthcare setting to decrease patient wait times, similar to how they optimize schedules for the airline industry. Future work utilizing big data from the electronic health record8 and sensors, such as radio-frequency identification38 for continuous data capture, could automatically gather time study data, as opposed to manual time study data collection, which is burdensome for a clinic.

This study was subject to certain limitations. No patient-level data was collected as this work was approved as a quality improvement project by the IRB. For example, we do not have data on some clinic-based ancillary tests such as refraction, pachymetry, or gonioscopy, so we cannot assess how different testing requirements may affect the wait time to process time ratios. Lean analysis is a quality improvement process that focuses on understanding the particular workflow of a specific organization and involves all of that organization’s staff in the problem-solving practice. Therefore, while the general principles gleaned from this analysis in this single clinic may be applicable to other glaucoma clinics, the particular nuanced details will vary by practice. We did not implement and evaluate the effect of any interventions on clinic flow, nor did we evaluate the impact of particular interventions on patient outcomes. At this time, we do not know what impact including education or decreasing batching might have on patient wait times; this is an important direction for future lean efforts in our clinic. Further research is needed to automate wait and process time data collection to provide big data and allow us to track possible interventions in real time.

In summary, our lean analysis showed that there is sufficient wait time during a glaucoma patient encounter to integrate eye drop instillation education or other targeted education of a patient’s choice into the visit prior to the attending physician consultation. Teaching eye drop instillation to ensure that patients have the skills to take care of their disease is a high priority for education delivered in the clinic. Finding ways to best leverage technology and team-based care will help patients identify barriers to optimal self-management. Using operations engineering techniques to optimize scheduling may help reduce wait times. If patients feel like less of their time is spent waiting, they may be more amenable to participating in glaucoma education and counseling scheduled after their visit with the physician.

Supplementary Material

Figure 1A. NV Patient Value Stream Map. SDC-Figure 1a. New Visit Patient Value Stream Map.

The processes and wait times in a new visit patient encounter have been visually depicted in this figure. Patient location for each step is represented in the bottom row of the figure. Sample size, maximum, minimum, and mean descriptive statistics are included for every process and wait time in the format (hours:minutes:seconds; h:mm:ss).

Notes:

* Time to instill dilating drops takes less than 1 minute

n=2 patients went to surgery scheduling after the attending exam with a mean process time of 4:30, min of 2:00, and max of 7:00. Surgery scheduling wait time was 6:00 for both visits studied.

Figure 1B. RV Patient Value Stream Map. Figure 1b. Return Visit Patient Value Stream Map.

The processes and wait times in a return visit patient encounter have been visually depicted in this figure. Patient location for each step is represented in the bottom row of the figure. Sample size, maximum, minimum, and mean descriptive statistics are included for every process and wait time in the format (hours:minutes:seconds; h:mm:ss).

Notes:

n=6, patients with no ancillary testing had an average IPW wait time of 13:52 [5:46 – 37:58]

n=1, a post-op return visit patient underwent a B-scan ocular ultrasound (wait time: 6:14 process time: 15:00) after the technician work up

n=1, patient went to surgery scheduling after the attending exam with a process time of 5:07 and wait time of 11:10.

(MA = Medical Assistant, Tech = Ophthalmic Technician, IPW = In-Process Waiting Room, VF = Visual Field, R/F = Resident / Fellow, Photo Tech = Photo Technician)

Table 1. Wait times and process times

Acknowledgments and Disclosures:

Funding / Support: This work was supported by the National Eye Institute K23 Mentored Clinician-Scientist Award (1K23EY025320), Bethesda, MD, (PANC); the Research to Prevent Blindness Career Development Award, New York, NY, (PANC); and the University of Michigan mCubed Award, Ann Arbor, MI, (PANC, Cohn, Heisler).

The sponsor or funding organization had no role in the design or conduct of this research.

Financial Disclosures: Dr. Shah reports consulting fees from Allergan and Glaukos, which were outside the submitted work. Dr. Newman-Casey reports consulting fees from Blue Health Intelligence. None of the other authors have any financial disclosures.

References:

  • 1.Alibhai SM, Han RK, Naglie G. Medication education of acutely hospitalized older patients. J Gen Intern Med. 1999;14:610–616. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Bar-dayan Y, Leiba A, Weiss Y, Carroll JS, Benedek P. Waiting time is a major predictor of patient satisfaction in a primary military clinic. Mil Med. 2002;167:842–845. [PubMed] [Google Scholar]
  • 3.Lee BW, Murakami Y, Duncan MT, Kao AA, Huang JY, Lin S, Singh K. Patient-related and system-related barriers to glaucoma follow-up in a county hospital population. Invest Ophthalmol Vis Sci. 2013;54:6542–6548. [DOI] [PubMed] [Google Scholar]
  • 4.Morrell D, Evans M, Morris R, Roland M. The” five minute” consultation: effect of time constraint on clinical content and patient satisfaction. Br Med J (Clin Res Ed). 1986;292:870–873. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.McMullen M, Netland PA. Wait time as a driver of overall patient satisfaction in an ophthalmology clinic. Clin Ophthalmol. 2013;7:1655–1660. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Wolfe A Institute of Medicine Report: crossing the quality chasm: a new health care system for the 21st century. Policy, Politics, & Nursing Practice. 2001;2:233–235. [Google Scholar]
  • 7.Friedman DS, Wolfs RC, O’Colmain BJ, Klein BE, Taylor HR, West S, Leske MC, Mitchell P, Congdon N, Kempen J. Prevalence of open-angle glaucoma among adults in the United States. Archives of ophthalmology (Chicago, Ill: 1960). 2004;122:532–538. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Read-Brown S, Hribar MR, Reznick LG, Lombardi LH, Parikh M, Chamberlain WD, Bailey ST, Wallace JB, Yackel TR, Chiang MF. Time requirements for electronic health record use in an academic ophthalmology center. Jama Ophthalmology. 2017;135:1250–1257. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Chiang MF, Read-Brown S, Tu DC, Choi D, Sanders DS, Hwang TS, Bailey S, Karr DJ, Cottle E, Morrison JC. Evaluation of electronic health record implementation in ophthalmology at an academic medical center (an American Ophthalmological Society thesis). Transactions of the American Ophthalmological Society. 2013;111:70. [PMC free article] [PubMed] [Google Scholar]
  • 10.Poissant L, Pereira J, Tamblyn R, Kawasumi Y. The impact of electronic health records on time efficiency of physicians and nurses: a systematic review. J Am Med Inform Assoc. 2005;12:505–516. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Sleath B, Blalock SJ, Carpenter DM, Sayner R, Muir KW, Slota C, Lawrence SD, Giangiacomo AL, Hartnett ME, Tudor G. Ophthalmologist–patient communication, self-efficacy, and glaucoma medication adherence. Ophthalmology. 2015;122:748–754. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Lacey J, Cate H, Broadway D. Barriers to adherence with glaucoma medications: a qualitative research study. Eye. 2009;23:924–932. [DOI] [PubMed] [Google Scholar]
  • 13.Stryker JE, Beck AD, Primo SA, Echt KV, Bundy L, Pretorius GC, Glanz K. An exploratory study of factors influencing glaucoma treatment adherence. Journal of glaucoma. 2010;19:66. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Taylor SA, Galbraith SM, Mills RP. Causes of non-compliance with drug regimens in glaucoma patients: a qualitative study. Journal of ocular pharmacology and therapeutics. 2002;18:401–409. [DOI] [PubMed] [Google Scholar]
  • 15.Okeke CO, Quigley HA, Jampel HD, Ying G-s, Plyler RJ, Jiang Y, Friedman DS. Interventions improve poor adherence with once daily glaucoma medications in electronically monitored patients. Ophthalmology. 2009;116:2286–2293. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Okeke CO, Quigley HA, Jampel HD, Ying G-s, Plyler RJ, Jiang Y, Friedman DS. Adherence with topical glaucoma medication monitored electronically: the Travatan Dosing Aid study. Ophthalmology. 2009;116:191–199. [DOI] [PubMed] [Google Scholar]
  • 17.Greaves CJ, Middlebrooke A, O’Loughlin L, Holland S, Piper J, Steele A, Gale T, Hammerton F, Daly M. Motivational interviewing for modifying diabetes risk: a randomised controlled trial. Br J Gen Pract. 2008;58:535–540. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Chlebowy DO, El-Mallakh P, Myers J, Kubiak N, Cloud R, Wall MP. Motivational interviewing to improve diabetes outcomes in African Americans adults with diabetes. Western journal of nursing research. 2015;37:566–580. [DOI] [PubMed] [Google Scholar]
  • 19.Oermann MH. Effects of educational intervention in waiting room on patient satisfaction. J Ambul Care Manage. 2003;26:150–158. [DOI] [PubMed] [Google Scholar]
  • 20.Oermann MH, Needham CA, Dobal MT, Sinishtaj L, Lange MP. Filling the waiting time in the clinic with education about glaucoma. Insight. 2001;26:77–80. [DOI] [PubMed] [Google Scholar]
  • 21.Engelund EH, Breum G, Friis A. Optimisation of large-scale food production using Lean Manufacturing principles. Journal of Foodservice. 2009;20:4–14. [Google Scholar]
  • 22.Bhamu J, Sangwan KS. Lean manufacturing: literature review and research issues. International Journal of Operations & Production Management. 2014;34:876–940. [Google Scholar]
  • 23.Dickson EW, Singh S, Cheung DS, Wyatt CC, Nugent AS. Application of lean manufacturing techniques in the Emergency Department. J Emerg Med. 2009;37:177–182. [DOI] [PubMed] [Google Scholar]
  • 24.Cima RR, Brown MJ, Hebl JR, Moore R, Rogers JC, Kollengode A, Amstutz GJ, Weisbrod CA, Narr BJ, Deschamps C, Surgical Process Improvement Team MCR. Use of lean and six sigma methodology to improve operating room efficiency in a high-volume tertiary-care academic medical center. J Am Coll Surg. 2011;213:83–92; discussion 93–84. [DOI] [PubMed] [Google Scholar]
  • 25.Koning H, Verver JP, Heuvel J, Bisgaard S, Does RJ. Lean six sigma in healthcare. Journal for Healthcare Quality. 2006;28:4–11. [DOI] [PubMed] [Google Scholar]
  • 26.Lodge A, Bamford D. New Development: Using Lean Techniques to Reduce Radiology Waiting Times. Public Money & Management. 2008;28:49–52. [Google Scholar]
  • 27.Poksinska B The Current State of Lean Implementation in Health Care: Literature Review. Quality Management in Healthcare. 2010;19:319–329. [DOI] [PubMed] [Google Scholar]
  • 28.Womack JP, Jones DT: Lean Thinking: Banish Waste and Create Wealth in Your Corporation, Free Press; 2010. [Google Scholar]
  • 29.Tatham A, Sarodia U, Gatrad F, Awan A. Eye drop instillation technique in patients with glaucoma. Eye. 2013;27:1293–1298. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Killeen OJ, MacKenzie C, Heisler M, Resnicow K, Lee PP, Newman-Casey PA. User-centered Design of the eyeGuide: A Tailored Glaucoma Behavior Change Program. Journal of glaucoma. 2016;25:815–821. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Ma C, Zhou Y, Zhou W, Huang C. Evaluation of the effect of motivational interviewing counselling on hypertension care. Patient education and counseling. 2014;95:231–237. [DOI] [PubMed] [Google Scholar]
  • 32.Ogedegbe G, Chaplin W, Schoenthaler A, Statman D, Berger D, Richardson T, Phillips E, Spencer J, Allegrante JP. A practice-based trial of motivational interviewing and adherence in hypertensive African Americans. Am J Hypertens. 2008;21:1137–1143. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Rubak S, Sandbæk A, Lauritzen T, Christensen B. Motivational interviewing: a systematic review and meta-analysis. Br J Gen Pract. 2005;55:305–312. [PMC free article] [PubMed] [Google Scholar]
  • 34.Lundahl B, Moleni T, Burke BL, Butters R, Tollefson D, Butler C, Rollnick S. Motivational interviewing in medical care settings: a systematic review and meta-analysis of randomized controlled trials. Patient education and counseling. 2013;93:157–168. [DOI] [PubMed] [Google Scholar]
  • 35.Stotts AL, DiClemente CC, Dolan-Mullen P. One-to-one: a motivational intervention for resistant pregnant smokers. Addictive behaviors. 2002;27:275–292. [DOI] [PubMed] [Google Scholar]
  • 36.Miller WR, Rollnick S: Motivational interviewing: Helping people change, Guilford press; 2012. [Google Scholar]
  • 37.Penneys NS. A comparison of hourly block appointments with sequential patient scheduling in a dermatology practice. Journal of the American Academy of Dermatology. 2000;43:809–813. [DOI] [PubMed] [Google Scholar]
  • 38.Yao W, Chu CH, Li Z: The use of RFID in healthcare: Benefits and barriers in 2010 IEEE International Conference on RFID-Technology and Applications2010. pp. 128–134. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Figure 1A. NV Patient Value Stream Map. SDC-Figure 1a. New Visit Patient Value Stream Map.

The processes and wait times in a new visit patient encounter have been visually depicted in this figure. Patient location for each step is represented in the bottom row of the figure. Sample size, maximum, minimum, and mean descriptive statistics are included for every process and wait time in the format (hours:minutes:seconds; h:mm:ss).

Notes:

* Time to instill dilating drops takes less than 1 minute

n=2 patients went to surgery scheduling after the attending exam with a mean process time of 4:30, min of 2:00, and max of 7:00. Surgery scheduling wait time was 6:00 for both visits studied.

Figure 1B. RV Patient Value Stream Map. Figure 1b. Return Visit Patient Value Stream Map.

The processes and wait times in a return visit patient encounter have been visually depicted in this figure. Patient location for each step is represented in the bottom row of the figure. Sample size, maximum, minimum, and mean descriptive statistics are included for every process and wait time in the format (hours:minutes:seconds; h:mm:ss).

Notes:

n=6, patients with no ancillary testing had an average IPW wait time of 13:52 [5:46 – 37:58]

n=1, a post-op return visit patient underwent a B-scan ocular ultrasound (wait time: 6:14 process time: 15:00) after the technician work up

n=1, patient went to surgery scheduling after the attending exam with a process time of 5:07 and wait time of 11:10.

(MA = Medical Assistant, Tech = Ophthalmic Technician, IPW = In-Process Waiting Room, VF = Visual Field, R/F = Resident / Fellow, Photo Tech = Photo Technician)

Table 1. Wait times and process times

RESOURCES