Abstract
The Center for Clinical Skills (CCS) at the University of Hawai‘i's John A. Burns School of Medicine (JABSOM) trains medical students in a variety of medical practice education experiences aimed at improving patient care skills of history taking, physical examination, communication, and counseling. Increasing class sizes accentuate the need for efficient scheduling of faculty and students for clinical skills examinations. This research reports an application of a discrete simulation methodology, using a computerized commercial business simulation optimization software package Arena® by Rockwell Automation Inc, to model the flow of students through an objective structure clinical exam (OSCE) using the basic physical examination sequence (BPSE). The goal was to identify the most efficient scheduling of limited volunteer faculty resources to enable all student teams to complete the OSCE within the allocated 4 hours. The simulation models 11 two-person student teams, using resources of 10 examination rooms where physical examination skills are demonstrated on fellow student subjects and assessed by volunteer faculty. Multiple faculty availability models with constrained time parameters and other resources were evaluated. The results of the discrete event simulation suggest that there is no statistical difference in the baseline model and the alternative models with respect to faculty utilization, but statistically significant changes in student wait times. Two models significantly reduced student wait times without compromising faculty utilization.
Introduction
The Center for Clinical Skills (CCS), at the University of Hawai‘i's John A. Burns School of Medicine (JABSOM), supports the school's medical education program by supplementing the traditional education modes with simulated clinical encounters.
As simulated clinical encounters were introduced into traditional medical education they were historically focused on the advanced training given to residents during Graduate Medical Education. As simulation was introduced to the Undergraduate Medical Education, the primary focus was on using Standardized Patients (SPs), or actors, to simulate patients with a variety of ailments.1 In the 1990s the National Board of Medical Examiners started an initiative to test clinical skills of medical school students as part of their licensure exam. In turn, the SPs' ability to present different medical scenarios and give feedback became critical to preparing the students for the board exam.2,3
Using SPs to train students has become a vital part of the medical school's curriculum.4 One study showed that the use of SPs along with traditional assessment improves student's long-term retention and application of knowledge.5
The SPs are trained to assess and provide feedback regarding a student's clinical skills. These innovative education experiences are aimed at teaching and evaluating the patient care skills of history taking, physical examination, interpersonal communication, and counseling.6
Study Background
The CCS trains medical school students in a variety of medical practice scenarios and performs assessments through objective structured clinical exams (OSCE). OSCEs are graded examinations of medical students in which SPs are used to simulate real patient encounters.7 Student performance in these encounters is graded by trainers/facilitators and SPs. The JABSOM Clinical Skills Center conducts OSCEs with a fixed number of skills stations or examination rooms, instructors, staff, and SPs. One key assessment grades the medical students' ability to conduct a comprehensive basic physical examination.
The Basic Physical Examination Sequence (BPES) is a comprehensive head-to-toe examination to be performed on a patient during their visit with a physician.8 The BPES Exercise requires students to demonstrate competence in performing the steps of the basic physical examination in the correct order, using the correct technique. All JABSOM students are required to successfully complete this exercise.
Decreased funding for faculty with no attrition in enrollment (the first year class size of 66 has remained essentially unchanged) has generated increased staff workload and increased difficulty in securing and scheduling volunteer faculty to facilitate and perform the student assessments. The purpose of this study is to examine the BPES exercise to optimize utilization and minimize wait times. The study uses Arena® Simulation Software, Version 14, by Rockwell Automation, Inc., Wexford, PA, to create a flowchart-style modeling methodology of the BPES exercise. This software helps to measure the performance of the system being modeled.9 The exercise was modeled to analyze current and alternative scheduling designs to identify the most efficient utilization and scheduling of faculty members, while decreasing student wait times.
Description of the BPES Exercise
First year medical student class size varies, so logistics and planning projections approximated 60-66 student participants in the exercise. OSCE sessions are scheduled for three groups of 20–22 students each and held over three days. Each group is segregated into 10–11 teams with two student participants per team. Each team completes the OSCE session, with a BPES exam conducted by each student. The entire OSCE session is conducted during a 4 hour block of time. Following a checklist, teammates sequentially complete a full physical exam on each other, in one examination room. One volunteer faculty member is assigned (on a first-come, first-served basis) to enter the examination room to facilitate, assess, and give constructive feedback to each student immediately following the patient examination exercise. When the exercise and feedback are completed, the faculty member leaves the room and the student team takes approximately 5 minutes to switch roles as the patient becomes the practitioner and vice versa. A new faculty member then enters the room and another exam sequence is started. On completion of the second exercise, both team members leave the exam. The faculty members used for this exercise are practicing physicians who volunteer to participate and have been trained by the CCS staff for the OSCE.
For this exercise the CCS provides 10 examination rooms outfitted to simulate medical examination rooms (Figure 1). Each room is equipped with cameras and microphones that can record digital video of student-patient encounters.
Figure 1.
CCS Facility and Exam Room (A–J) Configuration
The general format of the exercise requires the two-member student team to arrive in one of the open rooms. Each student has 30 minutes to complete the BPES and receive feedback from the faculty member. The faculty member then exits the room and students have 5 minutes to changes roles, clean up and reset the room while waiting for the next available faculty member to arrive. CCS OSCE design seeks to avoid use of any given faculty member to assess the same student team twice. This strategy, based on the preference of the CCS staff, allows a wider range of expertise and feedback to be provided to students during the exercise. Once both students on the team have finished the BPES, they exit the CCS.
Problem Description
The CCS staff routinely encounters problems recruiting the necessary number of volunteer faculty for the 4 hour time blocks required for medical student assessment. It is imperative that the CCS staff schedule enough faculty members to allow adequate time to move the students through the exercise within the given time. At the same time, it is equally important not to recruit too many faculty members. Under-recruitment results in long wait times between encounters for students, and undue hardship for faculty who are forced to grade the student team twice. Over-recruitment means that the volunteer faculty members are underutilized and idle time leaves an impression that they are not really needed, making it more difficult to recruit volunteer faculty for future exercises.
Study Objectives
There are two primary objectives for this study. First, we seek to use the Arena Simulation Software to model and analyze the current CCS scheduling methodology to establish baseline parameters for faculty utilization and student wait times. Second, we seek to vary and model the scheduling parameters, including the number of faculty members and their scheduling, to identify an optimally efficient model that maximizes faculty utilization while minimizing wait times. We analyze the changes in the model inputs on three primary outcome measurements: faculty utilization (percentage of the scheduled time that the faculty member is engaged with the students); the number of students seen by each faculty member; and, student wait times.
Research Questions:
Will increasing or decreasing the number of volunteer faculty and/or their hours decrease student wait times?
Will increasing or decreasing the number of volunteer faculty and/or their hours improve the volunteer faculty utilization?
What is the average number of assessments performed by each faculty member?
Methodology
Logistics and operations performance outcome measures evaluated by the project included three primary simulation model output parameters: (1) faculty utilization (percent of time that faculty member engages with students); (2) the number of students seen by a faculty member; and (3) student wait times in minutes. Observation of these measures will help determine efficiencies in scheduling.
The baseline Arena® faculty scheduling model was transposed from a paper-based method currently in use by the staff at the CCS. Table 1 shows the typical schedule.
Table 1.
Details of the Baseline and Alternative Models
| Model | Number of Faculty for the full session (4 hours) | Number of Faculty for the first 2 hours of the session | Number of Faculty for the second 2 hours of the session | Total Number of Faculty Required | Faculty Full Time Equivalent (FTE) | Avg. Faculty Utilization | Avg. # of Assessments per Faculty | Avg. Student Wait Time (Min) |
| Baseline | 4 | 2 | 2 | 8 | 6 | 57.00% | 2.75 | 23.78 |
| Model 1 | 4 | 0 | 0 | 4 | 4 | Could not finish within the 4 hour time | ||
| Model 2 | 5 | 0 | 0 | 5 | 5 | Could not finish within the 4 hour time | ||
| Model 3 | 8 | 0 | 0 | 8 | 8 | 50.90% | 2.75 | 10.23 |
| Model 4 | 4 | 4 | 0 | 8 | 6 | 62.14% | 2.75 | 32.91 |
| Model 5 | 4 | 0 | 4 | 8 | 6 | 57.25% | 2.75 | 10.23 |
| Model 6 | 4 | 0 | 2 | 6 | 5 | 67.87% | 3.66 | 31.25 |
| Model 7 | 4 | 2 | 0 | 6 | 5 | 58.17% | 3.14 | 18.42 |
CCS staff used a wall clock to manually record the times the student teams entered and exited exam rooms, and the duration of the exam including time used for providing feedback. The method of timing does not afford a high level of precision but gives an approximation suitable for this research.
The design of the baseline model (the control) was compared to the alternative designs (the experimental groups) for evaluation and recommendations to optimize faculty utilization and decrease student wait times. The null hypothesis is that there is no difference in faculty utilization and student wait times between the base and alternative models. The study examines the effect of changes to the number and distribution of volunteer faculty hours and how it affects the system.
For the analysis, the means of the faculty utilization and the student wait times was compared using a One-way Analysis of Variance (ANOVA). The ANOVA was used to evaluate measurements between the baseline model and the alternative models observations to see if the means showed a difference. A .05 level of significance was used.
Constraints and Assumptions
CCS staff survey and BPES observations generated the following constraints and assumptions to guide development of the Arena® simulation models. These standard parameters were applied across all simulations:
Once the student teams pick an exam room, they will stay there until both students complete the exam.
Faculty members must evaluate the student in the room they are assigned to by the CCS staff.
Students are given 5 minutes to change roles and reset the room between the sequential examinations.
Students are given a maximum of 60 minutes per student to complete the physical exam and receive faculty feedback. This is controlled by a timer. This amount is constrained by the CCS due to the fact that the individual BPES examination should take no more than 45 minutes.
Due to the capacity of the CCS (10 exam rooms), the staff restricts the number of student teams to 11. This allows for 10 teams to occupy the exam rooms and for one team to be waiting “on deck” in the student orientation room.
Faculty should assess no more than 3 individual students (if possible) regardless of the length of time the faculty member is scheduled.
Faculty should not assess the same student team more than once.
BPES exercise should finish within 4 hours.
Historically, it has been difficult for the CCS staff to recruit more than 8 total volunteer faculty
Data Collection
The data for the baseline model was collected from the BPES OSCE on January 29, 30, and 31 of 2013. During the 2013 OSCE, there were 66 students broken into 11 groups per day and evaluated over that 3 day period. With the help of CCS staff, student, and faculty member examination counts and specific time data was recorded.
Analysis of Data for Simulation
Arena Input Analyzer® was used to analyze the data observed for student arrival times and exam length times. While the examination was scheduled to start at a specific time, students were free to enter the rooms when ready. The student arrival times are the time intervals between each student team's arrival in the examination room. The exam length times were the observed length of time each faculty member was in the examination room. Observed samples of student arrival times (in minutes) showed a beta distribution with the following expression: 4.5 + 33 * BETA (0.228, 1.54). The observed sample of exam lengths (in minutes) showed a triangular distribution with a minimum time of 40 minutes, a maximum time of 60 minutes and a most observed value of 50 minutes.
Modeling, Design, and Analysis
In this discrete event simulation model, we are trying to optimize resource utilization and time along with the time that objects move through the process/exercise. Therefore, in our model, the faculty is the resource that we are trying to optimize and the students are the objects that are moving through the simulation and waiting in the queue for the resource to become available. The baseline simulation model of BPES scheduling was constructed using 8 volunteer faculty members, based on historical recruitment of volunteers and successful completion of the exercise. In this model 4 faculty members each participated in the full (FT) 4 hour session and 4 faculty members participated part time (PT) for 2 hours. The 2 hour part time sessions are balanced with 2 faculty members for the first half of the OSCE and 2 faculty members for the second half of the exercise. The simulation design begins as each student arrives at the CCS and moves to the first available room (this is the same point at which the timer was started to record the input data for the simulation when the exercised was observed in real time). If no faculty resources are available, the student team has to wait. In the Arena software, resources are idle (not assigned to a student team) or busy (assessing a student). Additionally, the software treats resources as stationary (faculty) and entities (students) seeking resources must move to find an open an idle resource, the faculty. Although different from the stationary student teams and moving faculty which happens in real life, the software treats resources as stationary objects. The time it takes for an entity to find an available resource is measured and is equivalent to the real life situation of students waiting in a room for a faculty member to arrive. In the simulation, student team is assigned a counter and the faculty member that evaluates each student is recorded. The counter makes sure the student completes the BPES only once, and recording the faculty member assures that the student group does not use the same faculty resource more than once. The simulation ends when all student teams have completed the process.
This base model was validated by discussions with the CCS staff and historical observation. The model is a terminating simulation run for a total of 1000 replications to reduce variance and provide the best results. The CCS staff also validated that the input parameters were consistent with the times measured during the BPES. This information was valuable in the calibration and debugging of the model.
After the baseline simulation was completed and the results were recorded, alternative simulation models were constructed. The first model (Model 1) decreased the faculty number to 4 members scheduled for the full 4 hour OSCE. The next model (Model 2) slightly increased the total number of faculty members used to 5, though far fewer that the 8 faculty members used in the baseline. Model 3 used 8 faculty members for the full OSCE session and Models 4 and 5 used 8 faculty members with 4 members used for the full OSCE session and 4 members used in the first or last halves of the OSCE session respectively. The final 2 models (Models 6 and 7) used 6 faculty members with 4 members used for the full OSCE session and 2 members used in the first or last halves of the OSCE session respectively Table 1 shows the models analyzed.
Results and Output Analysis
Models 1 and 2, with 4 full time faculty members and 5 full time faculty members respectively, could not be completed within the 4 hour exercise limit. Model 3, shows a 13.55 minute reduction in student wait time from the baseline, but a drop in faculty utilization of 6.10%. In Model 4, with 4 full time resources and 4 part time resources in the first half of the BPES, the faculty utilization increases 5.14% but the students wait time increases by 9.13 minutes. Model 5, with 4 full time resources and 4 part time resources in the last half of the exercise, has a similar student wait reduction as Model 4 and the faculty utilization is almost the same as the base model. Although Model 6 shows a utilization increase of 10.87% over the base model, the average wait time increases by 7.47 minutes. With 4 full time resources and 2 part time resources in the first half of the BPES, Model 7 shows a 5.36 minute decrease in wait times but only showed a 1.17% increase in faculty utilization. Moreover, in Models 6 and 7, the average number of assessments is higher than the CCS staff desired in the model constraints (3.66 and 3.14 respectively). All other models were similar to the base model is this area.
Discussion of Results
In this study discrete event simulation and the Arena® Simulation Software was applied in the modeling of medical education scheduling. Discrete event simulation is useful to build the business process model and make changes “virtually” without impacting the real life model. This is effective in evaluating changes without the cost of putting changes in practice.9 The simulation method and software has had many medical applications and more specifically has been used in modeling of clinical patient scheduling and clinical organization resource scheduling. In one study, a hospital used discrete event simulation and the Arena Simulation Software® to increase emergency department efficiency and decrease patient length of stay.10 Discrete event simulation was used in a similar way by the Department of Anesthesia at the University of Iowa, to maximize the operating room utilization and patient scheduling.11 Therefore, by extension, this type of simulation and related software may be used to model medical education training scenarios, where student time, instructor utilization, or equipment use needs to be optimized.
The ANOVA results in Tables 2 and 3 show that we should reject the null hypothesis and conclude that there is a significant difference difference between the models. Because there is a statistically significant between the models we tested further for significance in faculty utilization and student wait times between the baseline and each of the alternative models individually. A multiple comparisons test was completed between the base model and each alternative model. The results in Table 4 show no significant difference between the faculty utilizations and the results in Table 5 show significant differences between the student wait times. Given that improved faculty utilization was the primary goal of the exercise, the study revealed that varying the scheduling and/or decreasing the number scheduled faculty members failed to produce any better results in the simulation. Given that the only statistically significant differences between models occurred in student wait times, a practical look at the models for feasibility is warranted. The models with less than 6 faculty members were not evaluated against the base model because the system could not complete an average throughput of 11 student teams. The Model with 8 full session faculty members displays the lowest student wait times but it is not desirable because it shows the lowest faculty utilization. The CCS staff has also determined that it is not a desirable model due to the difficulty of recruiting 8 volunteer faculty members for the full 4 hour exercise. Models 4 and 6 showed non-significantly higher faculty utilization over the baseline model, but significantly higher student wait times. The model with 4 full session faculty members and 4 faculty members in the second half session (Model 5) along with the model with 4 full session faculty members and 2 in the first half session (Model 7) showed significant decreases in student wait times without sacrificing faculty utilization.
Table 2.
ANOVA for Faculty Utilization
| Faculty Utilization | Sum of Squares | df | Mean Square | F | Significance |
| Between Groups | 0.113 | 5 | 0.023 | 2.895 | 0.026 |
| Within Groups | 0.303 | 39 | 0.008 | ||
| Total | 0.416 | 44 |
Table 3.
ANOVA for Student Wait Time
| Student wait time | Sum of Squares | df | Mean Square | F | Significance |
| Between Groups | 986.338 | 5 | 197.268 | 6492.247 | 0.000 |
| Within Groups | 0.182 | 6 | 0.03 | ||
| Total | 986.521 | 11 |
Table 4.
Multiple Comparisons Test for Student Wait Time
| Baseline vs | µ1–µ2 | Significant (P<.05) | t |
| Model 3 | 13.55 | Yes | 78.231 |
| Model 4 | −9.13 | Yes | 52.712 |
| Model 5 | 13.55 | Yes | 78.231 |
| Model 6 | −7.47 | Yes | 43.128 |
| Model 7 | 5.36 | Yes | 30.946 |
Limitations
The study was conducted based on the observation of one BPES exercise over a 3 day period in January 2013. Data collected from OSCEs in previous years were disposed of before the need for this study was recognized. More research and observations are needed to fully analyze the most efficient faculty schedule. Lack of CCS staff also made the accurate collection of travel time between exams rooms and arrival time of faculty difficult. Use of video recording in the CCS main walkways would have helped in this process. Additionally, use of time keeping on the evaluation forms would have provided additional accuracy in exercise completion times.
Because the CCS staff limits the number of student teams per four hour period, increases in student numbers and decreases in available faculty resources have caused the CCS staff to extend the BPES over additional days.
Conclusion
Based on statistical analysis, we would suggest that the CCS continue to use the baseline model. As a possible alternative, the model with 4 full session faculty and 4 faculty scheduled in the first half session (Model 5) could be suggested only to increase the faculty member utilization. The negative to not using this alternative scheduling configuration is the rise in student wait times. The original aim of the study was to increase faculty utilization and decrease student wait times. But, because of constantly decreasing faculty resources, utilization is given preference. It is extremely difficult to get the practicing/volunteer physicians to commit to participating in the BPES exam. In the last exam, the CCS staff was forced to use faculty who hadn't practiced medicine in 10 years. Because of this, CCS leadership and staff felt that efficient utilization of the faculty was a higher priority than the wait times of the students.
An important limitation of any simulation study is the fact that the modeling cannot account for the scheduling limitations imposed by the lack of staffing and faculty participation, which severely restrict the operation of this exercise. The BPES is an essential foundation for medical education and clinical practice.4 Knowing this, the school does its best to staff the exercise with volunteer physicians from the community, where the biggest incentive is free parking. The volunteer faculty who do participate are dedicated to cultivating future doctors for this region but more resources are needed to make this exercise, the CCS and JABSOM a successful school. JABSOM is isolated to the pacific region with one state-run university and one medical school. The school produces 80% of the “Best Doctors” who practice in the state and will continue to need support from the physician community to maintain and continue to improve the quality of medical education.12 For more information on the John A. Burns School of Medicine or becoming a volunteer faculty member, please visit http://jabsom.hawaii.edu/faculty/volunteeraffiliated-faculty/.
Conflict of Interest
None of the authors identify a conflict of interest.
References
- 1.Wallace P. Following the Threads of an Innovation: The History of Standardized Patients in Medical Education. Caduceus. 1997;13:5–28. [PubMed] [Google Scholar]
- 2.Raguso E. Acting Sick. Metro Silicon Valley Website. [March 10, 2013]. http://www.metroactive.com/metro/06.14.06/patient-actors-0624.html.
- 3.Rosen K. The History of Medical Simulation. Journal of Critical Care. 2008 Jun;23(2):157–166. doi: 10.1016/j.jcrc.2007.12.004. [DOI] [PubMed] [Google Scholar]
- 4.Melish JS. Teaching Clinical Skills at John A. Burns School of Medicine: Philosophy and Practice - A Continuing Journey. Hawaii J Med Public Health. 2012;71(5):136–138. [PMC free article] [PubMed] [Google Scholar]
- 5.Larsen DP, Butler AC, Lawson AL, Roediger HL. The importance of seeing the patient: test-enhanced learning with standardized patients and written tests improves clinical application of knowledge. Advances in Health Science Education. 2012 Jan;18(3):409–425. doi: 10.1007/s10459-012-9379-7. [DOI] [PubMed] [Google Scholar]
- 6.John A. Burns School of Medicine Center for Clinical Skills Website. [March 1, 2013]. http://jabsom.hawaii.edu/JABSOM/admissions/clinSkills.php?l1=mdp.
- 7.Yudkowsky R, Downing SM, Ommert D. Prior experiences associated with residents' scores on a communication and interpersonal skill OSCE. Patient Education and Counseling. 2006 Apr;62(3):368–373. doi: 10.1016/j.pec.2006.03.004. [DOI] [PubMed] [Google Scholar]
- 8.Zayyan M. Objective structured clinical examination: The Assessment of Choice. Oman Medical Journal. 2011 Jul;26(4):219–222. doi: 10.5001/omj.2011.55. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Arena Rockwell Automation Website. [3 March 2013]. http://www.arenasimulation.com/
- 10.Hospital Simulation Confirms New Facility Expenditure Not the Solution. Arena Rockwell Automation Website. [1 November 2014]. http://www.arenasimulation.com/public/uploads/files/resources/New_Jersey_Hospital_Simulation.
- 11.Dexter F, Macario A, Traub R, Hopwood M, Lubarsky DA. An Operating Room Scheduling Strategy to Maximize the Use of Operating Room Block Time: Computer Simulation of Patient Scheduling and Survey of Patient's Preferences for Surgical Waiting Time. Anesthesia & Analgesia. 199;89:7–20. doi: 10.1213/00000539-199907000-00003. [DOI] [PubMed] [Google Scholar]
- 12.Shelton T. Over 80% of Honolulu's 2014 “BEST DOCTORS” Trained or Teach at UH Medical School. UHMedNow. [5 November 2014]. http://blog.hawaii.edu/uhmednow/2014/06/05/more-than-80-of-honolulu%CA%BBs-2014-best-doctors-trained-at-uh-medical-school/

