Skip to main content
HHS Author Manuscripts logoLink to HHS Author Manuscripts
. Author manuscript; available in PMC: 2019 Jun 8.
Published in final edited form as: IISE Trans Occup Ergon Hum Factors. 2018 Jun 8;6(3-4):165–177. doi: 10.1080/24725838.2018.1456988

Ambulatory Clinic Exam Room Design with respect to Computing Devices: A Laboratory Simulation Study

Dustin T Weiler 1,2, Tyler Satterly 1,2, Shakaib U Rehman 3,4, Maury A Nussbaum 5, Neale R Chumbler 6, Gary M Fischer 7, Jason J Saleem 1,2,*
PMCID: PMC6448389  NIHMSID: NIHMS947414  PMID: 30957056

Abstract

Background

Challenges persist regarding how to integrate computing effectively into the exam room, while maintaining patient-centered care.

Purpose

Our objective was to evaluate a new exam room design with respect to the computing layout, which included a wall-mounted monitor for ease of (re)-positioning.

Methods

In a lab-based experiment, 28 providers used prototypes of the new and older “legacy” outpatient exam room layouts in a within-subject comparison using simulated patient encounters. We measured efficiency, errors, workload, patient-centeredness (proportion of time the provider was focused on the patient), amount of screen sharing with the patient, workflow integration, and provider situation awareness.

Results

There were no statistically significant differences between the exam room layouts for efficiency, errors, or time spent focused on the patient. However, when using the new layout providers spent 75% more time in screen sharing activities with the patient, had 31% lower workload, and gave higher ratings for situation awareness (14%) and workflow integration (17%).

Conclusions

Providers seemed to be unwilling to compromise their focus on the patient when the computer was in a fixed position in the corner of the room and, as a result, experienced greater workload, lower situation awareness, and poorer workflow integration when using the old “legacy” layout. A thoughtful design of the exam room with respect to the computing may positively impact providers’ workload, situation awareness, time spent in screen sharing activities, and workflow integration.

Keywords: Human-computer interaction, Computer workstations, Mental workload, Exam room design, Exam room computing, Patient centeredness

1. INTRODUCTION

Substantial research has evaluated the impact of the electronic health record (EHR) on the provider-patient interaction in ambulatory care. However, challenges persist regarding how to best integrate the electronic health record (EHR) into patient visits and clinical workflow, without adversely influencing the provider-patient interaction and relationship (Patel, Vichich, Lang, Lin, & Zheng, 2017; Saleem et al., 2014). With respect to integrating computerized applications into the patient visit while maintaining patient-centeredness, the computer and EHR should be viewed as a “third party” and should serve as a mediator between provider and patient (Saleem et al., 2014). This viewpoint counters an existing viewpoint that suggests the integration of computers/EHR negatively impacts patient-centeredness due to the exam room layout and the inability of this integration to effectively substitute for current paper-based clinical workflows (Saleem et al., 2014). Integrating EHRs into the patient visit, while maintaining patient-centeredness, may thereby help enhance, rather than negatively impact, the provider-patient relationship.

Various practices are responsible for optimal integration of computers into exam rooms. A systematic review of prior research found that multiple studies support practices that utilize the computer through sharing the computer and what is on the screen, adjusting room design, and verbal and nonverbal communication (Patel et al., 2017). However, when the EHR is introduced and used in provider-patient encounters, the provider-patient relationship is affected by both the provider’s body orientation (Frankel, 2016; Pearce, Dwan, Arnold, Phillips, & Trumble, 2009) and the patient’s behaviors with the computer (Pearce, Arnold, Phillips, Trumble, & Dwan, 2011). In one study, the provider’s body orientation was classified as either ‘unipolar’ or ‘bipolar’; where ‘unipolar’ orientation classified the provider’s body as oriented towards the computer, and ‘bipolar’ classification indicated the provider’s body orientation fluctuated between facing the patient and the computer (Pearce et al., 2009). The behavior of the patient with the computer and EHR in the room was classified as having three components: ‘screen watching’, ‘screen ignoring’, and ‘screen excluding’ to try and influence the provider’s actions (Pearce et al., 2011). A recent study demonstrated that patients looked at the computer twice as much when the screen was within their gaze, and that the EHR was used for a consistent proportion of the interaction (Kumarapeli & de, 2013). Therefore, if increased provider-patient interaction is desired with the inclusion of the EHR or computer, there is a need for specific layout guidelines to induce interaction and facilitate the computer’s role in the interaction.

Computers are often placed wherever proper wiring is available and often this positioning affected communication (Ventres et al., 2006). Previous studies have focused on how computer use affects interactions between providers and patients in exam room settings (McGrath, Arar, & Pugh, 2007; Patel et al., 2017; Rouf, Whittle, Lu, & Schwartz, 2007). Through a systematic review, it appears that a gap in research exists when evaluating the practice of room design through randomized controlled trials, and most studies reviewed were of the observational variety (Patel et al., 2017). McGraph et al. (2007) found three different office spatial designs: ‘open,’ ‘closed’ and ‘blocked’. An ‘open’ orientation has the physician oriented toward the patient, even when using the computer and the ‘closed’ orientation was described as the physician with their back turned to the patient while using the computer. Finally, the ‘blocked’ orientation was described as the physician oriented toward the patient, but the computer monitor obstructing the view between the physician and patient. The ‘open’ arrangement put physicians in a position to establish better eye contact and physical orientation than did the other configurations.

This study was completed to obtain empirical evidence regarding provider preference and performance differences when using a more tangible and interchangeable exam room layout. An additional aim was to support the notion that a redesigned exam room layout has various benefits for the provider-patient relationship. To do this, we designed and conducted a study comparing two layouts (current version ‘A’ vs new version ‘B’). The former had a desktop computer, placed in the corner of the room (Figure 1), while the latter included an all-in-one computer attached to a wall-mounted armature system that was adjustable along three axes (Figure 2), making it easier for providers to achieve an ‘open’ position (McGrath et al., 2007). Layout A, with the computer monitor placed on a desk in a corner of the room, is a typical arrangement in practice, especially when computers were initially introduced into exam rooms (Frankel et al., 2005; Frankel & Saleem, 2013). The impact of the placement of exam room computers on provider-patient communication, both verbal and non-verbal, was not considered in many cases (McGraph et al., 2007), resulting in a convenience-based placement of the computer (e.g., by the nearest electrical outlet). Based upon the flexibility and maneuverability offered by the set-up in the new layout, we expected layout B to result in greater efficiency and accuracy, increased evidence of patient centeredness, better alignment with the providers’ clinical workflow, enhanced perceived situation awareness, and a decrease in perceived workload.

Figure 1.

Figure 1

Current design, layout A, with the computer workstation on a fixed desk in the corner of the room.

Figure 2.

Figure 2

New design, layout B, with a wall-mounted armature system for the computer monitor.

2. METHODS

2.1. Participants

An a priori power analysis was completed, based on the primary outcome measure of workload, as measured by the NASA-Task Load Index (TLX) (Hart & Staveland, 1988). From our previous studies that measured human performance, we estimated the standard deviation of NASA TLX workload scores as 13.2. The NASA TLX has a range of 100 points, and a difference of 10 points was considered a relevant difference. Assuming respective Type I and Type II error rates as 0.05 and 0.20, the required sample size is 28 participants to have 80% power for detecting a 10-point difference between the current design and the redesign.

A total of 28 healthcare providers (17 male, 11 female) completed the study, with the mean age being 31 (range: 26-59). Using a convenience sampling method, four attending physicians, 23 resident physicians, and one nurse practitioner were recruited. In total, 26 of the 28 providers used the Department of Veterans Affairs (VA) Computerized Patient Record System (CPRS) as their EHR often or occasionally; the majority of the providers were resident physicians who had previously rotated through the VA and had used CPRS. Eight providers currently utilize a wall-mounted armature system in the exam room, five providers currently utilize a stationary desktop, six utilize a laptop, seven do not utilize a computer in any capacity, one utilizes a computer on wheels, and one provider did not provide a response. All providers had experience working with patients in an outpatient examination room, with 24 providers being employed through the University of Louisville, two through an independent family practice, one through the Baptist Health Center, and one from the Louisville VA Medical Center.

2.2 New exam room design

Our redesigned exam room layout with respect to the computing is based on the VA’s new exam room design standard. The redesigned exam room includes a mobile computing work station with an armature system and a moveable table that can rotate against the wall or rotate out to form a consult surface for a keyboard or printed materials that can be viewed with the patient. Historically, computers were introduced into the exam rooms with the desk and computer fixed to the wall in a way that potentially encouraged the clinician to turn their back to the patient while using the EHR. The VA Office of Construction & Facilities Management decided that the new exam room design should minimize the dependency of a built-in desk, which seemed to encourage a ‘move-in and occupy’ mindset. The new exam room was designed with built-in efficiency, encouraging the provider to move from one exam room to another, which is consistent with the new team-based models of care (Helfrich et al., 2016), where members of the healthcare team rotate to the patient in a single location. We simulated this new exam room design in our laboratory, as well and the older exam room design with a computer on a desk against a wall.

2.3. Experimental design

We used a single-factor, within-subjects experimental design. The single factor was ‘Type of Exam Room Layout’ with two levels (A, B), one representing a current, typical exam room layout (A), and the other representing the redesigned layout, where the EHR/computer is designed to be more easily incorporated with the provider-patient interaction (B). The presentation order of designs A and B were counterbalanced to account for potential crossover effects. Dependent measures addressed efficiency, errors, workload, patient-centeredness, screen sharing, workflow integration, and situation awareness. Table 1 lists and defines the outcome measures, and describes what data collection tool or method was used for each.

Table 1.

Outcome measures for comparing a current, typical exam room layout with the redesigned layout during lab simulation study.

Outcome measure Definition Measuring tool/method
Efficiency Efficiency completing scenarios with the given exam room and computing layout. Time to complete test scenarios
Errors Deviations or omissions from the given clinical scenarios. Completeness of each clinical scenario.
Workload The difference between the amount of resources available within a person and the amount of resources demanded by the task situation (Sanders & McCormick, 1993) NASA Task Load Index (TLX) (Hart & Staveland, 1988)
Patient-centeredness Time the provider is focused on the patient compared to the computer Eye gaze (E.; Montague & Asan, 2014; E.; Montague et al., 2011)
Amount of screen sharing with the patient Time spent sharing information from the EHR and related software programs where both the provider and patient are viewing the computer monitor Time spent during screen sharing activities.
Workflow integration of computer/EHR Degree to which new technology is tailored such that it fits into the clinician’s workflow process for delivering patient care Workflow Integration Survey (WIS) (Flanagan et al., 2011)
Situation Awareness Perception and comprehension of elements in the environment; projection of their status in the future (Endsley, 1995) Situation Awareness Rating Technique (SART) (Selcon & Taylor, 1990)

For efficiency, errors, patient-centeredness, and screen sharing, data were collected by using video recordings and screen captures from Morae software (version 3.3.4, TechSmith Corporation, Okemos, MI). Specifically, time to complete a scenario (efficiency) was measured through a task-timing function with video recordings, while errors were measured by evaluating screen captures of the provider’s CPRS inputs and video recording from two cameras. One camera facing the provider and patient, and the other attached atop the exam room computing device, respectively captured screen sharing and patient-centeredness. Data for the NASA-TLX was collected via a computer-based survey with a scale of 1-100. The WIS and SART were paper-based measurements based on a scale of 1-5 and 1-7, respectively.

2.4. Procedure

Providers were brought to the Center for Ergonomics laboratory and they read an IRB-approved informed consent form. A brief overview of the study was described to the provider followed by a five-minute guided familiarization session with the EHR used for the study, the VA’s CPRS. Upon completion of the familiarization session, the first testing session began by working through one of two potential scenarios and layouts. Care was taken by the facilitator not to refer to the exam room layouts as “old” and “new”, which instead were referred to as “first” and “second”. Once the session was completed, or the 20-minute time limit was met, the provider left the simulation area to complete the paper-based SART and WIS, as well as the computer-based NASA-TLX. The provider was brought back into the simulation area to complete the second session using the alternative layout (i.e., the provider’s second simulated scenario and layout was different that the first). Similar to the first event, once the scenario was completed, or the 20-minute time limit was met, the provider left the simulation area to complete the SART, WIS, and NASA-TLX. Finally, the provider was guided through a semi-structured debrief session to gather any final thoughts pertaining to the study. See Appendix 1 for the semi-structured interview guide. After the debrief session was concluded, the provider was compensated and dismissed. The entire session was designed not exceed 90 minutes in total.

2.5. Simulation scenarios

We used similar outpatient visit scenarios for the provider to complete using both room layouts (A and B). These scenarios were reviewed and revised by a physician consultant to ensure a sufficient level of realism. Fictitious patient records for our scenarios were entered into the demo version of CPRS and populated with the scenario data, including historical and current vitals, a previous progress note, and medication list. A member of the study team [JJS] played the part of the patient. The patient actor asked for similar actions from the provider regardless of the layout and scenario. That is, regardless of the scenario or layout, the patient actor gave the provider a list of current medications and asked to see a history of vital readings from previous visits (blood pressure or respiratory rate depending on the specific patient scenario) to show interest in looking at their EHR record. The scenarios only differed in ‘surface-level’ aspects such as fictitious patient name, similar chief complaint, similar co-morbidities, similar medications, etc. However, the scenarios required providers to complete the same tasks, including creating a progress note, sharing lab results with the patient, medication reconciliation, ordering/renewing medications, and other common tasks associated with a primary care visit. Providers were asked to complete the clinical tasks; no instructions were given to the providers regarding patient-centeredness and screen sharing. The presentation order of the two patient scenarios was counterbalanced across layouts A and B (in addition to the layouts being counterbalanced across providers). In other words, the first provider used layout A with scenario 1, then layout B with scenario 2. The second provider used layout B with scenario 1, then layout A with scenario 2. The third provider used layout A with scenario 2, then layout B with scenario 1. The fourth provider used layout B with scenario 2, then layout A with scenario 1. This counterbalancing scheme was repeated for the next 24 providers. See Appendix 2 for the scenarios.

2.6. Layouts A and B

A picture of Layout A, with respect to the computing, can be seen in Figure 1, and a separate picture of Layout B, with respect to the computing, can be seen in Figure 2. Layout A has a simple computer and 19-inch monitor setup on a desk at the nearest electric outlet with no respect to the locale of the patient, patient table, or other needed medical tools. Layout B has an all-in-one computer (19.5-inch monitor) attached to a wall mount that moves the screen along three axes allowing for optimal screen positioning that can be adjusted depending upon the scenario. Placement of the wall mount was determined based upon where the most open space was located in the exam room to not limit the potential movement of the screen along any axis. This is consistent with the VA’s new exam room design standard, which is the basis for Layout B. Both simulated exam rooms were of high fidelity with regard to the exam room computing device, room layout, and furniture pieces. However, we did not include many smaller items that are typically in exam rooms, such as a blood pressure monitor, opthalmoscope, supply cart, etc.

2.7. Analysis

Analysis was done with an A vs. B comparison of the current, typical exam room layout and the redesigned layout with statistical analyses performed to compare the measures in Table 1 across the two types of exam room layouts. Each provider completed the NASA-TLX, WIS, and SART instruments twice, once for each of the two layouts. The SART instrument for situation awareness contained 10 items that were rated on a Likert-type scale from 1-7. Each of the 10 items map to three subscales for ‘understanding’, ‘demand’, and ‘supply’. A composite SART score for situation awareness (SA) was calculated using: SA = U – (D – S), where: U = summed understanding; D = summed demand; S = summed supply. Paired t tests were used to compare outcomes between the two layouts when parametric assumptions were met, and Wilcoxon Signed Rank tests were used otherwise. Statistically significant differences between layouts were concluded using a significance level of 0.05.

Debriefing responses were recorded for all 28 providers. The debrief interviews were first transcribed from audio recordings. Then, responses from the debrief interview transcripts were reviewed by a member of the study team for recurrent themes across providers. A second study team member reviewed and verified the summary of interview responses for repeating patterns within the full study sample. Recurrent themes centered around layout preference, provider-patient interaction, and redesign recommendations.

The remote database supporting the demo version of CPRS was inaccessible during the last provider’s session. Therefore, quantitative data for this provider was not included (i.e., the sample size was 27 for the statistical analyses).

3. RESULTS

A summary of statistical results is provided in Table 2. There were no significant differences between layouts for measures of efficiency, errors, or patient centeredness. However, there were significant differences for time spent in screen sharing activities, as well as provider perceived situation awareness and workload between layout types.

Table 2.

Results for Efficiency, Errors, Patient Centeredness, Screen Sharing, and Situation Awareness (n=27)

Outcome Measure Layout A – Mean (SD) Layout B – Mean (SD) Statistical Test Used p-value
Efficiency – Time to complete scenario (seconds) 604 (202.9) 585 (205.0) Wilcoxon Signed Ranks Test 0.501
Errors – Number of Errors Committed 1 (0.9) 1 (0.9) Wilcoxon Signed Ranks Test 0.529
Patient Centeredness (amount of time focused on patient in seconds) 139 (87.7) 128 (84.5) Wilcoxon Signed Ranks Test 0.648
Patient Centeredness (Percentage of time focused on patient) 22 (9.2) 21 (8.5) Paired T-test 0.482
Screen Sharing (Amount of time screen sharing with patient in seconds) 24 (20.5) 42 (35.8) Wilcoxon Signed Ranks Test 0.022*
Situation Awareness 22 (6.9) 25 (5.7) Paired T-test 0.017*

Note:

*

denotes statistical significance.

For workload, five out of six of the NASA-TLX subscales significantly differed between layouts (Table 3), though results for the mental workload subscale only approached significance.

Table 3.

NASA-TLX Subscale Comparison of Layout A vs. B (comparisons using paired t tests; n=27)

NASA-TLX Subscale Layout A - Mean (SD) Layout B – Mean (SD) p-value
Mental Workload 53 (28.7) 44 (25.9) 0.054
Physical Workload 35 (28.9) 16 (12.0) 0.003*
Temporal 53 (22.3) 40 (24.9) 0.030*
Performance 54 (25.1) 44 (28.7) 0.049*
Effort 55 (24.6) 38 (21.7) <0.001*
Frustration 60 (29.8) 35 (25.4) <0.001*
Overall Workload 52 (20.0) 36 (17.0) <0.001*

Note:

*

denotes statistical significance.

Finally, three out of four subscales in the WIS were found to be significantly different between layouts (Table 4), as well as the total WIS scores, while differences in the paper workaround subscale approached significance.

Table 4.

Workflow Integration Survey (WIS) analysis Layout A vs. B (n=27)

WIS Subscale Layout A - Mean (SD) Layout B – Mean (SD) p-value
Navigation 3.5 (1.0) 4.0 (0.8) 0.008*
Usability 2.6 (1.2) 3.4 (1.0) <0.001*
Paper Workarounds 3.3 (1.1) 3.5 (1.1) 0.057
Workload 2.6 (0.7) 3.1 (0.9) 0.002*
Total 3.0 (0.8) 3.5 (0.8) <0.001*

Note:

*

denotes statistical significance.

Table 5 provides a summary of the themes revealed from analysis of the semi-structured debrief interviews. Two members of the study team agreed that the debrief interviews revealed interesting concepts related to three main themes: (1) layout preference; (2) provider-patient interaction; and (3) redesign recommendations. All providers indicated a preference for layout B due to the mobility associated with the wall-mounted armature system, and because the patient was within the provider’s field of view. Similarly, providers indicated that layout B facilitated provider-patient interaction because the patient was in close proximity and the provider did not experience ergonomic discomfort to interact with the patient (i.e., providers turned and contorted their torso, neck, etc. to face the patient with layout A). Finally, providers described a couple of redesign recommendations for both layouts A and B. For layout A, they suggested moving the patient to a location within their field of view (i.e. next to the desk). For layout B, providers recommended the wall mounted armature system be fully adjustable in a vertical direction so they could stand if needed.

Table 5.

Debrief Interview Responses; Themes and Subthemes (n=28)

Theme Subthemes
Layout preference Mobility
Field of view
Provider-patient interaction Spatial relationship to patient
Ergonomic discomfort
Redesign recommendations Patient location
Adjustable work area

4. DISCUSSION

The academic literature supports several practices for promoting provider-patient interaction with the use of exam room computing (Patel, Vichich, Lang, Lin, & Zheng, 2017). Recommended behavioral and communication practices, as supported by evidence, are: (1) using the computer to facilitate conversation; (2) adjusting room design; (3) maintaining eye contact with the patient while typing; (4) separating typing and patient interaction; (5) talking to the patient while gazing at the computer; (6) using a postural style that allows the clinician to face the patient most of the time; (7) inviting the patient to look at the screen before the patient asks; and (8) informing the patient about the functions and role of the computer. Adjusting the exam room design was the focus of our study, as it is both strongly supported by available research evidence and also related to other evidence-based strategies for promoting provider-patient interaction.

Recommended exam room design practices include arranging the computer so that the patient can simultaneously view the record, and using computers that allow for easy repositioning of the screen (Baker, Reifsteck, & Mann, 2003; Ventres et al., 2006). Adjustable and moveable furniture have also been reported to facilitate orienting the room layout to be more patient-centered (Patel et al., 2017). The new exam room design used here incorporated these recommended design practices, and our findings support the notion of ‘using the computer to facilitate conversation’, an evidence-based strategy for promoting provider-patient interaction with the use of exam room computing (Patel et al., 2017). The new exam room design seems to facilitate this strategy. The new design, with the ability to easily reposition the monitor and easily move the workspace furniture, may also facilitate other evidence-based practices for promoting provider-patient interaction such as: maintaining eye contact with the patient while typing; using a postural style that allows the provider to face the patient most of the time; and inviting the patient to look at the screen before the patient asks (Patel et al., 2017).

4.1. Efficiency, Errors, and Patient Centeredness

Objective measurements of efficiency, errors, and patient centeredness (percentage of time focused on the patient) did not differ between layouts. These results are, to the best of our knowledge, unique with respect to related studies. Others have found that the spatial organization of the exam room, including placement of the computer, could inhibit or facilitate communication (Frankel et al., 2005). The arrangement tested by these authors that facilitated communication was similar to the one we used for layout B, with a wall-mounted armature system for the computer monitor for ease of (re)-positioning. However, while the Frankel et al. (2005) study revealed that this type of arrangement facilitated provider-patient communication, their study was qualitative in nature and did not measure the efficiency of the visit, errors, or time focused on the patient. Therefore, it is unclear if the providers in their study were predominately focused on the patient or computer screen while communicating with the patient. One study that did measure time focused on the patient compared only the use of paper-based records with an EHR (Asan, Smith, & Montague, 2014). These authors found that providers spent a significantly smaller proportion of time gazing at the patient when using an EHR compared with when using a paper chart. One interpretation for the lack of a substantial difference in our study is that neither layout helps (or hinders) a provider’s performance in these measures. However, the lack of a clear difference may have occurred due to the fact the provider did not have to rely more or less on the EHR based on the scenario. Moreover, the provider could have gathered much of the needed information by interacting with the patient and not with the EHR, meaning the EHR was used as more of an assistive tool to try and facilitate conversations between the provider and patient. Since the EHR was not used as a crutch for the provider’s performance, the provider could dictate how much EHR use would be incorporated in the patient visit. The amount of such use is variable, and thus may have led to the lack of significant differences in time, number of errors, and amount of time focused specifically on the patient.

4.2. Workload

We believe the current study is the first to measure changes in perceived workload with different exam room layouts. Layout B was more favorable in terms of perceived physical workload, temporal workload, performance, effort, and frustration. Despite the performance results of the NASA-TLX favoring layout B, performance measures (time and errors) showed no significant differences. However, some of the comments given during the debriefing match these findings. Providers complained about the amount of physical movement and general discomfort encountered while using layout A. The most common complaints were about having to turn around constantly to shift attention between the EHR and patient, twisting at the waist to look over their shoulder to check on patient while interacting with the EHR, and having their back turned towards the patient. Constantly adjusting the body posture to accommodate the EHR and patient is a logical explanation for the less favorable physical workload ratings for layout A. Additionally, providers mentioned they felt rude by having their back turned to the patient and layout A would have been easier if they took paper notes. This supports the NASA-TLX scores in regards to the high frustration scores for layout A. The temporal workload, effort, and frustration subscales were significantly lower with layout B, likely because of the personalization of the layout B, which accounts for various patient locations to assist with EHR and patient attention shifting.

4.3. Screen Sharing

To our knowledge, this is the first study to measure difference in the time spent in screen sharing activities between exam room layouts. Layout B led to a larger amount of time screen sharing compared to layout A. Similar to the NASA-TLX subscales, the cause of the increased amount of screen sharing in layout B is likely to be the wall-mounted system. With layout B, the computer is fully adjustable, potentially making the providers more willing to share the screen with the patient. With layout A, the only way to effectively share the screen with the patient was by relocating the patient and moving him/her to the screen, whereas with layout B the screen can be adjusted and moved to the patient by the provider. This not only promotes the increased amount of screen sharing, but also likely promotes patient centeredness. However, during the debriefing, providers expressed concern about the potential of a patient seeing information the provider did not intend to share. This concern is consistent with another study (Asan, Carayon, Beasley, & Montague, 2015) that investigated factors that influence providers’ screen sharing behaviors in primary care encounters; providers in this work did not want the patient to see the screen when they were looking at a psychiatrist’s note or when they were documenting embarrassing information or legal issues.

4.4. Workflow

The WIS instrument, or similar workflow integration assessment tools, have not been used in previous studies of exam room layout. The three WIS subscales of navigation, usability, and workload, as well as overall WIS scores, indicated a significant difference between layouts, with Layout B having better scores. Moreover, providers rated Layout B higher, meaning that they believed layout B was easier to incorporate into their clinical workflow rather than layout A. The debrief interviews are helpful for interpreting these results. Providers mentioned that layout A involves having their back to the patient and thus made interacting with the EHR and the patient very difficult. In contrast, with layout B, focusing between the EHR and the patient was nearly seamless, involving a simple shift in eye gaze. This easy shift in attention allowed providers to make changes in the EHR and talk to the patient with ease without having to change positions, which may have led to layout B having a more favorable WIS score. The one subscale of the WIS that was not statistically different was ‘paper-based workarounds’, but trended towards significance. The lack of difference for this subscale may be the result of the simulation environment; provider did not have access to any paper materials aside from a one-page overview of the patient scenario and a list of medications provided by the patient. Transposing this study to a real-world scenario, it is possible that over time certain paper-based workarounds would be developed.

4.5. Situation Awareness

Our assessment of changes in providers’ situation awareness with different exam room layouts is, we believe, novel in the existing literature. There was a higher perceived level of situation awareness with layout B. Situation awareness was most likely facilitated in layout B again because of the flexibility of the wall mount. The mounting system allows for the provider to have the patient in their peripheral vision. This gives the provider freedom to change eye gaze from the EHR and patient quickly, but also enables the provider to visually sense a disturbance with the patient when focused on the EHR and vice versa. With layout A, if a provider needs to visually check the patient, they would need to either move their body to put the patient within eye gaze, or move the patient next to them.

4.6. Debrief Interviews

Debrief interview results were organized into major themes of layout preference, provider-patient interaction, and redesign recommendations. Providers preferred layout B because it facilitated (1) conversation; (2) maintaining eye contact with the patient while typing; (3) talking to the patient while gazing at the computer; and (4) using a postural style that allows the clinician to face the patient most of the time. This is consistent with several practices for promoting provider-patient interaction with the use of exam room computing outlined by Patel et al. (2017), including using the computer to mediate conversation. Indeed, layout B here, which included the wall-mounted monitor for ease of (re)-positioning, allowed for a “joint focus of attention” (Frankel & Saleem, 2013) that seems to allow the provider to better manage the medical encounter. Just as an aviation pilot relies on an external field of view as well as the instrument panel during complex coordinated actions, the medical provider can achieve the same joint focus of attention with the patient and the EHR when the layout allows for positioning of the computer monitor in close proximity with the patient.

4.7. Summary

Although there were no significant differences in performance measures between the layouts (i.e., efficiency, number of errors, and patient centeredness), providers experienced lower workload, better workflow integration, more screen sharing, and greater perceived situation awareness with layout B. Providers seemed unwilling to compromise their focus on the patient when using layout A and thus experienced greater mental and physical workload and lower situation awareness. In other words, a thoughtful design of the exam room layout with respect to layout B (and potential future modifications of layout B) may not result in improved physician performance or patient centeredness. However, our results support that manipulating the design and placement of exam room computing can reduce physician’s perception of their overall workload, including physical demand, temporal demand, performance, effort, and frustration. Our results also suggest that a more thoughtful design may also improve their perceived situation awareness, as well their perceived integration of the computing with their clinical workflow in terms navigation, usability, and workload. These results, in terms of the specific measures used, are unique compared to previous studies. Previous work has demonstrated that an exam room wherein the provider can readily share the computer screen can facilitate direct interaction and communication with the patient; however, these studies were mainly qualitative (e.g., Chen, Ngo, Harrison, & Duong, 2011; Frankel et al., 2005; Ventres, Kooienga, Marlin, Vuckovic, & Stewart, 2005).

Performance may not increase among physicians due to a more purposeful exam room computing set-up (layout B) from an objective point of view, but reducing the physicians perceived workload and increasing situation awareness with a more thoughtful computing arrangement can lead to an increase in patient centeredness and perhaps even patient care. This can mainly be achieved through screen sharing by inviting the patient in on care decisions as they relate to the information on the EHR screen and giving the patient a feeling of greater involvement.

This study has some limitations that should be noted. Due to the challenges of recruiting physicians to participate in a laboratory simulation away from their clinics, convenience sampling was used and the majority of the participants were resident physicians, whose practices may not generalize to all primary care providers. Although some of the providers had previous experience using a wall-mounted armature system, which may have introduced some learning bias, there was a good deal of variety in overall previous experiences with exam room computing set-ups across the providers. Limitations of the current study also existed with the patient scenarios. The scenarios did not require the provider to conduct a full physical exam, which would be more common for providers when conducting a patient visit. However, this was omitted because the focus of the study was on the computing arrangement and patient centeredness, not the provider’s ability to conduct a physical examination. Additionally, certain nuances of the provider-patient interaction, such as mutual eye gaze of the provider and patient on the computer monitor, were not considered as part of patient centeredness, but should be incorporated in future studies. Another limitation was that one of the study team members played the role of the patient in each patient visit, could possibly have introduced bias during the study sessions. This was done because hiring an independent patient actor was cost prohibitive for the study. However, the study team member who played the patient was the senior member of the study team and took great care to be consistent across layout types and providers, and not compel the provider to share the screen with them by following a pre-determined patient file and pre-planned responses. Also, in both patient scenarios the patient was interested in viewing trends of their blood pressure or respiratory rate values over a period of time. This was purposefully designed into the scenarios to encourage the provider to share the screen at least once while using layouts A and B. In reality, there are patients who may not be interested in viewing the screen at all, which potentially limits the generalizability of the current laboratory simulation.

Finally, it would be interesting to see how layout A and B compare performance-wise over the course of an entire work day. Future research should look to conduct studies of provider-patient scenarios over the course of an entire work day in a real-world clinical environment. More specifically, future work should focus on the effects of the different layouts on performance, patient centeredness, workload, workflow integration, and situation awareness over the course of multiple patient interactions, to determine more realistic outcomes of the different layouts. Additionally, future studies could introduce a patient scenario where providers are required to reference imaging data (X-rays, CT scans, etc.) to better understand the role of the computing device in a more complex patient visit. Based on the study findings, we argue that layout B would be preferred based on the lower amount of perceived workload, greater perceived levels of situation awareness, and greater workflow integration. This may lead to providers feeling less fatigued towards the end of the day. The conclusion about layout B as preferred, however, is based solely on the study findings and does not take into account cost or other organizational factors.

5. CONCLUSION

Although neither layout was significantly different in terms of objective performance measures (efficiency, errors, and proportion of time focused on the patient), results show that layout B was the preferred exam room computing layout. Additionally, providers experienced reduced workload, increased situation awareness, and better integration with clinical workflow using layout B when compared to layout A. Layout B also encourages a greater amount of screen sharing activities, consistent with the evolving paradigm of the computer and EHR being a third party and serving as a mediator between provider and patient. This study partially supports our hypothesized expectations, but further research is needed that focuses on the effects of each layout throughout multiple provider-patient interactions over the course of an entire workday. We will conduct such a study with the same layouts that exist in a live clinic setting as part of this funded work, documenting real patients’ perspectives and preferences, in addition to collecting provider data.

Supplementary Material

Appendix 1
Appendix 2

OCCUPATIONAL APPLICATIONS.

When comparing a typical exam room layout to the Department of Veterans Affairs (VA’s) new exam room design, with respect to the exam room computing, primary care providers experienced significantly less mental workload and greater situation awareness when using the new exam room design. Further, providers rated the new exam room layout significantly higher in terms of being integrated with their clinical workflow and spent significantly more time in screen sharing activities with the patient. A more thoughtful design of the exam room layout with respect to the placement and physical design of the computing set-up may reduce provider cognitive effort and enhance aspects of patient centeredness by viewing the computer and electronic health record (EHR) it displays as an important mediator between provider and patient. This was achieved by using an all-in-one computer attached to a wall mount that moves the monitor along three axes, allowing for optimal screen positioning and adjustable depending upon the scenario.

Acknowledgments

The views expressed in this article are those of the authors and do not reflect the official position of AHRQ, U.S. Department of Health and Human Services, or the Department of Veterans Affairs.

Funding: This research was supported under grant number 1R03HS024488-01A1 from the Agency for Healthcare Research and Quality (AHRQ), U.S. Department of Health and Human Services.

Footnotes

Conflict of Interest: The authors declare no conflict of interest.

References

  1. Asan O, Carayon P, Beasley JW, Montague E. Work system factors influencing physicians’ screen sharing behaviors in primary care encounters. International Journal of Medical Informatics. 2015;84(10):791–798. doi: 10.1016/j.ijmedinf.2015.05.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Asan OD, Smith P, Montague E. More screen time, less face time–implications for EHR design. Journal of Evaluation in Clinical Practice. 2014;20(6):896–901. doi: 10.1111/jep.12182. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Baker LH, Reifsteck SW, Mann WR. Connected: communication skills for nurses using the electronic helath record. Nurs Econ. 2003;21:85–88. [PubMed] [Google Scholar]
  4. Chen Y, Ngo V, Harrison S, Duong V. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM; 2011. Unpacking exam-room computing: negotiating computer-use in patient-physician interactions; pp. 3343–3352. [Google Scholar]
  5. Endsley MR. Toward a Theory of Situation Awareness in Dynamic Systems. Human Factors. 1995;37:32–64. [Google Scholar]
  6. Flanagan M, Arbuckle N, Saleem JJ, Militello LG, Haggstrom DA, Doebbeling BN. Development of a workflow integration survey (WIS) for implementing computerized clinical decision support. AMIA Annual Symposium Proceedings. 2011:427–434. [PMC free article] [PubMed] [Google Scholar]
  7. Frankel RM. Computers in the examination room. JAMA Internal Medicine. 2016;176(1):128–129. doi: 10.1001/jamainternmed.2015.6559. [DOI] [PubMed] [Google Scholar]
  8. Frankel R, Altschuler A, George S, Kinsman J, Jimison H, Robertson NR, Hsu J. Effects of exam‐room computing on clinician–patient communication. Journal of General Internal Medicine. 2005;20(8):677–682. doi: 10.1111/j.1525-1497.2005.0163.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Frankel RM, Saleem JJ. “Attention on the flight deck”: What ambulatory care providers can learn from pilots about complex coordinated actions. Patient Education and Counseling. 2013;93(3):367–372. doi: 10.1016/j.pec.2013.08.011. [DOI] [PubMed] [Google Scholar]
  10. Hart S, Staveland L. Development of the NASA-TLX (Task Load Index): Results of empirical and theoretical research. North-Holland: Elsevier Science Publishers; 1988. [Google Scholar]
  11. Helfrich CD, Sylling PW, Gale RC, Mohr DC, Stockdale SE, Joos S, Nelson KM. The facilitators and barriers associated with implementation of a patient-centered medical home in VHA. Implementation Science. 2016;11:24. doi: 10.1186/s13012-016-0386-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Kumarapeli P, de LS. Using the computer in the clinical consultation; setting the stage, reviewing, recording, and taking actions: multi-channel video study. Journal of American Medical Informatics Association. 2013;20:67–75. doi: 10.1136/amiajnl-2012-001081. [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. McGrath JM, Arar NH, Pugh JA. The influence of electronic medical record usage on nonverbal communication in the medical interview. Health Informatics Journal. 2007;13:105–118. doi: 10.1177/1460458207076466. [DOI] [PubMed] [Google Scholar]
  14. Montague E, Asan O. Dynamic modeling of patient and physician eye gaze to understand the effects of electronic health records on doctor-patient communication and attention. International Journal of Medical Informatics. 2014;83:225–234. doi: 10.1016/j.ijmedinf.2013.11.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Montague E, Xu J, Chen PY, Asan O, Barrett BP, Chewning B. Modeling eye gaze patterns in clinician-patient interaction with lag sequential analysis. Human Factors. 2011;53:502–516. doi: 10.1177/0018720811405986. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Patel MR, Vichich J, Lang I, Lin J, Zheng K. Developing an evidence base of best practices for integrating computerized systems into the exam room: a systematic review. Journal of American Medical Informatics Association. 2017;24(1):207–215. doi: 10.1093/jamia/ocw121. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Pearce C, Arnold M, Phillips C, Trumble S, Dwan K. The patient and the computer in the primary care consultation. Journal of American Medical Informatics Association. 2011;18:138–142. doi: 10.1136/jamia.2010.006486. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Pearce C, Dwan K, Arnold M, Phillips C, Trumble S. Doctor, patient and computer–a framework for the new consultation. International Journal of Medical Informatics. 2009;78:32–38. doi: 10.1016/j.ijmedinf.2008.07.002. [DOI] [PubMed] [Google Scholar]
  19. Rouf E, Whittle J, Lu N, Schwartz MD. Computers in the exam room: differences in physician-patient interaction may be due to physician experience. Journal of General Internal Medicine. 2007;22:43–48. doi: 10.1007/s11606-007-0112-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Saleem JJ, Flanagan ME, Russ AL, McMullen CK, Elli L, Russell SA, Frankel RM. You and me and the computer makes three: variations in exam room use of the electronic health record. Journal of American Medical Informatics Association. 2014;21:147–151. doi: 10.1136/amiajnl-2013-002189. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Sanders MS, McCormick EJ. Human Factors in Engineering and Design. New York, NY: McGraw-Hill, Inc; 1993. [Google Scholar]
  22. Selcon SJ, Taylor RM. Evaluation of the situation awareness rating technique (SART) as a tool for aircrew systems design technique (SART) as a tool for aircrew systems design. Paper presented at the AGARD AMP symposium ‘Situational Awareness in Aerospace Operations’; Neuilly Sur Seine, France. 1990. [Google Scholar]
  23. Ventres W, Kooienga S, Marlin R, Vuckovic N, Stewart V. Clinician style and examination room computers: a video ethnography. Family Medicine. 2005;37(4):276–81. [PubMed] [Google Scholar]
  24. Ventres W, Kooienga S, Vuckovic N, Marlin R, Nygren P, Stewart V. Physicians, patients, and the electronic health record: an ethnographic analysis. The Annals of Family Medicine. 2006;6:124–131. doi: 10.1370/afm.425. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Appendix 1
Appendix 2

RESOURCES