Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2019 Oct 1.
Published in final edited form as: Int J Nurs Knowl. 2017 Sep 19;29(4):242–252. doi: 10.1111/2047-3095.12178

Acceptability of Clinical Decision Support Interface Prototypes for a Nursing Electronic Health Record to Facilitate Supportive Care Outcomes

Janet Stifter 1, Vanessa E C Sousa 2, Alessandro Febretti 3, Karen Dunn Lopez 4, Andrew Johnson 5, Yingwei Yao 6, Gail M Keenan 7, Diana J Wilkie 8
PMCID: PMC5858953  NIHMSID: NIHMS871789  PMID: 28926204

Abstract

To determine acceptability, usefulness, and ease of use for four nursing clinical decision support interface prototypes.

In a simulated hospital environment, 60 registered nurses (48 female; mean age=33.7±10.8; mean years of experience=8.1±9.7) participated in a randomized study with four study groups. Measures included acceptability, usefulness, and ease of use scales.

Mean scores were high for acceptability, usefulness, and the ease of use for all four groups. Inexperienced participants (<1 year) reported higher perceived ease of use (p=.05) and perceived usefulness (p=.01) than those with ≥1 year experience.

Participants completed the protocol and reported that all four interfaces, including the control (HANDS), were acceptable, easy to use, and useful.

Further study is warranted before clinical implementation within the electronic health record.

Keywords: clinical decision support, practice-based evidence, electronic health record, end-of-life care, interface usability, simulation

Purpose

Protocols for end-of-life and supportive care education for nurses have been widely available for nearly 15 years (Malloy, Paice, Virani, Ferrell, & Bednash, 2008; Wilkie, Judge, Wells, & Berkley, 2001). Another method of education, practice-based evidence is delivered to the nurse via clinical decision support (CDS) interventions or tools within the electronic health record (EHR). However, this method is rarely available for guiding interventions to improve supportive care outcomes. Practice-based evidence is defined as “the aggregated and systematically analyzed data derived from the contexts, experiences, and practices of healthcare providers working in real-world practice settings” (Leeman & Sandelowski, 2012, p.171). In this era of clinical effectiveness research and widespread availability of EHRs, evidence generated from analyzing EHR real world practice data has become recognized as an important source to improve supportive care patient outcomes (Casarett, Harrold, Oldanie, Prince-Paul, & Teno, 2012). CDS within EHRs was made available and studied for physicians (Garg et al., 2005; Schiff & Rucker, 1998), and nurses (Byrne & Lang, 2013; Campion, Waitman, Lorenzi, May, & Gadd, 2011). However, an analysis of how clinicians use this real time data to inform practice is at the nascent stage. The purpose of our pilot study was to determine the acceptability, usefulness, and ease of use of four nursing CDS interface prototypes displaying practice-based evidence of outcomes for hospitalized end-of-life patients.

Contemporary healthcare delivery with its fast pace, rapidly evolving technology, and expanding evidence base has created an environment where nurses want at their fingertips the information they need to make good choices to ensure patient safety and high quality outcomes. Historically nurses have sought information to support their practice in policy and procedure manuals, drug reference texts, nursing journals, or from each other. CDS containing practice-based evidence using electronic technology has been one way to provide key information in a readily accessible format that can help nurses drive care decisions. The use of computerized decision support to guide patient decision making related to chronic disease management in both adults and adolescents has been extensively reported (Simon, Gude, Holleman, Hoestra, & Peek, 2014; Stinson et al., 2010; Wilkie et al., 2013). In addition a body of literature was examined exploring nurses’ use of decision support to guide and coach patient decision making related to end-of-life placement (Murray, Wilson, Kryworuchko, Stacey, & O’Connor, 2009) and diabetes care (Yu et al., 2014). Two literature reviews focused on the development, use, and acceptability of CDS to support evidence-based nursing practice (Anderson & Willson, 2008; Piscotty & Kalisch, 2014). Of interest to this team was research by Doran et al. (2007) who created a point of care patient system, the e-Volution in Outcomes Focused Knowledge Translation that used hand held technology. Similar to our system this solution collected patient data and then used that data to drive evidence-based decision making. The e-Volution system also provided resources such as clinical practice guidelines and drug references. The system we chose to demonstrate through our study integrates CDS right within the nursing plan of care (POC) EHR software. This system, the Hands on Automated Data System (HANDS, see intervention section) allowed nurses to document a patient POC, monitor patients’ responses to their interventions, access practice-based evidence to guide their next steps, and continuously follow trend data to ensure the best outcomes for their patients (Keenan et al., 2012).

In prior studies we found that nurses’ use of the HANDS EHR system, in which they documented nursing diagnoses, interventions, and outcomes, provided a rich data source for understanding end-of-life care. Important findings about pain (Al-masalha et al., 2013; Yao et al., 2013), death anxiety (Lodhi et al., 2014), and anticipatory grieving (Johnson et al., In press) outcomes emerged from analysis of practice-based data when nurses used such an EHR system for up to two years in four hospitals (Keenan et al., 2012). A series of usability studies with a representative sample of nurses revealed alternate usable ways of displaying the evidence in HANDS as part of CDS interfaces (Febretti et al., 2013). Although the nurses perceived the interfaces positively, they were not randomly assigned to an interface or allowed to use it in a simulated hospital environment to document their care decisions based on end-of-life case scenarios. In addition, in prior research involving HANDS nurses were not asked to complete questionnaires to report their perceptions of the usability of the interfaces. To address these gaps, the study aim was to determine the acceptability, usefulness, and ease of use for three experimental CDS interface prototypes and for a control interface prototype (HANDS) to measure when nurses changed the care plans for two simulated patient scenarios with life limiting illnesses.

Methods

Design/Setting

The design was a randomized four-group cohort study conducted in a simulation laboratory with virtual (visual, auditory) stimuli similar to hospital nursing stations. The Institutional Review Board at the University of Illinois at Chicago approved the study.

Sample

Purposive sampling was used to ensure a diverse demographic population (i.e., ethnicity, age, gender, education, experience). Stratified by gender, race, experience (<1 year, ≥1 year), and education (<BSN, ≥ BSN), we randomized a diverse sample of 60 nurses to one of the four interface groups (three CDS interfaces or the control [HANDS] CDS interface). The participants were 48 women and 12 men (age 21–71 years, mean=33.7±10.8 years; 25 White, 13 African American, 16 Asian, 6 other races). The participants were all registered nurses (RNs; newly licensed to 44 years of experience, mean=8.1±9.7 years), 4 had an Associate Degree (ADN), and 56 had a Bachelor Degree (BSN) or higher level of education. Participants worked in a variety of direct care (e.g., obstetrics, medical surgical, intensive care, emergency department) and supportive clinical care (e.g., educator, charge nurse) positions. We intentionally included nurses in non-specialty roles or settings since they intermittently provide supportive care.

Procedures

After signed, informed consent, a trained facilitator used a standardized instruction manual and on-screen visual materials to orient the participant to use the EHR system. After the orientation, the facilitator then presented two simulated patient case scenarios, with shift hand-off reports and patient assessments. Then the facilitator asked the participant to update the care plans using the EHR system while a software-use capture system (Morae, Techsmith, Okemos, MI) recorded and time-stamped the selected options. Each participant had up to three simulated shifts (today and the next two days) to implement the CDS options for each of the two patients. Then the participant filled out questionnaires and received $100 for time and travel expenses.

Intervention

The intervention included orientation to the HANDS system and two simulated patient scenarios. The scenarios described women who were near the end of their lives (Figure 1).

Figure 1.

Figure 1

Figure 1

Simulated patient scenarios presented as the stimulus for participants to interact with the clinical decision support interfaces. Key: Number in left column represents current value for the outcome; (number) represents the expected value for the outcome at discharge; NANDA-I = North American Nursing Diagnosis-International

Orientation to HANDS

Following a standardized script, subjects received orientation to the HANDS software. The HANDS is an electronic nursing plan of care documentation system that is compatible with any electronic health record. Documentation in HANDS by registered nurses (RNs) occurs upon admission, at shift change, at any point when there is a major change in patient status, and at discharge (Keenan et al., 2012). The HANDS software uses standardized North American Nursing Diagnosis Association International (NANDA-I) (Herdman & Kamitsuru, 2014); Nursing Interventions Classification (NIC) (Bulechek, Butcher, Dochterman, & Wagner, 2012); and Nursing Outcomes Classification (NOC) (Moorhead, Johnson, Mass, & Swanson, 2012) terminologies. The validity and reliability of nurses use of HANDS has been established in earlier studies (Keenan et al., 2012; Yao et al., 2013) in which HANDS was implemented as the nursing POC system on nine units (medical-surgical, geriatric, and critical care) in four hospitals (community and university). Compliance with use of the system for 12 or 24 months by 787 unique RN users averaged 89% (Keenan et al., 2012).

The HANDS home page (Figure 2A) displays identification information and legends defining icons (i.e., “i”, red/yellow/green) along the right side of the screen. On the left side of the display is the section for POC development. Labels located after blue square buttons represent NANDA-I nursing diagnoses from over 200 internationally recognized nursing diagnoses of a patient’s response to actual or potential health problems (e.g., for a stroke patient one NANDA-I diagnosis may be Impaired Verbal Communication [00051]). Clicking on the blue square button next to the label displays more information about that nursing diagnosis including a definition and related factors. Labels located after the green circle buttons represent NOC outcome measures, which are nursing sensitive patient outcomes specifically developed to evaluate the effects of interventions provided by nurses. Each NOC outcome appears on the POC and allows the step to rate on a scale from 1–5 with 1 being the lowest rating possible and 5 being the best rating possible. Labels located after the red triangle buttons are the standardized NIC interventions, which are the actual nursing interventions performed for the patient and are entered each shift by the patient’s nurse.

Figure 2.

Figure 2

Screen shots showing the major features of the three clinical decision support interface prototypes (the control interface was similar to A, but did not have blinking red button alert). Prototype 1 uses a visual presentation of data that involves colors to depict trending of future NOC outcomes (red reflects a poor NOC outcome if current management continues while green represents an improved outcome when elements of the management plan changes) in addition to the narrative practice-based evidence statement (1B). The graph in this prototype highlights actual NOC outcome ratings against the expected outcomes over time. Prototype 2 (1C) also allows for visualization of data but in this case using numbers in a table in conjunction with the practice-based evidence statement. Similar to the first prototype NOC outcome projections over time are included in the table as well as actual versus expected outcome ratings. Prototype 3 relies solely on the use of a narrative statement to present the practice-based evidence (1D). Each interface prototype alerted the participant to the availability of CDS information by the appearance of a blinking red button alert. As shown in Figure 2, when the blinking red button alert was clicked, the CDS appeared in the randomly assigned format: Text only, text and Table, text and Graph, or the original HANDS CDS interface without the blinking red button alert. Analysis of the participants’ adoption of the CDS suggestions is in progress.

To the right of each NANDA-I on the POC, are three icons nurses can use to translate data into POC modifications: the +, X, and ↑. Selecting the + icon next to a NANDA-I results in a drop down list of NOC outcomes that can be added to that particular problem on the POC. The NOCs with open checkboxes next to them can be added to the POC. At the bottom of the list there is an option to type in an outcome that does not appear in the list. The X icon next to a NANDA-I allows the user to remove that nursing diagnosis and all NOC and NIC labels linked to it. Selecting the ↑ next to a NANDA-I means that problem will move to the top of the POC. The NANDA-I at the top of the POC is considered the highest priority issue for the patient. The + and X icons adjacent to the NOC labels work similarly. By selecting a + next to a NOC the user will see displayed a drop down list of NIC interventions. Similar to the NANDA-I and NOC picklists, those NIC interventions with open check boxes next to them can be added to the POC. There is also a space for entering “Other” NICs at the bottom of the list. Clicking an X next to a NOC will remove both the NOC label and the NIC labels below it. To remove only a NIC the user can select the X function next to a NIC label to remove that intervention. NICs are removed one at a time.

CDS Interfaces within HANDS

The CDS information contained in the prototypes for this experiment (Table 1) was derived through the use of statistical and data mining techniques applied to 1,425 episodes of care collected with the HANDS POC documentation system (Al-Masalha et al., 2013). For example, Figure 2B depicts the NOC trajectory ratings when pain interventions are not aggressively pursued within the first 24 hours of admission. Figure 2B also provides the practice-based evidence nugget that the combination of medication management, pain management, and positioning are associated with pain relief. The nurse user is invited to add NIC: Positioning (0840) to a POC that already contains NIC: Medication Management (2380) and NIC: Pain Management (1400). Critical practice-based evidence nuggets are encouraged to be added through the use of a red flashing button alert that appears to the right of a NOC outcome on the care plan. The red flashing button alert is activated by a NOC current rating that is below the expected rating set by the nurse user who introduced the problem on the POC (Febretti et al., 2013).

Table 1.

Practice-based evidence messages in the clinical decision support (CDS) embedded in the HANDS prototypes.

A combination of medication management, positioning, and pain management has the most positive impact on pain level.
  • Add NIC: Positioning (0840)

It is more difficult to control pain when the EOL patient has both pain and impaired gas exchange problems.
  • Prioritize NANDA: Acute Pain (00132)

  • Remove NANDA: Impaired Gas Exchange (00030)

All pain can be relieved, however achieving pain control within the first 24 hours is critical to achieving pain control throughout the hospitalization. Simple interventions control pain for 90% of EOL patients. Palliative care and aggressive interventions are needed for the remaining 10%.
  • Add NIC: Patient Controlled Analgesia (2400)

  • Add NIC: Massage (1480)

  • Add NIC: Relaxation Therapy (6040)

  • Add NIC: Guided Imagery (6000)

Consequences of immobility include pneumonia, pressure ulcers, contractions, constipation, and venous thrombosis.
  • Add NOC: Immobility Consequences (0205)

The physical and emotional demands of caregiving can overwhelm the family.
  • Prioritize NANDA: Death Anxiety (00147)

  • Add new mini care plan: Family Coping (00075)

It is more difficult to control pain when the EOL patient has both pain and impaired gas exchange.
  • Prioritize NANDA: Acute Pain (00132)

  • Add NIC: Consultation Palliative Care (7910)

Palliative care consultants help manage pain, symptoms, comorbidities, and patient/family communication.
  • Adding palliative care consultation is highly recommended. If you choose not to add it, please specify a reason:

    • Patient/Family refusal

    • Doctor refusal

    • Other

The patient’s profile suggests that monitoring Impaired Gas Exchange is no longer indicated.
  • Remove NANDA: Impaired Gas Exchange (00030)

  • Add NIC: Respiratory Monitoring to Acute Pain (3350)

Copyright © 2014 HANDS Research Team, reprinted with permission.

For the purposes of this study the practice-based evidence was embedded into three different CDS interface prototypes. We incorporated our practice-based evidence into graphical and numerical (table) formats in addition to providing the narrative information as over 80% of the general population are visual learners and a multimodal approach is felt to be the most effective approach for learning (Herrman, 2008). McCrow, Yevchak, and Lewis (2014) in a study of 142 acute care nurses examined how the nurses learned and concluded that nurse educators need to have an understanding of nurses’ learning styles to help shape information dissemination strategies that will best meet RN learning requirements and thus enhance knowledge uptake. We developed the features for the four interface prototypes based on extensive user testing and feedback in a series of three studies with a different sample of 45 nurses (Febretti et al., 2013). An approximation of the basic HANDS interface (Figure 3) served as the control interface (Keenan et al., 2012). The experimental interfaces (Text only, text and Table, text and Graph) included the same information that was presented as evidence-based CDS in the respective text, table, or graph formats (Figure 2).

Figure 3.

Figure 3

Four screen shots (A–D) show the HANDS control group interface that was used to orient the participant to the major features of the care planning system. Features emphasized were highlighted by the yellow and red visual cues to draw the participant’s attention to the part of the screen being presented. Other features were blurred to maintain the participant’s focus on the orientation material being presented for each screen. A total of 35 similar screens was presented during the standardized orientation. Copyright © 2017 HANDS Research Team, reprinted with permission.

Main Instruments

Study instruments measured demographics, computer software acceptability, ease of use, and usefulness. Additional measures were included but not used for this specific study.

The adapted Computer Acceptability Scale (Wilkie et al., 2001) is a 10-item tool that measures the self-reported difficulty of using computer devices and software interfaces under study conditions. Response options ranged from 0 to 2 indicating too difficult/not acceptable to not difficult/acceptable, with total scores ranging from 0 to 20. The validity and reliability of the instrument has been demonstrated in prior studies of users who were patients or clinicians (Jha et al., 2010; Wilkie et al., 2001; Wilkie et al., 2003).

The modified Perceived Usefulness and Perceived Ease of Use (Davis, 1989; Holden, Brown, Scanlon, & Karsh, 2012) scales measure users’ perceptions about software usefulness (3 items) and ease of use (4 items). Response options range from not at all (0) to extremely (6) and don’t know (7, which was not included in score). Scoring for the Perceived Usefulness and Perceived Ease of Use scales range from a low score of 0 (poor) to high scores of 18 and 24 respectively (high perceived usefulness or high perceived ease of use). Reliability and validity of the scales have been documented (Adams, Nelson, & Todd, 1992; Davis, 1989; Doll, Hendrickson, & Deng, 1998; Hendrickson, Massey, & Cronan, 1993).

We calculated descriptive statistics (e.g., mean, variation, frequency, percentage) for the sample and subgroups. We used Pearson’s correlation to examine the association between study measures and accepted p<.05 as statistically significant. For group comparisons, we used analysis of variance (ANOVA).

Findings

All of the RN participants reported that the interfaces, including the HANDS Control interface, were completely or somewhat acceptable for all but two items (Table 2). For the item about information sharing with the next nurse, two participants in the Text only group indicated that the software allowed sharing of very little information with the next nurse. One participant in the text and Graph group reported that the software should not be available to all nurses. The overall mean acceptability score was 18.5±1.4 and did not vary by interface group, gender, or race. There was a trend for participants with <1 year experience to report higher software acceptability than those with ≥1 year experience (18.9±0.7 and 18.4±1.5, respectively, p=.07). With the average score so close to the maximum score (20) and some items exhibiting zero variability in this sample, the Cronbach’s alpha of .48 that was observed in this study is not meaningful.

Table 2.

Frequency of computer acceptability scale responses: Overall and by interface groups (N=60).

Question Response Overall Interface Groups
Hands(Control Group) Text Table Graph
Using HANDS system too hard Not hard at all 52 16 13 11 12
Somewhat hard 8 0 2 3 3
Too hard 0 0 0 0 0
Lighting adequate to use the HANDS program Totally adequate 59 16 15 14 14
Moderately adequate 1 0 0 0 1
Not adequate 0 0 0 0 0
Glare on the computer screen No glare 55 15 14 13 13
Some glare 5 1 1 1 2
Substantial glare 0 0 0 0 0
Instructions easy to understand Easy to understand 53 15 13 11 14
Somewhat hard to understand 7 1 2 3 1
Too hard to understand 0 0 0 0 0
Mouse easy to use Easy to use 60 16 15 14 15
Somewhat hard to use 0 0 0 0 0
Too hard to use 0 0 0 0 0
Information on computer screen easy to see Easy to see 58 15 15 13 15
Somewhat hard to see 2 1 0 1 0
Too hard to see 0 0 0 0 0
Feel rushed to complete tasks in HANDS program Not rushed at all 57 15 15 13 14
Somewhat rushed 3 1 0 1 1
Too hard to complete 0 0 0 0 0
HANDS program allow sharing all of the information want to share with nurses on next shift All important information 23 6 7 3 7
Most of the important information 35 10 6 11 8
Very little of the important information 2 0 2 0 0
Like using HANDS program for care planning Like 44 13 10 9 12
Somewhat like 16 3 5 5 3
Don’t like 0 0 0 0 0
HANDS program should be available to all nurses who take care of patients Make available to all nurses who take care of patients 53 15 12 13 13
Make available to nurses who take care of patents with life limiting illness 6 1 3 1 1
Do not make available to nurses caring for hospitalized patients 1 0 0 0 1
Total Score (Mean, SD) 18.5 (1.4) 18.9 (1.0) 18.5 (1.6) 18.1 (1.4) 18.5 (1.4)

HANDS = Hands on Automated Nursing Data System, SD = standard deviation; Adapted with permission of D.J. Wilkie, Copyright © 1998; current version Copyright © 2014 HANDS Research Team, reprinted with permission.

Similar to the acceptability scores, the participants reported high perceived ease of use scores (average of 17.1±3.2 out of a maximum total of 24) and relatively high perceived usefulness scores (average of 13.5±3.3 out of a maximum of 18) (Table 3). Neither the perceived ease of use nor perceived usefulness scores varied significantly by interface group (Table 3), gender, or race. Inexperienced participants (<1 year) had higher perceived ease of use (p=.05) and perceived usefulness (p=.01) than those with 1 year or more experience. The Cronbach’s alpha was .66 for the perceived ease of use scale and .88 for perceived usefulness scale in this sample, indicating adequate reliability. There was a moderately strong correlation between the perceived ease of use and perceived usefulness (r=.54, p<.001). They both showed moderately strong correlation with the acceptability scale (r=.66, p<.001 for perceived ease of use scale and r=.40, p=.002 for perceived usefulness scale).

Table 3.

Acceptability, perceived ease of use, perceived usefulness, and ease1 and usefulness2 items: Means and standard deviation, by interface groups (N=60)

Overall Interface Groups

Scale HANDS (Control Group) Text Table Graph
Acceptability 18.5 (1.4) 18.9 (1.0) 18.5 (1.6) 18.1 (1.4) 18.5 (1.4)
Perceived Ease of Use 17.1 (3.2) 18.4 (2.1) 17.0 (2.4) 17.1 (3.6) 15.9 (4.2)
Perceived Usefulness 13.5 (3.3) 12.6 (3.2) 12.6 (3.0) 15.2 (2.6) 13.5 (3.8)
Item
Clear 4.7 (1.0) 5.1 (0.7) 4.7 (0.9) 4.9 (1.2) 4.3 (1.2)
Easy to use 5.0 (1.0) 5.3 (0.7) 4.9 (0.8) 5.1 (1.0) 4.7 (1.4)
Mental effort 3.4 (1.2) 3.8 (1.1) 3.9 (1.1) 2.9 (1.2) 3.1 (1.1)
Easy for your task 4.0 (1.3) 4.3 (1.3) 3.6 (1.2) 4.2 (1.3) 3.9 (1.6)
Improve patient care 4.4 (1.2) 4.3 (1.3) 4.1 (1.1) 5.0 (1.0) 4.3 (1.4)
Facilitate decision making 4.7 (1.2) 4.1 (1.3) 4.5 (1.0) 5.3 (0.9) 4.9 (1.2)
Make job easier 4.4 (1.2) 4.3 (1.0) 4.0 (1.3) 4.9 (1.0) 4.3 (1.4)

HANDS = Hands on Automated Nursing Data System

1

First 4 items;

2

Last 3 items; at p<.05 level, none of the scores differ significantly by prototype.

Conclusion

The study protocol was successfully completed by all participants who reported that the EHR CDS software interfaces including the HANDS control interface prototype were highly acceptable, useful, and easy to use. The two end-of-life scenarios and software prototypes were sufficiently realistic to allow the participants in each of the three CDS interface and the control CDS interface groups to complete study tasks regarding decisions for altering the care plan as they deemed appropriate to attend to the dying patients’ nursing care needs. These findings are critical as end-of-life (EOL) care is being provided outside of specialty care units and oftentimes by nurses not familiar with care management of this population. This result suggests that regardless of their professional experiences with end-of-life patients, these nurses were able to successfully navigate the CDS prototype interfaces to locate practice-based evidence that could support their patient care.

In addition, in our realistic simulation environment, it was feasible for nurses to complete the protocol and the RNs reported that all of the interfaces were acceptable regardless of gender and race. This finding was similar to the work of others (Brown et al., 2011; Gaissmaier et al., 2012; Hawley et al., 2008) and supported the importance of assessing usability of CDS interfaces not just from the perspective of perceived acceptability, usefulness, and ease of use but also to gage accuracy of understanding. A cognitive interview or a knowledge assessment questionnaire is important to more thoroughly understand the knowledge gained from the presentation of the practice-based evidence in the CDS and its ultimate impact on users’ medical decision making. This issue is important to patient safety as misunderstood CDS could have unintended consequences.

Our study results also suggest that RNs with <1 year of experience may find all of the interfaces more acceptable than those with ≥1 year experience. This result was comparable to the findings of Lee, Mills, Bausell, and Lu (2008) who learned from nurses using a computerized nursing care plan system that the most consistent attribute that influenced attitude toward the system was age – younger nurses were more comfortable with technology and thus more apt to use it. Dowding et al. (2009) similarly learned though observations of nurse-patient interactions and nurse interviews that less experienced nurses were more likely to use CDS systems to assist with care decisions, especially in situations in which they had limited practical experience as compared to their more seasoned counterparts.

Software acceptability and usability are important for successful adoption, especially in busy clinical practice settings such as hospitals. User studies are common for commercial products used at home or in business, but similar user studies for acceptability and usability of EHR systems are less common, which may account for the dissatisfaction that was noted when EHRs are implemented in healthcare settings (Corrao, Robinson, Swiernik, & Naeim, 2010; Sittig, Krall, Kaalaas-Sittig, & Ash, 2005). With the ultimate goal of successful implementation of CDS within a well-tested and highly successful nursing EHR system, our CDS features were acceptable to a diverse sample of RNs. This finding supports their use in a larger study comparing effects of the interfaces in facilitating adoption of care plan changes as suggested by the CDS for patients with life limiting illnesses. Other researchers, too, reported that patients and providers found technology acceptable and usable including patient-reported outcomes for adults who received cancer care (Wilkie et al., 2003), sickle cell care (Wilkie et al., 2010), cancer patients who received hospice care (Wilkie et al., 2009), or children who received palliative care (Wolfe et al., 2014), and decision aids for patients or surrogates (Einterz, Gilliam, Lin, McBride, & Hanson, 2014). Well-tested technology tools, which are acceptable and easy for clinicians to use, offer a potential approach for improving care for patients needing palliative and supportive care, especially if the CDS tools result in care plan changes that are associated with improved patient outcomes.

The end-of-life patient scenarios created for our study were sufficiently relevant to allow the RNs to complete the study protocol. Despite the fact that none of the RNs were working in supportive or palliative care settings, they all had enough experience with dying patients to make decisions for care plan changes. Our approach using patient scenarios in a simulated environment to explore clinician decision making could be applied in studies other than those focused on informatics and interface designs. For example, outcomes of a palliative care education module for oncology providers focused on improving knowledge, attitudes, and behaviors in the care of patients with cancer could be tested and measured in a simulation environment using EHR documentation of behavioral changes in nurses care planning. This approach to documentation of practice intention after real time palliative care education holds much promise based on the high acceptability and usability ratings of the CDS interface prototypes in our study.

Although our findings are encouraging and provide direct support for a larger study of CDS adoption, limitations detract from them. The sample did not adequately represent RNs with ADNs, although they continue to be a large proportion of the U.S. nursing work force (Committee on the Robert Wood Johnson Foundation Initiative on the Future of Nursing, 2010). Improved recruitment strategies are needed to increase future study participation by nurses with an ADN. The participants had unlimited and uninterrupted time to interact with the CDS interface, which is unrealistic in clinical practice. In future studies, researchers should include time constraints and process interruptions (e.g., call lights, phone calls) to be more representative of the real world practice environment. We acknowledge that these environmental factors could affect acceptability of the interfaces and that these real world issues should be addressed before a CDS system is deployed in practice.

Implications for nursing knowledge

Although the study protocol is feasible and the CDS prototype interfaces are acceptable and perceived as useful by male and female RNs from different racial groups and a variety of experience levels and work environments as they documented care plan changes for end-of-life patient scenarios, findings require replication in a larger, representative sample before any of the experimental prototypes are implemented in clinical practice. Use of the simulation environment to replicate the contemporary practice environment offers the opportunity to address software usability and acceptability by nurses impacted by the constraints of time, distractions, interruptions, increased patient complexity, and an ever growing workload. The use of case scenarios and EHR CDS prototype interfaces developed from practice-based evidence are approaches that hold promise for future research efforts in supportive, palliative, and hospice care to measure practice decisions after educational interventions designed to improve care for patients with cancer and other illnesses.

Knowledge Translation

CDS for nurses derived from real world practice data and available in the EHR is currently an underdeveloped resource for nursing practice. With the changing pace and demands of contemporary healthcare, bedside nurses need immediately available evidence based information to guide safe and high quality care for all patients, including those at the end of life. Our study contributes knowledge toward this resource development, introducing the use of four nursing CDS interface prototypes located within a unique nursing electronic POC documentation system. We derived these CDS prototypes for hospitalized end-of-life patients based on real world outcomes from practicing nurses, and crafted them to efficiently provide evidence based data in varied display formats. Study findings reflect the importance of designing software that includes acceptability and usability testing to include the different learning styles of a diverse nursing body, and highlight how the EHR system with immediately accessible evidence based CDS can provide supportive care practices that contribute to a higher level of quality and safety for vulnerable patients.

Acknowledgments

This research was made possible by Grant Number 1R01 NR012949 from the National Institutes of Health, National Institute for Nursing Research. Its contents are solely the responsibility of the authors and do not necessarily represent the official views of the National Institute for Nursing Research. The final peer-reviewed manuscript is subject to the National Institutes of Health Public Access Policy. The authors thank Veronica Angulo for subject recruitment and clerical assistance and Dr. Marie Suarez, David Shuey, Kevin McDaniel, Brenda Burke, and GW Douglas for assistance with data collection and processing.

Footnotes

Disclosure

The HANDS software that was used in this study is now owned and distributed by HealthTeam IQ, LLC. Dr. Gail Keenan is currently the President and CEO of this company and has a current conflict of interest statement of explanation and management plan in place with the University of Florida.

Contributor Information

Janet Stifter, College of Nursing, University of Illinois at Chicago, Chicago, IL. Contributions: Lead Author, Background and Discussion Sections, Data Collection.

Vanessa E. C. Sousa, College of Nursing, University of Illinois at Chicago, Chicago, IL. Contributions: Data Collection, Manuscript Reviewer.

Alessandro Febretti, Department of Computer Science, College of Engineering, University of Illinois at Chicago, Chicago, IL. Contributions: Development of HANDS Prototypes, Data Collection, Manuscript Reviewer.

Karen Dunn Lopez, College of Nursing, University of Illinois at Chicago, Chicago, IL. Contributions: Study Designer, Data Collection, Manuscript Review and Editing.

Andrew Johnson, Department of Computer Science, College of Engineering, University of Illinois at Chicago, Chicago, IL. Contributions: Study Designer, Supervised development of prototypes; Developed Simulation setting, Manuscript Review and Editing.

Yingwei Yao, College of Nursing, University of Illinois at Chicago, Chicago, IL. College of Nursing, University of Florida, Gainesville, FL. Contributions: Randomization, Data Analysis, Data Presentation in Manuscript, Manuscript Review and Editing.

Gail M. Keenan, College of Nursing, University of Illinois at Chicago, Chicago, IL. College of Nursing, University of Florida, Gainesville, FL. Contributions: Study Designer, Content Expert in Standardized Nursing Languages, Data Collection, Manuscript Review and Editing.

Diana J. Wilkie, College of Nursing, University of Illinois at Chicago, Chicago, IL. College of Nursing, University of Florida, Gainesville, FL. Contributions: Lead Author, Methods, Results sections, Study Designer.

References

  1. Adams DA, Nelson RR, Todd PA. Perceived usefulness, ease of use, and usage of information technology: A replication. MIS Quarterly. 1992;16:227–247. [Google Scholar]
  2. Al-masalha F, Xu D, Keenan GM, Khokhar A, Yao Y, Chen YC, … Wilkie DJ. Data mining nursing care plans of end-of-life patients: A study to improve healthcare decision making. International Journal of Nursing Knowledge. 2013;24(1):15–24. doi: 10.1111/j.2047-3095.2012.01217.x. 2047-3087 (Electronic) [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Anderson JA, Willson P. Clinical decision support systems in nursing. Computers, Informatics, Nursing. 2008;26:151–158. doi: 10.1097/01.NCN.0000304783.72811.8e. [DOI] [PubMed] [Google Scholar]
  4. Brown SM, Cluver JO, Osann KE, MacDonald DJ, Sand S, Thornton AA, … Weitzel JN. Health literacy, numeracy, and interpretation of graphical breast cancer risk estimates. Patient Education and Counseling. 2011;83:92–98. doi: 10.1016/j.pec.2010.04.027. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Bulechek G, Butcher H, Dochterman J, Wagner C, editors. Nursing interventions classification (NIC) 6. Missouri: Mosby; 2012. [Google Scholar]
  6. Byrne MD, Lang N. Examination of nursing data elements from evidence-based recommendations for clinical decision support. Computers, Informatics, Nursing. 2013;31(12):605–614. doi: 10.1097/CIN.0000000000000013. [DOI] [PubMed] [Google Scholar]
  7. Campion TR, Jr, Waitman LR, Lorenzi NM, May AK, Gadd CS. Barriers and facilitators to the use of computer-based intensive insulin therapy. International Journal of Medical Informatics. 2011;80(12):863–871. doi: 10.1016/j.ijmedinf.2011.10.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Casarett DJ, Harrold J, Oldanie B, Prince-Paul M, Teno J. Advancing the science of hospice care: Coalition of Hospices Organized to Investigate Comparative Effectiveness. Current Opinion in Supportive and Palliative Care. 2012;6(4):459–464. doi: 10.1097/SPC.0b013e32835a66b7. [DOI] [PubMed] [Google Scholar]
  9. Committee on the Robert Wood Johnson Foundation Initiative on the Future of Nursing, at the Institute of Medicine. The future of nursing: Leading change, advancing health. Washington, D.C: National Academies Press; 2010. [Google Scholar]
  10. Corrao NJ, Robinson AG, Swiernik MA, Naeim A. Importance of testing for usability when selecting and implementing an electronic health or medical record system. Journal of Oncology Practice. 2010;6(3):120–124. doi: 10.1200/JOP.200017. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Davis FD. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly. 1989;13:319–340. [Google Scholar]
  12. Doll WJ, Hendrickson A, Deng X. Using Davis’s perceived usefulness and ease of use instruments for decision making: A confirmatory and multigroup invariance analysis. Decision Sciences. 1998;29:839–869. [Google Scholar]
  13. Doran DM, Mylopoulos J, Kushniruk A, Nagle L, Laurie-Shaw B, Sidani S, … McArthur G. Evidence in the palm of your hand: Development of an outcomes-focused knowledge translation intervention. Worldviews on Evidence-Based Nursing. 2007:69–77. doi: 10.1111/j.1741-6787.2007.00084.x. Second Quarter. [DOI] [PubMed] [Google Scholar]
  14. Dowding D, Mitchell N, Randell R, Foster R, Lattimer V, Thompson C. Nurses’ use of computerized clinical decision support systems: A case site analysis. Journal of Clinical Nursing. 2009;18:1159–1167. doi: 10.1111/j.1365-2702.2008.02607.x. [DOI] [PubMed] [Google Scholar]
  15. Einterz SF, Gilliam R, Lin FC, McBride JM, Hanson LC. Development and testing of a decision aid on goals of care for advanced dementia. Journal of the American Medical Directors Association. 2014;15(4):251–255. doi: 10.1016/j.jamda.2013.11.020. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Febretti A, Dunn Lopez K, Stifter J, Johnson AE, Keenan GM, Wilkie DJ. A component-based evaluation protocol for clinical decision support interfaces. In: Marcus A, editor. Design, User Experience, and Usability: Design Philosophy, Methods, and Tools. Vol. 8012. Berlin: Springer; 2013. pp. 232–241. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Gaissmaier W, Wegwarth O, Skopec D, Muller SA, Broschinski S, Politi MC. Numbers can be worth a thousand pictures: Individual differences in understanding graphical and numerical representations of health-related information. Health Psychology. 2012;31:286–296. doi: 10.1037/a0024850. [DOI] [PubMed] [Google Scholar]
  18. Garg AX, Adhikari NK, McDonald H, Rosas-Arellano MP, Devereaux PJ, Beyene J, … Haynes RB. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: A systematic review. Journal of the American Medical Association. 2005;293(10):1223–1238. doi: 10.1001/jama.293.10.1223. [DOI] [PubMed] [Google Scholar]
  19. Hawley ST, Zikmund-Fisher B, Ubel P, Jancovic A, Lucas T, Fagerlin A. The impact of graphical presentation on health-related knowledge and treatment choices. Patient Education and Counseling. 2008;73:448–455. doi: 10.1016/j.pec.2008.07.023. [DOI] [PubMed] [Google Scholar]
  20. Hendrickson AR, Massey PD, Cronan TP. On the test-retest reliability of perceived usefulness and perceived ease of use scales. MIS Quarterly. 1993;17:227–230. [Google Scholar]
  21. Herdman TH, Kamitsuru S, editors. NANDA International nursing diagnoses: Definitions and classification, 2015–2017. Oxford: Wiley-Blackwell; 2014. [Google Scholar]
  22. Herrman JW. Creative teaching strategies for the nurse educator. Philadelphia: F. A. Davis; 2008. [Google Scholar]
  23. Holden RJ, Brown RL, Scanlon MC, Karsh BT. Modeling nurses’ acceptance of bar coded medication administration technology at a pediatric hospital. Journal of the American Medical Informatics Association. 2012;19(6):1050–1058. doi: 10.1136/amiajnl-2011-000754. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Jha A, Suarez ML, Ferrans CE, Molokie R, Kim YO, Wilkie DJ. Cognitive testing of PAINReportIt in adult African Americans with sickle cell disease. Computers, Informatics, Nursing. 2010;28(3):141–150. doi: 10.1097/NCN.0b013e3181d7820b. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Johnson J, Lodhi MK, Cheema U, Stifter J, Dunn-Lopez K, Yao Y, … Wilkie DJ. Outcomes for end-of-life patients with anticipatory grieving: Insights from practice with standardized nursing terminologies within an interoperable Internet-based electronic health record. Journal of Hospice and Palliative Nursing. doi: 10.1097/NJH.0000000000000333. In press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Keenan GM, Yakel E, Yao Y, Xu D, Szalacha L, Tschannen D, … Wilkie DJ. Maintaining a consistent big picture: Meaningful use of a web-based POC EHR system. International Journal of Nursing Knowledge. 2012;23(3):119–133. doi: 10.1111/j.2047-3095.2012.01215.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Lee T, Mills ME, Bausell B, Lu M. Two-stage evaluation of the impact of a nursing information system. Taiwan International Journal of Medical Informatics. 2008;77:698–707. doi: 10.1016/j.ijmedinf.2008.03.004. [DOI] [PubMed] [Google Scholar]
  28. Leeman J, Sandelowski M. Practice-Based Evidence and Qualitative Inquiry. Journal of Nursing Scholarship. 2012;44:171–179. doi: 10.1111/j.1547-5069.2012.01449.x. [DOI] [PubMed] [Google Scholar]
  29. Lodhi MK, Cheema UI, Stifter J, Wilkie DJ, Keenan GM, Yao Y, … Khokhar AA. Death anxiety in hospitalized end-of-life patients as captured from a structured electronic health record: Differences by patient and nurse characteristics. Research in Gerontological Nursing. 2014;7(5):224–234. doi: 10.3928/19404921-20140818-01. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Malloy P, Paice J, Virani R, Ferrell BR, Bednash GP. End-of-life nursing education consortium: 5 years of educating graduate nursing faculty in excellent palliative care. Journal of Professional Nursing. 2008;24(6):352–357. doi: 10.1016/j.profnurs.2008.06.001. [DOI] [PubMed] [Google Scholar]
  31. McCrow J, Yevchak A, Lewis P. A prospective cohort study examining the preferred learning styles of acute care registered nurses. Nurse Education in Practice. 2014;14:170–175. doi: 10.1016/j.nepr.2013.08.019. [DOI] [PubMed] [Google Scholar]
  32. Moorhead D, Johnson M, Mass M, Swanson F, editors. Nursing outcomes classification (NOC) 5. Missouri: Mosby; 2012. [Google Scholar]
  33. Murray MA, Wilson K, Kryworuchko J, Stacey D, O’Connor A. Nurses’ perceptions of factors influencing patient decision support for place of care at the end of life. American Journal of Hospice and Palliative Care Medicine. 2009;26:254–263. doi: 10.1177/1049909108331316. [DOI] [PubMed] [Google Scholar]
  34. Piscotty R, Kalisch B. Nurses’ use of clinical decision support. Computers, Informatics, Nursing. 2014;32:562–568. doi: 10.1097/CIN.0000000000000110. [DOI] [PubMed] [Google Scholar]
  35. Schiff GD, Rucker TD. Computerized prescribing: Building the electronic infrastructure for better medication usage. Journal of the American Medical Association. 1998;279(13):1024–1029. doi: 10.1001/jama.279.13.1024. [DOI] [PubMed] [Google Scholar]
  36. Simon A, Gude WT, Holleman F, Hoestra J, Peek N. Diabetes patients’ experiences with the implementation of insulin therapy and their perceptions of computer-assisted self-management systems for insulin therapy. Journal of Medical Internet Research. 2014;16:e235. doi: 10.2196/jmir.3198. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Sittig DF, Krall M, Kaalaas-Sittig J, Ash JS. Emotional aspects of computer-based provider order entry: a qualitative study. Journal of the American Medical Informatics Association. 2005;12(5):561–567. doi: 10.1197/jamia.M1711. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Stinson J, McGrath P, Hodnett E, Feldman B, Duffy C, Huber A, … White M. Usability testing of an online self-management program for adolescents with juvenile idiopathic arthritis. Journal of Medical Internet Research. 2010;12:e30. doi: 10.2196/jmir.1349. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Wilkie DJ, Gallo AM, Yao Y, Molokie RE, Stahl C, Hershberger PE, … Thompson AA. Reproductive health choices for young adults with sickle cell disease or trait: Randomized controlled trial immediate posttest effects. Nursing Research. 2013;62:352–361. doi: 10.1097/NNR.0b013e3182a0316b. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Wilkie DJ, Huang HY, Berry DL, Schwartz A, Lin YC, Ko NY, … Fitzgibbon D. Cancer symptom control: Feasibility of a tailored, interactive computerized program for patients. Family & Community Health. 2001;24(3):48–62. [PubMed] [Google Scholar]
  41. Wilkie DJ, Judge MK, Berry DL, Dell J, Zong S, Gilespie R. Usability of a computerized PAINReportIt in the general public with pain and people with cancer pain. Journal of Pain and Symptom Management. 2003;25(3):213–224. doi: 10.1016/s0885-3924(02)00638-3. [DOI] [PubMed] [Google Scholar]
  42. Wilkie DJ, Judge MKM, Wells MJ, Berkley IM. Excellence in teaching end-of-life care: A new multimedia toolkit for nurse educators. Nursing and Health Care Perspectives. 2001;22(5):226–230. [PubMed] [Google Scholar]
  43. Wilkie DJ, Kim YO, Suarez ML, Dauw CM, Stapleton SJ, Gorman G, … Zhao Z. Extending computer technology to hospice research: Interactive pentablet measurement of symptoms by hospice cancer patients in their homes. Journal of Palliative Medicine. 2009;12(7):599–602. doi: 10.1089/jpm.2009.0006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Wilkie DJ, Molokie R, Boyd-Seal D, Suarez ML, Kim YO, Zong S, … Wang ZJ. Patient-reported outcomes: Nociceptive and neuropathic pain and pain barriers in adult outpatients with sickle cell disease. Journal of the National Medical Association. 2010;102:18–27. doi: 10.1016/s0027-9684(15)30471-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. Wolfe J, Orellana L, Cook EF, Ullrich C, Kang T, Geyer JR, … Dussel V. Improving the care of children with advanced cancer by using an electronic patient-reported feedback intervention: results from the PediQUEST randomized controlled trial. Journal of Clinical Oncology. 2014;32(11):1119–1126. doi: 10.1200/JCO.2013.51.5981. [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Yao Y, Keenan G, Al-masalha F, Dunn Lopez K, Khokar A, Johnson A, … Wilkie DJ. Current state of pain care for hospitalized patients at end of life. American Journal of Hospice and Palliative Care. 2013;30(2):128–136. doi: 10.1177/1049909112444458. doi:1049909112444458. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Yu CH, Stacey D, Sale J, Hall S, Kaplan DM, Ivers N, … Straus SE. Designing and evaluating an interprofessional shared decision-making and goal-setting decision aid for patients in clinical care – systematic decision aid development and study protocol. Implementation Science. 2014;9:16. doi: 10.1186/1748-5908-9-16. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES