Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2019 Sep 1.
Published in final edited form as: Int J Med Inform. 2018 May 21;117:19–25. doi: 10.1016/j.ijmedinf.2018.05.008

The design of decisions: Matching clinical decision support recommendations to Nielsen’s design heuristics

Kristen Miller a,*, Muge Capan b, Danielle Weldon a, Yaman Noaiseh c, Rebecca Kowalski a, Rachel Kraft d, Sanford Schwartz e, William S Weintraub f, Ryan Arnold g
PMCID: PMC6061965  NIHMSID: NIHMS974185  PMID: 30032961

Abstract

Objective

While general design heuristics exist for graphic user interfaces, it remains a challenge to facilitate the implementation of these heuristics for the design of clinical decision support. Our goals were to map a set of recommendations for clinical decision support design found in current literature to Jakob Nielsen’s traditional usability heuristics and to suggest usability areas that need more investigation.

Materials and methods

Using a modified nominal group process, the research team discussed, classified, and mapped recommendations, organized as interface, information, and interaction, to design heuristics. A previous narrative review identified 42 recommendations from the literature to define the design and functional characteristics that impact the performance of CDS in terms of provider preference, process of care, and patient outcomes.

Main findings

We matched 20 out of 42 recommendations to heuristics. The mapping reveals gaps in both heuristics and recommendations, identifying a set of Nielsen’s heuristics that are underrepresented in the literature and subsets of recommendations important to design not covered in Nielsen’s heuristics. We attributed this, in part, to the evolution of technology since the inception of Nielsen’s heuristics. The team created a new interaction heuristic: Integration into real-time workflow to consider the needs of the end-user in the clinical space.

Discussion

Clinical decision support has enabled clinicians to better address arising information needs; however there remains a lack of evidence-based guidelines in terms of functional and design requirements.

Conclusion

Results from this review suggest that interaction design principles were not fully satisfied by the current literature of clinical decision support.

Keywords: Clinical decision support, Heuristics, Interaction, Interface, Design

1 Introduction

Clinical Decision Support (CDS) is defined as “providing clinicians with clinical knowledge and patient-related information, intelligently filtered, and presented at appropriate times to enhance patient care” [1]. CDS systems are widely used in clinical care, but are often ineffective due to poor usability [2]. Human factors, usability, and human computer interface principles are critical to the success of CDS. By evaluating CDS from a human factors perspective, frustrations and error-prone conditions can be minimized to create tools that are designed with the user in mind. While general design heuristics exist for graphic user interfaces [3,4], it remains a challenge to facilitate the implementation of these traditional interface interaction heuristics for design, and literature lacks studies that evaluate CDS as interactive computer systems. Guidelines for systems design, such as rules specifically focusing on designing graphical user interfaces, require better design and functional characteristics related to specific rules with a focus on clinical usability. Evaluating the design of these interactive computer systems from a human factors engineering perspective can optimize interactions between clinicians and the system, and eventually improve both patient and clinician outcomes.

Jakob Nielsen’s ten general principles for human computer interaction design are recognized not only in the world of design [4], but by healthcare researchers as the gold standard of heuristic evaluation [57]. In systems engineering, heuristics are defined as systematically designed procedures that do not guarantee an optimal solution, but provide near-optimal solutions [8]. In the context of computer systems design, heuristics represent broad rules of thumb to achieve optimal design. A heuristic evaluation is a usability inspection method for computer software that helps to uncover CDS design and interface deficiencies and to identify usability problems in human computer interface design. This technique typically requires three or more expert usability evaluators to independently apply a set of usability heuristics to a product, identify violations of the heuristics, and assess the severity of each violation [9].

By identifying successful components of current CDS and mapping them to Nielsen’s set of interaction design heuristics, this study provides insight into CDS evaluation and identifies design characteristics that lack guidance in the literature. The aims for this project were two-fold: i) to map a set of recommendations for CDS design found in current literature to Nielsen’s usability heuristics and ii) to suggest usability areas that need more investigation based on design heuristics that are not adequately covered in the matching process. We include a brief discussion of matched recommendations for each heuristic. In better understanding CDS design recommendations and their adherence to usability heuristics, we can provide guidance to improve interaction design.

2 Material and methods

2.1 Literature search and recommendations pulled

A brief narrative review was conducted from January 1, 2000 to December 31, 2016, and used the literature database PubMed and the peer-reviewed journal The Journal of Human Factors and Ergonomics Society, using the following MeSH terms: “clinical decision support”, “decision support systems”, “clinical alerts”, “alert systems”, AND “characteristics”, “features”, “human factors”, “usability heuristics”, “usability factors”, and “recommendation” [9]. Only one database was used for brevity as well as a snapshot of CDS literature. Articles were included if they were in English, electronic health record (EHR) based, provided specific characteristics in terms of CDS design, and defined success factors used to evaluate CDS performance.

Of the 14 articles identified, 42 recommendations were pulled from the full-text articles, to define the design and functional characteristics that impact the performance of CDS in terms of provider preference, process of care, and patient outcomes [9]. Recommendations “had to represent consensus, defined as recommendations identified in a systematic review and/or a design recommendation made by more than one paper (referenced within the article), demonstrating success factors” [9]. This review found that no reviews have been completed that discussed or recommended design features of CDS with heuristic evaluation. Recommendations from this narrative review were organized into three “I” categories: Interface, Information, and Interaction, further divided by four “P” categories (presentation, placement and positioning, and provision of multiple presentation layers); three “C” categories (clean and concise, content guidance, and consistency); and five “F” categories (fast, fit, feedback, forgiveness, and flexible design) extracted from 14 papers [9]. Included studies assessed CDS success and impact by studying patient/clinician outcomes in clinician workflow [1023].

2.2 Consensus and evaluation

Members of the research team conducted the initial inclusion of articles, with the final selection of articles discussed in a group setting [9]. A modified nominal group process was conducted with the full-text articles to discuss, classify, and map recommendations to Nielsen’s heuristics.

The research team applied usability knowledge to drive the heuristic evaluation. A codebook was developed based on iterative discussions to assist in matching with the standardized recommendation methods. Recommendations were matched to Nielsen’s heuristics by members of a multi-disciplinary team without prior discussion [24]. Following this independent process, the team discussed each recommendation’s relevance to the applicable design heuristic and a consensus was reached on every recommendation-heuristic match.

2.3 Heuristics

This evaluation utilized Jakob Nielsen’s ten general principles for interaction design (Table 1).

Table 1.

Nielsen’s ten general principles for interaction design.

Principles Definition
1 Visibility of system status The system should always keep users informed about what is going on, through appropriate feedback within reasonable time.
2 Match between system and the real world The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions,making information appear in a natural and logical order.
3 User control and freedom Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.
4 Consistency and standards Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions
5 Error prevention Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action.
6 Recognition rather than recall Minimize the user's memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.
7 Flexibility and efficiency of use Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions.
8 Aesthetic and minimalist design Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.
9 Help users recognize, diagnose, and recover from errors Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.
10 Help and documentation Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large.

2.4 Quantitative analysis of associations

Our mapping process refers to mapping a finite number of items in one set (n1 = 10 design heuristics) to a finite number of items in another set (n2 = 42) design recommendations extracted from the studies included in the narrative review (e.g., recommendations regarding the front size/spacing/color.) In this context, we are utilizing a generalized and simplified version of a point-to-set mapping framework [25], where the items in one set are assigned to items in the others set using an n-to-n mapping. In other words, since there are n1 = 10 design heuristics and n2=recommendations, n-to-n mapping indicates that any one item in set one (a design heuristic) can be matched to 0, 1,…n2 recommendations. Any one recommendation in set 2 can be matched to 0, 1,…,10 design heuristics. Recommendations that matched zero design heuristics were considered as revealing a potential underrepresented area specific to CDS and need for further expansion of Nielsen’s heuristics. The relationship between recommendations and matched design heuristics were quantified with the goal of: i) analyzing the relevance for each recommendation with regards to CDS design by quantifying the number of matches for each recommendation where higher number of matches represent stronger relevance, and ii) analyzing the relevance of the design heuristics with regards to CDS design by quantifying the total number of recommendations matched to each design heuristic. In addition, association rules methods were applied to find patterns of association between recommendations and design heuristics. Association rules explored combinations of recommendation-heuristic matches and searched for groups of matches that are commonly found in a set (e.g., Interface, Information, and Interaction sets) with the goal of identifying rules, such as “if a given combination of design heuristics in set A are matched, then so are they matched in set B with probability > p”.

3 Results

Through this consensus process, we were able to match 20 out of 42 recommendations to design heuristics (Table 2). Among the 22 recommendations that were not matched, five were in the Interface category, five in Information, and 12 in Interaction.

Table 2.

Recommendations taken from CDS literature that has been matched to Nielsen’s design heuristics.

Design Heuristica 1 2 3 4 5 6 7 8 9 10
Interface Presentation 1 1
  Make it Simple
  Use appropriate font sizes
  Use meaningful colors
  Ensure acceptable contrast
  Keep presentation consistent
  Deploy space filling techniques
  Make icons bold or bigger in size
Total Matches 1 1
Placement and Positioning 1 2 2
  Display information in prominent positions
  Allow for reading left to right
  Localize information
Total Matches 1 2 2
Provision of Multiple Presentation Layers 1
  Avoid using only text
Total Matches 1
Information Clean and Concise 1 1
  Standardize terminology
  Use concise and effective language
Total Matches 1 1
Content Guidance 1 1 1
  Provide a recommendation
  Justify recommendations
  Suggest alternative recommendations
  Provide additional resources
  Evidence-based recommendations as default
  Keep recommendations up to date
Total Matches 1 1 1
Consistency 2
  Recommendations from same place
  Same display for healthcare team members
Total Matches 2
Interaction Fast 1
  Provide timely feedback
  Reduce the amount of time required for use
Total Matches 1
Fit 3 3
  Minimize cognitive load (e.g. mouse clicks)
  Minimize cognitive load (e.g. manual input)
  Reduce screens to facilitate navigation
  Automatically pull data from the EHR
  Navigate to appropriate locations
  Initiate interventions (interactivity)
  Provide a route to get provide specific info
  Adapt behavior according to clinician actions
  Incorporate functions supporting dialogue
Total Matches 3 3
Feedback 1 1
  Provide decision support automatically
  Automate alerting
  Request documentation for override
Total Matches 1 1
Forgiveness 2
  Allow the user to modify orders
  Integrate a reset button
Total Matches 2
Flexible Design 1 1
  Involve the patient
  Utilize adaptive design and feedback
  Provide indication of data availability
  Incorporate functions to support the team
  Give access upon request to extended info
Total Matches 1 1
Total Table Matches 3 2 2 6 5 4 2 3 0 1
a

Design heuristics are written as numbers which correspond to the below list.

3.1 Matched heuristics

3.1.1 Visibility of system status

Three recommendations were found to match the heuristic visibility of system status: Interface recommendation to display information in prominent positions to ensure content is seen [14,22,26,27], interaction recommendations to provide timely feedback [10,14,18,19,23,2835], and to automate alerting and develop automating prompting of users by the CDS [12,13,17,21,36].

3.1.2 Match between system and the real world

Two recommendations were found to match the heuristic match between system and the real world: the interface recommendations to allow for conventional mapping by reading left to right [3739] and to localize information, grouping pieces of information together to facilitate on-screen searches [22,26,27,40].

3.1.3 User control and freedom

Two recommendations were found to match the heuristic user control and freedom: the interaction recommendations to allow the user to modify orders [23] and to integrate a reset button [17,20].

3.1.4 Consistency and standards

Six recommendations were found to match the heuristic consistency and standards: three interface recommendations to promote consistent terminology [22,27,4042], to allow for reading left to right [3739], and to localize information [22] and three information recommendation, to standardize terminology [22,26,27,40,42], to ensure recommendations come from the same place [19], and to have the same display of basic CDS information for the case at hand for all professionals of the healthcare team [19].

3.1.5 Error prevention

Five recommendations were found to match the heuristic error prevention: one information recommendation, to make evidence-based recommendations the default [12,14,32], and four interaction recommendations to minimize cognitive load (using both mouse clicks and free text as well as reducing the amount of manual input) [11,12,22], to automatically pull data from the electronic health record (EHR) [1113,19,31,33,37,4347], and to automate alerting [12,21].

3.1.6 Recognition rather than recall

Four recommendations were found to match the heuristic recognition rather than recall: one information recommendation, to make evidence-based recommendations the default [12,14,32], and three interaction recommendations, to minimize cognitive load [11,12,22] (using both mouse clicks and free text as well as reducing the amount of manual input) and to automatically pull data from the HER [1113,19,31,33,37,4347].

3.1.7 Flexibility and efficiency of use

Two recommendations were found to match the heuristic flexibility and efficiency of use: the information recommendation to make evidence-based recommendations the default [12,14,32], and to give access upon request to extended information (justification of the rule, attached scientific documentation, etc.) that should be structured depending on the user profile [16,19,22,37,40,4850].

3.1.8 Aesthetic and minimalist design

Three recommendations were found to match the heuristic aesthetic and minimalist design: the interface recommendations to make the design simple [11,16,18,22,32] and to avoid using only text [11,20,22], and the information recommendation to use concise and effective language [22,26,27,40,42].

3.1.9 Help users recognize, diagnose, and recover from errors

No recommendations were found to match the heuristic help users recognize, diagnose, and recover from errors.

3.1.10 Help and documentation

One recommendation was found to match the heuristic help and documentation: the interaction recommendation to give access upon request to extended information (justification of the rule, attached scientific documentation, etc) that should be structured depending on the user profile [16,19,22,37,40,4850].

3.2 Associations

Following the iterative matching process of 42 recommendations to ten design heuristics, we first quantified the number of recommendations matched to each heuristic to compare the heuristics’ applicability. Fig. 1 illustrates the ten design heuristics (“DH”) categorized by applicability where size and color of the circles is determined by the number of matched recommendations (i.e., larger, darker circles refer to more recommendations matching the given design heuristic). Design heuristic 4 (consistency and standards) was matched to the most number of recommendations (six recommendations). Design heuristic 9 is not included because it matched no recommendations. Overall, this visualization highlights the discrepancy between the applicability and practicality of different design heuristics for CDS. While some heuristics are more relevant for clinical interface design, others may not add significant value to the clinical setting.

Fig. 1.

Fig. 1

Recommendations from CDS literature quantified and matched to each heuristic. **Heuristic 9: Help users recognize, diagnose, and recover from errors, was not matched to any recommendation.

Next, we evaluated the individual recommendations grouped by type (where types refer to the categories Interface, Interaction, and Information) with regards to number of design heuristics they are mapped to. Fig. 2 quantifies how frequently the recommendations in each category matched to any design heuristics. In Fig. 2, DH stands for design heuristic. “No Rec to DH match” represents the set of recommendations in each type that were mapped to no design heuristics.

Fig. 2.

Fig. 2

CDS literature recommendations stratified, quantified, and matched to each heuristic.

Fig. 2 highlights the heterogeneity between the three categories of recommendations derived from 14 papers. The Interaction category contained the highest number of total recommendations (n = 21), although the majority of the recommendations in this group were matched to zero design heuristics (n = 13 out of 21). The Information category had the lowest number of total recommendations (n = 9), and the majority of recommendations in this group were matched to either zero recommendations (n = 4 out of 9) or exactly one design heuristic (n = 4 out of 9). In addition, the Information category was the only group including a recommendation that matched to three or more design heuristics. In the Interface category, the majority of the recommendations matched to zero or one design heuristics (n = 5 and 4 out of 11, respectively). Due to a limited number of matches in each category and the large number of recommendations matching zero design heuristics, the association rule algorithms were not able to identify statistically significant patterns.

4 Discussion

In this follow-up study to a narrative review outlining the current state of CDS design, we successfully mapped a set of recommendations for CDS design to Jakob Nielsen’s ten usability heuristics. The mapping reveals gaps in both heuristics and recommendations, identifying a set of Nielsen’s heuristics that are underrepresented in the literature and subsets of recommendations important to CDS design not covered in Nielsen’s heuristics.

By only matching 20 of 42 recommendations to heuristics, we were able to identify areas of research not represented in Nielsen’s list. The low match rate can be attributed to both the high specificity and age of Nielsen’s definitions. For example, the heuristic “flexibility and efficiency of use” is narrowly defined as “Flexibility and efficiency of use: accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions.” While the expansion to more general definitions of flexibility and efficiency would have increased our match rate, we were unwilling to compromise the protocol of the study by changing the definitions of well-established heuristics. Similarly, while respected and heavily utilized in the field of CDS design, Nielsen’s heuristics were published over two decades ago, and the capabilities and context of technology has evolved immensely in those years. We identify the need for additional focus on the design interaction between the user and the system’s interface, the arguably most evolved aspect of technological advancements.

To address the consequences of these two limitations, we contemplated additional heuristics that could be considered when designing CDSs based on the unmatched recommendations. This research identified existing recommendations that do not match any of the ten heuristics, which led to the suggested development of domain-specific recommendations. For example, the team created a new interaction heuristic: Integration into real-time workflow to consider the needs of the end-user in the clinical space. CDS offers powerful technology to aid clinicians in delivering high quality care to patients and reduce costs; however in order for CDS to reach its full potential, the information must be rapidly accessible and understood from within the clinicians’ workflow. This new heuristic supplements Nielsen’s heuristics to form a complete set of guidelines for effectively designing CDSs. Future work should consider additional design heuristics to the list that address the evolving technology of CDS.

In addition to unmatched recommendations, we also identified under-researched heuristics in the field of CDS. In particular, the heuristic “help & documentation” was matched to zero recommendations. This heuristic’s definition is extremely limiting (and, arguably, less relevant) for CDSs. While some CDSs do offer documentation, it is more frequently related to topics beyond instructions for use or troubleshooting. For example, MDCalc, a library of interactive score calculators, offers documentation regarding the source of the data used to develop the scoring system, the appropriate settings and populations to use it in, and its limitations [51]. While the heuristic is helpful in designing manual-like documentation, it is too narrow to guide the development of documentation referring to clinical data or predictive model development aspects of the tool that may be of interest to clinicians. Other heuristics with low match rates (two recommendations each for design heuristics 2, 3, and 7) would more likely benefit from additional research than the modification of the heuristic definition.

Previous research has identified similar challenges with regards to traditional interaction design and interface heuristics when applying heuristics to the evaluation of modern day healthcare technologies [14]. The evolution of EHR, medical devices, and mobile interfaces raises new usability challenges. Traditional heuristics have been modified for application to medical devices [52], radiotherapy systems [53], and health information systems [54]. The application of additional methodologies like ergonomic criteria [55], heuristic walkthroughs [56], and new evaluation checklists [57] has supplemented heuristic evaluation of human-computer interfaces. These methodologies are similar to a heuristic evaluation in that they are easy, fast, cheap, and suitable for every life-cycle software phase while serving as a compilation of recommended best practices. One specific evaluation rearranged traditional heuristics adding the following new heuristic: skills defined as “prepare workarounds for frequent users”; pleasurable and respectful interaction with the user; and privacy [57]. Furthermore, this evaluation expanded up on the traditional heuristic definitions by including subheuristics that, while kept unchanged from their corresponding references, provided increased flexibility. For example, the heuristic Help and Documentation include subheuristics like: “navigation: is information easy to find?” and “is there context-sensitive help?”

There is growing recognition that CDS, when well-designed and implemented, holds great potential to improve healthcare quality, increase efficiency, and reduce healthcare costs [14,32,58]. Optimizing the design of CDS and health information technology in general is of increased importance as providers struggle with high cognitive workloads and challenging workflow [59,60]. Much work has been done to identify features critical to success, evaluating clinician performance, workflow, and building frameworks for implementation [14,32,58,61] and yet, there is little consensus on how CDS should be generated and displayed to the user to optimize response [62]. Successful adoption of CDS requires careful consideration of not only the knowledge driving the alert system but also requires application of human factors principles in alert development and design [62].

5 Conclusion

CDS has enabled clinicians to better address arising information needs with evidence-based guidelines to deliver the best available care; however there remains a lack of evidence-based guidelines in terms of functional and design requirements of CDS. Results from this review suggest that interaction design principles were not fully satisfied by the current literature of CDS. Considering design and usability heuristics in the development and evaluation of CDS is a proactive approach to provide unique insight to address failing CDS and will allow developers and researchers to identify areas for improvement to ensure a concise and practical improved implementation.

Summary Table.

  • CDS systems are widely used in clinical care, but are often ineffective due to poor usability.

  • General design heuristics exist for graphic user interfaces; it remains a challenge to facilitate the implementation of these heuristics for the design of clinical decision support.

  • The mapping reveals gaps in both heuristics and recommendations, identifying a set of Nielsen’s heuristics that are underrepresented in the literature and subsets of recommendations important to CDS design not covered in Nielsen’s heuristics

  • This research created a new interaction heuristic: Integration into real-time workflow to consider the needs of the end-user in the clinical space due, in part, to the gaps in Nielsen’s heuristics.

Highlights.

  • A previous narrative review identified 42 recommendations of design and functional characteristics for clinical decision support; this paper classified and mapped these recommendations to Nielsen’s heuristics.

  • We identified a set of Nielsen’s heuristics that are underrepresented in the literature important to clinical decision support decision.

  • We created a new interaction heuristic that was integration into real-time workflow to consider the needs of the end-user in the clinical space.

Acknowledgments

Funding

This work is supported by an Institutional Development Award (IDeA) from the National Institute of General Medical Sciences of the National Institutes of Health under grant number U54-GM104941 (PI: Binder-Macleod), and by the National Library of Medicine of the National Institutes of Health under grant number 1R01LM012300-01A1 (PI: Miller).

Footnotes

Authors’ contributions

KM was responsible for study design and overall management, participated in matching heuristics, and drafted the manuscript. MC contributed to study design, performed the quantitative analysis of associations, participated in matching heuristics, and revised the manuscript. DM, RK, RK, and YN contributed to the study design, participated in matching heuristics, and revised the manuscript. SS, WW, and RA contributed to the study design and revised the manuscript. All authors approved the version of the manuscript to be published.

The authors declare no conflicts of interest.

Articles Used for Review
  1. P. Roshanov, N. Fernandes, J. Wilczynski, B. Hemens, J. You, S. Handler, R. Nieuwlaat, N. Souza, J. Beyene, H. Van Spall, et al., Features of effective computerized clinical decision support systems: Meta regression of 162 randomized trials, BMJ. 346, 2013, 1657.
  2. B. Martinez-Perez, I. de la Torre-Diez, M. Lopez-Coronado, B. Sainz-de-Abajo, M. Robles and J. Garcia-Gomez, Mobile Clinical Decision Support Systems and applications: A Literature and Commercial Review, J Med Syst 38 (1), 2014, 4.
  3. E. Mack, D. Wheeler and P. Embi, Clinical Decision Support Systems in the Pediatric Intensive Care Unit, Pediatrc Crit Care Med. 10 (1), 2009, 23–28.
  4. J. Nies, I. Colombet, P. Degoulet and P. Durieux, Determinants of Success for Computerized Clinical Decision Support Systems Integrated into CPOE Systems: A Systematic Review, AMIA Annu Symp Proc. 2006, 2006, 594–598.
  5. K. Kawamoto, C. Houlihan, E. Balas and D. Lobach, Improving clinical practice using clinical decision support systems: A systematic review of trials to identify features critical to success, BMJ. 330, 2005, 765.
  6. K. Kawamoto and D. Lobach, Clinical Decision Support Provided within Physician Order Entry Systems: A Systematic Review of Features Effective for Changing Clinician Behavior, AMIA Annu Symp Proc. 2003, 2003, 361–365.
  7. S. Lee, Features of Computerized Clinical Decision Support Systems Support of Nursing Practice, Comput Inf Nurs. 31 (10), 2013, 477–495.
  8. J. Chase, S. Andreassen, K. Jensen and G. Shaw, Impact of Human Factors on clinical Protocol Performance: A Proposed Assessment Framework and Case Examples, J Diabetes Sci Technol. 2 (3), 2008, 409–416.
  9. M. Wright and A. Robicsek, No Clinical Decision Support Systems and Infection Prevention: To Know is not Enough, Am J Infect Control. 43 (6), 2015, 554–558.
  10. S. Pelayo, R. Marcilly, S. Bernonville, N. Leroy and M. Beuscart-Zephir, Human Factors Based Recommendations for the Design of Medication Related Clinical Decision Support Systems (CDSS), Stud Heal Technol Inf. 169, 2011, 412–416.
  11. E. Devine, C. Lee, C. Overby, N. Abernethy, J. McCune, J. Smith and P. Tarczy-Hornoch, Usability Evaluation of Pharmacogenomics Clinical Decision Support Aids and Clinical Knowledge Resources in A Computerized Provider Order Entry System: A Mixed Methods Approach, Int J Med Inf. 83 (7), 2014, 473–483.
  12. A. Kanstrup, M. Christiansen and C. Nohr, Four Principles for User Interface Design of Computerized Clinical Decision Support Systems, Stud Heal Technol Inf. 166, 2011, 65–73.
  13. R. Tsopra, J. Jais, A. Venot and C. Duclos, Comparison of two kinds of interface, based on guided navigation or usability principles, for improving the adoption of computerized decision support systems: Application to the prescription of antibiotics, J Am Med Inf Assoc. 21 (e1), 2014, e107–16.
  14. T. Bright, Transforming User Needs into Functional Requirements for an Antibiotic Clinical Decision Support System, Appl Clin Inf. 4 (4), 2013, 618–635.

References

  • 1.LaRosa J, Ahmad N, Feinberg M, Shah M, DiBrienza R, Studer S. The use of an early alert system to improve compliance with sepsis bundles and to assess impact on mortality. Crit. Care Res. Pract. 2012;2012(980369):1–8. doi: 10.1155/2012/980369. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Genes N, Kim MS, Thum FL, et al. Usability evaluation of a clinical decision support system for geriatric ED pain treatment. Appl. Clin. Inf. 2016;7(1):128–142. doi: 10.4338/ACI-2015-08-RA-0108. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Shneiderman B. Designing the User Interface: Strategies for Effective Human-Computer Interaction. 2009 [Google Scholar]
  • 4.Nielsen J, Molich R. Heuristic evaluation of user interfaces; Proc ACM CHI’90 Conf; 1990. p. 2490256. [Google Scholar]
  • 5.Taft T, Staes C, Slager S, Weir C. Adapting Nielsen’s design heuristics to dual processing for clinical decision support; AMIA Annu. Symp. Proc; 2016. pp. 1179–1188. [PMC free article] [PubMed] [Google Scholar]
  • 6.Zhang J, Johnson TR, Patel VL, Paige DL, Kubose T. Using usability heuristics to evaluate patient safety of medical devices. J. Biomed. Inf. 2003;36(1–2):23–30. doi: 10.1016/S1532-0464(03)00060-1. [DOI] [PubMed] [Google Scholar]
  • 7.Zhang J, Patel V, Johnson T, Chung PTJ. In: Evaluating and Predicting Patient Safety for Medical Devices with Integral Information Technology. Henriksen Kerm, Battles James B, Marks Eric S, DIL, editors. Agency for Healthcare Research and Quality (US); Rockville (MD): 2005. [PubMed] [Google Scholar]
  • 8.Hillier F, Lieberman G. Introduction to Operations Research. 9. McGraw-Hill; New York: 2001. [Google Scholar]
  • 9.Miller K, Mosby D, Capan M, Kowalski R, Ratwani R, Noaiseh Y, Kraft R, Schwartz S, Weintraub WS, Arnold R. Interface, information, interaction: a narrative review of design and functional requirements for clinical decision support. J. Inf. Heal Biomed. 2017 doi: 10.1093/jamia/ocx118. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Roshanov P, Fernandes N, Wilcyzynski J, et al. Features of effective computerized clinical decision support systems: meta regression of 162 randomized trials. BMJ. 2013;346:f657. doi: 10.1136/bmj.f657. [DOI] [PubMed] [Google Scholar]
  • 11.Martinez-Perez B, Sainz-de-Abajo B, Robles M, Garcia-Gomez J. Mobile clinical decision support systems and applications: a literature and commercial review. J. Med. Syst. 2014;38(1):4. doi: 10.1007/s10916-013-0004-y. [DOI] [PubMed] [Google Scholar]
  • 12.Mack E, Wheeler D, Embi P. Clinical decision support systems in the pediatric intensive care unit. Pediatrc. Crit. Care Med. 2009;10(1):23–28. doi: 10.1097/PCC.0b013e3181936b23. [DOI] [PubMed] [Google Scholar]
  • 13.Nies J, Colombet I, Degoulet P, Durieux P. Determinants of success for computerized clinical decision support systems integrated into CPOE systems: a systematic review. AMIA Annu. Symp. Proc. 2006;2006:594–598. [PMC free article] [PubMed] [Google Scholar]
  • 14.Kawamoto K, Houlihan C, Balas E, Lobach D. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ. 2005;330:765. doi: 10.1136/bmj.38398.500764.8F. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Kawamoto K, Lobach D. Clinical decision support provided within physician order entry systems: a systematic review of features effective for changing clinician behavior. AMIA Annu. Symp. Proc. 2003;2003:361–365. [PMC free article] [PubMed] [Google Scholar]
  • 16.Lee S. Features of computerized clinical decision support systems support of nursing practice. Comput. Inf. Nurs. 2013;31(10):477–495. doi: 10.1097/01.NCN.0000432127.99644.25. [DOI] [PubMed] [Google Scholar]
  • 17.Chase J, Andreassen S, Jensen K, Shaw G. Impact of human factors on clinical protocol performance: a proposed assessment framework and case examples. J. Diabetes Sci. Technol. 2008;2(3):409–416. doi: 10.1177/193229680800200310. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Wright M, Robicsek A. No clinical decision support systems and infection prevention: to know is not enough. Am. J. Infect. Control. 2015;43(6):554–558. doi: 10.1016/j.ajic.2015.02.004. [DOI] [PubMed] [Google Scholar]
  • 19.Pelayo S, Marcilly R, Bernonville S, Leroy N, Beuscart-Zephir M. Human factors based recommendations for the design of medication related clinical decision support systems (CDSS) Stud. Heal Technol. Inf. 2011;169:412–416. [PubMed] [Google Scholar]
  • 20.Devine E, Lee C, Overby C, et al. Usability evaluation of pharmacogenomics clinical decision support aids and clinical knowledge resources in a computerized provider order entry system: a mixed methods approach. Int. J. Med. Inf. 2014;83(7):473–483. doi: 10.1016/j.ijmedinf.2014.04.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Kanstrup A, Christiansen M, Nohr C. Four principles for user interface design of computerized clinical decision support systems. Stud. Heal Technol. Inf. 2011;166:65–73. [PubMed] [Google Scholar]
  • 22.Tsopra R, Jais J, Venot A, Duclos C. Comparison of two kinds of interface, based on guided navigation or usability principles, for improving the adoption of computerized decision support systems: application to the prescription of antibiotics. J. Am. Med. Inf. Assoc. 2014;21(e1):e107–16. doi: 10.1136/amiajnl-2013-002042. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Bright T. Transforming User needs into functional requirements for an antibiotic clinical decision support system. Appl. Clin. Inf. 2013;4(4):618–635. doi: 10.4338/ACI-2013-08-RA-0058. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Fink A, Kosecoff J, Chassin M, Brook R. Consensus methods: characteristics and guidelines for use. AJPH. 1984;74(9) doi: 10.2105/ajph.74.9.979. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Burachik R, Iusem AN. Set-valued mappings and enlargements of monotone operators. Spring Sci. Bus. Media. 2018:8. [Google Scholar]
  • 26.Khajouei R, Jaspers M. The impact of CPOE medication systems' design aspects on usability, workflow and medication orders: a systematic review. Methods Inf. Med. 2010;49:3–19. doi: 10.3414/ME0630. [DOI] [PubMed] [Google Scholar]
  • 27.Khajouei R, Jaspers M. CPOE system design aspects and their qualitative effect on usability. Stud. Heal Technol. Inf. 2008;136:309–314. [PubMed] [Google Scholar]
  • 28.Wyatt J. Lessons learnt from the field trial of ACORN, an expert system to advise on chest pain, Singapore; Proc Sixth World Conf Med Informatics; 1989. pp. 111–115. [Google Scholar]
  • 29.Heathfield H, Wyatt J. Philosophies for the design and development of clinical decision-support systems. Methods Inf. Med. 1993;32(1):9–17. [PubMed] [Google Scholar]
  • 30.Tierney W. Improving clinical decisions and outcomes with information: a review. Int. J. Med. Inf. 2001;62(1):1–9. doi: 10.1016/s1386-5056(01)00127-7. [DOI] [PubMed] [Google Scholar]
  • 31.Bodenheimer T, Grumbach K. Electronic technology: a spark to revitalize primary care? JAMA. 2003;290(2):259–264. doi: 10.1001/jama.290.2.259. [DOI] [PubMed] [Google Scholar]
  • 32.Bates D, Kuperman G, Wang S. Ten commandments for effective clinical decision support: making the practice of evidence-based medicine a reality. J. Am. Med. Inf. Assoc. 2003;10:523–530. doi: 10.1197/jamia.M1370. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Kaplan B. Evaluating informatics applications—some alternative approaches: theory, social interactionism, and call for methodological pluralism. Int. J. Med. Inf. 2001;64(1):39–56. doi: 10.1016/s1386-5056(01)00184-8. [DOI] [PubMed] [Google Scholar]
  • 34.Morris A. Developing and implementing computerized protocols for standardization of clinical decisions. Ann. Intern. Med. 2000;132:373–383. doi: 10.7326/0003-4819-132-5-200003070-00007. [DOI] [PubMed] [Google Scholar]
  • 35.Bennett J, Glasziou P. Computerised reminders and feedback in medication management: a systematic review of randomised controlled trials. Med. J. Aust. 2003;178(5):217–222. doi: 10.5694/j.1326-5377.2003.tb05166.x. [DOI] [PubMed] [Google Scholar]
  • 36.Garg A, Adhikari N, McDonald H, et al. No effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA. 2005;293(10):1233–1238. doi: 10.1001/jama.293.10.1223. [DOI] [PubMed] [Google Scholar]
  • 37.Shiffman R, Liaw Y, Brandt C, Corb G. Computer-based guideline implementation systems: a systematic review of functionality and effectiveness. J. Am. Med. Inf. Assoc. 1999;6(2):104–114. doi: 10.1136/jamia.1999.0060104. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Maviglia S, Zielstorff R, Paterno M, Teich J, Bates D, Kuperman G. Automating complex guidelines for chronic disease: lessons learned. J. Am. Med. Inf. Assoc. 2003;10(2):154–165. doi: 10.1197/jamia.M1181. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Linder J, Schnipper J, Tsurikova R, et al. Documentation-based clinical decision support to improve antibiotic prescribing for acute respiratory infections in primary care: a cluster randomised controlled trial. Inf. Prim. Care. 2009;17(4):231–240. doi: 10.14236/jhi.v17i4.742. [DOI] [PubMed] [Google Scholar]
  • 40.Horsky J, Schiff G, Johnston D, Mercincavage L, Bell D, Middleton B. Interface design principles for usable decision support: a targeted review of best practices for clinical prescribing interventions. J. Biomed. Inf. 2012;45(6):1202–1216. doi: 10.1016/j.jbi.2012.09.002. [DOI] [PubMed] [Google Scholar]
  • 41.Belden J, Grayson R, Barnes J. Defining and testing EMR usability: principles and proposed methods of EMR usability evaluation and rating. Healthc. Inf. Manag Syst. Soc. (HIMSS) 2009 [Google Scholar]
  • 42.Middleton B, Bloomrosen M, Dente M, et al. Enhancing patient safety and quality of care by improving the usability of electronic health record systems: recommendations from AMIA. J. Am. Med. Inf. Assoc. 2013;20(e1):e2–e8. doi: 10.1136/amiajnl-2012-001458. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Wetter T. Lessons learnt from bringing knowledge-based decision support into routine use. Artif. Intell. Med. 2002;24(3):195–203. doi: 10.1016/s0933-3657(01)00103-8. [DOI] [PubMed] [Google Scholar]
  • 44.Payne T. Computer decision support systems. Chest. 2000;118(Suppl. 2):47S–52S. doi: 10.1378/chest.118.2_suppl.47s. [DOI] [PubMed] [Google Scholar]
  • 45.Trivedi M, Kern J, Marcee A, et al. Development and implementation of computerized clinical guidelines: barriers and solutions. Methods Inf. Med. 2002;41(5):435–442. [PubMed] [Google Scholar]
  • 46.Aronsky D, Chan K, Haug P. Evaluation of a computerized diagnostic decision support system for patients with pneumonia: study design considerations. J. Am. Med. Inf. Assoc. 2001;8(5):473–485. doi: 10.1136/jamia.2001.0080473. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Hersh W. Medical informatics: improving health care through information. JAMA. 2002;288(16):1955–1958. doi: 10.1001/jama.288.16.1955. [DOI] [PubMed] [Google Scholar]
  • 48.Mollon B, Chong J, Holbrook A, Sung M, Thabane L, Foster G. Features predicting the success of computerized decision support for prescribing: a systematic review of randomized controlled trials. BMC Med. Inf. Decis. Making. 2009;9:11. doi: 10.1186/1472-6947-9-11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Wendt T, Knaup-Gregori P, Winter A. Decision support in medicine: a survey of problems of user acceptance. Stud. Heal Technol. Inf. 2000;77:852–856. [PubMed] [Google Scholar]
  • 50.Sim I, Gorman P, Greenes R, et al. Clinical decision support systems for the practice of evidence-based medicine. J. Am. Med. Inf. Assoc. 2001;8(6):527–534. doi: 10.1136/jamia.2001.0080527. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Avery J, Beiner J, Habboushe J, et al. MDCalc, MDCalc [Google Scholar]
  • 52.Zhang J, Johnson TR, Patel VL, Paige DL, Kubose T. Using usability heuristics to evaluate patient safety of medical devices. J. Biomed. Inf. 2003;36:23–30. doi: 10.1016/s1532-0464(03)00060-1. [DOI] [PubMed] [Google Scholar]
  • 53.Chan A, Islam M, Rosewall T, Jaffray D, Easty A, Cafazzo J. Applying usability heuristics to radiotherapy systems. Radiother. Oncol. 2012;102(1):142–147. doi: 10.1016/j.radonc.2011.05.077. [DOI] [PubMed] [Google Scholar]
  • 54.Carvalho C, Borycki E, Kushniruk A. Using heuristic evaluations to assess the safety of health information systems. Stud. Heal Technol. Inf. 2009;143:297–301. [PubMed] [Google Scholar]
  • 55.Bastien J, Scapin D. A validation of ergonomic criteria for the evaluation of human-computer interaction. Int. J. Hum. Comput. Interact. 1992;4(2):183–196. [Google Scholar]
  • 56.Sears A. Heuristic walk throughs: finding the problems without the noise. Int. J. Hum. Comput. Interact. 1997;9(3):213–234. [Google Scholar]
  • 57.Gomez R, Caballero D, Sevillano J. Heuristic evaluation on Mobile interfaces: a New checklist. Sci. World J. 2014;2014 doi: 10.1155/2014/434326. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58.Garg AX, Adhikari NKJ, McDonald H, et al. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA. 2005;293(10):1223–1238. doi: 10.1001/jama.293.10.1223. [DOI] [PubMed] [Google Scholar]
  • 59.Moacdieh N, Sarter N. Display clutter: a review of definitions and measurement techniques. Hum. Factors. 2015;57(1):61–100. doi: 10.1177/0018720814541145. [DOI] [PubMed] [Google Scholar]
  • 60.American Medical Association. Improving care : priorities to improve electronic health record usability. Am. Med. Assoc. Advis. Commun. EHR Phys. Usab. 2014:12. [Google Scholar]
  • 61.Miller A, Moon B, Anders S, Walden R, Brown S, Montella D. Integrating computerized clinical decision support systems into clinical work: a meta-synthesis of qualitative research. Int. J. Med. Inf. 2015;84(12):1009–1018. doi: 10.1016/j.ijmedinf.2015.09.005. [DOI] [PubMed] [Google Scholar]
  • 62.Phansalkar S, Edworthy J, Hellier E, et al. A review of human factors principles for the design and implementation of medication safety alerts in clinical information systems. J. Am. Med. Inf. Assoc. 2010;17(5):493–501. doi: 10.1136/jamia.2010.005264. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES