Abstract
Clinical trials increasingly rely upon web-based Clinical Trials Management Systems (CTMS). As with clinical care systems, Human Computer Interaction (HCI) issues can greatly affect the usefulness of such systems. Evaluation of the user interface of one web-based CTMS revealed a number of potential human-computer interaction problems, in particular, increased workflow complexity associated with a web application delivery model and potential usability problems resulting from the use of ambiguous icons. Because these design features are shared by a large fraction of current CTMS, the implications extend beyond this individual system.
Introduction
The conduct of clinical research has increasingly become a national enterprise, involving multiple stakeholders, institutions, and sponsors [1]. Given this environment, the application of information technology in the form of clinical trials management systems (CTMS), designed to automate or assist in such processes is extremely desirable. The NCI-funded CLL Research Consortium (CLLCRC) is emblematic of the shift away from paper-based trials documentation. The CLLCRC conducts nearly paperless clinical trials, using a web-based CTMS called the CRC Integrated Information Management System (CIMS), which includes a research participant tracking tool. This tool shares design features with most common CTMS, including scheduling displays that emulate traditional paper-based calendars, and heavy use of icons (Figure 1). The importance of human-computer interaction (HCI) factors in clinical care systems has recently received considerable attention [2]. This study evaluates similar HCI issues in a typical CTMS.
Methods
A two-part cognitive analysis including a cognitive walkthrough and field usability test of the CIMS was undertaken. First, the workflow of the CIMS was compared to the generation of paper research participant calendars using conventional word processing software. During the walkthrough, potential usability issues based on standard HCI heuristics were identified. Second, a novice and an expert user were evaluated while performing a variety of common CTMS tasks.
Results
The overall complexity of the CIMS web application workflow was comparable to the word processor-based method, requiring 25 discrete actions to generate a research participant calendar. However, the number of screen transitions was significantly higher for CIMS (15 versus 8) as were potential usability problems (44 versus 32). The usability study revealed a high frequency of data interpretation errors. These were predominantly associated with incorrect interpretation of icons and associated navigation actions. Surprisingly, both expert and novice users experienced nearly identical problems.
Conclusions
The use of a “calendar-like” interface provides a familiar presentation model. However the shift to a web interaction paradigm increased complexity at the user level. Furthermore, the usability results underscore the importance of intuitive icon design. The CIMS icon set is shared with other CTMS’s. It uses generic icons to convey task status. The icons convey no semantic information about the specific tasks, resulting in ambiguous and incorrect interpretation. These findings underscore in importance HCI issues in the design of CTMS’s, and raise the question of where they will contribute to adverse outcomes, as has been observed in clinical care systems.
Acknowledgements
This work was supported in part by NCI grant PO1 CA81534 and NLM training grant N01-LM07079
References
- 1.Sung NS, et al. Central challenges facing the national clinical research enterprise. Jama. 2003;289(10):1278–87. doi: 10.1001/jama.289.10.1278. [DOI] [PubMed] [Google Scholar]
- 2.Koppel R, et al. Role of computerized physician order entry systems in facilitating medication errors. Jama. 2005;293(10):1197–203. doi: 10.1001/jama.293.10.1197. [DOI] [PubMed] [Google Scholar]