Skip to main content
AMIA Annual Symposium Proceedings logoLink to AMIA Annual Symposium Proceedings
. 2005;2005:921.

Usability comparison of three clinical trial management systems

Byungsuk Choi 1,2, Stan Drozdetski 1,2, Margrethe Hackett 1, Can Lu 1,2, Cari Rottenberg 1, Linda Yu 1,2, Dale Hunscher 1,2, Daniel Clauw
PMCID: PMC1560441  PMID: 16779208

Abstract

To advise in the selection of a clinical trial management system (CTMS), we evaluated three candidate applications. After preliminary analyses, we performed heuristic evaluation and usability testing to assess system’s usability. Velos eResearch, a commercial CTMS, had the best usability outcome despite having fewer features in comparison. In the decision process, the “ease-of-use” aspect was more valued than functionality.

Introduction

Last year, the Center for the Advancement of Clinical Research (CACR) at the University of Michigan Health System had to decide whether to adopt a clinical trial management system from external sources or to continue to develop its system, BioDBx 5, in transition from version 4. Velos eResearch is a commercial product, and CTMA is the clinical trial management application developed and deployed by the University of Pittsburgh Cancer Center. To help in the decision process in terms of system usability, graduate students of the University of Michigan School of Information conducted a series of usability evaluations.

Methods

To better understand the systems, we utilized generalized transition network diagrams, visual analysis, vocabulary analysis, and action analysis. Then, we carried out heuristic evaluations from a list1 adopted from Gregory Abowd, who in turn based his work on that of Clayton Lewis, John Rieman, and Jakob Nielsen.

For hands-on user testing, we recruited 14 clinical research staff members of the University of Michigan’s Health System. We used Camtasia Studio to capture video, audio, keyboard and mouse movements to supplement our notes. Two testers were present at each session, one to take notes, and the other to guide users. We requested users to ‘think aloud’ to help illuminate their mental processes. Afterwards, users filled out a post-test questionnaire and participated in an open-ended interview. We also used a card sorting exercise in which the subject chose 5 cards from a deck of 52 with words that described the users’ feelings about the system.

Results

eResearch excelled in terms of overall usability, but had less extensive functionality than BioDBx and CTMA. By comparing average scores, eResearch was best on all usability criteria over the other systems. One-way ANOVA of usability testing results shows that the following criteria were significant at α = 0.01 (but because N=9, these results are considered tentative and require further validation). eResearch excelled in:

  • User control and freedom (p= 0.0005)

  • Recognition rather than recall (p=0.00943)

  • Diagnosis and error recovery (p=0.007366)

  • Help and documentation (p=0.001473)

User test results show that BioDBx version 4 was problematic for users in navigation, system response, labeling, and layout. Version 5 improved the system’s feedback and response. Most testers agreed that BioDBx required extra scrolling to see necessary information. eResearch had unclear labeling, but otherwise drew accolades from users as being organized and understandable. Forms and fields are best addressed in eResearch. In spite of some workflow and labeling issues, users thought that CTMA was consistent and well organized; testers chose CTMA as having the clearest screen layout.

Conclusion

Previous research in the usability of medical data entry systems discovered problems in user workflow 2 and navigation 3 as was true in our findings. Such issues are closely related to a system’s adoption rate, error prevention, and productivity. Although usability is not the sole factor in determining the best system, it is as important as functionality. Moreover, it is often easier to add functionality to a usable system rather than making a functional system usable.

References


Articles from AMIA Annual Symposium Proceedings are provided here courtesy of American Medical Informatics Association

RESOURCES