Skip to main content
AMIA Annual Symposium Proceedings logoLink to AMIA Annual Symposium Proceedings
. 2006;2006:329–333.

Patterns of Use of Decision Support Tools by Clinicians

Robert S Hayward 1, Mohamad El-Hajj 1, Tanya K Voth 1, Kelly Deis 1
PMCID: PMC1839308  PMID: 17238357

Abstract

This paper analyses information behavior data automatically gathered by an integrated clinical information environment used by internal medicine physicians and trainees at the University of Alberta. The study reviews how clinical information systems, decision-support tools and evidence resources were used over a 13 month period. Aggregate and application-specific frequency and duration of use was compared for location, time of day, physician status, and application-type (clinical information system or 5 categories of knowledge resources). Significant differences are observed for when and where resources were used, diurnal patterns of use, minutes spent per encounter, and patterns of use for physicians and trainees. We find that evidence use is not restricted to either the place or time of clinical work, resources are used for very short periods at the point-of-care, and that use of filtered evidence-based resources is concentrated among trainees.

Keywords: Decision Support Systems, Clinical, Evidence-Based Medicine, Information Services

INTRODUCTION

Clinicians work in an era of decreasing time, shrinking resources, but rapidly increasing evidence about the effects of health care. Recognizing that important questions go unanswered during the clinical workday,1,2 health organizations increasingly deploy computerized information resources that can decrease the time it takes to find answers to clinical questions. Such resources usually include ‘unfiltered’ evidence such as bibliographic, journal and drug information databases, and may also include ‘filtered’ evidence such as evidence-based synopses, systematic reviews, clinical practice guidelines, and clinical decision support tools.3 Although, early assessments of access to diverse evidence repositories suggest that filtered evidence can improve clinical care,4 there are still many unanswered questions about how filtered evidence serves busy clinicians at the point of clinical decision-making, how knowledge resources are used alongside clinical information tools, and how to create holistic information environments for clinicians.5,6

Most studies of physician information-seeking behaviors rely on self-reported survey data, direct observation, interviews or analysis of website logs; all of which have inherent limitations.6,7 To improve our empiric understanding of clinicians’ information needs, actual information behaviors must be recorded real-time, when and where clinicians reflect on health care problems. Even simple behaviors, such as the time, location, frequency and duration of use of clinical, decision-support, communication and evidence resources can facilitate planning about optimal licensing and deployment of applications most likely to be used during patient care. To date, however, it has not been possible to record high-fidelity observations about how multiple products from multiple vendors interact during clinical workflow. This study analyzes information behavior data from an integrated clinical information environment that internal medicine physicians and trainees use to support patient care and professional development. We examine patterns of information use to address the following questions:

  • ▪ When and where are different types of information resources used by internists?

  • ▪ How does the frequency and duration of evidence use vary with time of day?

  • ▪ Which evidence-based information resources are most used at the point-of-care?

  • ▪ Do patterns of information use differ for internal medicine physicians and trainees?

METHODS

The study was approved by the University of Alberta Health Research Ethics Board, and user consent was obtained for anonymized information behavior monitoring among internal medicine physicians and residents affiliated with the University of Alberta Department of Medicine (DOM), University of Alberta Hospital, Capital Health Region, Edmonton, Alberta, Canada. Internists and trainees care for referred patients in inpatient and ambulatory care settings. We examined data for the time period January 1, 2005 - January 31, 2006, when information systems remained stable for the target population and hospital setting.

The DOM used computerized “desktops” to access diverse information resources since 1996, initially with the “CLINT” point-of-care informatics laboratory8 and later with the derivative “VIVIDESK” integration system (Figure 1, www.vividesk.com). The VIVIDESK desktop is an Internet-based technology that facilitates access to multiple information sources, networks and applications through a simple, customized, centrally managed and comprehensive information environment. The CCOW-enabled integration engine provides single sign-on to all resources (clinical, knowledge, teaching, safety, administrative, communication), while facilitating context-sensitive links between information resources. Information from one resource (such as a patient condition) can be automatically directed to other resources (such as an electronic textbook). Embedded “quick-searches” are available for most clinical information tools, including the electronic health record (EHR). In this way the desktop provides point-of-care evidence.

Figure 1.

Figure 1

VIVIDESK Clinician Desktop – Quick-Search (screen shot)

By 2005, clinician desktops were provided to all internal medicine trainees (90, PGY1-4) and over 80% of DOM physicians (123/151). All had desktops with single sign-on to clinical (EHR, diagnostic imaging, pharmacy systems, patient tracking, etc.), decision support (drug information, patient safety, clinical calculators, decision-rules, pathways, caremaps, guidelines, etc.), communications (email, telephony, paging, telehealth, etc.), education (courses, evaluations, professional development, etc.) and knowledge (Internet, references, e-Library, full-text journals, evidence repositories, etc.) resources for internists and residents. The desktop integrated evidence with clinical information (e.g., direct links from electronic health record to relevant supporting evidence), while facilitating private communication among physicians about evidence and practice. Although clinicians could log on to some clinical systems without using the desktop, the only way to quickly access evidence at the point-of-care required use of the desktop.

Although numerous applications were available to desktop users, only clinical information systems and knowledge-based resources were included in this study. Knowledge resources are sub-grouped using a modified hierarchy of evidence:9

  • Systems help connect evidence with action and include clinical decision support tools, clinical practice guidelines, clinical algorithms and prediction rules, drug information databases, and patient education (e.g., Micromedex Healthcare Series, CareNotes, MedCalc 3000).

  • Synopses provide brief, refereed, standardized summaries of high-quality studies and reviews, often emphasizing the clinical utility of evidence (e.g., ACP Journal Club, Best Bets).

  • Syntheses integrate results from multiple studies, usually in the form of systematic reviews (e.g., Cochrane Library, Clinical Evidence).

  • Summaries provide background information about health conditions or interventions and include electronic textbooks and atlases (e.g., Access Medicine, Stat!Ref e-Library).

  • Studies include electronic journals and bibliographic databases providing access to the full text of original health care research (e.g., PubMed, Ovid, Ebsco, etc.).

All users received a core set of information resources covering all of the above categories, with custom additions for 20 different specialty and interest groups. User groups are allowed to build private collections of “internal” evidence relevant to their practice. A private “my workspace” allows individuals to further personalize their evidence collection. A total of 98 core and custom resources were used during the study period, with the desktop available 99.94% of the time, 24 hours a day, from hospital, clinic, office, home and University.

Information behaviors were monitored for all study participants and information resources. The desktop recorded user location, workstation identity, time of application access, duration of application focus (actual time user interacts with resource) and informational context. “Point-of-care” devices are deduced from location and workstation identity data. User presence was verified every 4 minutes and, because of single sign-on to private tools like email and evaluations, both policy and self-interest discouraged account sharing. Data collection began the moment a user opened the desktop and ended when the desktop closed or timed out after 4 minutes of inactivity. Unlike conventional web server logs, the desktop tracked exactly what was done, for how long, in what sequence and context, for all information resources irrespective of software vendor, server location, or any intrinsic audit capabilities. Only completed desktop sessions are included in an independent data warehouse, where confidential identifiers allow information behaviors to be profiled anonymously.

Two dependent variables were defined: frequency and duration of monitored resource use. Independent variables included trainee status (physician vs. resident), day of week (weekday vs. weekend) time of day (8 categories: 00:00–02:59, 03:00–05:59, 06:00–08:59, 09:00–11:59, 12:00–14:59, 15:00–17:59, 18:00–20:59, 21:00–23:59), application type (6 categories: Clinical, Studies, Summaries, Syntheses, Synopses, Systems), and location (clinical vs. non-clinical, off-site). When comparing usage frequency and duration distributions for the various dependent variables, we used the non-parametric Tukey Multiple Comparisons Test (MCT) as rendered by version 13 of SSPS,10 and report “significant” results only if the probability of type B error is < 2.5%.

RESULTS

During the study period, 123 internists and 90 residents used monitored information resources 20,813 times while accruing 4,210 hours with knowledge-based applications. Although trainees accounted for 54% of frequency and 47% of duration of access, they were about twice as likely as faculty to use point-of-care resources when usage is adjusted for study setting exposure time (mean 2 months for residents versus 5 months for physicians), giving 45 events/person/month (11 hours/person/month) for residents and 21 events/person/month (4 hours/person/month) for certified internists. Overall, 71% of usage occurred in clinical settings (point-of-care) and 29% in private offices or at home. Between the hours of 17:00 and 07:00, however, 61% of usage occurred off-site.

Diurnal Patterns

Both frequency of access and duration of access follow a similar diurnal pattern over the course of a day, with the most intensive resource use between the hours of 09:00 and 18:00 (Figure 2). The daily distribution of access frequency is significantly different for trainees, who initiate 41% of resource access events between 18:00 and 06:00 by comparison to 29% for internists in the same time period. Residents allocate proportionately more of their total usage hours to evening (resident duration of use 47% compared to 30% for internists). The diurnal pattern for both access and duration differs for weekdays and weekends, when resource use is roughly the same for all time intervals between 09:00 and 23:59.

Figure 2.

Figure 2

Frequency of Application Use by Time of Day, All Users

Information Type

We observed significant differences in the diurnal distribution of the frequency and duration of application use for six categories of information resources (Figure 3).

Figure 3.

Figure 3

Diurnal Distribution of Application use by Resource Type

As expected, use of clinical information systems (laboratory results, radiographs, patient lists) peak at the beginning of the working day, with persistent high usage during typical hours for sign-in rounds, ward rounds, clinic hours, and sign-out rounds. Systems (clinical decision support) usage peaks towards the end of the working day. Syntheses and Synopses closely resemble the distributions for decision support tools (Systems), but Studies and Summaries are proportionately more likely to be used in the early and late evening. Whereas internists were significantly more likely to use Studies and Summaries (60% of total use) than residents (31% of total use), trainees were significantly more likely to use evidence-based Synopses and Syntheses (25% of total usage) than the established internists (12% of total usage), a difference accentuated when duration of use is analyzed, when residents are off-site and during non-working hours.

Duration of Information Episode

Considering all applications, 55% of individual application episodes are completed within 5 minutes and 44% within 3 minutes. If more than 5 minutes is spent on an application, it is equally likely to be used for any duration between 10 and 50 minutes, after which usage rapidly tails off. Only 3% of episodes last longer than an hour.

The time spent per application encounter varies by time of day and day of week (users spend significantly more time per application in the late evening and on weekends), trainee status (trainees spend significantly less time per application than internists) and application type. Application encounters can be collapsed into two categories (less than 5 minutes and 5 minutes or more), highlighting differences between different types of evidence resources (Figure 4). Whereas encounters with Syntheses, Synopses and Systems are significantly more likely take less than 5 minutes, encounters with Studies and Summaries usually take 5 or more minutes. This pattern is accentuated during busy weekdays, when short encounters with Systems, Synopses and Systems predominate and at night, when significantly longer episodes with Studies and Summaries access are observed.

Figure 4.

Figure 4

Information Episode Duration by Resource Type

DISCUSSION

The study summarizes observations about how clinical information systems, decision-support tools and evidence resources were used over a 13 month period by internal medicine clinicians and trainees. Although integrated information environments were used in our setting prior to this study, the 2005 calendar year offered optimal conditions for study of what busy clinicians do with health information. There was consistent access to fully integrated clinical and knowledge-based information resources at the point-of-care, no new applications, interventions or services were introduced, and all hardware, security and access protocols were implemented at inpatient, outpatient and off-site facilities. A single point of access for all software allowed automated data collection using a predefined protocol.

This paper examines basic data gathered about when, where and for how long diverse resources were used by physicians and trainees. The large number of user-application encounters (>20,000) and many hours (>4,200) of usage for 213 clinicians allowed detection of statistically significant and clinically important patterns of information use.

That clinical information and decision-support systems are most used between the hours of 06:00 and 18:00 is of little surprise. Other observations may have operational implications for health facilities promoting evidence-based decision-making:

Evidence use is not restricted to either the place or time of clinical work

Evidence-repositories, particularly research collections and summaries, are most used in early and late evening and on weekends, usually outside clinical facilities. The point of questioning and point of reflection appear separated in time and place, suggesting that evidence-based decision-makers need informational support beyond the usual time-space “borders” of intranets.

Brevity begets value at the point-of-care

When high-quality evidence is no more than 5 seconds or 5 clicks from clinical information, clinicians appear to favor brief (< 3 minute) visits to Synopses, Syntheses and Systems, all of which pre-filter for valid, important and applicable evidence. This suggests that health organizations may benefit from investment in instantly accessible filtered evidence at the point-of-care, emphasizing scaleable licenses supporting many simultaneous but brief encounters.

Promoting filtered evidence resources may improve clinician uptake

University of Alberta residents are taught about filtered evidence repositories within the first few months of training, which may enhance their apparent preference for Systems, Synopses and Syntheses. Having shifted EBM training to online self-directed modules, this hypothesis will be tested by correlating information behavior changes with online learning interventions.

A number of methodological limitations preclude further conjecture about the observed results. First, the desktop users were able to customize the integrated information environment at group, interest and personal levels. A larger and more diverse clinician population observed over longer periods in multiple settings would allow validation of the information behaviors in this study, while supporting comparisons among subgroups. As clinician desktops disseminate through our and others’ health regions, we look forward to analysis of larger datasets.

Second, in the absence of user surveys, interviews or focus groups, we do not know whether information use correlates with information satisfaction. We do know that both internists and trainees demonstrated continuous use throughout the one-year study period, and we surmise that unrewarding or frustrating evidence-access would have discouraged continuing usage.

Third, our internist group includes many clinician-teachers and clinician-researchers, which may explain their relative preference for evidence from Studies and Summaries. Other clinicians in other settings may have usage peaks for clinical or knowledge resources at different times of day, influenced by such things as the timing of rounds and discharges. However, we expect that peaks will occur and that diurnal variation in resource use remains relevant to information resource planning.

Finally, in the absence of health process or outcomes data, we continue to assume that point-of-care access to high-quality evidence is a good thing. Evidence was not forced upon participants in this study: there were no automated reminders, alerts or other intrusive prompts to decision support. Any and all evidence access required an explicit clinician request. However, the simplicity of that act (e.g., clicking on an unfamiliar concept and then selecting the type of evidence desired) makes it easier for clinicians to seek evidence in support of clinical questions. The relatively high levels of sustained evidence use, particularly frequent, short, point-of-care, interaction with high-quality knowledge resources, suggests that evidence can be integrated with clinical workflow and that health institutions may benefit from investments in actionable evidence: Systems, Synopses and Syntheses. Having accomplished this, the next step is to discover which resources most promote best practices.

Acknowledgements

The authors wish to thank Ms. Donna Strating for support and facilitated access to integrated information environments in Capital Health Region facilities; and the Department of Medicine and the JWS Health Sciences Library at the University of Alberta for their long-standing promotion of evidence-based health informatics.

REFERENCES

  • 1.Covell DG, Uman GC, Manning PR. Information needs in office practice: are they being met? Ann Intern Med. 1985;103(4):596–9. doi: 10.7326/0003-4819-103-4-596. [DOI] [PubMed] [Google Scholar]
  • 2.Smith R. What clinical information do doctors need? BMJ. 1996;313(7064):1062–8. doi: 10.1136/bmj.313.7064.1062. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Guyatt G, Rennie D, editors. Users' Guides to the Medical Literature, A Manual for Evidence-based Clinical Practice. Chicago: American Medical Association; 2002. [Google Scholar]
  • 4.Sackett DL, Straus SE. Finding and applying evidence during clinical rounds: the "evidence cart". JAMA. 1998;280(15):1336–8. doi: 10.1001/jama.280.15.1336. [DOI] [PubMed] [Google Scholar]
  • 5.Currie LM, Graham M, Allen M, Bakken S, Patel V, Cimino JJ. Clinical information needs in context: an observational study of clinicians while using a clinical information system. AMIA Annu Symp Proc. 2003:190–4. [PMC free article] [PubMed] [Google Scholar]
  • 6.Dawes M, Sampson U. Knowledge management in clinical practice: a systematic review of information seeking behavior in physicians. Int J Med Inform. 2003;71(1):9–15. doi: 10.1016/s1386-5056(03)00023-6. [DOI] [PubMed] [Google Scholar]
  • 7.Forsythe DE, Buchanan BG, Osheroff JA, Miller RA. Expanding the concept of medical information: an observational study of physicians' information needs. Comput Biomed Res. 1992;25(2):181–200. doi: 10.1016/0010-4809(92)90020-b. [DOI] [PubMed] [Google Scholar]
  • 8.Van Wingerden PL, Hayward RS, Langton KB, Carey TT. CLINT: a clinical informatics system for an internal medicine ward. Medinfo. 1995;8 Pt 1:602–5. [PubMed] [Google Scholar]
  • 9.Haynes RB. Of studies, syntheses, synopses, and systems: the "4S" evolution of services for finding current best evidence. ACP J Club. 2001 Mar–Apr;134(2):A11–3. [PubMed] [Google Scholar]
  • 10.Games PA. Multiple comparisons of means. Am Educ Res J. 1971;8:531–565. [Google Scholar]

Articles from AMIA Annual Symposium Proceedings are provided here courtesy of American Medical Informatics Association

RESOURCES