Skip to main content
AMIA Annual Symposium Proceedings logoLink to AMIA Annual Symposium Proceedings
. 2008;2008:26–30.

A Rapid Assessment Process for Clinical Informatics Interventions

Joan S Ash 1, Dean F Sittig 1,2, Carmit K McMullen 2, Kenneth Guappone 1, Richard Dykstra, James Carpenter 3
PMCID: PMC2656056  PMID: 18999075

Abstract

Informatics interventions generally take place in rapidly changing settings where many variables are outside the control of the evaluator. Assessment must be timely so that feedback can instigate modification of the intervention. Adapting a methodology from international health and epidemiology, we have developed and refined a Rapid Assessment Process (RAP) for informatics while conducting a study of clinical decision support (CDS) in community hospitals. Using RAP, we have not only been able to provide implementers with actionable feedback, but we have also discovered that users and informaticians conceptualize CDS in vastly different ways. Further understanding of this difference will be needed if we are to improve CDS acceptance by users.

Introduction

Clinical informatics interventions such as implementation of computerized provider order entry (CPOE) with clinical decision support (CDS) are moving evaluation targets: they are continuously changing as software and content are updated1. Although the ultimate goal is to improve patient care, and therefore most studies of CDS have assessed outcomes24, these studies do not explain why the systems are successful or not and they do not provide feedback for iterative system improvements. Formative evaluation methods using naturalistic designs have rarely been used for CDS assessment, yet they can best discover how and why systems are successful or not. Kaplan has noted that “these omissions are impoverishing our understanding of CDSS5, p.22”. The Provider Order Entry Team (POET) at Oregon Health & Science University in Portland, OR, is conducting such a naturalistic study of CDS in community hospitals, with dual purposes of identifying barriers and facilitators for CDS implementation and also of refining research methods for efficiency. We broadly define computerized provider order entry as a system that allows a decision maker to directly enter medical orders via computer, and clinical decision support as “passive and active referential information as well as reminders, alerts, and guidelines6, p. 524.”

Qualitative methods are well suited to investigating the “why” issues, yet traditional ethnographic approaches involve lengthy periods of fieldwork7. We often need answers to evaluation questions quickly while we still have the opportunity to take action and modify the direction towards which we are heading. This ability to respond appropriately in a timely way is especially important in informatics when patient safety can be threatened by unintended consequences. A generalizable method of inquiry that can help to rapidly identify and assess a situation is desirable for both research and application purposes. A rapid ethnographic approach therefore seems highly applicable to informatics.

Traditional ethnography takes time because researchers must develop cultural competence and knowledge and develop rapport and trust7. Rapid methods use several techniques to expedite this process: data are collected and analyzed by teams; insiders who know the culture are included as team members; and the focus is quite narrow and problem-oriented. Rapid ethnographic assessment using a mix of qualitative and quantitative methods has been used effectively in the public health arena to develop intervention programs for nutrition and primary health care8 and HIV/AIDS9. Also called quick ethnography or the Rapid Assessment Process (RAP) by some7,10, it is a way of gathering, analyzing, and interpreting high quality ethnographic data expeditiously so that action can be taken as rapidly as possible. The Rapid Assessment, Response, and Evaluation Project (RARE) has been especially well documented, with manuals available to guide investigators11,12. Another tactic for expediting the process is consistent use of structured tools across field sites at the same time observation and interviews yield high quality data. RAP includes many of the methods POET has used in past studies, but includes others as well13,14. It relies on a team approach including those inside the organization as well as the researchers, streamlines the data collection, analysis, and interpretation processes, involves less time in the field, and provides feedback to internal stakeholders. It depends heavily on triangulation of both qualitative and quantitative data. Tools for data collection include 1) site inventory profiles, 2) ethnography guides, 3) interview question guides, and 4) rapid survey instruments. For this study, our research question is: How can RAP be adapted for identifying barriers and facilitators to implementing clinical decision support in community hospitals?

Methods

Site selection

We define community hospitals as inpatient facilities that are not members of the Association of American Medical Colleges Council of Teaching Hospitals, meaning that they have private physicians treating most patients. We selected two community hospitals in different states with different commercial systems, one with a two-year history of CPOE use and one with a much longer history of use.

Selection of methodological approaches

RAP differs from prior POET methods in that use of a preliminary Site Inventory Profile instrument allows researchers to target questions and observations, the semi-structured interviews are less oral-history-oriented and involve two interviewers, short structured survey data augment the observation and interview data, and observations are more focused and include informal interviewing using planned questions. It can be accomplished in several days to several weeks time. Our plan was to spend three intensive days in the field followed by approximately one month of analysis.

Development of the field manual

We began by developing a CDS in Community Hospitals Field Manual which included: a Site Inventory Profile/CDS and Knowledge Management Assessment Tool; an Interview Guide with a list of questions outlining areas to be covered during formal semi-structured interviews; a schedule for each site visit that outlines work for the three-day period; an Observation Guide including informal questions; and a Field Survey. Table 1 shows just a few of the questions included in the Site Inventory Assessment Tool, which has been under development for the past year15. Table 2 includes areas covered during the formal semi-structured interviews. Not shown here, the Observation Guide included a list of foci and informal questions designed around the Site Inventory results. For example, a researcher may notice a clinician interacting with a specific CDS module and ask: Can you tell me what you think of this feature? Is this the way you usually use it? What would you like to change? Or if the clinician does not use a feature the researcher knows is available, the researcher might ask why it is not used and how it could be more useful.

Table 1.

Site Inventory Profile Tool Sample of Areas Covered

Hospital characteristics such as number of staffed in- patient beds
CPOE system information such as vendor and time since first unit go-live
Hospital locations with CPOE and percent of units with CPOE
Order entry system attributes such as availability of different types of medications and therapeutics, diagnostic tests, and coded clinical data
Clinical decision support types available such as subsequent or corollary orders, context-sensitive information retrieval, order sets, etc.
CPOE-related applications available such as an electronic medication administration record (e-MAR), bar code medication administration (BCMA), etc.
CDS-related personnel support including a chief medical information officer, chief nursing informatics officer, etc.
CDS-related organizational support available such as multidisciplinary CPOE/CDS oversight committees

Table 2.

Formal Interview Guide

Culture: What seems to be the motivation for CDS? What are the cultural barriers and facilitators here? How have attitudes towards CDS shifted over the years?
Control, autonomy, trust: What is the organizational structure (either formal or informal) that relates the quality and IT groups? How do they relate to clinical staff? What are the clinical priorities? Who sets the clinical priorities? How stable is this staff? Who is on CDS committees and why? How do CDS- related committees interact with one another? How do the committees communicate with users? How have they changed over the years? In your estimation, who holds the power here?
Cognition, emotions: What are the barriers and facilitators to use? What is the training for CDS like? How do clinicians keep up to date about CDS? How do people feel about CDS?
Content: Where does the organization get its clinical decision support logic from? How customized is the CDS and who does it? How often is the clinical content reviewed? What would motivate this hospital to share its content with others? What was implemented when and why?
Human-computer interface: What are the issues surrounding presentation of CDS to clinicians?

Likewise, the Field Survey is tailored to each site depending on CDS modules available and on local names given to different features. Questions cover usage, perceptions of CDS, awareness of a CDS committee, involvement of clinicians with development of CDS, communication about new CDS, and training and support. This short structured interview survey instrument is intended to help us to gather information from a wider range of users than those interviewed or observed.

Preparation for site visits

Experience has taught us that careful preparation prior to entering the field is a timesaver in the long run. With the help of a local principal investigator/sponsor, we made appointments for interviews and arrangements for on-site observing well before we arrive. Sponsors also assisted us in completing the Site Inventory Profile and the IRB paperwork for each site, allowing at least three months. This study received human subjects approval from Oregon Health & Science University and each individual study site.

Subject Selection

Sample of informants: Informants were purposely selected according to role and relevant knowledge about CDS and included, for example, chief medical information officers, clinician users including physicians, nurses, and pharmacists, quality assurance staff, information technology staff members, and in-house vendor staff. For selection of clinicians, we deliberately sought out skeptics as well as champions and average users by asking for suggestions from each interviewee using a snowball technique.

Recruitment:

The local sponsor invited each selected informant to participate, and then the principal investigator followed-up with detailed information and scheduling. Informants were given small thank you gifts such as coffee cards.

Data collection:

Data collection took place over three days at each site, though we also conducted some follow-up interviews, sometimes by phone. Early on Day 1, we were given a demonstration of the system, which was especially useful for our learning the local jargon related to the systems. Interviews were conducted by pairs of researchers, recorded, and brief field notes were written during the interviews. This is so that some notes could be immediately available for preliminary analysis because transcription can take several weeks. Four other researchers were on the floors conducting observations and informal interviews, and a doctoral student was stationed in an appropriate common gathering place (e.g., the physicians’ lounge) to conduct the field survey. We conducted debriefings twice a day so that plans could be continuously modified. With seven researchers, we did close to 15 formal interviews and 40 hours of observation of individuals or units at each site. We also attended meetings of CDS-related committees at both sites. Each site visit ended with a team debriefing that included the local principal investigator/sponsor. We were able to conduct approximately 15 Field Survey interviews at each site.

Data management:

Interviews were transcribed by professional qualitative research transcriptionists. Field notes, done manually on site, were expanded and put into electronic form by the researchers. Files were entered into N6, formerly QSR NUD*IST (QSR International, Doncaster, Victoria Australia).

Data analysis:

To expedite analysis, each researcher listened to assigned recordings of interviews, taking notes about our identified foci, and reviewed everyone’s fieldnotes; then each researcher was assigned specific topics to summarize. These topics included user perspectives, administrative perspectives, technology issues, and barriers and facilitators. Case reports were written for each site and comments solicited from those inside the organizations, generating some changes. These case studies will form the basis for a comparative analysis of data.

The interpretive process was both iterative and flexible. Discussions during on-site debriefings, careful formal data analysis, and “member checking16,” a qualitative technique to further establish trustworthiness of results by asking insiders for feedback, provided productive and continuous opportunities for interpretation.

Results

The Site Inventory Profile results were tremendously helpful in our site visit planning and the instrument needed little modification. The Observation Guide was modified for each site several times. The formal Interview Guide also evolved as we learned local terminology for systems and units and as we made discoveries we wanted to investigate further. We found that we needed to make major changes in the Field Survey when we discovered that the questions were inappropriate based on what we learned about the local context and culture. Our sense is that by triangulating data from this variety of sources and by preparing so carefully for visits, we reached saturation at each of the sites within the targeted time period.

Lessons learned about methods

While the sponsors’ assistance was crucially important for initially introducing us via electronic mail to potential interviewees, we found that we also needed an onsite “shepherd,” someone who could walk us to units and provide introductions prior to observing, at each site. We were fortunate in gaining the assistance of a skilled, locally well-known and well-liked CPOE trainer at each hospital. These individuals knew the users and the facilities well, had access to on-call schedules, were up-to-date, and were trusted by the clinicians. We found that half hour formal interviews are generally sufficient, that attending committee meetings yields rich data and that observing with foci in mind still allows researchers to gain a sense of the context surrounding CPOE and CDS.

We also found that although RAP techniques are efficient and effective, they take their toll on the researchers during fieldwork. Periods of observation were particularly stressful because researchers were under great pressure to be in the right place at the right time to see activities relevant to CDS. Also, the logistics of conducting five interviews a day in different hospital and clinic locations were sometimes complex.

Discoveries about CDS

It became apparent during our first debriefing on Day 1 at our first site that our concept of CDS, which reflects that of informaticians (two of us have helped to write books about CDS), is vastly different from that of users. In fact, we learned immediately that we should not use the term “clinical decision support” at all except with individuals who have informatics training. Although we defined CDS broadly, our definition was not broad enough. Users view decision support as anything that guides them throughout the ordering process, and this includes what we call interface issues. The higher-level CDS functions, especially alerts, are often viewed as unpleasant annoyances. A common complaint was “there’s too many [darn] clicks to do anything” when alerts had to be overridden. However, simple guidance, such as that offered by consistent and predictable screen layouts that allow users to know where to look for certain values, is highly regarded. Also, we found that from the point of view of users, CDS is inseparable from CPOE, which itself is viewed as inseparable from the computer system in general. Even interviewees involved in CDS had vastly differing views of it. One interviewee, whose title was Manager of Clinical Decision Support, described her role: “I oversee external reporting, registries, and core measures.” This person gets reports from the system concerning clinical outcomes, but has nothing to do with assisting clinicians in their decision making.

We also discovered that when one tries to understand the many varieties of CDS described by users, a complex picture emerges. The types they define lie along a continuum ranging from low level workflow support to stronger workflow support to different gradations of assistance with making cognitive decisions at different points in the ordering process. During observation periods, users even identified several new types of CDS we had not considered before, such as TallMan lettering and sound-alike medication warnings.

Discussion

RAP served us well in this study of CDS in community hospitals. Thorough preparation, especially careful consideration of the Site Inventory Site Profile results, knowledge of the information system features gained through in-depth system demonstrations, and enhanced flexibility responding to numerous debriefing sessions while in the field were especially productive. We were able to offer useful objective assessment feedback to our sponsors within a short period of time. We were also able to gain deep insight into the nature of CDS and its meaning to users. RAP is not intended to replace long-term more traditional ethnographic fieldwork, but it appears to be highly suitable for assessing the rapidly changing context within which informatics interventions exist. Community hospitals depend on their medical staff to bring in patients, so they cannot afford to alienate this staff and therefore they tend to move slowly into CPOE and CDS. Much of the value of CPOE cannot be gained without CDS, however. By improving our understanding of what aspects of CDS are most valued by physicians, we hope to foster development of meaningful and highly acceptable CDS in these environments.

Acknowledgments

This work was supported by grant LM06942 and training grant 2T15LM007088 from the National Library of Medicine. Special thanks go to Eric Pifer, M.D., Emily Campbell, M.S., R.N., and Joshua Richardson, M.S., M.L.I.S.

References

  • 1.Ash JS, Stavri PZ, Kuperman GJ. A consensus statement on considerations for a successful CPOE implementation. J Am Med Inform Assoc. 2003;10(3):229–234. doi: 10.1197/jamia.M1204. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Hunt DL, Haynes RB, Hanna SE, Smith K. Effects of computer-based clinical decision support systems on physician performance and patient outcomes: a systematic review. JAMA. 1998;280(15):1339–46. doi: 10.1001/jama.280.15.1339. [DOI] [PubMed] [Google Scholar]
  • 3.Kawamoto K, Houlihan CA, Balas EA, Lobach DF.Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success BMJ200; 3307494765. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Garg AX, Adhikari NK, McDonald H, Rosas-Arellano MP, Devereaux PJ, Beyene J, Sam J, Haynes RB. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA. 2005;293(10):1223–38. doi: 10.1001/jama.293.10.1223. [DOI] [PubMed] [Google Scholar]
  • 5.Kaplan B. Evaluating informatics applications—clinical decision support systems literature review. Int J Med Inform. 2001;64:15–37. doi: 10.1016/s1386-5056(01)00183-6. [DOI] [PubMed] [Google Scholar]
  • 6.Bates DW, Kuperman GJ, Wang S, et al. Ten commandments for effective clinical decision support: Making the practice of evidence-based medicine a reality. J Am Med Inform Assoc. 2003;10(6):523–30. doi: 10.1197/jamia.M1370. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Beebe J. Rapid Assessment Process: An Introduction. Walnut Creek, CA: AltaMira Press; 2001. [Google Scholar]
  • 8.Scrimshaw SCM, Hurtado E. Rapid Assessment Procedures for Nutrition and Primary Health Care: Anthropological Approaches to Improving Programme Effectiveness. Los Angeles, CA: UCLA; 1987. [Google Scholar]
  • 9.Needle RH, Trotter RT, Goosby E, Bates C, von Zinkermagel D. Crisis Response Teams and Communities Combet HIV/AIDS in Racial and Ethnic Minority Populations: A Guide for Conducting Community-Based Rapid Assessment, Rapid Response, and Evaluation. Washington DC: DHHS; 2000. [Google Scholar]
  • 10.Handwerker WP. Quick Ethnography. Walnut Creek, CA: AltaMira Press; 2001. [Google Scholar]
  • 11.Trotter RT, Needle RH, Goosby E, et al. A methodological model for rapid assessment, response, and evaluation: the RARE program in public health. Field Methods. 2001;13:137–159. [Google Scholar]
  • 12.Trotter RT, Needle R. RARE Field Team Principal Investigator Guide. Washington DC: DHHS; 2000. [Google Scholar]
  • 13.Ash JS, Smith AC, Stavri PZ. In: Interpretive or qualitative Methods: Subjectivist traditions responsive to users Chapter 10 in Evaluation Methods in Medical Informatics. 2nd. Friedman Charles P, Wyatt Jeremy C., editors. Springer-Verlag; 2005. [Google Scholar]
  • 14.Ash JS, Sittig DF, Seshadri V, Dykstra RH, Carpenter JD, Stavri PZ. Adding insight: A qualitative cross-site study of physician order entry. Int J Med Inform. 2005;74:623–628. doi: 10.1016/j.ijmedinf.2005.05.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Sittig DF, Thomas SM, Campbell E, et al. Consensus recommendations for basic monitoring and evaluation of in-patient computer-based provider order entry systems. Proceeding Conf. on IT and Communications in Health; Victoria, BC, CA. Feb, 2007. [PMC free article] [PubMed] [Google Scholar]
  • 16.Crabtree BF, Miller WL, editors. Doing Qualitative Research. second edition. Thousand Oaks, CA: Sage; 1999. [Google Scholar]

Articles from AMIA Annual Symposium Proceedings are provided here courtesy of American Medical Informatics Association

RESOURCES