Skip to main content
AMIA Annual Symposium Proceedings logoLink to AMIA Annual Symposium Proceedings
. 2011 Oct 22;2011:80–87.

Studying the Vendor Perspective on Clinical Decision Support

Joan S Ash 1, Dean F Sittig 2, Carmit K McMullen 3, James L McCormack 1, Adam Wright 4,5,6, Arwen Bunce 1, Joseph Wasserman 1, Vishnu Mohan 1, Deborah J Cohen 1, Michael Shapiro 1, Blackford Middleton 4,5,6
PMCID: PMC3243293  PMID: 22195058

Abstract

In prior work, using a Rapid Assessment Process (RAP), we have investigated clinical decision support (CDS) in ambulatory clinics and hospitals. We realized that individuals in these settings provide only one perspective related to the CDS landscape, which also includes content vendors and electronic health record (EHR) vendors. To discover content vendors’ perspectives and their perceived challenges, we modified RAP for industrial settings. We describe how we employed RAP, and show its utility by describing two illustrative themes. We found that while the content vendors believe they provide unique much-needed services, the amount of labor involved in content development is underestimated by others. We also found that the content vendors believe their products are resources to be used by practitioners, so they are somewhat protected from liability issues. To promote adequate understanding about these issues, we recommend a “three way conversation” among content vendors, EHR vendors, and user organizations.

Introduction

Computerized provider order entry (CPOE) with clinical decision support (CDS) can help to improve patient care,13 but implementing CPOE can be difficult and acceptance of CDS is often largely responsible for that difficulty.4,5 We broadly define CPOE as a system that allows a decision maker to directly enter medical orders via computer, and clinical decision support as “passive and active referential information as well as reminders, alerts, and guidelines.”6, p. 524 Most studies of CDS have taken place in academic settings in hospitals with locally developed systems and CDS crafted by knowledge engineers who are researchers. With the national impetus to implement CPOE with CDS assisted by incentives funded by the American Recovery and Reinvestment Act (ARRA),7 however, non-academic community hospitals are increasingly purchasing commercial electronic health record (EHR) systems which must include CPOE and CDS to achieve meaningful use goals. Effective use of CDS in these settings is problematic for many reasons, chiefly because CDS needs to be tailored to fit the local situation.8 For the past three years, the Provider Order Entry Team (POET) of researchers based primarily at Oregon Health & Science University has been investigating barriers and facilitators to CDS use in a variety of settings including community hospitals. Because community hospitals buy commercial systems with some CDS embedded in the EHR and some purchased separately, they employ analysts to tailor the CDS with assistance from clinician subject matter experts.9 These hospitals are therefore heavily dependent on the commercial sector for basic CDS content. Our prior investigations indicated that analysts desire more help from vendors and products better suited to their needs. To learn about the perspective of the vendors who supply these products, we felt it necessary to study them ethnographically just as we had studied hospitals and clinics.

Qualitative methods are appropriate for investigating “why” questions, but traditional ethnographic methods involve long periods of fieldwork.10 In informatics, we often need to answer research evaluation questions quickly while we still have the opportunity to take action and modify the direction towards which we are heading. A standard method of inquiry that can help to rapidly identify and assess a situation is desirable for both research and application purposes. A rapid ethnographic approach therefore seems highly applicable to informatics. The Rapid Assessment Process (RAP)10,11 is a method for gathering, analyzing, and interpreting ethnographic data effectively and quickly so that action can be taken rapidly. The chief mechanism for expediting the process is the consistent use of structured tools, gathered into a field manual, across field sites. RAP relies on a team approach including those inside the organization as well as outside researchers. RAP also streamlines the data collection, analysis, and interpretation processes; it involves less time in the field; and it provides feedback to internal stakeholders. It depends heavily on triangulation of data from different sources. The field manual generally includes 1) site inventory profiles, 2) observation guides, 3) interview question guides, and 4) rapid survey instruments.

We have been refining our naturalistic research methods over time to make them more efficient for the dual purposes of 1) studying informatics interventions that are in continuous flux and 2) making the methods more easily adopted by other evaluation researchers.10,11 Having studied seven inpatient hospitals using RAP, we then modified our techniques to investigate five ambulatory settings. Further modification for the vendor setting was necessary primarily because we could not observe clinical processes. This modification of RAP would not only aid in our new CDS content vendor inquiry, but would also illustrate transferability to other potential non-clinical organizations, such as insurance carriers or law firms.

Thus, for this study, our research question is: How can RAP be further adapted for the purpose of learning the CDS content vendor view of the CDS landscape?

Methods

Refining RAP: Exploring the methods and ethics of industrial ethnography

To refine RAP, we first conducted a literature search to identify the range of qualitative methods that have been used to study business. We identified “industrial ethnography” as an interpretive qualitative methodology used to study corporate organizations. We discussed this method with experts to discover how we might apply elements of industrial ethnography to RAP. Often companies themselves commission this type of ethnography for internal studies, which entails a rather different approach and level of entrée. Many of the published papers we found were less about methods and more about ethics. These were helpful to us because, although we have gained IRB approval not only at our own institutions, but at over 20 of our study sites, we now needed to be cognizant of additional more subtle issues than those of concern to IRBs. For example, we always promise subjects that they will not be identified in reports of our findings, but we generally name our study sites so that readers can be apprised of background information about the history, culture, and information systems. Companies might not want us to reveal their names, however, so we planned on discussing this issue with contact people at each vendor site before data collection began. We also knew we needed to be more sensitive about intellectual property, an issue that has been addressed in other research about university-industry relationships.12

We found excellent guidance about conducting industrial fieldwork in a series of papers introducing the notion of “studying up.”13,14 In anthropological terms, this means that instead of studying non-industrial cultures and people unlikely to read the results of these studies, ethnographers study subjects more like them. One of these papers, by Diana Forsythe,14 focuses on informatics and describes how she adapted to studying medical informatics in universities after conducting more classical fieldwork in non-scientific situations in other countries. In her case, she worked for and with those she was studying, a strategy that resulted in the development of ethical and political issues. Other authors of papers on “studying up”12,13 describe our situation, that of outsiders entering private companies. They describe problems of access, difficulty in gaining permission to observe, and the need for creative strategies. They also state that participant observation through periods of naturalistic observation and informal interaction, a hallmark of ethnography, may be unsuitable or not feasible when studying up. As noted by one author, “participant observation is a research technique that does not travel well up the social structure.”13, p.115 In studying “elite” settings such as corporations, authors have suggested that “ethnographers deemphasize participant observation in favor of ‘polymorphous engagement,’14, p.116 meaning other techniques that may even be virtual, such as telephone and e-mail interviewing. Taking these lessons from industrial ethnography and the anthropology of “studying up,” we modified each step in RAP for studying CDS content vendors. We describe the changes below.

Development of a new Field Manual for using RAP in an industrial setting

For our clinical sites, our field manual included a “site inventory profile,” which was a checklist of kinds of CDS and other related factors such as knowledge management processes.10 Completed in advance by our local principal investigator at each site, it helped us to develop our interview questions and lists of foci for observations. For the vendor fieldwork, we decided a site inventory was not appropriate. Instead, we gathered background information on companies through web searches of company sites available to the public so that we would be familiar with their products. We also conducted phone discussions with a contact person within the site to learn more about the company and to mutually identify individuals to interview. For clinical sites, we also experienced demonstrations of their EHR systems, and in a similar way we arranged to have demonstrations of these vendors’ products prior to visits. Our field manual always includes an interview guide including questions we want to have answered. For each individual, we select the most appropriate questions depending on his or her role. The field manual also normally includes a fieldwork observation guide that we use for writing fieldnotes. It briefly notes foci for fieldwork at that particular site. We explored with experts the wisdom of conducting formal observational fieldwork and because of their responses and our reading, we decided against it. In addition to concern about access, our experts thought that watching knowledge workers in their cubicles would not be fruitful. We did, however, decide to ask for tours of facilities to help us learn about the work environment, and we also wrote observational fieldnotes during interviews. Finally, we often include in the field manual a short survey instrument for interview surveys we conduct with a broad array of staff on site to augment what we learn during interviews and observations. However, the purpose of the survey, to gather user views of clinical systems, is not appropriate for industrial ethnography, so we decided not use that strategy.

Human Subjects Protection and Confidentiality

Companies such as those we were approaching do not have IRBs, so we did not need to go through that process at each site since we were covered by our home institution’s IRB. OHSU approved a modification of our study protocol to permit our visiting three vendor sites. We expected we would be asked to sign non-disclosure agreements, however, and we were.

Site selection

We purposively selected three different types of content vendors so that we could gain a broad view. Every community hospital we have studied purchases order sets, medication information, and clinical information reference resources, so we approached three companies providing those products. These were Zynx Health in Los Angeles, CA, First DataBank in South San Francisco, CA, and UpToDate in Waltham, MA. We were fortunate that when we approached leadership within these companies and explained our goals, we were greeted with immediate cooperation. This is likely because the CEOs are informatics-oriented and healthcare professionals. In addition, we shared samples of our prior publications with them so they could understand our strategy for aggregating results in what we believe is a nonthreatening way. Information for the company descriptions below was gained primarily through review of their customer Web sites.

Zynx Health, a wholly owned subsidiary of Hearst Publishing, is located in Westwood, CA. Zynx Health has approximately 200 employees, including 20 physicians and 20 registered nurses. We were told that approximately 50% of all U.S. hospitalized patients are cared for by Zynx clients. Zynx provides roughly 1,000 evidence-based order sets to customers, who are expected to customize them so that they become locally appropriate. In addition to building the order sets based on evidence in the literature, Zynx offers a computer-based tool to help with the customization and management of order sets called AuthorSpace.

First DataBank, also a wholly owned subsidiary of Hearst Publishing, is located in South San Francisco, CA. First DataBank has employees at this location, in Indianapolis, and outside the United States. First DataBank’s product offerings include drug databases and decision support tools, content integration software that facilitates embedding this information into the EHR, and drug reference products. Their customers include retail pharmacies, governmental agencies, pharmaceutical manufacturers, electronic health record vendors, healthcare delivery systems, and clinicians. For this study, we focused on those customers (i.e., clinicians, healthcare delivery systems, and EHR vendors) and product offerings (i.e., embedded medication reference databases) associated with the provision of real-time, point-of-care, medication related clinical decision support (CDS). To be more specific, this CDS is “embedded” within, and delivered through, existing electronic health record products.

Our third study site, UpToDate, began in 1992 as an electronic textbook designed to answer specific clinical questions. The Web site notes that it is an “evidence-based peer reviewed information resource available via the Web, desktop/laptop computer and mobile device.” The UpToDate community has “over 4,400 leading physicians, peer reviewers and editors and over 400,000 users.” Customers are either individual or institutional subscribers. The company is international. The resource includes 8,300 topics in 16 specialties and over 97,000 pages of text. Research indicates there is a strong association between hospitals having UpToDate and the quality and efficiency of health care.15 UpToDate was purchased in 2008 by Wolters Kluwer Health. Wolters Kluwer also owns ProVation Order Sets, which can now include UpToDate decision support.

Subject selection

Subject selection differed from our past methods in that normally, at clinical sites, we seek out clinical champions, normal users, and skeptical users in addition to CDS experts. For the vendor visits, we asked to interview individuals in particular roles including the CEO, vice presidents of sales or marketing, other vice presidents who could likely answer our questions, content development and management staff, technical/interoperability staff, and informaticians. We sought interviewees at different levels in the organizations so that we could talk with some who were close to the customers and had more technical backgrounds in addition to the leaders who could describe the history and mission of the organizations. We also asked our contact person to make it clear to the sales staff members we were interviewing that we did not need sales information, it being their natural tendency to supply this kind of information. We did this because we wanted answers to our questions rather than a sales speech.

Data collection

RAP interviews are generally semi-structured interviews so that there is a definite focus, but the interviewee is allowed some freedom to elaborate on topics when answering questions and the interviewer can pursue interesting and relevant statements the interviewee makes spontaneously. While using RAP in content vendor organizations, we developed tailored interview guides for individuals depending on their roles in the organization. Table 1 includes a list of all possible questions from which we chose the most appropriate. We also adjusted our questioning during the interview so that we could probe intriguing topics. Because we were not conducting observations aside from site tours and observations during interviews and because we believed that we could gather enough background information ahead of time, we decided that we did not need as much time or as many researchers in the field as we would for clinical site visits. During hospital or outpatient site visits, we routinely spend a concentrated week of 10–12 hour days in the field with six to eight researchers. For the vendor visits, we opted to interview everyone in one day, which was feasible because all interviewees were located in the same building (or could be interviewed by phone from the buildings) and we could stay in one conference room and have interviewees come to us. We decided to send three researchers with different skill sets to assure that the clinical and technical as well as organizational questions could be most knowledgeably investigated. Unless two interviews had to be conducted at the same time for scheduling reasons, all three interviewers attended each interview. One was the official interviewer and the others were asked to remain silent until appropriate times when they would be allowed to ask questions. In this way, the process remains true to recommended semi-structured interview techniques (e.g. it does not become a conversation), but the assistant interviewers have an opportunity to ask follow-up questions in their particular areas of expertise.16 Another important task for one of the assistant interviewers was to write fieldnotes during the interview. They noted nonverbal interactions and areas to further explore as well as what the person was saying. All interviews were recorded and transcribed.

Table 1.

Interview Guide Sample for CDS Content Vendor Study

1. Background
  • First, we’d like to learn a little about you. Could you give us a few words about your background?

  • Would you please describe your role here?

  • If you could give us a very brief history of [your company], it would help give us background for the rest of our interviews.

  • And for further background, could you describe the type of governance structures you have, in terms of a board of directors, expert panels, etc.?

  • And what is the mission of [your company]?

  • How does [your company], position itself as a company?

2. The Meaning of CDS
  • Our study is a general overview of CDS, and we are finding that people view CDS is many different ways. How would you define clinical decision support?

  • How does your product fit into this definition?

  • How does it relate to other forms of CDS?

3. About Your Customers
  • What kinds of customers do you sell to directly?

  • What do your customers seem to misunderstand about your product?

  • How does your company note and respond to customers’ feedback?

  • What do customers tell you about your product?

  • How do you communicate updates to customers?

  • What do you find frustrating about working with your customers?

4. Roles in Your Organization
  • It seems like the industry is creating new types of professionals with unique skill sets. What are the kinds of essential, special people that make your enterprise run?

  • What new types of positions or job descriptions do you see emerging?

  • What new roles for such essential people are you seeing in customer organizations?

5. Content Management
  • We are interested in how people manage CDS content. What is your process for maintaining, monitoring, and deciding which elements of your product need updating? Do you have any specific software tools to take care of this? If you have internal content management tools, what are they like?

  • What data sources do you use to develop your product? [Cochrane, literature, government regulations and recommendations, specialty society recommendations]

  • How do you store the information you produce?

  • What systems have you developed for reviewing the information or content in your product?

  • If you have content management tools for your customers, what are they like?

6. Use of Your Product
  • Who is it within the company who works with the customers, especially the EMR vendors?

  • How involved is your company with implementation of your products?

  • What kind of training and support do you provide? In person classes? Online training?

  • How do you help the users choose among the available options?

  • If you provide authoring and editing tools, how well are they used?

  • When designing your products, how is the workflow of users determined?

  • What impact do you think your product has on the workflow of users?

  • What do you see as success factors when implementing or using your product?

  • What statistics or aggregated feedback on how clients use your CDS products do you gather?

  • What feedback do you get on which elements are actually used?

  • What data are collected on the effectiveness of your product?

Data analysis

Preliminary data analysis took place between interviews. Using notes taken by the assistant interviewer, the three researchers briefly reviewed what had been learned so that new questions for upcoming interviews could be developed as needed based on prior interviews. After completion of the site visits, interview notes and transcripts were entered into NVivo, (QSR International, Doncaster, Victoria Australia), a program that facilitates the organization and retrieval of qualitative data for analysis.

The interpretive process was both iterative and flexible. For in-depth analysis, our ten-person research team broke into dyads (one clinical/technical person and one social scientist); each dyad was assigned a set number of transcripts. Individuals read the assigned transcripts, noting all recurring or potentially important expressions and key phrases, and the dyads then met to compare and agree on their findings. The ten researchers next met in person to conduct a card sort exercise. Each of the over 300 expressions generated by the dyads was pasted on a separate index card, and using the constant comparison method, the team built piles of cards that seemed to go together.17 This was not simply a sorting and naming of piles.

It is a technique to stimulate team discussion using a grounded theory approach. Each card is considered in light of the card(s) that have come before it. Each is either grouped with similar cards already in a pile or it is set aside as the start of a new pile. The team discussed the emerging themes, how they were interconnected and how they should be interpreted across the content vendor settings studied. The piles, once named, became the themes.

We wrote a short report of the findings for each company we studied for two reasons: first, we thought the companies would find them useful, and second, the report was a form of “member checking,”18 a qualitative technique to further establish trustworthiness of results by asking insiders for feedback. Also, at this point, because this was industrial ethnography, we promised contacts at the sites that they would be given the opportunity to review any publications prior to our submitting them. We decided as a team that if there were any ethical issues (such as a request to change our results) arising from company responses to our draft publications, we would openly discuss them with our contacts and negotiate resolutions.

Results

We conducted three site visits and twenty-three interviews at three companies during 2009–2010.

Lessons learned about methods

Gaining entrée was not as difficult as other researchers “studying up” have found it to be. We approached those at the highest levels in the companies because their approval was needed for our gaining cooperation from others in the organization. The CEOs are interested in informatics and, because the companies are knowledge based clinical content development specialists, they wanted to support our research endeavors.

RAP usually calls for triangulation through multiple methods, multiple perspectives, and multiple researchers. Our data collection methods were more constrained in this setting than in healthcare delivery settings, yet we were able to triangulate19 findings from interview data with some limited informal observations during visits, demonstrations of products during visits, and analysis of written publicly available material about the companies. We used researcher triangulation by selecting researchers of different backgrounds and perspectives to attend the site visits and analyze data. Site triangulation included selection of three quite different companies. Finally, subject triangulation involved interviewing a broad spectrum of company employees.

The analysis process was more difficult than we had envisioned. After the first vendor site visit, we tried to do a comparative content analysis using the codes we had developed during our clinical site visits. This attempt at using a template method for coding failed. The template method is most useful when you have already analyzed enough data so that you know a good deal about the topic.17 While we had thought the vendor data would be a simple amplification of what we had learned during hospital visits, we discovered that was not the case. We abandoned the template method and started from scratch using the pile-sort technique discussed above. The pile sort technique, which we have used in the past to analyze novel topics, worked well with these data.

When we asked contacts within the companies for feedback on the reports we sent to them, they provided constructive comments and explanations, for which we were grateful. Having their approval of our basic findings will give us greater confidence as we generate future publications. We have promised the companies that they could review drafts publications and again, not only have they been responsive in offering comments, but they have also allowed us to use their names.

The Content Vendor Perspective on CDS

Among the themes that arose from the data, two are overarching and serve as especially appropriate examples of the power of RAP to uncover issues both in depth and expeditiously. We call these themes “We’re in This Together” and “We Are Like Switzerland”. UpToDate is somewhat different from the other companies in that the product at this point is generally not embedded within an EMR and is accessed electronically outside of the EMR or through a link within the EMR. However, the ability to provide UpToDate information through info buttons is becoming easier because a standard has been agreed upon and that information is also being supplied within ProVation order sets. The following themes relate to all three vendors when these trends are considered.

Theme 1: We’re in This Together

The ultimate mission of each of these organizations, we were told, is better health care. Their customers include health care organizations and individual clinicians, but also at times EHR vendors. We were told that since end users and EHR vendors share the goal of better health care, all three groups need to cooperate to reach it. While the relationships with EHR vendors are complex and they change over time as companies merge and split, these content vendors feel they are in the middle, between end users and EHRs. They are proud that their products are unique and could not be easily duplicated. Each company feels somewhat misunderstood in that the experts they employ and knowledge they sell are more expensive to produce than anyone realizes. In each company, we walked through the buildings that house 50 or more physicians, pharmacists, nurses, and editorial staff members and were impressed by the size and expertise of staff. We were told “it’s very expensive to develop and maintain this. We have to pay a lot of very smart people to do this.” Another interviewee noted “it’s tremendously intensive to pull off what we do.”

We were also told “some EMR vendors originally thought that they were gonna create their own content but they’ve kind of abandoned that as not being their central skill set.” The content vendors provide content and under some circumstances the EHR vendors deliver their content. However, content vendors perceive that the EHR vendors do not make it easy since the content vendors must adapt to the unique technology of each EHR vendor. In other words, standards are either not available or not followed but their use is highly desired by these companies.

Except for UpToDate, the content vendors often hear from end users (often EHR analysts who customize the content) that it is hard to work with their products even though the content vendor might have a tool available that would solve the problem. Users do not always purchase or take advantage of all of the advanced capabilities of these tools. In addition, all three vendors sense that their products are often not used optimally by buyers.

Representatives of these companies desire more interaction with users at the individual, clinic, and larger healthcare organization level as well as with EHR vendors. As one interviewee noted, “We really need to work together to reduce variability in care and improve patient safety. And you know, the software that is being built by EHR companies themselves doesn’t do that. It’s building a kitchen that doesn’t have a refrigerator or an oven in it. And so it’s the content piece of that [we provide].” Another interviewee claimed “It would be nice if you could get everybody together in some manner to talk about what would be best for the industry, I’m not sure that could happen. I don’t know how you would do that.”

Theme 2: We Are Like Switzerland

By this we mean that vendor representatives often spontaneously told us that their companies simply provide a reference source based on evidence, and one likened his company to Switzerland in its neutrality. They repeatedly told us they do not practice healthcare: “I don’t think we can ever take away the judgment of the person looking at our information and applying it to the situation.” The content vendors, again, feel like they are in the middle. They provide content but, when that content is embedded in the EHR, the EHR vendor is responsible for how the content is displayed. The ultimate end user is responsible for how it influences patient care. This neutral stance on the part of the content vendors is also due to the legal situation. Some content vendor representatives spoke strongly about how the legal system in this country influences what they can provide. There are many legal, regulatory, antitrust, and fiduciary constraints that content vendors must navigate while still providing a useful and usable product for all their customers.20 Sometimes, depending on what is being sold, these constraints result in sub-optimal products for clinician end-users. For example, over-alerting is a well-recognized problem that causes frustration on the part of providers, but it is a consequence of different entities (e.g., clinical content vendors, EHR vendors, and hospital administrators responsible for implementing EHRs) trying to protect themselves against liability claims.

Discussion

Our examination of CDS content vendors demonstrates that RAP is a robust method that can be modified for industrial ethnography. The adaptation of RAP, however, took preparation. This included careful consideration of an effective strategy for gaining entrée and for thinking through ethical issues. While RAP is not intended to replace long-term more traditional ethnographic fieldwork, it appears to be highly suitable for assessing a variety of situations ranging in our study from ambulatory to hospital to industrial settings.

The CDS content vendors we studied believe they are neutral developers of clinical content which EHR vendors and clinicians consume. They would like to have more interaction with the other players in the CDS landscape, including EHR vendors and end users of different types, including both organizations and individuals. To promote adequate understanding among the most important entities critical to successful use of CDS, a “three way conversation” among content vendors, electronic health record vendors, and users is recommended if the mutual goal of better health care assisted by CDS is to be reached.

Acknowledgments

This work was supported by grant LM06942 and training grant 2T15LM007088 from the National Library of Medicine and AHRQ contract #HHSA290200810010. We would like to thank all of our interviewees at Zynx Health, First DataBank, and UpToDate for assisting us with this study.

References

  • 1.Hunt DL, Haynes RB, Hanna SE, Smith K. Effects of computer-based clinical decision support systems on physician performance and patient outcomes: a systematic review. JAMA. 1998;280(15):1339–46. doi: 10.1001/jama.280.15.1339. [DOI] [PubMed] [Google Scholar]
  • 2.Kawamoto K, Houlihan CA, Balas EA, Lobach DF. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ. 2005;330(7494):765. doi: 10.1136/bmj.38398.500764.8F. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Garg AX, Adhikari NK, McDonald H, Rosas-Arellano MP, Devereaux PJ, Beyene J, Sam J, Haynes RB. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA. 2005;293(10):1223–38. doi: 10.1001/jama.293.10.1223. [DOI] [PubMed] [Google Scholar]
  • 4.Ash JS, Sittig DF, Campbell EM, Guappone KP, Dykstra RH. Some unintended consequences of clinical decision support systems. Proceedings AMIA; 2007. pp. 26–30. [PMC free article] [PubMed] [Google Scholar]
  • 5.Berner ES. Clinical Decision Support Systems: State of the Art. Rockville, Maryland: Agency for Healthcare Research and Quality; Jun, 2009. AHRQ Publication No. 09-0069-EF. [Google Scholar]
  • 6.Bates DW, Kuperman GJ, Wang S, et al. Ten commandments for effective clinical decision support: Making the practice of evidence-based medicine a reality. J Am Med Inform Assoc. 2003;10(6):523–30. doi: 10.1197/jamia.M1370. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.H.R. 1 [111th]: American Recovery and Reinvestment Act of 2009 (Gov-Track.us) [Internet].Available from: http://www.govtrack.us/congress/bill.xpd?bill=h111-1. Accessed March 9, 2011.
  • 8.Bates DW, Cohen M, Leape LL, Overhage M, Shabot MM, Sheridan T. Reducing the frequency of errors in medicine using information technology. J Am Med Inform Assoc. 2001;8:299–308. doi: 10.1136/jamia.2001.0080299. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Wright A, Sittig DF, Ash JS, Feblowitz J, Meltzer S, McMullen C, Guappone K, Carpenter J, Richardson J, Evans RS, Nicol WP, Middleton B. Development and evaluation of a comprehensive clinical decision support taxonomy: Comparison of front-end tools in commercial and internally-developed electronic health record systems. J Am Med Inform Assoc. 2011 doi: 10.1136/amiajnl-2011-000113. (in press). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.McMullen CK, Ash JS, Sittig DF, Bunce A, Guappone K, Dykstra R, Carpenter J, Richardson J, Wright A. Rapid assessment of clinical information systems in the healthcare setting: An efficient method for time-pressed evaluation. Meth Inform Med. 2010 Dec 20;50(2) doi: 10.3414/ME10-01-0042. [E pub ahead of print]. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Ash JS, Sittig DF, McMullen CK, Guappone K, Dykstra R, Carpenter J. A rapid assessment process for clinical informatics interventions. Proceedings AMIA; 2008 Nov 6; pp. 26–30. [PMC free article] [PubMed] [Google Scholar]
  • 12.Baba ML. Industry-university relationships and the context of intellectual property dynamics: The case of IBM. Multi-level Issues in Social Systems. 2006;5:301–319. [Google Scholar]
  • 13.Gusterson H. Studying up revisited. Political and Legal Anthropology Review. 1997;20:114–119. [Google Scholar]
  • 14.Forsythe DE. Ethics and politics of studying up in technoscience. Anthropology of Work Review. 1999;20(1):6. [Google Scholar]
  • 15.Bonis PA, Pickens GT, Rind DM, Foster DA. Association of a clinical knowledge support system with improved patient safety, reduced complications and shorter length of stay among Medicare beneficiaries in acute care hospitals in the United States. Int J Med Inform. 2008 Nov;77(11):745–53. doi: 10.1016/j.ijmedinf.2008.04.002. [DOI] [PubMed] [Google Scholar]
  • 16.Beebe J. Rapid Assessment Process: An Introduction. Walnut Creek, CA: AltaMira Press; 2001. [Google Scholar]
  • 17.Ryan GW, Bernard HR. Techniques to identify themes. Field Methods. 2003;15(1):85–109. [Google Scholar]
  • 18.Crabtree BF, Miller WL, editors. Doing Qualitative Research. second edition. Thousand Oaks, CA: Sage; 1999. [Google Scholar]
  • 19.Berg BL. Qualitative Research Methods for the Social Sciences. 7th ed. Boston, MA: Allyn & Bacon; 2009. [Google Scholar]
  • 20.Sittig DF, Singh H. Legal, Ethical, and Financial Dilemmas in Electronic Health Record Adoption and Use. Pediatrics. 2011 Mar 21;127(4):e1042–7. doi: 10.1542/peds.2010-2184. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from AMIA Annual Symposium Proceedings are provided here courtesy of American Medical Informatics Association

RESOURCES