Abstract
Context:
Current information-rich electronic health record (EHR) interfaces require large, high-resolution screens running on desktop computers. This interface compromises the provider’s already limited time at the bedside by physically separating the patient from the doctor. The case study presented here describes a patient-centered clinical decision support (CDS) design process that aims to bring the physician back to the bedside by integrating a patient decision aid with CDS for shared use by the patient and provider on a touchscreen tablet computer for deciding whether or not to obtain a CT scan for minor head injury in the emergency department, a clinical scenario that could benefit from CDS but has failed previous implementation attempts.
Case Description:
This case study follows the user-centered design (UCD) approach to build a bedside aid that is useful and usable, and that promotes shared decision-making between patients and their providers using a tablet computer at the bedside. The patient-centered decision support design process focuses on the prototype build using agile software development, but also describes the following: (1) the requirement gathering phase including triangulated qualitative research (focus groups and cognitive task analysis) to understand current challenges, (2) features for patient education, the physician, and shared decision-making, (3) system architecture and technical requirements, and (4) future plans for formative usability testing and field testing.
Lessons Learned:
We share specific lessons learned and general recommendations from critical insights gained in the patient-centered decision support design process about early stakeholder engagement, EHR integration, external expert feedback, challenges to two users on a single device, project management, and accessibility.
Conclusions:
Successful implementation of this tool will require seamless integration into the provider’s workflow. This protocol can create an effective interface for shared decision-making and safe resource reduction at the bedside in the austere and dynamic clinical environment of the ED and is generalizable for these purposes in other clinical environments as well.
Keywords: Methods, Informatics, Quality Improvement
Introduction
Counterintuitively, in an information-rich world advances in technology can increase, not decrease, cognitive demands on users.1,2 In the rush to adopt electronic health records (EHRs) to qualify for federal incentive payments, clinicians now find themselves working with products with poor usability that are neither integrated nor interoperable into the clinical workflow.3–5 Since computerized clinical decision support (CDS) is most effective when integrated as part of the physician’s normal workflow6,7 at the time and location of decision-making, the potential patient safety and outcome benefits of CDS have not yet been fully realized.8–13 Furthermore, current information-rich EHR interfaces require large, high-resolution screens running on desktop computers. This interface compromises the physician’s already limited time at the bedside by physically separating the patient and the doctor.3,5
We developed a possible solution to these interface challenges in a clinical scenario that could benefit from CDS but had failed previous implementation attempts. Diagnostic imaging is the fastest growing segment of health care spending in the United States, increasing twice as fast as total health care costs.14 In the emergency department (ED), use of advanced diagnostic imaging in injured patients has increased dramatically—leading to increased health care costs, exposure to ionizing radiation, and length of stay without objective metrics of improved patient outcomes.15 In particular, despite implementation of validated, highly sensitive clinical guidelines designed to safely reduce the use of computed tomography (CT) in minor head injury, CT is frequently obtained in low-risk, minor head injury patients in whom it is not clinically indicated.16–21
This case study describes a patient-centered CDS design process that aims to bring the physician back to the patient’s bedside by integrating a patient decision aid with CDS for shared use by the patient and provider using a touchscreen tablet computer (Figure 1).22,23 Furthermore, newer generation tablet computers are flat, portable, and potentially less likely to cause hospital-acquired infections than desktop computers as they can be more easily sanitized since they do not house internal fans.24–27 The objective of this case study was to do the necessary foundational work to uncover and disentangle the human and environmental factors, as well as the chaotic clinical workflow, and address them in the design process such that the eventual CDS interface can more effectively support the physician at the point of care.9,13,28–30 The tool described in this case study has been prototyped and will subsequently undergo usability and field-testing, will be interfaced with our institution’s EHR, and will be studied in an implementation trial to demonstrate that technology is better than no technology for patient engagement and safe reduction of diagnostic imaging in the ED—an austere and dynamic clinical environment, where the physician is faced with a substantial volume of high acuity patients, time pressures, and interruptions.31–34
Case Description
Background
More than 1.3 million patients are treated annually in United States EDs for traumatic brain injury.35 Most of these injuries are mild, but in a small proportion of patients with mild injury, clinical deterioration occurs.36 In patients with clinically important traumatic brain injury, CT imaging yields a quick and accurate diagnosis such that neurological intervention can prevent deleterious outcomes from intracranial hematoma. Although CT has greatly improved our diagnostic ability, it exposes patients to significant amounts of ionizing radiation.37 In addition, over 90 percent of CT scans for minor head injury are negative for clinically important brain injury.38
The Canadian CT Head Rule (CCHR) is a clinical decision rule that was developed using a rigorous, evidence-based derivation and validation process to identify appropriate use of CT to differentiate mild traumatic brain injury from clinically important brain injury.38 In both Canada and the United States, the CCHR has been validated to be 100 percent sensitive and more specific than other guidelines and decision rules.38,40–43 A prospective cluster-randomized trial to implement a similar prediction rule—the Canadian C-spine Rule—led to a significant decrease in imaging.16 When the CCHR was implemented at the same centers with many of the same patients, however, CT imaging rates did not decrease.17 In fact, imaging rates were 74–76 percent with the implementation, compared to 63–68 percent without it.17 These rates were more than double compared to 12 years earlier in the same region.44 The authors suspected that “CT imaging has become the local standard of care for patients with minor head injury…[and has] led to expedient over-testing.”17 The CCHR failed to reduce testing due to implementation failures not rule performance. Specifically, compliance with the CCHR could safely decrease the number of CT scans performed in minor head injury by 35 percent.18,19 If the CCHR were successfully implemented in the United States, a significant number of radiation-induced cancer deaths could be averted with a cost savings of up to $394 million annually.17–19,35 Until workflow barriers, patients’ values and preferences, and how they affect decision-making regarding use of CT are addressed in this clinical scenario, these patients will continue to be exposed to undue radiation risk and cost.19
Patient-Centered Decision Support Design Process
We followed a user-centered design (UCD) approach—an iterative, multistage user interface design and evaluation process—to build a bedside aid that is useful and usable, and that promotes shared decision-making between patients and providers at the patient’s bedside on a tablet computer.45 First, a triangulated qualitative study was performed to identify the factors that either promote or inhibit the appropriate use of CT in patients presenting to the ED with minor head injury.
The findings of this qualitative research have been reported elsewhere.46 Briefly, seven focus groups were performed—three with exclusively providers and four with exclusively patients.47 Understanding that users may “have very limited insight into their own performance, and even more limited ability to articulate what might improve it,” we triangulated the focus group findings with direct field observation in the form of a cognitive task analysis.13 The analysis included more than 150 hours of direct observation in the ED of peer-nominated, senior emergency-physician subject matter experts (SMEs) in safely minimizing testing in the ED via patient engagement and subsequent critical decision method interviews with the SMEs.48,49 The qualitative research findings included nonclinical human factors in six primary domains of establishing trust, bedside manner, anxiety (of both the patient and the provider), constraints (e.g., time), the influence of others (e.g., other providers, patient family members, Internet), and patient expectations. These results informed the conclusion that identifying and disseminating approaches and designing systems that help clinicians establish trust and manage uncertainty within the ED context could optimize CT use in minor head injury.46 To our knowledge, such patient-centered themes have not previously been considered up front in the CDS design process.
Qualitative factors identified during the focus groups and cognitive task analysis were integrated in subsequent design and development of a prototype tool to formulate the initial CDS concept. Although there are many potential options for the specific details of a tool’s interface such that it may eventually be seamlessly integrated into provider workflow, the initial interface must function within the constraints of what typically occurs at the bedside in a patient encounter—e.g., in the ED, patient arrival, waiting for provider, provider evaluation, diagnostic work-up if necessary, and patient-provider discussion regarding patient disposition (i.e., discharged home, transferred to another facility, admitted to the hospital). The tool must facilitate a patient-focused workflow for the provider. Therefore, the prototype was designed to include the following: (1) information for the patient to review while waiting for the provider and awaiting diagnostic work-up and discharge (akin to a patient decision aid) and (2) decision support with content for the provider to complete at the bedside in discussion with the patient.
A rapid prototyping model was used following the International Patient Decision Aid Standards and the agile software development approach, which provides the flexibility to make changes from feedback on usability in addition to rapid prototyping.61,62 Given the constant evolution of available technology, the prototype was programmed to be device agnostic: capable of running on any modern device regardless of its operating system or form factor. The prototype’s interface evolved through many development cycles. To develop the initial prototype, a multidisciplinary team (including several clinical informaticists with a variety of clinical backgrounds, a systems architect, a computer programmer, key stakeholders from the health system’s information technology (IT) leadership, and potential users) reviewed the factors identified in the requirements-gathering phase, resulting in a “rough draft” prototype. Next, the flexible development process was enhanced by eliciting feedback from end users at every stage. The initial prototype was developed for demonstration and feedback from multiple audiences (including patient decision aid designers, internal- and emergency-medicine physicians, a human computer interaction class for physicians with a variety of clinical backgrounds and experiences, and key stakeholders from our institution’s EHR vendor) on its content, format, and usability as well as its potential to increase patient knowledge and address patient concerns, values, and preferences. This feedback was used to iteratively modify the prototype.
This process is ongoing. At the time of this writing, the next step planned is for the prototype to undergo formative usability testing to maximize its efficiency, effectiveness, and user satisfaction.19 Usability evaluations will be conducted with representative patients and providers (end users) to assess the degree to which the prototype tool matches their needs for shared decision-making and workflow. To optimize the adoption of this tool in the complex, high-pressure environment of the ED warrants special attention. In this complex sociotechnical system, maximizing the human-computer interaction is not sufficient. The human-environment interaction, or ecology, must also be taken into account. It is crucial that CDS makes work easier for the provider—otherwise providers will not use the tool, will experience information overload and alert fatigue, and will try to ignore or circumnavigate the CDS in order to get other tasks done.12,63,64 Field-testing will rely on the principles of ecological interface design to provide the right information at the right time and in the right way while considering the demands of the ED work environment.59,63
Features
The application was designed to allow users (both patient and provider) smooth, user-friendly navigation through the screens while completing the tasks of patient education, risk communication, and shared decision-making about whether or not to perform a CT. The application is equipped with features to educate and empower patients with knowledge that facilitates expressing concerns. When the provider arrives at the bedside, the patient can make an informed decision, and the provider can efficiently address the patient’s concerns.
The patients fill out two forms when they first receive the tablet. The first is an eligibility form. If the patient is eligible, the patient then continues on to a questionnaire form. The answers to the questionnaire autopopulate the subjective components of the clinical decision rule later on in the provider workflow to streamline the provider’s time at the bedside, thus maximizing the opportunity for a conversation regarding the patient’s specific concerns. Following the forms, the tool is divided into three sections: (1) a patient education section, (2) a physician’s section, and (3) a shared decision-making section. These sections all follow a visual metaphor using a design reminiscent of a decision aid on paper cards.65
The tool’s patient education section contains information for the patient about concussion, more severe brain injuries, and CT scans. The information provided on each card is focused and simplified, given the wide range of education backgrounds of potential users as well as the presence of a recent head injury. If the patient wants to learn more, there is the option to get more detailed information on specific questions on each topic. With the goal of maintaining the patient’s attention, the information section was designed so the patients can read in any order they choose. In this section, the patients are also given the opportunity to flag their specific concerns for future discussion with their providers (Figure 2).
On entering the patient’s room, the provider logs in and is prompted to go through the CCHR clinical decision rule with the patient. The inputs to the rule prompt a risk communication conversation by providing patient-specific risk estimates of any brain injury on CT, clinically important brain injury, the need for neurosurgical intervention, and risk of cancer from a head CT. The “What is best for YOU” screen in the provider section (Figure 3) engages the patient by addressing issues that we identified in our qualitative research—such as identifying and addressing patient concerns, establishing trust, and managing patient uncertainty. This section also has a card that appears only if the patient is 65 years or older and addresses issues specific to older adults.
The decision area is where the patient and the provider come together to make a shared decision. Three cards provide the details of three choices: (1) get a CT, (2) to go home now with active surveillance for new or concerning symptoms, or (3) stay in the ED for observation (available only to patients under age 65 due to the increased risk of a slow-bleeding subdural hemorrhage). A doctor’s note includes the patient-specific risk estimates and documents the shared decision-making conversation. For example, the following shared decision-making note is generated for a low risk patient who does not undergo CT imaging:
I have used a decision aid to share decision-making with the patient about whether or not to get a CT scan for a minor head injury.65 This patient’s injury is low risk based on the Canadian CT Head Rule. We estimated the patient’s risks as follows: (1) need for neurosurgical intervention to be 0.0%, (2) clinically important brain injury to be 1.1%, (3) any brain injury by CT to be 2.7%, and (4) lifetime risk of cancer from a CT scan today to be 0.007%. After considering the patient’s unique circumstances and the pros and cons of the alternatives, we decided the patient should go home now without a CT.
This note will push to the patient’s chart in the EHR along with the ability for the provider to push an order for a CT or discharge instructions depending on the ultimate results of the shared decision.
System Database Architecture
The prototype phase also included creation of a database to collect, edit, store, and retrieve data generated by the tool. The database grew throughout the rapid prototyping process as the application’s needs expanded. Originally the model view controller framework provided some tables that handle user authentication. As the application grew, new tables were added for additional functionality. At the end of the prototyping phase, the database had six tables (Figure 4).
Figure 4. The Database Schema, Where. the Shared Decision-Making is Reflected in the Database as FinalDecision in the userRisk Relation
Technical System Requirements
The application was developed and tested on a machine that was running Internet Information Services 8 (running on Microsoft Windows 6.2). This machine had 16 GB of memory and a 4.5 GHz octa-core processor. Currently, the Web service is hosted on a machine that has Windows 2012 R2 Datacenter edition with two processors. Our server used 4GB of RAM and a 250GB hard disk. While this machine uses fewer resources, these specifications allow the application to run smoothly. We used similar specifications for the database server. With these modest specifications the application runs smoothly and yet does not utilize all of the server’s resources.
Lessons Learned
Specific lessons gained from the patient-centered decision support design process described above are provided in Table 1 along with generalized, actionable recommendations such that future design and implementation efforts might benefit from the critical insights gained from this process. Given the literature available on similar topics, this list of lessons is by no means exhaustive; rather, it is intended to provide key pitfalls and successes based on our experiences that may not yet appear in the literature or would most benefit an early stage developer.
Table 1.
LESSONS LEARNED | RECOMMENDATIONS | |
---|---|---|
Early stakeholder engagement | Early engagement of health system IT leadership allowed introduction of project’s goals, networking with leaders locally, and access to EHR training coursework that will support integration of tool with EHR. | Engage key stakeholders early to facilitate open dialogue and support when implementation challenges arise. |
Prototype pilot testing addressed gaps in instructions provided to new users. | Pilot test prototype with new users prior to usability testing to identify gaps that developers may overlook due to familiarity with their tool. | |
Patient input is challenging. A patient advisory committee was recruited from focus group participants for future input on tool development. | Involve patient end users at every stage of development. There is a growing community of patient representatives, advocates, and volunteers who could help provide this input. | |
EHR integration | Despite physician user frustration with EHR usability, vendors are developing patient-centered interfaces, and they know who else is doing similar work and how best to integrate new tools with their software. |
|
Other researchers have worked on similar integration challenges. One group integrated a Web service into provider decision support workflow with our EHR vendor. As a result, the vendor has incorporated the ability to access a Web service from within the EHR as a standard for the latest version of their software. |
|
|
EHR vendors create barriers to research applied to their interface. There are many logistical challenges to completing vendor-specific EHR training to learn skills to do this type of research. |
Begin EHR vendor training coursework early, and anticipate delays and challenges to coursework completion. | |
Feedback from prototype demonstrations to outside expert groups | Tool will achieve its goals only if it can promote conversation between patient and provider. | Include designers on development team so tool facilitates high quality conversation between patient and provider. Consider using psychometrics for shared decision-making—like the OPTION scale and decisional conflict scale.67,68 |
It is more difficult to make changes to a computerized prototype if programming is started too early in the development process. |
|
|
Challenge of having two users | Best practices for authentication of two users for a single tool on a single device are not established. Double user authentication is particularly difficult given HIPAA regulations. | Work within the institution’s EHR constraints to permit authentication of both users and maintain HIPAA compliance. |
Usability testing methods for two users on a single device is not established—limiting data collection options for both users using conventional usability evaluation methods and software. | Evaluate one user type at a time (e.g., provider) and simulate the other user with a standardized script. | |
Project management | A high volume of programming specifications can be difficult to track and prioritize. | Use cloud-based project management software to allow asynchronous communication and tracking of specification requests and progress. |
Accessibility | Section 508 of the Rehabilitation Act of 1973 in the United States requires all federally funded information technology (IT) to be accessible to people with disabilities. | Use online resources to assess 508 compliance as well as accessibility for those with limited literacy and color blindness. |
Early stakeholder involvement is a key tenet in implementation science and change management. A common pitfall in implementation science is when stakeholders resist change due to lack of engagement early on in the change process. Stakeholders for this project include groups such as ED patients with minor head injury, ED providers locally (and beyond should the implementation succeed locally), departmental leadership, health system IT leadership and informatics work force, clinical informatics research community, and the EHR vendor. Engagement of health system IT leadership and patients are both challenging and crucial for success. Therefore, we placed special emphasis on recommendations to engage these stakeholders early and throughout the development process.
Provider stakeholders have made it clear in the requirement-gathering phase that they will use a tool if it can streamline their workflow. Therefore, we are working to integrate the tool within our institution’s EHR such that it can facilitate CT order entry for the provider and generate discharge instructions for the patient (patient handout on concussion). We are working with our health system’s IT leadership and our institution’s EHR leadership to optimize this interoperability. Lessons learned thus far in EHR integration are both refreshing and frustrating. Evolution of the EHR interface is driven by market pressure for functionality over usability.4,5,69 Despite the barriers to receiving vendor-specific training coursework for research applied to their interface, there are large initiatives on the vendor’s part to continue to improve EHR usability with a patient-centered focus.
During the prototype development process, we demonstrated the prototype to outside expert groups (in human-computer interaction, health IT usability research, and decision aids) to seek feedback. These outside groups were particularly helpful to set goals in innovative areas of inquiry within the patient-centered decision support design process. For example, traditional usability evaluations test a single user on a single device. Usability evaluation of our tool presents unique challenges to traditional usability evaluation techniques since our tool will be used by two users on a single device. To address this challenge, we have developed a standardized protocol for usability testing that evaluates one user type and simulates the other user with a standardized script. For example, real physician users will test the tool with standardized patients in a simulated clinical environment. The decision aid expert group helped us to appreciate that the tool’s primary goal for both evaluation and implementation will be its ability to facilitate high quality conversation between the patient and provider.70
Conclusions
This case study describes a novel patient-centered decision support design process that aims to bring the physician back to the patient’s bedside by integrating a patient decision aid with CDS for shared use by the patient and provider on a touchscreen tablet computer for deciding whether or not to obtain a CT scan for minor head injury in the ED, a clinical scenario that could benefit from CDS but has failed previous implementation attempts. The study focuses on the prototype building process in the context of requirement gathering, usability- and field-testing, and EHR integration, and a future implementation trial to demonstrate that some technological support is better than no technology for patient engagement and safe reduction of diagnostic imaging in the ED. Once the cycle of development and testing is complete, we aim to prospectively test the effect of using the tool in ED patients with minor head trauma. The ultimate objective of developing this patient-centered decision support tool is that it will engage patients in their care and safely reduce CT use in minor head injury patients in the ED. The tool will facilitate the patient and provider in engaging in transparent, informed, shared decision-making regarding risk communication and CT use in minor head injury. It will be seamlessly integrated into the provider’s EHR workflow including automatically generating a note for the patient’s chart on the shared decision-making conversation, thereby simultaneously streamlining the provider’s workflow and respecting the patient’s preferences.
The current challenges to EHR usability reflect the vendor market’s pressure for functionality over usability.4,5,69 It has been argued that current EHRs contain vast amounts of data and information hidden in unreadable interfaces and that “no system of data management will ever replace…good medicine.”71 This conclusion fails to recognize the trajectory of innovation that EHR usability challenges represent. In Ackoff’s seminal paper, “From Data to Wisdom,” he differentiates data and information from knowledge.72 Data is raw material; it exists in isolation without significance. Information is data that has been given meaning but may or may not be useful; whereas knowledge is the collection of information with the intent to be useful. We believe that medicine is only in the dawn of its information age; the smart phone’s pervasiveness is evidence that as a society we are well into a knowledge age. The complexity of the changing health care system locally, regionally, and nationally and the rapid growth of knowledge in complex fields all delay EHR usability. As medicine catches up to available technologies, future electronic systems must not only be usable, but must also support knowledge and promote conversation between patients and their doctors at the bedside.
Acknowledgments
We would like to thank the Mayo Clinic Knowledge & Evaluation Research Unit, North Shore/LIJ Department of Internal Medicine, University of Maryland, Human-Computer Interaction Lab, Oregon Health & Science University-Human Computer Interaction in Biomedicine course instructors, Pediatric Emergency Care Applied Research Network investigators, and Epic’s Research & Development team for taking the time to view demonstrations of our prototype and provide feedback to improve our final product.
Footnotes
Disciplines
Emergency Medicine | Health Information Technology | Trauma
References
- 1.Simon HA. Designing Organizations for an Information-Rich World. In: Greenberger M, editor. Computers, Communication, and the Public Interest. Baltimore, MD: The Johns Hopkins Press; 1971. pp. 40–1. [Google Scholar]
- 2.Howell WC, Cooke NJ. Training the human information processor: A look at cognitive model. In: Goldstein IL, editor. Training and Development in Work Organizations: Frontiers of Industrial and Organizational Psychology. San Francisco, CA: Jossey-Bass; 1989. pp. 121–82. [Google Scholar]
- 3.O’Malley AS. Tapping the unmet potential of health information technology. The New England journal of medicine. 2011;364(12):1090–1. doi: 10.1056/NEJMp1011227. [DOI] [PubMed] [Google Scholar]
- 4.Mandl KD, Kohane IS. Escaping the EHR trap--the future of health IT. The New England journal of medicine. 2012;366(24):2240–2. doi: 10.1056/NEJMp1203102. [DOI] [PubMed] [Google Scholar]
- 5.Melnick ER. An outdated solution. The New York Times. 2014 Jan 21; [Google Scholar]
- 6.May CR, Mair F, Finch T, MacFarlane A, Dowrick C, Treweek S, et al. Development of a theory of implementation and integration: Normalization Process Theory. Implement Sci. 2009;4(29):29. doi: 10.1186/1748-5908-4-29. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.May C, Finch T, Mair F, Ballini L, Dowrick C, Eccles M, et al. Understanding the implementation of complex interventions in health care: the normalization process model. BMC Health Services Research. 2007;7(1):148. doi: 10.1186/1472-6963-7-148. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Kawamoto K, Houlihan CA, Balas EA, Lobach DF. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ (Clinical research ed) 2005;330(7494):765. doi: 10.1136/bmj.38398.500764.8F. Epub 2005 Mar 14. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Saleem JJ, Patterson ES, Militello L, Render ML, Orshansky G, Asch SM. Exploring barriers and facilitators to the use of computerized clinical reminders. Journal of the American Medical Informatics Association : JAMIA. 2005;12(4):438–47. doi: 10.1197/jamia.M1777. Epub 2005 Mar 31. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Sittig DF, Wright A, Osheroff JA, Middleton B, Teich JM, Ash JS, et al. Grand challenges in clinical decision support. Journal of Biomedical Informatics. 2008;41(2):387–92. doi: 10.1016/j.jbi.2007.09.003. Epub 2007 Sep 21. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Karsh BT. Clinical practice improvement and redesign: how change in workflow can be supported by clinical decision support. Rockville, MD: Agency for Healthcare Research and Quality; 2009 09-0054-EF Contract No.: Report. [Google Scholar]
- 12.Melnick ER, Nielson JA, Finnell JT, Bullard MJ, Cantrill SV, Cochrane DG, et al. Delphi consensus on the feasibility of translating the ACEP clinical policies into computerized clinical decision support. Annals of Emergency Medicine. 2010;56(4):317–20. doi: 10.1016/j.annemergmed.2010.03.006. Epub 2010 Apr 3. [DOI] [PubMed] [Google Scholar]
- 13.Karsh BT, Weinger MB, Abbott PA, Wears RL. Health information technology: fallacies and sober realities. Journal of the American Medical Informatics Association : JAMIA. 2010;17(6):617–23. doi: 10.1136/jamia.2010.005637. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Medicare Payment Advisory Commission. A Data Book: Health Care Spending and the Medicare Program 2011. [January 11, 2012]. Available from: http://www.medpac.gov/documents/Jun11DataBookEntireReport.pdf.
- 15.Korley FK, Pham JC, Kirsch TD. Use of advanced radiology during visits to US emergency departments for injury-related conditions, 1998–2007. JAMA : the journal of the American Medical Association. 2010;304(13):1465–71. doi: 10.1001/jama.2010.1408. [DOI] [PubMed] [Google Scholar]
- 16.Stiell IG, Clement CM, Grimshaw J, Brison RJ, Rowe BH, Schull MJ, et al. Implementation of the Canadian C-Spine Rule: prospective 12 centre cluster randomised trial. BMJ (Clinical research ed) 2009;339:b4146. doi: 10.1136/bmj.b4146. Journal Article. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Stiell IG, Clement CM, Grimshaw JM, Brison RJ, Rowe BH, Lee JS, et al. A prospective cluster-randomized trial to implement the Canadian CT Head Rule in emergency departments. CMAJ : Canadian Medical Association journal = journal de l’Association medicale canadienne. 2010;182(14):1527–32. doi: 10.1503/cmaj.091974. Epub 2010 Aug 23. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Smits M, Dippel DW, Nederkoorn PJ, Dekker HM, Vos PE, Kool DR, et al. Minor head injury: CT-based strategies for management--a cost-effectiveness analysis. Radiology. 2010;254(2):532–40. doi: 10.1148/radiol.2541081672. [DOI] [PubMed] [Google Scholar]
- 19.Melnick ER, Szlezak CM, Bentley SK, Kotlyar S, Post LA. Overuse of CT for mild traumatic brain injury. The Joint Commission Journal on Quality and Patient Safety. 2012. (Accepted, pending revision). [DOI] [PubMed]
- 20.American College of Emergency Physicians. Five Things Physicians and Patients Should Question 2013. [cited 2014 August 12]. Available from: http://www.choosingwisely.org/doctor-patient-lists/american-college-of-emergency-physicians/.
- 21.Schuur JD, Carney DP, Lyn ET, Raja AS, Michael JA, Ross NG, et al. A top-five list for emergency medicine: a pilot project to improve the value of emergency care. JAMA internal medicine. 2014;174(4):509–15. doi: 10.1001/jamainternmed.2013.12688. [DOI] [PubMed] [Google Scholar]
- 22.Stacey D, Bennett CL, Barry MJ, Col NF, Eden KB, Holmes-Rovner M, et al. Decision aids for people facing health treatment or screening decisions. Cochrane Database Syst Rev. 2011(10):CD001431. doi: 10.1002/14651858.CD001431.pub3. [DOI] [PubMed] [Google Scholar]
- 23.Horng S, Goss FR, Chen RS, Nathanson LA. Prospective pilot study of a tablet computer in an Emergency Department. International journal of medical informatics. 2012;81(5):314–9. doi: 10.1016/j.ijmedinf.2011.12.007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Hartmann B, Benson M, Junger A, Quinzio L, Rohrig R, Fengler B, et al. Computer keyboard and mouse as a reservoir of pathogens in an intensive care unit. Journal of clinical monitoring and computing. 2004;18(1):7–12. doi: 10.1023/b:jocm.0000025279.27084.39. [DOI] [PubMed] [Google Scholar]
- 25.Wilson AP, Hayman S, Folan P, Ostro PT, Birkett A, Batson S, et al. Computer keyboards and the spread of MRSA. The Journal of hospital infection. 2006;62(3):390–2. doi: 10.1016/j.jhin.2005.09.007. [DOI] [PubMed] [Google Scholar]
- 26.Kendrick J. Tablets in the enterprise: Pros and cons. April 13, 2012 [April 9, 2015]. Available from: http://www.zdnet.com/article/tablets-in-the-enterprise-pros-and-cons/.
- 27.Hirsch EB, Raux BR, Lancaster JW, Mann RL, Leonard SN. Surface microbiology of the iPad tablet computer and the potential to serve as a fomite in both inpatient practice settings as well as outside of the hospital environment. PLoS One. 2014;9(10):e111250. doi: 10.1371/journal.pone.0111250. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Horsky J, Gutnik L, Patel VL. Technology for emergency care: cognitive and workflow considerations. (Journal Article) [PMC free article] [PubMed]
- 29.Saleem JJ, Patterson ES, Militello L, Asch SM, Doebbeling BN, Render ML. Using human factors methods to design a new interface for an electronic medical record. 2007. (Journal Article). [PMC free article] [PubMed]
- 30.Russ AL, Zillich AJ, McManus MS, Doebbeling BN, Saleem JJ. A human factors investigation of medication alerts: barriers to prescriber decision-making and clinical workflow. AMIA Annu Symp Proc. 2009;2009:548–52. Journal Article. [PMC free article] [PubMed] [Google Scholar]
- 31.Levin S, Aronsky D, Hemphill R, Han J, Slagle J, France DJ. Shifting toward balance: measuring the distribution of workload among emergency physician teams. Annals of Emergency Medicine. 2007;50(4):419–23. doi: 10.1016/j.annemergmed.2007.04.007. Epub 2007 Jun 7. [DOI] [PubMed] [Google Scholar]
- 32.Chisholm CD, Dornfeld AM, Nelson DR, Cordell WH. Work interrupted: a comparison of workplace interruptions in emergency departments and primary care offices. Annals of Emergency Medicine. 2001;38(2):146–51. doi: 10.1067/mem.2001.115440. [DOI] [PubMed] [Google Scholar]
- 33.Chisholm CD, Collison EK, Nelson DR, Cordell WH. Emergency department workplace interruptions: are emergency physicians “interrupt-driven” and “multitasking”? Academic Emergency Medicine. 2000;7(11):1239–43. doi: 10.1111/j.1553-2712.2000.tb00469.x. [DOI] [PubMed] [Google Scholar]
- 34.Wears RL. Introduction: The Approach to the Emergency Department Patient. In: Wolfson AB, Harwood-Nuss A, editors. Harwood-Nuss’ clinical practice of emergency medicine. 4th. Philadelphia, PA: Lippincott Williams & Wilkins; 2005. [Google Scholar]
- 35.Faul M, Xu L, Wald MM, Coronado VG. Traumatic brain injury in the United States: emergency department visits, hospitalizations, and deaths. Atlanta, GA: Centers for Disease Control and Prevention, National Center for Injury Prevention and Control; 2010. [Google Scholar]
- 36.Marshall LF, Toole BM, Bowers SA. The National Traumatic Coma Data Bank. Part 2: Patients who talk and deteriorate: implications for treatment. Journal of neurosurgery. 1983;59(2):285–8. doi: 10.3171/jns.1983.59.2.0285. [DOI] [PubMed] [Google Scholar]
- 37.Brenner DJ, Hall EJ. Computed tomography--an increasing source of radiation exposure. The New England journal of medicine. 2007;357(22):2277–84. doi: 10.1056/NEJMra072149. [DOI] [PubMed] [Google Scholar]
- 38.Stiell IG, Wells GA, Vandemheen K, Clement C, Lesiuk H, Laupacis A, et al. The Canadian CT Head Rule for patients with minor head injury. Lancet. 2001;357(9266):1391–6. doi: 10.1016/s0140-6736(00)04561-x. [DOI] [PubMed] [Google Scholar]
- 39.Smith-Bindman R, Lipson J, Marcus R, Kim KP, Mahesh M, Gould R, et al. Radiation dose associated with common computed tomography examinations and the associated lifetime attributable risk of cancer. Archives of Internal Medicine. 2009;169(22):2078–86. doi: 10.1001/archinternmed.2009.427. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Stiell IG, Wells GA. Methodologic standards for the development of clinical decision rules in emergency medicine. Annals of Emergency Medicine. 1999;33(4):437–47. doi: 10.1016/s0196-0644(99)70309-4. [DOI] [PubMed] [Google Scholar]
- 41.Stiell IG, Clement CM, Rowe BH, Schull MJ, Brison R, Cass D, et al. Comparison of the Canadian CT Head Rule and the New Orleans Criteria in patients with minor head injury. JAMA : the journal of the American Medical Association. 2005;294(12):1511–8. doi: 10.1001/jama.294.12.1511. [DOI] [PubMed] [Google Scholar]
- 42.Smits M, Dippel DW, de Haan GG, Dekker HM, Vos PE, Kool DR, et al. External validation of the Canadian CT Head Rule and the New Orleans Criteria for CT scanning in patients with minor head injury. JAMA. 2005;294(12):1519–25. doi: 10.1001/jama.294.12.1519. [DOI] [PubMed] [Google Scholar]
- 43.Papa L, Stiell IG, Clement CM, Pawlowicz A, Wolfram A, Braga C, et al. Performance of the Canadian CT Head Rule and the New Orleans Criteria for Predicting Any Traumatic Intracranial Injury on Computed Tomography in a United States Level I Trauma Center. Acad Emerg Med. 2012;19(1):2–10. doi: 10.1111/j.1553-2712.2011.01247.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Stiell IG, Wells GA, Vandemheen K, Laupacis A, Brison R, Eisenhauer MA, et al. Variation in ED use of computed tomography for patients with minor head injury. Annals of Emergency Medicine. 1997;30(1):14–22. doi: 10.1016/s0196-0644(97)70104-5. [DOI] [PubMed] [Google Scholar]
- 45.Goldberg L, Lide B, Lowry S, Massett HA, O’Connell T, Preece J, et al. Usability and accessibility in consumer health informatics current trends and future challenges. American journal of preventive medicine. 2011;40(5 Suppl 2):S187–97. doi: 10.1016/j.amepre.2011.01.009. [DOI] [PubMed] [Google Scholar]
- 46.Melnick ER, Shafer K, Rodulfo N, Shi J, Hess EP, Wears RL, et al. Understanding Overuse Of CT For Minor Head Injury In The ED: A Triangulated Qualitative Study. Academic Emergency Medicine. 2015 Dec; doi: 10.1111/acem.12824. [in press] [DOI] [PubMed] [Google Scholar]
- 47.Curry LA, Nembhard IM, Bradley EH. Qualitative and mixed methods provide unique contributions to outcomes research. Circulation. 2009;119(10):1442–52. doi: 10.1161/CIRCULATIONAHA.107.742775. [DOI] [PubMed] [Google Scholar]
- 48.Militello LG. Learning to think like a user: using cognitive task analysis to meet today’s health care design challenges. Biomedical instrumentation & technology / Association for the Advancement of Medical Instrumentation. 1998;32(5):535–40. [PubMed] [Google Scholar]
- 49.Klein GA, Calderwood R, MacGregor D. Critical decision method for eliciting knowledge. Systems, Man and Cybernetics, IEEE Transactions on. 1989;19(3):462–72. [Google Scholar]
- 50.Curran JA, Brehaut J, Patey AM, Osmond M, Stiell I, Grimshaw JM. Understanding the Canadian adult CT head rule trial: use of the theoretical domains framework for process evaluation. Implement Sci. 2013;8:25. doi: 10.1186/1748-5908-8-25. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Sheehan B, Nigrovic LE, Dayan PS, Kuppermann N, Ballard DW, Alessandrini E, et al. Informing the design of clinical decision support services for evaluation of children with minor blunt head trauma in the emergency department: a sociotechnical analysis. J Biomed Inform. 2013;46(5):905–13. doi: 10.1016/j.jbi.2013.07.005. [DOI] [PubMed] [Google Scholar]
- 52.Probst MA, Kanzaria HK, Schriger DL. A conceptual model of emergency physician decision making for head computed tomography in mild head injury. The American journal of emergency medicine. 2014;32(6):645–50. doi: 10.1016/j.ajem.2014.01.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Kushniruk AW, Patel VL. Cognitive and usability engineering methods for the evaluation of clinical information systems. Journal of Biomedical Informatics. 2004;37(1):56–76. doi: 10.1016/j.jbi.2004.01.003. [DOI] [PubMed] [Google Scholar]
- 54.Stone D, Jarrett C, Woodroffe E, Minocha S. User Interface Design and Evaluation. San Francisco, CA: Elsevier, Inc; 2005. [Google Scholar]
- 55.Morgan DL. Focus Groups. Annual Review of Sociology. 1996;22(1):129–52. [Google Scholar]
- 56.Weir CR, Nebeker JJ, Hicken BL, Campo R, Drews F, Lebar B. A cognitive task analysis of information management strategies in a computerized provider order entry environment. Journal of the American Medical Informatics Association : JAMIA. 2007;14(1):65–75. doi: 10.1197/jamia.M2231. Epub 2006 Oct 26. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57.Nemeth CP, Cook RI, Wears RL. Studying the technical work of emergency care. Annals of Emergency Medicine. 2007;50(4):384–6. doi: 10.1016/j.annemergmed.2007.08.013. [DOI] [PubMed] [Google Scholar]
- 58.Nemeth CP, Cook RI, Woods DD. The messy details: insights from the study of technical work in health care. IEEE Transactions on Systems, Man and Cybernetics: Part A. 2004;34(6):689–92. [Google Scholar]
- 59.Vicente KJ. Ecological interface design: progress and challenges. Human factors. 2002;44(1):62–78. doi: 10.1518/0018720024494829. [DOI] [PubMed] [Google Scholar]
- 60.Legare F, Ratte S, Gravel K, Graham ID. Barriers and facilitators to implementing shared decision-making in clinical practice: update of a systematic review of health professionals’ perceptions. Patient education and counseling. 2008;73(3):526–35. doi: 10.1016/j.pec.2008.07.018. [DOI] [PubMed] [Google Scholar]
- 61.Martin RC. Agile software development: principles, patterns, and practices. Prentice Hall PTR; 2003. [Google Scholar]
- 62.Elwyn G, O’Connor A, Stacey D, Volk R, Edwards A, Coulter A, et al. Developing a quality criteria framework for patient decision aids: online international Delphi consensus process. BMJ. 2006;333(7565):417. doi: 10.1136/bmj.38926.629329.AE. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63.Flach JM, Vicente KJ, Tanabe F, Monta K, Rasmussen J. An ecological approach to interface design.. Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting; 1998. pp. 295–9. [Google Scholar]
- 64.Lomotan EA, Hoeksema LJ, Edmonds DE, Ramirez-Garnica G, Shiffman RN, Horwitz LI. Evaluating the use of a computerized clinical decision support system for asthma by pediatric pulmonologists. International journal of medical informatics. 2012;81(3):157–65. doi: 10.1016/j.ijmedinf.2011.11.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65. Diabetes Medication Choice Decision Aid: Mayo Clinic; 2012 [cited 2014 September 7]. Available from: http://diabetesdecisionaid.mayoclinic.org.
- 66.Mann DM, Lin JJ. Increasing efficacy of primary care-based counseling for diabetes prevention: rationale and design of the ADAPT (Avoiding Diabetes Thru Action Plan Targeting) trial. Implement Sci. 2012;7:6. doi: 10.1186/1748-5908-7-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67.O’Connor AM. Validation of a decisional conflict scale. Medical decision making : an international journal of the Society for Medical Decision Making. 1995;15(1):25–30. doi: 10.1177/0272989X9501500105. [DOI] [PubMed] [Google Scholar]
- 68.Elwyn G, Hutchings H, Edwards A, Rapport F, Wensing M, Cheung WY, et al. The OPTION scale: measuring the extent that clinicians involve patients in decision-making tasks. Health expectations : an international journal of public participation in health care and health policy. 2005;8(1):34–42. doi: 10.1111/j.1369-7625.2004.00311.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 69.Gellert GA, Ramirez R, Webster SL. The rise of the medical scribe industry: implications for the advancement of electronic health records. JAMA. 2015;313(13):1315–6. doi: 10.1001/jama.2014.17128. [DOI] [PubMed] [Google Scholar]
- 70.Montori VM, Breslin M, Maleska M, Weymiller AJ. Creating a conversation: insights from the development of a decision aid. PLoS Med. 2007;4(8):e233. doi: 10.1371/journal.pmed.0040233. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71.Zuger A. With Electronic Medical Records, Doctors Read When They Should Talk. The New York Times. 2015 October 14, 2014. [Google Scholar]
- 72.Ackoff RL. From data to wisdom: Presidential address to ISGSR, June 1988. Journal of applied systems analysis. 1989;16(1):3–9. [Google Scholar]