Abstract
Background
Hospitals are required to have rapid response (RR) systems in place to respond to acute changes in a patient’s condition. In high-stress situations like RR, medical residents face decision-making challenges due to time constraints and perceived pressure. Instituting order panels (OPs) can facilitate clinical decision making and improve residents’ and nurses’ satisfaction and patient safety.
Objective
This quality improvement (QI) project aimed to create and institute standardized OPs for common RR clinical scenarios to improve satisfaction of internal medicine residents and nurses with the RR process.
Methods
This was a single tertiary care center QI project that developed OPs for 10 common RR scenarios. Resident and nursing satisfaction with RR was assessed before and after OP implementation via survey and qualitative data collection.
Results
Residents and nurses expressed high levels of satisfaction across various aspects of the RR process before and after OP implementation in both quantitative and qualitative analysis. Increased satisfaction was observed among residents regarding time spent placing orders (94%; P = 0.02) and time spent correcting wrong orders (87%; P = 0.03) after OP implementation. The nurses’ survey revealed no statistically significant differences in satisfaction before and after the implementation of OPs regarding communication, collaboration, efficiency, and organization of the team.
Conclusion
The introduction of standardized OPs for RRs resulted in increased satisfaction among internal medicine residents in terms of order placement and correcting wrong orders. Nurse satisfaction based on survey responses remained neutral. Qualitative data from both groups demonstrated a positive impact on communication, efficiency, and teamwork.
Keywords: rapid response, quality improvement, order sets, nurse satisfaction, medical education
In alignment with Joint Commission recommendations, hospitals are mandated to establish rapid response (RR) systems. These systems, typically led by a multidisciplinary team under the leadership of a provider, play a crucial role in promptly identifying and responding to acute patient condition changes, ultimately preventing adverse outcomes (1, 2).
At Cleveland Clinic Fairview Hospital, internal medicine residents are an important part of the RR team, and the senior resident is the leader of the RR team. However, the complexity of managing patients in high-pressure RR scenarios presents a common dilemma. Effective decision making in these situations demands more than medical knowledge; it requires the ability to navigate time constraints, perceived pressure from team members, and the critical clinical acuity of the scenario.
Instituting systems that can support and improve providers’ clinical decision-making process and patient care workflow is one of the most important ways to ensure best clinical outcomes and satisfaction (3). However, a gap exists in the literature regarding strategies to enhance residents’ satisfaction with the RR process while improving their medical training on decision-making processes (4).
We hypothesize that developing order panels (OPs) of common interventions, standardized for the most common scenarios residents face in RRs, would increase the satisfaction of the residents and nurses with the RR process, simplify the process of placing orders, and enhance the communication between the RR team members. The primary aim of this quality improvement (QI) project was to create and implement OPs for the RR team at Fairview Hospital, followed by an evaluation of residents’ and nurses’ satisfaction before and after implementation.
Methods
Fairview Hospital is a tertiary 500-bed hospital and has an average of 150–180 RR events each month. The RR team is composed of internal medicine residents (senior resident and intern dedicated on an RR rotation), a dedicated RR nurse, nurse manager, respiratory therapist, and the bedside nurses.
Order Panels Creation
Based on the Get With the Guidelines– Medical Emergency Team registry study by Lyons and colleagues (5) and our internal registry of RR calls, we identified the 10 most common RR scenarios and developed specific OPs: altered mental status, chest pain, shortness of breath, atrial fibrillation with rapid ventricular response, acute bleeding, abdominal pain, anaphylaxis, hypertensive urgency/emergency, hypotension/shock, and seizure. Cardiac arrest (code blue) was not included, because its management is based on Advanced Cardiac Life Support.
The OPs were developed to provide a standardized sequence of orders within the electronic health record (EHR) interface. The OPs encompass common laboratory and imaging tests at the top of the screen, followed by treatment interventions, transfer orders, telemetry orders, and specialist consultation requests at the bottom of the screen. To streamline the process, some laboratory orders are preselected as “default” orders and appear as checked boxes in the EHR. However, residents can uncheck them if they are not necessary for a specific patient. The most important orders are always presented as checked boxes. Medical treatments, including medications, always appear as unchecked boxes, requiring the resident to intentionally select the order.
The design of these OPs underwent a rigorous approval process, characterized by multiple meetings between the QI team and the internal medicine and nursing leadership. This iterative development involved extensive stakeholder engagement, ensuring panels met the needs of residents and nurses. For reference, the OPs can be found in the supplementary material.
Participants
The Fairview internal medicine residency program consists of 38 residents. For this project, a sample of 34 residents was included, excluding those directly involved in the study.
The survey was electronically distributed to approximately 400 nurses working on the medical-surgical floor and within the RR teams. At this point, a decision was made not to include respiratory therapists and pharmacists in the survey, because they do not participate in all RR teams.
Residents were e-mailed weekly about the OPs and educated on the use of OPs in person during monthly residency meetings. Nurses were informed of the OPs through weekly e-mails sent by leadership and direct communication from nursing leadership during nursing rounds.
Survey Development and Data Collection
Residents and nurses were surveyed before and after implementation to gauge satisfaction on time spent, utilization, satisfaction, and barriers to efficiency.
The residents’ survey was developed and adapted from the Veterans Affairs Learners’ Perceptions Survey, which is a validated survey to address residents’ satisfaction during clinical training (6). During five virtual focus-group sessions involving authors and faculty leadership, questions from the Veterans Affairs survey were either retained without changes or modified to specifically explore RR clinical training. This process resulted in a streamlined survey of 20 questions, achieving both saturation and exhaustiveness. The nurses’ satisfaction survey was adapted from other surveys assessing nurses’ satisfaction with RR (7–11) and approved by our institution nursing leadership. More details are available in the data supplement.
We developed two web-based survey tools, one for residents and one for nurses, to evaluate satisfaction levels before and after the intervention. Using a 6-point Likert scale ranging from 0 (unable to evaluate) to 5 (very satisfied), we asked respondents to rate their satisfaction with the RR process, with scores of 1 and 2 indicating negative responses (very dissatisfied and dissatisfied), 3 representing neutral response, and scores of 4 and 5 indicating satisfaction (satisfied and very satisfied).
For data collection, the anonymous surveys were administered through the SurveyMonkey website (https://www.surveymonkey.com). Only two researchers (M.V.F.G. and S.R.) had access to the data to ensure confidentiality.
Statistical Analysis
Participant data were summarized using descriptive statistics. Only surveys with at least one question response were included in the analysis. The Likert-scale responses were dichotomized into two categories, “satisfied” (including ranks 4 and 5) and “not satisfied” (including ranks 1–3), and a chi-square test was used to analyze the survey data.
Because respondents were not assigned tracking numbers, pairing of pre- and postimplementation responses was not possible. Statistical analysis was performed using IBM SPSS software Version 23.0. A P value < 0.05 was considered statistically significant. We followed the Standards for Quality Improvement Reporting Excellence (SQUIRE 2.0) guidelines for reporting QI projects (12). Because this was a QI project, it was waived from institutional review board review. Additional information about the statistical analysis is available in the supplementary material.
Qualitative Analysis
A thematic coding system, following Braun and Clarke’s model, was created to analyze the free-text survey responses (13). Two authors (S.R. and M.V.F.G.) completed the coding process to identify themes, and any discrepancies were resolved through discussion and agreement.
Timeline of the Study
This QI project commenced in September 2022 with the creation of the OPs and the pre-OP implementation survey. Institutional approval for the OPs was obtained in October 2022. Subsequently, the preimplementation survey was distributed in November 2022, followed by OP implementation in December 2022. In February 2023, the postimplementation survey was distributed. For a visual representation of the project timeline, refer to Figure 1.
Figure 1.
Timeline of the quality improvement (QI) project.
Results
Residents’ Survey
In the preimplementation survey, 32 out of 34 residents responded (94.1%), and in the postimplementation survey, 31 out of 34 residents responded (91.1%). There was no significant difference in the distribution of respondent post-graduate year (PGY) levels between the pre- and postimplementation surveys (P = 0.88).
Regarding formal RR rotations, 68.8% of residents reported having rotated in a formal RR rotation in the preimplementation survey, and 70.0% reported the same in the postimplementation survey (P = 0.92). In the preimplementation survey, only 43.8% of residents had 4 or more weeks of RR rotations, whereas in the postimplementation survey all residents had at least 4 weeks of RR rotation (P < 0.01), reflecting the months of training between the pre- and postimplementation survey.
After the implementation of RR, among the postintervention sample (n = 31), 32.3% of residents mentioned being aware of the RR OP but had never used it, 38.7% used it between one and five times, and 29.1% used it five or more times. There was no significant difference in the usage of the OPs between the PGY levels (P = 0.32).
After the implementation of OPs, there was an improvement in resident satisfaction with time spent on placing orders (70.0–93.5%; P = 0.02) and the time spent correcting wrong orders (60.7–87.1%; P = 0.03). Although not statistically significant (83.3–96.9%; P = 0.07), there was an upward trend in satisfaction regarding the ease of placing orders. However, there were no statistically significant differences between the preintervention and postintervention groups in the residents’ survey in terms of autonomy, length of RR, learning process, preparation for future clinical practice, confidence in managing sick patients, quality of care, patient safety, communication, and overall satisfaction. Residents’ survey results are presented in Table 1.
Table 1.
Residents’ satisfaction survey results: before and after rapid response order panels
| Survey Question* | Before Intervention (n = 32) |
After Intervention (n = 31) |
P Value |
|---|---|---|---|
| Degree of autonomy | 23/27 (85.2) | 24/29 (82.8) | 0.80 |
| Length of rapid response | 20/26 (76.9) | 23/31 (74.2) | 0.81 |
| Learning process | 27/29 (93.1) | 29/31 (93.5) | 0.95 |
| Preparation for future clinical practice | 24/29 (82.8) | 26/31 (83.9) | 0.90 |
| Confidence managing sick patients | 22/26 (84.6) | 25/31 (80.6) | 0.69 |
| Quality of care | 28/28 (100) | 29/30 (96.7) | 0.98 |
| Patient safety | 28/29 (96.6) | 29/30 (96.7) | 0.99 |
| Interdisciplinary work | 23/29 (79.3) | 25/30 (83.3) | 0.74 |
| Time available for chart review and history | 23/29 (79.3) | 27/31 (87.1) | 0.41 |
| Ease of placing orders | 25/30 (83.3) | 30/31 (96.9) | 0.07 |
| Time spent on placing orders | 21/30 (70.0) | 29/31 (93.5) | 0.02 |
| Time spent correcting wrong orders | 17/28 (60.7) | 27/31 (87.1) | 0.03 |
| Friendliness of EMR order panels in general | 20/30 (66.7) | 25/29 (86.2) | 0.12 |
| STAT orders are executed urgently | 20/30 (66.7) | 24/30 (80.0) | 0.38 |
| Delays in care during RR | 15/29 (51.7) | 14/27 (51.9) | 0.99 |
| Communication between residents | 29/30 (96.7) | 30/31 (96.8) | 1.00 |
| Communication with nurses | 24/30 (80.0) | 27/31 (87.1) | 0.50 |
| Satisfaction with the current structure of RR | 22/29 (75.9) | 27/31 (87.1) | 0.32 |
| Stress levels during RR | 19/30 (63.3) | 22/31 (71.0) | 0.59 |
| Overall satisfaction | 29/31 (93.5) | 28/31 (90.3) | 0.64 |
Definition of abbreviations: EMR = electronic medical records; RR = rapid response.
Data are presented as n (%) satisfied.
Satisfied = rank of 4–5 on a 6-point Likert scale.
Nurses’ Survey
A total of 130 nurses participated in the preimplementation survey, and 86 nurses responded to the postimplementation survey. The nurses’ preimplementation survey revealed that 81.5% of respondents were medical-surgical nurses, 11.5% were RR nurses, 3.9% were nurse managers, and 3.1% fell into other categories. In the postimplementation survey, the percentages were 66.3% medical-surgical floor nurses, 19.8% RR nurses, 4.6% nurse managers, and 9.3% other categories.
In both the preimplementation and postimplementation surveys, the median years of practice since graduation were 5 years (interquartile range, 2–12 yr). Notably, 99.1% of respondents reported being personally involved in an RR over the past year.
The satisfaction rates of nurses regarding the clarity of residents’ orders, timeliness of residents’ arrival, quality of care provided by residents to sick patients, teamwork between nurses and residents, patient safety during RRs, residents’ contingency plans, and their overall satisfaction with RR service remained unchanged between the pre- and postintervention periods. Nurses’ survey results are presented in Table 2. Sensitivity analyses were conducted for all questions in both the residents’ and nurses’ surveys, and the results remained unchanged (data supplement).
Table 2.
Nurses’ satisfaction survey before and after rapid response order panels
| Survey Question* | Before Intervention (n = 130) |
After Intervention (n = 86) |
P Value |
|---|---|---|---|
| Clarity of residents’ orders | 78/113 (69.0) | 57/80 (71.3) | 0.75 |
| Timeliness of residents’ arrival | 87/113 (77.0) | 66/80 (82.5) | 0.37 |
| Residents’ communication with you | 78/113 (69.0) | 53/80 (66.3) | 0.75 |
| Quality of care provided by residents to sick patients | 73/112 (65.2) | 58/80 (72.5) | 0.34 |
| Collaboration between residents and you to develop a plan of care | 62/113 (54.9) | 50/80 (62.5) | 0.30 |
| Efficiency and organization of the residents’ team | 62/114 (54.4) | 48/80 (60.0) | 0.46 |
| Teamwork between nurses and residents | 75/113 (66.4) | 50/80 (62.5) | 0.64 |
| Patient safety during RR | 99/114 (86.8) | 68/80 (85.0) | 0.83 |
| Residents listened to your concerns | 68/113 (60.2) | 48/80 (60.0) | 0.99 |
| Your involvement in the decision-making process | 61/113 (54.0) | 44/80 (55.0) | 0.90 |
| Duration of the rapid response | 59/113 (52.2) | 44/80 (55.0) | 0.77 |
| Resident’s contingency plan if patient decompensates | 60/113 (53.1) | 51/80 (63.7) | 0.18 |
| Your overall satisfaction with RR service | 75/111 (67.7) | 56/80 (70.0) | 0.75 |
Definition of abbreviation: RR = rapid response.
Data are presented as n (%) satisfied.
Satisfied = rank of 4–5 on a 6-point Likert scale.
Qualitative Analysis
The qualitative survey responses provided valuable insights into the residents’ and nurses’ perspectives on the RR OP intervention (Table 3). Residents found the RR OP helpful but believed it should not be mandatory, to avoid compromising critical thinking. They noted their usefulness for STAT orders and as reminders for important orders during RRs. Residents recommended early implementation and reminders during their training. Nurses reported improved communication and teamwork collaboration with residents after implementation.
Table 3.
Qualitative survey responses categorized (Braun and Clarke’s model)
| Residents’ survey |
| Ease of placing orders “It is a helpful resource to have, but should not be mandatory as sometimes it takes away critical thinking and may go against ‘choosing wisely’ when we are biased after seeing an order panel and feel compelled to place orders with minimal indication.” – IM PGY2 postintervention survey |
| Time spent on placing orders “As a PGY1 I am still learning how to run a rapid in a timely efficient manner and RR order sets help a lot in reminding what to order. Especially that changing routine orders into STAT one by one take time so RR order sets would be more efficient in terms of patient care! Thank you!” - IM PGY1 postintervention survey |
| Time spent correcting wrong order “The new RR order panel helps mainly with placing the orders STAT and as a reminder for important orders to be placed.”– IM PGY1 postintervention survey |
| Overall impression “They’re a great tool. The times that I have used it, I have found it very helpful.” – IM PGY1 postintervention survey “Suggest implementing early during the training.” – IM PGY2 postintervention survey “Reminders should be made to use it during the RR, or else we tend to forget about it and thus not to use it.” – IM PGY3 postintervention survey |
| Nurses’ survey |
| Communication between residents and nursing staff “At times there have been missed orders due to orders not being heard and feedback response not used. Rapids can be busy, and residents need to make sure they are being understood.” – Nurse Manager, 5 yr of practice, preintervention survey “I feel that communication between the residents and nursing staff has been effective overall. I appreciate the ‘summary’ or conclusion the resident often provides at the end. Stating ‘if this happens then call this person or do this.’ It lets everyone know our next steps.” – Med/Surg RN, 10 yr of practice, postintervention survey “Residents asking if we are comfortable with the plan has been extremely comforting. We all are given a chance to ask questions pertaining to next steps also. I feel RRs have had improved closed-loop communication over the last months.” – Med/Surg RN, 4 yr of practice, postintervention survey |
| Efficiency of the RR “There can be delays in deciding what to do. Order sets for common issues would be very beneficial for all.” – Med/Surg RN, 8 yr of practice, preintervention survey “I believe standardized order sets for certain types of RR will help our residents feel more confident in emergent situations.” – Med/Surg RN, 3 yr of practice, preintervention survey “I think it is very efficient and timely when one resident is running the rapid and their colleague is entering the orders in real time.” – Med/Surg RN, 10 yr of practice, postintervention survey |
| Teamwork and satisfaction with RR service “They want us to grab things in a timely manner but also not place the orders quickly.” – Med/Surg RN, 2 yr of practice, preintervention survey |
| Overall Impression “In general, I feel the residents do a wonderful job of helping during the RR and have gotten much better at communicating with nursing staff and making sure we are comfortable with the plan.” – Med/Surg RN, 6 yr of practice, postintervention survey. “I have personally noticed great improvement in our rapids over the past few months, specifically in the areas of communication, collaboration, and organization. Keep doing a great job.” – Nurse Manager, 3 yr of practice, postintervention survey |
Definition of abbreviations: IM = internal medicine; PGY = postgraduate year; RR = rapid response; STAT = short turnaround time.
See Reference 13 for description of Braun and Clarke’s model.
Discussion
The RR OP creation and implementation process received strong support from residents, nurses, and leadership. Overall, satisfaction levels among residents and nurses regarding various aspects of the RR process remained high, both before and after introducing standardized OPs. Most surveyed items showed no significant differences, except for residents’ satisfaction with the time spent on order placement and wrong order correction.
Our findings of increased resident satisfaction with the time spent on order placement and the correction of incorrect orders are consistent with previous research. In 2001, Lovis and colleagues conducted an initial study on physician satisfaction with order sets involving physicians at a Veteran Affairs facility and showed that the command interface was easy to learn and enabled them to place orders more rapidly when compared with the traditional menu-driven system (14). Nemeh and colleagues conducted a QI project that demonstrated that implementing order sets reduced the time medical residents spent on order placement and navigating the EHR (15).
The majority of the residents in this study had been involved in fewer than 20 RR cases. For residents, relying solely on memory and past experiences for RR can be time consuming and may not always ensure optimal patient outcomes (16); thus, implementing OPs can be important to change people and systems, because it contributes to patient safety, streamlined workflows resulting in improved efficiency, and better adherence to standards of care (17–20).
Despite those benefits, we anticipated encountering resistance from residents, especially those who preferred individual laboratory test orders over standardized panels. We considered that senior residents might be more resistant to this change than their junior counterparts, but we found no differences in usage across different PGY levels.
Remarkably, 32% of the residents opted not to use the OPs, despite receiving training in their use. Although we did not conduct a quantitative assessment to assess the specific reasons behind residents’ decisions to abstain from using the OPs, qualitative analysis highlighted recurring concerns related to over-ordering and unnecessary orders.
Order sets often do not align with what providers need at the point of care during an RR (21), and standard order sets may actually result in excessive ordering of laboratory tests (22). To balance standardization and individualized care, we left most OP boxes unchecked, so the residents had the educational opportunity to quickly review common RR order patterns while allowing for personalized consideration of each RR need.
Previous data show that nurses have consistently reported high levels of satisfaction with RR processes in various healthcare settings, but it remains unclear which specific aspects of the RR teams are responsible for such satisfaction (23). In the qualitative survey, nurses expressed improved satisfaction with the effectiveness of communication between residents and nursing staff and that the implementation of OPs facilitated closed-loop communication by enabling residents to concisely summarize the plan and provide clear next steps. However, in the quantitative analysis, these changes were not statistically significant.
Possible reasons for our neutral findings include the OP intervention may not have impacted areas directly related to nursing satisfaction; sample size limitations; other contextual factors influencing nursing satisfaction, such as staffing and workload; and the subjective nature of satisfaction in such a complex scenario as an RR.
We did not assess the costs or cost-effectiveness of this intervention at either the patient or hospital level. Recent findings from a systematic review indicate a lack of costing analyses conducted alongside the implementation of hospital-based computerized decision support systems, with considerable variability in existing data (24). Given the intricacies of RR scenarios and the variability in the OPs we developed, it remains uncertain whether our OPs resulted in any cost-effective changes. Further investigation would be needed to evaluate the financial impact of this intervention.
Strengths and Limitations
The current study presents notable strengths, including a high response rate to surveys and the use of validated measures to evaluate satisfaction. However, the study’s single-center and QI design and the specific context of our institution preclude the acquisition of data on patient outcomes, which may limit the generalizability of the findings to other healthcare settings.
In the future, additional studies can provide valuable insights into the impact of standardized OPs on patient-related outcomes such as mortality and length of stay. These investigations should be complemented with an analysis of potential issues related to the overuse of orders, unnecessary orders, overall team efficiency, and cost effectiveness. Moreover, it is essential to consider feedback from team members and make necessary adjustments to the RR process to enhance their satisfaction.
Conclusions
RR is a complex scenario, and this QI showed that the introduction of standardized OPs for common RRs resulted in increased satisfaction among internal medicine residents in terms of order placement and correcting wrong orders. Nurse satisfaction based on survey responses remained neutral, but qualitative data from both groups demonstrated a positive impact on communication, efficiency, and teamwork. Overall, our findings support implementing standardized OPs for RRs from internal residents’ perspectives, but further research is needed to explore cost effectiveness, impact on nurse satisfaction, and patient-centered outcomes.
Acknowledgments
Acknowledgment
The authors thank the internal medicine residents, bedside nurses, rapid response nurses, nurse managers, pharmacists, respiratory therapists, patient care nursing assistants, and all ancillary staff who make our rapid response team exceptional. The authors appreciate their unwavering commitment to patient care, especially during challenging times.
Footnotes
The views expressed in this article do not communicate an official position of the Cleveland Clinic or Fairview Hospital.
Author Contributions: M.V.F.G., S.R., E.K., K.M.H., M.J.A.-J., and F.F. conceptualized the study, designed the study, collected and analyzed the data, wrote the manuscript, and approved the final version. E.K. designed the study, collected and analyzed the data, wrote the manuscript, and approved the final version.
This article has a data supplement, which is accessible at the Supplements tab.
Author disclosures are available with the text of this article at www.atsjournals.org.
References
- 1.Agency for Healthcare Research and Quality. 2019. https://psnet.ahrq.gov/primer/rapid-response-systems [DOI] [PubMed]
- 2. Benin AL, Borgstrom CP, Jenq GY, Roumanis SA, Horwitz LI. Defining impact of a rapid response team: qualitative study with nurses, physicians and hospital administrators. BMJ Qual Saf . 2012;21:391–398. doi: 10.1136/bmjqs-2011-000390. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Sutton RT, Pincock D, Baumgart DC, Sadowski DC, Fedorak RN, Kroeker KI. An overview of clinical decision support systems: benefits, risks, and strategies for success. NPJ Digit Med . 2020;3:17. doi: 10.1038/s41746-020-0221-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. Butcher BW, Quist CE, Harrison JD, Ranji SR. The effect of a rapid response team on resident perceptions of education and autonomy. J Hosp Med . 2015;10:8–12. doi: 10.1002/jhm.2270. [DOI] [PubMed] [Google Scholar]
- 5. Lyons PG, Edelson DP, Carey KA, Twu NM, Chan PS, Peberdy MA, et al. American Heart Association’s Get With the Guidelines – Resuscitation Investigators Characteristics of rapid response calls in the United States: an analysis of the first 402,023 adult cases from the Get with the Guidelines Resuscitation-Medical Emergency Team Registry. Crit Care Med . 2019;47:1283–1289. doi: 10.1097/CCM.0000000000003912. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Keitz SA, Holland GJ, Melander EH, Bosworth HB, Pincus SH, VA Learners’ Perceptions Working Group The Veterans Affairs Learners’ Perceptions survey: the foundation for educational quality improvement. Acad Med . 2003;78:910–917. doi: 10.1097/00001888-200309000-00016. [DOI] [PubMed] [Google Scholar]
- 7. Al Qahtani S. Satisfaction survey on the critical care response team services in a teaching hospital. Int J Gen Med . 2011;4:221–224. doi: 10.2147/IJGM.S17361. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Radeschi G, Urso F, Campagna S, Berchialla P, Borga S, Mina A, et al. Factors affecting attitudes and barriers to a medical emergency team among nurses and medical doctors: a multi-centre survey. Resuscitation . 2015;88:92–98. doi: 10.1016/j.resuscitation.2014.12.027. [DOI] [PubMed] [Google Scholar]
- 9. Jones D, Baldwin I, McIntyre T, Story D, Mercer I, Miglic A, et al. Nurses’ attitudes to a medical emergency team service in a teaching hospital. Qual Saf Health Care . 2006;15:427–432. doi: 10.1136/qshc.2005.016956. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Galhotra S, Scholle CC, Dew MA, Mininni NC, Clermont G, DeVita MA. Medical emergency teams: a strategy for improving patient care and nursing work environments. J Adv Nurs . 2006;55:180–187. doi: 10.1111/j.1365-2648.2006.03901.x. [DOI] [PubMed] [Google Scholar]
- 11. Astroth KS, Woith WM, Stapleton SJ, Degitz RJ, Jenkins SH. Qualitative exploration of nurses’ decisions to activate rapid response teams. J Clin Nurs . 2013;22:2876–2882. doi: 10.1111/jocn.12067. [DOI] [PubMed] [Google Scholar]
- 12. Ogrinc G, Davies L, Goodman D, Batalden P, Davidoff F, Stevens D. SQUIRE 2.0 (Standards for QUality Improvement Reporting Excellence): revised publication guidelines from a detailed consensus process. BMJ Qual Saf . 2016;25:986–992. doi: 10.1136/bmjqs-2015-004411. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Braun V, Clarke V. Is thematic analysis used well in health psychology? A critical review of published research, with recommendations for quality practice and reporting. Health Psychol Rev . 2023;17:695–718. doi: 10.1080/17437199.2022.2161594. [DOI] [PubMed] [Google Scholar]
- 14. Lovis C, Chapko MK, Martin DP, Payne TH, Baud RH, Hoey PJ, et al. Evaluation of a command-line parser-based order entry pathway for the Department of Veterans Affairs electronic patient record. J Am Med Inform Assoc . 2001;8:486–498. doi: 10.1136/jamia.2001.0080486. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15. Nemeh CN, Muradali K, Arnell M, Horstman MJ, Kulkarni PA. Implementation of a rapid response order set in a single-center metropolitan Veterans Affairs medical center: a quality improvement study [abstract] Am J Respir Crit Care Med . 2021;203:A2873. [Google Scholar]
- 16. Horsky J, Kaufman DR, Patel VL. When you come to a fork in the road, take it: strategy selection in order entry. AMIA Annu Symp Proc . 2005;2005:350–354. [PMC free article] [PubMed] [Google Scholar]
- 17. Bobb AM, Payne TH, Gross PA. Viewpoint: controversies surrounding use of order sets for clinical decision support in computerized provider order entry. J Am Med Inform Assoc . 2007;14:41–47. doi: 10.1197/jamia.M2184. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18. Bates DW, Leape LL, Cullen DJ, Laird N, Petersen LA, Teich JM, et al. Effect of computerized physician order entry and a team intervention on prevention of serious medication errors. JAMA . 1998;280:1311–1316. doi: 10.1001/jama.280.15.1311. [DOI] [PubMed] [Google Scholar]
- 19. Micek ST, Roubinian N, Heuring T, Bode M, Williams J, Harrison C, et al. Before-after study of a standardized hospital order set for the management of septic shock. Crit Care Med . 2006;34:2707–2713. doi: 10.1097/01.CCM.0000241151.25426.D7. [DOI] [PubMed] [Google Scholar]
- 20. Guirgis FW, Jones L, Esma R, Weiss A, McCurdy K, Ferreira J, et al. Managing sepsis: electronic recognition, rapid response teams, and standardized care save lives. J Crit Care . 2017;40:296–302. doi: 10.1016/j.jcrc.2017.04.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21. Li RC, Wang JK, Sharp C, Chen JH. When order sets do not align with clinician workflow: assessing practice patterns in the electronic health record. BMJ Qual Saf . 2019;28:987–996. doi: 10.1136/bmjqs-2018-008968. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Leis B, Frost A, Bryce R, Coverett K. Standard admission order sets promote ordering of unnecessary investigations: a quasi-randomised evaluation in a simulated setting. BMJ Qual Saf . 2017;26:938–940. doi: 10.1136/bmjqs-2017-006898. [DOI] [PubMed] [Google Scholar]
- 23. Salvatierra GG, Bindler RC, Daratha KB. Rapid response teams: is it time to reframe the questions of rapid response team measurement? J Nurs Scholarsh . 2016;48:616–623. doi: 10.1111/jnu.12252. [DOI] [PubMed] [Google Scholar]
- 24. Donovan T, Abell B, Fernando M, McPhail SM, Carter HE. Implementation costs of hospital-based computerised decision support systems: a systematic review. Implement Sci . 2023;18:7. doi: 10.1186/s13012-023-01261-8. [DOI] [PMC free article] [PubMed] [Google Scholar]

