Skip to main content
BMJ Open Quality logoLink to BMJ Open Quality
. 2019 Dec 6;8(4):e000588. doi: 10.1136/bmjoq-2018-000588

Filming for auditing of real-life emergency teams: a systematic review

Lise Brogaard 1,, Niels Uldbjerg 2
PMCID: PMC6937091  PMID: 31909207

Introduction

Delivering high-quality emergency care is the ambition for every emergency team. To succeed requires not only that the individual provider is well trained; it also commands a rapid and coordinated team effort.1 2 However, performance often falls short of expectations.3–5 Therefore, strategies like simulation training, audits, feedback and debriefings have been studied.6–10 Furthermore, filming of emergency teams was introduced back in 1969.11 Filming makes it possible to review and analyse the performance in detail.12–14 Despite the widespread availability and acceptability of video as a method for auditing and quality improvement in healthcare today,15 16 it is still not used by the majority of emergency teams.17

This review describes current evidence for video review to audit emergency teamsmanagement of real-life patients. Video review is defined in this manuscript as any assessment, evaluation or audit where video is used

The key questions in this systematic review are

  1. Where has video review been used; populations and settings?

  2. How has video review been used; technical solutions, legal and ethical issues?

  3. What is the evidence that video review improves patient care?

Methods

This systematic review used the protocol for systematic reviews (Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols).18 The full study protocol in registered with PROSPERO.

Eligibility criteria

The eligibility criteria were based on the PICOS (Population, Intervention, Control, Outcome and Study design) guideline.19

Population was resuscitation teams, code teams, emergency teams, trauma teams, rapid response teams in hospitals.

Intervention was video review.

Control/comparison was non-exposed teams.

Outcome was any assessment of the team’s performance and/or patient outcome.

Study designs eligible for inclusion include randomised controlled trials (RCTs) and non-randomised studies (non-RCTs, interrupted time series, controlled before-and-after studies, cohort studies) and cross-sectional studies. Studies of single case reports and unpublished studies (eg, conference abstract and non-English papers) were excluded. We excluded studies of teams performing any planned activities (elective procedures or operations) and teams performing procedures in out-of-hospital settings or simulated environments.

Literature search

The full search strategy for MEDLINE is visualised in PROSPERO. The search was conducted on 15 March 2018, and this search strategy was adapted to other databases. The databases used were: (1) Ovid MEDLINE (1946 to present). (2) Embase (1974 to present; Ovid). (3) PsycINFO (1806 to present; Ovid). (4) The Cochrane Central Register of Controlled Trials (CENTRAL). (5) Cochrane Database of Systematic Reviews (current issue; part of the Cochrane Library). The literature review was supplemented with studies found by reviewing the references of the included studies.

Study selection and data extraction

LB and our Liberian KRS conducted the literature search. LB checked for duplicates and conducted an initial screening, drawing up a preliminary list of records (n=157). LB and our colleague KJ independently assessed and extracted the data and conducted risk of bias assessment.

Data synthesis and risk of bias

The included studies were characterised by setting, population, outcomes, technical solution, ethical solution and risk of bias (low, unclear or high according to four criteria).20

  1. Selection bias was categorised as low risk if inclusion of participants was clearly described and representative of the population.

  2. Performance bias was categorised as low risk if the majority of healthcare providers were included in the study and as high risk if only a small part of the teams agreed to be filmed.

  3. Measurement bias 1 was categorised as low if a validated tool, checklist or protocol was used.

  4. Measurement bias 2 was categorised as low if several raters independently assessed the video and their agreement was acceptable (eg, measured by kappa or intra class correlation >0.75).

Patient and public involvement

This systematic review was conducted with no patient and public involvement.

Results

Study selection

The literature search identified 7077 papers, the full texts of 157 of which were assessed by two reviewers. A further eight studies were identified by checking the references lists. Among these 165 studies, 50 were eligible according to the inclusion criteria (online supplementary table S1),4 5 8 13 21–66 and five of these evaluated the impact of video review in regards to improvement in patient care (PICOS)8 21 23 41 62 (figure 1).

Figure 1.

Figure 1

PRISMA flow diagram. PICOS, Population, Intervention, Control, Outcome and Study design; PRISMA, Preferred Reporting Items for Systematic Review and Meta-Analysis

Supplementary data

bmjoq-2018-000588supp001.pdf (2.7MB, pdf)

Study designs

There were 6 cohort or case–control studies,8 21–23 27 42 27 cross-sectional studies24 25 28 30 33–40 43 44 46 47 49 51–54 56–59 65 66 and 17 case reports or series4 5 13 26 27 29 31 32 45 48 50 55 60–64 (figure 2). There were no trials or protocols of RCTs.

Figure 2.

Figure 2

The included studies in the systematic review. CRP, Cardiopulmonary resuscitatio; ROSC, return of spontaneous circulation.

Population and setting

We identified five categories of team: cardiac arrest team (n=7),41–47 neonatal resuscitation team (n=18),48–65 trauma team (n=15),4 5 8 13 21–31 paediatric trauma team (n=9)32–40 and obstetric emergency team (n=1).66 The 50 studies were conducted at 30 different hospitals situated in the USA (n=12), Australia (n=1), Europe (n=11) and Asia (n=5) (figure 2).

Technical solution: how to capture the team’s management

The technical solution was described in 43 of the studies. The procedure of filming was continuous video recording (n=6),4 28 31 43 56 58 manual activation where you have to ‘press the button’ (n=20)5 8 13 23 24 26 27 29 32 33 44–47 50 53–55 57 64 or recording automatically activated either by motion triggers (n=3)48 59 60 or Bluetooth in the team leader’s telephone which activated the camera when the leader entered the room (n=1).66

In 20 studies, the cameras were placed either as a bird’s eye view or at a high angle. In bird’s eye view, the camera is placed directly above the patient; for example, the camera could be mounted on the radiant heater above the neonatal resuscitation table.48 50 53 57–60 62 63 In high angle view, the cameras were mounted in the ceiling in the corner presenting an oblique view.32 33 41 45–47 66 Cameras with low angle view were used in studies prioritising patient anonymity.8 13 26 27 An example of a low angle view could be a camera mounted on the resuscitation bed either at the foot of the bed or behind the patient’s head.

Only a few studies described recording sound. Where described, microphones were either integrated into the camera or as separate microphones placed in the ceiling.4 31 57 None of the included studies reported staff wearing microphones, which has been described in other video reviews, for example, elective operations.67

To inform the video review of the patients’ condition, data such as vital signs and echocardiography, the monitor was recorded in five studies,31 42 49 51 65 and in one of those, several views had been recorded simultaneously, viz three camera views of the team’s management and one of the patient’s monitor.

Informed consent

Consent was collected in 14 studies. Consent was collected either from all who participated in the video (n=4)26 40 53 66 or in part from either staff or patients (n=10).48–51 54–58 65 In the majority of the studies, informed consent from the patient could not be obtained before video recording due to the patient’s situation, for example, cardiac arrest or severe trauma. In these studies, the hospital approved the project ethically and legally and waived informed consent on the grounds that video recording was considered a quality assurance measure.

Outcomes

To audit the care provided in the video review, four categories of outcomes were identified: teams’ clinical performance (n=34), teams’ technical performance (n=13), teams’ non-technical performance (n=17) and patient outcome (n=14) (figure 2). Teams’ clinical performance consisted of the overall management of the emergency and was assessed in terms of adherence to protocol or guideline.4 5 8 21–28 30 32–45 47 54 55 57 58 62 63 66 Technical performance was assessed for procedures like intubation or chest compression depth.13 46 48 49 51–53 56 59–61 64 65 Non-technical performance comprised decision-making, situation awareness, communication, leadership, teamwork or vigilance and was assessed in terms of use of checklist or descriptive analysis.24 25 27 28 30 31 35–37 39 40 43 47 53 57 58 63 Patient outcomes comprised return of spontaneous circulation (ROSC), length of hospital stay or survival and were assessed based on either the video or medical charts (n=14).4 21 22 26 28 41–43 48 49 51 65

Evidence that video review improves patient care

Five studies evaluated the effect of video review on the provided care8 21 23 41 62; trauma teams (n=3),8 21 23 cardiac arrest teams (n=1)41 and neonatal resuscitation teams (n=1)62 (table 1).

Table 1.

Studies evaluating the effect of video review on provided care (PICOS)

Study Comparator group Video review Team
outcome
Patient outcome Results Risk of bias
A B C D
Hoyt
(1988)8
graphic file with name bmjoq-2018-000588f05.jpg
n=60
(verbal feedback)
n=180 ATLS* Team: Improved in timely delivery of care (ATLS) for trauma patients ISS†>20 (44/15%, p≤0.001) ? + +
Townsend
(1993)21
graphic file with name bmjoq-2018-000588f03.jpg
n=361
(video no feedback)
n=522 ATLS* Survival Team: Improved significantly in timely delivery of care (mean 97.5/88.6 min, p<0.01)
Patient: Mortality reduced (10.8% to 10.4%), p<0.01, By TRISS‡ No difference between groups by ISS†
? ? + ?
Scherer
(2002)23
graphic file with name bmjoq-2018-000588f04.jpg
n=27
(verbal feedback)
n=24 ATLS* Team: Improved significantly in 8/10 of the
ATLS checklist.
? +
Carbine
(2000)62
Inline graphic
n=25
(first of consecutive)
n=25
(last of
100 consecutive teams)
Neonatal Resuscitation Protocol* Neonatal resuscitation: No overall improvement in resuscitation score. One action ‘deep suctioning’ improved significantly (p=0.002) ? ? + +
Jiang
(2010)41
graphic file with name bmjoq-2018-000588f06.jpg
n=15
(first of consecutive)
n=15
(last of 45 consecutive teams)
ALS* ROSC survival Team: In 3/8 resuscitation tasks, the team improved significantly (p<0.001)
Survival: No improvement (survival to discharge n=2 in control group and n=1 in video group)
+ + + +

(A) Selection bias was categorised as low risk if inclusion of participants was clearly described and representative of the population.

(B) Performance bias was categorised as low risk if the majority of healthcare providers were included in the study and as high risk if only a small part of the team agreed to be filmed.

(C) Measurement bias 1 was categorised as low if a validated tool, checklist or protocol was used.

(D) Measurement bias 2 was categorised as low if several raters independently assessed the video and their agreement was acceptable (eg, measured by kappa or intra class correlation>0.75).

Information about quality items (risk of bias A-D) results in: + Adequate; ? Unclear; - Inadequate.

*Adherence to a protocol or clinical guideline, for example, ATLS (Advanced Trauma Life Support), ALS (Advanced Life Support for cardiac arrest).

†ISS: (Injury Severity Score) is an anatomical scoring system that provides an overall score for patients with multiple injuries.

‡TRISS: The probability of survival (Ps) of a patient from the ISS and RTS using a formula 68.

PICOS, Population, Intervention, Control, Outcome and Study design; ROSC, return of spontaneous circulation.

Intervention by video review

The teams’ management of real-life patients was filmed, and the teams received feedback at an educational conference where they reviewed their performance.

Controls

The control groups in the five studies were not blinded as both the intervention group and the control group were aware of the filming. Three studies evaluated the effect ‘before and after’21 41 62 and two studies compared verbal feedback with no video to a video-feedback conference.8 23

Quality of studies

Four studies had high or unclear risk of bias8 21 23 62 and one study of cardiac arrest teams41 had low risk of bias (table 1).

Meta-analysis

Meta-analysis was waived due to the heterogeneity of the studies and the lack of consistently applied reference standards.

Level of evidence for video review

All five studies were in favour of video review as an educational intervention to improve patient care. Four of the five studies found that video review significantly improved teams’ clinical performance, which was assessed either in terms of improved guideline adherence or less time spent on providing patient care8 21 23 41 (table 1). Two studies evaluated whether the improved clinical performance improved survival rates.21 41 The first study by Townsend et al21 found a significant reduction of mortality from 10.8% to 10.4%, p<0.01, and controlled for the confounder of the Trauma and injury severity score (TRISS) score (probability of survival based on Injury Severity Score68). However, the second study,41 evaluating whether improved clinical performance resulted in better patient outcome for 45 cardiac arrest cases, reported no improvement in survival or ROSC, as only two patients survived until discharge in the control group and one patient survived until discharge in the video group. Two studies evaluated the educational impact of video review where teams reviewed their own performance compared with verbal feedback.8 23 Both found that video review outperformed verbal feedback alone in improving clinical performance (figure 1).

Recommendation

Educational intervention by reviewing video of actual emergencies may improve teams’ clinical performance. The Oxford strength of this recommendation is B.69

Discussion

This systematic review of 50 observational studies provides insight into the use of video review of resuscitation teams, trauma teams, emergency paediatric teams and emergency obstetric teams. This technical solution is affordable and relatively easy to instal; however, legal and ethical issues may be challenging. Investigating the impact of video review, with regards to improved patient care, all five studies were in favour of the use of video review as an educational intervention. Four of the five studies found that video review significantly improved teams’ clinical performance, and one study found improved survival of trauma patients.

After systematically reviewing the included studies, the question remains whether we may rely on video review to improve team performance in the future. In an attempt to answer this question, we discuss the strengths, weaknesses, opportunities and threats (SWOT analysis) of video review below.

Strength

Easy to install cameras

Cameras and microphones are easy to instal and this solution is affordable compared with other medicotechnical equipment.

Data collection

Once data are collected, the data can be reviewed repeatedly.5 58 70

Improve teams’ performance

As the need for high-value, cost-conscious medical education is more prevalent than ever, video may be a learning tool to expedite mastery of necessary techniques.71 Video can capture teams’ behaviour, timing of medicine, flow of an algorithm and time taken to master techniques.23 72 73 Video review also allows teams to evaluate local algorithms for patient care. Furthermore, one can conduct research by linking these processes to patient outcomes.8 21 23 41 62

Video in feedback and debriefing

The present literature of video review to improve patient care favours video review over other modalities.8 21–23 41 62

Quality assessment

Video review can be used for quality assessment and benchmarking of performance between departments or countries.

Documentation

Video provides more detailed and accurate information than paper-based records,53 56 and videos can be saved as a supplement to patient records.

Weaknesses

Blinding

When we use video, we are limited by the ‘eye’ of the camera. Thus, we can easily be blind to missing information, for example, information that is obvious to the team but not to the video reviewer.74

Forgetting to turn on the cameras

This is a well-known problem.5 8 21 Therefore, continuous video recording is recommended; however, if this is not feasible, we recommend considering how cameras can be activated automatically by either motion triggers48 59 60 or Bluetooth.66

Hindsight bias

Knowing the outcome affects how we perceive the team’s performance.75 Therefore, teams reviewing their own performance are biassed to overestimating both the importance of error in cases of poor outcomes and to overestimate the importance of their management when the outcome is good.

How to use the video as an educational tool

We have limited information regarding how to use the video as an educational tool, for example, is the video review facilitated by the team itself or a trained facilitator and used for structured feedback or debriefing? Debriefing in simulation-based education suggests that the effectiveness of facilitator-led debriefings seems to the same whether performed with video or without use of videos.76 77 However, in simulation, the facilitator observes the team’s management, whereas in real-life emergencies the facilitator is not there; in those cases, video may serve as an eye witness.

Difficult to benchmark performance

Standardisation of appropriate outcomes is needed before we can benchmark outcomes between studies, and this would also open the possibility for meta-analysis.78 Guideline adherence is the most frequently used outcome; however, this should be standardised by use of systematically developed checklists for evaluation of teams’ performance.66 79 Time can be a relevant outcome; however, outcomes should always be selected based on relevance for the clinicians and the patients and not because they are easy to assess.78

Time-consuming and costly

To date, the majority of selected outcomes are assessed manually by observers reviewing the video using a checklist or protocol, and this is tedious, time-consuming and costly.74 80

Opportunities

Work environment

Research into teams’ working environment may identify factors that complement teams’ management but also factors that prevent teams from becoming effective. Such research may be the first step towards innovative solutions to improve environments, for example, the arrangement of the room, how to diagnose or to deliver the right dose of medicine. In the future, such development may improve the quality of care as it simply becomes easier to deliver the right care.81–83

Automatic assessment

Manual analysis of video is time-consuming and expensive as several raters are usually needed. Therefore, development of innovative solutions is needed to reduce the cost and increase the objectivity and the quality of video analyses. Innovative solutions could combine analysis of the team and other objective measures, for example, technical skills like chest compression depth measured by the monitor device.29 55 65

Faster education

By reviewing not only our own performance but also that of our colleagues, video review could have the potential to reduce the time to mastery.8 41 63

Threats

Legal issues

Some projects may be stopped before they ever get started because it can be difficult to get legal permission for recording where informed consent cannot be collected, for example, in cardiac arrest cases.

Patient compliance

Overall, patients are reportedly positive regarding the use of video to improve care. In an Australian survey, 96% of parents agreed to the use of video in neonatal resuscitation to improve patient safety.84 In a Danish study, patients and their relatives gave consent in 94% of the cases of major postpartum haemorrhage.85

Staff compliance

Studies find that staff in general are positive and find that video review improves their knowledge and the provided care.8 However, in a survey on trauma teams, 30% of the staff found video to provoke moderate anxiety; still, 90% of the staff agreed on the educational value of using video.86 Although the benefits of video seem to outweigh the potential liability risk, there will always be a concern for how videos can be used in malpractice trial, even if this risk is minimal17 87 and knowing that video is more likely to provide evidence of good care.88–90

Video versus audio recording

The gold standard is video and audio recording. However, if this is not possible, audio recording is an alternative; and audio has been used in analysis of telephone conversations.91

Ethical issues

Video recording of patients without informed consent raises the ethical question whether the public benefit of video recording outweighs the patient’s right to privacy.88 There is no unambiguous answer to this, and several studies video record without consent.24 39 89 Reviewing the timeline of the included studies, the number of studies using video review has dropped dramatically since 2003.26 In 2003, a new act, The Health Insurance Portability and Accountability Act (HIPAA), was passed in the USA, and this affected the use of video in trauma centres there as informed consent was now needed before filming. Hence, a survey from 200590 found that the use of video review in trauma centres dropped from 58% to 18% after the HIPAA became law, and the most cited reason for discontinuation of video review was the need for informed consent, legal concerns and concerns about patient privacy. A survey from 201017 reported that the use of video had not changed since 2005 as only 20% of 108 trauma centres used video review in the USA, although 100% of all trauma centres agreed that it can improve the trauma resuscitation process.

Building on findings from this review, future research may address two main aspects:

  1. The teams’ conditions, for example, development of new innovative solutions for the emergency room, monitors providing information and guiding the team, reduction of noise, improved lighting of the room, premixed syringes with correct dose medicine, new tool as surgical instruments, checklists or other cognitive adds.

  2. Video as an educational tool; as the educational strategies of real-life video are lacking, future research should therefore focus on describing all characteristics involved in how to use video in debriefing to maximise educational efficiency.

Conclusion

The need for high-value, cost-conscious medical education is more prevalent than ever, with shrinking time for training and increasing complexity of the care delivered by the emergency teams. Filming emergencies has educational value and is an important priority for clinical research seeking to identify factors that complement teams’ management and factors that prevent teams from becoming effective. However, the ethical and legal concerns remain unresolved. If we can solve these ethical and legal concerns, video review can provide us with the opportunity to analyse, understand and improve our performance and the quality of patient care.

Acknowledgments

We would, therefore, like to thank Kristiane Roed Jensen for reviewing and extracting data and conducting the analysis of the risk of biases together with LB. Also, we are grateful to Kristian Krogh for reviewing the final manuscript.

Footnotes

Contributors: LB and NU designed the review, made the analysis and extracted the data. LB drafted the paper and NU approved the final version

Funding: This work was supported by:Tryg Foundation (Trygfonden) (grant ID no 109507); The Regional Hospital in Horsens, Department of Obstetrics and Gynaecology; and the Regional Postgraduate Medical Education Administration Office North.

Competing interests: None declared.

Patient consent for publication: Not required.

Provenance and peer review: Not commissioned; externally peer reviewed.

Data availability statement: All data relevant to the study are included in the article.

References

  • 1.Charney C. Making a team of experts into an expert team. Adv Neonatal Care 2011;11:334–9. 10.1097/ANC.0b013e318229b4e8 [DOI] [PubMed] [Google Scholar]
  • 2.Norris EM, Lockey AS. Human factors in resuscitation teaching. Resuscitation 2012;83:423–7. 10.1016/j.resuscitation.2011.11.001 [DOI] [PubMed] [Google Scholar]
  • 3.Abella BS, Alvarado JP, Myklebust H, et al. Quality of cardiopulmonary resuscitation during in-hospital cardiac arrest. JAMA 2005;293:305–10. 10.1001/jama.293.3.305 [DOI] [PubMed] [Google Scholar]
  • 4.Spanjersberg WR, Bergs EA, Mushkudiani N, et al. Protocol compliance and time management in blunt trauma resuscitation. Emerg Med J 2009;26:23–7. 10.1136/emj.2008.058073 [DOI] [PubMed] [Google Scholar]
  • 5.Santora TA, Trooskin SZ, Blank CA, et al. Video assessment of trauma response: adherence to ATLS protocols. Am J Emerg Med 1996;14:564–9. 10.1016/S0735-6757(96)90100-X [DOI] [PubMed] [Google Scholar]
  • 6.Handley AJ, Handley SAJ. Improving CPR performance using an audible feedback system suitable for incorporation into an automated external defibrillator. Resuscitation 2003;57:57–62. 10.1016/S0300-9572(02)00400-8 [DOI] [PubMed] [Google Scholar]
  • 7.Mileder L, Urlesberger B, Szyld E, et al. Simulation-Based neonatal and infant resuscitation teaching: a systematic review of randomized controlled trials. Klin Padiatr 2014;226:259–67. 10.1055/s-0034-1372621 [DOI] [PubMed] [Google Scholar]
  • 8.Hoyt DB, Shackford SR, Fridland PH, et al. Video recording trauma resuscitations: an effective teaching technique. J Trauma 1988;28:435–40. 10.1097/00005373-198804000-00003 [DOI] [PubMed] [Google Scholar]
  • 9.Mundell WC, Kennedy CC, Szostek JH, et al. Simulation technology for resuscitation training: a systematic review and meta-analysis. Resuscitation 2013;84:1174–83. 10.1016/j.resuscitation.2013.04.016 [DOI] [PubMed] [Google Scholar]
  • 10.Edelson DP. Improving in-hospital cardiac arrest process and outcomes with performance Debriefing. Arch Intern Med 2008;168:1063–9. 10.1001/archinte.168.10.1063 [DOI] [PubMed] [Google Scholar]
  • 11.Peltier LF, Geertsma RH, Youmans RL. Television videotape recording: an adjunct in teaching emergency medical care. Surgery 1969;66:233–6. [PubMed] [Google Scholar]
  • 12.Chamberlain DA, Hazinski MF. Education in resuscitation. Resuscitation 2003;59:11–43. 10.1016/j.resuscitation.2003.08.011 [DOI] [PubMed] [Google Scholar]
  • 13.Murray L, McCabe M. The video-recorder in the accident and emergency department. Arch Emerg Med 1991;8:182–4. 10.1136/emj.8.3.182 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Georgiou A, Lockey DJ. The performance and assessment of hospital trauma teams. Scand J Trauma Resusc Emerg Med 2010;18:66 10.1186/1757-7241-18-66 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Parry R, Pino M, Faull C, et al. Acceptability and design of video-based research on healthcare communication: evidence and recommendations. Patient Educ Couns 2016;99:1271–84. 10.1016/j.pec.2016.03.013 [DOI] [PubMed] [Google Scholar]
  • 16.Gambadauro P, Magos A. Surgical Videos for accident analysis, performance improvement, and complication prevention. Surg Innov 2012;19:76–80. 10.1177/1553350611415424 [DOI] [PubMed] [Google Scholar]
  • 17.Rogers SC, Dudley NC, McDonnell W, et al. Camera, action. spotlight on trauma video review: an underutilized means of quality improvement and education. Pediatr Emerg Care 2010;26:803–7. [DOI] [PubMed] [Google Scholar]
  • 18.Shamseer L, Moher D, Clarke M, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation. BMJ 2015;349:g7647–25. 10.1136/bmj.g7647 [DOI] [PubMed] [Google Scholar]
  • 19.Stone PW. Popping the (PICO) question in research and evidence-based practice. Appl Nurs Res 2002;15:197–8. 10.1053/apnr.2002.34181 [DOI] [PubMed] [Google Scholar]
  • 20.Khan K, Knuz R, Kleijnen J, et al. Systematic reviews to support evidencebased medicine. 2nd edn New York: CRC Press, 2011: 39–51. [Google Scholar]
  • 21.Townsend RN, Clark R, Ramenofsky ML, et al. ATLS-based videotape trauma resuscitation review: education and outcome. J Trauma 1993;34:133–8. 10.1097/00005373-199301000-00025 [DOI] [PubMed] [Google Scholar]
  • 22.Fitzgerald M, Cameron P, Mackenzie C, et al. Trauma resuscitation errors and computer-assisted decision support. Arch Surg 2011;146:218–25. 10.1001/archsurg.2010.333 [DOI] [PubMed] [Google Scholar]
  • 23.Scherer LA, Chang MC, Meredith JW, et al. Videotape review leads to rapid and sustained learning. Am J Surg 2003;185:516–20. 10.1016/S0002-9610(03)00062-X [DOI] [PubMed] [Google Scholar]
  • 24.Lubbert PHW, Kaasschieter EG, Hoorntje LE, et al. Video registration of trauma team performance in the emergency department: the results of a 2-year analysis in a level 1 trauma center. J Trauma 2009;67:1412–20. 10.1097/TA.0b013e31818d0e43 [DOI] [PubMed] [Google Scholar]
  • 25.Maluso P, Hernandez M, Amdur RL, et al. Trauma team size and task performance in adult trauma resuscitations. J Surg Res 2016;204:176–82. 10.1016/j.jss.2016.05.007 [DOI] [PubMed] [Google Scholar]
  • 26.van Olden GDJ, van Vugt AB, Biert J, et al. Trauma resuscitation time. Injury 2003;34:191–5. 10.1016/S0020-1383(02)00202-4 [DOI] [PubMed] [Google Scholar]
  • 27.Ritchie PD, Cameron PA. An evaluation of trauma team leader performance by video recording. Aust NZ J Surg 1999;69:183–6. 10.1046/j.1440-1622.1999.01519.x [DOI] [PubMed] [Google Scholar]
  • 28.Bergs EAG, Rutten FLPA, Tadros T, et al. Communication during trauma resuscitation: do we know what is happening? Injury 2005;36:905–11. 10.1016/j.injury.2004.12.047 [DOI] [PubMed] [Google Scholar]
  • 29.Mann FA, Walkup RK, Berryman CR, et al. Computer-Based videotape analysis of trauma resuscitations for quality assurance and clinical research. J Trauma 1994;36:226–30. 10.1097/00005373-199402000-00015 [DOI] [PubMed] [Google Scholar]
  • 30.Hoff WS, Reilly PM, Rotondo MF, et al. The importance of the Command-Physician in trauma resuscitation. J Trauma 1997;43:772–7. 10.1097/00005373-199711000-00007 [DOI] [PubMed] [Google Scholar]
  • 31.DeMoor S, Abdel-Rehim S, Olmsted R, et al. Evaluating trauma team performance in a level I trauma center. J Trauma Acute Care Surg 2017;83:159–64. 10.1097/TA.0000000000001526 [DOI] [PubMed] [Google Scholar]
  • 32.Noland J, Treadwell D. Video evaluation of pediatric trauma codes. Int J Trauma Nurs 1996;2:42–8. 10.1016/S1075-4210(96)80006-X [DOI] [PubMed] [Google Scholar]
  • 33.Oakley E, Staubli G, Young S. Using video recording to identify management errors in pediatric trauma resuscitation. Pediatrics 2006;117:658–64. 10.1542/peds.2004-1803 [DOI] [PubMed] [Google Scholar]
  • 34.Carter EA, Waterhouse LJ, Kovler ML, et al. Adherence to ATLS primary and secondary surveys during pediatric trauma resuscitation. Resuscitation 2013;84:66–71. 10.1016/j.resuscitation.2011.10.032 [DOI] [PubMed] [Google Scholar]
  • 35.El-Shafy IA, Delgado J, Akerman M, et al. Closed-Loop communication improves task completion in pediatric trauma resuscitation. J Surg Educ 2017;1:58–66. [DOI] [PubMed] [Google Scholar]
  • 36.Webman RB, Fritzeen JL, Yang J, et al. Classification and team response to nonroutine events occurring during pediatric trauma resuscitation. J Trauma Acute Care Surg 2016;81:666–73. 10.1097/TA.0000000000001196 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Kelleher DC, Kovler ML, Waterhouse LJ, et al. Factors affecting team size and task performance in pediatric trauma resuscitation. Pediatr Emerg Care 2014;30:248–53. 10.1097/PEC.0000000000000106 [DOI] [PubMed] [Google Scholar]
  • 38.Kelleher DC, Carter EA, Waterhouse LJ, et al. Effect of a checklist on advanced trauma life support task performance during pediatric trauma resuscitation. Acad Emerg Med 2014;21:1129–34. 10.1111/acem.12487 [DOI] [PubMed] [Google Scholar]
  • 39.Sarcevic A, Marsic I, Waterhouse LJ, et al. Leadership structures in emergency care settings: a study of two trauma centers. Int J Med Inform 2011;80:227–38. 10.1016/j.ijmedinf.2011.01.004 [DOI] [PubMed] [Google Scholar]
  • 40.Gala PK, Osterhoudt K, Myers SR, et al. Performance in trauma resuscitation at an urban tertiary level I pediatric trauma center. Pediatr Emerg Care 2016;32:756–62. 10.1097/PEC.0000000000000942 [DOI] [PubMed] [Google Scholar]
  • 41.Jiang C, Zhao Y, Chen Z, et al. Improving cardiopulmonary resuscitation in the emergency department by real-time video recording and regular feedback learning. Resuscitation 2010;81:1664–9. 10.1016/j.resuscitation.2010.06.023 [DOI] [PubMed] [Google Scholar]
  • 42.Ong MEH, Quah JLJ, Annathurai A, et al. Improving the quality of cardiopulmonary resuscitation by training dedicated cardiac arrest teams incorporating a mechanical load-distributing device at the emergency department. Resuscitation 2013;84:508–14. 10.1016/j.resuscitation.2012.07.033 [DOI] [PubMed] [Google Scholar]
  • 43.Park SO, Shin DH, Baek KJ, et al. A clinical observational study analysing the factors associated with hyperventilation during actual cardiopulmonary resuscitation in the emergency department. Resuscitation 2013;84:298–303. 10.1016/j.resuscitation.2012.07.028 [DOI] [PubMed] [Google Scholar]
  • 44.Mann CJ, Heyworth J. Comparison of cardiopulmonary resuscitation techniques using video camera recordings. Resuscitation 1996;13:198–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Hossein-Nejad H, Afzalimoghaddam M, Hoseinidavarani H, et al. The validity of cardiopulmonary resuscitation skills in the emergency department using video-assisted surveillance: an Iranian experience. Acta Med Iran H 2013;51:394–8. [PubMed] [Google Scholar]
  • 46.Clattenburg EJ, Wroe P, Brown S, et al. Point-Of-Care ultrasound use in patients with cardiac arrest is associated prolonged cardiopulmonary resuscitation pauses: a prospective cohort study. Resuscitation 2018;122:65–8. 10.1016/j.resuscitation.2017.11.056 [DOI] [PubMed] [Google Scholar]
  • 47.Cooper S, Wakelam A. Leadership of resuscitation teams: ‘Lighthouse Leadership’. Resuscitation 1999;42:27–45. 10.1016/S0300-9572(99)00080-5 [DOI] [PubMed] [Google Scholar]
  • 48.Skåre C, Boldingh A-M, Nakstad B, et al. Ventilation fraction during the first 30 S of neonatal resuscitation. Resuscitation 2016;107:25–30. 10.1016/j.resuscitation.2016.07.231 [DOI] [PubMed] [Google Scholar]
  • 49.Donoghue A, Hsieh T-C, Nishisaki A, et al. Tracheal intubation during pediatric cardiopulmonary resuscitation: a videography-based assessment in an emergency department resuscitation room. Resuscitation 2016;99:38–43. 10.1016/j.resuscitation.2015.11.019 [DOI] [PubMed] [Google Scholar]
  • 50.McCarthy LK, Morley CJ, Davis PG, et al. Timing of interventions in the delivery room: does reality compare with neonatal resuscitation guidelines? J Pediatr 2013;163:1553–7. 10.1016/j.jpeds.2013.06.007 [DOI] [PubMed] [Google Scholar]
  • 51.Hsieh T-C, Wolfe H, Sutton R, et al. A comparison of video review and feedback device measurement of chest compressions quality during pediatric cardiopulmonary resuscitation. Resuscitation 2015;93:35–9. 10.1016/j.resuscitation.2015.05.022 [DOI] [PubMed] [Google Scholar]
  • 52.Mullan PC, Cochrane NH, Chamberlain JM, et al. Accuracy of Postresuscitation team Debriefings in a pediatric emergency department. Ann Emerg Med 2017;70:311–9. 10.1016/j.annemergmed.2017.01.034 [DOI] [PubMed] [Google Scholar]
  • 53.Gelbart B, Hiscock R, Barfield C. Assessment of neonatal resuscitation performance using video recording in a perinatal centre. J Paediatr Child Health 2010;46:378–83. 10.1111/j.1440-1754.2010.01747.x [DOI] [PubMed] [Google Scholar]
  • 54.Schilleman K, Witlox RS, van Vonderen JJ, et al. Auditing documentation on delivery room management using video and physiological recordings. Arch Dis Child Fetal Neonatal Ed 2014;99:F485–90. 10.1136/archdischild-2014-306261 [DOI] [PubMed] [Google Scholar]
  • 55.Schilleman K, Siew ML, Lopriore E, et al. Auditing resuscitation of preterm infants at birth by recording video and physiological parameters. Resuscitation 2012;83:1135–9. 10.1016/j.resuscitation.2012.01.036 [DOI] [PubMed] [Google Scholar]
  • 56.Su L, Waller M, Kaplan S, et al. Cardiac resuscitation events: one eyewitness is not enough. Pediatr Crit Care Med 2015;16:335–42. 10.1097/PCC.0000000000000355 [DOI] [PubMed] [Google Scholar]
  • 57.Thomas EJ, Sexton JB, Lasky RE, et al. Teamwork and quality during neonatal care in the delivery room. J Perinatol 2006;26:163–9. 10.1038/sj.jp.7211451 [DOI] [PubMed] [Google Scholar]
  • 58.Williams AL, Lasky RE, Dannemiller JL, et al. Teamwork behaviours and errors during neonatal resuscitation. Qual Saf Health Care 2010;19:60–4. 10.1136/qshc.2007.025320 [DOI] [PubMed] [Google Scholar]
  • 59.Wrammert J, Zetterlund C, Kc A, et al. Resuscitation practices of low and normal birth weight infants in Nepal: an observational study using video camera recordings. Glob Health Action 2017;10:1322372–10. 10.1080/16549716.2017.1322372 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Lindbäck C, KC A, Wrammert J, et al. Poor adherence to neonatal resuscitation guidelines exposed; an observational study using camera surveillance at a tertiary hospital in Nepal. BMC Pediatr 2014;14:1–7. 10.1186/1471-2431-14-233 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Lane B, Finer N, Rich W. Duration of intubation attempts during neonatal resuscitation. J Pediatr 2004;145:67–70. 10.1016/j.jpeds.2004.03.003 [DOI] [PubMed] [Google Scholar]
  • 62.Carbine DN, Finer NN, Knodel E, et al. Video recording as a means of evaluating neonatal resuscitation performance. Pediatrics 2000;106:654–8. 10.1542/peds.106.4.654 [DOI] [PubMed] [Google Scholar]
  • 63.Finer NN, Rich W. Neonatal resuscitation: toward improved performance. Resuscitation 2002;53:47–51. 10.1016/S0300-9572(01)00494-4 [DOI] [PubMed] [Google Scholar]
  • 64.O'Donnell CPF, Kamlin COF, Davis PG, et al. Feasibility of and delay in obtaining pulse oximetry during neonatal resuscitation. J Pediatr 2005;147:698–9. 10.1016/j.jpeds.2005.07.025 [DOI] [PubMed] [Google Scholar]
  • 65.Jang HY, Wolfe H, Hsieh T-C, et al. Infant chest compression quality: a video-based comparison of two-thumb versus one-hand technique in the emergency department. Resuscitation 2018;122:36–40. 10.1016/j.resuscitation.2017.11.044 [DOI] [PubMed] [Google Scholar]
  • 66.Brogaard L, Hvidman L, Hinshaw K, et al. Development of the TeamOBS-PPH - targeting clinical performance in postpartum hemorrhage. Acta Obstet Gynecol Scand 2018;98:1–11. [DOI] [PubMed] [Google Scholar]
  • 67.Bleakley A, Allard J, Hobbs A. ‘Achieving ensemble’: communication in orthopaedic surgical teams and the development of situation awareness—an observational study using live videotaped examples. Adv Heal Sci Educ 2013;18:33–56. 10.1007/s10459-012-9351-6 [DOI] [PubMed] [Google Scholar]
  • 68.Boyd CR, Tolson MA, Copes WS. Evaluating trauma care: the TRISS method. trauma score and the injury severity score. J Trauma 1987;27:370–8. [PubMed] [Google Scholar]
  • 69.Centre for Evidence-based medicine Levels of evidence. 2009(3). Available: https://www.cebm.net/2009/06/oxford-centre-evidence-based-medicine-levels-evidence-march-2009/
  • 70.Jeffcott SA, Mackenzie CF. Measuring team performance in healthcare: review of research and implications for patient safety. J Crit Care 2008;23:188–96. 10.1016/j.jcrc.2007.12.005 [DOI] [PubMed] [Google Scholar]
  • 71.Cook DA, Beckman TJ. High-Value, Cost-Conscious medical education. JAMA Pediatr 2015;169:109–11. 10.1001/jamapediatrics.2014.2964 [DOI] [PubMed] [Google Scholar]
  • 72.Søreide E, Morrison L, Hillman K, et al. The formula for survival in resuscitation. Resuscitation 2013;84:1487–93. 10.1016/j.resuscitation.2013.07.020 [DOI] [PubMed] [Google Scholar]
  • 73.Manser T. Team performance assessment in healthcare. Simul Healthc J 2008;3:1–3. 10.1097/SIH.0b013e3181663592 [DOI] [PubMed] [Google Scholar]
  • 74.Mackenzie CF, Xiao Y, Hu F-M, et al. Video as a tool for improving tracheal intubation tasks for emergency medical and trauma care. Ann Emerg Med 2007;50:436–42. 10.1016/j.annemergmed.2007.06.487 [DOI] [PubMed] [Google Scholar]
  • 75.Henriksen K, Kaplan H. Hindsight bias, outcome knowledge and adaptive learning. Qual Saf Health Care 2003;12:46–50. 10.1136/qhc.12.suppl_2.ii46 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 76.Cheng A, Eppich W, Grant V, et al. Debriefing for technology-enhanced simulation: a systematic review and meta-analysis. Med Educ 2014;48:657–66. 10.1111/medu.12432 [DOI] [PubMed] [Google Scholar]
  • 77.Garden AL, Le Fevre DM, Waddington HL, et al. Debriefing after simulation-based non-technical skill training in healthcare: a systematic review of effective practice. Anaesth Intensive Care 2015;43:300–8. 10.1177/0310057X1504300303 [DOI] [PubMed] [Google Scholar]
  • 78.Williamson PR, Altman DG, Blazeby JM, et al. Developing core outcome sets for clinical trials: issues to consider. Trials 2012;13:1–8. 10.1186/1745-6215-13-132 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79.Schmutz J, Eppich WJ, Hoffmann F, et al. Five steps to develop checklists for evaluating clinical performance: an integrative approach. Acad Med 2014;89:996–1005. [DOI] [PubMed] [Google Scholar]
  • 80.Wolfe H, Nishisaki A, Well TW. That went well, or did it? fighting rosy recall in the documentation of in-hospital cardiac arrest. Pediatr Crit Care Med 2015;16:382–3. 10.1097/PCC.0000000000000370 [DOI] [PubMed] [Google Scholar]
  • 81.Bates DW, Gawande AA. Improving safety with information technology. N Engl J Med 2003;348:2526–34. 10.1056/NEJMsa020847 [DOI] [PubMed] [Google Scholar]
  • 82.Pilotto A, D'Onofrio G, Benelli E, et al. Information and communication technology systems to improve quality of life and safety of Alzheimer's disease patients: a multicenter international survey. JAD 2011;23:131–41. 10.3233/JAD-2010-101164 [DOI] [PubMed] [Google Scholar]
  • 83.Buljac-Samardzic M, Dekker-van Doorn CM, van Wijngaarden JDH, et al. Interventions to improve team effectiveness: a systematic review. Health Policy 2010;94:183–95. 10.1016/j.healthpol.2009.09.015 [DOI] [PubMed] [Google Scholar]
  • 84.Taylor K, VanDenberg S, le Huquet A, et al. Parental attitudes to digital recording: a paediatric Hospital survey. J Paediatr Child Health 2011;47:335–9. 10.1111/j.1440-1754.2010.01981.x [DOI] [PubMed] [Google Scholar]
  • 85.Brogaard L, Kierkegaard O, Hvidman L, et al. The importance of non-technical performance for teams managing postpartum haemorrhage: video review of 99 obstetric teams. BJOG 2019;8:1015–23. [DOI] [PubMed] [Google Scholar]
  • 86.Davis L, Johnson L, Allen SR, et al. Practitioner perceptions of trauma video review. J Trauma Nurs 2013;20:150–4. 10.1097/JTN.0b013e3182a172b6 [DOI] [PubMed] [Google Scholar]
  • 87.What are legal risks of videotaping trauma? ED Manag 2007;19:138–9. [PubMed] [Google Scholar]
  • 88.Gelbart B, Barfield C, Watkins A. Ethical and legal considerations in video recording neonatal resuscitations. J Med Ethics 2009;35:120–4. 10.1136/jme.2008.024612 [DOI] [PubMed] [Google Scholar]
  • 89.O'Donnell CPF, Kamlin COF, Davis PG, et al. Ethical and legal aspects of video recording neonatal resuscitation. Arch Dis Child Fetal Neonatal Ed 2008;93:F82–4. 10.1136/adc.2007.118505 [DOI] [PubMed] [Google Scholar]
  • 90.Campbell S, Sosa JA, Rabinovici R, et al. Do not roll the videotape: effects of the health insurance portability and accountability act and the law on trauma videotaping practices. Am J Surg 2006;191:183–90. 10.1016/j.amjsurg.2005.07.033 [DOI] [PubMed] [Google Scholar]
  • 91.Eppich WJ, Dornan T, Rethans J-J, et al. "Learning the Lingo": A Grounded Theory Study of Telephone Talk in Clinical Education. Acad Med 2019;94:1033–9. 10.1097/ACM.0000000000002713 [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary data

bmjoq-2018-000588supp001.pdf (2.7MB, pdf)


Articles from BMJ Open Quality are provided here courtesy of BMJ Publishing Group

RESOURCES