Abstract
Introduction
A long and rich research legacy shows that under the right conditions, simulation-based medical education (SBME) is a powerful intervention to increase medical learner competence. SBME translational science demonstrates that results achieved in the educational laboratory (T1) transfer to improved downstream patient care practices (T2) and improved patient and public health (T3).
Method
This is a qualitative synthesis of SBME translational science research that employs a critical review approach to literature aggregation.
Results
Evidence from SBME and health services research programs that are thematic, sustained, and cumulative shows that measured outcomes can be achieved at T1, T2, and T3 levels. There is also evidence that SBME translational science research can yield a favorable return on financial investment and contributes to long-term retention of acquired clinical skills. The review identifies best practices in SBME translational science research, presents challenges and critical gaps in the field, and sets forth a translational science research agenda for SBME.
Conclusion
Rigorous SBME translational science research can contribute to better patient care and improved patient safety. Consensus conference outcomes and recommendations should be presented and used judiciously.
Keywords: Simulation, Translational science, Medical education
At least forty years of empirical research shows that simulation-based medical education (SBME) promotes learner acquisition and maintenance of clinical knowledge, attitudes, and skills. Thousands of individual research reports synthesized in five comprehensive reviews1-5 reveal that SBME is a powerful educational intervention to increase medical learner competence measured in the learning laboratory, during patient care delivery, and improves patient health outcomes measured quantitatively. The scientific legacy leaves no doubt that SBME technology works to achieve educational goals under the right conditions: (a) coupled with mastery learning and deliberate practice, (b) involving skillful faculty, (c) with curriculum integration and institutional endorsement, and (d) receiving healthcare system acceptance.
Translational science is usually defined as biomedical or biomedical engineering research designed to accelerate movement of results from the laboratory bench to the patient bedside. Translational science advances from bench to bedside in at least three seamless phases. T1 science aims to move basic laboratory discoveries to clinical research. T2 science aims to produce evidence of clinical effectiveness at the level of the patient; compare the success of different treatments to identify “the right treatment for the right patient in the right way at the right time;” and translate these results into practice guidelines for patients, clinicians, and policy-makers.6 T3 science addresses health care delivery, community engagement, and preventive services that yield measureable improvements in the health of individuals and society.6
Simulation-based medical education research qualifies as translational science when it stretches the outcome measurement endpoint just like its biomedical counterparts. SBME translational science demonstrates that results achieved in the educational laboratory (T1) transfer to improved downstream patient care practices (T2) and improved patient and public health (T3).7
METHOD
This is a qualitative synthesis of SBME translational science research (TSR). The review is deliberately selective and critical, rather than exhaustive. It relies on Norman and Eva’s “critical review” approach to literature synthesis.8,9 Eva asserts, “A good educational literature review … is one that presents a critical synthesis of a variety of literatures, identifies knowledge that is well established, highlights gaps in understanding, and provides some guidance regarding what remains to be understood. The result should give a new perspective of an old problem … The author … should feel bound by a moral code to try to represent the literature (and the various perspectives therein) fairly, but need not adopt a guise of absolute systematicity.”9
This article covers five SBME translational science research issues that address distal outcomes involving better healthcare delivery (T2) and improved patient and public health (T3). The five translational science research issues are: (a) its logic and progression, (b) evidence that TSR can yield T2 and T3 outcomes, (c) best practices in translational science research, (d) challenges and critical gaps in TSR, and (e) the translational science research agenda. The report concludes with a summary and a brief discussion about the context of translational science research in health professions education and healthcare.
RESULTS
Logic and progression
Translational science in health professions education progresses from the simulation laboratory to patient outcomes in three seamless phases. Health professions education translational science research at the T1 level, the most common form of such research, involves the design and delivery of education protocols and measurement of educational outcomes in controlled laboratory settings. Translational science research at the T2 level stretches the endpoint beyond the T1 laboratory setting. Better patient care delivery practices (e.g., ACLS, complicated obstetrical delivery) are the goal of T2 translational science. Translational science research at the T3 level advances the measurement endpoint further downstream. The target of T3 translational science is improved patient or public health outcomes (e.g., faster surgical recovery) directly linked to educational interventions (e.g., skillful laparoscopic surgery).
Most of health related simulation education and research has measured T1 learning outcomes. There are many good reasons for this limited focus. Simultaneous pressures of high volume training demands, work overload, insufficient staff, tight budgets, and research inexperience hinder researchers from reporting results beyond the laboratory exit door. The body of reliable T2 and T3 knowledge will no doubt expand in breadth and depth as the scientific field matures, research resources increase, and as staff research competence grows from training and experience.
The key point about the logic and progression of translational science research is that it stems from a thematic, sustained, and cumulative research program. Effective translational science is not a product of stand alone, one-shot studies. Instead, productive translational science requires thoughtful, continuous research planning that addresses education and health services research outcomes. It is usually performed best by multi-disciplinary, functionally diverse research teams composed of clinicians, research methodologists, reference librarians, and other professionals.
Evidence that TSR can yield T2 and T3 outcomes and beyond
An example of a translational science research program is seen in the work of a team headed by Northwestern University internist Jeffrey Barsuk. This team developed a SBME program featuring mastery learning and deliberate practice to increase internal medicine and emergency medicine residents’ skills of central venous catheter (CVC) insertion. T1 outcomes were evaluated in a cohort study that compared internal jugular and subclavian CVC insertion skills of simulation-trained versus traditionally trained residents (historical controls) in a laboratory setting. SBME produced much more skillful residents than traditional bedside training.10 In a subsequent T2 study, Barsuk et al. showed that residents who received internal jugular and subclavian SBME inserted CVCs in the medical intensive care unit with significantly fewer needle passes, catheter adjustments, arterial punctures, and with higher success rates than traditionally trained residents (historical controls).11 Barsuk et al. then conducted a before/after observational study in the medical intensive care unit on the incidence of catheter-related bloodstream infections over 32 months. This team reported a 85% reduction in catheter related bloodstream infections after the SBME trained residents entered the intervention intensive care unit (0.50 infections per 1000 catheter days) compared to both the same unit before the intervention (3.20 infections per 1000 catheter days, p = .001) and to a comparison intensive care unit in the same hospital throughout the study period (5.03 infections per 1000 catheter days, p = .001). This is a powerful T3 translational science research outcome.12
A subsequent cost-effectiveness study by the Barsuk team demonstrates the SBME intervention resulted in significant medical care cost savings, a 7-1 return on investment, an index of treatment value.13 The Barsuk research group has also reported that CVC insertion skills acquired in the simulation laboratory under mastery learning conditions are largely retained and are robust to decay measured 6 months and 12 months downstream.14 However, residents whose performance did not meet mastery standards on follow-up measurement could not be predicted from prior assessments. This suggests that periodic testing and refresher training for invasive procedures like CVC insertion are needed.
The Barsuk et al. T1 to T3 and treatment value and retention translational science results link improved patient outcomes, health care cost return on investment, and skill retention directly to the SBME intervention featuring mastery learning and deliberate practice.
Other illustrations of translational science research programs that lead to T3 outcomes can also be cited although they are few in number. Obstetrician Tim Draycott and colleagues have reported a series of simulation-based translational science studies in the United Kingdom that demonstrate individual and team training in obstetric emergencies produce statistically and clinically significant reductions in birth complications due to shoulder dystocia (brachial plexus injury), low Apgar scores (< 6 on a 1-10 scale) 5 min. after birth, and neonatal hypoxic-ischemic encephalopathy (HIE), a brain injury caused by lack of oxygen.15-17 These are all T3 outcomes.
Opthalmologists at the University of Iowa have developed and implemented an intense, structured surgical curriculum for resident trainees that includes “[a] structured wet lab and simulator training during the first year, [b] backing into cases of senior residents during the first year, [c] formative feedback during the second year, and [d] deliberate practice of capsulorhexis during the second year. In a before/after study these investigators compared sentinel events—“a posterior capsule tear (with or without vitreous loss) or vitreous loss (from any cause)”—from surgical cases of residents trained before the surgical curriculum change versus cases of residents trained with the enhanced curriculum. The results show, “There was a statistically [and clinically] significant reduction in the sentinel complication rate, from 7.17% before the curriculum changes to 3.77% with the enhanced curriculum (P = .008, unpaired 2-tailed t test).” 18 Better surgical skills acquired by the residents from the enhanced curriculum (T2 outcomes) produced a measureable reduction in sentinel complications among patients during cataract surgery (T3 outcomes).
A selected set of 15 research reports that address SBME translational science research outcomes at the downstream T2 level is summarized in Table 1. The reports are representative, not exhaustive. They are presented here to demonstrate that medical education research outcomes can have a direct impact on medical care practices. A skillful healthcare workforce is a key variable in the patient safety and healthcare outcomes equation.
Table 1.
Study | Findings |
---|---|
1. Sroka et al. 201019 | Training on the fundamentals of laparoscopic surgery (FLS) simulator led to improved operating room performance in lap cholecystectomy compared to controls |
2. Butter et al. 201020 | S-B mastery learning improves medical students’ cardiac auscultation skills that transfer to actual patients |
3. Wayne et al. 200821 | Simulation trained residents responded to real hospital cardiac arrest events with greater compliance to American Heart Association protocols than more experienced team leaders not trained with simulation |
4. Ahlberg et al. 200722 | Resident surgeons trained on a virtual reality (VR) laparoscopic cholecystectomy simulator made fewer errors and were faster during their first 10 cholecystectomies compared to a control group |
5. Park et al. 200723 | “Residents trained on a colonoscopy simulator prior to their first patient-based colonoscopy performed significantly better in the clinical setting than controls, demonstrating skill transfer to live patients.” |
6. Banks et al. 200724 | Simulation training in laparoscopic tubal ligation improved resident knowledge and performance in the operating room (OR) compared to controls. |
7. Chaer et al. 200625 | Training on a VR endovascular simulator led to improved clinical performance during catheter-based interventions for lower extremity occlusive disease compared to controls |
8. Banks et al. 200626 | Training in episiotomy repair in the skills laboratory improved residents’ knowledge and performance in the clinical setting compared to controls |
9. Cohen et al. 200627 | GI fellows trained on a VR colonoscopy simulator demonstrated significantly better clinical performance during their first 80 colonoscopies compared to controls |
10. Andreatta et al. 200628 | “Prior training on the LapMentor™ laparoscopic simulator leads to improved resident performance of basic skills in the animate operating room environment” compared to controls. |
11. Korndorffer et al. 200529 | “ … training to a predetermined expert level on a videotrainer suture model provides trainees with skills that translate into improved operative performance” compared to controls. |
12. Ahlberg et al. 200530 | Training on a VR endoscopy simulator improved colonoscopy performance on patients (shorter procedure time, less patient discomfort, more successful at reaching the cecum) compared to controls |
13. Grantcharov et al. 200431 | Training on a VR simulator improved lap cholecystectomy performance in the OR (fewer errors, less time, better motion efficiency) compared to controls. |
14. Seymour et al. 200232 | “Use of VR surgical simulation to reach specific target criteria significantly improved the OR performance of residents during laparoscopic cholecystecomy” compared to controls. |
15. Scott et al. 200033 | Laparoscopic skill training using simulated tasks on a video-trainer improves the operative performance of junior surgery residents compared to controls. |
Best practices in translational science research
Quality translational science research is grounded in a set of basic principles that shape best practices in the enterprise. The basic principles include the previously stated need to establish thematic, sustained, and cumulative research programs; an emphasis on methodological and measurement rigor; unification of educational and health services research; and a constant search for opportunities to link SBME interventions with better patient care practices and patient and public health outcomes.
These principles do not dictate a specific set of research practices that define the scope and boundaries of acceptable translational science. Instead, we endorse the idea that decisions about research methods and measurement procedures must be fitted to specific research questions. Quality translational science does not imply methodological “gold standards.” It seeks instead to use the best available research approach to each research question in its clinical and professional context.
A wide variety of experimental and quasi-experimental designs are available to conduct rigorous translational science research.34-35 The designs vary in strength on grounds of internal, external, statistical conclusion, and construct validity;35 require investigators to make educated research design decisions; and often involve tradeoffs among rigor and the practical realities of the research situation.
Shadish, Cook, and Campbell teach that quantitative research designs involve decisions about at least five research variables that form the UTOST model.35 Investigators need to decide on the unit (U) of analysis [e.g., individual, team, class]; treatment (T) imposed [e.g., SBME intervention]; observations (O) made [e.g., measurements]; setting (S) where the research is conducted [e.g., laboratory, clinic, OR]; and time (T) [e.g., duration of the study, follow-up, etc.]. UTOST decisions should be made with thought and care to increase the likelihood of informative, useful translational science research results.
Kirlik reminds TSR investigators to be mindful about the ecological validity of research. Ecological validity aims to insure that clinical problems and contexts, not just research participants, are sampled systematically to capture the variety and richness of clinical experience.36 Ecological validity is achieved when the methods, materials, and setting of a study closely approximate the situation that is under investigation.
Rigorous measurement of translational science research processes and outcomes is essential for research success. At a minimum research measures must yield reliable data at each translational science endpoint to establish solid connections for every link of the cascaded research chain.37 The reliability of data should be estimated and reported in the context of every translational science research study, not assumed from earlier research and publications. Scholars remind translational science investigators that reliable measurement is a necessary but not a sufficient condition to allow valid inferences or decisions from research data.38-40
The granularity versus breadth of research data exerts a strong influence on their reliability and usefulness. This concerns the persistent “fidelity versus bandwidth” or breadth of measurement issue that has vexed scientists for at least 50 years.41-42 Most performance measures that are used for SBME research are task specific, unambiguous, and aim to produce data having high reliability (fidelity). However, there are research situations where more general performance measures having greater bandwidth are needed. Translational scientists are urged to make tradeoff decisions about the fidelity versus bandwidth of research measurements to boost data reliability within practical limits so that valid judgments and decisions can be reached.
Challenges and critical gaps in TSR
There are many challenges involved in conducting translational science, especially when studies aim to affect T3 outcomes. The challenges highlight critical gaps in our thinking about translational science research and its methodological drivers. We choose to name five key challenges.
Translational scientists need to identify clinical problems (e.g., CRBSI, neonatal HIE) whose solutions depend on the decision-making and skillful behavior of health professionals—individuals and teams. SBME interventions featuring mastery learning and deliberate practice can then be designed and implemented to sharpen clinical decision-making and boost clinical skills. Embedding the educational interventions within rigorous research protocols using outcome measures that yield reliable data allows translational scientists to address the clinical problems directly and improve patient or public health.
Measuring and capturing clinical data that reliably represent distal (T3) outcomes is a persistent translational science challenge. Clinical data contained in hospital and clinic records are notoriously unreliable. Thus the research utility of most clinical data collected and stored routinely is suspect. Recent advances in developing sophisticated data storage facilities at academic medical centers are attempts to solve the clinical data reliability problem that deserve increased effort. For example, the Northwestern University Enterprise Data Warehouse (EDW), created under auspices of the Northwestern University Clinical and Translational Sciences (NUCATS) Institute “… is a single, integrated database of all clinical and research data from all patients receiving treatment throughout Northwestern healthcare affiliates. Consolidating this wealth of [reliable] data into a single database maximizes efficiency and centralizes security, making data available yet controlling access to assure consistency with consents and regulatory requirements.”43
Translational scientists should acknowledge that some important clinical problems cannot be studied beyond the T2 level. Such clinical problems as end-of-life cancer care, ALS, and refractory addictions test the limits of translational science research that seeks T3 outcomes.
The biomedical and educational translational science research community needs to educate health policy and research thought leaders that health professions education can have a measureable impact on patient and public health outcomes. In the U.S., for example, health research policies recently published by the Institute of Medicine (IOM) under the title, Knowing What Works in Health Care: A Roadmap for the Nation44 and research priorities published by the Agency for Healthcare Research and Quality (AHRQ)45 are silent about the contribution of health professions education to healthcare delivery. By contrast, we assert that human capital, embodied in competent physicians and other healthcare professionals, is an essential feature of quality healthcare that should also be a translational science research priority.5
The biomedical and educational translational science research community should exercise leadership in healthcare by arguing that clinical performance competency standards for healthcare professionals are needed, but remarkably absent. Improved standards of care will be realized when SBME measures that yield reliable data are used to reach increasingly valid competency decisions about healthcare professionals. Translational science research that approximates T3 goals contributes to creation of such safety-based metrics.46
Translational science research agenda
This short report does not permit detailed discussion of the broad and deep translational science research agenda. Other writing provides a modest start toward this goal.47 The corpus of papers and vigorous discussion presented at the 2011 IMSH Research Consensus Summit reported in this issue of Simulation in Healthcare begins to set forth such an agenda on behalf of the Society for Simulation in Healthcare. Consequently, in addition to the translational science research ideas and challenges presented earlier in this report we choose to add two items to the research agenda.
Translational science research in the healthcare professions should not only address the acquisition and maintenance of procedural skills but also such thorny research targets as clinical judgment, decision-making, mental workload,48 comparative and reflective analysis,49 and other cognitive and affective outcomes. The ability to engage a family in a difficult conversation about end-of-life issues is a clinical skill amenable to SBME just like inserting a chest tube. The translational science research agenda should make room for investigations about a broad array of clinical and educational outcomes.
Rigorous quantitative research, which accounts for the majority of translational science in healthcare, is complemented by qualitative research50 that is also judged using rigorous standards.51 Research based on words rather than numbers can inform and enrich our understanding of complex clinical events.
CONCLUSION
We conclude from this brief, critical review that SBME interventions, especially those that feature deliberate practice toward mastery learning goals, can achieve translational science research outcomes at the T2 and T3 levels. Such translational science research outcomes are more likely when SBME interventions are embedded in rigorous educational and health services research programs that are thematic, sustained, and cumulative. We are also aware that T3 health services research outcomes can be achieved without obvious educational interventions.52-55 Such work complements translational science research featuring SBME because both research models contribute to better patient care and improved patient safety.
This report, and other contributions to the SSH Research Consensus Summit, are intended to describe and evaluate the state of the science regarding the use of educational simulation in the healthcare professions. These are important and valuable goals. However we must be mindful about blind spots that can distort consensus conference proceedings and their conclusions. In particular, scholars warn that the assembly of consensus conference participants from like-minded people can be a source of selection bias.56 Selection bias, in turn, can contribute to confirmation bias, i.e., reaching conclusions that affirm a priori convictions or interests. This is not to discount the potential value of a consensus conference or its results. We only issue a word of caution about the certainty of the Research Consensus Summit conclusions.
Acknowledgments
Dr. McGaghie’s contribution was supported in part by the Jacob R. Suker, MD, professorship in medical education at Northwestern University and by grant UL 1 RR 025741 from the National Center for Research Resources, National Institutes of Health. The National Institutes of Health had no role in the preparation, review, or approval of the article.
Footnotes
The authors have no conflicts of interest to disclose.
This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Contributor Information
William C. McGaghie, Center for Education in Medicine, Northwestern University Feinberg School of Medicine, Chicago, IL.
Timothy J. Draycott, Southmead Hospital, Bristol, UK.
William F. Dunn, College of Medicine, Mayo Clinic, Rochester, MN.
Connie M. Lopez, Kaiser Permanente Program Offices, Oakland, CA.
Dimitrios Stefanidis, Department of Surgery, Carolinas Healthcare System, Charlotte, NC.
REFERENCES
- 1.Issenberg SB, McGaghie WC, Hart IR, Mayer JW, Felner JW, Petrusa ER, Waugh RA, Brown DD, Safford RR, Gessner IH, Gordon DL, Ewy GA. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282:861–866. doi: 10.1001/jama.282.9.861. [DOI] [PubMed] [Google Scholar]
- 2.Issenberg SB, McGaghie WC, Petrusa ER, Gordon DE, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10–28. doi: 10.1080/01421590500046924. [DOI] [PubMed] [Google Scholar]
- 3.McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. Effect of practice on standardized learning outcomes in medical education. Med Educ. 2006;40:792–797. doi: 10.1111/j.1365-2929.2006.02528.x. [DOI] [PubMed] [Google Scholar]
- 4.McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003-2009. Med Educ. 2010;44:50–63. doi: 10.1111/j.1365-2923.2009.03547.x. [DOI] [PubMed] [Google Scholar]
- 5.McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does Simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86 doi: 10.1097/ACM.0b013e318217e119. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Dougherty D, Conway PH. The “3T’s” road map to transform US health care. JAMA. 2008;299(19):2319–2321. doi: 10.1001/jama.299.19.2319. [DOI] [PubMed] [Google Scholar]
- 7.McGaghie WC. Medical education research as translational science. Sci Trans Med. 2010;2:19cm8. doi: 10.1126/scitranslmed.3000679. [DOI] [PubMed] [Google Scholar]
- 8.Norman G, Eva KW. Quantitative Research Methods in Medical Education. Association for the Study of Medical Education; Edinburgh: 2008. [Google Scholar]
- 9.Eva KW. On the limits of systematicity. Med Educ. 2008;42:852–853. doi: 10.1111/j.1365-2923.2008.03140.x. [DOI] [PubMed] [Google Scholar]
- 10.Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4:397–403. doi: 10.1002/jhm.468. [DOI] [PubMed] [Google Scholar]
- 11.Barsuk JH, McGaghie WC, Cohen ER, et al. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37(10):2697–2701. [PubMed] [Google Scholar]
- 12.Barsuk JH, Cohen ER, Feinglass J, et al. Use of simulation-based education to reduce catheter-related bloodstream infections. Arch Intern Med. 2009;169(15):1420–1423. doi: 10.1001/archinternmed.2009.215. [DOI] [PubMed] [Google Scholar]
- 13.Cohen ER, Feinglass J, Barsuk JH, Barnard C, O’Donnell A, McGaghie WC, Wayne DB. Cost savings from reduced catheter-related bloodstream infection after simulation-based education for residents in a medical intensive care unit. Sim Healthcare. 2010;5:98–102. doi: 10.1097/SIH.0b013e3181bc8304. [DOI] [PubMed] [Google Scholar]
- 14.Barsuk JH, Cohen ER, McGaghie WC, Wayne DB. Long-term retention of central venous catheter insertion skills after simulation-based mastery learning. Acad Med. 2010;85(10, Suppl.):S9–S12. doi: 10.1097/ACM.0b013e3181ed436c. [DOI] [PubMed] [Google Scholar]
- 15.Crofts JF, Bartlett C, Ellis D, Hunt LP, Fox R, Draycott TJ. Training for shoulder dystocia: a trial of simulation using low-fidelity and high-fidelity mannequins. Obstet Gynecol. 2006;108:1477–1485. doi: 10.1097/01.AOG.0000246801.45977.c8. [DOI] [PubMed] [Google Scholar]
- 16.Draycott TJ, Crofts JF, Ash JP, et al. Improving neonatal outcome through practical shoulder dystocia training. Obstet Gynecol. 2008;112:14–20. doi: 10.1097/AOG.0b013e31817bbc61. [DOI] [PubMed] [Google Scholar]
- 17.Draycott TJ, Sibanda T, Owen L, et al. Does training in obstetric emergencies improve neonatal outcome? BJOG. 2006;113:177–182. doi: 10.1111/j.1471-0528.2006.00800.x. [DOI] [PubMed] [Google Scholar]
- 18.Rogers GM, Oetting TA, Lee AG, et al. Impact of a structured surgical curriculum on opthalmic resident cataract surgery complication rates. J Cataract Refract Surg. 2009;35:1956–1960. doi: 10.1016/j.jcrs.2009.05.046. [DOI] [PubMed] [Google Scholar]
- 19.Sroka G, Feldman LS, Vassiliou MC, et al. Fundamentals of laparoscopic surgery simulator training to proficiency improves laparoscopic performance in the operating room—a randomized controlled trial. Am J Surg. 2010;199:115–120. doi: 10.1016/j.amjsurg.2009.07.035. [DOI] [PubMed] [Google Scholar]
- 20.Butter J, McGaghie WC, Cohen ER, et al. Simulation-based mastery learning improves cardiac auscultation skills in medical students. J Gen Intern Med. 2010;25:780–785. doi: 10.1007/s11606-010-1309-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Wayne DB, Didwania A, Feinglass J, et al. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case-control study. CHEST. 2008;133:56–61. doi: 10.1378/chest.07-0131. [DOI] [PubMed] [Google Scholar]
- 22.Ahlberg G, Enochsson L, Gallagher AG, et al. Proficiency-based virtual reality training significantly reduces the error rate for residents during their first 10 laparoscopic cholecystectomies. Am J Surg. 2007;193:797–804. doi: 10.1016/j.amjsurg.2006.06.050. [DOI] [PubMed] [Google Scholar]
- 23.Park J, MacRae H, Musselman LJ, et al. Randomized controlled trial of virtual reality simulator training: transfer to live patients. Am J. Surg. 2007;194:205–211. doi: 10.1016/j.amjsurg.2006.11.032. [DOI] [PubMed] [Google Scholar]
- 24.Banks EH, Chudnoff S, Kermin I, et al. Does a surgical simulator improve resident operative performance of laparoscopic tubal ligation? Am J Obstet Gynecol. 2007;197:541.e1–541.e5. doi: 10.1016/j.ajog.2007.07.028. [DOI] [PubMed] [Google Scholar]
- 25.Chaer RA, DeRubertis BG, Lin SC, et al. Simulation improves resident performance in catheter-based intervention: results of a randomized, controlled study. Ann Surg. 2006;244:343–352. doi: 10.1097/01.sla.0000234932.88487.75. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Banks E, Pardanani S, King M, et al. A surgical skills laboratory improves residents’ knowledge and performance of episiotomy repair. Am J Obstet Gynecol. 2006;195:1463–1467. doi: 10.1016/j.ajog.2006.05.041. [DOI] [PubMed] [Google Scholar]
- 27.Cohen J, Cohen SA, Vora KC, et al. Multicenter, randomized, controlled trial of virtual-reality simulator training in acquisition of competency in colonoscopy. Gastrointest Endosc. 2006;64:361–368. doi: 10.1016/j.gie.2005.11.062. [DOI] [PubMed] [Google Scholar]
- 28.Andreatta PB, Woodrum DT, Birkmeyer JD, et al. Laparoscopic skills are improved with Lap Mentor™ training: results of a randomized, double-blinded study. Ann Surg. 2006;243:854–863. doi: 10.1097/01.sla.0000219641.79092.e5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Korndorffer JR, Dunne JB, Sierra R, et al. Simulator training for laparoscopic suturing using performance goals translates to the operating room. J Am Coll Surg. 2005;201:23–29. doi: 10.1016/j.jamcollsurg.2005.02.021. [DOI] [PubMed] [Google Scholar]
- 30.Ahlberg G, Hultcrantz R, Jaramillo E, et al. Virtual reality colonoscopy simulation: a compulsory practice for the future colonoscopist? Endoscopy. 2005;37:1198–1204. doi: 10.1055/s-2005-921049. [DOI] [PubMed] [Google Scholar]
- 31.Grantcharov TP, Kristiansen VB, Bendix J, et al. Randomized clinical trial of virtual reality simulation for laparoscopic skills training. Br J Surg. 2004;91:146–150. doi: 10.1002/bjs.4407. [DOI] [PubMed] [Google Scholar]
- 32.Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg. 2002;236:458–464. doi: 10.1097/00000658-200210000-00008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Scott DJ, Bergen PC, Rege RV, et al. Laparoscopic training on bench models: better and more cost effective than operating room experience? J Am Coll Surg. 2000;191:272–283. doi: 10.1016/s1072-7515(00)00339-2. [DOI] [PubMed] [Google Scholar]
- 34.Campbell DT, Stanley JC. Experimental and Quasi-Experimental Designs for Research. Rand McNally; Chicago: 1963. [Google Scholar]
- 35.Shadish WR, Cook TD, Campbell DT. Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Houghton Mifflin Co.; Boston: 2002. [Google Scholar]
- 36.Kirlik A. Brunswikian theory and method as a foundation for simulation-based research in clinical judgment. Sim Healthcare. 2010;5(5):255–259. doi: 10.1097/SIH.0b013e3181f12f03. [DOI] [PubMed] [Google Scholar]
- 37.Downing SM. Reliability: on the reproducibility of assessment data. Med Educ. 2004;38:1006–1012. doi: 10.1111/j.1365-2929.2004.01932.x. [DOI] [PubMed] [Google Scholar]
- 38.Downing SM. Validity: on the meaningful interpretation of assessment data. Med Educ. 2003;37:830–837. doi: 10.1046/j.1365-2923.2003.01594.x. [DOI] [PubMed] [Google Scholar]
- 39.Kane MT. Validation. In: Brennan RL, editor. Educational Measurement. 4th ed. American Council on Education and Praeger Publishers; Westport, CT: 2006. [Google Scholar]
- 40.Kane MT. An argument-based approach to validity. Psychol Bull. 1992;112:527–535. [Google Scholar]
- 41.Cronbach LJ. Essentials of Psychological Testing. 2nd ed. Harper & Row; New York: 1960. [Google Scholar]
- 42.Cronbach LJ, Gleser GC. Psychological Tests and Personnel Decisions. University of Illinois Press; Urbana, IL: 1965. [Google Scholar]
- 43.Northwestern’s Enterprise Data Warehouse (EDW) [Accessed 04/21/11]; Available at: http://www.nucats.northwestern.edu/clinical-research-resources/data-collection-biomedical-informatics-and-nubic/enterprise-data-warehouse.html.
- 44.Institute of Medicine (IOM) Knowing What Works in Health Care: A Roadmap for the Nation. The National Academies Press; Washington, DC: 2008. [Google Scholar]
- 45.Agency for Healthcare Research and Quality (AHRQ) [Accessed March 1, 2011];What is the Effective Health Care Program. Available at: http://effectivehealthcare.ahrq.gov/index.cfm/what-is-the-effective-health-care-program1/ [PubMed]
- 46.Dong Y, Suri HS, Cook DA, et al. Simulation-based objective assessment discerns clinical proficiency in central line placement: a construct validation. CHEST. 2010;137:1050–1056. doi: 10.1378/chest.09-1451. [DOI] [PubMed] [Google Scholar]
- 47.McGaghie WC. Research opportunities in simulation-based medical education using deliberate practice. Acad Emer Med. 2008;15:995–1001. doi: 10.1111/j.1553-2712.2008.00246.x. [DOI] [PubMed] [Google Scholar]
- 48.Yurko YY, Scerbo MW, Prabhu AS, Acker CE, Stefanidis D. Higher mental workload is associated with poorer laparoscopic performance as measured by the NASA-TLX tool. Sim Healthcare. 2010;5(5):267–271. doi: 10.1097/SIH.0b013e3181e3f329. [DOI] [PubMed] [Google Scholar]
- 49.McMahon GT, Monaghan C, Falchuk K, Gordon JA, Alexander EK. A simulator-based curriculum to promote comparative and reflective analysis in an internal medicine clerkship. Acad Med. 2005;80:84–89. doi: 10.1097/00001888-200501000-00021. [DOI] [PubMed] [Google Scholar]
- 50.Hoff T. Managing the negatives of experience in physician teams. Health Care Manage Rev. 2010;35(1):65–76. doi: 10.1097/HMR.0b013e3181c22bfb. [DOI] [PubMed] [Google Scholar]
- 51.Cote L, Turgeon J. Appraising qualitative research articles in medicine and medical education. Med Teacher. 2005;27(1):71–75. doi: 10.1080/01421590400016308. [DOI] [PubMed] [Google Scholar]
- 52.Berenholtz SM, Pronovost PJ, Lipsett PA, et al. Eliminating catheter-related bloodstream infections in the intensive care unit. Crit Care Med. 2004;32:2014–2020. doi: 10.1097/01.ccm.0000142399.70913.2f. [DOI] [PubMed] [Google Scholar]
- 53.Pronovost P, Needham D, Berenholtz S, et al. An intervention to decrease catheter-related bloodstream infections in the ICU. NEJM. 2006;355:2725–2732. doi: 10.1056/NEJMoa061115. [DOI] [PubMed] [Google Scholar]
- 54.Lipitz-Snyderman A, Steinwachs D, Needham DM, et al. Impact of a statewide intensive care unit quality improvement initiative on hospital mortality and length of stay: retrospective comparative analysis. BMJ. 2011;342:d219. doi: 10.1136/bmj.d219. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Woodward HI, Mytton OT, Lemer C, et al. What have we learned about interventions to reduce medical errors? Annu Rev Public Health. 2010;31:479–497. doi: 10.1146/annurev.publhealth.012809.103544. [DOI] [PubMed] [Google Scholar]
- 56.Wortman PM, Vinokur A, Sechrest L. Do consensus conferences work? A process evaluation of the NIH consensus development program. J Health Politics Policy Law. 1988;13(3):469–498. doi: 10.1215/03616878-13-3-469. [DOI] [PubMed] [Google Scholar]