Abstract
Background
Near-peer teaching is recognised for its benefit to both students and facilitators when used as an adjunct to traditional teaching. Simulation is an effective tool for teaching acute management. There are currently no published long-term objective data of the efficacy of near-peer simulation teaching.
Methods
We designed the ‘Immediate Management: Peer Led Simulated Emergencies’ course, a near-peer simulation course for medical students run by junior doctors covering common medical and surgical emergencies. Participants and teachers were objectively tested before and after sessions, and participant confidence in various areas was self-assessed. Participants were followed up at 18 months with both repeat testing and analysis of finals examination results.
Results
Participants’ mean test scores improved significantly postcourse and remained significantly higher than baseline at 18-month follow-up. There was no difference between participants’ and non-participants’ final examination performance. Participant confidence increased in all areas immediately and at 18-month follow-up. Junior doctor facilitator test scores significantly improved after teaching on the course.
Conclusions
Near-peer simulation courses can be effectively designed and run by junior doctors and our results suggest that they result in long-term improvement in test scores. Larger studies with randomised control groups are required to confirm the efficacy of such teaching.
Keywords: teaching and learning, simulation, peer, medical education research, medicine
Introduction
What is near-peer teaching?
While a peer teacher is of the same grade as their students, a near-peer teacher is more senior than their student.1 2 However, the terms are often used interchangeably in the literature, to mean a facilitator of the same or similar level to the learner.3 There is increasing interest in this form of teaching,4 with a push to encourage and formalise such programmes5 in both undergraduate and postgraduate educations. This has resulted in efficacious and sustainable near-peer programmes emerging on both the local6 and regional scales.7 These programmes can effectively be designed and delivered by newly qualified junior doctors.7 8
Why use near-peer teaching?
As well as increasing teacher numbers,1 providing additional hours of training during which students can consolidate learning, near-peer teaching adds unique learning advantages for students when compared with traditional senior-led sessions7 9 10 despite the near peers’ relative lack of knowledge. This has been attributed to the social and cognitive congruence between students and near-peer facilitators.11 12
It has been demonstrated that individuals who act as near-peer teachers develop enhanced knowledge of the subject matter13 and this is cited as a motivation for participating in such teaching.7 The General Medical Council (GMC) stresses the importance of continued learning and teaching for doctors14 and it is a core requirement of the Foundation curriculum15 to ‘develop the clinical teacher’—to develop teaching skills and to enhance knowledge and clinical skills.
While there are numerous examples of near-peer teaching programmes in the literature, near-peer simulation is far rarer.
Why simulation?
Simulation training has been established as an effective tool for improving knowledge, skill and confidence of medical students and doctors,16 while protecting patients from unnecessary risk.17 Simulation training improves post-test scores and knowledge retention when compared with traditional lectures18 19 and is an important part of the undergraduate curriculum.20 This is in part due to the Acute Care Undergraduate Teaching (ACUTE) initiative,21 published in response to a number of confidential enquiries that identified that acute care, in particular the recognition and initial management of patients who were acutely unwell, was being delivered suboptimally.22 Simulation has high construct validity in assessment of management of medical emergencies23 and is therefore well suited to facilitate learning in this area.
While it has been demonstrated that repetition of training improves skill retention,24 students often receive only a limited amount of time in the simulation suite.
By increasing teaching opportunities as well as offering unique advantages to students and junior doctors alike,10 25 near-peer teaching is well placed to teach the ACUTE initiative aims, improving recognition and initial management of patients who were acutely unwell by junior doctors.
Previous near-peer simulation programmes26 have shown improvement in student self-assessed confidence and self-reported improvement in facilitator clinical practice. To our knowledge no studies have examined objective long-term impact of near-peer simulation on student performance. With this in mind, we designed and ran a near-peer simulation course in the Royal Cornwall Hospital (RCH), Truro, UK: the IMPLSE (Immediate Management: Peer Led Simulated Emergencies) course, looking for objective improvement in student and facilitator performance immediately and at 18-month follow-up. Students from the Truro campus of Peninsula Medical School (PMS) were invited to attend.
Method
Course structure and design
The IMPLSE course was designed as a near-peer simulation course, run by junior doctors for senior medical students, teaching basic A–E assessment and management of common medical and surgical emergencies. The course was designed using the Association for Medical Education in Europe Peer Assisted Learning project framework19 and following discussion with senior members of the RCH simulation team and University of Exeter Medical School (UEMS).
The first course ran in October 2014 and has run termly since. Scenarios were chosen to maximise the number of ACUTE learning objectives21 covered and on the availability of NICE/Trust guidelines. A 6-week course was developed, consisting of a 2-hour session each week in which three simulations would be run from a bank of four possible scenarios. The course outline is shown in table 1.
Table 1.
Course structure, with topics and possible simulations each week
| Week | Topic | Possible scenarios |
| Week 1 | Respiratory distress | Opiate toxicity, acute asthma attack, hypoglycaemia, acute pulmonary oedema |
| Week 2 | Shock | Chest sepsis, lower gastrointestinal bleed, upper gastrointestinal bleed, anaphylaxis |
| Week 3 | Chest pain | Pulmonary embolus, ST elevation myocardial infarction, tension pneumothorax, acute coronary syndrome |
| Week 4 | Coma | Diabetic ketoacidosis, status epilepticus, acute stroke, severe head injury |
| Week 5 | Acute abdomen | Acute pancreatitis, bowel perforation, ruptured abdominal aortic aneurysm, lower gastrointestinal bleed |
| Week 6 | Cardiac arrest | Pulseless ventricular tachycardia, ventricular fibrillation, asystole, pulseless electrical activity |
Six students and four teachers attended each week. The topics to be covered were discussed in a 30-min small group teach at the start of each session.
Students were split into groups of three, with one group in the simulation suite while the other team debriefed, with simulations and debriefs rotating every 15 min. Every student led one scenario each week.
Attending multiple sessions allowed students to practice the A–E assessment repeatedly in progressively more complex settings, allowing consolidation of understanding though the spiral organisation model.27
Recruitment
All third and fourth year students from the PMS Truro campus were invited to attend the IMPLSE course via email and were allocated on a first come first served basis. Students would attend an entire 6-week course.
All Foundation Year One (F1) and Year Two (F2) doctors at the RCH, Truro were invited via email to act as near-peer tutors. No prerequisite qualifications or experience were required other than a desire to teach. Tutors were allocated based on their availability and could generally only commit to one or two sessions. Junior doctors gave up their free time to attend sessions, which ran from 1900 to 2100—typically only those on unbanded rotations with sociable hours could attend.
Teaching material
Students were emailed reading material a week before each session which outlined the local trust/NICE guidance relevant to the topics being covered that week and a PowerPoint presentation covering the relevant ACUTE21 objectives. This material was discussed during a 30-min small group teaching session prior to the simulation.
Simulations ran according to a simulation ‘script’ (see online supplementary file) which included an outline of the scenario for facilitators and a vignette for the student.
bmjstel-2016-000172.supp1.pdf (971.2KB, pdf)
Mock patient notes were available including a clerking proforma, observation chart and drug chart as found on the wards at the RCH, and students were expected to use these as they would in practice. Trust/NICE guidelines were available and students were encouraged to refer to these during the simulation.
Facilitator brief
Before each session, facilitators would meet to discuss the objectives of the teaching. The strengths of near-peer teaching (creating a non-threatening learning environment and offering insights based on personal experience) were emphasised.
Facilitators were encouraged to be available to students during the simulation to offer support if needed and encourage students to refer to guidelines as appropriate.
Quality assurance
All learning materials and assessment tools were directly based on Trust or NICE clinical guidelines and were reviewed and approved by a supervising consultant anaesthetist.
All facilitators attended a 3-hour teaching session covering the delivery of simulation and feedback/debriefing. This session was run by the UEMS senior clinical skills tutor with extensive simulation experience. Each facilitator had attended frequent simulation sessions as part of their foundation training. Precourse reading was emailed to facilitators weekly covering the relevant topics to ensure each team member had the required level of clinical knowledge.
Sustainability
The course is managed by two junior doctors, and each year is handed from the current F2 leads to two F1 volunteers who have shown leadership, enthusiasm and competence when teaching the previous year.
Aims/objectives
To investigate whether there was any effect or otherwise of participation in the IMPLSE course on immediate and long-term student test scores or on finals results.
Subjective measures
Four questions were designed based on the non-clinical attributes expected from new F1s.28 These aimed to assess the students’ confidence as well as engagement with the human factors encountered in the scenarios. Student performance was self-assessed using a 10-point visual analogue scale immediately before and after each session. The questions are listed in the online supplementary file.
Objective measures
Six white space assessments with a standardised, objective mark scheme were designed, one for each week (see online supplementary file). Questions were based on the ACUTE learning objectives and tested knowledge of NICE/Trust guidelines. Identical assessments were used for ‘pretest’, ‘post-test’ and ‘follow-up’.
Students were tested weekly, immediately before and after each session. After the 6-week course was completed, a mean presession and postsession score was calculated for each student. Results from two consecutive courses were collected and improvements in test scores analysed for significance using the Mann-Whitney U test due to non-normality of distribution.
Facilitators were asked to complete all six pretests before the start of the course and then complete a postsession test each week. This ensured that any benefit from preparing to teach, as well as teaching itself, was captured. Presession and postsession test scores were compared with the Mann-Whitney U test.
Follow-up
We assessed the long-term impact of this project on student knowledge by retesting students 18 months postparticipation and through analysis of medical school finals results.
Retesting
IMPLSE participants were assessed for knowledge retention at 18 months postcourse. Due to the time-consuming nature of the test, to maximise response rate only one of the six tests was selected for follow-up using a random number generator. The pretest, post-test and follow-up tests were identical. Tests were sent and returned via email. Students were advised that results were confidential and asked to complete the test in examination conditions.
Finals examination analysis
The ISCE (Integrated Structured Clinical Examination) is the final examination undertaken by students at PMS. It comprises six long case stations, during which history taking, examination, diagnosis, presentation and discussion of management are assessed. Stations cover a range of topics such as cardiovascular, communication skills and mental health.
Anonymised ISCE results from 2014 to 2016 divided into ‘participant’ and ‘control’ columns were obtained for all students at PMS from the PMS Psychometric Department. The scores of students who attended the IMPLSE course were compared with those of students who had not attended the course using the Mann-Whitney U test to assess for any difference in performance.
Results
Results were collated from two consecutive courses (totalling 12 students). Six students were present at each session, except block two week six when four students attended. Each student took a pretest and post-test at each of the six sessions. Each student’s mean presession score was calculated and compared with their mean postsession score (n=12). There were either three or four junior doctor facilitators present at each session (n=44). Response rate was 100%. At 18-month follow-up, all 12 participants (100%) responded and completed the week 3 test.
Student knowledge of management guidelines
Students mean presession score was 37.1%, while mean postsession score was 69.9%, an improvement of 32.8 (U=0, P<0.01). In week 3 (randomly selected for follow-up testing), mean presession score was 35.5% and 61.8% postsession (improvement 26.3, U=7, P<0.01). At 18 months, participants mean score was 65.7%, an improvement from presession of 30.2 points (U=2, P<0.01) and from postsession of 3.8 points (non-significant). Scores are shown in figures 1 and 2.
Figure 1.
Mean presession and postsession student test scores of weeks 1–6 of two consecutive courses.
Figure 2.
Mean presession, postsession and 18-month follow-up student test scores from week 3 of two consecutive courses. Week 3 was randomly selected for 18-month follow-up from the six available tests.
Student confidence
Confidence improved in all areas, both immediately postsession and at 18 months. Results are shown in table 2.
Table 2.
Mean student self-assessed presession and postsession confidence with SD reported (scores out of 10)
| Presession (n=12) |
Postsession (n=12) | 18-month follow-up (n=12) |
||||
| Mean | SD | Mean | SD | Mean | SD | |
| Confidence managing of patients who were acutely unwell in a simulation environment | 3.8 | 1.1 | 7.6 | 0.8 | 8.4 | 1 |
| Confidence managing of patients who were acutely unwell in the hospital environment | 1.9 | 1.3 | 5.6 | 1.4 | 6.9 | 1.2 |
| Confidence working as part of a medical team | 4.4 | 2.2 | 6.7 | 1.5 | 7.9 | 1.2 |
| Confidence knowing how, when and who to escalate to | 5.3 | 1.4 | 7.4 | 1.3 | 8.5 | 0.9 |
Facilitator performance
Presession, the mean facilitator score was 42.7%, increasing to 76.4% postsession (improvement of 33.7 points, U=191, P<0.001, n=44). Subgroup analysis shows that F1 and F2 doctors showed a similar level of improvement. F1 (n=25) mean pretest score was 37.4% and post-test 74.0% (improvement 36.6 points, U=15.5, P<0.001). F2 (n=19) mean pretest score was 41.9% and post-test 79.6% (improvement 37.7 points, U=6.5, P<0.001). Results are shown in figure 3.
Figure 3.
Mean junior doctor facilitator test scores preteaching and immediately post-teaching on the IMPLSE course. Subgroup analysis of F1 and F2 doctors shown. F1 and F2, Foundation Year One and Year Two; IMPLSE, Immediate Management: Peer Led Simulated Emergencies.
ISCE score follow-up
Twelve participants and 592 non-participants were identified. Participants’ mean ISCE score was 85.5% while non-participants was 84.2%, a difference of 1.3 points (U=6113, P>0.05). The mean number of stations passed (achieving a rating of satisfactory or higher) was 5.71 (95%) for participants and 5.61 (94%) for non-participants, a difference of 0.1 (U=6956, P>0.05). There was no statistical difference in either measure of ISCE performance between participants and non-participants. Results are shown in figure 4.
Figure 4.
Mean finals ISCE score of both participants and non-participants as percentage of total score. Difficult, non-significant (P=0.11). ISCE, Integrated Structured Clinical Examination.
Discussion
The ACUTE initiative identified that the recognition and initial management of patients who were acutely unwell by junior doctors was suboptimal and developed a number of core competencies that all junior doctors should possess.
Our results suggest that a near-peer simulation course (the IMPLSE course) can significantly improve students’ and facilitators’ knowledge of these competencies as well as NICE/Trust guidelines. Students almost doubled their test scores following sessions and retained this improvement, performing significantly better than baseline at 18 months. Foundation doctors also roughly doubled their test scores. We do not have follow-up data on long-term retention of foundation doctor performance.
It is interesting to note students’ pretest scores were low despite having access to the guidelines in advance of the session. A likely explanation is that students had not reviewed this reading material before attending the session—perhaps unsurprising given they were already giving up their spare time.
Long-term follow-up of medical school final ISCE scores showed no significant difference between participants and non-participants performance. One might expect that improvement in one area of testing should translate into a global increase in performance. However, this is not what we found. What explains this disparity in results?
The most obvious criticism is the lack of a control group. While the immediate improvement seems likely to be due to the IMPLSE course, it is possible that the long-term improvement in test scores is attributable to 18 months of medical school teaching, not the course. If participants scored more highly than a control group that had been tested before intervention and at 18 months, then this would suggest the IMPLSE course was responsible.
Another explanation is that the students follow-up scores were falsely elevated. Tests were sent via email and were not invigilated, meaning students could have cheated. We explained that results were anonymous and would have no effect on their grades so there is no clear motive for cheating—however, this remains a possibility. Alternatively, it could be that reusing the exact same examinations acted as an aide memoire for students, boosting their performance over controls. The use of white space questions rather than multiple choice questions minimised this effect.
The sample size (n=12) was small and non-randomly selected and therefore open to selection bias— students who volunteer to give up their evenings for extra teaching may be higher performing than their peers.
These issues could be addressed by randomising students to participation or control groups at the outset and testing both groups at baseline and at 18 months in exam conditions—a process that we did not have the resources to carry out.
Another explanation is that ISCEs and the IMPLSE course are simply testing different things. The IMPLSE course was designed to teach a specific set of skills—emergency management as per the ACUTE initiative. These skills are not tested in the ISCE examination, which instead focuses on chronic conditions—there are no resuscitation stations in the ISCE. The highly specific nature of emergency treatment algorithms means they are unlikely to offer more generalisable improvements in performance. If this is the case, then it strengthens the case for implementing this type of teaching in medical schools, as perhaps this is an area underexamined by traditional final examinations.
Participants’ confidence improved in both the immediate and long-term, with significant improvements in confidence in all areas specified by the Foundation Programme as essential non-clinical attributes of new doctors. Increasing subjective confidence alone without an associated objective improvement in ability could be dangerous. However, our follow-up data suggest that this is not the case.
It is interesting to note that the junior doctors’ presession scores were only slightly better than those of the students. While the ACUTE initiative suggests that all medical students should possess these skills before graduating from medical school,21 our results suggest that either junior doctors forget these skills or that they were not fully learnt in the first place. This is an area that both students and doctors can improve in.
Limitations
Issues relating to controls, selection bias, randomisation and sample size have been addressed above.
While students were the same throughout each course, facilitators changed weekly, introducing possible variation in the quality of teaching. The course was arranged this way for practical reasons, as junior doctors’ free time is inconsistent. Subgroup analysis shows only a small weekly variation in improvement, suggesting any effect due to this is small.
A criticism of the course is that the quality of facilitators cannot be guaranteed with a single teaching session, and it is certainly true that none of our facilitators were experts in simulation teaching—especially given facilitators poor presession test scores. One explanation for these low scores is the highly specific nature of the questions, relating to the details of management guidelines which junior doctors are expected to refer to in clinical practice. Our results suggest that this sample of doctors certainly did not have them memorised before teaching on the course. Students and facilitators were encouraged to refer to guidelines extensively throughout the sessions, and so having this knowledge memorised was not necessary to be able to teach effectively. We suggest that this manuscript demonstrates that high-quality teaching that delivers long-term improvements in performance can be delivered by non-experts, who do not have the same level of knowledge as a consultant—the reasons for which are discussed below.
Near-peer versus expert teaching
Numerous studies have found that near-peer teaching is as effective as that delivered by more senior faculty members4 6–8 25 26 29 30 and one study9 found that the peer-led group performed better. This effect is not confined solely to medical education.31
This may appear counter-intuitive, as senior faculty members can be assumed to have a more complete knowledge of the subject matter and greater teaching experience. In what way do near peers compensate for this relative lack of knowledge?
Cornwall11 proposed that the cognitive congruence of near peers—that is, the similarity of the learners’ and facilitators’ cognitive schema—provides this benefit.
‘… his explanation is almost bound to be at the same conceptual level as that of his peers;… The teacher-expert on the other hand must always try to empathize with the learner…. Even the best teacher is not always very successful in this respect.’ (p. 84)
Near peers share social congruence12 with learners, understanding and anticipating the specific problems they may encounter. Junior doctors are well placed to teach human factors and offer guidance on issues specific to newly qualified doctors,7 preparing them for the practicalities of starting work on the wards in a way that consultant led teaching may not.
Teachers as learners
Student test scores improved and junior doctors demonstrated a significant improvement in knowledge. Individuals acting as near-peer teachers develop enhanced knowledge of the subject matter.13 This can be partly explained by the relearning involved when preparing to teach, and our feedback indicated this was a common motivation for junior doctors when signing up. However, there is also a distinct benefit from the teaching process itself, evoking a separate, synergistic learning process32 over and above the revision process.
Finally, this form of teaching offers junior doctors an opportunity to receive training in clinical teaching—a skill deemed essential by the GMC.
Conclusion
We provide the first objective evidence that near-peer simulation is an effective tool for teaching knowledge of acute management guidelines to both students and facilitators. Students who participate in this form of teaching also report significant long-term improvements in confidence. While the conclusions that can be drawn from this study are limited by the lack of control group and lack of randomisation, we have demonstrated that junior doctors are capable of setting up and running a near-peer simulation teaching programme to supplement pre-existing teaching and that there is a significant interest in such teaching by both students and junior doctors. We hope this work will help contribute to the adoption and acceptance of such courses, thus providing the resources for more methodologically rigorous studies in the future through the backing of medical schools and hospital trusts.
Acknowledgments
Ms Kandy Collings, Ms Angela Lait, Mr Jan Sominka, Dr Julie Blundell, Dr William Jewell.
Footnotes
Contributors: JWC: design and implementation of the IMPLSE course 2014–2016; data acquisition, statistical analysis and data interpretation. TB: design and implementation of the IMPLSE course 2014–2016; data acquisition; gaining ethical approval. LAC, AG: implementation of the IMPLSE course 2015–2017; data acquisition. LS: data acquisition.
Competing interests: None declared.
Ethics approval: University of Exeter Medical School Research Ethics Committee.
Provenance and peer review: Not commissioned; externally peer reviewed.
References
- 1. Ten Cate O, Durning S. Peer teaching in medical education: twelve reasons to move from theory to practice. Med Teach 2007;29:591–9. 10.1080/01421590701606799 [DOI] [PubMed] [Google Scholar]
- 2. Ten Cate O, Durning S. Dimensions and psychology of peer teaching in medical education. Med Teach 2007;29:546–52. 10.1080/01421590701583816 [DOI] [PubMed] [Google Scholar]
- 3. Secomb J. A systematic review of peer teaching and learning in clinical education. J Clin Nurs 2008;17:703–16. 10.1111/j.1365-2702.2007.01954.x [DOI] [PubMed] [Google Scholar]
- 4. Burgess A, McGregor D, Mellis C. Medical students as peer tutors: a systematic review. BMC Med Educ 2014;14:115. 10.1186/1472-6920-14-115 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5. Ross MT, Cameron HS. Peer assisted learning: a planning and implementation framework: AMEE Guide no. 30. Med Teach 2007;29:527–45. 10.1080/01421590701665886 [DOI] [PubMed] [Google Scholar]
- 6. Hughes TC, Jiwaji Z, Lally K, et al. Advanced Cardiac Resuscitation Evaluation (ACRE): a randomised single-blind controlled trial of peer-led vs expert-led advanced resuscitation training. Scand J Trauma Resusc Emerg Med 2010;18:3. 10.1186/1757-7241-18-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Rodrigues J, Sengupta A, Mitchell A, et al. The Southeast Scotland foundation doctor teaching programme-is ‘near-peer’ teaching feasible, efficacious and sustainable on a regional scale? Med Teach 2009;31:e51–7. 10.1080/01421590802520915 [DOI] [PubMed] [Google Scholar]
- 8. Rashid MS, Sobowale O, Gore D. A near-peer teaching program designed, developed and delivered exclusively by recent medical graduates for final year medical students sitting the final objective structured clinical examination (OSCE). BMC Med Educ 2011;11:11. 10.1186/1472-6920-11-11 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Tolsgaard MG, Gustafsson A, Rasmussen MB, et al. Student teachers can be as good as associate professors in teaching clinical skills. Med Teach 2007;29:553–7. 10.1080/01421590701682550 [DOI] [PubMed] [Google Scholar]
- 10. Moust JHC, Schmidt HG. Facilitating small-group learning: a comparison of student and staff tutors′ behavior. Instr Sci 1995;22:287–301. 10.1007/BF00891782 [DOI] [Google Scholar]
- 11. Cornwall MG. Students as teachers: peer teaching in higher education. Universiteit, 1980. [Google Scholar]
- 12. Lockspeiser TM, O’Sullivan P, Teherani A, et al. Understanding the experience of being taught by peers: the value of social and cognitive congruence. Adv Health Sci Educ Theory Pract 2008;13:361–72. 10.1007/s10459-006-9049-8 [DOI] [PubMed] [Google Scholar]
- 13. Tang TS, Hernandez EJ, Adams BS. ‘Learning by teaching’: a peer-teaching model for diversity training in medical school. Teach Learn Med 2004;16:60–3. 10.1207/s15328015tlm1601_12 [DOI] [PubMed] [Google Scholar]
- 14. The General Medical Council. Good medical practice. London: The General Medical Council, 2013. [Google Scholar]
- 15. The UK Foundation Programme 2015 Website. The UK foundation programme curriculum 2012. 2012. Retrieved 10 Sep 2015. http://www.foundationprogramme.nhs.uk/download.asp?file=FP_Curriculum_2012_Updated_for_Aug_2015_-_FINAL.pdf
- 16. Ewy GA, Felner JM, Juul D, et al. Test of a cardiology patient simulator with students in fourth-year electives. J Med Educ 1987;62:738–43. 10.1097/00001888-198709000-00005 [DOI] [PubMed] [Google Scholar]
- 17. Ziv A, Wolpe PR, Small SD, et al. Simulation-based medical education: an ethical imperative. Acad Med 2003;78:783–8. [DOI] [PubMed] [Google Scholar]
- 18. Tubaishat A, Tawalbeh LI. Effect of cardiac arrhythmia simulation on nursing students′ knowledge acquisition and retention. West J Nurs Res 2015;37:1160–74. 10.1177/0193945914545134 [DOI] [PubMed] [Google Scholar]
- 19. Ross AJ, Kodate N, Anderson JE, et al. Review of simulation studies in anaesthesia journals, 2001-2010: mapping and content analysis. Br J Anaesth 2012;109:99–109. 10.1093/bja/aes184 [DOI] [PubMed] [Google Scholar]
- 20. Bradley P. The history of simulation in medical education and possible future directions. Med Educ 2006;40:254–62. 10.1111/j.1365-2929.2006.02394.x [DOI] [PubMed] [Google Scholar]
- 21. Perkins GD, Barrett H, Bullock I, et al. The Acute Care Undergraduate TEaching (ACUTE) Initiative: consensus development of core competencies in acute care for undergraduates in the United Kingdom. Intensive Care Med 2005;31:1627–33. 10.1007/s00134-005-2837-4 [DOI] [PubMed] [Google Scholar]
- 22. Hodgetts TJ, Kenward G, Vlackonikolis I, et al. Incidence, location and reasons for avoidable in-hospital cardiac arrest in a district general hospital. Resuscitation 2002;54:115–23. 10.1016/S0300-9572(02)00098-9 [DOI] [PubMed] [Google Scholar]
- 23. Devitt JH, Kurrek MM, Cohen MM, et al. The validity of performance assessments using simulation. Anesthesiology 2001;95:36–42. 10.1097/00000542-200107000-00011 [DOI] [PubMed] [Google Scholar]
- 24. Sutton RM, Niles D, Meaney PA, et al. Low-dose, high-frequency CPR training improves skill retention of in-hospital pediatric providers. Pediatrics 2011;128:e145–51. 10.1542/peds.2010-2105 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25. Cooper DD, Wilson AB, Huffman GN, et al. Medical students′ perception of residents as teachers: comparing effectiveness of residents and faculty during simulation debriefings. J Grad Med Educ 2012;4:486–9. 10.4300/JGME-D-11-00269.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26. Antonelou M, Krishnamoorthy S, Walker G, et al. Near-peer facilitation: a win-win simulation. Med Educ 2014;48:544–5. 10.1111/medu.12443 [DOI] [PubMed] [Google Scholar]
- 27. Bruner JS. The process of education. 2nd Revised edn. USA: Harvard University Press, 1960. [Google Scholar]
- 28. Fitzpatrick S, O’Neil P. Getting into the foundation programme: the new selection methods. BMJ Careers 2012:3–4. [Google Scholar]
- 29. Knobe M, Münker R, Sellei RM, et al. Peer teaching: a randomised controlled trial using student-teachers to teach musculoskeletal ultrasound. Med Educ 2010;44:148–55. 10.1111/j.1365-2923.2009.03557.x [DOI] [PubMed] [Google Scholar]
- 30. Nestel D, Kidd J. Peer assisted learning in patient-centred interviewing: the impact on student tutors. Med Teach 2005;27:439–44. 10.1080/01421590500086813 [DOI] [PubMed] [Google Scholar]
- 31. Moust JC, Schmidt HG. Effects of staff and student tutors on student achievement. Higher Education 1994;28:471–82. 10.1007/BF01383938 [DOI] [Google Scholar]
- 32. Fiorella L, Mayer RE. The relative benefits of learning by teaching and teaching expectancy. Contemp Educ Psychol 2013;38:281–8. 10.1016/j.cedpsych.2013.06.001 [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
bmjstel-2016-000172.supp1.pdf (971.2KB, pdf)




