Abstract
Background
Digital rectal examination (DRE) is a challenging examination to learn.
Objective
To synthesise evidence regarding the effectiveness of technology-enhanced simulation (TES) for acquiring DRE skills.
Study selection
EMBASE, Medline, CINAHL, Cochrane, Web of Knowledge (Science and Social Science), Scopus and IEEE Xplore were searched; the last search was performed on 3 April 2019. Included were original research studies evaluating TES to teach DRE. Data were abstracted on methodological quality, participants, instructional design and outcomes; a descriptive synthesis was performed. Quality was assessed using a modified Medical Education Research Study Quality Instrument. The study design domain was modified by scoring the papers based on (1) evaluation of risk of bias for randomised controlled trials, (2) description of participants and (3) assessment of robustness and degree of simulation fidelity of the assessments used to evaluate learning.
Findings
863 articles were screened; 12 were eligible, enrolling 1507 prequalified medical/clinical students and 20 qualified doctors. For skill acquisition, role player was statistically significantly superior to a static manikin (2 studies). For knowledge acquisition, manikin use was significantly superior to role player (1 study); 2 studies showed no difference. For confidence, manikin use was significantly superior to no manikin (4 studies). For comfort, manikin use was significantly superior to no manikin (2 studies). For anxiety, role player was significantly superior to manikin (1 study).
Median overall quality score (QS) was 48% (27–62). Highest median QS was 73% (33–80) for data analysis; lowest median QS was 20% (7–40) for the validity of instrument. Six papers scored over 50% of the maximum score for overall quality.
Conclusions
TES training is associated with improved DRE skills and should be used more widely.
Keywords: Clinical competence, medical simulation, simulation based learning, systematic review, technology enhanced learning
INTRODUCTION
Digital rectal examination (DRE) is an important physical examination tool that can identify abnormalities of the anus, rectum and (in the male) prostate. DRE is an approved examination technique, which helps to identify life-threatening diseases (eg, prostate cancer) 1 : DRE detected prostate cancer in 13% of 1905 men with normal levels (<4 ng/mL) of prostate-specific antigen during a prostate cancer screening study. 2 As Verghese et al 3 have described, failing to perform necessary physical examinations may result in missing important diagnostic findings that can have dire consequences. The General Medical Council 4 5 has indicated the need for doctors to conduct intimate examinations of breasts, genitalia and the rectal passage when necessary.
bmjstel-2020-000587supp001.pdf (379.6KB, pdf)
Learning DRE and some other intimate examinations is particularly challenging as the trainer has no visual cues for guidance and feedback, such as might be available in, say, chest percussion, because such examinations (and some others such as of the tympanic membrane) take place in an invisible space. During this teaching process, the trainer is neither able to ascertain whether the learner is palpating the right area or anatomical structure nor whether they are able to differentiate between normal and abnormal presentations. Clinical students will rely on feeling what their finger has touched to locate the correct position and to construct a picture of normal and abnormal anatomical structures and positions. In an Australian study, students reported that they had little idea of what they should be doing or looking for when conducting intimate physical examinations, and stated that, if they could do more examinations, they believed they would become more competent. 6 Similarly, data from foundation year 1 (FY1) doctors from two UK universities show FY1 doctors had very little experience in performing DRE on a patient. 7 Wong et al 8 surveyed faculty, fellows, medical residents and final-year medical students at four medical centres in four different states in the USA. A total of 652 respondents completed a questionnaire in which 56% indicated that they were not at all, a little, or somewhat comfortable with a DRE. More than 30% of respondents with less than 4 years of experience reported that they ‘Need confidence/competence to do DRE’. Thus, the lack of comfort in medical students and junior doctors may be explained by inadequate training.
Technology-enhanced simulation (TES) has been introduced to medical education to improve students’ learning and skill acquisition. 9 10 Normal and abnormal presentations and pathological changes can be simulated using synthetic models and manikins, computer-based manikins with pressure-sensor technology, and virtual reality simulators. 11 Although the use of TES in medical education for teaching male rectal examinations has been studied, we are not aware of any systematic reviews synthesising the results.
The aim of this review is to integrate the evidence on the role and effectiveness of enhanced technology simulation in teaching DRE skills for healthcare students and practitioners.
METHODS
The protocol was registered prospectively on PROSPERO (registration no. CRD42019128808). The review follows the Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement (appendix 1). The registered protocol was used to guide answering the following questions:
How has technology-enhanced simulation been used to improve training for digital rectal examination for clinicians?
How effective is technology-enhanced simulation for acquiring and retaining the required skills to examine the prostate correctly and competently?
Which pedagogical approaches increase learning effectiveness and are associated with improved outcomes?
Study eligibility
We included primary empirical research papers published in peer-reviewed journals or conference proceedings that met the following inclusion criteria: investigating the use of TES to teach or assess DRE skills with hands-on training with haptic, tactile or force feedback. Initially, we included only randomised controlled trials (RCTs) for prequalification or postqualification learners, but only five articles were identified. Therefore, the criteria were broadened to include single-group pretest/post-test comparisons as well as non-randomised studies.
Study identification
We searched seven electronic databases to select eligible articles: EMBASE, Medline, CINAHL, Cochrane, Web of Knowledge (Science and Social Science), Scopus and IEEE Xplore. We used four search term concepts describing the population, intervention, comparator and outcomes to develop a comprehensive search strategy (appendix 2). No date of publication or language restrictions were applied. The last date of the search was 3 April 2019.
The reference lists and citations of the identified papers were scrutinised, and the research profiles of the key expert authors in the field were searched for additional papers. Furthermore, key authors identified during the searching were contacted to identify papers which had not been found using the aforementioned strategies.
Study selection
The references identified through the aforementioned strategies were screened by two of the authors working independently (MAAA and either JP or JE). The title and abstract of the references were screened for inclusion. If the abstract presented insufficient information or there was disagreement between the reviewers, then the full text of the study was reviewed by two of the authors (MAAA and either JP or JE). All conflicts were successfully resolved by consensus.
Data extraction
The data extraction form was developed by MAAA. The form was tested on 10 study samples and amended following discussion between the reviewers (MAAA, JP and JE). MAAA and JP performed data extraction together. Reviewers MAAA, JP and JE tested the previously validated Medical Education Research Study Quality Instrument (MERSQI) 12 on the same 10 samples as were used for data extraction, and the instrument was then modified to improve granularity. This has been done previously in the field of diagnostic accuracy studies when the existing instrument lacks specific criteria that are important to facilitate better quality assessment. 13 Modifications were in four domains: first, the study design domain was modified by scoring the papers based on the risk of bias for RCTs (eg, sequence generation and allocation concealment); this is used to discriminate lower quality RCTs from higher-quality RCTs. 14 Second, in the sampling domain, the participant characteristics were given value (eg, whether gender, age and ethnicity were included); this was to identify whether differences in gender, age and ethnicity may influence learning. Third, in the data type domain, the objective measurements were classified based on their robustness to evaluate learning (eg, recall, analysis or problem-solving type questions); this was to discriminate learners’ level of mastery (‘knows’ or ‘knows how’). 15 Lastly, the outcomes domain was modified to score knowledge and skills outcomes according to the degree of fidelity used in the test setting (eg, teaching associate (TA) vs manikin); this was to estimate the transferability of students’ performance and efficacy of skill transfer to clinical contexts. In addition, scores were reweighted in each domain based on the new modifications (ie, RCT with higher risk of bias given lower score than low risk; outcomes and objective measurement which used a true representation of intended learning goals and robust assessment were given higher scores compared to lower fidelity and limited assessment).
All authors (except MSH) assessed all publications for quality using the modified MMERSQI (the MERSQI and the MMERSQI can be seen in appendix 3). Discrepancies were resolved by consensus and discussion.
The Kirkpatrick four-level model was used to distinguish the outcomes. 16 Thus, data were abstracted separately: for learners’ reactions (confidence and satisfaction); learning (knowledge and skills); behaviours in clinical setting with patients; and clinical outcomes (improvement in patient care). If there was a reported second-time point after a reasonable period (>4 weeks), then these data were treated as reflecting skill retention.
Data synthesis
Statistical integration of data was not possible as there was insufficient consistency in the study design, measurement tools or scales. The I-square was very high (I2=92.86%), and therefore, meta-analysis was not appropriate (see, for example, appendix 4 in which analysis has been performed for the largest number of papers with all relevant data). Therefore, a descriptive synthesis was generated to summarise methodology, type of learner, the intervention and outcomes. Frequency tables are used to summarise the studies’ key features. The synthesised finding is generated to address the research questions in a narrative structure. 17
RESULTS
Trial flow
After removal of duplicates, the abstracts of 863 articles were screened and 5 articles were identified in the first iteration (based on the criterion of RCTs for prequalification or postqualification learners). As already stated, due to the small number of articles identified, the criteria were broadened to include single-group pretest/post-test comparison, as well as non-randomised studies. Fourteen more articles were identified in the second iteration, resulting in a total of 19 papers for detailed evaluation.
We identified a further 16 articles by reviewing the reference lists in the articles identified originally, by using citation searching of these articles and by investigating the research profile of the key expert authors. No articles were added by experts contacted by the authors. Articles published in non-English languages (n=2) were translated to English by bilingual colleagues. Detailed evaluation of these 35 papers resulted in the exclusion of 23 for the following reasons: (1) not DRE (16 papers), (2) intervention not TES (4), (3) insufficient data and no response from author (1) and (4) not intimate examination (2). Thus, we included 12 papers enrolling 1507 prequalified medical/clinical students and 20 postqualified surgeons. Figure 1 shows trial flow; table 1 shows the key features of the papers.
Figure 1.
Trial flow. RCT, randomised controlled trial.
Table 1.
Key features of the 12 included papers
| Author | Design | Sample | Intervention | Comparators | Instructional design | Outcome |
|---|---|---|---|---|---|---|
| Popadiuk et al (2002) 9 | RCT | Medical students n=65 |
Lecture followed by manikin then TA session | Lecture followed by rectal manikin | FB, MLS T 1 hour for two students |
Correct diagnosis of rectal examination Knowledge Satisfaction with learning method |
| Gerling et al (2009) 18 | RCT | Medical students and nurse practitioner students n=36 |
Virginia Prostate Examination Simulator (VPES) |
-Life-size prostate model set -Rectal manikin |
FB, RTD, CV, T 10 min | Correct identification of abnormality (eg, boggy) Correct identification of pathology (eg, prostatitis) |
| Robb et al (2013) 11 | RCT | Medical students n=27 | Augmented virtual patient with rectal manikin | Rectal manikin | FB | Comfort Confidence Percentage of the prostate examined (coverage) Time spent performing the exam |
| Carrasco et al (2018) 19 | RCT | Medical students n=28 |
Theoretical explanation followed by rectal manikin | Theoretical explanation | TX | Confidence Mental load Satisfaction with learning method Skills acquired |
| Hendrickx et al (2009) 20 | Three-group, non-RCT | Medical students n=233 |
Group 2: video, rectal manikins, Teaching Associate, 2 weeks training in urology service Group 3: Same as group 2 but with no internship |
Rectal manikin | FB, CI, MLS, T 3 hours | Number of intimate examinations done during internship Self-estimated competence Technical performance |
| Siebeck et al (2011) 21 | Crossover with randomisation | Medical students n=188 |
Teaching Associate then Rectal manikin | Rectal manikin then Teaching Associate | FB, MLS, CV, CIA, T 30 min ×2, RP | Inhibition concerning performing the rectal examination Knowledge |
| Stolarek (2007) 22 | Two-group, non-RCT | Postgraduate junior doctors n=20 |
Rectal manikin training before the start of clinical work | No Rectal manikin training | RP, MLS, CIA, T 8 hours | Confidence |
| Rodriguez-Diez et al (2014) 23 | One-group, pretest and post-test | Medical students n=173 |
workshop for rectal examination and bladder catheterisation on simulator | None | MLS, T 1 100 min | Confidence Satisfaction |
| Isherwood et al (2012) 24 | One-group, pretest and post-test | Medical students n=377 |
DRE training with manikin | None | RP, CI, CV, CIA | Confidence |
| Pugh et al (2012) 25 | One-group, pretest and post-test | Medical students n=350 |
Rectal manikin | None | FB, CI, MLS, T 45 min | Comfort levels Top cause of anxiety |
| Cohen et al (2013) 26 | One-group, pretest and post-test | Physician associate students n=30 |
Rectal manikin | None | FB, CI, MLS, DP, T 8 hours ×2 | Anxiety |
| Hegele et al (2014) 27 | One-group, post-test | Medical students n=190 |
Rectal manikin | None | MLS, CV | Acceptance of urology as a career Level of learning |
CI, curriculum integration; CIA, cognitive interactivity; CV, clinical variation; DP, distributed practice (whether learners trained on 1 or >1 day); FB, feedback; RCT, randomised controlled trial; RP, repetitive practice; RTD, range of task difficulty; T, time learning; TA, teaching associate; Tx, cannot tell.
Study characteristics
The use of TES to teach DRE is fairly recent, with the oldest study published in 2002. Ten papers were published in English, 9 11 18 20–26 one in Spanish 19 and one in German. 27 The studies were conducted in seven different countries. Five papers were from North America (one from the USA, 11 18 25 26 and one from Canada 9 ), six from Europe (one from Belgium, 20 two from Germany, 21 27 one from Spain 19 23 and one from the UK 24 ) and one from New Zealand. 22 The participants were medical students in nine of the studies 9 11 19–21 23–25 27 and in the other three were physician associate students, 26 nursing practitioner students 18 and postqualified surgical trainees. 22
Most of the 12 papers combined TES with other learning strategies. TES was combined with lectures in five, 9 19 23–25 with video in three 20 25 26 and with TAs in three. 9 20 21 Curricular integration of the simulation training was identifiable in four, 20 24–26 skill acquisition was measured in five 9 11 18–20 and confidence was measured in five. 11 19 22–24 Only three papers 9 21 27 measured knowledge and four 9 19 23 27 measured satisfaction.
Different types of simulators were used as intervention or comparator. All of the papers except one 18 used life-size anatomy models with a torso capable of simulating a variety of different prostate abnormalities. One study 18 used a life-size prostate anatomy model without torso. Pressure sensors were added to a static rectal model in one of the papers, 25 and this gave continual visual feedback by generating a graphic showing the location and pressure exerted by the examining finger. Interactive virtual patients displayed on-screen were used in one of the papers 11 simultaneously with a life-size prostate anatomy model, to simulate history taking and rectal examination. One study 18 used silicone-elastomer materials embedded with balloons and sensors to simulate the feel of normal and abnormal prostate tissue.
Synthesis
In this data synthesis, we have divided outcomes of interest into five categories: acquisition of clinical skills, acquisition of knowledge, learner satisfaction, learner confidence, and learner anxiety and comfort levels (table 2).
Table 2.
Outcome results of pre and post or control versus experimental
| Outcomes | Reference | n Pre or control | Max score | Mean(SD) Pre or control |
n Post or experimental | Mean (SD) Post or experimental |
P value | Percentage point change in scores |
|---|---|---|---|---|---|---|---|---|
| Acquisition of clinical skills | Popadiuk et al (2002) 9 | 25 | 33 | 23.80 (2.92) | 25 | 27.52 (4.55) | 0.001 | 11 |
| Gerling et al (2009) 18 | 12 | 6 | 3.25 (0.754) | 12 | 4.667 (1.073) | 0.002 | 25 | |
| Hendrickx et al (2009) 20 | 19 | 10 | 5.15 (0.525) | 18 | 7.77 (0.59) | 0.0001 | 26 | |
| Robb et al (2013) 11 | 13 | 100 | 36 (11.5) | 13 | 50 (19) | NR | 14 | |
| Carrasco et al (2018) 19 | 11 | 10 | 8.88 (1.47) | 14 | 9.67 (0.50) | 0.078 | 8 | |
| Acquisition of knowledge | Popadiuk et al (2002) 9 | 18† | 25 | 23.11 (1.32) | 17 | 21.47 (2.5) | 0.025 | −6 |
| Siebeck et al (2011) 21 | NR | 17 | 14.65 (2.3) | NR | 15.75 (1.8) | >0.99 | 7 | |
| Hegele et al (2014) | NA | 100 | NA | 147 | >99 | NR | NA | |
| Learner confidence | Stolarek (2007) 22 | 10 | 40 | 11 (NR) | 10 | 28 (NR) | 0.006 | 42 |
| Isherwood et al (2012) 24 | 360 | 6 | 2.22 (NR) | 358 | 4.87 (NR) | 0.0001 | 44 | |
| Robb et al (2013) 11 | 13 | 6 | 3.50 (1.5) | 13 | 3.71 (1.8) | 0.0001 | 4 | |
| Rodriguez-Diez et al (2014) 23 | 155 | 5 | 2.61 (1.31) | 155 | 4.30 (1.05) | 0.001 | 34 | |
| Carrasco et al (2018) 19 | 11 | 10 | 1.63 (1.96) | 14 | 2.56 (1.50) | 0.046 | 10 | |
| Learner anxiety and comfort levels | Siebeck et al (2011) 21 | NR | 5 | 2.9* (1.0) | NR | 2.67* (0.90) | 0.10 | 5 |
| Pugh et al (2012) 25 | 338 | 6 | 2.56 (1.06) | 338 | 3.32 (1.08) | 0.001 | 12 | |
| Cohen et al (2013) 26 | 30 | 6 | 4.10 (1.24) | 30 | 5.10 (0.76) | 0.005 | 17 | |
| Robb et al (2013) 11 | 13 | 6 | 4.2 (NR) | 13 | 4.2 (NR) | 0.139 | 0 |
*Anxiety level.
†Popadiuk reported on different interventions with different groups, hence the variation in numbers.
n, number of participants; NA, not applicable; NR, not reported.
Acquisition of clinical skills
The acquisition of clinical skills was measured in five papers. 9 11 18–20 Two 9 18 focused on skill-product outcomes (correct identification of prostate abnormalities) and three 11 19 20 focused on skill-process outcomes (systematic approach, completeness and duration). One study shows that standard rectal model followed by TA training improved skill-product significantly compared to the standard rectal model followed by peer role-play. 9 One study reported a standard rectal model followed by TA training improved skill-process outcomes compared to the standard rectal model alone. 20 A static low-fidelity rectal model was compared to a dynamic model that provided learners with visual feedback (finger pressure and location) and tactile feedback (produced by pulsating balloons). 18 This study shows both static and dynamic simulators improve diagnostic ability; however, the transfer of training to new exam scenarios was significantly higher when trained with the dynamic simulator compared to the static model. Similarly, the computerised virtual patient combined with the static rectal model was found to improve students’ performance (prostate examination coverage and duration) when compared to a non-augmented reality simulator. 11 Correspondingly, students trained with the standard rectal model simulator performed better in skill process outcomes compared to lectures only. 19 These findings suggest haptic and visual feedback provided to the learners during simulation training improved their clinical skills.
Acquisition of knowledge
Three papers measured knowledge after DRE training with technology simulation. 9 21 27 Two papers show technology simulation training is equal to or better than using TAs alone in improving knowledge if combined with peer role-play. 9 21 One study 9 reported knowledge assessment scores postintervention were significantly higher in the group that used the rectal model followed by peer role-play, compared to the rectal model followed by TA. The second study 21 reported no significant difference in knowledge acquisition between technology simulation and TA. Ninety-nine per cent of the medical students participating in the third study 27 (single group post-test) in which they trained with a rectal model reported improvement in knowledge and skills.
Learner satisfaction
Four papers reported some limited data on the level of students’ satisfaction after DRE training with TES. 9 19 23 27 In one study, 9 the TA was scored as the preferred learning method for students. However, students not exposed to the TA in the same study scored the lectures higher than standard rectal model simulation for learning and understanding rectal examination. In two papers, 19 27 students scored the standard rectal model simulation higher than lectures for learning clinical skills. Similarly, the rectal model was scored as a highly satisfactory learning method for acquiring skills and knowledge in a single-group study. 23
Learner confidence
Five papers reported students’ confidence in performing a DRE after training with the enhanced technology simulator. 11 19 22–24 The rectal model augmented with a computerised virtual patient was reported to have a similar effect on confidence as the non-augmented rectal model after simulation training. 11 The remaining four papers 19 22–24 reported that the improvement in students’ confidence was more than five percentage points after simulation training with a standard rectal model. According to Janjua et al, 28 a five percentage point increase in competence and confidence is considered clinically/educationally meaningful.
Learner anxiety and comfort levels
The inhibition of skills or knowledge acquisition by anxiety or negative emotion was investigated after male rectal examination training in four papers. 11 21 25 26 The effect of high-fidelity simulation (eg, TA) to reduce inhibition concerning performing the rectal examination was significantly greater than low-fidelity simulation (eg, rectal model). 21 Cohen et al 26 reported both low- and high-fidelity simulation significantly improved comfort levels. These findings were consistent with those of Pugh et al, 25 showing students were significantly more comfortable after simulation training using a standard rectal model. On the contrary, Robb et al 11 reported higher change in comfort level in the group trained with standard rectal model compared to the rectal model augmented with virtual patient group; however, there was no difference in comfort after training between standard and augmented model.
Study quality
The methodological quality of each study was evaluated independently by five reviewers (MAAA, FB, JE, JP, RJS) using the Modified Medical Education Research Study Quality Instrument (MMERSQI; maximum of 18 points), see table 3. Six of 12 papers scored more than 50% on the overall score. The average score for methodological quality based on the MMERSQI is 8.2 of 18 (45%) (1.6 SD).
Table 3.
Quality assessment of the 12 included papers showing mean and total scores
| Author (year) | Quality of study design (% of maximum score) |
Quality of sampling (% of maximum score) |
Quality of type of data (% of maximum score) |
Quality of validity of instrument (% of maximum score) |
Quality of data analysis (% of maximum score) |
Quality of outcomes (% of maximum score) |
Overall paper quality (% of maximum score) |
|---|---|---|---|---|---|---|---|
| Popadiuk et al (2002) 9 | 77 | 40 | 57 | 27 | 73 | 40 | 52 |
| Stolarek (2007) 22 | 33 | 33 | 33 | 17 | 33 | 20 | 28 |
| Gerling et al (2009) 18 | 67 | 47 | 63 | 17 | 73 | 47 | 52 |
| Hendrickx et al (2009) 20 | 20 | 27 | 33 | 7 | 53 | 20 | 27 |
| Siebeck et al (2011) 21 | 37 | 27 | 40 | 20 | 73 | 27 | 37 |
| Isherwood et al (2012) 24 | 47 | 50 | 67 | 30 | 80 | 45 | 54 |
| Pugh et al (2012) 25 | 73 | 30 | 77 | 27 | 80 | 67 | 59 |
| Robb et al (2013) 11 | 33 | 47 | 40 | 40 | 73 | 27 | 43 |
| Cohen et al (2013) 26 | 77 | 53 | 80 | 33 | 77 | 50 | 62 |
| Rodriguez-Diez et al (2014) 23 | 33 | 47 | 33 | 7 | 57 | 23 | 35 |
| Hegele et al (2014) 27 | 50 | 53 | 70 | 17 | 80 | 40 | 52 |
| Carrasco et al (2018) 19 | 50 | 13 | 33 | Not reported | 67 | 20 | 30 |
DISCUSSION
We are not aware of previous reviews of TES for teaching male DRE. Our review concurs with three previous reviews 10 29 30 of simulation for other intimate examinations, which showed a large, educationally significant benefit in favour of simulation training for healthcare professionals. Technology-enhanced simulator training in DRE is associated with improvements in learner outcomes. The simulators capable of providing feedback are associated with greater skill acquisition but with no difference in knowledge acquisition. TAs were found to be the more effective teaching method for skill acquisition; however, there has been no direct comparison of TAs and TES in DRE. TES was scored by students as a more effective teaching method for skill acquisition than standard rectal model or lectures. The papers which we have included achieved modest overall quality scores when evaluated with MMERSQI. It is worth mentioning that the studies which scored more than 50% on the MMERSQI quality measurement showed that use of TA or dynamic simulator was associated with increased skill acquisition. However, this was not evident with knowledge acquisition.
The capability of technology-enhanced simulators to simulate a wide range of disease with different levels of difficulty may help skills transfer. Low-fidelity simulators offer more time with less stress for learning, which may help learners to better understand the DRE examination and to develop the skills required to detect anatomical landmarks, as well as abnormal pathological structures. Learners trained with low-fidelity simulators reported comfort and confidence in their DRE skills. These confidence scores should be interpreted cautiously, as this improvement could falsely exaggerate the learner’s sense of preparation for conducting DRE in real patients. 31 32 Only one paper 24 of the included papers which reported on the impact of simulation on confidence achieved an overall score more than 50% using MMERSQI. Using different types of simulation to achieve different learning goals could be more efficient and cost-effective, for example, using low fidelity for the cognitive domain of learning and high fidelity for skills and the affective domain of learning.
Limitations and strengths
There are some limitations to our review. We found few papers of TES for DRE, and the quality of papers published was not generally very high. The number of participants ranged from 20 to 377, with a median of 119. Seven of the papers compared two groups or more. Five of the seven two-group studies used randomisation (70%). Only three papers had blinded outcomes assessment in which performance was measured by the technology-enhanced simulator in two papers 11 18 and by a blinded assessor in one study. 21 There is insufficient information about the random sequence generation process in all of the seven papers to permit clear judgement of bias risk. The response rate of those who completed the intervention evaluation, post-test or survey was more than 75% in all the papers, except one study that did not report the number of lost participants before the outcome assessment. We were not able to perform meta-analysis due to insufficient statistical data and heterogeneity across the different papers, which used different interventions and scales. We found no direct comparison between TAs and technology-assisted simulators for skill acquisition. TA simulation training was always accompanied by an additional type of teaching method (eg, rectal model). Only two papers compared different simulators, but the sample size of these studies 11 18 was relatively small and there was limited training time.
There are numerous strengths to this review. This is a comprehensive systematic review of the acquisition of skill and knowledge. The search was guided with predefined and reproducible inclusion and exclusion criteria. The papers included encompassed a broad range of learners, study designs and outcomes from different parts of the world. We used duplicate, independent and reproducible data abstraction, and rigorous coding of the study methodological criteria. The quality assessments were done by five independent reviewers.
Implications for education and research
The evidence reviewed here indicates that TES training is associated with DRE skills improvement for novice learners. TES may facilitate the transfer of skills that are challenging to learn. Therefore, we recommend further studies to compare high-fidelity and high technology-enhanced simulators with TAs for skill acquisition (the data suggest knowledge can be acquired equally effectively using low fidelity and lower technology). During DRE, learners have no visual cues and the instructors are not able to evaluate learners’ performance or provide feedback. Evidence suggests that feedback is an important element of the learning process, which can be achieved by TES. We speculate that incorporating clinical variation and a range of difficulties in the simulators would help to transfer skills from simulation training to the clinical environment, although empirical evidence to support this proposition is lacking. Further research is required to confirm the effect of TES to train novice learners in terms of skill acquisition and retention using randomisation, blinding of the outcome assessment and sufficient sample size with mixed ethnicity. Also, research is needed to explore the most appropriate pedagogical approaches (eg, clinical variation and range of difficulties) to increase learning effectiveness in simulation training. Lastly, we have not addressed the issue of skills attrition over time and the potential for TES of intimate examinations to ‘reskill’ clinicians who may use these skills rarely but still need to maintain competence through many decades of clinical practice. The ‘cost’ saved to patients (whose inconvenience and embarrassment is largely ignored in current clinical teaching) is not considered by us here.
CONCLUSION
Teaching DRE with high fidelity (eg, TAs) improves student skill acquisition and reduces student anxiety compared with other teaching approaches. Anatomical and clinical knowledge acquisition, student confidence and comfort can be better achieved with low-fidelity methods (eg, static manikin). Rather than leaving students to simply opportunistic learning and assuming that trainees will receive sufficient exposure to an intimate examination throughout their training, simulation-based training can standardise their exposure to routine as well as to low-frequency critical events. The opportunity to use simulation to summatively assess trainee clinicians’ skills in DRE (and indeed other intimate examinations) is also worth exploring. Assessing the effect of simulation-based education on patient outcomes remains a major challenge; however, we believe mastering DRE in simulation will result in it being carried out more frequently and this could reduce the doctor-associated delay in rectal/prostate cancer diagnosis. We recommend that trainees must demonstrate a minimal level of competence and comfort prior to performing this invasive examination on a ‘real’ patient. As discussed elsewhere by Draper et al, 33 the ‘old system’ of patients having no role nor opinion in the education of clinicians is no longer tenable: we have a moral duty to prepare students as well as we can. And the judgement of students’ competence should not be based simply on how many times a trainee has performed a DRE but on their ability to accurately describe nodules, etc.
Acknowledgments
The authors thank Professors Roger Kneebone, Debra Nestel and Carla Pugh for assisting with the identification of relevant papers.
Footnotes
Contributors: MAAA suggested the theme of the review, search protocol, literature review search, abstract and full-text screening, coded studies, quality assessment as well as the initial drafting and redrafting of the manuscript. JP contributed to full-text screening, coded studies, quality assessment, contacted experts to identify papers as well as drafting and redrafting of the manuscript. JE contributed to abstract screening, quality assessment, drafting and redrafting of the manuscript. RJS and FB contributed to quality assessment, drafting and redrafting of the manuscript. SH advised on statistical analysis, evaluated the extracted data for possible meta-analysis, drafting and redrafting of the manuscript.
Funding: First author (MAAA) is funded for his PhD by Saudi Arabian Ministry of Education. Other authors contributed to the research as part of their regular work duties.
Competing interests: None declared.
Provenance and peer review: Not commissioned; externally peer reviewed.
Data availability statement: Data are available upon reasonable request.
Supplemental material: This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.
REFERENCES
- 1. National Institute for Health and Care Excellence . Suspected cancer: recognition and referral. 2015. Available https://www.nice.org.uk/guidance/ng12/chapter/1-Recommendations-organised-by-site-of-cancer#urological-cancers (accessed 12 Feb 2019) [PubMed]
- 2. Carvalhal GF, Smith DS, Mager DE, et al. Digital rectal examination for detecting prostate cancer at prostate specific antigen levels of 4 ng./ml. Or less. J Urol 1999;161:835–9. 10.1016/S0022-5347(01)61785-3 [DOI] [PubMed] [Google Scholar]
- 3. Verghese A, Charlton B, Kassirer JP, et al. Inadequacies of physical examination as a cause of medical errors and adverse events: a collection of vignettes. Am J Med 2015;128:1322–1324. e1323. 10.1016/j.amjmed.2015.06.004 [DOI] [PubMed] [Google Scholar]
- 4. General Medical Council . Intimate examinations and chaperones: guidance. 2013. Available https://www.gmc-uk.org/-/media/documents/maintaining-boundaries-intimate-examinations-and-chaperones_pdf-58835231.pdf (accessed 15 Mar 2019)
- 5. General Medical Council . Duties of a doctor. 1995. Available https://www.gmc-uk.org/ethical-guidance/ethical-guidance-for-doctors/good-medical-practice/duties-of-a-doctor (accessed 12 Sept 2018)
- 6. Dabson AM, Magin PJ, Heading G, et al. Medical students’ experiences learning intimate physical examination skills: a qualitative study. BMC Med Educ 2014;14:39. 10.1186/1472-6920-14-39 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Yeung JM, Yeeles H, Tang SW, et al. How good are newly qualified doctors at digital rectal examination? Colorectal Dis 2011;13:337–40. 10.1111/j.1463-1318.2009.02116.x [DOI] [PubMed] [Google Scholar]
- 8. Wong RK, Drossman DA, Bharucha AE, et al. The digital rectal examination: a multicenter survey of physicians’ and students’ perceptions and practice patterns. Am J Gastroenterol 2012;107:1157–63. 10.1038/ajg.2012.23 [DOI] [PubMed] [Google Scholar]
- 9. Popadiuk C, Pottle M, Curran V. Teaching digital rectal examinations to medical students: an evaluation study of teaching methods. Acad Med 2002;77:1140–6. 10.1097/00001888-200211000-00017 [DOI] [PubMed] [Google Scholar]
- 10. Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA 2011;306:978–88. 10.1001/jama.2011.1234 [DOI] [PubMed] [Google Scholar]
- 11. Robb A, Kopper R, Ambani R, et al. Leveraging virtual humans to effectively prepare learners for stressful interpersonal experiences. IEEE Trans Vis Comput Graph 2013;19:662–70. 10.1109/TVCG.2013.35 [DOI] [PubMed] [Google Scholar]
- 12. Reed DA, Cook DA, Beckman TJ, et al. Association between funding and quality of published medical education research. JAMA 2007;298:1002–9. 10.1001/jama.298.9.1002 [DOI] [PubMed] [Google Scholar]
- 13. Wade R, Corbett M, Eastwood A. Quality assessment of comparative diagnostic accuracy studies: our experience using a modified version of the QUADAS-2 tool. Res Synth Methods 2013;4:280–6. 10.1002/jrsm.1080 [DOI] [PubMed] [Google Scholar]
- 14. Higgins JP, Savović J, Page M, et al. Assessing risk of bias in a randomized trial. In: Higgins JPT, Thomas J, Chandler J, Cumpston M, Li T, Page MJ, Welch VA, eds, Cochrane handbook for systematic reviews of interventions . The Cochrane Collaboration and John Wiley & Sons Ltd., 2019: 205–28. [Google Scholar]
- 15. Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990;65:S63–67. 10.1097/00001888-199009000-00045 [DOI] [PubMed] [Google Scholar]
- 16. Kirkpatrick D. Great ideas revisited. Train Dev 1996;50:54. ISSN: ISSN-1055-9760. [Google Scholar]
- 17. Arai L, Britten N, Popay J, et al. Testing methodological developments in the conduct of narrative synthesis: a demonstration review of research on the implementation of smoke alarm interventions. Evid Policy 2007;3:361. 10.1332/174426407781738029 [DOI] [Google Scholar]
- 18. Gerling GJ, Rigsbee S, Childress RM, et al. The design and evaluation of a computerized and physical simulator for training clinical prostate exams. IEEE Trans Syst Man Cybern-Part A 2009;39:388–403. 10.1109/TSMCA.2008.2009769 [DOI] [Google Scholar]
- 19. Carrasco J, Gómez E, García J, et al. Impact of the use of simulators on the mental workload and confidence in a digital rectal examination and bladder catheterization workshop. Arch Esp Urol 2018;71:537–42. PMID: 29991662. [PubMed] [Google Scholar]
- 20. Hendrickx K, De Winter B, Tjalma W, et al. Learning intimate examinations with simulated patients: the evaluation of medical students’ performance. Med Teach 2009;31:e139–e147-. 10.1080/01421590802516715 [DOI] [PubMed] [Google Scholar]
- 21. Siebeck M, Schwald B, Frey C, et al. Teaching the rectal examination with simulations: effects on knowledge acquisition and inhibition. Med Edu 2011;45:1025–31. 10.1111/j.1365-2923.2011.04005.x [DOI] [PubMed] [Google Scholar]
- 22. Stolarek I. Procedural and examination skills of first-year house surgeons: a comparison of a simulation workshop versus 6 months of clinical ward experience alone. N Z Med J 2007;120:U2516–U2516. PMID: 17514217. [PubMed] [Google Scholar]
- 23. Rodriguez-Diez MC, Diez N, Merino I, et al. Simulators help improve student confidence to acquire skills in urology. Actas Urol Esp 2014;38:367–72. 10.1016/j.acuroe.2014.02.013 [DOI] [PubMed] [Google Scholar]
- 24. Isherwood J, Ashkir Z, Panteleimonitis S, et al. Teaching digital rectal examination to medical students using a structured workshop - a point in the right direction? J Surg Educ 2012;70:254–7. 10.1016/j.jsurg.2012.09.009 [DOI] [PubMed] [Google Scholar]
- 25. Pugh CM, Iannitelli KB, Rooney D, et al. Use of mannequin-based simulation to decrease student anxiety prior to interacting with male teaching associates. Teach Learn Med 2012;24:122–7. 10.1080/10401334.2012.664534 [DOI] [PubMed] [Google Scholar]
- 26. Cohen E, Ononye C, Salud J, et al. Use of simulation to understand the effects of task complexity and time away on clinical confidence. Stud Health Technol Inform 2013;184:92–5. PMID: 23400136. [PubMed] [Google Scholar]
- 27. Hegele A, Heers H, Bruening F, et al. How can young academics be recruited? Acceptance and effects of urological practice-oriented training. Urologe 2014;53:236–40. 10.1007/s00120-013-3266-6 [DOI] [PubMed] [Google Scholar]
- 28. Janjua A, Smith P, Chu J, et al. The effectiveness of gynaecology teaching associates in teaching pelvic examination to medical students: a randomised controlled trial. Eur J Obstetrics Gynecol Reproduct Biol 2017;210:58–63. 10.1016/j.ejogrb.2016.10.006 [DOI] [PubMed] [Google Scholar]
- 29. Dilaveri C, Szostek J, Wang A, et al. Simulation training for breast and pelvic physical examination: a systematic review and meta‐analysis. BJOG 2013;120:1171–82. 10.1111/1471-0528.12289 [DOI] [PubMed] [Google Scholar]
- 30. McGaghie WC, Issenberg SB, Cohen MER, et al. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med 2011;86:706. 10.1097/ACM.0b013e318217e119 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31. Barnsley L, Lyon PM, Ralston SJ, et al. Clinical skills in junior medical officers: a comparison of self-reported confidence and observed competence. Med Edu 2004;38:358–67. 10.1046/j.1365-2923.2004.01773.x [DOI] [PubMed] [Google Scholar]
- 32. Liaw SY, Scherpbier A, Rethans J-J, et al. Assessment for simulation learning outcomes: a comparison of knowledge and self-reported confidence with observed clinical performance. Nurse Educ Today 2012;32:e35–e39. 10.1016/j.nedt.2011.10.006 [DOI] [PubMed] [Google Scholar]
- 33. Draper H, Ives J, Parle J, et al. Medical education and patients’ responsibilities: back to the future? J Med Ethics 2008;34:116–19. 10.1136/jme.2006.019257 [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
bmjstel-2020-000587supp001.pdf (379.6KB, pdf)

