Abstract
Colonoscopy is a complex task that requires the interplay of motor and cognitive skill sets. Traditional teaching of colonoscopy involves observation in an apprenticeship model. Individual trainees vary in their rate of their skill acquisition, and this trial-and-error method often results in frustration and anxiety for both the educator and the learner. Currently, there are no guidelines to determine the competence or proficiency of an individual for colonoscopy. Furthermore, there is a paucity of information regarding formal training curricula for colonoscopy skills acquisition. The present study investigated a formal and validated educational framework for colonoscopy teaching and compared it with the traditional apprenticeship model in first-year trainees.
Keywords: Colonoscopy, Education, Evaluation, Skills, Teaching
Abstract
La coloscopie est une tâche complexe qui exige une interaction des aptitudes motrices et cognitives. L’enseignement classique de la coloscopie se fait par observation dans le cadre d’un modèle de stage. Le rythme d’acquisition des compétences varie selon les stagiaires, et cette méthode essai-erreur entraîne souvent de la frustration et de l’anxiété, tant de la part de l’éducateur que du stagiaire. Il n’existe actuellement pas de lignes directrices pour déterminer la compétence ou l’adresse d’une personne à effectuer des coloscopies. De plus, on possède peu d’information sur le programme officiel de formation visant l’acquisition de compétences en coloscopie. La présente étude a permis d’explorer un cadre d’apprentissage officiel et validé pour enseigner la coloscopie et de le comparer au modèle de stage classique chez des stagiaires de première année.
COLONOSCOPY EDUCATION
Attaining proficiency in colonoscopy is a cornerstone of training in gastroenterology (GI) and for select general surgeons who plan on including colonoscopy in their practice. The acquisition of these skills most often occurs during a fixed time period of residency, and competence is often determined at the conclusion of training and after the completion of a specific number of procedures. Alternatively, in practice, competence is defined by efficiency, accuracy and patient comfort.
Colonoscopy instruction has largely followed the apprenticeship model of ‘see one, do one, teach one’. The obvious disadvantages include time management and potential trauma to the patients involved. Furthermore, the apprenticeship model has promoted a trial-and-error culture of skills acquisition with little time for self-reflection or provision of formative feedback. Consequently, the apprenticeship model has been associated with significant frustration for the trainees and teachers. Colonoscopy instuctors are now recognizing the limitations of an apprenticeship-based model, and are searching for novel methods to facilitate trainee-based endoscopic skills.
Currently, there is a paucity of information regarding colonoscopy education and training in the existing medical literature. From a Canadian perspective, two important, relatively recent articles have brought attention to this topic. Romagnuolo et al (1) published Canadian credentialing guidelines for colonoscopy in 2008, and highlighted the important interplay between cognitive and motor skills in performing colonoscopy. While colonoscopy instruction was not directly addressed, any credentialing guidelines would be built on the premise of appropriate training of future endoscopists. Our group recently proposed a seven-step framework for teaching colonoscopy based on previously validated use in other procedural skills (2). The inadequacies of the traditional apprenticeship model outlined above, along with these two articles create a platform for further discussion regarding this under-represented issue.
COLONOSCOPY EVALUATION
Colonoscopy skills evaluation has not been formalized in Canada. Most training centres generally certify proficiency in endoscopic skills based on the number of procedures performed and the completion of a specified time period in either a GI fellowship or in surgical training. Although some training centres have used formalized evaluation criteria, this method of evaluation has not been standardized. One such formal evaluation tool is the direct observation of procedural skills (DOPS).
DOPS
The DOPS tool is used to assess performance in four major areas that have previously been shown to have validity in colonoscopy skills acquisition (3,4). These domains include the following: assessment, consent and communication (ACC), in which actions including obtaining informed consent, demonstrating respect for patients’ views and modesty during the procedure, and communication with the patient are assessed; safety and sedation (SS), in which use of analgesia and sedation, appropriate monitoring of vital signs and oxygenation, and communication with the nursing staff are assessed; endoscopic skills during insertion and withdrawal (ENDO), the domain in which the bulk of technical skills are assessed. Aspects of assessment in this domain include the following: checking the endoscope for function before the procedure, maintenance of luminal view, torque steering, lumen distention, appropriate suction, recognizing and logically resolving loop formation, using position change and abdominal pressure, and completing the procedure in a reasonable time. The final domain is diagnostic and therapeutic ability (DIAG), in which the trainee is evaluated on adequate mucosal visualization, recognizing cecal landmarks, accurate identification and management of pathology, therapeutic interventions and managing complications. The DOPS assessment requires the evaluator to score performance in each of these areas on a scale of 1 to 4. A score of 1 indicates that accepted standards were not met and frequent uncorrected errors occurred. A score of 2 indicates that some standards were not met, with some aspects for improvement noted, and some uncorrected errors still occurred. A score of 3 indicates that the endoscopist performed a competent and safe procedure without any uncorrected errors. Finally, a score of 4 indicates a highly skilled performance. The maximum raw score obtainable for each of the ACC and SS domains is 12, the ENDO domain 36 and DIAG domain 20. The maximum raw score obtainable for an excellent performance is 80. In addition to the areas mentioned above, the DOPS tool also has a provision for documentation of case difficulty rated on a scale of 1 to 5, in which 1 represents a very easy case, and 5 represents an extremely challenging case.
DOPS validity and reliability:
The Joint Advisory Group on Gastrointestinal Endoscopy in the United Kingdom has validated the DOPS tool with regard to colonoscopy education (3,4). The DOPS form used in the assessment for accreditation was developed by the Joint Advisory Group on Gastrointestinal Endoscopy after extensive consultation with the multiprofessional endoscopy community, and is used as an assessment tool for both formative and summative purposes. This tool is gaining popular acceptance by several endoscopy training centres nationally and worldwide. The DOPS tool has multiple purposes including formative assessment to aid endoscopy trainees in skills acquisition, summative assessment for accreditation and maintenance of skills following the training period.
Because the DOPS tool has been validated in the assessment of colonoscopy skills, we aimed to assess its performance and reliability in colonoscopy evaluation on live patients among Canadian endoscopy trainees. We also investigated additional factors that could affect residents’ performance such as case difficulty and method of skills education, ie, the seven-step feedback model versus the traditional apprenticeship model.
METHODS
In the present pilot study, four first-year gastroenterology fellows (three men and one woman) from the University of Calgary (Calgary, Alberta) were exposed to an endoscopy simulator (GI Mentor II, Simbionix, USA) and taught colonoscopy technique using a formal seven-step approach that integrated all aspects of the procedure from consent to sedation, to manual technique. These seven steps, as outlined by Raman and Donnon (2), are as follows:
Planning/needs assessment. During this phase, the specific objectives to be achieved during each scope encounter are outlined.
After an expert demonstration of the procedure, the preceptor demonstrated one procedure in its entirety while concurrently verbalizing manoeuvres and rationale for special techniques. The trainee would observe the procedure and have an opportunity to ask questions.
Procedure performance by the learner under direct supervision. During this phase, the learner would either perform the procedure in a piecemeal fashion based on predetermined, negotiable goals or attempt the full colonoscopy with ongoing iterative direction from the preceptor.
Self-reflection. Following completion of the procedure, the trainee would reflect on the procedure and self-assess aspects of their performance.
Following the trainee’s self-assessment, specific descriptive feedback based on the initial negotiated objectives is provided by the preceptor.
Following feedback from the preceptor, the trainee further reflected on his/her performance and identified areas for improvement.
Subsequent practice to further proficiency is undertaken by the learner.
These steps are flexible and modifiable to accommodate learning opportunities for the students.
In comparison, three third-year general surgery residents (two men and one woman) selected through a convenience sample were exposed to the conventional method of apprenticeship-based teaching.
The raw scores obtained in each domain of the DOPS were translated into percentage. Test-retest reliability was subsequently assessed by calculating the Spearman’s rank correlation coefficient (r) to compare each pair of assessments. Correlation coefficients according to case difficulty and training are presented. Data were analyzed using Stata version 10.1 (Stata Corporation, USA). Statistical differences were calculated using nonparametric tests, namely the Mann-Whitney and Kruskal-Wallis tests. Data are reported as median and interquartile range (IQR).
RESULTS
Before the present study, GI fellows had been involved in 30 (IQR 25 to 40) colonoscopies compared with five (IQR 0 to 10) by general surgery residents (P=0.06).
Using the DOPS assessment tool for colonoscopy performance on live patients, the GI fellows scored 84% (range 81% to 86%) versus 81% (75% to 100%) by the surgery group in the ACC domain; 77% (75% to 81%) versus 83% (79% to 88%) in the SS domain; 74% (69% to 81%) versus 72% (67% to 79%) in the ENDO domain; 83% (78% to 91%) versus 76% (20% to 83%) in the DIAG domain; and 79% (76% to 84%) versus 77% (62% to 86%), respectively, in overall DOPS score. Therefore, our findings suggest that training assessment using the seven-step feedback model is comparable with the apprenticeship model.
Regarding the test-retest reliability of DOPS assessment in evaluating both the apprenticeship model and seven-step feedback model, r values according to the type of training and case difficulty were calculated (Table 1). In the GI fellows group, a wide range of correlations (r=0.05 to r=0.98) were noted, and correlations were stronger in medium-difficulty (3/5) cases compared with difficult cases (4/5). GI residents 3 and 4 performed colonoscopies in cases of similar levels of difficulty. A similarly wide range of correlation (r=0 to r=0.95) was also observed in the surgery residents’ group (Table 1).
TABLE 1.
Direct observation of procedural skills test-retest reliability according to training and case difficulty
| Test-retest reliability, r | Case difficulty, mean | |
|---|---|---|
| Gastroenterology residents | ||
| Resident 1 | ||
| 2 assessments | 0.80 | 3 |
| 2 assessments | 0.42 | 4 |
| Resident 2 | ||
| 3 assessments | 0.05, 0.41, 0.87 | 3 |
| 3 assessments | 0.16, 0.54, 0.58 | 4 |
| Resident 3 | ||
| 3 assessments | 0.21, 0.44, 0.64 | 4 |
| Resident 4 | ||
| 4 assessments | 0.67–0.98 | 3 |
| Surgery residents | ||
| Resident 5 | ||
| 2 assessments | 0.95 | 5 |
| Resident 6 | ||
| 4 assessments | 0–0.95 | 3 |
| 2 assessments | 0.56 | 5 |
DISCUSSION
To our knowledge, the present article describes the first formal study of the evaluation of colonoscopy skills among Canadian endoscopy trainees. Historically, assessment of colonoscopy performance among trainees has been based on the number of procedures performed, time spent in skills evaluation and cecal intubation rate. These criteria have been arbitrarily accepted to be appropriate evaluation end points for credentialing purposes. However, in the early formative period of skills acquisition, these end points may not be appropriate and may lack content validity as assessment criteria. In our study, we investigated the reliability of the DOPS tool for colonoscopy skills evaluation in trainees who acquired colonoscopy skills using a seven-step feedback model in conjunction with a colonoscopy simulator, and among trainees who learned colonoscopy through the apprenticeship model. Based on the DOPS tool scores, we found that first-year GI fellows performed well in colonoscopy procedures after one month of intervention with the seven-step teaching approach and use of an endoscopy simulator.
However, the DOPS tool, which has both face and content validity, did not appear to be a reliable tool in assessing performance in our study. Although a single expert colonoscopy assessor evaluated performance among the trainees, a wide range of reliability – as assessed by test-retest correlation – was seen. This wide range in reliability in our study indicates that skills among trainess were not consistent and were subject to variability. Interestingly, reliability of the DOPS tool was poor among trainees learning colonoscopy through both the seven-step feedback model and among trainees using the apprenticeship method. This suggests that neither method of colonoscopy training provided trainees with consistent and reliable skills. The seven-step feedback method of teaching colonoscopy is relatively new in the endoscopy world; consequently, it is possible that few endoscopists have the skills to teach colonoscopy using this method, thus making the finding of poor reliability in this group compared with trainees receiving skills education using the apprenticeship model less surprising. This wide range of reliability rings warning bells and suggests to colonoscopy education experts that greater attention should be focused on formal skills acquisition and the need for reliable, periodical assessment.
The importance of proficient skills acquisition in colonoscopy is underscored by several factors. There is a belief that there is a small, yet appreciable increase in procedural complications in colonoscopies performed by trainees (5). Similarly, training is often associated with increased patient discomfort, the need for more sedation and longer procedure times. Based on these premises, shortening the learning curve of skills acquisition in the early phases of training may reduce these unwanted outcomes.
Formal educational curricula are important in colonoscopy instruction. A shift from the traditional apprenticeship model to a formal curriculum for skills acquisition and proficiency may result in improved trainee performance, reduced patient discomfort, and less frustration for both students and teachers.
However, it is important for instructors who teach colonoscopy to be consistently educated in how to best impart colonoscopy skills to novice trainees. Therefore, initiatives such as the Canadian Association of Gastroenterology ‘Train the Trainers’ programs are critical in helping colonoscopy experts achieve competence in teaching colonoscopy skills because technical expertise is not synonymous with teaching or ‘coaching’ expertise.
Strengths and limitations
The present investigation was the first Canadian study to assess the reliability of both a novel colonoscopy educational curriculum and the standard apprenticeship model. Feedback during colonoscopy skills acquisition has the potential to improve colonoscopy skill beyond the capability of simulator-based training, and allows for more specific objectives and goal setting based on the needs of the trainee. However, any colonoscopy curriculum must be delivered by instructors trained in these methods. Our study has several limitations including the fact that colonoscopy instructors were not formally taught how to deliver this particular curriculum. Due to the small sample size, we may have failed to observe significant differences between the apprenticeship model and the colonoscopy educational curriculum. However, such a fact would not affect the wide variation in observed DOPS tool scores. Heterogeneous populations may limit the applicability of these results. Finally, inconsistent adherence by trainees and staff to the formal curriculum perhaps limited our interpretation of the magnitude of this training method.
FUTURE RESEARCH DIRECTIVES
The present pilot study creates a platform for several future research avenues such as assessing the quantitative impact of a structured feedback curriculum by establishing baseline performance, institution of the appropriate curriculum and assessing postintervention performance in a homogeneous cohort. Studies comparing colonoscopy training and performance across the nation are needed. Larger studies adjusting for potential factors related to centres, trainees and patients are needed to improve our trainees’ assessment tools.
CONCLUSIONS
Competent performance of colonoscopy is a mandatory requirement for GI trainees before certification. Competency should be defined not by the completion of a fixed training period, but rather with the use of objective measures of performance.
REFERENCES
- 1.Romagnuolo J, Enns R, Ponich T, Springer J, Armstrong D, Barkun AN. Canadian credentialing guidelines for colonoscopy. Can J Gastroenterol. 2008;22:17–22. doi: 10.1155/2008/837347. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Raman M, Donnon T. Procedural skills education – colonoscopy as a model. Can J Gastroenterol. 2008;22:767–70. doi: 10.1155/2008/386851. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Barton JR. Validity and reliability of the Joint Advisory Group/Bowel Cancer Screening Programme accreditation assessment for colonoscopy. Gut. 2008;57(Suppl 1):A2. [Google Scholar]
- 4.Barton R. Accrediting competence in colonoscopy: Validity and reliability of the UK Joint Advisory Group/NHS Bowel Cancer Screening Programme Accreditation Assessment. Gut. 2008;57(Suppl 1):A1–A72. [Google Scholar]
- 5.Arora G, Mannalithara A, Singh G, et al. Risk of a perforation from a colonoscopy in adults: A large population-based study. Gastrointest Endosc. 2009;69(3 Suppl):654–64. doi: 10.1016/j.gie.2008.09.008. [DOI] [PubMed] [Google Scholar]
