Abstract
The Directly Observed Procedural/Practical Skill (DOPS) is a relatively new but reliable tool for formative assessment. The lack of desired awareness regarding DOPS among the Otorhinolayngologists of India made us to conduct this study. The aim of the study was introduction of DOPS in Oto-rhino-laryngology Department. The objectives of the study were: (1) To prepare lists of Oto-rhino-laryngology procedures for DOPS, (2) To conduct Orientation program of DOPS for the participants, (3) To prepare a structured list of items for the rating scale, (4) To facilitate and conduct DOPS encounters of different Oto-rhino-laryngology procedures. The study was conducted in a tertiary care medical college hospital from April 2018 to August 2018. Thirty-three trainees and 5 trainers participated. The 421 DOPS encounters involved 41 Oto-rhino-laryngology procedures. For checking the association between average time and clinical settings and Oto-rhino-laryngology procedures and DOPS encounters, the nonparametric test χ2 test was employed. Male trainees (63.63%) outnumbered female trainees. Mostly trainees (91%) were aged 22–25 years. Approximately half (49%) of the Oto-rhino-laryngology procedures (20/41) and 9/10th (86.22%) of DOPS encounters (363/421) were conducted in OPD. The average time taken to complete the E.N.T. procedures and DOPS encounters was 15 min or less in the majority (91% and 98%) of the Oto-rhino-laryngology procedures (38/41) and DOPS encounters (414/421). DOPS was introduced as a learning tool in the Oto-rhino-laryngology Department of our medical college. For assessing the “competency level” of trainees for E.N.T. procedures, DOPS is a high quality instrument as it tests the candidate at the “does” level.
Keywords: Directly Observed Procedural Skills, Workplace-based assessment, Otorhinolaryngology
Introduction
The Directly Observed Procedural/Practical Skill (DOPS) is a relatively new but reliable tool for the formative assessment in the field of competency-based medical education [1, 2]. DOPS is a tool of Workplace-Based Assessments (WPBA) for assessing competency of the students in which the trainer directly observes the trainee performing procedure on real patient in real clinical setting which may be outpatient, inpatient, emergency, operation rooms or intensive care units [2–4]. The trainer immediately rates the performance of the trainee as per the structured checklist and simultaneously provides constructive feedback to the trainee in a friendly environment [2, 4]. DOPS has been found to be an effective learning tool for developing competency in undergraduate medical students as well as in the specialist training [2, 3].
At present, there is no provision for the formative assessment during Internship and postgraduate (PG) training in India. Because of increased government and public scrutiny, the Medical Council of India (MCI) has decided to take medical education in India from just completing accredited posts within a set period of time to a more ‘competency based curricula’. The Vision 2015 document of the MCI reaffirms the need of ‘competency based curricula’ for both under-graduates (UG) and post-graduates (PG) medical students so that Indian Medical Graduates (IMG) can carry out medical procedures efficiently and safely [5].
In spite of all these facts however DOPS is yet unheard of in the Oto-rhino-laryngology curriculum in India. This lack of desired degree of awareness regarding DOPS in the E.N.T. Departments in the Medical Colleges of India made us to conduct this short term educational research project. So the aim of this study was to introduce DOPS as a formative assessment and learning tool for honing the skills of E.N.T. procedures among the residents and interns who are posted in the Department of E.N.T. Head and Neck Surgery. The objectives of this study were: (1) To prepare lists of E.N.T. procedures for DOPS for the Internees and post-graduate residents, (2) To conduct Orientation program of DOPS for the participants (Trainers and Trainee), (3) To prepare a structured list of items for the rating purpose, and (4) To facilitate and conduct the DOPS encounters of different E.N.T. procedures.
Methods
After getting the approval from the Institutional Ethical Committee (IEC), this short-term educational research study [as a part of Curriculum Innovation Project of MCI’s Advanced Course in Medical Education (ACME)] was conducted during the period of 5 months (1st April 2018 to 31st August 2018) in the E.N.T. Department of a tertiary care medical college hospital. Thirty-three trainees (4 PG residents and 29 interns) and five trainers (3 faculties and 2 senior residents) gave their informed consent and participated in this study. Participants were told regarding the experimental nature of the study and they were allowed to leave the study any time if they felt so. It was also informed to the trainee that the marks obtained would not be counted for their final computation (summative assessment) of scores.
The interns, who were posted in the department of E.N.T. between the months of April and August 2018 and were interested in participating in this study, were enrolled in the study. The trainers and trainee were sensitized regarding the methods and principles of DOPS. A demonstration session was conducted for the enrolled trainers and trainee.
As per the E.N.T. curriculum (UG and PG) of MCI, 41 E.N.T. procedures for DOPS encounters were unanimously decided by the faculties of E.N.T. Department, which ranged from frequently performed simple procedures for Interns (e.g. examination of oral cavity, anterior rhinoscopy and otoscopy etc.) to more advanced skills for PG residents (e.g. tracheostomy, caloric test, and Dix-Hallpike test etc.). The trainees were observed performing the procedures and the trainers noted trainees’ areas of improvements. Immediately after the procedure, trainers gave necessary feedback including corrective steps to the trainees. Trainers and trainees were advised to maintain a logbook of the conducted DOPS procedures. Trainees were encouraged to perform the procedures using DOPS whenever they get the opportunity. All the DOPS procedures were done on the job (ad hoc). The trainees were told regarding the E.N.T. procedures about which DOPS would be conducted so that they could prepare for that procedure accordingly as the aim of DOPS was learning. All the trainees were aware about how the DOPS assessments would be conducted as they were sensitized before joining the study. No trainee and no trainer had previous exposure to any DOPS assessment.
Total 421 DOPS encounters involving 41 E.N.T. procedures were performed on actual patients. Simulation technology and standardized patients were not employed. The assessment of the trainees for each DOPS encounters was done by the trainers using structured standardized rating scale (Box 1) that was prepared after reviewing the related literature and thorough discussion among the faculties of E.N.T. Department. For checking the association between average time and clinical settings and ENT procedures and DOPS encounters, the nonparametric test χ2 test was employed.
Box 1.
Structured rating scale form for Direct Observation of Procedural Skills (DOPS)
|
Date and time: Trainee’s name: Trainer’s name: Patient’s name: E.N.T. procedure: Clinical environment: OPD/IPD/Minor OT/Major OT/ICU/Emergency/Other (specify) Time taken for assessment and feedback: Areas of assessment 1. Pertinent anatomy: Poor/Satisfactory/Good/Not applicable 2. Indications: Poor/Satisfactory/Good/Not applicable 3. Informed consent: Poor/Satisfactory/Good/Not applicable 4. Pre-procedural preparation: Poor/Satisfactory/Good/Not applicable 5. Analgesia or safe sedation: Poor/Satisfactory/Good/Not applicable 6. Aseptic technique: Poor/Satisfactory/Good/Not applicable 7. Technique of procedure: Poor/Satisfactory/Good/Not applicable 8. Seeks help when necessary: Poor/Satisfactory/Good/Not applicable 9. Post-procedural management: Poor/Satisfactory/Good/Not applicable 10. Communication skills/professionalism: Poor/Satisfactory/Good/Not applicable 11. Overall performance: Poor/Satisfactory/Good Whether the verbal feedback to the trainee given? Yes/No Trainer’s remarks if any: Trainer’s signature |
Results
Thirty-three trainees (4 PG E.N.T. residents and 29 interns posted in E.N.T. Department) and five trainers (1 professor, 1 associate professor, 1 assistant professor, and 2 senior residents of E.N.T. Department) participated in this study. Four-hundred-twenty-one DOPS encounters on 41 E.N.T. procedures (Table 1) were completed between 1st April 2018 and 31st August 2018.
Table 1.
E.N.T. procedures and DOPS encounters, average time, clinical settings and numbers done by trainees
| S. no. | E.N.T. procedures | Average time (min) | Clinical settings | R3 | R2 | R1 | I | Total DOPS encounters |
|---|---|---|---|---|---|---|---|---|
| 1 | Pure tone audiometry | 14.57 | Audio room | 1 | 1 | 1 | 3 | |
| 2 | Impedance audiometry | 10.02 | Audio room | 1 | 1 | 1 | 3 | |
| 3 | Wound suturing | 14.55 | Casualty | 2 | 1 | 1 | 4 | |
| 4 | Trecheostomy | 27.3 | ICU | 2 | 2 | |||
| 5 | Anterior nasal packing | 4.58 | Indoor | 2 | 1 | 1 | 4 | |
| 6 | Nasogastric feeding | 4.55 | Indoor | 2 | 1 | 1 | 4 | |
| 7 | Direct laryngoscopy | 10.26 | Major OT | 1 | 1 | |||
| 8 | Esophagoscopy | 10.1 | Major OT | 1 | 1 | |||
| 9 | Cricopharaynx FB removal | 9.59 | Major OT | 1 | 1 | |||
| 10 | Oropharynx FB removal | 9.56 | Major OT | 1 | 1 | 2 | ||
| 11 | Laryngopharynx FB removal | 9.48 | Major OT | 1 | 1 | 1 | 3 | |
| 12 | Tonsillectomy | 31.2 | Major OT | 1 | 1 | |||
| 13 | Bithermal caloric test | 40.32 | Minor OT | 2 | 1 | 1 | 4 | |
| 14 | Nasal endoscopy | 5.05 | Minor OT | 2 | 1 | 1 | 4 | |
| 15 | Microear examination | 3.1 | Minor OT | 2 | 1 | 3 | ||
| 16 | Epley’s maneuver | 7.3 | Minor OT | 2 | 1 | 1 | 4 | |
| 17 | Semont maneuver | 4.4 | Minor OT | 1 | 1 | 1 | 3 | |
| 18 | Stroboscopy | 7.23 | Minor OT | 1 | 1 | 2 | ||
| 19 | Biopsy | 5 | Minor OT | 2 | 1 | 3 | ||
| 20 | Aspiration of cystic swellings | 4.53 | Minor OT | 2 | 1 | 3 | ||
| 21 | Abscess drainage | 10.5 | Minor OT | 2 | 1 | 3 | ||
| 22 | Anterior and posterior rhinoscopy | 4.58 | OPD | 2 | 1 | 1 | 29 | 33 |
| 23 | Oral cavity examination | 2.5 | OPD | 2 | 1 | 1 | 29 | 33 |
| 24 | Indirect mirror laryngoscopy | 3.2 | OPD | 2 | 1 | 1 | 29 | 33 |
| 25 | Otoscopy | 3 | OPD | 2 | 1 | 1 | 29 | 33 |
| 26 | Otoendoscopy | 5 | OPD | 2 | 1 | 1 | 4 | |
| 27 | Tunning fork hearing tests | 5.5 | OPD | 1 | 1 | 1 | 29 | 32 |
| 28 | Dix–Hallpike test | 5.42 | OPD | 2 | 1 | 1 | 4 | |
| 29 | Valsalva maneuver | 2.4 | OPD | 1 | 1 | 1 | 29 | 32 |
| 30 | Catheterization of eustachian tube | 9 | OPD | 1 | 1 | 1 | 3 | |
| 31 | Spatula test | 2.1 | OPD | 2 | 1 | 1 | 29 | 33 |
| 32 | Transillumination test for PNS | 4.55 | OPD | 1 | 1 | 1 | 3 | |
| 33 | Neck examination | 5.36 | OPD | 2 | 1 | 1 | 29 | 33 |
| 34 | Examination of cranial nerves | 9.59 | OPD | 1 | 1 | 1 | 29 | 32 |
| 35 | Examination of cervical lymph nodes | 3.59 | OPD | 2 | 1 | 1 | 29 | 33 |
| 36 | 70-degree laryngoscopy | 4.57 | OPD | 2 | 1 | 1 | 4 | |
| 37 | FNAC/Core biopsy | 4.59 | OPD | 1 | 1 | 2 | ||
| 38 | Ear packing | 4.15 | OPD | 2 | 1 | 1 | 4 | |
| 39 | Ear syringing | 4.1 | OPD | 2 | 1 | 1 | 4 | |
| 40 | Ear FB removal | 4.5 | OPD | 2 | 1 | 1 | 4 | |
| 41 | Nasal FB removal | 5.05 | OPD | 2 | 1 | 1 | 4 | |
| – | Total | 65 | 37 | 29 | 290 | 421 |
R1, R2 and R3: 1st, 2nd, and 3rd year residents respectively; I: Interns; FB, Foreign body; ICU, Intensive care unit; OT, Operation theatre; OPD, Out-patient department
Trainers were all male. Male trainees (64%) outnumbered the female trainees and the male:female ratio was 7:4 respectively (Table 2). Mostly (91%) trainees were in the age group of 22–25 years (30/33). Most (91%) of the trainees (30/33) were graduates from the CUSMC where the study was conducted. Approximately 9/10th (88%) of the trainee (29/33), who were all interns, performed minimum 10 E.N.T. procedures while only one trainee, who was 3rd year post-graduate resident, performed all the 41 E.N.T. procedures.
Table 2.
Demographics of trainees
| Trainee n/33 (%) | |
|---|---|
| Gender | |
| Male | 21 (63.63%) |
| Female | 12 (36.36%) |
| Age | |
| 22–25 years | 30 (90.90%) |
| 26–29 years | 01 (0.03%) |
| 30–33 years | 02 (0.06%) |
| College of graduation | |
| CUSMC | 30 (90.90%) |
| Other | 03 (09.09%) |
| Grade of training | |
| Interns | 29 (87.87%) |
| 1st year PG (R1) | 01 (0.03%) |
| 2nd year PG (R2) | 01 (0.03%) |
| 3rd year PG (R3) | 02 (0.06%) |
| Number of DOPS encounters attended | |
| 1–10 | 29 (87.87%) |
| 11–20 | 00 (0.0%) |
| 21–30 | 02 (0.06%) |
| 31–40 | 01 (0.03%) |
| 41 | 01 (0.03%) |
PG, Post-graduate; R1, R2 and R3: 1st, 2nd, and 3rd year residents respectively
Approximately half (49%) of the E.N.T. procedures (20/41) and 9/10th (86%) DOPS encounters (363/421) were conducted in OPD (Table 3). For checking the association between clinical settings, ENT procedures and DOPS encounters, the χ2 test a nonparametric test was done. The χ2 = 41.68 with probability value 0.00001 showed strong association between these two variable as highly significant. Mean time for DOPS (observation and feedback) was 5.43 min (SD 5.10) (max: 42.3–min: 1.1). The time taken to complete DOPS encounters was 15 min or less in the majority (98%) of the DOPS encounters (414/421) (Table 3). For checking the association between average time, ENT procedures and DOPS encounters, the non parametric test χ2 test was done. The χ2 = 10.52 with probability value 0.06 showed mild association between these two variables as partially significant.
Table 3.
Procedures and DOPS encounters—clinical settings and average time
| E.N.T. procedures | DOPS encounters | |
|---|---|---|
| n/41 (%) | n/421 (%) | |
| Clinical settings | ||
| OPD | 20 (48.78%) | 363 (86.22%) |
| Minor OT | 09 (21.95%) | 29 (6.88%) |
| Major OT | 06 (14.63%) | 9 (2.13%) |
| Audiometry-room | 02 (0.04%) | 6 (1.42%) |
| Indoor | 02 (0.04%) | 8 (1.90%) |
| Casualty | 01 (0.02%) | 4 (0.95%) |
| ICU | 01 (0.02%) | 2 (0.47%) |
| Time duration | ||
| 00.01–05 min | 20 (48.78%) | 275 (65.32%) |
| 05.01–10 min | 12 (29.26%) | 124 (29.45%) |
| 10.01–15 min | 06 (14.63%) | 15 (3.56%) |
| 15.01–20 min | – | – |
| 20.01–25 min | – | – |
| 25.01–30 min | 01 (0.02%) | 2 (0.47%) |
| 30.01–35 min | 01 (0.02%) | 1 ( 0.23%) |
| 35.01–40 min | – | – |
| 40.01–45 min | 01 (0.02%) | 4 (0.95%) |
| Mean | 5.43 | |
| Standard deviation | 5.10 | |
| Minimum time | 1.1 | |
| Maximum time | 42.3 | |
Discussion
The present study was conducted as a part of Curriculum Innovation Project of MCI’s Advanced Course in Medical Education (ACME). The faculties and participants of ACME reviewed the proposal and results of this study. The study has high numbers of DOPS encounters, E.N.T. procedures and trainees. In the present study 33 trainees participated and there were 421 DOPS encounters involving 41 E.N.T. procedures whereas study of Kara et al. [6] had 5 trainees and 55 DOPS encounters involving 18 E.N.T. procedures.
There is a need for some sort of procedural assessment during the medical training as the traditional supervisor evaluations have been reported unreliable [7]. Logbooks have their own limitations. In comparison to the logbook based system, the Royal College of Physicians considers DOPS as a highly valid and reliable instrument [8].
DOPS-based studies have reported its feasibility in many clinical departments such as Anesthesia, Ophthalmology, Obstetrics and Gynecology, Surgery and Pediatrics [1, 9–12]. However present study could find only two DOPS-based studies in Oto-rhino-laryngology [6, 13] and none such study was reported from India. The aim of introducing the DOPS in the E.N.T. department was to generate awareness regarding Competency-Based Medical Education and Assessment among the E.N.T. surgeons of India. In the Medical Education Units (MEU) of medical colleges of India, there is an emphasis on competency-based medical curriculum and the MCI has decided to train medical teachers in workplace-based formative assessment [5, 12].
As observed in other reports [3, 12, 14], our faculties and students also had initial problems in scheduling the DOPS encounters. The feasibility can be influenced by the availability of the patient and faculty for a particular procedure at a short notice in a busy OPD and operation schedule. The enthusiasm and will could overcome these problems with better organization and liaising. As suggested by Morris et al. [3], to make the DOPS project feasible in our institute, schedule was designed keeping in mind the convenience of the trainers so that DOPS could be integrated into their normal routine work. However in some studies, certain DOPS had to be conducted outside normal working hours when teachers/trainees were not available [3]. Both trainee and trainer faculty should spare sufficient time to perform the DOPS encounter. The freedom of being able to decide when and who assesses the trainees has been criticised [15]. In the present study, faculties were the trainers for PG residents while the senior residents assessed the Interns. Davies et al. [15] reported easy access to teachers. In their study trainees were able to achieve the six compulsory DOPS during their foundation year. In our study, minimum DOPS presented by trainees were 10. Perhaps, being the new introduction the enthusiasm of the participants was high.
The time of DOPS would vary according to the nature of E.N.T. procedure. DOPS was reported to take around 10–20 min [6]. In our study also, time taken to complete the DOPS was 15 min or less in the majority (98%) of the DOPS encounters (414/421) (Table 2). One DOPS (Bithermal caloric test) took 42.3 min. More than half (65%) of DOPS encounters (275/421) finished within 5 min, as they were simple and short and did not require much feedback.
Feedback given by the trainer to a trainee builds a strong rapport among them and also promotes student directed learning. As the studies [4, 12] have observed that the educational value of DOPS depends on the immediate feedback which includes strengths and weaknesses of the trainee, our trainers used to provide immediate feedback and action plan to the trainees that met their learning needs. Immediate feedback after the workplace based assessments was found useful in a cohort study of core medical trainees [14]. So our trainers as a part of DOPS used to acknowledge the trainee regarding their areas of good and bad practices and also provided plan for their future learning. All the 421 DOPS encounters in present study were conducted prospectively as the Thompson et al. [16] reported poor reliability when studied retrospectively.
Number of procedures need to be done before a trainee is deemed competent would vary with the trainees and the type of procedure. Number of procedures cannot guarantee that the procedures were performed competently. The formative assessment should facilitate improvement in the procedural skills of the students and the DOPS does the same by providing immediate feedback. The objectivity in rating the competency decreases the inter-observer differences in giving score to the students [6].
DOPS cannot predict future performance and needed further studies. Several studies on DOPS are warranted regarding the reliability of this assessment and learning tool [8]. Due to anxiety, trainees’ behaviours might be influenced once they know that they are being observed [11]. The review study of Wilkinson et al’s [8] could not find any validated methods of DOPS in the literature. Although there is lack of evidence on the quality, DOPS does have good face validity because it involves the direct observation of the procedural skills in the real patients [11].
The DOPS was introduced as a learning tool in the Oto-rhino-laryngology Department of our medical college as it is cost effective and does not need special set-up. The inbuilt feedback is the strongest factor of DOPS. DOPS can be considered a feasible and accurate method of Workplace-Based Assessment (WPBA) as it assesses the candidate at the highest level of Millers pyramid i.e. “does”.
Acknowledgements
I would like to thank following for their support: Dr. Dimple Mapara Mehta (Dean); Dr. Sanjay Mehta (Academic Dean); Dr. Pankaj Shah (Prof HOD E.N.T.) and Dr. Bhargaw Jadav, Dr. Alaap Shah and Dr. Bhavik Gosai of E.N.T. Department; Dr. Krupal Joshi (PSM Department) of CU Shah Medical College Hospital, Surendranagar, Gujarat.
Compliance with Ethical Standards
Conflict of interest
The author declares that there is no conflict of interest.
Human and Animal Rights
Research does not involve human participants and/or animals
Footnotes
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Setna Z, Jha V, Boursicot KAM, Roberts TE. Evaluating the utility of work-place assessment tools for specialty training. Best Pract Res Clin Obstet Gynaecol. 2010;24:767–782. doi: 10.1016/j.bpobgyn.2010.04.003. [DOI] [PubMed] [Google Scholar]
- 2.Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach. 2007;29:855–871. doi: 10.1080/01421590701775453. [DOI] [PubMed] [Google Scholar]
- 3.Morris A, Hewitt J, Roberts C. Practical experience of using directly observed procedures, mini clinical examinations, and peer observation in pre-registration house officer (FY1) trainees. Postgrad Med J. 2006;82:285–288. doi: 10.1136/pgmj.2005.040477. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Boursicort K, Etheridge L, Setna Z, Sturrock A, Ker J, Smee S, Sambandam S. Performance in assessment: consensus statement and recommendations from the Ottawa conference. Med Teach. 2011;33:370–383. doi: 10.3109/0142159X.2011.565831. [DOI] [PubMed] [Google Scholar]
- 5.Reforms in Under-graduate and Post-graduate Medical Education (2011) Vision 2015 Medical Council of India. https://www.tnmgrmu.ac.in/images/medical-council-of-india/MCI_book.pdf. Accessed 19 Sept 2018
- 6.Kara CO, Mengi E, Tumkaya F, Topuz B, Ardıc FN. Direct observation of procedural skills in otorhinolaryngology training. Turk Arch Otorhinolaryngol. 2018;56:7–14. doi: 10.5152/tao.2018.3065. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Turnbull J, Gray J, MacFadyen J. Improving in-training evaluation programs. J Gen Intern Med. 1998;13:317–323. doi: 10.1046/j.1525-1497.1998.00097.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Wilkinson J, Benjamin A, Wade W. Assessing the performance of doctors in training. BMJ. 2003;327:s91–s92. doi: 10.1136/bmj.327.7416.s91. [DOI] [PubMed] [Google Scholar]
- 9.Phillips AW, Madhavan A, Bookless LR, Macafee DA. Surgical trainers’ experience and perspectives on workplace-based assessments. J Surg Educ. 2015;72:979–984. doi: 10.1016/j.jsurg.2015.03.015. [DOI] [PubMed] [Google Scholar]
- 10.Kumar N, Singh NK, Rudra S, Pathak S. Effect of formative evaluation using direct observation of procedural skills in assessment of postgraduate students of obstetrics and gynecology: prospective study. J Adv Med Educ Prof. 2017;5:1–5. [PMC free article] [PubMed] [Google Scholar]
- 11.Hays RB, Davies HA, Beard JD, Caldon LJ, Farmer EA, Finucane PM, et al. Selecting performance assessment methods for experienced physicians. Med Educ. 2002;36:910–917. doi: 10.1046/j.1365-2923.2002.01307.x. [DOI] [PubMed] [Google Scholar]
- 12.Modi JN, Gupta P, Singh T. Competency-based medical education, entrustment and assessment. Indian Pediatr. 2015;52:413–420. doi: 10.1007/s13312-015-0647-5. [DOI] [PubMed] [Google Scholar]
- 13.Awad Z, Hayden L, Muthuswamy K, Ziprin P, Darzi A, Tolley NS. Does direct observation of procedural skills reflect trainee’s progress in otolaryngology? Clin Otolaryngol. 2014;39:169–173. doi: 10.1111/coa.12251. [DOI] [PubMed] [Google Scholar]
- 14.Johnson GJ, Barrett J, Jones M, Wade W. Feedback from educational supervisors and trainees on the implementation of curricula and the assessment system for core medical training. Clin Med. 2008;8:484–489. doi: 10.7861/clinmedicine.8-5-484. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Davies H, Archer J, Southgate L, Norcini J. Initial evaluation of the first year of the Foundation Assessment Programme. Med Educ. 2009;43:74–81. doi: 10.1111/j.1365-2923.2008.03249.x. [DOI] [PubMed] [Google Scholar]
- 16.Thompson WG, Lipkin M, et al. Evaluating evaluation: assessment of the American board of internal medicine resident evaluation form. J Gen Intern Med. 1990;5:214–217. doi: 10.1007/BF02600537. [DOI] [PubMed] [Google Scholar]
