Abstract
The SingHealth Pathology Residency Program (SHPRP) is a 5-year postgraduate training program in Singapore. We face the problem of resident attrition, which has a significant impact on the individual, program and healthcare providers. Our residents are regularly evaluated, using in-house evaluations as well as assessments required in our partnership with the Accreditation Council for Graduate Medical Education International (ACGME-I). We hence sought to determine if these assessments were able to distinguish residents who would attrite from residents who would graduate successfully. Retrospective analysis of existing residency assessments was performed on all residents who have separated from SHPRP and compared with residents currently in senior residency or graduated from the program. Statistical analysis was performed on quantitative assessment methods of Resident In-Service Examination (RISE), 360-degree feedback, faculty assessment, Milestones and our own annual departmental mock examination. Word frequency analysis of narrative feedback from faculty assessment was used to generate themes.
Since 2011, 10 out of 34 residents have separated from the program. RISE, Milestone data and the departmental mock examination showed statistical significance in discriminating residents at risk of attrition for specialty-related reasons from successful residents. Analysis of narrative feedback showed that successful residents performed better in areas of organization, preparation with clinical history, application of knowledge, interpersonal communication and achieving sustained progress. Existing assessment methods used in our pathology residency program are effective in detecting residents at risk of attrition. This also suggests applications in the way that we select, assess and teach residents.
Keywords: Assessment, Attrition, Pathology, Residency
Introduction
In 2010, postgraduate medical education in Singapore transitioned to a new residency system, for which all Sponsoring Institutions (SI) and many of the programs have been successfully accredited by the Accreditation Council for Graduate Medical Education International (ACGME-I).1 SingHealth is the largest SI for residency programs in Singapore2 and at the time of writing this article, the Singhealth Pathology Residency Program (SHPRP) has admitted a total of 34 residents into its training program since its inception in 2011. We are a 5-year training program for anatomic pathology (AP) comprising three years of junior residency followed by an overseas intermediate exam (joint exam with American Board of Medical Specialities (ABMS), Fellow of the Royal College of Pathologists (FRCPath) Part 1 examination or Royal College of Pathologists of Australasia (RCPA) Part 1 examination), then two years of senior residency, at the end of which residents are required to pass an overseas certifying exam (FRCPath Part 2 or RCPA Part 2) before graduating and becoming accredited as specialists.
Attrition is a problem that residencies internationally face,3,4 and our residency program is no exception. Attrition has widespread and significant ramifications: these include a delay of the resident's entry into the practicing specialist workforce; effects on morale, scheduling, certification and funding of the residency program; and difficulties regarding manpower and sustaining patient care for healthcare providers and at a national level.4,5,6
Our residents are systematically assessed with formative and summative evaluations under policies of ACGME-I,7 as well as local assessments that were already in place prior to partnership with ACGME-I. We hence set out with the objective to see if these assessments were able to distinguish residents who would drop out of residency from those who were able to successfully complete residency.
Material and methods
This study was regarded as a service evaluation project and used de-identified existing statistics without additional risk or burden to the participants, and was thus deemed as not requiring further review and approval by the SingHealth Centralised Institutional Review Board (IRB). Approval from the relevant Head of Department was obtained and the study was conducted in compliance with existing institutional policies, acts and regulations.
Participants
All residents admitted into SHPRP since its first intake in 2011 were considered for this study. Any resident who separated from the program prior to graduation was regarded as an instance of attrition. As part of administrative proceedings prior to cessation of training, residents underwent a formal interview for which the reason(s) for separation was recorded. These reasons could be broadly classified into two groups. Reasons pertaining to difficulty with the discipline of AP, wishing to return to clinical practice and a desire to explore other specialities were viewed as a misfit with the discipline of pathology and labelled as ‘specialty-related’ reasons. The other group comprised personal reasons unrelated to work, such as having to care for family or emigration. Those with specialty-related reasons were further analysed as a subgroup.
Residents in the attrition group were compared against a group of ‘success’ residents, defined as residents who had graduated from the residency program or who were currently in senior residency. We used this definition as in our experience, residents who reached senior residency would in all likelihood graduate from the program. In our eleven years of experience, only one resident left during senior residency, for personal reasons.
Quantitative data analysis
Epidemiological data and residency evaluations were collated from electronic documents into a Microsoft Excel spreadsheet. As a significant number of residents from the attrition group had left the program by the second year of residency precluding meaningful comparisons thereafter, statistical analysis was restricted to data from the first and second years of training. A number was assigned to each resident and the dataset was de-identified prior to statistical analysis. Data collected and used in statistical analysis are listed below. Continuous variables were compared using Wilcoxon rank-sum (Mann-Whitney) test while categorical variables were compared using Fischer's exact test or chi-square test, using Stata 17 software.
Epidemiological data
Epidemiological data was extracted from each resident's individual portfolio. Resident characteristics analysed included gender, age, citizenship, marital status, number of years post-graduation from medical school, local or overseas degree and prior attachment to the department before residency.
Residency evaluations under ACGME-I
360-Degree feedback
Also known as Multisource feedback (MSF), this comprises a survey rating form, completed by multiple people who interact and work with the resident.8 Assessors include laboratory technicians, clerical staff, other residents and faculty that work with the resident on a regular basis.
Resident In-Service Examination (RISE)
The American Society for Clinical Pathology (ASCP) RISE is an in-training exam that is an established part of graduate medical education in pathology.9 Held annually, and with 100% of the United States pathology residency programs participating, it comprises more than 350 multiple-choice questions. Residents receive individual reports with a scaled score as well as a percentile equivalent in comparison to their peer training group for individual content areas as well as the total examination.10
In SHPRP, junior residents participate in RISE and take the AP and special topics (ST) sections. The total examination percentile equivalent (which includes comparison to United States residencies) was used for statistical analysis.
Milestone data
Milestones are specialty-specific descriptions of abilities at various stages of professional development based on a core set of six general competencies developed by the ACGME and ABMS: Patient Care and Procedural Skills (PC); Medical Knowledge (MK); Practice-based Learning and Improvement (PBLI); Interpersonal and Communication skills (ICS); Professionalism (Pro); and Systems-based Practice (SYB). The aim of Milestones is to provide a uniform framework to guide progressive development of a learner. Residents are to be assessed by a Clinical Competency Committee (CCC) at least twice a year.8,11, 12, 13
SHPRP has labelled the five levels of Milestones as Level 1 (Novice), 2 (Advanced beginner), 3 (Competent), 4 (Proficient) and 5 (Expert). Residents are assessed twice a year by the CCC. In addition to individual competency Milestones, our institution incorporates these individual Milestones into an overall Milestone score to summarise the overall progress of a resident. For the purposes of this study, we felt the overall Milestone score was sufficiently representative and this was used for statistical analysis.
Faculty assessment forms
Faculty members supervising scheduled rotations or educational experiences are to assess each learner. This assessment includes a rating scale and section for written comments in each of the six core competencies, based on specific frameworks that define the expectations for what is being assessed.8
SHPRP faculty are requested to assess residents that have been attached to them on a weekly basis. Based on the rating scale, an overall score was generated and used for data analysis. Written comments were compiled and used for qualitative analysis (see below).
Residency evaluations not under ACGME-I
Department mock examination
A time-honoured tradition in our training program prior to partnership with ACGME-I, the mock examination is set by faculty members and administered to residents and trainees at all levels approximately once a year. The primary aim of this examination is to allow residents to gauge their progress and prepare them for certifying exit examinations. The examination predominantly focuses on surgical histology questions, in which candidates are required to review H&E stained slides and provide their written diagnosis with relevant descriptions, workup and discussion. Over the years, additional components including cytopathology, long cases (cases with additional immunohistochemical and/or ancillary investigations), frozen sections, macroscopic pathology and Objective Structure Practical Examination (OSPE) have also been included. The process of scoring the mock examination has undergone refinement over the years, but for a number of years now includes marking by more than one examiner and usage of a fixed marking scheme to minimise subjectivity. Overall percentage scores were used for statistical analysis.
Qualitative data analysis
All narrative comments from faculty assessment forms throughout a resident's entire period of training were compiled into a Microsoft Excel datasheet that was similarly de-identified by assigning numbers to each resident. All identifying references such as name and gender were also removed. One resident from the success group had received a disproportionately large amount of feedback over a period of remediation, and this was excluded from word frequency analysis.
The comments were dichotomised into positive and negative comments, and further subcategorised by each of the six general competencies (PC, MK, PBLI, ICS, Pro and SYB). Where appropriate, the tense or form of keywords was adjusted to allow for word frequency generation. However, the root word was not changed. Positive and negative comments for each of the general competencies were compared between attrition and success groups using analysis by two online text analysers which provided statistical information on the frequency of keywords and 2 or 3-word key phrases. Information from one of the analysers was visualised as a ‘word cloud’ (in which a word's size is proportional to its frequency). With the knowledge of these keywords and phrases, three authors (who are also residency faculty) individually reviewed the de-identified datasheet to identify themes, then collectively came to a consensus on conclusions made.
Results
Since 2011, the SHPRP has admitted a total of 34 residents into its program, of which 25 residents met the criteria for our study, the attrition group comprising 10 residents and the success group comprising 15 residents. The remaining 9 residents are currently enrolled in junior residency. Within the attrition group, 9 residents (90%) had dropped out in junior residency: five residents left in the second year of training (50%); 2 residents left in the first year of training and 2 residents left in the third year of training. One resident dropped out in the fifth year of training in senior residency. Six residents (60%) dropped out due to specialty-related reasons, all of whom left during junior residency. Demographic characteristics of the study subjects are given in Table 1. We did not find any significant difference between the attrition and success groups with regard to age at entry to residency, gender, local or foreign citizenship, marital status, post-graduate year at entry to residency, local or overseas medical degree or having done a posting in the department prior to starting residency.
Table 1.
Demographic characteristics of the study subjects (N = 25).
Variable | Success (N = 15) | Attrition (N = 10) | P value |
---|---|---|---|
Age | |||
25 | 2 | 0 | |
26 | 0 | 1 | |
27 | 6 | 1 | |
28 | 4 | 3 | |
29 | 2 | 1 | |
30 | 0 | 1 | |
32 | 1 | 2 | |
34 | 0 | 1 | 0.061 |
Sex N(%) | |||
Male | 7 (46.67) | 6 (60.00) | |
Female | 8 (53.33) | 4 (40.00) | 0.688 |
Citizenship N(%) | |||
Singaporean | 7 (46.67) | 7 (70.00) | |
Non-Singaporean | 8 (53.33) | 3 (30.00) | 0.414 |
Marital status in junior residency N(%) | |||
Married | 4 (26.67) | 4 (40.00) | |
Single | 11 (73.33) | 6 (60.00) | 0.667 |
Post-graduate year at entering residency | |||
1 | 3 | 1 | |
2 | 3 | 2 | |
3 | 6 | 2 | |
4 | 2 | 1 | |
5 | 1 | 2 | |
7 | 0 | 1 | |
10 | 0 | 1 | 0.147 |
Place of medical degree N(%) | |||
Local (Singapore) | 5 (33.33) | 6 (60.00) | |
Overseas | 10 (66.67) | 4 (40.00) | 0.214 |
Attachment to department before residency N(%) | |||
Yes | 9 (60.00) | 6 (60.00) | |
No | 6 (40.00) | 4 (40.00) | 1.000 |
Quantitative data results
Results for statistical analysis of quantitative assessment methods are shown in Table 2, Table 3. For RISE and the department mock examination, although the attrition group appeared to perform more poorly than the success group, this initially did not trend to statistical significance. However, restricting analysis to the subgroup of attrition due to specialty-related reasons showed that this group performed significantly worse than the success group in RISE (year 1) as well as the department mock examination (years 1 and 2). For Milestone data, we found that a summed value of the two overall Milestones scores for the first year of residency had significant predictive value. A summed score of 2.5 or less was predictive of subsequent attrition, when similarly restricted to the subgroup of residents who attrited due to specialty-related reasons. There was no significant difference between attrition and success groups for 360-degree feedback and faculty assessment form scores.
Table 2.
Quantitative data analysis.
Quantitative assessment | Success | Attrition (all) | P valuea | Attrition (due to specialty-related reasons) | P valueb |
---|---|---|---|---|---|
Department mock examination Median% (IQR) (Max = 100%) | |||||
Year 1 | 56.2 (46.3–61.5) | 42.5 (39.5–61.3) | 0.280 | 40.0 (39.0–41.0) | 0.015 |
Year 2 | 64.0 (60.0–76.0) | 42.0 (41.0–60.5) | 0.089 | 41.5 (36.0–51.3) | 0.022 |
RISE percentile Median (IQR) (Max = 100) | |||||
Year 1 | 85.0 (80.0–90.0) | 72.5 (49.6–85.0) | 0.145 | 60.9 (47.5–75.0) | 0.008 |
Year 2 | 77.5 (50.0–90.0) | 57.5 (27.5–86.0) | 0.394 | 35.0 (20.0–80.0) | 0.126 |
Faculty feedback score Median (IQR) (Max score = 5) | |||||
Year 1 | 3.50 (2.35–3.80) | 3.10 (1.69–3.50) | 0.174 | 2.87 (1.66–3.30) | 0.119 |
Year 2 | 3.45 (3.30–4.00) | 3.37 (3.11–3.60) | 0.310 | 3.40 (3.11–3.60) | 0.405 |
360-degree feedback score Median (IQR) (Max score = 9) | |||||
Year 1 | 6.95 (6.60–7.10) | 6.80 (6.70–7.04) | 0.801 | 6.80 (6.65–7.04) | 0.535 |
Year 2 | 7.00 (6.32–7.30) | 6.55 (6.25–6.84) | 0.259 | 6.60 (6.50–6.84) | 0.485 |
IQR – Interquartile range. RISE – Resident In-service Examination.
The differences in assessment scores between subjects in the success group and those who failed to complete the program were examined using non-parametric Mann-Whitney test.
The difference in assessment scores between subjects in the success group and those who failed to complete the program due to specialty-related reasons were examined using non-parametric Mann-Whitney test.
Table 3.
Summed overall Milestone scores for first year of residency.
Summed overall Milestone scores for Year 1 | Success | Attrition (all) | P value | Attrition (due to specialty-related reasons) | P value |
---|---|---|---|---|---|
≤2.5 (n) | 2 | 5 | 0.226 | 4 | 0.031 |
≥3.0 (n) | 13 | 5 | 2 |
Qualitative data results
The following themes emerged from analysis of narrative comments from faculty assessment forms. Keywords with high frequencies are spelt in capital letters.
Work-centered culture
Both success and attrition groups were noted to be hardworking, conscientious, with the drive to complete tasks on time. Other synonyms like diligent, meticulous and efficient were also featured, especially for the success group.
Organization and preparation
These were strong features of the success group not often seen in the attrition group. Preparation was often in the context of acquiring the necessary clinical history and background prior to signing out with faculty.
Report writing
Both success and attrition groups received feedback requesting for residents be more careful, pay attention to descriptions and details and correct errors (in particular grammar and typographical errors).
Medical knowledge
The success group was often noted to possess good knowledge. In addition, they had greater ability to apply this knowledge in the workup of cases to reach the diagnosis independently. This was observed far less in the attrition group.
In contrast, some residents from the attrition group were noted to have gaps or be lacking in knowledge, with basic and foundational knowledge specifically highlighted. They showed problems with concepts and sometimes lacked an approach to cases.
Experience
Both success and attrition groups frequently received feedback to read more and see more.
Professional behavior
Both success and attrition groups were noted to be responsible and demonstrate good attitude. Residents were described as cheerful and pleasant.
Unprofessional behavior was scarcely mentioned.
Interpersonal communication
Residents from the success group were complimented on communication with colleagues, which included clinicians, other pathologists and faculty, while this was seen far less in the attrition group. A few residents in the success group were sometimes shy and needed encouragement to speak up.
In contrast, some residents from the attrition group had difficulties in interpersonal interactions, with clerical and support staff and technicians being specifically mentioned.
Learning and improvement
Both success and attrition groups were keen, willing and eager to learn. Residents responded to feedback and showed improvement thereafter.
Residents from the success group demonstrated progress that was steady and consistent, and were described as being on par with their level of training; this was not well demonstrated by the attrition group.
The success group was able to ask relevant and higher-level questions as well as able to refer to relevant sources of information.
Discussion
All but one resident in the attrition group dropped out in junior residency, with most leaving in the second year of training. Vance et al. reported similar findings in which 61% of attrition from United States pathology residencies occurred during or immediately on completion of the first year of training.4 This highlights early years of pathology training as a particularly vulnerable period. This is hardly surprising, as in our experience there is variable and often little exposure to pathology in undergraduate medical education, compounded by the knowledge-intensive nature of the discipline, resulting in a steep learning curve in the initial years of training. We were able to detect statistically significant differences between attrition and success groups using currently available evaluations endorsed by ACGME, as well as in our department mock examination. This implies that characteristics associated with attrition are detectable as early as in the first two years of residency. A study by Ducatman et al. also similarly showed that pathology trainees at risk of failure to complete certifying examinations or the training program could be identified by a case-based evaluation system as early as the first postgraduate year and possibly even the first surgical pathology rotation.14
Quantitative data
RISE and the department mock examination, which target the competency of MK,8 were found to be discriminatory between successful residents and residents who would attrite, in particular for specialty-related reasons. RISE performance has been found to be predictive of success in certification examinations of the American Board of Pathology, as well as in board certifying examinations across other specialties.9,10,15 Mock board examinations have also shown a correlation with performance in certifying examinations.16 One of the major challenges in diagnostic pathologic work is the requirement for an almost encyclopaedic knowledge of a wide range of ever-expanding disease entities,17 both common and esoteric, from all organ systems. As such, it is reasonable to suggest that the specialty-related difficulties that residents who attrite face may include a significant component related to acquisition or application of knowledge, a finding corroborated by our qualitative analysis (see below). In our experience, there may be a tendency of faculty or residents themselves not to place too much emphasis on the results of such tests, either viewing it as a simulated experience or making concessions for a resident who is only at an early stage of training. However, our results suggest that the outcome of these assessments should be taken seriously from the start, and poor performance may indicate a need for intervention.
Educational Milestones are developmentally based, and their value lies in tracking the longitudinal trajectory of a resident's professional development to ensure they acquire competency for essential elements by the conclusion of residency.8,11, 12, 13,18 Residents who attrite mostly do so in junior residency, hence may only have the opportunity to have their Milestones assessed two or three times by the CCC. It is hence important to investigate if early Milestone scores are able to identify a resident at risk. Holmboe et al. examined residencies of three clinical disciplines and found that mid-residency (3rd and 4th reviews) Milestone ratings were useful in identifying and predicting residents at risk of not achieving recommended competency Milestone goals by the completion of residency.11 Our study similarly shows that early Milestone ratings, possibly even in the first year of residency, may have value in discerning residents at risk. Milestone data has also been associated with performance in in-training and certifying board examinations in other specialties.19, 20, 21 However, to our knowledge, association with successful outcomes has not been investigated in the setting of a pathology residency. Here, we corroborate that the assessment of Milestones is a robust method to assess residents in a pathology residency.
We did not note a statistically significant difference between the attrition and success groups in 360-degree feedback and numerical rating scales of faculty assessment forms. However, qualitative analysis of comments in faculty assessment forms (see below) did suggest differences between these two groups. The reason for this may be attributable in part due to inherent limitations and difficulties with psychometric tests. The way that assessors translate assessment of complex tasks into numerical values is variable, with one assessor using a different frame of reference from another. Numerical scores also are unable to take into account external or contextual factors that influence ratings.8,22, 23, 24 Furthermore, it has been shown that assessors tend to inflate their scores in these forms of feedback, reducing their educational and feedback value.8,24 In our experience, there is also an element of ‘questionnaire fatigue’ leading to imprecise scoring, especially if faculty are requested to do perform these assessments frequently or for multiple residents.
Qualitative data
Although numerical psychometric scores have become the mainstay of assessment in medical education, many still advocate the utility of narrative comments. Narrative evaluation has been validated to be at least as effective, and possibly even more informative than quantitative evaluations in measuring performance,23, 24, 25, 26, 27 including being able to predict learners who are facing difficulties.28, 29, 30 Furthermore, qualitative comments have several added advantages – these include providing the context in which feedback is given, resulting in more holistic and integrated opinions24,26; being able to capture comments on areas outside competency frameworks and numeric scores26; and providing actional feedback and strategies to improve,27 to name a few.
In the interpretation of narrative feedback, one needs to be able to discern a comment signalling a learner in true difficulty and requiring intervention, versus generic, non-discriminatory comments for which intervention would be an over-reaction – in other words, to separate noise from a true signal.25 In our qualitative analysis, we found that comments requesting more care with the written report (such as typographical errors) or generic advice to acquire more experience or knowledge were seen in both success and attrition groups. Although still important, these did not appear to constitute a ‘fatal flaw’ condemning a learner to failure. Conversely, we did not note any outstanding behaviors or traits within the attrition group that would be deemed undesirable or unacceptable for any medical professional. For example, similar to the success group, residents in the attrition group were hardworking and completed their work, were eager to receive feedback and improve, and lacked overtly unprofessional behavior. Rather, the key differences between the two groups are expounded upon as follows.
Being organized
Compliments for the success group regarding organization generally pertained to workflow. The pathology laboratory has a high throughput and a single pathologist (or resident) is at any one point in time juggling multiple cases at various stages of diagnostic work-up, not to mention other tasks such as multidisciplinary meetings, administrative and education duties. There is an ever-present danger of making an error, especially in the face of the attempt to maintain critical key performance indicators such as turnaround times.31,32 Possessing an organized mind and workflow appears to be part of the key to coping with the workload, while ensuring each and every case is reported in a safe, accurate and timely manner.
Being prepared with relevant history
Feedback regarding organization was also closely tied in with comments regarding being prepared with appropriate and relevant background clinical and radiological details – in other words being organized with regard to approach and workup of a case. Clinical information remains an essential element in histological interpretation and can greatly add value to a diagnosis.31 Nested within the competencies of PC and SBP, this reflected the ability of residents in the success group to understand the need for clinicopathological correlation as well as the contribution of pathology to patient care within a larger clinical system. Being aware of the ‘big picture’ and the ability to assimilate relevant patient data has already been recognized in literature to be an important skill, both in histopathology as well as in other specialities.25,33
Application of knowledge
Pathology is a knowledge-intensive specialty.34 Feedback from the attrition group reflected an inability to keep pace with the steep learning curve and the presence of key foundational gaps, whereas, successful residents not only possessed knowledge, but were also able to apply the knowledge they acquired. In a Delphi study by Brierley et al., consultant histopathologists were seen to highly value an approach to diagnosis based on sound pathological principles, even more so than the eventual diagnosis itself.35 This also corroborates the significant difference seen between the two groups for RISE and departmental mock examination – the former requiring interpretation of images, diagrams or a case scenario rather than mere recollections of facts, while the latter tests the entire interpretative process from glass slide to diagnosis – further validating these modalities of assessment.
Interpersonal communication
Pathology is a unique specialty in which there is limited to no direct communication with patients. In fact, the stereotype of being ‘unsocial’ or having mild Asperger's syndrome is even regarded by some as being a trait of a good pathologist!34,36 However, our analysis of narrative comments showed that communication was also an important area in which residents in the success group outperformed the attrition group. It is an important responsibility of the pathologist to ensure that the diagnostic product is communicated to those responsible for the care of the patient,33 for example at frozen section, or in communication of a final diagnosis or critical and unexpected findings.31 Communication is also essential to working as a team, from which the pathologist is not exempt – requiring cooperation with technical, administrative and secretarial colleagues, internal consultation with other pathologists, as well as participation in multidisciplinary meetings, to ensure effective collaboration in a professional and timely manner to benefit patient care.33
Sustained progress and trajectory
The analysis of comments under the competency of PBLI appeared to yield two apparently contradictory trends. On one hand, residents from both groups appeared keen to learn and improve following feedback. Yet, only the success group received feedback acknowledging more longitudinal and sustained progress and ability to keep pace with the level of training, which corroborated our statistical analysis of Milestone data. This is most likely because residents receive feedback following a week-long sign out with faculty. As such, a resident may be observed by a single faculty member to improve in the immediate time frame following feedback given during sign out, however this ‘reactionary’ betterment may not equate to long-term, self-driven progress necessary to propel a resident to the end of the training program. Rather, what deserves more significance are comments regarding long-term development, which can only be made if faculty have repeated encounters with residents over a period of time. In many training programs today, residents often pass through rotations of short duration and interact with attending physicians on a weekly or even daily basis, resulting in fragmented and interrupted interactions between mentor and learner.24 This limits the ability of mentors to observe and give integrated feedback on a resident's trajectory – which would include how far along a learner has come, future potential and predicted difficulties that may be encountered.26 A long-term advisor would hence likely be more effective in recognising patterns and trends for personal development.27
Recommendations
This study yields several practical insights which can be applied to a pathology residency program.
Support during early residency
Recognition of early residency as a vulnerable period suggests that increased support and guidance should be supplied in early residency, or even prior to starting residency. Recent literature has increasingly recognized the utility of teaching curriculums targeting this stage of training. Pathology training programs have described ‘onboarding’ programs that impart knowledge and skills to matched individuals prior to residency and ‘boot camps’ or other intensive forms of teaching in the initial phase of training and reported good reception by residents and success in rapidly transitioning medical students into pathology residents.37, 38, 39, 40, 41
Evaluation modalities to detect residents at risk
Our study has shown that current quantitative assessment methods under ACGME-I are useful evaluation tools, and should be administered conscientiously with close monitoring of results to detect struggling residents. Narrative feedback is equally important, but further work is needed to develop means to capture these opinions in systematic ways.24,26 It is also important to select, train and develop faculty to perform these evaluations in a reliable and meaningful way. Assessors need to be familiar with evaluation frameworks such as Milestone descriptors,24 and should also be equipped to provide rich, descriptive and insightful narrative feedback that can be acted upon.24,27,30 They should also stay updated on upcoming modalities of assessment – one such evaluation deserving mention being entrustable professional activities (EPAs), which describes essential clinical work and activities required by a profession.8 Long-term mentorship and effective supervision would provide valuable insights.27
Methods of teaching and intervention
Our results also raise the suggestion that teaching needs to go beyond imparting knowledge, but also incorporate how that knowledge can be applied. For example, a resident must not just recite lists of differential diagnoses or immunohistochemical stains, but be able to narrow and tailor investigations to be relevant to the specific morphology and the appropriate clinical setting. An effective teaching strategy or program has been shown to improve resident performance in pathology15,42 and it remains to be further investigated which teaching styles or modalities would accomplish this adequately in pathology.
Pathology residencies also need to explore teaching of non-cognitive skills such as organizational and communication skills. These ‘soft’ skills are not formally taught and are generally variably acquired by trainees' observation and emulation of teachers and peers.31,33 Our clinical colleagues appear to have adopted a more active role in inculcating these skills – including the usage of mock patients and objective structured clinical examinations. A pathology residency should not neglect these skills simply because there is no direct patient contact, but rather should adapt these teaching tools to cater to scenarios in the laboratory or prepare pathology residents for interactions with other medical colleagues.
Selection of candidates for residency
Findings from our study may be extrapolated to the selection of candidates for residency. SHPRP receives applications from a variety of individuals from both local and overseas universities, with varying degrees of postgraduate clinical experience. There is also no widely subscribed examination akin to the USMLE (United States Medical Licensing Examination). This precludes the selection of residency applicants using standardised and systematic criteria similar to residencies in the United States. Furthermore, these traditional selection factors (for example, USMLE scores, selection to Alpha Omega Alpha Honour (AOA) Society, medical school rank, letters of recommendation etc.) have been shown to have little or at best variable ability in predicting future resident success.43,44,45,46,47,48,49. In our study, we have highlighted traits of successful residents, including in non-cognitive domains, and further efforts should be made to see how these qualities can be identified in potential residents who may be a good fit for the specialty.
Limitations of study
The main limitation of our study is the small sample size, involving only a single residency program. Inclusion of other pathology residencies nationwide would be helpful to see if similar trends exist and may result in greater statistical strength.
We also recognize that differences in the structure of residency programs between institutes and in different countries may limit how generalisable the results of our study are. Most ACGME-accredited pathology residency programs in the United States train residents in a combined AP/Clinical Pathology (CP) program over a period of 48 months. Only a few center offer an AP-only program, for which training is conducted over a period of 36 months, which is significantly shorter as compared to our 5-year AP-only program.50,51 We would also like to acknowledge that the social and cultural contexts of this study, comprising primarily residents and faculty of Asian nationalities, may be quite different from a Western perspective. It would be interesting to see if our findings would also hold true in different sociocultural and educational settings.
Lastly, we did not study the effects of intervention in this study. In SHPRP, residents perceived to have weaker performance are generally provided with informal help initially – this often includes counselling, direction to appropriate learning resources and closer supervision by core faculty. Only in more severe cases would the CCC recommend formal remediation, with its attendant impact on resident progression. Amongst our study subjects, two residents were deemed to have performed poorly enough that formal remediation was required. This included non-progression for that residency year, a temporary reduction and subsequent graduated increases in workload, regular monitoring and feedback by supervisors and core faculty, with set targets and minimum expectations to be fulfilled. One resident eventually graduated from the residency program, whilst the other eventually separated from the program (but transferred to and successfully completed training in a different field in Pathology). The remaining individuals from the attrition group voluntarily requested separation from the program before formal remediation could be initiated. As these residents often left early in residency training, this limited data available for analysis. Furthermore, there was no opportunity to study if the characteristics and traits observed were amenable to change and improvement. We cannot exclude the possibility that had a resident facing difficulty chosen to persevere instead of leaving, he/she may have improved and eventually shown similar characteristics to residents in our success group.
Conclusions
To conclude, our study shows that existing modalities of evaluation in our residency program – both quantitative metrics and qualitative narrative data - showed good utility in identifying residents at risk of attrition. In doing so, we have found certain key traits in cognitive and non-cognitive domains that differ between residents that attrite and those that are successful. These findings can be applied in the way that we select, assess and teach residents.
Funding
This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.
Declaration of competing interest
The authors declare that there are no competing interests.
Acknowledgements
The authors would like to acknowledge SingHealth Duke-NUS Pathology Academic Clinical Program for the support in this publication.
References
- 1.MOH Holdings About residency. https://www.physician.mohh.com.sg/medicine/residency/about-residency
- 2.Tan H.K. Singhealth residency. Welcome message. https://www.singhealthacademy.edu.sg/residency/about-us/welcome-message
- 3.Accreditation Council for Graduate Medical Education. Data Resource Book Academic Year 2020-2021. Accreditation Council for Graduate Medical Education (ACGME); 2021. Graduating residents and residents leaving prior to completion; pp. 97–116. [Google Scholar]
- 4.Vance R.P. Pathology trainee attrition: a new variable in the manpower equation. Hum Pathol. 1992;23(2):104–106. doi: 10.1016/0046-8177(92)90230-z. [DOI] [PubMed] [Google Scholar]
- 5.Andriole D.A., Jeffe D.B., Hageman H.L., Klingensmith M.E., McAlister R.P., Whelan A.J. Attrition during graduate medical education: medical school perspective. Arch Surg. 2008;143(12):1172–1177. doi: 10.1001/archsurg.143.12.1172. [DOI] [PubMed] [Google Scholar]
- 6.Lu D.W., Hartman N.D., Druck J., Mitzman J., Strout T.D. Why residents quit: national rates of and reasons for attrition among emergency medicine physicians in training. West J Emerg Med. 2019;20(2):351–356. doi: 10.5811/westjem.2018.11.40449. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.ACGME International Pathology. ACGME international foundational program requirements for graduate medical education. https://www.acgme-i.org/globalassets/acgme-international/specialties/common/foundinternationalresidency.pdf
- 8.Accreditation Council for Graduate Medical Education Milestones resources. Guidebooks. Assessment guidebook. https://www.acgme.org/globalassets/pdfs/milestones/guidebooks/assessmentguidebook.pdf
- 9.McKenna B.J. The American Society for Clinical Pathology resident in-service examination: does resident performance provide insight into the effectiveness of clinical pathology education? Clin Lab Med. 2007;27(2):283–291. doi: 10.1016/j.cll.2007.03.005. [DOI] [PubMed] [Google Scholar]
- 10.Rinder H.M., Grimes M.M., Wagner J., Bennett B.D. RISE Committee, American Society for Clinical Pathology and the American Board of Pathology. Senior pathology resident in-service examination scores correlate with outcomes of the American Board of Pathology certifying examinations. Am J Clin Pathol. 2011;136(4):499–506. doi: 10.1309/AJCPA7O4BBUGLSWW. [DOI] [PubMed] [Google Scholar]
- 11.Holmboe E.S., Yamazaki K., Nasca T.J., Hamstra S.J. Using longitudinal milestones data and learning analytics to facilitate the professional development of residents: early lessons from three specialties. Acad Med. 2020;95(1):97–103. doi: 10.1097/ACM.0000000000002899. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Naritoku W.Y., Alexander C.B., Bennett B.D., et al. The pathology milestones and the next accreditation system. Arch Pathol Lab Med. 2014;138(3):307–315. doi: 10.5858/arpa.2013-0260-SA. [DOI] [PubMed] [Google Scholar]
- 13.Naritoku Wesley Y., Alexander C. Bruce. On behalf of the pathology Milestone working group; pathology milestones. J Grad Med Educ. 2014;6(1s1):180–181. doi: 10.4300/JGME-06-01s1-10. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Ducatman B.S., Williams H.J., Hobbs G., Gyure K.A. Vital signs: how early can resident evaluation predict acquisition of competency in surgical pathology? J Grad Med Educ. 2009;1(1):37–44. doi: 10.4300/01.01.0007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Forcucci J.A., Hyer J.M., Bruner E.T., Lewin D.N., Batalis N.I. Success in implementation of a resident in-service examination review series. Am J Clin Pathol. 2017;147(4):370–373. doi: 10.1093/ajcp/aqx013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Newkirk K.M., Xiaocun S., Bailey M.R. Correlation of mock board examination scores during anatomic pathology residency training with performance on the certifying examination. J Vet Med Educ. 2020;47(1):39–43. doi: 10.3138/jvme.1117-177r. [DOI] [PubMed] [Google Scholar]
- 17.Underwood J.C. More than meets the eye: the changing face of histopathology. Histopathology. 2017;70(1):4–9. doi: 10.1111/his.13047. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.ACGME International Pathology. Milestones. Milestones for Singapore. Milestones Guidebook for Residents and Fellows Singapore Edition. https://www.acgme-i.org/globalassets/acgme-international/specialties/common/milestones_guidebook_for_residents_and_fellows_singapore_edition.pdf
- 19.Hauer K.E., Vandergrift J., Hess B., et al. Correlations between ratings on the resident annual evaluation summary and the internal medicine milestones and association with ABIM certification examination scores among US internal medicine residents, 2013-2014. JAMA. 2016;316(21):2253–2262. doi: 10.1001/jama.2016.17357. [DOI] [PubMed] [Google Scholar]
- 20.Ottum S., Chao C., Tamakuwala S., et al. Can ACGME Milestones predict surgical specialty board passage: an example in Obstetrics and Gynecology. Clin Exp Obstet Gynecol. 2021;48(5):1048–1055. doi: 10.31083/j.ceog4805168. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Kimbrough M.K., Thrush C.R., Barrett E., Bentley F.R., Sexton K.W. Are surgical Milestone assessments predictive of in-training examination scores? J Surg Educ. 2018;75(1):29–32. doi: 10.1016/j.jsurg.2017.06.021. [DOI] [PubMed] [Google Scholar]
- 22.Kogan J.R., Conforti L., Bernabeo E., Iobst W., Holmboe E. Opening the black box of clinical skills assessment via observation: a conceptual model. Med Educ. 2011;45(10):1048–1060. doi: 10.1111/j.1365-2923.2011.04025.x. [DOI] [PubMed] [Google Scholar]
- 23.Bartels J., Mooney C.J., Stone R.T. Numerical versus narrative: a comparison between methods to measure medical student performance during clinical clerkships. Med Teach. 2017;39(11):1154–1158. doi: 10.1080/0142159X.2017.1368467. [DOI] [PubMed] [Google Scholar]
- 24.Hanson J.L., Rosenberg A.A., Lane J.L. Narrative descriptions should replace grades and numerical ratings for clinical performance in medical education in the United States. Front Psychol. 2013;4:668. doi: 10.3389/fpsyg.2013.00668. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Kelleher M., Kinnear B., Sall D.R., et al. Warnings in early narrative assessment that might predict performance in residency: signal from an internal medicine residency program. Perspect Med Educ. 2021;10(6):334–340. doi: 10.1007/s40037-021-00681-w. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Ginsburg S., Gold W., Cavalcanti R.B., Kurabi B., McDonald-Blumer H. Competencies "plus": the nature of written comments on internal medicine residents' evaluation forms. Acad Med. 2011;86(10 Suppl):S30–S34. doi: 10.1097/ACM.0b013e31822a6d92. [DOI] [PubMed] [Google Scholar]
- 27.Marcotte L., Egan R., Soleas E., Dalgarno N., Norris M., Smith C. Assessing the quality of feedback to general internal medicine residents in a competency-based environment. Can Med Educ J. 2019;10(4):e32–e47. [PMC free article] [PubMed] [Google Scholar]
- 28.Ginsburg S., van der Vleuten C.P.M., Eva K.W. The hidden value of narrative comments for assessment: a quantitative reliability analysis of qualitative data. Acad Med. 2017;92(11):1617–1621. doi: 10.1097/ACM.0000000000001669. [DOI] [PubMed] [Google Scholar]
- 29.Tremblay G., Carmichael P.H., Maziade J., Grégoire M. Detection of residents with progress issues using a keyword-specific algorithm. J Grad Med Educ. 2019;11(6):656–662. doi: 10.4300/JGME-D-19-00386.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Hatala R., Sawatsky A.P., Dudek N., Ginsburg S., Cook D.A. Using in-training evaluation report (ITER) qualitative comments to assess medical students and residents: a systematic review. Acad Med. 2017;92(6):868–879. doi: 10.1097/ACM.0000000000001506. [DOI] [PubMed] [Google Scholar]
- 31.Lehr H.A., Bosman F.T. Communication skills in diagnostic pathology. Virchows Arch. 2016;468(1):61–67. doi: 10.1007/s00428-015-1848-y. [DOI] [PubMed] [Google Scholar]
- 32.Condel Jennifer L., Jukic Drazen M., Sharbaugh David T., Raab Stephen S. Histology errors: use of real-time root cause analysis to improve practice. Pathol Case Rev. 2005;10(2):82–87. doi: 10.1097/01.pcr.0000155793.51378.ba. [DOI] [Google Scholar]
- 33.Johnston P.W., Fioratou E., Flin R. Non-technical skills in histopathology: definition and discussion. Histopathology. 2011;59(3):359–367. doi: 10.1111/j.1365-2559.2010.03710.x. [DOI] [PubMed] [Google Scholar]
- 34.Houghton J. So you want to be a Histopathologist? Ulster Med J. 2013;82(3):212. [PMC free article] [PubMed] [Google Scholar]
- 35.Brierley D.J., Farthing P.M., Zijlstra-Shaw S. Delphi study to determine the key qualities consultant histopathologists look for in their trainees. J Clin Pathol. 2020;73(10):642–647. doi: 10.1136/jclinpath-2019-206345. [DOI] [PubMed] [Google Scholar]
- 36.Booth A.L., Roy-Chowdhuri S. Becoming an engaged pathologist. Arch Pathol Lab Med. 2019;143(2):149–150. doi: 10.5858/arpa.2018-0365-. [DOI] [PubMed] [Google Scholar]
- 37.Hébert T.M., Szymanski J., Mantilla J., et al. Onboarding for pathology residency programs-the montefiore experience. Acad Pathol. 2016;3 doi: 10.1177/2374289516639979. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Hébert T.M., Cole A., Panarelli N., et al. Training the next generation of pathologists: a novel residency program curriculum at montefiore medical center/albert einstein College of medicine. Acad Pathol. 2019;6 doi: 10.1177/2374289519848099. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Mehr C.R., Montone K.T., Schwartz L.E. The implementation of an introductory surgical pathology didactic series to transition first year residents and facilitate upper level resident teaching. Adv Anat Pathol. 2019;26(3):210–214. doi: 10.1097/PAP.0000000000000229. [DOI] [PubMed] [Google Scholar]
- 40.Smith N.E., Collins R., Hall J. Surgical pathology "boot camp": a military experience. Arch Pathol Lab Med. 2019;143(9):1144–1148. doi: 10.5858/arpa.2018-0318-EP. [DOI] [PubMed] [Google Scholar]
- 41.Black-Schaffer W.S., Morrow J.S., Prystowsky M.B., Steinberg J.J. Training pathology residents to practice 21st century medicine: a proposal. Acad Pathol. 2016;3 doi: 10.1177/2374289516665393. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Cotta C.V., Chute D.J., Theil K.S. Quantification of the effectiveness of a residency program using the resident in-service examination. Acad Pathol. 2018;5 doi: 10.1177/2374289518781575. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Busha M.E., McMillen B., Greene J., Gibson K., Milnes C., Ziemkowski P. One Institution's evaluation of family medicine residency applicant data for academic predictors of success. BMC Med Educ. 2021;21(1):84. doi: 10.1186/s12909-021-02518-w. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Chole R.A., Ogden M.A. Predictors of future success in otolaryngology residency applicants. Arch Otolaryngol Head Neck Surg. 2012;138(8):707–712. doi: 10.1001/archoto.2012.1374. [DOI] [PubMed] [Google Scholar]
- 45.Grewal S.G., Yeung L.S., Brandes S.B. Predictors of success in a urology residency program. J Surg Educ. 2013;70(1):138–143. doi: 10.1016/j.jsurg.2012.06.015. [DOI] [PubMed] [Google Scholar]
- 46.Valley B., Camp C., Grawe B. Non-cognitive factors predicting success in orthopedic surgery residency. Orthop Rev. 2018;10(3):7559. doi: 10.4081/or.2018.7559. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Quillin R.C., 3rd, Pritts T.A., Hanseman D.J., Edwards M.J., Davis B.R. How residents learn predicts success in surgical residency. J Surg Educ. 2013;70(6):725–730. doi: 10.1016/j.jsurg.2013.09.016. [DOI] [PubMed] [Google Scholar]
- 48.Hayek S.A., Wickizer A.P., Lane S.M., et al. Application factors may not be predictors of success among general surgery residents as measured by ACGME milestones. J Surg Res. 2020;253:34–40. doi: 10.1016/j.jss.2020.03.029. [DOI] [PubMed] [Google Scholar]
- 49.Agarwal V., Bump G.M., Heller M.T., et al. Do residency selection factors predict radiology resident performance? Acad Radiol. 2018;25(3):397–402. doi: 10.1016/j.acra.2017.09.020. [DOI] [PubMed] [Google Scholar]
- 50.Naritoku W.Y., Powell S.Z., Black-Schaffer W.S. Evolution of the pathology residency curriculum: preparing for a positive future. Acad Pathol. 2016;3 doi: 10.1177/2374289516667746. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.ACGME pathology program requirements and FAQs. 2022. https://www.acgme.org/globalassets/pfassets/programrequirements/300_pathology_2022_tcc.pdf Published July 1.