Skip to main content
American Journal of Clinical Pathology logoLink to American Journal of Clinical Pathology
. 2017 Mar 18;147(4):370–373. doi: 10.1093/ajcp/aqx013

Success in Implementation of a Resident In-Service Examination Review Series

Jessica A Forcucci 1,, J Madison Hyer 2, Evelyn T Bruner 1, David N Lewin 1, Nicholas I Batalis 1
PMCID: PMC5848425  PMID: 28340222

Abstract

Objectives: Primary pathology board certification has been correlated with senior resident in-service examination (RISE) performance. We describe our success with an annual, month-long review series.

Methods: Aggregate program RISE performance data were gathered for 3 years prior to and 3 years following initiation of the review series. In addition, mean United States Medical Licensing Examination Step 1 and 2 Clinical Knowledge scores for residents participating in each RISE examination were obtained to control for incoming knowledge and test-taking ability. Linear models were used to evaluate differences in average RISE performance prior to and following the initiation of the review series in addition to controlling for relevant covariates.

Results: Significant improvement was noted in the grand total, anatomic pathology section average, clinical pathology section average, and transfusion medicine section. Although not statistically significant, improvement was noted on the cytopathology and clinical chemistry sections. There was no significant difference in scores in hematopathology, molecular pathology, and the special topics section average. In addition, improvement in primary pathology board certification rates was also noted.

Conclusions: Institution of a month-long RISE review series demonstrated improved overall performance within our training program. The success could easily be replicated in any training program without significant disruption to an annual didactic series.

Keywords: Resident in-service examination, RISE, Pathology graduate medical education


The annual American Society for Clinical Pathology resident in-service examination (RISE)1 is a valuable tool used nationally to evaluate medical knowledge and assess trainees’ progress throughout residency. In addition, resident performance in the latter years of residency has been shown to correlate with primary board certification.2 As such, many programs seek strategies to improve resident performance and board pass rate. Herein, we describe our experience with an annual, month-long review series, instituted in 2013, designed to highlight important high-yield concepts in each pathology subspecialty.

Materials and Methods

Beginning in 2013, a month-long RISE review series was held during February and March in lieu of typical morning resident didactics, which all residents are expected to attend, with the purpose of emphasizing important, high-yield topics in each subspecialty. Typical resident education consists of daily hour-long morning didactics in both anatomic pathology (AP) and clinical pathology (CP) topics with a general 2-year curriculum. With the exception of the addition of the RISE review series, no other changes to the resident educational activities were made in this time period.

Every morning during the review series, one to two subspecialties are covered within the hour dedicated to resident didactics, each taught by an attending with expertise in the subject. Topics include transfusion medicine, clinical chemistry, microbiology, molecular pathology, cytology, hematopathology, forensic pathology, and most surgical pathology subspecialties (eg, genitourinary, breast, head and neck, soft tissue). The format of the review is at the discretion of the attending and includes a review of deficiencies noted on the previous year’s program performance report, audience response quizzes and Jeopardy, and general overviews such as “blood banking in 56 slides” Table 1.

Table 1.

2016 Resident In-Service Examination Review Curriculum (Topics and Format)

Subject Time, h Format
Autopsy 1 Practice questions (audience response software)
Bone and soft tissue 2 Practice questions (“round-robin” style)
Breast 1 Practice questions (“round-robin” style)
Clinical chemistry 1 Lecture-based, high-yield topics from prior years
Cytogenetics ½ Practice questions (Jeopardy style)
Cytology 1 Lecture-based, high-yield topics from prior years
Dermatopathology 1 High-yield images
Gastrointestinal 1 Practice questions (“round-robin” style)
Genitourinary 1 Practice questions (“round-robin” style)
Gynecologic 1 Practice questions (audience response software)
Head and neck 1 Lecture based with integrated practice questions
Hematopathology/coagulation 1 Practice question competition (audience response software)
Immunology 1 Case-based practice questions
Informatics ½ Lecture-based, high-yield topics from prior years
Laboratory management ½ Lecture-based, high-yield topics from prior years
Microbiology 1 Practice question competition (team based)
Molecular pathology ½ Lecture-based, high-yield topics from prior years
Neuropathology 1 Lecture-based, high-yield topics and images
Transfusion medicine 1 Lecture-based, high-yield topics from prior years

Aggregate program RISE performance data were gathered for 3 years prior to and 3 years following initiation of the review series Table 2. Primary AP and CP board certification rates for each graduating class over the same time period were also collected. In addition, postgraduate year level, graduation year, and mean United States Medical Licensing Examination (USMLE) Step 1 and 2 Clinical Knowledge scores for residents participating in each RISE were obtained and served as covariates to control for incoming knowledge and test-taking ability, as Kay et al3 demonstrated a correlation between poor performance on USMLE Step 1 and Internal Medicine Board scores. All procedures were approved by the institutional review board at the Medical University of South Carolina.

Table 2.

Residency Program Resident In-Service Examination Percentiles (2010-2015)a

Anatomic Pathology
Clinical Pathology
Special Topics
Year Grand Total CP FP SP CC HE MB TM STHP LA ST
2010 60 65 65 55 60 65 70 55 70 75 80
2011 65 70 75 50 65 50 70 60 65 80 80
2012 55 45 80 45 50 55 65 50 60 80 75
2013 65 65 85 55 65 60 70 75 70 80 75
2014 70 70 90 60 70 70 80 60 70 90 75
2015 65 65 85 55 60 60 65 70 80 75 75

CC, clinical chemistry; CP, cytopathology; FP, forensic pathology; HE, hematology; LA, laboratory administration; MB, microbiology; SP, surgical pathology; ST, special topics; STHP, hematopathology; TM, transfusion medicine.

a

All values are presented as percentiles.

Computed summary statistics are presented as least squares mean and standard error, estimated in linear models controlling for relevant covariates. To assess the effect of the review series, the aforementioned linear model was used. All analyses were performed using SAS version 9.4 (SAS Institute, Cary, NC). Statistical significance was assessed at α = .05.

Results

Following the review series, improvement in RISE performance was documented Table 3 in the grand total, AP section average, CP section average, and transfusion medicine section. Although not statistically significant, improvement was noted on most other individual sections, with cytopathology and clinical chemistry sections showing the greatest gains. In addition, the surgical pathology section scores improved from below to above the national average. Interestingly, hematopathology, molecular pathology, and the special topics section average had lower scores following the review series, although this was not statistically significant.

Table 3.

Pre- and Postreview Mean Percentile

RISE Component Prereview Percentile, Mean (SE) Postreview Percentile, Mean (SE) Change in Percentile (SE) P Value
Grand total 58.5 (2.54) 71.5 (2.54) 13 (4.09) .007
Cytopathology 54.4 (5.55) 70.4 (5.55) 16 (8.93) .09
Forensic pathology 74.6 (4.45) 83.6 (4.45) 9 (7.16) .23
Surgical pathology 45.2 (3.79) 53.2 (3.79) 8 (6.10) .21
AP average 58.1 (2.62) 69.1 (2.62) 11 (4.22) .021
Clinical chemistry 53.6 (4.93) 68.6 (4.93) 15 (7.93) .08
Hematology 54.8 (6.75) 59.3 (6.75) 4.5 (10.85) .68
Microbiology 61.9 (5.16) 70.9 (5.16) 9 (8.30) .30
Transfusion medicine 58.9 (3.81) 72.8 (3.81) 13.9 (6.13) .040
CP average 57.3 (2.01) 67.9 (2.01) 10.6 (3.24) .006
Hematopathology 73.7 (7.12) 70.0 (7.12) −3.7 (11.46) .75
Laboratory administration 74.7 (5.55) 82.7 (5.55) 8 (8.93) .39
Molecular pathology 80.9 (3.56) 71.4 (3.56) −9.5 (5.72) .12
Special topics average 76.4 (3.17) 74.7 (3.17) −1.7 (5.09) .74

AP, anatomic pathology; CP, clinical pathology; RISE, resident in-service examination.

In addition, we have noted an improvement in primary certification rates Figure 1. Although a small sample size with typically five to six graduating residents each year, the AP certification mean has increased from 88% (prereview series, n = 17) to 100% (postreview series, n = 16), while the CP certification mean has increased from 88% (prereview series, n = 17) to 93% (postreview series, n = 16).

Figure 1.

Figure 1

Program RISE grand total percentile and primary certification rates. AP, anatomic pathology; CP, clinical pathology; RISE, resident in-service examination.

Discussion

While in-service examinations generally serve to assess resident knowledge and progression throughout residency, a particular resident or program’s in-service examination performance may also play other important roles. Aggregate program performance may be used, in part, to judge the quality and effectiveness of resident recruitment and education, while individual performance, although generally uncommon within pathology, may be used in promotion, remediation, retention, and permission to moonlight. At our institution, typically each resident’s performance is discussed at his or her subsequent semiannual review with the program director, associate program director, and the resident’s chosen faculty mentor in attendance. Punitive measures have never been placed on residents due to individual or group performance, although individual results have been used to help residents develop a personalized study plan. Likewise, group performance in individual sections has been used by the faculty to highlight opportunities for improvement in educating residents in those disciplines.

Across specialties, in-service examination performance generally correlates with board certification.4‐9 This was confirmed among pathology residents by Rinder et al,2 who demonstrated a correlation between senior RISE performance and primary board certification. As such, many specialties have sought to identify factors that affect in-service examination performance and methods to enhance performance. As previously mentioned, USMLE Step 1 failure has been correlated with increased risk of failure on the Internal Medicine Board examination.3 Interestingly, Sugar et al10 noted statistically significant improvement in American Board of Surgery In-Training Examination performance among residents who took vacation in January prior to the examination, while Talmon et al11 noted a weak positive correlation between average numbers of hours spent teaching and RISE performance among pathology residents. Prior night call status has not been proven to affect in-service examination performance.10

While knowledge of mitigating and exacerbating factors may be helpful, many program directors and educators seek specific interventions that will improve resident in-service and certification examination performance. Within the general surgery education literature, statistically significant improvement in resident performance was noted after the following interventions: mock in-service examination, increased resident ownership of a structured curriculum, review question completion, direct faculty involvement, protected educational time,12 and mandatory remedial programs.13 To our knowledge, no interventions among pathology residents have been previously described.

The institution of the annual RISE review series has been both subjectively and objectively successful. Although not documented in a formal postparticipation evaluation or survey, residents enjoy the fast-paced, high-yield review sessions, and since the teaching responsibilities are distributed throughout the faculty, the time and effort required are not overwhelmingly burdensome. Objectively, statistically significant improvement in performance was noted in the grand total, AP section average, and CP section average, which likely represent the best simulation of the AP and CP certification examinations, respectively. Improved performance was also noted within eight (80%) of the 10 sections, which represent the majority of pathology subspecialties. Although molecular pathology and hematopathology demonstrated a slight decrease in performance, which affected the special topics average, both were among the top five highest performing sections prior to the review series, perhaps leaving less room for improvement.

In addition to the noted improvement in RISE performance, residents receive distinct educational benefits based on their level of training. While senior residents find this review series very helpful as a summation in preparation for board examinations, first- and second-year residents are exposed to high-yield information relating to services on which they have yet to rotate. These tidbits may prove useful when covering call or interfacing between pathology services. Regardless of the level of resident training, the adage “Repetition is the mother of all learning” certainly holds true.

Conclusion

Institution of a month-long RISE review series demonstrated improved performance within our training program, both overall and within most sections, excluding molecular pathology and hematopathology. Furthermore, the review series has been well received by trainees and attendings, with most sessions emphasizing high-yield material in a fun, interactive environment. The success of this review series could easily be replicated in any training program without disruption to an annual didactic series.

Funding

This work has been supported by funding from the National Institutes of Health: Medical University of South Carolina Clinical and Translation Science Award (UL1 RR029882).

Acknowledgments

Acknowledgments: We acknowledge the time and effort spent by each faculty member teaching the review series.

References

  • 1. American Society for Clinical Pathology. ASCP’s resident in-service examination (RISE). 2016. https://www.ascp.org/content/residents/rise. Accessed June 7, 2016.
  • 2. Rinder H, Grimes M, Wagner J, et al. Senior pathology resident in-service examination scores correlate with outcomes of the American Board of Pathology Certifying Examinations. Am J Clin Pathol. 2011;136:499-506. [DOI] [PubMed] [Google Scholar]
  • 3. Kay C, Jackson J, Frank M.. The relationship between internal medicine residency graduate performance on the ABIM certifying examination, yearly in-service training examinations, and the USMLE Step 1 examination. Acad Med. 2015;90:100-104. [DOI] [PubMed] [Google Scholar]
  • 4. Kempainen R, Hess B, Addrizzo-Harris D, et al. Pulmonary and critical care in-service training examination score as a predictor of board certification examination performance. Ann Am Thorac Soc. 2016;13:481-488. [DOI] [PubMed] [Google Scholar]
  • 5. Lohr K, Clauser A, Hess B, et al. Performance on the adult rheumatology in-training examination and relationship to outcomes on the rheumatology certification examination. Arthritis Rheumatol. 2015;67:3082-3090. [DOI] [PubMed] [Google Scholar]
  • 6. Grabovsky I, Hess B, Haist S, et al. The relationship between performance on the infectious diseases in-training and certification examinations. Clin Infect Dis. 2015;60:677-683. [DOI] [PubMed] [Google Scholar]
  • 7. Levy D, Dvorkin R, Schwartz A, et al. Correlation of the emergency medicine resident in-service examination with the American Osteopathic Board of Emergency Medicine Part I. West J Emerg Med. 2014;15:45-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Bedno S, Soltis M, Mancuso J, et al. The in-service examination score as a predictor of success on the American Board of Preventive Medicine Certification Examination. Am J Prev Med. 2011;41:641-644. [DOI] [PubMed] [Google Scholar]
  • 9. Baverstock R, MacNeily A, Cole G.. The American Urological Association In-Service Examination: performance correlates with Canadian and American specialty examinations. J Urol. 2003;170:527-529. [DOI] [PubMed] [Google Scholar]
  • 10. Sugar J, Chu Q, Cole P, et al. Effect of January vacations and prior night call status on resident ABSITE performance. J Surg Educ. 2013;70:720-724. [DOI] [PubMed] [Google Scholar]
  • 11. Talmon G, Czarnecki D, Sayles H.. Does how much a resident teaches impact performance? A comparison of preclinical teaching hors to pathology residents’ in-service examination scores. Adv Med Educ Pract. 2015;6:331-335. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Buckley E, Markwell S, Farr D, et al. Improving resident performance on standardized assessments of medical knowledge: a retrospective analysis of interventions correlated to American Board of Surgery In-Service Training Examination performance. Am J Surg. 2015;210:734-738. [DOI] [PubMed] [Google Scholar]
  • 13. Kim R, Tan T.. Interventions that affect resident performance on the American Board of Surgery In-Training Examination: a systematic review. J Surg Educ. 2015;72:418-429. [DOI] [PubMed] [Google Scholar]

Articles from American Journal of Clinical Pathology are provided here courtesy of Oxford University Press

RESOURCES