Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2021 Aug 5.
Published in final edited form as: J Cutan Pathol. 2015 May 20;42(10):779–781. doi: 10.1111/cup.12503

Integrating virtual dermatopathology as part of formative and summative assessment of residents: a feasibility pilot study

Ryan Gertz 1, B Jack Longley 2, Daniel Bennett 2, Erik Ranheim 3, Victoria Rajamanickam 4, Tisha Kawahara 2, Justin Endo 2
PMCID: PMC8341838  NIHMSID: NIHMS1725951  PMID: 25990466

To the Editor,

We are writing in response to the innovative articles published by Mooney et al. and Brick et al. about employing virtual dermatopathology (DP) for clinical use and pathology education.[ 1] [ 2] Because the diagnostic equivalence between virtual and conventional microscopy has been supported by comparative studies, virtual DP has been implemented in the certifying examination for dermatology.[ 3] The literature has focused on the implementation of this technology in postgraduate education but with mixed results. Brick et al. noted no preference for glass or virtual slides among their residents,[ 1] in contrast to Koch et al. who found that residents preferred virtual DP to conventional slides as a regular study aid but not as a testing media.[ 3] Given the rising utilization of virtual DP on certifying examinations and differing resident attitudes about this technology for formative (e.g. low stakes pop quiz, self-assessment) vs. summative (e.g. high stakes exam) purposes, we conducted a study to explore the feasibility and acceptability of implementing this technology for residents for both formative and summative purposes.[ 1] , [ 3]

This research project was conducted under the University of Wisconsin-Madison Minimal Risk Institutional Review Board (IRB) approval number M-2012-0466. Dermatology and pathology residents at the University of Wisconsin-Madison consented to participate. Our prior curriculum consisted of self-directed reading with review of corresponding glass slides followed by ‘sign out’ time with a dermatopathologist. A pretest anonymous survey was administered to assess current and ‘ideal’ DP study habits as well as perceptions of virtual DP (free-text and Likert scale ratings). Next, residents completed a 21-item multiple-choice, single correct answer, self-assessment quiz that tested recognition of inflammatory and neoplastic histopathology using whole-slide scanned images (Aperio Technologies, Inc, Vista, CA, USA) on a proprietary, server-based learning platform (University of Wisconsin, Madison, USA). Residents were permitted to take the test at any time over a 2-week period and at any location using the computer, operating system and internet browser of their choice. A posttest survey was administered that assessed difficulties experienced during the test as well as perceptions of virtual DP. Scores were linked to survey responses by a study coordinator and given as aggregated, deidentified data to the investigators. A statistician (VR) performed chi-square test for comparison analysis. Poststudy semistructured debriefing sessions with dermatology and pathology residents were conducted to discuss the study results and obtain further qualitative data. JE and RG performed qualitative analysis using open coding and constant comparative method of free-text survey items and the poststudy debriefing sessions.[ 4]

Twenty-seven residents submitted a pretest survey; 20 residents completed the self-assessment virtual DP quiz (mean completion time 85.7 min); 18 residents completed the quiz and both pretest and posttest surveys. Only nine residents who completed the pretest survey (33%) had used virtual DP more than once in the preceding 6 months. Despite few residents currently using virtual DP, 22 residents (81%) expressed using it more frequently for studying. The degree of utilization was not associated with year of training. Residents using virtual DP more than once in the previous 6 months had better average scores on the quiz than those who did not (61 vs. 44%, p = 0.044). The most common overall study modalities reported were reviewing glass slides from the dermatopathologist’s study set and textbook reading. However, the most commonly reported ideal methods of studying were reviewing one’s own biopsies, reviewing glass slides from the study set, and textbook reading.

Figure 1 illustrates resident posttest preference for glass slides or virtual DP in various contexts that range from low stakes to high stakes. For lower stakes purposes, glass slides and no preference were the most common responses. However, for board certifying exam (high stakes), no preference and virtual DP were the most frequent responses. No statistically significant differences were found between the pretest and posttest surveys about preferences for virtual DP. Navigation was a key difficulty reported by 72% of residents.

Fig. 1.

Fig. 1.

Posttest survey responses of residents’ preferences for using virtual dermatopathology (vs. glass slides) for board certifying exam, low to medium stakes formative assessment quizzes, low-stakes weekly unknown cases that supplement assigned topic readings, and self-directed study.

The poststudy debriefing session comments helped clarify learner ambivalence toward virtual DP. Residents liked the potential flexibility of virtual DP to have remote access of study sets, unknown slides or quizzes. Many also emphasized the desire for decentralized board examinations, obviating the need for a single test center with glass slides. The most common reason for preferring virtual DP for the board certifying exam was the variability of glass slide quality. However, residents expressed reservations about the certifying exam virtual DP platform offering a fixed view; inability to pan, zoom or adjust field of focus; slow image loading time; and low image resolution.

Our study is limited by a small sample size. However, this mixed methods study provides important insights into how virtual DP might be integrated into resident curriculum. Navigation difficulty can be a problem that potentially introduces measurement error that is not related to DP and skill (construct irrelevance). Other important factors that impact learners’ virtual DP experiences are the quality of workstation hardware and internet connection. Our study also questions the common assumption that today’s learners are early adopters of technology, as evidenced by the relatively low number of residents who currently use or prefer virtual DP.[ 5] Our results raise the possibility that the frequency of using virtual DP as a self-study tool might impact learners’ performance on virtual DP assessments. However, our study design cannot prove causality (e.g. virtual DP could be associated with higher volume of reviewed cases or interest in DP). Koch et al. concluded that prior virtual DP experience did not affect performance on glass or virtual DP quiz performance. However, they did not ask about the recency of virtual DP experience as we did.[ 3]

While virtual DP has many theoretical advantages over glass slides, cost remains a barrier. As of 2013, a small scanner costs $50,000 plus maintenance and license (unpublished correspondence, William Lidwin). As the technology becomes cheaper, faster and more available, further study will be important to determine how virtual DP can be integrated into curriculum in a cost-effective manner.

Acknowledgments

We thank William Lidwin for technical assistance in creating the quiz, the University of Wisconsin Division of Information Technology (DoIT) staff for help with the online testing portal, and Drs. Scott Florell and Chong Foo for consultation about their experiences with the Aperio system. This study would not have been possible without the participation of the dermatology and pathology residents at the University of Wisconsin. This project was supported by the Clinical and Translational Science Award (CTSA) program, through the NIH National Center for Advancing Translational Sciences (NCATS), grant UL1TR000427. The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH.

References

  • 1.Brick KE, Sluzevich JC, Cappel MA, DiCaudo DJ, Comfere NI, Wieland CN. Comparison of virtual microscopy and glass slide microscopy among dermatology residents during a simulated in-training examination. J Cutan Pathol 2013; 40: 807. [DOI] [PubMed] [Google Scholar]
  • 2.Mooney E, Kempf W, Jemec GB, Koch L, Hood A. Diagnostic accuracy in virtual dermatopathology. J Cutan Pathol 2012; 39: 758. [DOI] [PubMed] [Google Scholar]
  • 3.Koch LH, Lampros JN, Delong LK, Chen SC, Woosley JT, Hood AF. Randomized comparison of virtual microscopy and traditional glass microscopy in diagnostic accuracy among dermatology and pathology residents. Hum Pathol 2009; 40: 662. [DOI] [PubMed] [Google Scholar]
  • 4.Harris I What does “the discovery of grounded theory” have to say to medical education? Adv Health Sci Educ Theory Pract 2003; 8: 49. [DOI] [PubMed] [Google Scholar]
  • 5.Linder JA, Rigotti NA, Schneider LI, et al. Clinician characteristics and use of novel electronic health record functionality in primary care. J Am Med Inform Assoc 2011; 18(Suppl. 1): i87. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES