Abstract
Introduction
Telepathology (TP) allows for remote slide review with performance comparable to traditional light microscopy. Use of TP in the intraoperative setting allows for faster turnaround and greater user convenience by obviating the physical presence of the attending pathologist. We sought to perform a practical validation of an intraoperative TP system using the Leica Aperio LV1 scanner in tandem with Zoom teleconferencing software.
Methods
A validation was performed in accordance with recommendations from CAP/ASCP, using a retrospectively identified sample of surgical pathology cases with a 1 year washout period. Only cases with frozen-final concordance were included. Validators underwent training in the operation of the instrument and conferencing interface, then reviewed the blinded slide set annotated with clinical information. Validator diagnoses were compared to original diagnoses for concordance.
Results
60 slides were chosen for inclusion. 8 validators completed the slide review, each requiring 2 h. The validation was completed in 2 weeks. Overall concordance was 96.4%. Intraobserver concordance was 97.3%. No major technical hurdles were encountered.
Conclusion
Validation of the intraoperative TP system was completed rapidly and with high concordance, comparable to traditional light microscopy. Institutional teleconferencing implementation driven by the COVID pandemic facilitated ease of adoption.
Keywords: Telepathology, Frozen, Intraoperative, Frozen section, Intraoperative consultation, Teleconferencing, Zoom, Scanner
Introduction
The evolution of telecommunications drives changes in the healthcare landscape, including in the practice of surgical pathology. As a relative latecomer to the field of telemedicine, telepathology (TP) techniques were first investigated in the 1980s and were focused primarily on use in the intraoperative (frozen) setting, where pathologist availability is critical but can be limited by circumstance.1 The diagnostic accuracy of TP methods has increased over a 20–30 year span as technologies have progressively improved. A systematic review performed by Dietz et al in 2020 found concordance rates between TP-derived diagnoses and primary light microscopy diagnoses of paraffinized tissue to be 91.1% for studies before the year 2000, and to be 97.2% for studies on or after the year 2000.2 A unifying feature of these studies is the persistent recommendation to thoroughly train all involved parties on the use of the TP instrument. However, most published validations lacked detailed explanations about user training.
In 2013, the College of American Pathologists (CAP) published guidelines for the validation of WSI systems in anatomic pathology. According to CAP guidelines, every pathology laboratory planning to use TP for clinical diagnostics is required to perform an internal validation.3 The goals of such validation would be to emulate a real-world environment to test whole-system performance and to assess intra- or interobserver variability by establishing diagnostic concordance between digital and glass slides. In an update to these guidelines, the American Telemedicine Association (ATA) and CAP, in conjunction with the American Society for Clinical Pathology (ASCP) and the Association of Pathology Informatics (API), have affirmed specific recommendations for validation of WSI, including using at least 60 cases with a washout period of at least 2 weeks.4 This update also added several good practice statements, including the recommendation to review the validation slide set in a random order.
At the time of this study, our institution employed a dynamic TP system for intraoperative pathologist-to-pathologist consultation, which was solely accessible on campus via intranet. Due to the shortcomings of a dynamic TP system, off-campus accessibility limitations, and the imminent expiration of a third-party digital software contract agreement, we sought to modernize our digital resources for intraoperative consultation. The Aperio LV1 system (Leica Biosystems, Richmond, VA) is a hybrid digital-robotic and WSI instrument with a 4-slide capacity and a relatively small physical footprint. Users can traverse up to 4 slides in real-time through 3 axes of movement with multiple objective lenses. Although external internet streaming functionality was limited on the software build provided, the ubiquitous implementation of videoconferencing via Zoom (Zoom Video Communications, San Jose, CA) brought on by the COVID-19 pandemic provided a promising workaround. We sought to validate this combined system for use in intraoperative consultations, using the above guidelines proposed for WSI validation as a model.
Materials & methods
System and apparatus
At our institution, the Aperio LV1 system is located in the surgical pathology intraoperative consultation room, adjacent to the main anatomic pathology (gross) lab. The benchtop instrument can accept 4 slides at a time, loaded through a horizontally fed tray. This operation must be performed by in-person staff (hereafter referred to as the primary user). Slides are viewed in a live fashion similar to standard bright-field light microscopy. Slides can be navigated in 3 axes of movement (X, Y, and Z-planes) and at multiple magnifications (1.25x, 2.5x [digital], 5x, 10x [digital], 20x, 40x, and 63x [digital]). WSI and image exporting functions are also available. The primary user interacts with the slide through a connected computer workstation using provided proprietary software. The remote user connects to and interacts with the system by means of a persistent Zoom room that is hosted on the connected workstation and operated by the primary user. The connected workstation broadcasts at a 1920 × 1080 pixel resolution. For ease of use, a standing conference invitation link is pre-distributed to all potential remote users. Remote users can directly interact with the software interface by means of Zoom’s “share screen control” function, in a fashion analogous to remote desktop sharing.
Installation qualification and operational qualification were performed by hospital and vendor services; the only considerations mentionable are access of both LV1 machine and computer workstation to a standard 120V power source, workstation access to a robust internet connection (at least 50 Mbps per manufacturer recommendations), and placement away from hazardous materials, ventilation ducts, and direct sunlight. Performance qualification is addressed below.
Operator training
All pathologist validators and administrating primary users were required to undergo training on the software’s live view workflow (loading and selecting slides, navigating slides, changing magnification, adjusting post-processing such as brightness and contrast, capturing screenshots, and editing annotations), hosting and connecting to the persistent Zoom session, and basic troubleshooting. The training sessions were administered by a Pathologists’ Assistant and recorded on a retained worksheet (supplement A). The Pathologists’ Assistant was trained by Aperio LV1’s Application Specialist during system installation. The training sessions were designed to reflect the instrument’s intended use in future live operation. Namely, all users who interact with the instrument on a clinical basis will be required to complete the same standard user training.
Case selection
Archival patient cases were retrospectively evaluated via computerized records for inclusion, spanning the breadth of general surgical pathology intraoperative services. Subspecialty services of interest included bone & soft tissue, breast, head & neck, gastrointestinal (excluding medical hepatic), genitourinary (excluding medical renal), gynecologic, and thoracic. Medical hepatic and renal cases, such as those evaluating for transplant candidacy, were excluded from this primary validation due to their highly subspecialized nature. Cytology, hematopathology, and neuropathology specimens were similarly excluded from this validation. Only cases signed out greater than 1 year prior were considered, constituting a washout period far stricter than the 2-week period recommended by ASCP and API guidance. All cases considered had a component sent for frozen section evaluation. Cases were selected on the basis of breadth and variable complexity and were chosen to mirror the general frozen volume per surgical service. Exclusion criteria included discrepancy between original intraoperative and final diagnosis and poor slide quality due to age or mishandling.
Performance of validation
For each case, the following historic parameters were recorded:
1. Original intraoperative diagnosis. The initial intraoperative diagnosis rendered by an in-person attending pathologist using standard light microscopy. At our institution, intraoperative consultations are provided by surgical pathologists of all subspecialty training background (so-called general coverage).
2. Pathologist rendering original intraoperative diagnosis. Historic intraoperative diagnoses were rendered by a total of 17 pathologists.
3. Original final diagnosis. The final diagnosis, then rendered by a subspecialty attending pathologist using standard light microscopy.
4. Pathologist rendering original final diagnosis. Historic final diagnoses were rendered by a total of 15 pathologists.
5. Subspecialty service. The surgical pathology subspecialty service to which the case was assigned, and in which the pathologist who rendered the final diagnosis is specialized.
6. Gold-standard diagnosis. Since there was no discordance between intraoperative and final diagnosis for any case (see exclusion criteria above), the gold-standard diagnosis for the purpose of this validation was considered to be either the intraoperative or final diagnosis.
Following case selection, all slides for each case were pulled from departmental archives. A single frozen slide from each case was selected on the basis of intactness and featuring the pathology described on the original intraoperative diagnosis. These frozen slides were then randomized and de-identified. Slide labels were covered, as the LV1 displays a slide label snapshot during normal use. A worksheet including patient age, sex, brief clinical history, and specimen designation was provided to each validating pathologist (supplement B). Each validating pathologist reviewed the slide set from a remote location, such as an office or home workstation, with a primary user loading the instrument. The remote workstation was required to meet certain minimum requirements, including a display resolution of at least 1920 × 1080 pixels and at least 13” screen size. A mouse or trackpad input were required for remote screen control functions, as touchscreen controls (such as on a smart phone or tablet) were incompatible. The workstation also was required to have audio input (microphone) and output (speakers or headphones) for communication between the remote and primary users as well as a reliable internet connection (no more than 1% expected downtime).
Validating pathologists were given remote screen control and rendered a diagnosis for each case. The diagnosis was documented on the provided worksheet. The principal investigating pathologist reviewed the diagnoses and compared them to the gold-standard diagnoses in order to determine concordance. Major discordance was defined as a change in diagnosis such that a change in intraoperative patient management was likely to have occurred (e.g., from benign to malignant or vice versa). Minor discordance was defined as a change in diagnosis such that a change in intraoperative patient management was not likely to have occurred (e.g., change in subtype of malignant diagnosis). Only major discordance was considered as significantly discordant for the purposes of this validation. Percent concordance for each pathologist was determined. A goal for both individual pathologist concordance and average overall concordance was set at a minimum of 90% for validation to be considered successful.
Results
General surgical pathology cases
A total of 60 cases spanning 7 subspecialty services (Table 1, supplemental Table 1) were reviewed by 8 pathologists who normally serve as general coverage for the intraoperative consultation (frozen) service, constituting 40% of pathologists on this service. Validating pathologists generally required 2 h of one-on-one review to complete the set. The validation was completed in 2 weeks. Overall, there were 480 unique slide reviews. No deferred or referred diagnoses were rendered. Frozen case re-exposure (when the validating pathologist was the same as the pathologist rendering original intraoperative diagnosis) occurred in 37 (7.7%) of these reviews. Final case re-exposure (when the validating pathologist was the same as the pathologist rendering original final diagnosis) occurred in 27 (5.0%) of these reviews. Most slide reviews, 420 (87.5%), had no case re-exposure.
Table 1.
General validation cases by subspecialty.
| Subspecialty service | # Cases | # Cases reviewed concordantly by all pathologists (% total) | # Cases reviewed discordantly by at least one pathologist (% total) |
|---|---|---|---|
| Gastrointestinal | 16 | 16 (100%) | 0 (0%) |
| Thoracic | 7 | 6 (85.7%) | 1 (14.3%) |
| Gynecologic | 7 | 6 (85.7%) | 1 (14.3%) |
| Breast (all sentinel nodes) | 6 | 5 (83.3%) | 4 (16.7%) |
| Genitourinary | 4 | 3 (75.0%) | 3 (25.0%) |
| Head & Neck | 15 | 10 (66.6%) | 5 (33.3%) |
| Bone & Soft Tissue | 5 | 3 (60%) | 2 (20%) |
| Total | 60 | 49 (81.7%) | 11 (18.3%) |
Of 480 slide reviews, 463 (96.5%) were concordant. Of the 37 cases of frozen case re-exposure, constituting an intrapathologist comparison, there was only 1 discordant review, constituting a 97.3% (36/37) intrapathologist concordance rate. Of the 443 reviews without frozen case re-exposure, 16 were discordant, constituting a 96.4% (427/443) inter-pathologist concordance rate.
Individual pathologist concordance rates ranged from 95% (57/60) to 98.3% (59/60) (Table 2, supplemental Table 2). 49 cases (81.7%) were diagnosed concordantly by all validating pathologists. 11 cases (18.3%) were interpreted discordantly by any pathologist. Of these, 2 were interpreted discordantly by multiple pathologists. The subspecialty service with highest rate of discordant review was head & neck, followed by genitourinary, and bone & soft tissue (Table 1).
Table 2.
General validation cases by pathologist.
| Pathologist | Number of concordant cases (% total) |
|---|---|
| 1 | 58/60 (96.7%) |
| 2 | 57/60 (95.0%) |
| 3 | 58/60 (96.7%) |
| 4 | 59/60 (98.3%) |
| 5 | 59/60 (98.3%) |
| 6 | 57/60 (95%) |
| 7 | 57/60 (95%) |
| 8 | 58/60 (96.7%) |
| Overall | 405/420 (96.4%) |
The 2 cases diagnosed discordantly by more than 1 pathologist were: (1) A biopsy of periurethral tissue in a patient undergoing resection of penile squamous cell carcinoma, originally diagnosed as carcinoma in situ. 3 of 8 pathologists diagnosed reactive atypia rather than dysplasia/carcinoma in situ. (2) An axillary sentinel lymph node in a patient with treated grade 1 invasive ductal carcinoma, originally diagnosed as a <0.5 cm focus of metastatic carcinoma in a background of biopsy (clip) site changes and treatment effect. 5 of 8 pathologists commented on the foreign body granuloma but did not identify the small focus of carcinoma. Over a year after the main study, the glass slides for these 2 cases were subjected to blinded re-review by the same validators using traditional light microscopy. Concordance rate upon re-review was 50% for both cases. Notably, intraobserver agreement between the main validation and this re-review was also 50%, indicating these cases to be intrinsically challenging.
In 2 out of the 8 validator sessions, the instrument or software interface froze or transiently malfunctioned. Restarting both systems resolved the interruption. One validator experienced transient difficulty in using the remote share screen function of Zoom, which resolved spontaneously.
Discussion
We found the overall performance of the Leica Aperio LV1 with Zoom video conferencing to be comparable to that reported in other TP studies comparing telemicroscopy and traditional light microscopy. Our overall concordance of 96.5% is similar to a 95.1% concordance reported in a large comparison of digital WSI to traditional light microscopy.5 Intra- and interobserver concordance does not appear to be significantly different; however, our limited sample size arguably limits assessment of intraobserver variability. Identifying meaningful differences between organ systems in this intraoperative setting likewise requires greater data collection.
Our implementation and validation of the combined system was executed quickly and with a minimum of cost beyond the capital investment of the instrument. We did not encounter any technical limitations that significantly impeded this implementation—essentially all computer workstations met our stated requirements, and the network demands placed by teleconferencing were well within routine expectations of our hospital internet systems. All technical issues encountered were quickly corrected by instrument or software rebooting.
Telemicroscopy offers many immediate and cost-effective benefits to surgical pathology workflows. The earliest efforts used only static TP methods, in which still histologic images were transmitted over remote and closed connection, due to limitations in data transmission bandwidth and the underdevelopment of robotic methods at that time. Control could only be exerted by the remote pathologist indirectly through the help of on-site personnel. Dynamic TP methods, which use live video streaming, generally combined with live telephone or voice-over-IP (VoIP) communication, allow for faster histologic evaluation and greater user satisfaction.6 Image quality is of paramount importance, and poor quality often correlated with misdiagnosis in early studies.7 Dynamic robotic (DR) TP methods emerged early in the 21st century. These technologies were defined by allowing a remote pathologist the ability to directly control the microscopic examination of a slide, including scanning and selection of fields, adjusting magnification, and sometimes selection or exchange of entire slides. Early studies also utilized open web-based protocols for remote connection.8 Finally, whole slide imaging (WSI) emerged, enabling scanning of an entire histologic slide, often at multiple magnifications, to a high-resolution-format image file that can be reviewed by a networked remote user. WSI has vast applications, both for primary diagnosis, where it has been demonstrated as comparable to traditional light microscopy,5 and for the intraoperative setting, where it confers the ability for multiple users to review a slide simultaneously without having to share common viewfields. Other benefits to have emerged from the development of these digital methods include saving of live images for QA/QC efforts and improving access to subspecialty consultation.9 Although medical hepatic and renal pathology were excluded from our study, access to a remote pathologist seems of most salience in evaluation of tissue for transplant during organ procurement. A meta-analysis of studies in this context has highlighted TP’s efficacy, owing mainly to lack of available pathologists in all procurement settings and a lack of familiarity by general pathologists with transplant pathology.10
More generally, removing the requirement for in-person staffing by an attending pathologist allows for greater flexibility in providing accurate and reproducible intraoperative diagnoses in both routine and extenuating practice settings. Dynamic robotic systems enhance this flexibility by offering the remote user nearly full control over instrument operation, in our case greatly improving ease of use over a prior dynamic video-feed TP system. Under this system, the primary operator was responsible for microscope operation, reducing control for the remote user and potentially introducing bias from the primary user responsible for moving the slide. Conferring the remote telepathologist full control has been shown to shorten slide review time and improve frozen section turnaround time.11
Remote live view was achieved through use of video teleconferencing software with shared screen control functions, in this case Zoom, in a manner ensuring compliance with Health Insurance Portability and Accountability Act (HIPAA) requirements. Following the start of the nationwide emergency brought about by the COVID-19 pandemic, the US Department of Health and Human Services issued a notification of enforcement discretion for telehealth remote communications. In this notice, the Office for Civil Rights indicated an exercise of enforcement discretion for those telehealth systems that may not be compliant with all regulatory requirements under HIPAA. Those seeking additional privacy protections are advised to enter into business associate agreements with vendors that are HIPAA compliant, including Zoom, Skype/Microsoft Teams, Cisco Webex, and GoToMeeting. A longer list of approved vendors is available online (link). Institutional licenses for Zoom teleconferencing or equivalent competitors have become ubiquitous in most healthcare settings, and most hospital-based pathology practices will not need to seek their own business associate agreements.
This validation demonstrated good concordance between light microscopy and robotic TP and was determined to be safe for clinical application in intraoperative consultation. However, this method does suffer from certain limitations. Slide preparation is not different from our routine workflow, and as such it is still susceptible to pre-analytic errors. Proper tissue accessioning, labeling, and sampling are essential, and slide preparation should be performed to minimize slide artifacts including tissue folds and air bubbles. In our experience, frozen tissue sections greater than 4 microns in thickness yielded poor histomorphology in intraoperative evaluation using either TP or traditional light microscopy. Prior instrument training of all involved personnel on both intraoperative workflow and TP instrument use is essential to minimize the rate of diagnostic errors stemming from primary or remote user inexperience.7 Troubleshooting of instrument or software errors should be included in this training. Access to on-site personnel capable of troubleshooting should be ensured at all service hours. In addition, many early studies identified a possible increase in turnaround time relating to operator inexperience and various technical limitations including bandwidth and difficult viewfinding.9,11,12 Turnaround time in live clinical practice was not captured during our validation, which was performed in controlled settings. However, general technological advances in file transfer, streaming, and user interfaces are expected to improve turnaround time.
Following this validation, the Aperio LV1 system was approved for histologic diagnosis in intraoperative consultation for all users having completed operator training. Operator training is also required of all primary users, including trainees (residents and fellows) and surgical pathology staff (technicians and Pathologists’ Assistants) who may operate the instrument for clinical use, and has become standardized as part of our surgical pathology onboarding training. The majority of intraoperative frozen service pathologists continue to prefer in-person coverage under normal circumstances, citing the benefit of gross examination of specimens and overall comfort levels with conventional glass microscopy. The instrument was also received well by technical staff. The LV1 can be left on overnight, eliminating the need for daily restarting of hardware or software. The persistent Zoom conference room only requires regular checks for uptime and necessary software updates. Better remote access also reduced technical staff burden in locating primary or secondary pathologists.
Remote review of gross specimens, so-called telemacropathology, is not addressed in our study. However, anecdotally, gross review has been achieved by specimen photography, person-to-person video chat (such as FaceTime), and videoconferencing via Zoom through the use of the MacroPATH imaging system (Milestone Medical, Kalamazoo, MI). The instrument has seen greater use in secondary (subspecialty) consultation, when guidance is sought for a case by the on-site pathologist, often in an area outside their subspecialty.
Use of the LV1 for either primary diagnosis or secondary consultation is tracked internally on intraoperative consultation worksheets and is reviewed at monthly quality management meetings. This data will be used to identify any impacts on turnaround time as well as to detect emergent problems with instrument performance. We anticipate modest time increases in routine settings, reflecting instrument loading and establishing remote connection as additional steps relative to in-person intraoperative workflow. We expect this to worsen in complicated consultations that require many cut slides or that have many frozen parts. However, this may be offset by improved turnaround in cases where pathologist unavailability would cause significant delays.
Although implementation and validation of the Leica Aperio LV1 robotic microscope was not initially driven by the onset of the COVID-19 pandemic, the workplace changes widely driven by the pandemic clearly highlighted the benefit of telepathology in the intraoperative setting, and the widespread increase in computer and teleconferencing literacy made adoption straightforward. Performance qualification was completed quickly and at low cost and effort, with easy adoption by participating pathologists. Concordance between light microscopy and remote live view was high (96.5%) facilitating adoption of the LV1 system into routine clinical practice.
Author Contribution
TC contributed to the design the study, collected data, interpreted results, and wrote the manuscript.
FS contributed to the design of the study, collected data, interpreted results, and wrote the manuscript.
MA contributed to the design of the study and collected data.
RP contributed to the design of the study and collected data.
CN contributed to the design of the study and collected data.
CG contributed to the design of the study and collected data.
NC designed the study, collected data, interpreted results, and wrote the manuscript.
Declaration of interests
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Acknowledgments
We would like to thank our dedicated pathologist validators who contributed to this project, including Drs. Tatjana Antic, Jennifer Bennett, Anthony Chang, Kammi Henriksen, Aliya Husain, Ricardo Lastra, and Husain Sattar.
Footnotes
Supplementary data to this article can be found online at https://doi.org/10.1016/j.jpi.2023.100194.
Appendix A. Supplementary data
Supplementary material 1
Supplementary material 2
Supplementary material 3
References
- 1.Weinstein R.S. Prospects for telepathology. Hum Pathol. 1986;17:433–434. doi: 10.1016/s0046-8177(86)80028-4. [DOI] [PubMed] [Google Scholar]
- 2.Dietz R.L., Hartman D.J., Pantanowitz L. Systematic review of the use of telepathology during intraoperative consultation. Am J Clin Pathol. 2020;153:198–209. doi: 10.1093/ajcp/aqz155. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Pantanowitz L., Sinard J.H., Henricks W.H., et al. Validating whole slide imaging for diagnostic purposes in pathology: guideline from the College of American Pathologists Pathology and Laboratory Quality Center. Arch Pathol Lab Med. 2013;137:1710–1722. doi: 10.5858/arpa.2013-0093-CP. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Pantanowitz L., Dickinson K., Evans A.J., et al. American Telemedicine Association clinical guidelines for telepathology. J Pathol Inform. 2014;5:39. doi: 10.4103/2153-3539.143329. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Mukhopadhyay S., Feldman M.D., Abels E., et al. Whole slide imaging versus microscopy for primary diagnosis in surgical pathology: a multicenter blinded randomized noninferiority study of 1992 cases (pivotal study) Am J Surg Pathol. 2018;42:39–52. doi: 10.1097/PAS.0000000000000948. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Baak J.P., van Diest P.J., Meijer G.A. Experience with a dynamic inexpensive video-conferencing system for frozen section telepathology. Anal Cell Pathol. 2000;21:169–175. doi: 10.1155/2000/908426. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Stauch G., Schweppe K.W., Kayser K. Diagnostic errors in interactive telepathology. Anal Cell Pathol. 2000;21:201–206. doi: 10.1155/2000/154031. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Kaplan K.J., Burgess J.R., Sandberg G.D., Myers C.P., Bigott T.R., Greenspan R.B. Use of robotic telepathology for frozen-section diagnosis: a retrospective trial of a telepathology system for intraoperative consultation. Mod Pathol. 2002;15:1197–1204. doi: 10.1097/01.MP.0000033928.11585.42. [DOI] [PubMed] [Google Scholar]
- 9.Horbinski C., Hamilton R.L. Application of telepathology for neuropathologic intraoperative consultations. Brain Pathol. 2009;19:317–322. doi: 10.1111/j.1750-3639.2009.00265.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Eccher A., Girolami I., Brunelli M., et al. Digital pathology for second opinion consultation and donor assessment during organ procurement: review of the literature and guidance for deployment in transplant practice. Transplant Rev (Orlando) 2020;34 doi: 10.1016/j.trre.2020.100562. [DOI] [PubMed] [Google Scholar]
- 11.Hufnagl P., Guski H., Hering J., et al. Comparing conventional and telepathological diagnosis in routine frozen section service. Diagnos Pathol. 2016;2 [Google Scholar]
- 12.Adachi H., Inoue J., Nozu T., Aoki H., Ito H. Frozen-section services by telepathology: experience of 100 cases in the San-in District, Japan. Pathol Int. 1996;46:436–441. doi: 10.1111/j.1440-1827.1996.tb03634.x. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supplementary material 1
Supplementary material 2
Supplementary material 3
