Abstract
Background
In recent years, there has been a surge of interest in clinical digital pathology (DP). Hardware and software platforms have matured and become more affordable, and advances in artificial intelligence promise to transform the practice of pathology. At our institution, we are launching a stepwise process of DP adoption which will eventually encompass our entire workflow. Out of necessity, we began by establishing a whole slide imaging (WSI)-based frozen section service.
Methods
We proceeded in a systematic manner by first assembling a team of key stakeholders. We carefully evaluated the various options for digitizing frozen sections before deciding that a WSI-based solution made the most sense for us. We used a formalized evaluation system to quantify performance metrics that were relevant to us. After deciding on a WSI-based system, we likewise carefully considered the various whole slide scanners and digital slide management systems available before making decisions.
Results
During formal evaluation by pathologists, the WSI-based system outperformed competing platforms. Although implementation was relatively complex, we have been happy with the results and have noticed significant improvements in our frozen section turnaround time. Our users have been happy with the slide management system, which we plan on utilizing in future DP efforts.
Conclusions
There are various options for digitizing frozen section slides. Although WSI-based systems are more complex and expensive than some alternatives, they perform well and may make sense for institutions with a pre-existing or planned larger DP infrastructure.
Keywords: Digital pathology, Frozen sections, Whole slide imaging
Background
In recent years, digital pathology (DP) has experienced a surge in interest. An increasing number of pathology departments have converted to a purely digital workflow, and many more have incorporated digital pathology in some capacity.1, 2, 3, 4, 5 The acceleration in DP adoption is due to a convergence of factors, including competition-driven price decreases for hardware and storage, the availability of robust and user-friendly software, and regulatory advances.6, 7, 8 The ongoing COVID-19 pandemic appears to have spurred adoption, as the utility of remote pathology signout became obvious5,9,10; CLIA restrictions surrounding remote signout have been relaxed due to the pandemic.11
The prospect of using computational pathology software tools on digital pathology images is exciting and is likely to transform the practice of pathology.12, 13, 14, 15, 16 Deep-learning-based “artificial intelligence” (AI) tools are being developed and deployed at a rapid pace17, 18, 19, 20, 21; the first FDA approval was recently granted to such a tool, for AI-assisted detection of prostate cancer.22 As more such tools come to market, facilitating increased pathologist productivity and accuracy, interest in DP will only grow.
At our institution, we have embarked on a stepwise plan for deploying DP, with the ultimate goal of a fully digital workflow. City of Hope is an NCI-designated cancer treatment center and research institute in Southern California, with one 217-bed inpatient hospital, 21 clinical network sites, and a soon-to-open second ambulatory cancer center located 40 miles away.
We opted to begin by implementing a digital remote frozen section service. This decision was spurred by the announcement of the new ambulatory cancer center. This facility will host operating rooms that will need frozen section support, but the initial volume of pathology cases will not be enough to justify stationing a full-time surgical pathologist on-site. In addition, our main campus operating rooms are in separate buildings from our pathology grossing room, requiring manual couriering of specimens. This process adds 10–20 min to our frozen section turnaround time. We therefore decided to first develop the capability for remote frozen sections on our main campus, and use the lessons learned to replicate the infrastructure at our new cancer center. We ultimately decided to deploy a whole slide-based platform, but other possibilities were considered.
In this paper, we will discuss the process of implementing this digital remote frozen section service, with an emphasis on our logistical and technical decision-making and lessons learned.
Methods
Assembling a team
Before any major decisions were made, we assembled a team composed of pathologists with subject matter expertise (“SMEs”), laboratory executive leadership (including our department chair), and representatives from information technology (IT). The IT department identified a project manager (PM) who was responsible for coordinating meetings, seeking additional expertise when needed, and ensuring adherence to project timelines. The PM proved invaluable in shepherding the project and allowing the SMEs to concentrate on concerns most relevant to the end users. The IT department also provided a dedicated user support specialist to provide ongoing assistance after the go-live date. Although the team grew and shrank over the course of the project as needed, the core group remained intact and provided necessary continuity.
Choosing a platform
There are a multitude of solutions to the problem of remote pathology diagnosis—i.e., telepathology.23 Indeed, telepathology has been practiced for decades, predating the widespread availability of whole slide imaging (WSI).24, 25, 26 Broadly, the available solutions can be grouped into 3 categories: (1) static image-based, (2) real-time telepathology, and (3) whole slide imaging-based systems. Several hybrid systems exist which incorporate both whole slide imaging and real-time remote telepathology in a “live view” (LV) mode. These technologies have all been previously described in detail (e.g. in reference 23).
We quickly decided that full remote control by the interpreting pathologist was a core requirement for patient safety reasons. Our pathologists also strongly preferred full control. Therefore, static image-based systems were excluded, as were remote telepathology systems that did not allow full remote control. We therefore opted to formally evaluate a “live view” (LV) remote telepathology system versus a whole slide imaging-based system. We chose the Leica Aperio LV1 device, with which we had some prior experience.
Even before formal evaluation, the two platforms offered clear advantages and disadvantages. The live view device enables full remote viewing and control of up to 4 glass slides. The viewing pathologist can control magnification and can pan the slide view. Focus can also be controlled in real-time, which is not possible with WSI systems. This feature can be helpful when evaluating slides with three-dimensionality such as cytologic preparations (e.g., touch preparations, smears) or slides with tissue folds. Such specimens are occasionally encountered on our frozen section service, mostly in the form of neuropathology cytologic preparations.
Since LV devices simply provide a temporary remote view of the slide and do not generate any digital images, the network integration requirements are minimal, and the overall cost is lower when compared to a whole slide-based system.
By contrast, whole slide imaging-based systems are significantly more complex. At a minimum, network storage and an image management system (IMS) are required to view and manage the digitized slides. Integration with the hospital electronic health record (EHR) is usually desired, which entails additional complexity, effort, and cost. Multiple focal planes can be captured as a “z stack”, but this significantly increases scan time and file size. We hypothesized that a WSI-based system would offer the best image quality and overall user experience, but we wanted to formally test this hypothesis.
We needed to choose a whole slide scanner suitable for digitizing frozen section slides. For this application, we aimed to prioritize scanning speed, image quality, and reliability. High capacity was not required. We had been using the Roche/Ventana DP200 scanner in our department for several years for miscellaneous non-clinical scanning tasks and were satisfied with its performance in terms of visual quality and scanning speed. The slide handling features of this scanner also appeared well suited to wet-mounted frozen section slides (e.g., slides held in place in a tray—see Fig. 1). Therefore, we chose this scanner model for further evaluation.
Fig. 1.
Roche/Ventana DP200 slide handling.
Slides are loaded into slots in a plastic tray (A), and are securely held in place as the tray is loaded into the scanner (B). This arrangement avoids individual slide handling by the scanner, and works well for frozen sections slides which may contain incompletely dried mounting media.
An image management system (IMS) was required to view and organize the scanned whole slide images. Our department of Radiology had been using the Sectra imaging system for several years, and our institution had purchased the optional Sectra digital pathology module. While we had no personal experience with this system, we were aware of other institutions using Sectra as an IMS.4,27,28 We did have combined experience with several other IMS systems, including Inspirata (Inspirata, Inc., Tampa, FL) and the Phillips IMS (Koninklijke Philips N.V., Amsterdam, Netherlands). Since Sectra was already available to us, we chose to formally evaluate its capabilities before considering other options.
Network architecture and storage
We estimated our storage needs based on historical frozen section volume and real-world average file sizes (scanned at 20X magnification). Since this was a relatively modest amount (approximately 6 TB/year), we were able to utilize existing storage capacity already procured for Sectra Radiology. This was a high speed (flash-based, “Tier 1”) storage pool hosted on a network-attached storage (NAS) platform with replication to an off-site data center.
Interfacing the DP200 scanner proved somewhat more involved than anticipated, as the vendor implemented a firewall that precluded direct connectivity to our internal network by the scanner workstation PC. Rather, the scanned images were transmitted to an image management service hosted on a virtual machine. This service was configured to move the images to a designated network folder which was monitored by Sectra for ingestion into the IMS. A Sectra “test” instance was configured to enable evaluation without electronic health record integration. Fig. 2 shows the network architecture.
Fig. 2.
DP200 network architecture.
A simplified network diagram showing scanner connectivity. The firewall does not permit direct connectivity of the scanner to network storage. Instead, images transit through a vendor-supplied image management service hosted on a virtual machine, which is configured to deposit images in a pre-specified network folder. The Sectra PACS monitors the folder and ingests newly deposited images.
Formal evaluation process
We developed an evaluation scoring spreadsheet to formally compare the performance of the live view system versus the WSI-based system. Metrics were identified by the SMEs and placed into categories of “speed”, “image quality”, or “ease of use” (Fig. 3). Four pathologists and 2 pathologist assistants (PAs) participated in the evaluation. Metrics relating to image manipulation were evaluated by the pathologists, while metrics related to slide handling were evaluated by the PAs. Each metric was assigned a weight relative to its perceived importance, and evaluators were asked to assign a score of “not acceptable (1)”, “acceptable (2)”, or “exceeds expectations (3)”. Scores were multiplied by their weights and results were summed across all metrics to arrive at final scores for each modality.
Fig. 3.
Live view vs. whole slide imaging evaluation worksheet.
This worksheet was used to quantitatively compare the live view (LV) device versus the whole slide imaging (WSI) platform. Metrics were developed and weights were assigned by the subject matter experts (SMEs). Evaluators supplied a score ranging from 1 to 3, which was multiplied by the weight to arrive at a weighted score. All weighted scores were summed to arrive at a total weighted score.
In addition to these subjective user experience-related metrics, detailed timing data was gathered for each step of the 2 workflows. For example, for the WSI scanner, median time to complete the following steps were calculated: slide loading, slide prepping, scanning, file saving, IMS upload/ingestion (Table 1). In addition, the average time before the pathologist could manipulate the digital slide, and the total time to render a diagnosis were calculated.
Table 1.
Whole-slide imaging timing data.
Median (s) | Range (s) | |
---|---|---|
Slide load time | 15 | 14-20 |
Slide prep time | 12.5 | 3-21 |
Scan time | 64.5 | 20-100 |
File save time | 7 | 3-16 |
IMS ingestion time | 20 | 6-38 |
Timing data is shown for each sub-step of the whole slide imaging process. The slide scanning time is the most time-consuming step, and is also the most variable (depending on tissue size).
Twelve frozen section cases were evaluated to gather subjective user feedback, while 44 additional cases were scanned to gather WSI timing data.
Electronic health record integration
During evaluation, integration with the EHR was not required since we were primarily concerned with the performance and usability of the Sectra software in isolation. After we decided to proceed with the WSI-based solution, integration with Epic Beaker, our pathology laboratory information system (LIS) became necessary. Our institution uses Epic as the EHR, and our department uses Epic Beaker as the LIS. We opted to implement an “EHR-driven” workflow, with case management and reporting performed in the EHR. Sectra can be configured to perform these functions in an “IMS-driven” workflow, but the EHR-driven option most closely adhered to our existing glass slide-based workflow and was therefore chosen to minimize user disruption and smooth adoption.
Our frozen section reporting workflow also needed to be changed. We had been writing our diagnoses by hand for later transcription and scanning into the EHR, but this was obviously not possible in a remote scenario. Therefore, we built a new frozen section reporting module for our EHR (see Fig. 4).
Fig. 4.
New Epic Beaker frozen section reporting workflow.
Frozen section diagnoses are entered by clicking an “Intra-Op” toolbar button (step 1). Diagnoses are entered per specimen (step 2), and the case is posted to the patient’s chart by clicking a “Prelim intra-op" button (step 3). The intraoperative diagnosis is then posted to the chart and is visible to the operating room staff (step 4).
For frozen sections particularly, we wanted to enforce correspondence between the case selected in the EHR and the digitized slide being viewed in the IMS. After trial and error, we eventually decided to enable WSI viewing via a hyperlink in the EHR, which would open the Sectra IMS with the appropriate case displayed. Fig. 5 outlines this workflow.
Fig. 5.
EHR-based launching of IMS with enforcement of patient context.
When the “Intra-Op” tab is active a link to launch associated digital pathology images is present (step 1, highlighted in red). When this link is clicked, the Sectra IMS is launched and the scanned images for the case are displayed. This mechanism enforces patient context by only displaying images belonging to the case being viewed in the EHR.
Staffing
We stationed a PA in an existing room adjacent to one of our main OR suites. This room had been originally designed for frozen section support but had not been routinely used for that purpose due to its distance from our main grossing room and pathologist's offices. The necessary equipment was therefore already in place (e.g., a cryostat and grossing hood), and we only needed to find room for the DP200 slide scanner.
For the main campus pilot testing, the option of reverting to our preexisting glass slide-based workflow remained available. Thus, we imposed a limit of 4 blocks per frozen section case and decided that only 1 digital frozen section case should be performed at a time, with “overflow” cases reverting to glass slide diagnosis.
Validation and training
Once the decision was made to proceed with the WSI-based platform, validation was performed according to CAP guidelines.29,30 Representative frozen section slides were chosen from 60 previously diagnosed cases, and the slides were digitized at 20X magnification after anonymization. One neuropathology smear was included in the validation set. Three pathologists were recruited to participate in validation and were asked to render a diagnosis starting either with the glass slide or digital versions of the slides. A washout period of at least 2 weeks was allowed to elapse, after which each participant again rendered their diagnosis using the other modality. Intraobserver variability between glass slide and WSI diagnoses was used as the metric of diagnostic concordance. Discordant diagnoses were categorized as either minor or major, with major discordances having a possible impact on patient care. Concordance was also assessed relative to the actual diagnosis rendered at the time of initial frozen section. Validation was performed using standard office PCs and monitors, which all met the minimum hardware requirements provided by Sectra. The average minor concordance rate was 96.7%, while the average major concordance rate was 97.8%.
Training was provided by the IMS vendor (Sectra) close to our go-live date for all users of the system. This was conducted via several virtual meeting sessions which were recorded for record-keeping purposes.
Results
Fig. 6 and Table 2 show the results of LV versus WSI-based head-to-head testing. The median time to diagnosis was 45 s longer for WSI than for live view. This was largely due to the time needed to scan the slides (median 64.5 s). By contrast, slides were available in a median time of 15 s for the LV device.
Fig. 6.
Results of Live View vs. WSI head-to-head testing.
Panel A shows summarized weighted scores across the categories of ”ease of use”, ”image quality”, and ”speed” for four pathologist evaluators. The WSI system did not receive any ”unnaceptable” scores, and the summarized weighted scores are higher on average. Panel B shows the difference in time to diagnosis between the live view and WSI systems. A time difference greater than zero indicates a faster time to diagnosis for the LV system. The median time difference was 45 s.
Table 2.
Live view versus WSI diagnosis time by case type
Anatomic site | Tissue size (mm2) | LV diagnosis time (s) | WSI diagnosis time (s) |
---|---|---|---|
Distal ureter | 42 | 183 | 60 |
Left ureter | 390 | 260 | 120 |
Delphian node | 88 | 110 | 20 |
Gallbladder | 540 | 180 | 52 |
Sentinel lymph node | 374 | 164 | 85 |
Left lower lip | 28 | 47 | 50 |
Left nasal cavity | 55 | 200 | 120 |
Peritoneal implant | 128 | 82 | 80 |
Whipple | 390 | 119 | 120 |
Sentinel lymph node | 153 | 96 | 60 |
Sentinel lymph node | 136 | 102 | 52 |
Uterus | 330 | 160 | 80 |
Distal ureter | 42 | 109 | 50 |
Left ureter | 390 | 229 | 120 |
Delphian node | 88 | 23 | 7 |
Gallbladder | 540 | 237 | 72 |
Sentinel lymph node | 374 | 180 | 108 |
Left lower lip | 28 | 64 | 50 |
Left nasal cavity | 55 | 232 | 100 |
Peritoneal implant | 128 | 200 | 80 |
Whipple | 390 | 225 | 120 |
Sentinel lymph node | 153 | 100 | 80 |
Sentinel lymph node | 136 | 90 | 50 |
Uterus | 330 | 120 | 75 |
Distal ureter | 42 | 53 | 30 |
Left ureter | 390 | 183 | 88 |
Delphian node | 88 | 7 | 5 |
Gallbladder | 540 | 164 | 35 |
Sentinel lymph node | 374 | 51 | 73 |
Left lower lip | 28 | 115 | 23 |
Left nasal cavity | 55 | 144 | 120 |
Peritoneal implant | 128 | 84 | 54 |
Whipple | 390 | 159 | 54 |
Sentinel lymph node | 153 | 32 | 42 |
Sentinel lymph node | 136 | 42 | 17 |
Uterus | 330 | 85 | 80 |
Distal ureter | 42 | 28 | 34 |
Left ureter | 390 | 165 | 58 |
Delphian node | 88 | 30 | 10 |
Gallbladder | 540 | 115 | 38 |
Sentinel lymph node | 374 | 105 | 60 |
Left lower lip | 28 | 53 | 48 |
Left nasal cavity | 55 | 105 | 85 |
Peritoneal implant | 128 | 93 | 50 |
Whipple | 390 | 82 | 50 |
Sentinel lymph node | 153 | 48 | 46 |
Sentinel lymph node | 136 | 120 | 39 |
Uterus | 330 | 180 | 60 |
The time to diagnosis in seconds is shown for both the live view and WSI systems, by case type.
The summarized subjective user experience scores are also shown in Fig. 6. Several evaluators deemed the LV device “unacceptable” in terms of either speed or ease of use. By contrast, the WSI system did not receive any “unacceptable” ratings, and indeed was rated as “exceeds expectations” for many metrics. The summed scores were higher for the WSI system for all evaluators.
We have not formally quantified changes in turnaround time between glass slide and digital frozen sections, as improving turnaround time was not the primary goal of our pilot project. However, the 10–20 min spent manually couriering frozen section specimens from our ORs to our main gross room has been eliminated with digital frozen sections. This time savings is partially offset by the additional time spent scanning the slides; nevertheless, we estimate at least a 10-min improvement in turnaround time on average.
Subjective user feedback has been very positive. Pathologists have generally found the Sectra platform to be user friendly and responsive. We have experienced no performance issues such as network lag.
Conclusions
We elected to proceed with a WSI-based remote frozen section platform for several reasons. Perhaps most importantly, we had in place a plan for stepwise full digitization of our department. While we needed to start with remote frozen sections by necessity, we knew that whole slide scanning hardware and the associated infrastructure would soon be needed regardless of which platform was chosen for frozen sections. By standardizing early on an IMS system (Sectra), we could gain experience and better prepare ourselves for eventual higher throughput clinical digital pathology. For example, we are currently converting our large outside case volume to a digital workflow. While multiple high throughput whole slide scanners will be deployed, we plan to continue using Sectra as our IMS, as well as the interfaces to Epic Beaker that we have built.
The ability to control focus in real-time is an advantage of the live view-based systems that is particularly relevant for specimens with three dimensionality such as cytologic preparations. In our frozen section practice, such specimens are only occasionally encountered, largely in the form of neuropathology specimens. In our validation set and in subsequent live cases, pathologists have been able to render an accurate diagnosis on such specimens utilizing only the single focal plane provided by the WSI scanner.
Committing to the Sectra IMS came with a certain amount of risk, since we had no personal prior experience with the software. However, Sectra has a large clinical digital pathology footprint in Europe, and several institutions in the United States are using it for a fully digital workflow.27,28 Importantly for us, other institutions had successfully integrated Sectra with Epic Beaker. Our radiology department had been using Sectra for their own needs for several years, and the pathology module had already been purchased. We therefore had an opportunity to test the software before committing.
Since going live with the remote digital frozen sections, our pathologists have expressed satisfaction with the selected platform. We chose an “EHR-driven” workflow, where our case management and reporting are performed in our EHR (Epic Beaker), with Sectra serving purely as a “digital microscope”. This configuration made sense for us, since it most closely mimicked our established glass slide workflow. Sectra can also be configured to perform case management and reporting, but this seemed unnecessary in our case. Regardless, we have found the software generally easy to use and responsive. Even pathologists with little prior digital pathology experience have been able to quickly learn the digital workflow, and overall feedback has been very positive. Sectra offers an “open API” that allows software developers to easily interface with the system, and this feature was very attractive to us given our combined interests in computational pathology/artificial intelligence. In the future, we plan to leverage this API for the development and/or interfacing of additional software packages (e.g., for biomarker scoring and AI-assisted diagnosis).
For patient safety reasons, we chose to enforce correspondence between the case open in the EHR and the digital slides open in Sectra. To facilitate this, we built a hyperlink-based interface in the EHR that launches Sectra, with the appropriate images displayed, directly from the patient’s chart. Sectra contains a web-based “thin client” application as well as a “thick client” desktop application, and we wanted to use the thick client since it appeared to offer a more response user experience with full functionality (e.g., GPU acceleration). Enabling launch of the thick client from within Epic Beaker, with maintenance of patient context, proved to be a considerable technical challenge, but was ultimately worth the effort.
Our decision to proceed with the Roche DP200 whole slide scanner was based largely on our prior experience with this model, which we had been using for non-clinical scanning needs for several years. In that role, scanning approximately 50 slides daily, it has performed reliably. Image quality is subjectively excellent, and scanning times are relatively fast. Indeed, there are few other options for fast, high quality, low throughput scanners currently on the market. Slides are held in place in a tray which is fed into the scanner, an arrangement that is well-suited to potentially wet frozen section slides.
Others have reported success using the DP200 for remote frozen sections. For example, Menter et al.32 compared a live view type system to WSI using the DP200 and reported both improved image quality and faster overall turnaround time with the WSI system. In their case, the additional time needed for scanning was offset by a significantly faster diagnosis due to better usability of the whole slide software management system compared to the live view software. By contrast, in our case the time to diagnosis using the WSI system was slightly longer as compared to the LV system. This was an acceptable tradeoff due to the much better image quality and overall user experience.
CLIA regulations surrounding remote signout have been relaxed due to the ongoing COVID-19 pandemic.11 Specifically, a CLIA license is now not required to sign out cases from a remote location (e.g., from home). However, enabling frozen section signout from home is not an immediate goal for us, and thus far we have not completed the necessary validation study. We may enable remote signout in the future, particularly if the regulatory changes become permanent.
The entire process of evaluating, choosing, and deploying a digital frozen section platform took much longer than anticipated. Much of this time was spent simply coordinating the various people and teams, from IT to pathologist SMEs, to scanner, and IMS vendors. No doubt this is partially attributable to the additional complexity of a whole slide-based platform, with the attendant infrastructure and integration requirements. However, even the scope of a seemingly modest digital pathology project may not be apparent until the project is well underway, and it seems prudent to avoid over-optimistic delivery estimates. We were very lucky in that we did not need to expend time and energy obtaining “buy-in” from hospital or departmental leadership, which is a frequent concern in digital pathology deployments.31 Instead, our decision to pursue full digitization was in large part spurred by the arrival of new leadership already interested in the idea.
There are many ways to enable remote frozen section signout. We chose a whole slide imaging-based platform with direct integration into the LIS as a strategic foundational building block for our department and future circumstances. However, such a solution may be an outsized build in relation to the direct product of frozen section signout. This solution may be regarded as overly complex and/or expensive if further digitization efforts are not planned. In such a situation, a live view type device may be appropriate, and will be less expensive and simpler to integrate.
Regardless of the specific platform, we found it helpful to carefully and formally evaluate certain options to arrive at our decision. From the beginning, the establishment of a core team responsible for project governance was essential. A project manager (provided by our IT department in our case) was very useful for staying on task and coordinating the many meetings required. Any digital pathology implementation is a complicated endeavor with many moving parts, and the time required should not be underestimated.
Ultimately, our whole slide imaging-based system has been working well for digital frozen sections in the setting of a pilot study, and we plan to deploy the system at our new cancer center.
Competing interests
The authors declare no competing interests.
Authors' contributions
DS is responsible for the conception and design of this manuscript. All co-authors contributed to manuscript preparation and editing, and approved of the final version.
Acknowledgments
The authors would like to acknowledge the efforts of Erick Nad in ensuring the success of this project.
Contributor Information
Sue Chang, Email: suchang@coh.org.
Evita Sadimin, Email: esadimin@coh.org.
Keluo Yao, Email: kyao@coh.org.
Stanley Hamilton, Email: shamilton@coh.org.
Patricia Aoun, Email: paaoun@coh.org.
Raju Pillai, Email: rpillai@coh.org.
David Muirhead, Email: dmuirhead@coh.org.
Daniel Schmolze, Email: dschmolze@coh.org.
References
- 1.Griffin J., Treanor D. Digital pathology in clinical use: where are we now and what is holding us back? Histopathology. 2017 Jan;70(1):134–145. doi: 10.1111/his.12993. [DOI] [PubMed] [Google Scholar]
- 2.Williams B.J., Bottoms D., Treanor D. Future-proofing pathology: the case for clinical adoption of digital pathology. J Clin Pathol. 2017 Dec;70(12):1010–1018. doi: 10.1136/jclinpath-2017-204644. [DOI] [PubMed] [Google Scholar]
- 3.Betmouni S. Diagnostic digital pathology implementation: Learning from the digital health experience. Digit Health. 2021 Jan;7 doi: 10.1177/20552076211020240. 205520762110202. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Stathonikos N., Nguyen T.Q., Spoto C.P., Verdaasdonk M.A.M., Diest P.J. Being fully digital: perspective of a Dutch academic pathology laboratory. Histopathology. 2019 Nov;75(5):621–635. doi: 10.1111/his.13953. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Lujan G.M., Savage J., Shana’ah A., et al. Digital pathology initiatives and experience of a large academic institution during the Coronavirus Disease 2019 (COVID-19) pandemic. Arch Pathol Lab Med. 2021 Sep 1;145(9):1051–1061. doi: 10.5858/arpa.2020-0715-SA. [DOI] [PubMed] [Google Scholar]
- 6.Pantanowitz L. Digital images and the future of digital pathology. J Pathol Inform. 2010;1(1):15. doi: 10.4103/2153-3539.68332. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Al-Janabi S., Huisman A., Van Diest P.J. Digital pathology: current status and future perspectives: Digital pathology. Histopathology. 2012 Jul;61(1):1–9. doi: 10.1111/j.1365-2559.2011.03814.x. [DOI] [PubMed] [Google Scholar]
- 8.Pantanowitz L., Sharma A., Carter A., Kurc T., Sussman A., Saltz J. Twenty years of digital pathology: an overview of the road travelled, what is on the horizon, and the emergence of vendor-neutral archives. J Pathol Inform. 2018;9(1):40. doi: 10.4103/jpi.jpi_69_18. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Browning L., Fryer E., Roskell D., et al. Role of digital pathology in diagnostic histopathology in the response to COVID-19: results from a survey of experience in a UK tertiary referral hospital. J Clin Pathol. 2021 Feb;74(2):129–132. doi: 10.1136/jclinpath-2020-206786. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Cimadamore A., Lopez-Beltran A., Scarpelli M., Cheng L., Montironi R. Digital pathology and COVID-19 and future crises: pathologists can safely diagnose cases from home using a consumer monitor and a mini PC. J Clin Pathol. 2020 Nov;73(11):695–696. doi: 10.1136/jclinpath-2020-206943. [DOI] [PubMed] [Google Scholar]
- 11.Clinical Laboratory Improvement Amendments (CLIA) Laboratory . Centers for Medicare and Medicaid Services; 2020. Guidance During COVID-19 Public Health Emergency [Internet]https://www.cms.gov/files/document/qso-20-21-clia.pdf-0 [cited 2022 Feb 17]. Available from. [Google Scholar]
- 12.Baxi V., Edwards R., Montalto M., Saha S. Digital pathology and artificial intelligence in translational medicine and clinical practice. Mod Pathol. 2022 Jan;35(1):23–32. doi: 10.1038/s41379-021-00919-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Bera K., Schalper K.A., Rimm D.L., Velcheti V., Madabhushi A. Artificial intelligence in digital pathology — new tools for diagnosis and precision oncology. Nat Rev Clin Oncol. 2019 Nov;16(11):703–715. doi: 10.1038/s41571-019-0252-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Colling R., Pitman H., Oien K., et al. Artificial intelligence in digital pathology: a roadmap to routine use in clinical practice. J Pathol. 2019 Oct;249(2):143–150. doi: 10.1002/path.5310. [DOI] [PubMed] [Google Scholar]
- 15.Cui M., Zhang D.Y. Artificial intelligence and computational pathology. Lab Invest. 2021 Apr;101(4):412–422. doi: 10.1038/s41374-020-00514-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Parwani A.V. Next generation diagnostic pathology: use of digital pathology and artificial intelligence tools to augment a pathological diagnosis. Diagn Pathol. 2019 Dec;14(1):138. doi: 10.1186/s13000-019-0921-2. s13000-019-0921–2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Madabhushi A., Lee G. Image analysis and machine learning in digital pathology: challenges and opportunities. Med Image Anal. 2016 Oct;33:170–175. doi: 10.1016/j.media.2016.06.037. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Ching T., Himmelstein D.S., Beaulieu-Jones B.K., et al. Opportunities and obstacles for deep learning in biology and medicine. J R Soc Interface. 2018 Apr;15(141):20170387. doi: 10.1098/rsif.2017.0387. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Coccia M. Deep learning technology for improving cancer care in society: new directions in cancer imaging driven by artificial intelligence. Technol Soc. 2020;60 [Google Scholar]
- 20.Janowczyk A., Madabhushi A. Deep learning for digital pathology image analysis: a comprehensive tutorial with selected use cases. J Pathol Inform. 2016;7(1):29. doi: 10.4103/2153-3539.186902. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Dimitriou N., Arandjelović O., Caie P.D. Deep learning for whole slide image analysis: an overview. Front Med. 2019 Nov;22(6):264. doi: 10.3389/fmed.2019.00264. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Paige Prostate FDA Approval Letter [Internet] U.S. Food and Drug Administration. 2021:1–6. https://www.accessdata.fda.gov/cdrh_docs/pdf20/DEN200080.pdf [cited 2022 Feb 17]. Available from. [Google Scholar]
- 23.Farahani N., Pantanowitz L. Overview of Telepathology. Surg Pathol Clin. 2015 Jun;8(2):223–231. doi: 10.1016/j.path.2015.02.018. [DOI] [PubMed] [Google Scholar]
- 24.Weinstein R.S., Bloom K.J., Rozek L.S. Telepathology. Long-distance diagnosis. Am J Clin Pathol. 1989 Apr;91(4 Suppl 1):S39–S42. [PubMed] [Google Scholar]
- 25.Weinberg D.S., Allaert F.A., Dusserre P., et al. Telepathology diagnosis by means of digital still images: an international validation study. Hum Pathol. 1996 Feb;27(2):111–118. doi: 10.1016/s0046-8177(96)90363-9. [DOI] [PubMed] [Google Scholar]
- 26.Cross S.S., Dennis T., Start R.D. Telepathology: current status and future prospects in diagnostic histopathology. Histopathology. 2002 Aug;41(2):91–109. doi: 10.1046/j.1365-2559.2002.01423.x. [DOI] [PubMed] [Google Scholar]
- 27.Nation’s Top Orthopedic Hospital Launches the First FDA Approved Digital Pathology Platform [Internet] Nation’s Top Orthopedic Hospital Launches the First FDA Approved Digital Pathology Platform. 2021. https://news.hss.edu/nations-top-orthopedic-hospital-launches-the-first-fda-approved-digital-pathology-platform/ [cited 2022 Feb 18]. Available from.
- 28.Sectra’s digital pathology solution chosen by US healthcare provider for primary diagnostics [Internet] 2021. https://medical.sectra.com/news-press-releases/news-item/74ABD5E1DD555A5C/ [cited 2022 Feb 18]. Available from.
- 29.Pantanowitz L., Sinard J.H., Henricks W.H., et al. Validating whole slide imaging for diagnostic purposes in pathology: guideline from the College of American Pathologists Pathology and Laboratory Quality Center. Arch Pathol Lab Med. 2013 Dec;137(12):1710–1722. doi: 10.5858/arpa.2013-0093-CP. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Evans A.J., Brown R.W., Bui M.M., et al. Validating whole slide imaging systems for diagnostic purposes in pathology: Guideline Update From the College of American Pathologists in Collaboration With the American Society for Clinical Pathology and the Association for Pathology Informatics. Arch Pathol Lab Med [Internet]. 2021 doi: 10.5858/arpa.2020-0723-CP/464968/Validating-Whole-Slide-Imaging-Systems-for. May 18 [cited 2022 Feb 18]; Available from: [DOI] [Google Scholar]
- 31.Lujan G., Quigley J., Hartman D., et al. Dissecting the business case for adoption and implementation of digital pathology: a white paper from the digital pathology association. J Pathol Inform. 2021;12(1):17. doi: 10.4103/jpi.jpi_67_20. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Menter T., Nicolet S., Baumhoer D., Tolnay M., Tzankov A. Intraoperative frozen section consultation by remote whole-slide imaging analysis–validation and comparison to robotic remote microscopy. J Clin Path. 2020;73:6. doi: 10.1136/jclinpath-2019-206261. 350. [DOI] [PMC free article] [PubMed] [Google Scholar]