Abstract
Artificial intelligence is a strong focus of interest for global health development. Diagnostic endoscopy is an attractive substrate for artificial intelligence with a real potential to improve patient care through standardisation of endoscopic diagnosis and to serve as an adjunct to enhanced imaging diagnosis. The possibility to amass large data to refine algorithms makes adoption of artificial intelligence into global practice a potential reality. Initial studies in luminal endoscopy involve machine learning and are retrospective. Improvement in diagnostic performance is appreciable through the adoption of deep learning. Research foci in the upper gastrointestinal tract include the diagnosis of neoplasia, including Barrett’s, squamous cell and gastric where prospective and real-time artificial intelligence studies have been completed demonstrating a benefit of artificial intelligence–augmented endoscopy. Deep learning applied to small bowel capsule endoscopy also appears to enhance pathology detection and reduce capsule reading time. Prospective evaluation including the first randomised trial has been performed in the colon, demonstrating improved polyp and adenoma detection rates; however, these appear to be relevant to small polyps. There are potential additional roles of artificial intelligence relevant to improving the quality of endoscopic examinations, training and triaging of referrals. Further large-scale, multicentre and cross-platform validation studies are required for the robust incorporation of artificial intelligence–augmented diagnostic luminal endoscopy into our routine clinical practice.
Keywords: AI, endoscopy, imaging
Introduction
Artificial intelligence (AI) systems in luminal endoscopy are now on the precipice of being widely commercially available. The development of deep learning (DL) in diagnostic imaging has further potentiated the realms of AI in luminal endoscopy and overcomes some of the limitations of machine learning (ML) through the ability to process high-dimensional endoscopic data and to self-identify trainable parameters not appreciable to humans. One of the most researched DL methods uses convoluted neural networks (CNNs) (Figure 1) designed to assimilate biological neural networks.1 Here, we review the existing data for AI in diagnostic endoscopy encompassing upper gastrointestinal, small bowel capsule and colonic examinations. Additional roles of AI outside diagnosis relevant to endoscopy are discussed alongside future requirements for research and global adoption of AI into routine endoscopic practice.
Figure 1.
Schematic diagram of a CNN. CNNs differ from traditional fully connected networks in that each perceptron connects to a few neurons instead of all neurons. With each hidden layer of the CNN there will be an input value multiplied by the weight and added to the biases. This value will then be passed through an activator function (ReLU); if the value is above the threshold, it will fire. The output of this layer will then become the input of the next hidden layer and follow the same formula. The final layer is fully connected and is where image classification ensues with final activation by pixel values from the pooling layer exceeding the threshold: a high value will correctly identify the image, a low value will not.
CNN, convolutional neural network; ReLU, rectified linear activation unit.
Upper gastrointestinal endoscopy
AI-augmented endoscopic imaging has been studied across benign and malignant pathologies of the upper gastrointestinal (GI) tract. The majority of data of AI in the upper GI tract are retrospective,2–4 with two real-time diagnostic evaluations for the diagnosis of early gastric cancer5 and Barrett’s neoplasia.6
Helicobacter pylori
AI could potentially improve the diagnostic yield by abating the false positive rate secondary to conventional sampling error. All studies for the diagnosis of H. pylori to date involve deep learning (DL) and are summarised in Table 1.
Table 1.
Diagnostic test summaries for artificial intelligence–augmented endoscopy of the upper gastrointestinal tract.
| Author | P/R | AI model/imaging modality | Test dataset (no. of images) | Sensitivity (%) | Specificity (%) | Accuracy (%)/AUC |
|---|---|---|---|---|---|---|
| Helicobacter pylori | ||||||
| Shichijo and colleagues7 | R | DL WLE | 32,208 | 81.9 | 83.4 | 83.1 |
| DL WLE | 11,481 | 85.2 | 89.3 | 88.6 | ||
| Itoh and colleagues8 | R | DL WLE | 30 | 86.7 | 86.7 | 0.956 |
| Huang and colleagues3 | R | DL WLE | 74 patients | 72.7–84.8 | 85.4–95.1 | 85.1 |
| Yasuda and colleagues9 | R | DL LCI | 525 | 90.4 | 85.7 | 87.6 |
| Nakashima and colleagues4 | R | DL WLE | 60 | 66.7 | 60.0 | 0.66 |
| DL BLI | 60 | 96.7 | 86.7 | 0.96 | ||
| DL LCI | 60 | 96.7 | 83.3 | 0.95 | ||
| Barrett’s neoplasia | ||||||
| Ebigbo and colleagues6 | P | DL WLE | 62 | 83.7 | 100 | 89.9 |
| Qi and colleagues11 | R | CAD WLE | 314 | 82 | 74 | 83 |
| Sommen and colleagues12 | R | CAD WLE | 100 | 86 | 87 | |
| Shin and colleagues13 | R | CAD HRME | 153 | 88 | 85 | 85 |
| Swager and colleagues14 | R | CAD VLE | 40 | 90 | 93 | 0.95 |
| Sehgal and colleagues15 | R | CAD (DT) WLE/ACA | 40 videos | 97 | 88 | 92 |
| Squamous cell neoplasia | ||||||
| Tang and colleagues17 | R | DL WLE | 946 | 97 | 94 | 97 |
| Shiroma and colleagues18 | R | DL WLE/NBI | 40 videos | 80 | 63.3 | 67.5 |
| Zhao and colleagues19 | R | DL Mag NBI | 1383 | 83 | 95.7 | 89.2 |
| Horie and colleagues20 | R | DL WLE | 1118 | 98 | 98 | |
| Guo and colleagues21 | P | DL NBI | 6671 (1480 neoplastic images) | 91.5 | 99.9 | 0.989 |
| Quang and colleagues22 | R | ML HRME | 167 | 95 | 91 | O.937 |
| Kumagi and colleagues23 | R | Endocytoscopy | 1520 (55 patients) | 92.6 | 89.3 | 90.9 |
| Tokai and colleagues24 | R | DL WLE | 293 | 87.7 | 72.5 | 77.8 |
| DL NBI | 40 | 92.3 | 77.8 | |||
| Gastric Neoplasia | ||||||
| Hirasawa T and colleagues25 | R | DL WLE | 2296 (714 cancers) | 92.2 | NR | NR |
| Ishioka M and colleagues5 | R | DL WLE | 68 videos | 94.1 | NR | NR |
| Luo and colleagues26 | P | DL WLE | 1,036,496 | 90.7–98.2 | 91.3–97.9 | 0.91–0.97 |
| Miyaki and colleagues27 | R | CAD FICE | 46 cancers | 84.8 | 87.0 | 85.9 |
| Miyaki and colleagues28 | R | CAD BLI | 100 | NR | NR | SVM value 0.846 ± 0.22 |
| Kanesaka and colleagues29 | R | CAD mNBI | 81 images (61 cancers) | 96.7 | 95 | 96.3 |
| Li and colleagues30 | R | DL NBI | 341 (170) | 91.1 | 90.6 | 90.9 |
| Horiuchi and colleagues31 | R | DL mNBI | 258 images (151 cancers) | 95.4 | 71.0 | 85.3 |
| Yoon and colleagues32 | R | DL WLE | 660 images (330 cancers) | 91 | 97.6 | 0.981 |
| Kubota and colleagues33 | R | DL WLE | 90 | NR | NR | 64.7 |
| Zhu and colleagues34 | R | DL WLE | 203 | 76.47 | 95.56 | 0.94 |
P, prospective; R, retrospective; ACA, acetic acid; AI, artificial intelligence; AUC, area under the curve; BLI, blue laser imaging; CAD, computer-assisted diagnosis; DL, deep learning; FICE, Fujinon intelligent chromoendoscopy; HRME, high-resolution magnification endoscopy; LCI, linked colour imaging; mNBI, magnification narrow band imaging; NBI, narrow band imaging; WLE, white light endoscopy.
Three retrospective CNNs have achieved comparable diagnostic performances with a sensitivity of 79–87% and specificity of 83.2–87%.7,8 One of these CNNs used images classified to location and achieved a significantly higher accuracy than endoscopists (by 5.3%).7
Preliminary analysis of prospectively collected data using a refined feature selection trained artificial neural network (ANN) of cases and controls yielded an accuracy of greater than 80% for the detection of intestinal metaplasia, atrophy and the severity of gastric inflammation.3
Linked colour imaging (LCI) demonstrates encouraging preliminary findings with high accuracy of greater than 93%4,9 and appears to be superior to blue laser imaging (BLI) with areas under the curve (AUCs) of 0.96 and 0.95, respectively,4 but requires large-scale study.
Barrett’s neoplasia
Application of AI in this remit is for the detection and diagnosis of Barrett’s-associated neoplasia using the conventional Seattle biopsy protocol and is summarised in Table 1.10 Retrospective studies demonstrate overall accuracy for the detection of Barrett’s neoplasia of approximately 83–85%.11–13 Preliminary novel findings of a recent study evaluating histograms of volume laser endomicroscopy (VLE) images appear to further optimise diagnostic performance.14 ML using decision trees (DTs) on videos of nondysplastic Barrett’s and Barrett’s-associated dysplasia undergoing acetic acid examination using i-SCAN (PENTAX) was formed upon the mucosal evaluation by three expert endoscopists. The overall DT model accuracy of diagnosis was 92% and demonstrated augmented performance and improvement in the accuracy of nonexpert endoscopists using the DT algorithm.15 Real-time diagnostic study of using DL demonstrated an overall accuracy of 89.9% in a small test sample of 14 cases [36 images of early adenocarcinoma and 26 nondysplastic Barrett’s oesophagus (BE)] and using only 129 images for training.6
Although the current data do not satisfy the preservation and incorporation of valuable endoscopic innovation (PIVI) thresholds for advanced imaging technologies of a per-patient sensitivity of ⩾90% and a negative predictive value of ⩾98%,16 through further optimisation of algorithms in large-scale studies, this appears feasible and achievable.
Squamous cell neoplasia
Applications of AI include the diagnosis of squamous cell neoplasia (SCC) and the estimation of depth of invasion, a pertinent factor to decipher the appropriate management strategy driven by a range of imaging modalities, summarised in Table 1.
Diagnosis
White light endoscopy
A retrospective CNN demonstrated excellent diagnostic performance for the detection of early SCC and differentiation from inflammation.17 The preliminary performance of CNNs on unseen videos having been developed on static images was able to detect SCC in 8 of 10 patients.18
Narrow band imaging
DL using a CNN was piloted on manually generated regions of interest (ROIs) of narrow band imaging (NBI) intrapapillary capillary loops.19 A subsequent large-scale study demonstrated a rapid accurate analysis taking 27 s for the test database analysis (1118 images).20 A multicentre NBI DL study of 2770 images of dysplasia and early neoplasia and 3703 images of benign lesions trained a model for real-time diagnosis. Validation of images and videos with a heatmap generation for neoplasia demonstrated a high performance on static images with AUC of 0.989 over four validation datasets.21
High-resolution microendoscopy and endocytoscopy
A tablet interface high-resolution microendoscopy system was designed to facilitate morphometric nuclear analysis from which neoplasia can be diagnosed with an AUC of 93%.22
Preliminary findings of endocytoscopy-based CNNs demonstrated encouraging results with a sensitivity of 92.6% on a test database of 55 patients (n = 27 with neoplasia). A step further was the development of a GoogLeNet CNN with a larger training dataset; 4715 images were evaluated on a test set of 1520 images from 55 patients. The receiver operating characteristic curve (ROC) was 0.9 for magnification and correctly diagnosed 25 of 27 neoplastic and 25 of 28 benign oesophageal lesions.23
Depth of invasion
Evaluation of a retrospectively trained CNN of SCCs to predict the depth of invasion in 68 patients (n = 52 SM1 and n = 16 SM2) demonstrated a high negative predictive value (NPV) of 91.6% ⩾SM2 under white light endoscopy (WLE).24
Gastric neoplasia
The function of AI related to gastric neoplasia includes diagnosis, estimation of depth of invasion and delineation of borders: vital information for treatment strategising. Various imaging modality–driven AI have been evaluated and are summarised in Table 1.
Diagnosis and delineation of neoplastic borders
WLE
Inception study of CNNs yielded high sensitivity but poor specificity (30.6%).25 Favourable results for real-time diagnosis have been demonstrated with detection of 64 of the 68 cancerous lesions in 62 patients.5
The real-time multicentre study developed and validated the DL Gastrointestinal Artificial Intelligence Diagnostic System (GRAIDS). A large training set of 1,036,496 images from 84,424 patients was used, with accuracy in the external validation sets ranging from 91.5% to 97.7%; similar to expert endoscopists but superior to senior doctors/trainees, demonstrating the potential to optimise and standardise practice.26
Virtual chromoendoscopy (Fujinon intelligent chromoendoscopy)
Primary studies using Fujinon intelligent chromoendoscopy (FICE)-driven AI models revealed modest results (Table 1);27 however, diagnostic accuracy improved with BLI and a larger validation set of 100 early gastric cancers.28
NBI
Diagnosis with magnification NBI using support vector machine algorithms on only 126 training images demonstrated excellent diagnostic performance with a sensitivity of 97% and specificity of 95%; however, limited performance was seen for delineation of the lesion with border.29 Further study using magnification NBI CNNs on a larger training dataset of more than 2000 images yielded an overall accuracy of 90.91% on a test dataset of 314 images (170 cancer images).30
Further CNNs have been trained to differentiate gastric neoplasia from gastritis using NBI with high NPV (91.7%) in a modest test dataset of 258 images with a rapid image evaluation (0.02 s per image).31
Depth of invasion
Unvalidated neural networks have demonstrated potential for predictions of depth of invasion between T1a and T1b from benign lesions with AUC of 0.851.32 A feed-forward ANN demonstrated an overall accuracy of 64.7% and per depth of T1 at 77%, T2 at 49%, T3 at 51% and T4 at 55%.33 Using a larger training and testing dataset, DL was highly specific (96%) for SM2 or deeper invasive cancers: a useful scenario where specificity is the priority for therapy planning.34
Surveillance
Gastric mapping using image retrieval networks from an index endoscopic examination provides the potential for real-time guidance for retargeting areas of concern which could be extended for colonic surveillance.35
Small bowel
In addition to pathology detection, applications of AI in the small bowel include image enhancement, three-dimensional (3D) image reconstruction and localisation. AI-augmented capsule reading may automate the reading process and therefore reduce reading time.
Image enhancement
ML algorithms have been developed to reduce artefact interference within frames using wavelet transformation,36 deblurring algorithms and adaptive learning algorithms.37,38
Localisation and 3D image reconstruction
Algorithms for vision-based simultaneous localisation and mapping (vSLAM) have been developed ex vivo to localise with respectable accuracy and orientate to accurately guide therapy inclusion and 3D luminal map reconstruction.39
Pathology detection
Video capsule endoscopy (VCE) reading can be a time-consuming process that requires patience (mean capsule reading time of 30–40 min) with an inherent risk of missing pathology and high intervariability of reading due to up to 50,000–100,000 frames to be viewed. QuickView (Medtronic, Dublin, Ireland) selects 10% of the most prominent images for review with a sensitivity for pathology detection of 94%.40,41 ML algorithms have been proposed using static image analysis in relatively small datasets with AUC of 0.89.42
DL proves to be promising in this field. AlexNet, a pretrained network of static images of erosions and healthy mucosa, achieves an accuracy of circa 95%.43 Deep convolutional neural network (DCNN) performance (5360 training images of small bowel ulcerations/erosions) was tested on 10,440 images (440 contained such pathology). The AUC was 0.958 [95% confidence interval (CI), 0.947–0.968], sensitivity was 88.2% and accuracy was 90.8% using a cut-off point of 0.481 probability score. The DCNN evaluated the WCE images at a speed of 44.8 images/s.44
Bleeding
The performance of commercially available bleeding detection software such as the Suspected Blood Indicator algorithm (Medtronic) has been suboptimal (sensitivity of 55%, specificity of 58%), necessitating further development.40,45
ML demonstrated improved performance (sensitivity of 89.5%, specificity of 96.8%) using ROI generation and DT-based learning algorithms with the ground truth determined by human-labelled images.46
DL offers advantages over ML due to superior aspects in feature selection for the detection of bleeding. Consistency of static image and video collection with capsule endoscopy makes it an attractive substrate for AI.
DCNNs appear to further improve the detection of small bowel angioectasia with a frailty index (FI) score of 0.9955 on a test dataset of 10,000 images.47 Subsequent analysis of 10,488 images (488 with angioectasia) revealed an AUC of 0.998 and NPV of 99.9%.48 Comparable data have been observed using a semantic segmentation approach with CNN. A total of 4166 VCE videos were used from a national multicentre database, from which 600 control and 600 angioectasia frames were selected and divided for training and testing. The CNN yielded a sensitivity of 100%, specificity of 96% and NPV of 100%. The total automated video analysis time was 39 min for a capsule of 50 000 frames.49
Colonoscopy
Colonoscopy with computer-aided detection (CADe) is one of the most attractive applications and the subject of the broadest literature available to date. Additional roles for AI include polyp characterisation and the assessment of colitis for activity and dysplasia. The potential for large datasets results in the ability to perform refined training of algorithms and validation studies.50
Polyp detection
CADe aims to mitigate human factors for missed lesions such as fatigue, distraction/inattention blindness, loss of mucosal inspection during visual scanning and endoscopist skill. DL appears to enhance polyp detection compared with ML; however, pre-DL data were limited by small-scale studies with fewer polyps (<30) and artefactual interference. Several studies of real-time CNNs illustrate the potential of AI in this area. A 3D video-trained CNN (3,017,088 manually reported frames, 930 polyps) yielded a sensitivity of 84% for protruded lesions and 87% for flat lesions in real time.51 Further real-time evaluation of a CNN (trained on >8000 images from 2000 patients) demonstrated impressive diagnostic accuracy of 96% and appeared superior to human detection for polyp detection (45 versus 36),52 however requiring validation.52 Large data including training from 1290 patients and validation cohorts of 1138 patients (27,113 images) demonstrated ML algorithm sensitivity of 94.3% and AUC of 0.984.53 Further development of real-time AI models echoes these findings, achieving real-time feedback within 0.03 s/frame with sensitivity of 93.7% for the detection of nonpolypoid lesions.54
There are five randomised trials for CADe to date with adenoma detection rate (ADR) as the primary outcome (Table 2). The comparator in all randomised trials is conventional colonoscopy (CC) using WLE, and all have demonstrated a significantly higher ADR by CADe.55–59 The first randomised controlled trial (RCT) demonstrated a higher polyp detection rate and ADR using CADe (0.29 versus 0.2); however, the increased benefit was limited to diminutive lesions in the left colon.56 A large validation study used a commercially available AI system (GI-Genius; Medtronic) on videos of 2684 polyps from 840 patients and 1.5 million manually annotated frames of these polyps, which were randomised to a validation and training dataset by patient. A test dataset of n = 105 (338 polyps) yielded a sensitivity of 99.7% (one missed polyp) and a false positive proportion of less than 1% of frames per colonoscopy. The reaction time of AI to polyp detection was faster than endoscopists in the majority of cases.60 In the double-blind designed trial, subanalysis of the initial missed polyps by the endoscopist in the CADe arm was small, on the edge of the visual field.55 Automated quality control systems (AQCS) developed on DCNNs additionally measure withdrawal time and stability, and bowel preparation to polyp detection, which demonstrated a significant increase in ADR along with prolonged exposure time (7.03 min versus 5.68 min, p < .001) and adequate bowel preparation rate.58 A randomised trial of a 3D DCNN was evaluated against CC of 1026 patients in the complete case population. Withdrawal time was again prolonged in the CADe cohort with a 0.071 false positive risk per colonoscopy with no polyps missed.59
Table 2.
Summary of randomised trials with adenoma detection rate (ADR) as the primary outcome.
| Author | Centres | Randomisation | CADe model | Comparator | No. of patients | ADR AI (%) | ADR CC (%) | OR (95% CI) |
|---|---|---|---|---|---|---|---|---|
| Wang and colleagues55 | 1 | 1:1 Unblinded |
DCNN | CC (WLE) | 1058 | 29.1 | 20.3 | 1.61 (1.213–2.135) |
| CADe 522 | ||||||||
| CC 536 | ||||||||
| Wang and colleagues56 | 1 | 1:1 Double-blinded |
DCNN | CC (WLE) | 1010 | 34 | 28 | 1.36 (1.03–1.79 |
| CADe 508 | ||||||||
| CC 502 | ||||||||
| Gong and colleagues57 | 1 | 1:1 Unblinded |
DCNN ENDOANGEL |
CC (WLE) | 704 | 16 | 8 | 2.30 (1.40–3.77) |
| CADe 355 | ||||||||
| CC 349 | ||||||||
| Su and colleagues58 | 1 | 1:1 Unblinded |
DCNN | CC (WLE) | 659 | 28.9 | 16.5 | 2.055 (1.397–3.024) |
| CADe 308 | ||||||||
| CC 315 | ||||||||
| Liu and colleagues59 | 1 | 1:1 Unblinded |
DCNN | CC (WLE) | 1026 | 39 | 23 | 1.64 (1.201–2.220) |
| CADe 508 | ||||||||
| CC 518 |
AI, artificial intelligence; CADe, computer-assisted detection; CC, conventional colonoscopy; CI, confidence interval; DCNN, deep convolutional neural network; OR, odds ratio; WLE, white light endoscopy.
Polyp diagnosis
AI-augmented polyp characterisation may serve to support the PIVI criteria for diminutive polyps including resect and discard (PIVI-1) and diagnose and leave (PIVI-2).61 Furthermore, it can provide an educational adjunct to endoscopist skill development in polyp diagnosis and finally to guide endoscopic therapy through the evaluation of depth of invasion. AI models have been studied using all advanced imaging modalities as a range of image substrates. WLE-based models, although an attractive modality given its global nature, have limited performance (accuracy ca. 70%) even with DL and a decent sample size.62 Further refinement of DL with the ‘deep capsule neural network’ demonstrates the ability to diagnose and differentiate hyperplastic polyps, adenomas and serrated adenomas, traditionally difficult for existing AI models.63
NBI
NBI-driven (Olympus, Tokyo, Japan) AI is the most extensively studied modality to date. ML of magnification NBI differentiated neoplastic and nonneoplastic lesions with a sensitivity of 85–93%.64,65 Retrospective DL studies surpassed PIVI-2 criteria with an NPV of 91.5–97%.66,67 Prospective evaluation with real-time magnification NBI also exceeded the PIVI-2 criteria with an NPV of 93.3% and was also able to predict surveillance strategy in exact agreement with histology in 92.7%.68
Chromoendoscopy
Preliminary models identifying pit patterns are encouraging (accuracy of 98.5%).69 Image segmentation using wavelet textural approaches also demonstrates potential and await large-scale study.70
Autofluorescence imaging
Preliminary study of autofluorescence imaging (AFI) based on green:red ratios demonstrates an NPV of 96.1% in a small patient population (n = 27).71 Real-time assessment demonstrated encouraging results with a sensitivity of 94.2%, specificity of 88.9% and NPV of 85.2%.72
Confocal laser endomicroscopy
Initial study of confocal laser endomicroscopy (CLE) differentiated advanced colorectal cancer with an accuracy of 84.5%73 and 89.6% between neoplastic and nonneoplastic polyps.74 in addition to the identification of optimum images for clinician review.75,76
Endocytoscopy (EndoBrain)
EndoBrain is one of the first regulatory body–approved AI systems. Initial studies using methylene blue nuclear staining identified neoplastic features with 89.2% accuracy.77,78 Further development for the diagnosis of diminutive polyps (n = 144) involving textural analysis and a support vector machine classifier achieved an NPV of 98%, satisfying PIVI-2.79 NBI replacing the need for methylene blue achieved an accuracy of 90%.80 Retrospective data of 5843 endocytoscopic images (375 lesions in n = 242) differentiated invasive cancer with an accuracy of 94.1% in 200 test images.81 Prospective real-time evaluation of EndoBrain on 466 diminutive lesions (n = 791 patients) in the rectosigmoid area satisfied the PIVI-2 criteria with an NPV of 93.7%.59,82 Recent retrospective comparison of endoscopist (trainee and expert) of both methylene blue (MB) endocytoscopy and NBI images versus EndoBrain demonstrated excellent overall accuracy of EndoBrain (98% MB and 96% NBI) and outperformed all endoscopists in sensitivity and NPV.83
Assessment of colitis
NBI-EndoBrain has been evaluated for the assessment of mucosal healing in ulcerative colitis (UC). A retrospective analysis of 187 patients (100 used for validation) demonstrated a high specificity of 97% for mucosal inflammation.84 A retrospective WLE CNN (GoogLeNet) model performed favourably with an area under the receiver operating characteristic curve (AUROC) of 0.86 for Mayo 0 and 0.98 for Mayo ⩽1 with the highest accuracy for rectal images (AUROC of 0.92).85 A large-scale prospective study of a constructed DCNN on 40,758 images from 2012 patients with UC validated prospectively in 875 patients was able to detect endoscopic remission to 90.1% accuracy and importantly histological remission with 92.9% accuracy and kappa coefficient agreement with histology of 0.86.86
Quality of endoscopy
Education
The first RCT in AI-augmented gastroscopy revealed significantly fewer blind spots identified with AI-assisted WLE (WISENSE) versus conventional examination (5.86% versus 22.46%, p < 0.001),87 which could potentially extrapolate to improving endoscopist skill and quality.
ML with DTs built on video substrates of nondysplastic Barrett’s and Barrett’s-associated dysplasia using i-SCAN (PENTAX) in 40 patients demonstrated improved diagnostic accuracy of nonexpert endoscopists when using the DT algorithm.15
Automated endoscopy reporting
Piloting of CNN-automated procedure labelling such as colonic intubation time, caecal recognition and withdrawal time on video-recorded colonoscopy illustrates high accuracy when compared with manual recording (R2 = 0.995).88
Triaging of endoscopy referrals
Triaging referrals in GI endoscopy are subject to high variance pending on route of referral and straight-to-endoscopy qualifiers which can result in delays in patient pathways. Natural language processing can provide assistance with ‘auto-triaging’ for suspected cancer referrals, providing community clinicians with a vetting tool for referrals for further management including direct endoscopy referrals.89 NHS England (UK) recently introduced an AI-supported application ‘C the signs’, a class 1 device with the Medicines and Healthcare products Regulatory Agency (MRHA) using National Institute for Health and Care Excellence (NICE) guidance designed to be used in primary care to support clinicians to identify investigations and referrals required.90
Future considerations and directions
Digital reform is inevitable, necessary and has arrived in multiple spectra of healthcare. AI-augmented colonoscopy has been acknowledged in the recently updated European Society of Gastrointestinal Endoscopy (ESGE) Advanced Imaging guidance for the diagnosis and detection of colorectal neoplasia.91 Further multicentre randomised trials are warranted to evaluate both the effect and validity of AI software across patient populations and are required to support its global uptake. Current AI software is trained on pristine images. Further development of AI algorithms that are able to adapt to real-life artefacts such as faecal residue/mucus yet detect polyps is required. In the realms of adopting AI into routing endoscopic practice will come a new age of responsibility regarding data storage and protection, which will require further attention. A potential limitation to this incorporation of AI-augmented endoscopy is cost related to installation of hardware, software with on-going required maintenance/upgrade and additional computational requirements. In addition, there is the hypothetical threat of behavioural reliance on AI for endoscopic diagnosis; however, the role of AI is an adjunct to diagnosis and can potentially be implemented to improve endoscopic diagnosis through self-learning. Furthermore, endoscopic management is a multifaceted patient-specific decision process of which AI-augmented endoscopic imaging is one facet. Future focus on trainees/nonexpert endoscopists and QUALY analyses will be required to inform the potential wider benefits and impact of adopting AI-augmented endoscopy into routine clinical practice. It is our duty to patients to use technology and advance care to the maximum benefit, making for an exciting future ahead for endoscopic practice.
Footnotes
Conflict of interest statement: The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding: The authors received no financial support for the research, authorship, and/or publication of this article.
ORCID iDs: Shraddha Gulati
https://orcid.org/0000-0002-1394-011X
Helmut Neumann
https://orcid.org/0000-0002-7433-405X
Contributor Information
Shraddha Gulati, King’s Institute of Therapeutic Endoscopy, King’s College Hospital NHS Foundation Trust, London, UK.
Andrew Emmanuel, King’s Institute of Therapeutic Endoscopy, King’s College Hospital NHS Foundation Trust, London, UK.
Mehul Patel, King’s Institute of Therapeutic Endoscopy, King’s College Hospital NHS Foundation Trust, London, UK.
Sophie Williams, King’s Institute of Therapeutic Endoscopy, King’s College Hospital NHS Foundation Trust, London, UK.
Amyn Haji, King’s Institute of Therapeutic Endoscopy, King’s College Hospital NHS Foundation Trust, London, UK.
Bu’Hussain Hayee, King’s Institute of Therapeutic Endoscopy, King’s College Hospital NHS Foundation Trust, London, UK.
Helmut Neumann, Department of Interdisciplinary Endoscopy, University Hospital Mainz, 55131 Mainz, Germany.
References
- 1. Gulati S, Patel M, Emmanuel A, et al. The future of endoscopy: advances in endoscopic image innovations. Dig Endosc 2020; 32: 512–522. [DOI] [PubMed] [Google Scholar]
- 2. Mori Y, Kudo S, Mohmed HEN, et al. Artificial intelligence and upper gastrointestinal endoscopy: current status and future perspective. Dig Endosc 2019; 31: 378–388. DOI: 10.1111/den.13317. [DOI] [PubMed] [Google Scholar]
- 3. Huang C, Sheu B, Chung P, et al. Computerized diagnosis of Helicobacter pylori infection and associated gastric inflammation from endoscopic images by refined feature selection using a neural network. Endoscopy 2004; 36: 601–608. [DOI] [PubMed] [Google Scholar]
- 4. Nakashima H, Kawahira H, Kawachi H, et al. Artificial intelligence diagnosis of Helicobacter pylori infection using blue laser imaging-bright and linked color imaging : a single-center prospective study. Ann Gastroenterol 2018; 31: 462–468. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5. Ishioka M, Hirasawa T, Tada T. Detecting gastric cancer from video images using convolutional neural networks. Dig Endosc 2019; 31: e34–e35. [DOI] [PubMed] [Google Scholar]
- 6. Ebigbo A, Mendel R, Probst A, et al. Real-time use of artificial intelligence in the evaluation of cancer in Barrett’s oesophagus. Gut 20209; 69: 615–616. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Shichijo S, Nomura S, Aoyama K, et al. Application of convolutional neural networks in the diagnosis of Helicobacter pylori infection based on endoscopic images. Ebiomedicine 2017; 25: 106–111. DOI: 10.1016/j.ebiom.2017.10.014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Itoh T, Kawahira H, Nakashima H, et al. Deep learning analyzes Helicobacter pylori infection by upper gastrointestinal endoscopy images. Endosc Int Open 2018; 6: E139–E144. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Yasuda T, Hiroyasu T, Hiwa S, et al. Potential of automatic diagnosis system with linked color imaging for diagnosis of Helicobacter pylori infection. Dig Endosc 2020; 32: 373–381. [DOI] [PubMed] [Google Scholar]
- 10. American Gastroenterological Association, Spechler S, Sharma P, et al. American Gastroenterological Association medical position statement on the management of Barrett’s esophagus. Gastroenterology 2011; 140: 1084–1091. DOI: 10.1053/j.gastro.2011.01.030. [DOI] [PubMed] [Google Scholar]
- 11. Qi X, Sivak M, Isenberg G, et al. Computer-aided diagnosis of dysplasia in Barrett’s esophagus using endoscopic optical coherence tomography. J Biomed Opt 2006; 11: 044010. [DOI] [PubMed] [Google Scholar]
- 12. Sommen FVD, Zinger S, Curvers WL, et al. Computer-aided detection of early neoplastic lesions in Barrett’ s esophagus. Endoscopy 2016; 48: 617–624. [DOI] [PubMed] [Google Scholar]
- 13. Shin D, Lee M, Polydorides A, et al. Quantitative analysis of high-resolution microendoscopic images for diagnosis of neoplasia in patients with Barrett’s esophagus. Gastrointest Endosc 2016; 83: 107–114. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Swager A, Sommen F, Van Der Klomp SR, et al. Computer-aided detection of early Barrett’ s neoplasia using volumetric laser endomicroscopy. Gastrointest Endosc 2017; 86: 839–846. DOI: 10.1016/j.gie.2017.03.011. [DOI] [PubMed] [Google Scholar]
- 15. Sehgal V, Rosenfeld A, Graham DG, et al. Machine learning creates a simple endoscopic classification system that improves dysplasia detection in Barrett’s oesophagus amongst non-expert endoscopists. Gastroenterol Res Pr 2018; 2018: 1872437. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Thosani N, Abu BK, Mph D, et al. ASGE technology committee systematic review and meta-analysis assessing the ASGE Preservation and Incorporation of valuable endoscopic innovations thresholds for adopting real-time imaging – assisted endoscopic targeted biopsy during endoscopic surveillance of Barrett’s esophagus. Gastrointest Endosc 2016; 83: 684–698.e7. DOI: 10.1016/j.gie.2016.01.007. [DOI] [PubMed] [Google Scholar]
- 17. Tang D, Wang X, Wang L, et al. Artificial intelligence network to aid the diagnosis of early squamous cell carcinoma and esophageal inflammations in white light endoscopic images. Gastrointest Endosc 2019; 89: AB654. DOI: 10.1016/j.gie.2019.03.1148. [DOI] [Google Scholar]
- 18. Shiroma S, Yoshio T, Aoyama K, et al. The application of artificial intelligence to detect esophageal squamous cell carcinoma in movies using convolutional neural networks. Gastrointest Endosc 2019; 89: AB189. DOI: 10.1016/j.gie.2019.03.142. [DOI] [Google Scholar]
- 19. Zhao Y, Xue D, Wang Y, et al. Computer-assisted diagnosis of early esophageal squamous cell carcinoma using narrow-band imaging magnifying endoscopy. Endoscopy 2019; 51: 333–341. [DOI] [PubMed] [Google Scholar]
- 20. Horie Y, Yoshio T, Aoyama K, et al. Diagnostic outcomes of esophageal cancer by artificial intelligence using convolutional neural networks. Gastrointest Endosc 2019; 89: 25–32. DOI: 10.1016/j.gie.2018.07.037. [DOI] [PubMed] [Google Scholar]
- 21. Guo L, Xiao X, Wu C, et al. Real-time automated diagnosis of precancerous lesions and early esophageal squamous cell carcinoma using a deep learning model (with videos). Gastrointest Endosc 2020; 91: 41–51. DOI: 10.1016/j.gie.2019.08.018. [DOI] [PubMed] [Google Scholar]
- 22. Quang T, Schwarz RA, Dawsey SM, et al. A tablet-interfaced high-resolution microendoscope with automated image interpretation for real-time evaluation of esophageal squamous cell neoplasia. Gastrointest Endosc 2016; 84: 834–841. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23. Kumagai Y, Takubo K, Kawada K, et al. Diagnosis using deep-learning artificial intelligence based on the endocytoscopic observation of the esophagus. Esophagus 2019; 16: 180–187. [DOI] [PubMed] [Google Scholar]
- 24. Tokai Y, Yoshio T, Fujisaki J, et al. Application of artificial intelligence using convultional neural networks in diagnosing the invasion depth of esophageal squamoous cell carcinoma. Gastrointest Endosc 2019; 89: AB169. DOI: 10.1016/j.gie.2019.03.100. [DOI] [Google Scholar]
- 25. Hirasawa T, Aoyama K, Tanimoto T, et al. Application of artificial intelligence using a convolutional neural network for detecting gastric cancer in endoscopic images. Gastric Cancer 2018; 21: 653–660. DOI: 10.1007/s10120-018-0793-2. [DOI] [PubMed] [Google Scholar]
- 26. Luo H, Xu G, Li C, et al. Real-time artificial intelligence for detection of upper gastrointestinal cancer by endoscopy: a multicentre, case-control, diagnostic study. Lancet Oncol 2019; 20: 1645–1654. [DOI] [PubMed] [Google Scholar]
- 27. Miyaki R, Yoshida S, Tanaka S, et al. Quantitative identification of mucosal gastric cancer under magnifying endoscopy with flexiblespectral imaging color enhancement. J Gastroenterol Hepatol 2013; 28: 841–847. [DOI] [PubMed] [Google Scholar]
- 28. Miyaki R, Yoshida S, Tanaka S, et al. A computer system to be used with laser-based endoscopy for quantitative diagnosis of early gastric cancer. J Clin Gastroenterol 2015; 49: 108–115. [DOI] [PubMed] [Google Scholar]
- 29. Kanesaka T, Lee T, Uedo N, et al. Computer-aided diagnosis for identifying and delineating early gastric cancers in magnifying narrow-band imaging. Gastrointest Endosc 2018; 87: 1339–1344. DOI: 10.1016/j.gie.2017.11.029. [DOI] [PubMed] [Google Scholar]
- 30. Li L, Chen Y, Shen Z, et al. Convolutional neural network for the diagnosis of early gastric cancer based on magnifying narrow band imaging. Gastric Cancer 2020; 23: 126–132. DOI: 10.1007/s10120-019-00992-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31. Horiuchi Y, Aoyama K, Tokai Y, et al. Convolutional neural network for differentiating gastric cancer from gastritis using magnified endoscopy with narrow band imaging. Dig Dis Sci 2020; 65: 1355–1363. [DOI] [PubMed] [Google Scholar]
- 32. Yoon HJ, Kim S, Kim J, et al. A lesion-based convolutional neural network improves endoscopic detection and depth prediction of early gastric cancer. J Clin Med 2019; 8: 1–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33. Kubota K, Kuroda J, Yoshida M, et al. Medical image analysis: computer-aided diagnosis of gastric cancer invasion on endoscopic images. Surg Endosc 2012; 26: 1485–1489. [DOI] [PubMed] [Google Scholar]
- 34. Zhu Y, Wang Q, Xu M, et al. Application of convolutional neural network in the diagnosis of the invasion depth of gastric cancer based on conventional endoscopy. Gastrointest Endosc 2019; 89: 806–815. DOI: 10.1016/j.gie.2018.11.011. [DOI] [PubMed] [Google Scholar]
- 35. Ye M, Johns E, Walter B, et al. An image retrieval framework for real-time endoscopic image retargeting. Int J Comput Assist Radiol Surg 2017; 12: 1281–1292. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36. Gopi V, Palanisamy P, Niwas S. Capsule endoscopic colour image denoising using complex wavelet transform. In: Venugopal KR, Patnaik LM. (eds) Wireless networks and computational intelligence. Berlin; Heidelberg: Springer, 2012, pp. 220–229. [Google Scholar]
- 37. Liu H, Lu W, Meng M. De-blurring wireless capsule endoscopy images by total variation minimization. In: Proceedings of 2011 IEEE pacific rim conference on communications, computers and signal processing, Victoria, BC, Canada, 23–26 August 2011, pp. 102–106. New York: IEEE. [Google Scholar]
- 38. Wang Y, Cai C, Zou Y. Single image super-resolution via adaptive dictionary pair learning for wireless capsule endoscopy image. In: 2015 IEEE international conference on digital signal processing, Singapore, 21–24 July 2015, pp. 595–599. New York: IEEE. [Google Scholar]
- 39. Turan M, Almalioglu Y, Araujo H, et al. A non-rigid map fusion-based direct SLAM method for endoscopic capsule robots. Int J Intell Robot Appl 2017; 1: 399–409. DOI: 10.1007/s41315-017-0036-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40. Mcalindon ME, Ching H, Yung D, et al. Capsule endoscopy of the small bowel. Ann Trans Med 2016; 4: 1–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41. Saurin J, Lapalus M, Cholet F, et al. Can we shorten the small-bowel capsule reading time with the ‘Quick-view’ image detection system? Dig Liver Dis 2012; 44: 477–481. [DOI] [PubMed] [Google Scholar]
- 42. Iakovidis DK, Koulaouzidis A. Automatic lesion detection in capsule endoscopy based on color saliency : closer to an essential adjunct for reviewing software. Gastrointest Endosc 2014; 80: 877–883. DOI: 10.1016/j.gie.2014.06.026. [DOI] [PubMed] [Google Scholar]
- 43. Fan S, Xu L, Fan Y, et al. Computer-aided detection of small intestinal ulcer and erosion in wireless capsule endoscopy images. Phys Med Biol 2018; 63: 165001. [DOI] [PubMed] [Google Scholar]
- 44. Aoki T, Yamada A, Aoyama K, et al. Automatic detection of erosions and ulcerations in wireless capsule endoscopy images based on a deep convolutional neural network. Gastrointest Endosc 2019; 89: 357–363. DOI: 10.1016/j.gie.2018.10.027. [DOI] [PubMed] [Google Scholar]
- 45. Yung D, Sykes C, Koulaouzidis A. The validity of suspected blood indicator software in capsule endoscopy: a systematic review and meta-analysis. Expert Rev Gastroenterol Hepatol 2017; 11: 43–51. [DOI] [PubMed] [Google Scholar]
- 46. Noya F, Alvarez-Gonzalez M, Benitez R. Automated angiodysplasia detection from wireless capsule endoscopy. In: 2017 39th annual international conference of the IEEE engineering in medicine and biology society (EMBC), Seogwipo, South Korea, 11–15 July 2017, pp. 3158–3161. New York: IEEE. [DOI] [PubMed] [Google Scholar]
- 47. Jia X, Meng MQH. A deep convolutional neural network for bleeding detection in wireless capsule endoscopy images. In: 2016 38th annual international conference of the IEEE engineering in medicine and biology society (EMBC), Orlando, FL, 16–20 August 2016, pp. 639–642. New York: IEEE. [DOI] [PubMed] [Google Scholar]
- 48. Tsuboi A, Oka S, Aoyama K, et al. Artificial intelligence using a convolutional neural network for automatic detection of small-bowel angioectasia in capsule endoscopy images. Dig Endosc 2019, 31: 13507. [DOI] [PubMed] [Google Scholar]
- 49. Leenhardt R, Vasseur P, Li C, et al. A neural network algorithm for detection of GI angiectasia during small-bowel capsule endoscopy. Gastrointest Endosc 2019; 89: 189–194. DOI: 10.1016/j.gie.2018.06.036. [DOI] [PubMed] [Google Scholar]
- 50. Kudo S, Mori Y, Misawa M, et al. Artificial intelligence and colonoscopy: current status and future perspectives. Dig Endosc 2019; 31: 363–371. DOI: 10.1111/den.13340. [DOI] [PubMed] [Google Scholar]
- 51. Misawa M, Kudo S, Mori Y, et al. Artificial intelligence-assisted polyp detection system for colonoscopy, based on the largest available collection of clinical video data for machine learning. Gastrointest Endosc 2019; 89: AB646–AB647. DOI: 10.1016/j.gie.2019.03.1134. [DOI] [Google Scholar]
- 52. Urban G, Tripathi P, Alkayali T, et al. Deep learning localizes and identifies polyps in real time with 96% accuracy in screening colonoscopy. Gastroenterology 2018; 155: 1069–1078. DOI: 10.1053/j.gastro.2018.06.037. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53. Wang P, Xiao X, Glissen Brown JR, et al. Development and validation of a deep-learning algorithm for the detection of polyps during colonoscopy. Nat Biomed Eng 2018; 2: 741–748. [DOI] [PubMed] [Google Scholar]
- 54. Yamada M, Saito Y, Imao H, et al. Development of a real-time endoscopic image diagnosis support system using deep learning technology in colonoscopy. Sci Rep 2019; 9: 1–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55. Wang P, Liu X, Berzin T, et al. Effect of a deep-learning computer-aided detection system on adenoma detection during colonoscopy (CADe-DB trial): a double-blind randomised study. Lancet Gastroenterol Hepatol 2020; 5: 343–351. [DOI] [PubMed] [Google Scholar]
- 56. Wang P, Berzin TM, Glissen Brown JR, et al. Real-time automatic detection system increases colonoscopic polyp and adenoma detection rates : a prospective randomised controlled study. Gut 2019; 68: 1813–1819. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57. Gong D, Wu L, Zhang J, et al. Detection of colorectal adenomas with a real-time computer-aided system (ENDOANGEL): a randomised controlled study. Lancet Gastroenterol Hepatol 2020; 5: 352–361. [DOI] [PubMed] [Google Scholar]
- 58. Su J, Li Z, Shao X, et al. Impact of a real-time automatic quality control system on colorectal polyp and adenoma detection: a prospective randomized controlled study (with videos). Gastrointest Endosc 2020; 91: 415–424. [DOI] [PubMed] [Google Scholar]
- 59. Liu W, Zhang Y, Bian X, et al. Study on detection rate of polyps and adenomas in artificial-intelligence-aided colonoscopy. Saudi J Gastroenterol 2020; 26: 13–19. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60. Hassan C, Wallace MB, Sharma P, et al. New artificial intelligence system : first validation study versus experienced endoscopists for colorectal polyp detection. Gut 2020; 69: 799–800. [DOI] [PubMed] [Google Scholar]
- 61. Rex DK, Kahi C, O’Brien M, et al. The American Society for Gastrointestinal Endoscopy PIVI (Preservation and Incorporation of Valuable Endoscopic Innovations) on real-time endoscopic assessment of the histology of diminutive colorectal polyps statements. Gastrointest Endosc 2011; 73: 419–422. [DOI] [PubMed] [Google Scholar]
- 62. Komeda Y, Handa H, Watanabe T, et al. Computer-aided diagnosis based on convolutional neural network system for colorectal polyp classification: preliminary Experience. Oncology 2017; 93: 30–34. [DOI] [PubMed] [Google Scholar]
- 63. Kandel P, LaLonde R, Ciofoaia V, et al. Colorectal polyp diagnosis with contemporary artificial intelligence. Gastrointest Endosc 2019; 89: AB403 DOI: 10.1016/j.gie.2019.03.613. [DOI] [Google Scholar]
- 64. Tischendorf JJW, Gross S, Winograd R, et al. Computer-aided classification of colorectal polyps based on vascular patterns : a pilot study. Endoscopy 2010; 42: 203–207. [DOI] [PubMed] [Google Scholar]
- 65. Gross S, Trautwein C, Behrens A, et al. Computer-based classification of small colorectal polyps by using narrow-band imaging with optical magnification. Gastrointest Endosc 2011; 74: 1354–1359. DOI: 10.1016/j.gie.2011.08.001. [DOI] [PubMed] [Google Scholar]
- 66. Byrne MF, Chapados N, Soudan F, et al. Real-time differentiation of adenomatous and hyperplastic diminutive colorectal polyps during analysis of unaltered videos of standard colonoscopy using a deep learning model. Gut 2019; 68: 94–100. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67. Chen P, Lin M, Lai M, et al. Accurate classification of diminutive colorectal polyps using computer-aided analysis. Gastroenterology 2018; 154: 568–575. DOI: 10.1053/j.gastro.2017.10.010. [DOI] [PubMed] [Google Scholar]
- 68. Kominami Y, Yoshida S, Tanaka S, et al. Computer-aided diagnosis of colorectal polyp histology by using a real-time image recognition system and narrow-band imaging magnifying colonoscopy. Gastrointest Endosc 2016; 83: 643–649. DOI: 10.1016/j.gie.2015.08.004. [DOI] [PubMed] [Google Scholar]
- 69. Takemura Y, Yoshida S, Tanaka S, et al. Quantitative analysis and development of a computer-aided system for identification of regular pit patterns of colorectal lesions. Gastrointest Endosc 2010; 72: 1047–1051. DOI: 10.1016/j.gie.2010.07.037. [DOI] [PubMed] [Google Scholar]
- 70. Hafner M, Gangl A, Kwitt R, et al. Improving pit-pattern classification of endoscopy images by a combination of experts. Med Image Comput Comput Assist Interv 2009; 12: 247–254. [DOI] [PubMed] [Google Scholar]
- 71. Rath T, Tontini GE, Vieth M, et al. In vivo real-time assessment of colorectal polyp histology using an optical biopsy forceps system based on laser-induced fluorescence spectroscopy. Endoscopy 2016; 48: 557–562. [DOI] [PubMed] [Google Scholar]
- 72. Aihara H, Saito S, Inomata H, et al. Computer-aided diagnosis of neoplastic colorectal lesions using ‘real-time’ numerical color analysis during autofluorescence endoscopy. Eur J Gastroenterol Hepatol 2013; 25: 488–494. [DOI] [PubMed] [Google Scholar]
- 73. André B, Vercauteren T, Buchner AM, et al. Software for automated classification of probe-based confocal laser endomicroscopy videos of colorectal polyps. World J Gastroenterol 2012; 18: 5560–5569. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 74. Stefanescu D, Streba C, Cartana E, et al. Computer aided diagnosis for confocal laser endomicroscopy in advanced colorectal adenocarcinoma. PLoS ONE 2016; 11: e0154863. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 75. Tafreshi M, Linard N, Andre B, et al. Semi-automated query construction for content-based endomicroscopy video retrieval. Med Image Comput Comput Assist Interv 2014; 17: 89–96. [DOI] [PubMed] [Google Scholar]
- 76. Prieto SP, Lai KK, Laryea JA, et al. Quantitative analysis of ex vivo colorectal epithelium using an automated feature extraction algorithm for microendoscopy image data epithelium using an automated feature extraction. J Med Imaging 2016; 3: 1–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77. Mori Y, Kudo SE, Wakamura K, et al. Novel computer-aided diagnostic system for colorectal lesions by using endocytoscopy (with videos). Gastrointest Endosc 2015; 81: 621–629. DOI: 10.1016/j.gie.2014.09.008. [DOI] [PubMed] [Google Scholar]
- 78. Mori Y, Kudo S, Wai P, et al. Impact of an automated system for endocytoscopic diagnosis of small colorectal lesions : an international web-based study. Endoscopy 2016; 48: 1110–1118. [DOI] [PubMed] [Google Scholar]
- 79. Mori Y, Kudo S, Mori K. Potential of artificial intelligence-assisted colonoscopy using an endocytoscope (with video). Dig Endosc 2018; 30: 52–53. [DOI] [PubMed] [Google Scholar]
- 80. Misawa M, Kudo SE, Mori Y, et al. Characterization of colorectal lesions using a computer-aided diagnostic system for narrow-band imaging endocytoscopy. Gastroenterology 2016; 150: 1531–1532. [DOI] [PubMed] [Google Scholar]
- 81. Takeda K, Kudo S, Mori Y, et al. Accuracy of diagnosing invasive colorectal cancer using computer-aided endocytoscopy. Endoscopy 2017; 49: 798–802. [DOI] [PubMed] [Google Scholar]
- 82. Mori Y, Kudo S, Misawa M, et al. Real-time use of artificial intelligence in identification of diminutive polyps during colonoscopy: a prospective study. Ann Intern Med 2018; 169: 357–366. [DOI] [PubMed] [Google Scholar]
- 83. Kudo S, Misawa M, Mori Y, et al. Artificial intelligence-assisted system improves endoscopic identification of colorectal neoplasms. Clin Gastroenterol Hepatol. Epub ahead of print 13 September 2019. DOI: 10.1016/j.cgh.2019.09.009. [DOI] [PubMed] [Google Scholar]
- 84. Maeda Y, Kudo S, Mori Y, et al. Fully automated diagnostic system with artificial intelligence using endocytoscopy to identify the presence of histologic inflammation associated with ulcerative colitis (with video). Gastrointest Endosc 2019; 89: 408–415. DOI: 10.1016/j.gie.2018.09.024. [DOI] [PubMed] [Google Scholar]
- 85. Ozawa T, Ishihara S, Fujishiro M, et al. Novel computer-assisted diagnosis system for endoscopic disease activity in patients with ulcerative colitis. Gastrointest Endosc 2019; 89: 416–421. DOI: 10.1016/j.gie.2018.10.020. [DOI] [PubMed] [Google Scholar]
- 86. Takenaka K, Ohtsuka K, Fujii T, et al. Development and validation of a deep neural network for accurate evaluation of endoscopic images from patients with ulcerative colitis. Gastroenterology 2020; 158: 2150–2157. [DOI] [PubMed] [Google Scholar]
- 87. Wu L, Zhang J, Zhou W, et al. Randomised controlled trial of WISENSE, a real-time quality improving system for monitoring blind spots during esophagogastroduodenoscopy. Gut 2019; 68: 2161–2169. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 88. Rombaoa C, Kalra A, Dao T, et al. Automated insertion time, cecal intubation and withdrawal time during live colonoscopy using convolutional neural networks: a video validation study. Gastrointest Endosc 2019; 89: AB619. DOI: 10.1016/j.gie.2019.03.1076. [DOI] [Google Scholar]
- 89. Arnott I, Lock P. Intelligent automated triage: using robotic and cognitive automation to improve the triage and referral management pathway in gastroenterology in NHS Lothian. Edinburgh: NHS Lothian, 2019. [Google Scholar]
- 90. NHSE. C the Signs – How artificial intelligence (AI) is supporting referrals. NHS England, 2019. https://www.england.nhs.uk/cancer/case-studies/c-the-signs-how-artificial-intelligence-ai-is-supporting-referrals/ [Google Scholar]
- 91. Bisschops R, East J, Hassan C, et al. Advanced imaging for detection and differentiation of colorectal neoplasia: European Society of Gastrointestinal Endoscopy (ESGE) guideline – update 2019. Endoscopy 2019; 51: 1155–1179. [DOI] [PubMed] [Google Scholar]

