Skip to main content
Iranian Endodontic Journal logoLink to Iranian Endodontic Journal
. 2024;19(2):85–98. doi: 10.22037/iej.v19i2.44842

Artificial Intelligence in Endodontics: A Scoping Review

Saeed Asgary a,*
PMCID: PMC10988643  PMID: 38577001

Abstract

Artificial intelligence (AI) is transforming the diagnostic methods and treatment approaches in the constantly evolving field of endodontics. The current review discusses the recent advancements in AI; with a specific focus on convolutional and artificial neural networks. Apparently, AI models have proved to be highly beneficial in the analysis of root canal anatomy, detecting periapical lesions in early stages as well as providing accurate working-length determination. Moreover, they seem to be effective in predicting the treatment success next to identifying various conditions e.g., dental caries, pulpal inflammation, vertical root fractures, and expression of second opinions for non-surgical root canal treatments. Furthermore, AI has demonstrated an exceptional ability to recognize landmarks and lesions in cone-beam computed tomography scans with consistently high precision rates. While AI has significantly promoted the accuracy and efficiency of endodontic procedures, it is of high importance to continue validating the reliability and practicality of AI for possible widespread integration into daily clinical practice. Additionally, ethical considerations related to patient privacy, data security, and potential bias should be carefully examined to ensure the ethical and responsible implementation of AI in endodontics.

Key Words: Artificial Intelligence, Artificial Neural Networks, Deep Learning, Diagnostic Precision, Endodontics, Convolutional Neural Networks

Introduction

Artificial Intelligence (AI) is transforming healthcare by bridging the gap between computers and humans. It exhibits intelligent behavior to achieve specific goals. AI was formally established in the 1950s and has since evolved into a cornerstone of healthcare. It promises to bring a paradigm shift in diagnostic accuracy, treatment planning, and clinical decision-making [1]. The applications of classical machine learning (ML) have garnered attention, particularly in precision medicine. They can predict treatment success based on patient traits. Deep neural networks (DNN) have also revolutionized medical image analysis. They demonstrate superior outcomes in tasks such as detecting cancer through lymph nodes. Implementing AI can improve diagnostic accuracy, reduce healthcare costs, and enhance overall patient care [2]. Robotic process automation, a facet of AI, can perform administrative procedures and alleviate physician burnout by automating repetitive tasks. AI’s ability to extract medical insights enhances decision-making in patient healthcare. It contributes to anticipatory decision-making and targeted treatment. ML algorithms contribute to radiological analysis, early disease detection, and diagnosis. Digitization of provider-patient interactions through AI-driven technologies enhances user experience and personalization in healthcare services. AI has the potential to provide up-to-date guidelines, foster economic growth, and contribute to innovation. However, misconceptions surrounding AI contribute to sensationalism and unrealistic expectations, emphasizing the need for accurate understanding and awareness in healthcare. The integration of AI represents a transformative era in healthcare, promising improved patient care, efficiency, and overall advancement in the medical field.

AI in dentistry

AI technology has brought about a significant transformation in the field of dentistry [3, 4]. AI applications leverage models such as convolutional neural networks (CNN) and artificial neural networks (ANN) to perform a wide range of functions within dental practice. Virtual dental assistants powered by AI ensure precision and efficiency in dental offices, performing tasks with reduced manpower and high accuracy. AI's diagnostic capabilities are particularly useful in oral and maxillofacial surgery, aiding in procedures such as dental implants and tumor removal. Design assistants such as RaPiD contribute to prosthetic dentistry by ensuring optimal aesthetic prostheses through anthropological calculations and patient preferences. Moreover, AI facilitates personalized care in orthodontics by analyzing radiographs, predicting malocclusion, identifying cephalometric landmarks, eliminating the need for multiple laboratory procedures and offering more precise diagnostics than human perception alone. Forensic odontology benefits from AI applications that can determine biological age and gender and analyze bite marks. Dental radiology harnesses AI's ability to recognize teeth, diagnose conditions such as caries, and predict issues like root caries and TMJ osteoarthritis. Also, periodontics and endodontics benefit from AI algorithms that enhance the diagnosis of compromised teeth [3, 4].

Rapid evolution of AI in endodontics

The rapid progress of AI in endodontics illustrates its potential to revolutionize patient care in this specialized field. Specifically, the emergence of CNN has led to remarkable advancements in diagnostic accuracy and treatment planning [5]. For instance, CNNs have demonstrated exceptional capabilities in tasks such as identifying intricate root canal morphology, determining working length (WL), detecting vertical root fractures (VRF), predicting thrust force and torque in canal preparation and detecting subtle signs of pathology in radiographic images [6]. These advances enable endodontists to make more precise diagnoses and develop tailored treatment plans, ultimately improving patient outcomes.

In addition to enhancing precision/efficiency, AI offers a range of other benefits in endodontics. For example, AI-powered algorithms can aid in treatment planning by analyzing patient data and predicting the optimal course of action based on individual characteristics and treatment objectives. Furthermore, AI can help reduce errors by providing real-time feedback during procedures and alerting clinicians to potential complications or deviations from optimal treatment protocols. Moreover, AI-driven technologies enable earlier detection of endodontic issues, allowing for timely intervention and prevention of disease progress.

The integration of AI into endodontic practice is supported by thorough research methodologies aimed at evaluating the reliability/accuracy of AI-driven diagnostic tools and treatment algorithms. Studies utilizing large datasets of clinical cases and employing rigorous validation techniques, such as cross-validation and external validation, provide compelling evidence for the efficacy of AI in endodontics. Moreover, the seamless integration of AI into clinical workflows is facilitated by user-friendly interfaces and interoperability with existing dental software systems.

Despite its transformative potential, the integration of AI into endodontic practice is not without challenges and limitations. For example, concerns related to data privacy and security must be addressed to ensure the ethical and responsible use of patient data in AI-driven applications. Additionally, the reliance on AI algorithms may pose challenges in cases where clinical judgment and expertise are required to interpret complex clinical scenarios or make treatment decisions.

Role of CNN and ANN

The combination of CNNs and ANNs has led to significant progress in endodontics, especially in tasks related to image recognition. Endodontists frequently use CNNs, which are a type of neural network (NN), to determine the working length (WL), detect vertical root fractures (VRFs), and evaluate root morphology. These networks comprise convolutional and pooling layers, which relevant features from input images, enabling accurate predictions [7]. Training CNNs involves exposing them to labeled datasets, allowing the network to learn associations between extracted features and correct labels through back propagation and optimization. Each layer in CNNs, including convolutional, pooling, activation, batch normalization, dropout, and fully connected layers, contributes to the network’s ability to extract intricate features and make precise predictions. Understanding the role of each layer assists researchers and practitioners in designing effective CNN architectures for various computer vision tasks.

Moreover, the advancement of ANNs has been instrumental in the development of AI in endodontics. The first generation of ANNs, characterized by linear logic networks, treated neural events using propositional logic. The second generation, known as connectionist networks, emerged in the mid-1980s and experienced a resurgence driven by the demand for AI development and debugging. This period saw the creation of various neural models and learning algorithms, contributing to the revival of ANNs. The third generation, represented by spiking neural networks (SNNs), incorporates neurobiological findings related to synaptic transmission, ion channel conductance, and spike-timing-dependent plasticity [8]. SNNs, with categories like rate encoding, paired-pulse ratio (PPR) encoding, and spike-time encoding, demonstrate enhanced capabilities in temporal coincidence and noise considerations, aligning with the intricacies of endodontic diagnoses and predictions.

Objectives of the review

The purpose of this review is to analyze the recent advances and performances of AI models in endodontics, and to examine how they can enhance prognosis predictions and improve treatment outcomes. The review will cover various applications of AI in endodontics, including determining critical parameters such as WL, VRFs, root canal failures, root morphology, and diagnostic aspects like pulpal diseases and periapical lesions. By synthesizing findings from various sources, this review aims to provide a contemporary understanding of AI’s potential to increase the accuracy, efficiency, and accessibility of endodontic services. The ultimate goal is to be a valuable resource for practitioners and researchers in endodontics, guiding the development of precise AI models tailored to the field’s unique challenges and informing future research directions.

Materials and Methods

A systematic search was conducted to find relevant articles about AI in endodontics, covering studies published up to October 15, 2023. The search protocol involved the next steps:

1. Database selection and search strategy: PubMed and Scopus were chosen as primary sources for the systematic search because they extensively cover medical/dental literature. The search strategy involved the use of carefully selected search terms and Boolean operators, including ("artificial intelligence," OR "machine learning," OR "deep learning," OR "neural network*," OR "dental diagnosis," OR "prognosis prediction" AND "endodontic*"). These search terms were combined using Boolean operators to ensure comprehensive retrieval of relevant articles. Filters/limitations were applied to narrow the search results to articles published prior to the specified cutoff date, increasing their relevance and specificity.

2. Adherence to guidelines: Although no official guidelines were followed, best practices were adhered to ensure reproducibility and rigor throughout the search/review process.

3. Inclusion and exclusion criteria: Inclusion criteria were established to select articles relevant to the topic of AI applications in endodontics. Articles were included if they addressed AI technologies, machine learning algorithms, or deep learning models in the context of endodontic diagnosis, treatment planning, or clinical decision-making. Studies published in peer-reviewed journals up to the specified cutoff date were considered. Exclusion criteria included non-English articles, studies unrelated to endodontics or AI, and articles not relevant to the research focus.

Each selected article underwent complete data extraction, including author identification, title, year of publication, and extraction of key findings and conclusions. Articles were systematically categorized based on their research focus and specific topics related to AI applications in endodontics to facilitate subsequent data analysis and synthesis.

Results

The systematic search yielded a significant number of articles from two major databases. PubMed provided 131 results, while Scopus contributed 103 relevant documents. Additionally, a thorough manual search uncovered 18 pertinent articles that may have been initially overlooked. After meticulously removing duplicate publications and those deemed unrelated, the research team identified and selected 58 unique articles for detailed scrutiny/synthesis (Table 1). Our meticulous curation process ensures a comprehensive review of AI in endodontics.

Table 1.

A succinct summary of the selected studies delving into the application of AI in endodontics is presented below. The table encompasses crucial details such as the study title, author, publication year, employed algorithms, objectives, study factors, modality, and the number of patients or images involved. The results column succinctly outlines the key findings or outcomes of each study. The classification of studies is based on their research focus, with emphasis on automated root/tooth morphology detection systems, caries detection, pulpal diagnosis, working length determination, vertical root fracture detection, automated endodontic/periapical lesions detection, endodontic prediction, case difficulty, fluid behavior, and other applications of AI in endodontics. This structured categorization facilitates a nuanced exploration of AI’s multifaceted contributions to various facets of endodontic practice

Objective Author
Year [Ref.]
Algorithm Study Factor Modality No. of Patients/
Images
Results
Automated canal/root/tooth morphology detection systems Automated detection system for endodontic-treated teeth Chen et al. 2022 [13] CNN Retained roots, endodontic treated teeth, implants OPG Not specified Improved image segmentation and anomaly detection
Detection of root canal obturation from noisy radiographs Hasan et al. 2023 [14] YOLOv5s, YOLOv5x, YOLOv7 Endodontic treatment outcomes PRs 250 PRs Successful classification of obturation and mishaps
Assessment of root morphology on OPG Hiraiwa et al. 2019 [15] DLM Root morphology of molars OPGs 760 mandibular first molars High accuracy in root morphology diagnosis
Tooth and pulp segmentation in CBCTs Duan et al. 2021 [16] U-Net Tooth and pulp segmentation CBCT Feature Pyramid Network Accurate segmentation of tooth and pulp in CBCT images
Tooth Segmentation Lahoud et al. 2021 [17] Feature Pyramid Network Tooth morphology CBCT 433 images Fast and accurate tooth segmentation on CBCT imaging
Tooth detection and segmentation Leite et al. 2021 [18] Deep CNN Tooth detection OPG 153 OPGs Accurate and fast tooth detection and segmentation
Pulp cavity and tooth segmentation Lin et al. 2021 [19] U-Net Network Micro-CT data CBCT 30 teeth Enhanced tooth and pulp cavity segmentation
C-shaped canal detection and classification Sherwood et al. 2021 [20] DL (U-Nets) C-shaped canal classification CBCT 100 training and 35 testing Improved C-shaped canal detection and classification
C-shaped Canals Prediction Jeon et al. 2021 [21] CNN C-shaped canal prediction OPGs 1020 patients Accurate prediction of C-shaped canals on panoramic radiographs
C-shaped Canal Classification Yang et al. 2022 [22] Deep CNN C-shaped canal classification PRs 1000 teeth Effective C-shaped canal diagnosis in mandibular second molars
Accurate/automatic root canal segmentation Wang et al. 2023 [23] DentalNet and PulpNet Automatic tooth and root canal segmentation CBCT Two clinical datasets Efficient, precise, and fully automatic tooth and root canal segmentation in difficult RCTs
Caries detection Automatic diagnosis of dental diseases using CNN Ghaznavi Bidgoli et al. 2021 [24] Deep NN Dental diseases diagnosis OPG Standard dataset Automatic diagnosis of decayed, root-canaled, and restored teeth
Dental Caries Detection Oztekin et al. 2023 [25] ML models Different pre-trained models OPGs 562 subjects Accurate dental caries detection
Pulpal diagnosis To identify pulpitis through PRs Tumbelaka et al. 2014 [26] ANN Normal pulp, pulpitis, necrotic pulp PRs 20 (10 molar and 10 canine teeth) Direct reading radiography is better to be digitized for improved diagnosis validation
To diagnose deep caries and pulpitis on PRs Zheng et al. 2021 [27] CNN (VGG19, Inception V3, ResNet18) Deep caries and pulpitis PRs 844 PRs (717 for training, 127 for testing) Multi-modal CNN (ResNet18 integrated with clinical parameters) demonstrated significantly enhanced performance
WL determination To locate the minor apical foramen using radiograph features Saghiri et al. 2012 [28] ANN Perceptron ANN model PRs 50 straight single-rooted teeth ANN enhances accuracy in WL determination by radiography.
To evaluate ANN's accuracy in a cadaver model Saghiri et al. 2012 [29] ANN Location of the file in relation to AF Human cadaver 50 single-rooted teeth ANN outperformed endodontists in WL determination when compared to real measurements
To measure working length (WL) using multifrequency impedance Qiao et al. 2020 [30] NN Impedance ratios, type of tooth, and file Circuit system & impedance ratios Not specified The multifrequency impedance method using NNs showed improved accuracy and robustness in WL measurement.
Vertical root fracture detection Develop a PNN for VRF detection using DR Kositbowornchai et al. 2013 [31] Probabilistic NN 150 VRF and 50 sound teeth DR 200 images PNN proves to be an effective model for VRF detection in DR
Design a PNN to diagnose VRFs Johari et al. 2017 [32] Probabilistic NN Intact/endodontically treated teeth PRs, CBCTs 240 radiographs of teeth PNN-based models effectively diagnose VRFs in both PRs and CBCT images
Demonstrate DSR in the detection of VRFs Mikrogeorgis et al. 2018 [33] DSR Endodontically treated teeth DSR images Four clinical cases DSR proves to be a useful diagnostic tool for VRF detection
Evaluate the use of CNN for detecting VRF on OPG Fukuda et al. 2020 [34] CNN CNN-based DLM OPG 300 OPGs The CNN model detected VRFs on OPGs and serves as a CAD tool
Develop an algorithm for detecting microfractures Vicory et al. 2021 [35] AIA and ML Wavelet Features and ML CBCTs 22 teeth (=14 microfractures) The algorithm enables the quantification of microfractures in teeth
Efficiency of DLM in diagnosing VRFs on CBCT images Hu et al. 2022 [36] DLM Three DLNs CBCTs 276 teeth ResNet50 showed promise in diagnosing in vivo VRFs
Automated endodontic/periapical lesions detection Differential diagnosis of periapical lesions in CBCT scans Okada et al. 2015 [37] CAD Differential Diagnosis Histology and CBCT 28 CBCT scans CAD showed promise for noninvasive differential diagnosis of apical lesions
Automated detection of apical lesions in OPGs Birdal et al. 2016 [38] DWT Apical lesion OPGs Not specified The used methodology can efficiently assist in examining radiographs for apical lesions
DL for radiographic detection of apical lesions on OPGs Ekert et al. 2019 [39] 7-layer Deep CNN Apical Lesion Detection OPG 2001 tooth segments The deep CNN detected apical lesions on OPGs.
DLA for periapical disease detection Endres et al. 2020 [40] DLA OMF surgeons' assessments and DLM OPG 2902 OPGs DLA has the potential to assist in detecting periapical lesions.
Diagnostic performance of AI in detecting periapical pathosis Orhan et al. 2020 [41] Deep CNN Localization, lesion detection, and lesion volume CBCTs 153 lesions in 109 patients AI-based DLS were useful for detecting apical pathosis on CBCT
AI for the CAD of apical lesions Setzer et al. 2020 [42] DL Periapical lesion detection CBCTs 20 scans DL algorithm displayed high lesion detection accuracy
Disease detection on PRs Chen et al. 2021 [43] Deep CNNs Disease categories, severity levels, train strategies PRs Not specified Deep CNNs detected diseases, with varying performance based on severity and training strategies
CNNs for detecting apical lesions Li et al. 2021 [44] CNN Image database of individual tooth images Automatic diagnosis Standardized database The CNN model efficiently diagnosed apical lesions
Diagnostic performance of CNNs vs. human observers Pauwels et al. 2021 [45] Convolutional CNNs Comparison of CNNs and human observers PRs, CBCTs Simulated periapical lesions CNNs showed promise for periapical lesion detection, surpassing human observers
Automatic detection of endodontic lesions in CBCTs Calazans et al. 2022 [46] CNN (Siamese Network) Apical lesion classification CBCT 1000 scans The proposed system achieved an accuracy of about 70%, offering diagnostic support in endodontics
Determine the efficacy of AI in detecting apical radiolucencies Hamdan et al. 2022 [47] Denti.AI DL Tool Dentists' performance PRs 68 PRs Enhanced diagnostic accuracy for apical radiolucencies
Automated detection of osteolytic apical lesions Kirnbauer et al. 2022 [48] Deep CNNs Tooth localization and lesion detection CBCTs 144 CBCT images The method provided excellent results in detecting osteolytic apical lesions in CBCT.
DL for caries and apical periodontitis detection Li et al. 2022 [49] DLM Detection of dental caries and apical periodontitis PRs 4129 images DLM achieved scores of 0.829 for dental caries and 0.828 for apical periodontitis
Categorization of apical lesions based on PAI Moidu et al. 2022 [50] CNN Different PAI scores PRs 3000 areas (1950 digital PRs) The CNN model performed well in categorizing endodontic lesions
PRs image classification using Deep NN Vasdev et al. 2022 [51] Pipelined Deep NN (AlexNet model) Healthy vs. Non-Healthy Classification PRs 16000 images The AlexNet model outperformed other models in dental disease classification
Accuracy of AI in detecting periapical periodontitis on PRs Issa et al. 2023 [52] Diagnocat AI System Diagnostic Test Accuracy PRs 20 PRs (60 teeth) The AI algorithm showed high accuracy in detecting apical periodontitis
Automatic differential diagnosis of apical lesions Patel et al. 2023 [53] Image Processing Tool Differential Diagnosis PRs 60 images (gold standard dataset) The tool achieved high sensitivity, specificity, and accuracy in differential diagnosis of apical lesions
Endodontic prediction Predict the practicality of performing or not performing a retreatment Campo et al. 2016 [54] Case-Based Reasoning (CBR) Dental retreatment Not specified Not specified The system minimizes false negatives
Assess factors influencing endodontic failure & and predict failure using ML Herbst et al. 2022 [55] ML (LogR, RF, GBM, XGB) Tooth-, treatment-, and patient-level covariates Endodontic treatments 458 patients (591 teeth) Tooth-level factors strongly associated with failure
Predict prognosis of endodontic microsurgery Qu et al. 2022 [56] GBM, RF Tooth type, lesion size, type of bone defect, root filling density, etc. CBCT 234 teeth (178 patients) ML model improve the efficiency of clinical decision-making
Automated evaluation of RCT results from X-ray images Li et al. 2022 [57] AGMB-Transformer Network Anatomy features and multi-branch Transformer network X-ray images 245 endodontically treated teeth AGMB-Transformer significantly improved evaluation of RCT outcomes
Predict endodontic treatment outcome based on preoperative PRs Lee et al. 2023 [58] Deep CNN Seven clinical features PRs 598 single-root premolars Efficient, precise, and fully automatic root canal segmentation to support clinical decisions
Identify factors affecting optimal RFL during RCT and predict RFL Herbst et al. 2023 [59] ML (LogR, SVM, DT, GBM, XGB) Operator, indistinct canal paths, root canals reduced in size, retreatments Apical extent prediction 555 completed RCTs (343 patients) Limited predictive ability
Enhancing endodontic precision Latke & Narawade 2023 [60] Hybrid Ensemble Classifier Root canal curvature and calcification Dental Imaging N/A Refining endodontic treatments
Case difficulty Predict case difficulty in endodontic microsurgery Qu et al. 2023 [61] LR, SVR, XGB Lesion size, anatomical structures, root filling density, etc. CBCT 261 patients (341 teeth) XGBoost outperformed LR and SVR models
Automated assessment of case difficulty and referral decisions Mallishery et al. 2020 [62] ANN AAE Endodontic Case Difficulty Assessment Form Standardized AAE form 500 cases Automation of case difficulty assessment
Fluid behavior Fluid Motion Estimation Peeters et al. 2022 [63] AI EDDY® tip fluid flow behavior High-speed Imaging N/A Detailed fluid flow behavior analysis
EndoActivator Fluid Behavior Peeters et al. 2022 [64] AI EndoActivator tip fluid flow High-speed Imaging N/A Visualization of fluid behavior and bubbles
Other uses of AI in endodontics Detect and segment unobturated MB2 canals in CBCTs Albitar et al. 2022 [65] CNN (U-Net) Unobturated MB2 canals CBCT 57 scans Potential for identifying obturated and unobturated canals
Detect separated instruments on radiographs Buyuk et al. 2023 [66] CNN (Gabor filtered) and LSTM Separated endodontic instrument OPG 915 teeth Gabor filtered-CNN model achieved the best performance
Differentiating stress from EPT-induced electrodermal activity Kong et al. 2023 [67] Multilayer Perceptron Stress and EPT stimulation EDA Signals 51 subjects Successful discrimination between stress and EPT stimulation
Predicting pulp exposure risk in radiographic images Ramezanzade et al. 2023 [68] Multi-path NN Dental pulp exposure Bitewing radiographs 292 images DenseNet provided best predictive effect for pulp exposure
Interactive system for access cavity assessment in preclinic Choi et al. 2023 [69] Software Access cavity assessment Three-dimensional models 44/79 students Integration into preclinical curriculum for dental students
Evaluate consistency and accuracy of ChatGPT in endodontics Suárez et al. 2023 [70] ChatGPT (AI Chatbot) ChatGPT performance Clinical questions 91 dichotomous (yes/no) questions Currently, ChatGPT is not capable of replacing dentists in clinical decision-making

AGMB: Anatomy-Guided Multi-Branch; AIA: Advanced Image Analysis; ANN: Artificial Neural Networks; CAD: Computer-Aided Detection; CBCT: Cone-beam Computed Tomography; CNN: Convolutional Neural Networks; DLA: Deep Learning Algorithm; DLM: Deep Learning Models; DLN: Deep Learning Networks; DLS: Deep Learning Segmentation; DR: Digital Radiography; DSR: Digital Subtraction Radiography; DT: Decision Tree; DWT: Discrete Wavelet Transformation; GBM: Gradient Boosting Machine; LogR: Logistic Regression; LR: Linear Regression; ML: Machine Learning; NN: Neural Networks; OPG: Panoramic Radiograph; PRs: Periapical Radiographs; RF: Random Forests; RFL: Root filling length; SVM: Support Vector Machine; SVR: Support Vector Regression; VRF: Vertical Root Fracture; XGB: Extremely Gradient Boosting

Discussion

This review is focused solely on original research contributions in AI applications in endodontics, setting it apart from other scholarly discussions. The review process only included original studies, which provide new insights and methodologies directly from primary research. While acknowledging previous review articles [9-12], this compilation prioritizes original research, offering an authentic snapshot of the latest advancements in integrating AI technologies into endodontic practice. This approach enhances the reliability and credibility of the included studies, reinforcing the scholarly rigor of this comprehensive exploration.

Furthermore, it is important to recognize the diversity in diagnostic accuracy among different imaging modalities employed across these studies. Variations in diagnostic accuracy among panoramic radiographs/orthopantomographs (OPGs), periapical radiographs (PRs), cone-beam computed tomography (CBCT) scans, and micro-CT scans should be considered when interpreting the results. Each modality has its strengths and limitations, which can significantly influence diagnostic outcomes. Therefore, discussions should account for these differences to ensure accurate assessments and informed decision-making in clinical practice.

Automated canal/root/tooth morphology detection systems

Over the years, several studies have explored the potential of using AI to improve endodontic diagnostics. Automated systems have been developed to detect canal, root, and tooth morphology in endodontics, using advanced algorithms to analyze dental images. These systems provide endodontists with precise assessments of root canal configurations and tooth structures.

One study by Chen et al. [13] used a convolutional neural network (CNN) to create an automated detection system for endodontic-treated teeth using orthopantomograms (OPGs). This algorithm significantly improved image segmentation and anomaly detection, demonstrating its potential to enhance endodontic treatment planning. Another study by Hasan et al. [14] used YOLOv5s and YOLOv5x to detect root canal obturation from noisy periapical radiographs (PRs), successfully classifying obturation and mishaps, demonstrating the efficacy of these algorithms in assessing endodontic treatment outcomes. Hiraiwa et al. [15] utilized a Deep Learning Model (DLM) to assess root morphology, particularly focusing on mandibular first molars using OPGs. Their study achieved high accuracy in root morphology diagnosis, showcasing the potential of AI in enhancing diagnostic precision. Duan et al. [16] explored tooth and pulp segmentation in CBCT scans using the U-Net algorithm and Feature Pyramid Network, demonstrating accurate segmentation and showcasing the potential for AI to contribute to detailed diagnostic processes.

Lahoud et al. [17] utilized the Feature Pyramid Network for tooth segmentation, focusing on tooth morphology in CBCT imaging. This indicates the versatility of AI in addressing specific diagnostic needs. Leite et al. [18] implemented a Deep CNN for tooth detection using OPGs, demonstrating accurate and rapid detection and segmentation of teeth, emphasizing the efficiency AI can bring to diagnostic processes in endodontics.

Lin et al. [19] applied a U-Net Network to enhance pulp cavity and tooth segmentation using micro-CT data in CBCT scans, showcasing improved segmentation and contributing to the precision of endodontic diagnostics. Sherwood et al. [20] utilized Deep Learning (DL; U-Nets) for the detection and classification of C-shaped canals in CBCT scans, significantly improving C-shaped canal detection and classification. Jeon et al. [21] focused on CNN for the prediction of C-shaped canals on OPGs, accurately predicting C-shaped canals and showcasing the potential of AI in addressing complex diagnostic challenges.

Finally, Yang et al. [22] explored the application of Deep CNNs for the classification of C-shaped canals in PRs, demonstrating effective C-shaped canal diagnosis in mandibular second molars. Wang et al. [23] employed DentalNet and PulpNet for automatic tooth and root canal segmentation in CBCT scans, showcasing efficient, precise, and fully automatic segmentation that is particularly beneficial in challenging root canal treatments.

Caries detection

Caries detection is an important area where AI has made significant strides. Machine learning algorithms have enabled accurate identification of carious lesions in dental images. For instance, Ghaznavi Bidgoli et al. [24], utilized a CNN in a deep NN framework to diagnose dental diseases automatically, using an OPG and a standard dataset. Their approach included identifying decayed, root-canaled, and restored teeth, which showcases the potential of AI in comprehensive dental disease diagnosis. Similarly, Oztekin et al. [25] focused on dental caries detection using various ML models and pre-trained models. They conducted their study on OPGs, with a dataset comprising 562 subjects, and demonstrated the accuracy of AI in detecting dental caries. These studies highlight the versatility of AI applications in dentistry, particularly in the area of caries detection, where ML models are effective in providing precise and automated diagnoses.

Pulpal diagnosis

The field of pulpal diagnosis has benefited greatly from the integration of AI, as demonstrated by various studies. For example, Tumbelaka et al. [26] used an ANN to differentiate between normal pulp, pulpitis, and necrotic pulp based on PRs. Their study, which involved 20 teeth (10 molars and 10 canine teeth), showed that digitizing direct reading radiography could enhance the validation of pulpal diagnoses. In a more recent investigation, Zheng et al. [27] explored the diagnosis of deep caries and pulpitis on PRs, using convolutional neural networks (CNNs) such as VGG19, Inception V3, and ResNet18. Their study, which involved a comprehensive dataset of 844 PRs (717 for training and 127 for testing), revealed that a multi-modal CNN, particularly ResNet18 integrated with clinical parameters, significantly improved the accuracy of diagnosing deep caries and pulpitis.

Although AI has shown promise in distinguishing between different pulpal conditions using radiographs, it is important to recognize the limitations associated with relying solely on radiographic assessment. It is crucial to emphasize the complementary role of clinical and radiographic examinations alongside other diagnostic tools, such as pulp and periapical tests. This integrated diagnostic approach ensures a thorough evaluation, thereby enhancing the accuracy and reliability of pulpal diagnoses in clinical practice.

Working length determination

Determination the WL accurately is crucial for successful endodontic treatments, and AI has played a significant role in improving this process. Saghiri et al. [28] conducted a study on locating the minor apical foramen using radiograph features, which involved employing an ANN with a Perceptron model on PRs of 50 straight single-rooted teeth. The study concluded that the ANN model was effective in enhancing the accuracy of WL determination through radiography. In another study, Saghiri et al. [29] evaluated the accuracy of ANN in a cadaver model by determining the location of the file in relation to the apical foramen. The study demonstrated that the ANN outperformed endodontists in determining the WL when compared to real measurements, using human cadavers with 50 single-rooted teeth. This suggests that AI, particularly ANN, has the potential to provide more precise WL measurements, even surpassing human expertise in certain scenarios. Qiao et al. [30] explored a different approach by utilizing a NN to measure WL using multifrequency impedance. They incorporated factors such as impedance ratios, type of tooth, and file characteristics into the circuit system. Although the number of specified cases is not provided, the multifrequency impedance method using NNs demonstrated improved accuracy and robustness in WL measurement. These studies collectively emphasize the potential of AI in revolutionizing the determination of WL in endodontic procedures, offering more accurate and reliable outcomes.

Vertical root fracture detection

The detection of VRF is a difficult aspect of endodontic diagnosis, but AI has made significant advancements in this field. Several studies have contributed to the development and validation of AI models that accurately detect VRF, providing clinicians with valuable tools to identify potential treatment complications.

Kositbowornchai et al. [31] used a Probabilistic Neural Network (PNN) to develop an effective model for VRF detection using dental radiographs. Their study included 150 cases of VRF and 50 sound teeth, utilizing 200 images for training and validation. The PNN-based model demonstrated effectiveness in VRF detection in dental radiographs, showcasing the potential of AI in addressing this challenging aspect of endodontic diagnosis. Johari et al. [32] designed a PNN to diagnose VRFs in both intact and endodontically treated teeth. Their study utilized PRs and CBCT, involving 240 radiographs for evaluation. The PNN-based models were found to be effective in diagnosing VRFs in both PRs and CBCT images, highlighting the versatility of AI in fracture detection across different imaging modalities. Mikrogeorgis et al. [33] demonstrated the utility of Digital Subtraction Radiography (DSR) in the detection of VRFs in endodontically treated teeth. The study, based on four clinical cases and DSR images, showed that DSR could be a useful diagnostic tool for VRF detection, adding to the array of AI applications in this domain.

Fukuda et al. [34] explored the use of CNN for detecting VRFs on OPGs. Analyzing 300 OPGs, the CNN-based DLM effectively detected VRFs, serving as a computer-aided diagnosis (CAD) tool for clinicians. Vicory et al. [35] introduced an algorithm for detecting microfractures, employing AI and MLs (AIA and ML) with wavelet features. Utilizing CBCTs with 22 teeth (14 with microfractures), their algorithm demonstrated the capability to quantify microfractures in teeth, showcasing the potential for AI in addressing subtler aspects of fracture detection. Hu et al. [36] evaluated the efficiency of DLM in diagnosing VRFs on CBCT images. Utilizing three different Deep Learning Networks (DLNs) and 276 teeth, ResNet50 showed promise in diagnosing in vivo VRFs, emphasizing the potential for AI in real-world clinical scenarios. These studies collectively highlight the positive impact of AI in enhancing the detection of VRF, providing clinicians with valuable tools for improved diagnostic accuracy and treatment planning.

Automated endodontic/periapical lesion detection

The use of AI in the automated detection of endodontic and periapical lesions has revolutionized dental diagnostics. Multiple studies have demonstrated the feasibility of using ML algorithms to identify and classify lesions, leading to faster diagnosis and targeted treatment planning.

Okada et al. [37] employed CAD to distinguish between periapical lesions in CBCT scans. The study, which included 28 CBCT scans, demonstrated the potential of CAD in non-invasive differential diagnosis of apical lesions, highlighting the usefulness of AI in identifying different pathologies. Birdal et al. [38] utilized Discrete Wavelet Transform (DWT) to detect apical lesions in OPGs. Although the number of subjects was not specified, the methodology efficiently assisted in examining radiographs for apical lesions, showcasing the versatility of AI applications in different imaging modalities. Ekert et al. [39] implemented a 7-layer deep CNN for the radiographic detection of apical lesions on OPGs. With 2001 tooth segments involved in the study, the deep CNN demonstrated effectiveness in detecting apical lesions, highlighting the potential of DL in enhancing lesion detection.

In 2020, Endres et al. [40] introduced a Deep Learning Algorithm (DLA) for periapical disease detection. Utilizing 2902 OPGs and incorporating assessments by oral and maxillofacial surgeons, the DLA showed promise in assisting in the detection of periapical lesions, demonstrating the collaborative potential between AI and clinical expertise. Orhan et al. [41] investigated the diagnostic performance of AI in detecting periapical pathosis using Deep CNNs. Analyzing 153 lesions in 109 patients through CBCTs, AI-based Deep Learning Systems (DLS) were found to be useful for detecting apical pathosis, underscoring the potential for AI in contributing to accurate diagnoses. Setzer et al. [42] focused on the use of DLA for periapical lesion detection in CBCTs. With 20 scans included in the study, the DL algorithm displayed high accuracy in lesion detection, emphasizing its potential as a valuable tool in endodontic diagnostics.

In 2021, Chen et al. [43] used Deep CNNs to detect disease on PRs achieving varying performance based on severity and training strategies. Li et al. [44] implemented CNNs for detecting apical lesions using a standardized image database, demonstrating the potential for AI in lesion detection in diverse clinical scenarios. Pauwels et al. [45] compared the diagnostic performance of Convolutional CNNs with human observers in the detection of simulated periapical lesions. The CNNs surpassed human observers in certain aspects, showcasing their potential for periapical lesion detection.

In 2022, Calazans et al. [46] proposed a CNN-based Siamese Network for apical lesion classification in CBCTs. The proposed system achieved an accuracy of about 70%, offering diagnostic support in endodontics. Hamdan et al. [47] evaluated the Dentist.AI DL Tool for the automated detection of apical radiolucencies, showcasing its potential as an adjunctive diagnostic tool. Kirnbauer et al. [48] utilized Deep CNNs for the automated detection of osteolytic apical lesions in CBCTs, emphasizing the precision and efficiency of AI in lesion detection. Li et al. [49] explored the use of DLM for the detection of dental caries and apical periodontitis in PRs, achieving high scores for both pathologies. Moidu et al. [50] employed CNNs for the categorization of apical lesions based on Periapical Index (PAI) scores, indicating its potential for precise lesion classification. Vasdev et al. [51] utilized a Pipelined Deep NN model (AlexNet) for PR image classification, outperforming other models in dental disease classification.

In the current year, Issa et al. [52] assessed the diagnostic test accuracy of the Diagnocat AI System in detecting apical periodontitis on PRs, showing high accuracy in detecting apical periodontitis. Patel et al. [53] developed an Image Processing Tool for the automatic differential diagnosis of apical lesions in PRs, achieving high sensitivity, specificity, and accuracy in the differential diagnosis of apical lesions.

Overall, these studies demonstrate the versatility and transformative potential of AI in automating the detection and classification of endodontic and periapical lesions, providing valuable tools for clinicians in their diagnostic and treatment-planning endeavors.

Endodontic prediction

Predictive modeling in endodontics has gained momentum through AI applications. In one study, Campo et al. [54] introduced a predictive model that utilized Case-Based Reasoning (CBR) to assess the practicality of performing or not performing retreatment for dental cases. The system was designed to minimize false negatives, offering valuable insights into the decision-making process. Herbst et al. [55] delved into assessing factors influencing endodontic failure and predicting failure using ML techniques such as Logistic Regression (LogR), Random Forest (RF), Gradient Boosting Machine (GBM), and XGBoost. With a study involving 458 patients and 591 teeth, the research highlighted tooth-level factors strongly associated with failure, contributing to a more nuanced understanding of treatment outcomes.

Qu et al. [56] ventured into predicting the prognosis of endodontic microsurgery using a Gradient Boosting Machine (GBM) and Random Forest (RF). The study considered factors such as tooth type, lesion size, type of bone defect, and root filling density with a dataset of 234 teeth from 178 patients, demonstrating enhanced efficiency in clinical decision-making. Li et al. [57] explored the automated evaluation of root canal treatment (RCT) outcomes from X-ray images, introducing an AGMB-Transformer Network. By incorporating anatomy features and a multi-branch Transformer network, the study focused on 245 endodontically treated teeth, showcasing that the AGMB-Transformer significantly improved the evaluation of RCT outcomes.

Lee et al. [58] employed a Deep CNN to predict endodontic treatment outcomes based on preoperative PRs. The study demonstrated the efficiency, precision, and full automation of root canal segmentation to support clinical decisions for 598 single-root premolars. In another study by Herbst et al. [59], ML techniques including LogR, Support Vector Machine (SVM), Decision Trees (DT), Gradient Boosting Machine (GBM), and XGBoost were employed to identify factors affecting optimal root filling length (RFL) during RCT and predict RFL. The study, based on 555 completed RCTs involving 343 patients, revealed limited predictive ability in this context. Latke and Narawade [60] focused on enhancing endodontic precision through a Hybrid Ensemble Classifier, considering root canal curvature and calcification in dental imaging. Although specific case numbers were not provided, the research aimed at refining endodontic treatments through improved classification methods.

Case difficulty

The difficulty of endodontic cases can be predicted using AI applications, which can provide valuable insights for practitioners. A study by Qu et al. [61] used LR, SVR, and XGB models to assess case difficulty in endodontic microsurgery. By considering factors such as lesion size, anatomical structures, and root filling density from CBCT scans of 261 patients (341 teeth), the study found that XGBoost outperformed LR and SVR models. This makes it an advanced tool for anticipating surgical challenges. Another approach developed by Mallishery et al. [62] involved an automated system that uses ANN to assess case difficulty and support referral decisions. The system analyzed 500 cases, using the AAE Endodontic Case Difficulty Assessment Form as a standardized input. This demonstrated the potential for automation in evaluating the complexity of endodontic cases. This innovative use of AI contributes to more efficient and standardized case difficulty assessments in clinical practice.

Fluid behavior

In recent years, AI has been used in innovative studies that explore the potential of AI in endodontics beyond traditional diagnostic tasks. Peeters et al. [63] conducted a study on fluid behavior during endodontic procedures using AI for Fluid Motion Estimation. The study utilized high-speed imaging to analyze the EDDY® tip fluid flow behavior, providing valuable insights into the complex dynamics of fluid motion. Another study by Peeters et al. [64] focused on the fluid behavior associated with the EndoActivator, using AI to analyze EndoActivator tip fluid flow. Through high-speed imaging, the study aimed to visualize fluid behavior and bubbles, contributing to a comprehensive understanding of the dynamics involved in this specific endodontic procedure. These studies highlight the versatility of AI applications in endodontics, showcasing its potential in addressing practical challenges and enhancing procedural insights, in addition to its diagnostic capabilities.

Other uses of AI in endodontics

AI has a wide range of applications in endodontics beyond traditional diagnostic tasks. It can address various challenges in clinical practice, improve procedural outcomes, and provide valuable insights for comprehensive endodontic assessments. For instance, a study by Albitar et al. [65] utilized a CNN with U-Net architecture to identify both obturated and unobturated canals, while Buyuk et al. [66] introduced a CNN model with Gabor filtering and Long Short-Term Memory (LSTM) networks to detect separated endodontic instruments in OPGs. Kong et al. [67] used a Multilayer Perceptron to differentiate stress from Electric Pulp Tester (EPT)-induced electro dermal activity signals, and Ramezanzade et al. [68] introduced a Multi-path NN to predict pulp exposure risk in Bitewing radiographs. Choi et al. [69] developed an interactive software system for access cavity assessment using three-dimensional models, while Suárez et al. [70] evaluated the consistency and accuracy of ChatGPT, an AI chatbot, in endodontics. Additionally, it was seen that ChatGPT has weaknesses and limitations in understanding the situation and making decisions in treatment planning [71].

These studies highlight the transformative potential of AI in endodontics and how it can redefine the standards of dental practice. The following subsections provide a detailed exploration of each thematic area, highlighting key methodologies, outcomes, and the implications of these studies for the broader landscape of endodontics.

Limitations

It is important to note that this review is limited to articles published before October 15, 2023, which means that it excludes more recent publications.

Ethical considerations and future directions

The integration of AI into endodontic practice shows great promise, but it is crucial to address ethical considerations, limitations, and emerging challenges to ensure responsible and effective implementation [72]. Ethical concerns regarding patient privacy, data security, and algorithmic bias require careful scrutiny and robust regulatory frameworks to safeguard patient welfare and uphold professional standards. The use of historical datasets for AI training raises concerns about representativeness and generalizability, highlighting the need for diverse and inclusive datasets to mitigate bias and improve model reliability. Additionally, the lack of interpretability of AI algorithms remains a challenge, limiting their acceptance and adoption by clinicians. Future research should prioritize transparency and explainability, facilitating trust and comprehension among endodontic practitioners. Furthermore, exploring novel applications such as predictive modeling for treatment outcomes, real-time procedural guidance, and patient-centered decision support systems offers exciting prospects for advancing clinical practice. Continuous interdisciplinary dialogue, ethical reflexivity, and technological innovation are essential to harnessing the transformative potential of AI while ensuring its ethical and equitable integration into endodontic healthcare.

Conclusions

The integration of AI in endodontics has had a transformative impact on the field. This has been exemplified by the use of CNN, ANN, and various ML models, which have enabled greater diagnostic precision, better treatment planning, and improved clinical decision-making. There are many diverse applications of AI in this field, from automated canal morphology detection to caries diagnosis, pulpal condition assessment, WL determination, and VRF detection. These applications underscore the multifaceted impact of AI on endodontic practice. The findings presented in this review are robust and supported by studies employing PNN, DLM, and innovative algorithms like CBR and AGMB networks. They emphasize AI’s potential to enhance efficiency, accuracy, and personalized treatment strategies. As AI continues to demonstrate its prowess in predicting treatment outcomes, assessing case difficulty, analyzing fluid behavior, and venturing into novel applications, it emerges as an invaluable ally in elevating standards in patient care and reshaping the landscape of endodontic healthcare.

While acknowledging these strides, careful considerations of reliability, practicality, and cost-effectiveness are paramount for the seamless integration of AI into routine endodontic procedures. This will ensure sustained advancements in clinical outcomes. It is imperative to recognize the ongoing evolution of AI and to address any associated limitations or challenges to foster its responsible and effective use in the endodontics.

Acknowledgments/Declaration of AI-assisted technologies

During the preparation of this paper, the author utilized the assistance of chat.openai to improve language/readability. The AI model provided suggestions/guidance, which were then reviewed/edited by the author. The author takes full responsibility for the final content of the publication and its accuracy.

Conflict of interest

None.

Funding support

None.

Author's contributions

S.A. Conceptualization, Methodology, Investigation, Data Curation, Writing Original Draft, Review & Editing.

References

  • 1.Akinrinmade AO, Adebile TM, Ezuma-Ebong C, Bolaji K, Ajufo A, Adigun AO, Mohammad M, Dike JC, Okobi OE. Artificial Intelligence in Healthcare: Perception and Reality. Cureus. 2023;15(9):e45594. doi: 10.7759/cureus.45594. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Alanazi A. Clinicians' Views on Using Artificial Intelligence in Healthcare: Opportunities, Challenges, and Beyond. Cureus. 2023;15(9):e45255. doi: 10.7759/cureus.45255. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Agrawal P, Nikhade P. Artificial Intelligence in Dentistry: Past, Present, and Future. Cureus. 2022;14(7):e27405. doi: 10.7759/cureus.27405. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Rajaram Mohan K, Mathew Fenn S. Artificial Intelligence and Its Theranostic Applications in Dentistry. Cureus. 2023;15(5):e38711. doi: 10.7759/cureus.38711. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Khanagar SB, Alfadley A, Alfouzan K, Awawdeh M, Alaqla A, Jamleh A. Developments and Performance of Artificial Intelligence Models Designed for Application in Endodontics: A Systematic Review. Diagnostics. 2023;13(3) doi: 10.3390/diagnostics13030414. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Alzaid N, Ghulam O, Albani M, Alharbi R, Othman M, Taher H, Albaradie S, Ahmed S. Revolutionizing Dental Care: A Comprehensive Review of Artificial Intelligence Applications Among Various Dental Specialties. Cureus. 2023;15(10):e47033. doi: 10.7759/cureus.47033. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Krichen MJC. Convolutional neural networks: A survey. 2023;12(8):151. [Google Scholar]
  • 8.Shao F, Shen Z. How can artificial neural networks approximate the brain? Front Psychol. 2022;13:970214. doi: 10.3389/fpsyg.2022.970214. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Aminoshariae A, Kulild J, Nagendrababu V. Artificial Intelligence in Endodontics: Current Applications and Future Directions. J Endod. 2021;47(9):1352–7. doi: 10.1016/j.joen.2021.06.003. [DOI] [PubMed] [Google Scholar]
  • 10.Sudeep P, Gehlot PM, Murali B, Mariswamy ABJJoIOH. Artificial intelligence in endodontics: A narrative review. 2023;15(2):134–41. [Google Scholar]
  • 11.Asgary S. Emphasizing the impact of artificial intelligence in dentistry: A call for integration and exploration. J Dent Sci. 2023;18(4):1929–30. doi: 10.1016/j.jds.2023.06.028. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Asiri AF, Altuwalah AS. The role of neural artificial intelligence for diagnosis and treatment planning in endodontics: A qualitative review. Saudi Dent J. 2022;34(4):270–81. doi: 10.1016/j.sdentj.2022.04.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Chen SL, Chen TY, Mao YC, Lin SY, Huang YY, Chen CA, Lin YJ, Hsu YM, Li CA, Chiang WY, Wong KY, Abu PAR. Automated Detection System Based on Convolution Neural Networks for Retained Root, Endodontic Treated Teeth, and Implant Recognition on Dental Panoramic Images. IEEE Sensors Journal. 2022;22(23):23293–306. [Google Scholar]
  • 14.Hasan HA, Saad FH, Ahmed S, Mohammed N, Farook TH, Dudley J. Experimental validation of computer-vision methods for the successful detection of endodontic treatment obturation and progression from noisy radiographs. Oral Radiol. 2023;39(4):683–98. doi: 10.1007/s11282-023-00685-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Hiraiwa T, Ariji Y, Fukuda M, Kise Y, Nakata K, Katsumata A, Fujita H, Ariji E. A deep-learning artificial intelligence system for assessment of root morphology of the mandibular first molar on panoramic radiography. Dentomaxillofac Radiol. 2019;48(3) doi: 10.1259/dmfr.20180218. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Duan W, Chen Y, Zhang Q, Lin X, Yang X. Refined tooth and pulp segmentation using U-Net in CBCT image. Dentomaxillofac Radiol. 2021;50(6) doi: 10.1259/dmfr.20200251. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Lahoud P, EzEldeen M, Beznik T, Willems H, Leite A, Van Gerven A, Jacobs R. Artificial Intelligence for Fast and Accurate 3-Dimensional Tooth Segmentation on Cone-beam Computed Tomography. J Endod. 2021;47(5):827–35. doi: 10.1016/j.joen.2020.12.020. [DOI] [PubMed] [Google Scholar]
  • 18.Leite AF, Gerven AV, Willems H, Beznik T, Lahoud P, Gaêta-Araujo H, Vranckx M, Jacobs R. Artificial intelligence-driven novel tool for tooth detection and segmentation on panoramic radiographs. Clin Oral Investig. 2021;25(4):2257–67. doi: 10.1007/s00784-020-03544-6. [DOI] [PubMed] [Google Scholar]
  • 19.Lin X, Fu Y, Ren G, Yang X, Duan W, Chen Y, Zhang Q. Micro–Computed Tomography–Guided Artificial Intelligence for Pulp Cavity and Tooth Segmentation on Cone-beam Computed Tomography. J Endod. 2021;47(12):1933–41. doi: 10.1016/j.joen.2021.09.001. [DOI] [PubMed] [Google Scholar]
  • 20.Sherwood AA, Sherwood AI, Setzer FC, K SD, Shamili JV, John C, Schwendicke F. A Deep Learning Approach to Segment and Classify C-Shaped Canal Morphologies in Mandibular Second Molars Using Cone-beam Computed Tomography. J Endod. 2021;47(12):1907–16. doi: 10.1016/j.joen.2021.09.009. [DOI] [PubMed] [Google Scholar]
  • 21.Jeon SJ, Yun JP, Yeom HG, Shin WS, Lee JH, Jeong SH, Seo MS. Deep-learning for predicting C-shaped canals in mandibular second molars on panoramic radiographs. Dentomaxillofac Radiol. 2021;50(5):20200513. doi: 10.1259/dmfr.20200513. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Yang S, Lee H, Jang B, Kim KD, Kim J, Kim H, Park W. Development and Validation of a Visually Explainable Deep Learning Model for Classification of C-shaped Canals of the Mandibular Second Molars in Periapical and Panoramic Dental Radiographs. J Endod. 2022;48(7):914–21. doi: 10.1016/j.joen.2022.04.007. [DOI] [PubMed] [Google Scholar]
  • 23.Wang Y, Xia W, Yan Z, Zhao L, Bian X, Liu C, Qi Z, Zhang S, Tang Z. Root canal treatment planning by automatic tooth and root canal segmentation in dental CBCT with deep multi-task feature learning. Med Image Anal. 2023:85. doi: 10.1016/j.media.2023.102750. [DOI] [PubMed] [Google Scholar]
  • 24.Ghaznavi Bidgoli SA, Sharifi A, Manthouri M. Automatic diagnosis of dental diseases using convolutional neural network and panoramic radiographic images. Computer Methods in Biomechanics and Biomedical Engineering: Imaging and Visualization. 2021;9(5):447–55. [Google Scholar]
  • 25.Oztekin F, Katar O, Sadak F, Yildirim M, Cakar H, Aydogan M, Ozpolat Z, Talo Yildirim T, Yildirim O, Faust O, Acharya UR. An Explainable Deep Learning Model to Prediction Dental Caries Using Panoramic Radiograph Images. Diagnostics (Basel). 2023;13(2) doi: 10.3390/diagnostics13020226. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Tumbelaka BY, Oscandar F, Baihaki FN, Sitam S, Rukmo MJSEJ. Identification of pulpitis at dental X-ray periapical radiography based on edge detection, texture description and artificial neural networks. 2014;4(3):115–21. [Google Scholar]
  • 27.Zheng L, Wang H, Mei L, Chen Q, Zhang Y, Zhang H. Artificial intelligence in digital cariology: a new tool for the diagnosis of deep caries and pulpitis using convolutional neural networks. Ann Transl Med. 2021;9(9):763. doi: 10.21037/atm-21-119. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Saghiri MA, Asgar K, Boukani KK, Lotfi M, Aghili H, Delvarani A, Karamifar K, Saghiri AM, Mehrvarzfar P, Garcia-Godoy F. A new approach for locating the minor apical foramen using an artificial neural network. Int Endod J. 2012;45(3):257–65. doi: 10.1111/j.1365-2591.2011.01970.x. [DOI] [PubMed] [Google Scholar]
  • 29.Saghiri MA, Garcia-Godoy F, Gutmann JL, Lotfi M, Asgar K. The reliability of artificial neural network in locating minor apical foramen: a cadaver study. J Endod. 2012;38(8):1130–4. doi: 10.1016/j.joen.2012.05.004. [DOI] [PubMed] [Google Scholar]
  • 30.Qiao X, Zhang Z, Chen XJAS. Multifrequency impedance method based on neural network for root canal length measurement. 2020;10(21) [Google Scholar]
  • 31.Kositbowornchai S, Plermkamon S, Tangkosol T. Performance of an artificial neural network for vertical root fracture detection: an ex vivo study. Dent Traumatol. 2013;29(2):151–5. doi: 10.1111/j.1600-9657.2012.01148.x. [DOI] [PubMed] [Google Scholar]
  • 32.Johari M, Esmaeili F, Andalib A, Garjani S, Saberkari H. Detection of vertical root fractures in intact and endodontically treated premolar teeth by designing a probabilistic neural network: an ex vivo study. Dentomaxillofac Radiol. 2017;46(2):20160107. doi: 10.1259/dmfr.20160107. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Mikrogeorgis G, Eirinaki E, Kapralos V, Koutroulis A, Lyroudia K, Pitas I. Diagnosis of vertical root fractures in endodontically treated teeth utilising Digital Subtraction Radiography: A case series report. Aust Endod J. 2018;44(3):286–91. doi: 10.1111/aej.12240. [DOI] [PubMed] [Google Scholar]
  • 34.Fukuda M, Inamoto K, Shibata N, Ariji Y, Yanashita Y, Kutsuna S, Nakata K, Katsumata A, Fujita H, Ariji E. Evaluation of an artificial intelligence system for detecting vertical root fracture on panoramic radiography. Oral Radiol. 2020;36(4):337–43. doi: 10.1007/s11282-019-00409-x. [DOI] [PubMed] [Google Scholar]
  • 35.Vicory J, Chandradevan R, Hernandez-Cerdan P, Huang WA, Fox D, Qdais LA, McCormick M, Mol A, Walters R, Marron JS, Geha H, Khan A, Paniagua B. Dental microfracture detection using wavelet features and machine learning. Proc SPIE Int Soc Opt Eng. 2021 doi: 10.1117/12.2580744. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Hu Z, Cao D, Hu Y, Wang B, Zhang Y, Tang R, Zhuang J, Gao A, Chen Y, Lin Z. Diagnosis of in vivo vertical root fracture using deep learning on cone-beam CT images. BMC Oral Health. 2022;22(1):382. doi: 10.1186/s12903-022-02422-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Okada K, Rysavy S, Flores A, Linguraru MG. Noninvasive differential diagnosis of dental periapical lesions in cone-beam CT scans. Med Phys. 2015;42(4):1653–65. doi: 10.1118/1.4914418. [DOI] [PubMed] [Google Scholar]
  • 38.Birdal RG, Gumus E, Sertbas A, Birdal IS. Automated lesion detection in panoramic dental radiographs. Oral Radiol. 2016;32(2):111–8. [Google Scholar]
  • 39.Ekert T, Krois J, Meinhold L, Elhennawy K, Emara R, Golla T, Schwendicke F. Deep Learning for the Radiographic Detection of Apical Lesions. J Endod. 2019;45(7):917–22. doi: 10.1016/j.joen.2019.03.016. [DOI] [PubMed] [Google Scholar]
  • 40.Endres MG, Hillen F, Salloumis M, Sedaghat AR, Niehues SM, Quatela O, Hanken H, Smeets R, Beck-Broichsitter B, Rendenbach C, Lakhani K, Heiland M, Gaudin RA. Development of a Deep Learning Algorithm for Periapical Disease Detection in Dental Radiographs. Diagnostics (Basel). 2020;10(6) doi: 10.3390/diagnostics10060430. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Orhan K, Bayrakdar IS, Ezhov M, Kravtsov A, Özyürek T. Evaluation of artificial intelligence for detecting periapical pathosis on cone-beam computed tomography scans. Int Endod J. 2020;53(5):680–9. doi: 10.1111/iej.13265. [DOI] [PubMed] [Google Scholar]
  • 42.Setzer FC, Shi KJ, Zhang Z, Yan H, Yoon H, Mupparapu M, Li J. Artificial Intelligence for the Computer-aided Detection of Periapical Lesions in Cone-beam Computed Tomographic Images. J Endod. 2020;46(7):987–93. doi: 10.1016/j.joen.2020.03.025. [DOI] [PubMed] [Google Scholar]
  • 43.Chen H, Li H, Zhao Y, Zhao J, Wang Y. Dental disease detection on periapical radiographs based on deep convolutional neural networks. Int J Comput Assist Radiol Surg. 2021;16(4):649–61. doi: 10.1007/s11548-021-02319-y. [DOI] [PubMed] [Google Scholar]
  • 44.Li CW, Lin SY, Chou HS, Chen TY, Chen YA, Liu SY, Liu YL, Chen CA, Huang YC, Chen SL, Mao YC, Abu PAR, Chiang WY, Lo WS. Detection of dental apical lesions using cnns on periapical radiograph. Sensors. 2021;21(21) doi: 10.3390/s21217049. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Pauwels R, Brasil DM, Yamasaki MC, Jacobs R, Bosmans H, Freitas DQ, Haiter-Neto F. Artificial intelligence for detection of periapical lesions on intraoral radiographs: Comparison between convolutional neural networks and human observers. Oral Surg Oral Med Oral Pathol Oral Radiol. 2021;131(5):610–6. doi: 10.1016/j.oooo.2021.01.018. [DOI] [PubMed] [Google Scholar]
  • 46.Calazans MAA, Ferreira F, Alcoforado M, Santos AD, Pontual ADA, Madeiro F. Automatic Classification System for Periapical Lesions in Cone-Beam Computed Tomography. Sensors (Basel). 2022;22(17) doi: 10.3390/s22176481. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Hamdan MH, Tuzova L, Mol A, Tawil PZ, Tuzoff D, Tyndall DA. The effect of a deep-learning tool on dentists’ performances in detecting apical radiolucencies on periapical radiographs. Dentomaxillofac Radiol. 2022;51(7) doi: 10.1259/dmfr.20220122. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Kirnbauer B, Hadzic A, Jakse N, Bischof H, Stern D. Automatic Detection of Periapical Osteolytic Lesions on Cone-beam Computed Tomography Using Deep Convolutional Neuronal Networks. J Endod. 2022;48(11):1434–40. doi: 10.1016/j.joen.2022.07.013. [DOI] [PubMed] [Google Scholar]
  • 49.Li S, Liu J, Zhou Z, Zhou Z, Wu X, Li Y, Wang S, Liao W, Ying S, Zhao Z. Artificial intelligence for caries and periapical periodontitis detection. J Dent. 2022:122. doi: 10.1016/j.jdent.2022.104107. [DOI] [PubMed] [Google Scholar]
  • 50.Moidu NP, Sharma S, Chawla A, Kumar V, Logani A. Deep learning for categorization of endodontic lesion based on radiographic periapical index scoring system. Clin Oral Investig. 2022;26(1):651–8. doi: 10.1007/s00784-021-04043-y. [DOI] [PubMed] [Google Scholar]
  • 51.Vasdev D, Gupta V, Shubham S, Chaudhary A, Jain N, Salimi M, Ahmadian A. Periapical dental X-ray image classification using deep neural networks. Ann Oper Res. . 2022:1–29. doi: 10.1007/s10479-022-04961-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Issa J, Jaber M, Rifai I, Mozdziak P, Kempisty B, Dyszkiewicz-Konwińska M. Diagnostic Test Accuracy of Artificial Intelligence in Detecting Periapical Periodontitis on Two-Dimensional Radiographs: A Retrospective Study and Literature Review. Medicina (Kaunas). 2023;59(4) doi: 10.3390/medicina59040768. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Patel J, Mital D, Singhal V, Srinivasan S, Wu H, Mehta S. Feasibility of automatic differential diagnosis of endodontic origin periapical lesions - a pilot study. Int J Med Eng Inform. 2023;15(5):430–41. [Google Scholar]
  • 54.Campo L, Aliaga IJ, De Paz JF, García AE, Bajo J, Villarubia G, Corchado JM. Retreatment Predictions in Odontology by means of CBR Systems. Comput Intell Neurosci. 2016;2016:7485250. doi: 10.1155/2016/7485250. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Herbst CS, Schwendicke F, Krois J, Herbst SR. Association between patient-, tooth- and treatment-level factors and root canal treatment failure: A retrospective longitudinal and machine learning study. J Dent. 2022:117. doi: 10.1016/j.jdent.2021.103937. [DOI] [PubMed] [Google Scholar]
  • 56.Qu Y, Lin Z, Yang Z, Lin H, Huang X, Gu L. Machine learning models for prognosis prediction in endodontic microsurgery. J Dent. 2022:118. doi: 10.1016/j.jdent.2022.103947. [DOI] [PubMed] [Google Scholar]
  • 57.Li Y, Zeng G, Zhang Y, Wang J, Jin Q, Sun L, Zhang Q, Lian Q, Qian G, Xia N, Peng R, Tang K, Wang S, Wang Y. AGMB-Transformer: Anatomy-Guided Multi-Branch Transformer Network for Automated Evaluation of Root Canal Therapy. IEEE Journal of Biomedical and Health Informatics. 2022;26(4):1684–95. doi: 10.1109/JBHI.2021.3129245. [DOI] [PubMed] [Google Scholar]
  • 58.Lee J, Seo H, Choi YJ, Lee C, Kim S, Lee YS, Lee S, Kim E. An Endodontic Forecasting Model Based on the Analysis of Preoperative Dental Radiographs: A Pilot Study on an Endodontic Predictive Deep Neural Network. J Endod. 2023;49(6):710–9. doi: 10.1016/j.joen.2023.03.015. [DOI] [PubMed] [Google Scholar]
  • 59.Herbst SR, Herbst CS, Schwendicke F. Preoperative risk assessment does not allow to predict root filling length using machine learning: A longitudinal study. J Dent. 2023:128. doi: 10.1016/j.jdent.2022.104378. [DOI] [PubMed] [Google Scholar]
  • 60.Latke V, Narawade V. Enhancing Endodontic Precision: A Novel AI-Powered Hybrid Ensemble Approach for Refining Treatment Strategies. International Journal of Intelligent Systems and Applications in Engineering. 2023;11(11):73–84. [Google Scholar]
  • 61.Qu Y, Wen Y, Chen M, Guo K, Huang X, Gu L. Predicting case difficulty in endodontic microsurgery using machine learning algorithms. J Dent. 2023:133. doi: 10.1016/j.jdent.2023.104522. [DOI] [PubMed] [Google Scholar]
  • 62.Mallishery S, Chhatpar P, Banga KS, Shah T, Gupta P. The precision of case difficulty and referral decisions: an innovative automated approach. Clin Oral Investig. 2020;24(6):1909–15. doi: 10.1007/s00784-019-03050-4. [DOI] [PubMed] [Google Scholar]
  • 63.Peeters HH, Silitonga F, Zuhal L. Application of artificial intelligence in a visual-based fluid motion estimator surrounding a vibrating EDDY® tip. G Ital Endod. 2022;36(1):151–9. [Google Scholar]
  • 64.Peeters HH, Judith ET, Silitonga FY, Zuhal LR. Visualizing the velocity fields and fluid behavior of a solution using artificial intelligence during EndoActivator activation. Dent J. 2022;55(3):125–9. [Google Scholar]
  • 65.Albitar L, Zhao T, Huang C, Mahdian M. Artificial Intelligence (AI) for Detection and Localization of Unobturated Second Mesial Buccal (MB2) Canals in Cone-Beam Computed Tomography (CBCT) Diagnostics (Basel). 2022;12:12. doi: 10.3390/diagnostics12123214. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66.Buyuk C, Arican Alpay B, Er F. Detection of the separated root canal instrument on panoramic radiograph: a comparison of LSTM and CNN deep learning methods. Dentomaxillofac Radiol. 2023;52(3) doi: 10.1259/dmfr.20220209. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67.Kong Y, Posada-Quintero HF, Tran H, Talati A, Acquista TJ, Chen IP, Chon KH. Differentiating between stress- and EPT-induced electrodermal activity during dental examination. Comput Biol Med. 2023:155. doi: 10.1016/j.compbiomed.2023.106695. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.Ramezanzade S, Dascalu TL, Ibragimov B, Bakhshandeh A, Bjørndal L. Prediction of pulp exposure before caries excavation using artificial intelligence: Deep learning-based image data versus standard dental radiographs. J Dent. 2023:138. doi: 10.1016/j.jdent.2023.104732. [DOI] [PubMed] [Google Scholar]
  • 69.Choi S, Choi J, Peters OA, Peters CI. Design of an interactive system for access cavity assessment: A novel feedback tool for preclinical endodontics. Eur J Dent Educ. 2023 doi: 10.1111/eje.12895. [DOI] [PubMed] [Google Scholar]
  • 70.Suárez A, Díaz-Flores García V, Algar J, Gómez Sánchez M, Llorente de Pedro M, Freire Y. Unveiling the ChatGPT phenomenon: Evaluating the consistency and accuracy of endodontic question answers. Int Endod J. 2023 doi: 10.1111/iej.13985. [DOI] [PubMed] [Google Scholar]
  • 71.Farajollahi M, Modaberi A. Can ChatGPT pass the "Iranian Endodontics Specialist Board" exam? Iran Endod J. 2023;18(3):192. [PMC free article] [PubMed] [Google Scholar]
  • 72.Asgary S. The future of endodontics: Harnessing the potential of artificial intelligence. Saudi Endod J. 2024;14(1):137–8. [Google Scholar]

Articles from Iranian Endodontic Journal are provided here courtesy of Iranian Center for Endodontic Research

RESOURCES