Abstract
Artificial intelligence (AI) holds immense promise in revolutionising dentistry, spanning, diagnostics, treatment planning and educational realms. This narrative review, in two parts, explores the fundamentals and the multifaceted potential of AI in dentistry. The current article explores the profound impact of AI in dentistry, encompassing diagnostic tools, treatment planning, and patient care. The Part 2 of the article delves into the potential of AI in patient education, ethics and the FDI communique on AI in dentistry. The review begins by elucidating the historical context of AI, outlining its recent widespread use in various sectors, including medicine and dentistry. The narrative delves into the fundamental concepts of AI, which entails developing machines capable of executing tasks that typically necessitate human intellect. In the biomedical realm, AI has evolved from exploring computational models to constructing systems for clinical data processing and interpretation, aiming to enhance medical/dental decision-making. The discussion delves into the pivotal role of AI models in dentistry, such as Large Language Models (LLM), Large Vision Models (LVM), and Multimodality Models (MM), revolutionizing processes from clinical documentation to treatment planning. The narrative extends to the applications of AI in dental specialties such as periodontics, endodontics, oral medicine and pathology, restorative dentistry, prosthodontics, paediatric dentistry, forensic odontology, oral and maxillofacial surgery, orthodontics, and orofacial pain management. AI's role in improving treatment outcomes, diagnostic accuracy, and decision-making processes is evident across these specialties, showcasing its potential in transforming dental care. The review concludes by highlighting the need for continued validation, interdisciplinary collaboration, and regulatory frameworks to ensure the seamless integration of AI into dentistry, paving the way for enhanced patient outcomes and evidence-based practice in the field.
Keywords: Artificial intelligence; Dentistry; Challenge; Literacy; Oral health; AI,; Petient care
Introduction
The term artificial intelligence (AI) has been widely used globally for decades. However, the realisation of its unfathomable potential dawned when the generalised use of a large language model (LLM) was first made available to the public in 2022. Since then, AI use has been widespread in various domains, including medicine and dentistry. In essence, it is the concept of developing machines capable of carrying out intellectual tasks typically done by humans. When a machine demonstrates the ability to make informed decisions, it can be referred to as being artificially intelligent.1
AI is dedicated to comprehending and constructing intelligent entities, frequently embodied as software programs.2 It can be elucidated as a series of operations devised to execute a particular task.3 In biomedical sciences, the initial AI endeavours can be viewed as a sequence of efforts to explore, comprehend, and construct computational models mirroring scientific knowledge and problem-solving strategies. These efforts involve developing and evaluating computational systems for clinical data processing and interpretation alongside modelling clinical reasoning. This approach is aimed to surpass the prevalent logical, statistical, and pattern recognition models for medical decision-making that gained popularity from 1950s onward.4
Two primary types of AI prevalent today are narrow AI (weak AI) and general AI (strong AI). Narrow AI can be classified into various subclasses, such as machine learning (ML) and expert-based systems.5 ML in turn consists of supervised, unsupervised, or semi-supervised learning. Supervised learning employs labelled data for training, which can limit algorithm adaptability and performance. Unsupervised learning processes unlabelled data but requires greater algorithmic complexity. Semi-supervised learning combines a small amount of labelled data with large volumes of unlabelled data for training.5 A key subset of ML is deep learning (DL), which utilizes artificial neural networks (NNs) to identify intricate patterns within the processed data. The primary distinction between DL and traditional ML lies not in how they learn but, in their ability, to capture increasingly complex and hierarchical data representations.
Neural networks (NNs), which form the foundation of DL, include artificial neural networks (ANNs), convolutional neural networks (CNNs), and generative adversarial networks (GANs).5 ANNs process information in a feedforward manner, from input through hidden layers to output. CNNs specialize in processing structured grid-like data, employing layers of convolutional filters to extract features. GANs, trained through adversarial learning, generate new data resembling their input.5
Presently, within the health professional education (HPE) domain, AI predominantly leverages models such as Large Language Models (LLM), Large Vision Models (LVM), Multimodality Models (MM), Reinforcement Learning (RL) Models, and Generative Adversarial Networks (GANs) and many more.6,7 These are briefly described below.
LLM, epitomised by models like the GPT (Generative Pretrained Transformer), has revolutionised natural language processing (NLP) tasks. These models, trained on vast amounts of textual data, exhibit remarkable abilities in understanding, generating, and summarising textual information. In dentistry, LLMs can be harnessed for many tasks ranging from automating clinical documentation to aiding patient communication through chatbots, thus enhancing overall efficiency and patient satisfaction.6,8, 9, 10
On the other hand, LVM, represented by architectures like Convolutional Neural Networks (CNNs), excel in image and video analysis and interpretation. In dentistry, LVMs can facilitate the interpretation of radiographs, aid in the detection of pathologies, and assist in treatment planning by analysing intraoral and extraoral images. LVMs contribute significantly to diagnostic accuracy and treatment efficacy by providing precise and rapid assessments.11,12 In addition to CNNs, vision transformers (ViTs) are gaining recognition as advanced tools for image and video analysis. Research suggested that transformer-based models could outperform CNN-based algorithms in diagnostic tasks like caries detection in intraoral images, highlighting their potential to further improve diagnostic accuracy and efficiency.13
MM enables AI systems to process and interpret information from diverse modalities such as text, images, and voice. These models hold potential in dentistry by facilitating comprehensive patient data analysis, enabling holistic treatment planning, and enhancing interdisciplinary collaboration among dental professionals.6
Reinforcement Learning (RL) models are unique in ML, as they learn through trial and error by interacting with their environment.14 This approach has evolved into a dynamic and transformative element of AI, enabling intelligent decision-making in complex and ever-changing situations. The distinctive capabilities of RL make it particularly useful in simulation-based learning, such as robotic surgery training and clinical decision-making. These models are powerful tools for developing adaptive and interactive learning environments, especially in areas that require decision-making skills and real-time feedback. In dental education, RL can enhance haptic simulations, virtual reality experiences, and robotic-assisted dental surgeries.15,16
In dentistry, input data encompass visual (eg, clinical images and radiographs), textual (eg, electronic health records), or audio modes. Neural networks (NN) process these inputs to generate various outputs, such as diagnoses, treatment plans, disease predictions, or prognoses. Diagnosis may involve deciphering clinical cues, cephalometric analysis, or lesion identification. AI in dentistry also aids in recognising structures, analysing results, converting speech data, and facilitating data acquisition for computer-aided design/computer-aided manufacturing (CAD/CAM) processes. Additionally, AI may utilise gene analysis, prioritise risk factors, or predict disease outcomes.17
As we explore the role of AI in dentistry, it becomes evident that LLMs, LVMs, and MMs offer unprecedented opportunities for optimising clinical workflows, improving diagnostic accuracy, and ultimately enhancing patient care. However, alongside these opportunities come challenges that need to be addressed. A schematic illustration of the working of AI can be seen in Figures 1 and 2.
Fig. 1.
The working of AI in a schematic format.5
Fig. 2.
Schematic representation of working AI in Dentistry.18
Currently there is limited presence of AI software in dentistry. However, progress has been made in areas such as dental image analysis, caries detection, orthodontic treatment planning and analysis, endodontic lesion detections on periapical radiographs, oral pathological lesion detection, and electronic patient record management with the aid of AI.5
While existing reviews have examined the landscape of dental AI, this article aims to provide a comprehensive narrative of AI evolution within dentistry. It summarises recent advancements in AI research specific to dentistry and explores the intricate relationship between evidence-based dentistry and AI. Additionally, it addresses the challenges of integrating AI in dentistry, offering a holistic perspective crucial for navigating the future trajectory of AI-driven dentistry.
Applications of AI in dentistry
We review below the multiple ways in which AI could be applied to almost all facets of dentistry.
AI in periodontics
Various treatment strategies exist for managing periodontally compromised teeth,19 but disease prognosis has been hampered due to poor diagnostics and human error. AI has been explored and advanced in several aspects to enhance the diagnosis, treatment, and monitoring of periodontal care. For instance, CNNs are trained and able to detect chronic gingivitis from clinical intraoral images. ANNs have been experimentally used to assess periodontitis grades in a number of studies. Some models achieve up to 85% accuracy in correctly classifying periodontal patients based on their radiographic bone loss profiles.20 Determining bone loss in both panoramic and periapical radiographs has been investigated, facilitating the accurate diagnosis of periodontal bone loss.21,22 A meta-analysis of 10 publications reveals that AI performance demonstrates high mean sensitivity (87%), specificity (76%), and accuracy (84%) in assessing alveolar bone loss and periodontitis from panoramic and periapical radiographs.23 Further, the use of AI for detecting furcation involvement in axial CBCT images has been investigated. Results exhibit the Resnet101V2 deep-learning model performs with distinctively high accuracy, precision, and F1 score.24 A high correlation has been observed between the AI models and radiological diagnoses. In this regards, a study has revealed an overall whole-jaw Pearson's correlation coefficient of 73% between AI and radiologists and an intraclass correlation value of 91%.25
Furthermore, CNNs can be utilised for prediction of periodontal disease. In one study, a CNN exhibited a precision rate of 73.4 and 82.8% for predicting the necessity of tooth extractions for premolars and molars, respectively.26 Another study demonstrated that a logistic regression model and a NN for the prediction of tooth loss in periodontitis patients exhibited moderate specificity and high sensitivity and precision in comparison to clinical tooth prognostic systems.27 Periodontal disease risk prediction using ML in diabetic patients28 found that smoking habits, low educational levels, high income-to-poverty ratio, high albumin levels and high alanine aminotransferase levels were relevant variables for predicting periodontitis.28 At molecular levels, ML integrated with Mendelian randomisation and single-cell sequencing data revealed a novel approach to identifying gene interactions in periodontitis,46 leading to the potential use of these genes in predictive diagnosis and treatment follow-up for periodontitis patients.29
ChatGPT has been explored for its ability to classify periodontal disease status according to the relatively new, 2018 classification. The latter model correctly identified the stage, grade and extent of periodontal disease in one study. This performance required additional fine-tuning to improve the accuracy.30 Among four different LLM models (ChatGPT model GPT 4.0, Google Gemini, Google Gemini Advanced, and Microsoft Copilot), ChatGPT 4.0 was superior (>8 out of 10 in scale) in response to open-ended questions in periodontology, confirming its potential in enhancing patient education to be employed as dental professional co-pilots.31 LLMs can also be tailored to be focused on periodontology, yielding to higher accuracy rates (upto 81.0%).32
ML models effectively identify the correlation between systemic well-being and poor periodontal status.33 In this context, localised variations in gingival indices and periodontal diseases were associated with blood pressure, body mass index, and a number of other systemic findings. Interestingly, one models has revealed a link between early periodontal disease, gingivitis, and optic nerve abnormalities.50 Additionally, co-occurrence of periodontal diseases has been observed with joint swelling and a family history of eye diseases.33 Another AI version, Local interpretable model-agnostic explanations (LIME) identifies key systemic factors contributing to periodontitis, such as arthritis, sleep disorders, hypertension, elevated cholesterol levels, and obesity, aligning with clinically recognised associated conditions.34
The foregoing indicates that, AI has significantly advanced periodontics by improving diagnosis, treatment, and monitoring of periodontal diseases. ANNs and CNNs demonstrate accuracy in detecting bone loss and predicting treatment outcomes and disease progression, while generative AI can be utilised to enhance patient education and professional support. These innovations highlight AI's transformative potential for precision diagnostics, personalised care, and interdisciplinary integration appertaining to periodontology.
AI in endodontics
Early detection of periapical lesions due to pulp infections is vital to improve treatment outcomes, prevent spread of infection, and mitigate related adverse consequences.35,36 It is known that clinician visualisation, of CBCT scans demonstrate superior accuracy in identifying periapical lesions compared to conventional and digital periapical radiographs.37 However AI technology appears to improve the accuracy of detecting pathology in CBCT images. In one study, DL demonstrated 93% detection accuracy with high specificity, positive predictive value (PPV), and negative predictive value (NPV).36 A CNN system achieved 92.8% reliability in correctly identifying periapical lesions on CBCT images.38 In panoramic radiographs, a DL algorithm detected periapical pathoses at 60% precision and 58% F1 score compared to evaluations of experienced oral and maxillofacial surgeons.39 A deep CNN model based on the UNET algorithm achieved sensitivity, precision, and F1 scores of 92%, 84%, and 88%, respectively.40 Another study utilised a similar algorithm but resulted in lower accuracy.41 A modified DL model achieved an 82% F1 score in detecting periapical periodontitis using periapical radiographs, indicating potential for improved diagnostic accuracy,42 leading to the improvement of dental professionals' ability to detect apical radiolucencies on intraoral periapical radiographs.43 A recent systematic review concluded that DL algorithms are more effective than expert clinicians in accurately detecting periapical radiolucent lesions in dental radiographs.44 All these studies indicate the superiority of AI in detecting apical pathology and is feasible for future endodontic practice.
AI-driven dental pulp cavity segmentation in CBCT images is also another application of AI to facilitate endodontic treatment. Automated pulp cavity segmentation assists in endodontic workflow and minimally invasive endodontic procedures, and reduces the time needed compared with manual segmentation.45 It has also been shown that CNN models (ResNet-101 and - 50, DenseNet-121 and - 161, and Inception-V3) are able to detect C-shape canals in panoramic radiographs.46 These models outperform dental practitioners in identifying C-shaped canals in mandibular second molars. Additionally, AI has been explored to diagnose the status of root canal fillings visualised in CBCT images, including the adequacy of obturation, overfilling, short filling and voids in fillings. Results indicate very high accuracy in detecting these parameters (over 80%).37 Further, ANNs can be employed to determine the working length from a radiograph. ANNs determined the correct position of a file in a canal more precisely (96%) than endodontists (76%).47 LLMs have also been explored for pulpal and periradicular diagnosis. Bing and ChatGPT 4.0 outperformed ChatGPT 3.5 and Bard in the accuracy of diagnosis and treatment recommendations.48 In addition, using English language for case information resulted in better performance.48 Google Gemini exhibited an accuracy similar to that of endodontists in response to a question related to the management of traumatised permanent teeth.49
AI in oral medicine and pathology
Recent advances in cancer diagnostics have emphasised the virtues of digital image analysis, which entails extracting critical information from images for tasks such as feature delineation (segmentation) or label assignment (classification).25,50 Numerous customised ML methods, leveraging feature analysis, have succeeded in various diagnostic applications by predefining specific features and processing workflows (Figure 1). Such data have been explored with AI models to facilitate oral cancer diagnosis for instance, and these include intraoral images, radiographs, and other diagnostic data. AI is also employed to facilitate pathologists tasks in order to assist histolopathological diagnoses derived from tissue sections. High-quality data from these technologies would will be amenable for evaluation by ML methods in the future, enabling swift, precise, and accurate diagnosis of head and neck tumours, thereby paving the way for improved prognosis. Below are some AI interventions that could enhance the quality of life for head and neck cancer patients.
Clinical data from intraoral or endoscopic images have been utilised to train AI programmes for detecting oral cancers,51 nasopharyngeal cancers.52 oropharyngeal cancers,53 laryngeal cancers,54 and miscellaneous oral mucosal lesions.55 ML models for instance were able to differentiate normal laryngeal tissues from malignant tissues in endoscopic images using textural information.54 Hence, incorporating AI technology with an endoscopic system could enhance the early detection of head and neck cancer lesions. With such technology, the detection of oral cancer in clinical oral images can be improved. ML classification models from multispectral narrow-band imaging outperformed those from white-light endoscopy in distinguishing oropharyngeal squamous cell carcinoma from healthy mucosa, with improved accuracy linked to angiogenic and inflammatory surface changes.53 Further, CNN models are able to classify oral cancer using fluorescence visualisation. Results indicate that precision was 80.3%, 82.1%, 84.2%, and 94.1% for oral cancer, oral potentially malignant disorders, benign disease, and normal mucosa, respectively.56 With a smartphone-based intraoral dual-modality imaging platform, CNN models were trained to distinguish oral premalignant and malignant lesions with an accuracy of more than 86 %.57 CNN models exhibit very high precision (100%), specificity (99%), and accuracy (97.5%) in detecting oral cancer from digital images.58 However, distinct models specifically perform particular tasks. In this regards, DenseNet121, VGG19, and EfficientNet-B0 are excellent for binary classification58 while efficientNet-B4, Inception-V4, and Faster R-CNN excel in multiclass classification and object detection.58 A schematic of extraction and enalysis of key features from primary diagnostic imaging modalities for predicting clinical outcomes is shown in Figure 3.
Fig. 3.
A schematic of feature extraction from primary diagnostic imaging modalities to aid outcome prediction. Images are reproduced from Charoenlarp et al.,59 under the terms of the Creative Commons Attribution Noncommercial License (http://creativecommons.org/licenses/by-nc/3.0).
AI technologies are increasingly utilised to analyse a variety of radiological and special technique-derived images depicting head and neck cancer and other soft and hard tissue abnormalities. CT-based textural analysis has been utilised to assess malignancies in the head and neck region.60 The spectral dual-energy computed tomography data obtained from multienergy virtual monochromatic image datasets is used to systematically classify Warthin tumour and pleomorphic adenoma with a remarkable accuracy of 92%.60 An automated algorithm has been formulated to identify nasopharyngeal squamous cell carcinoma on PET/CT imaging, resulting in a remarkable accuracy of 100% in identifying hypermetabolic lesions exceeding 1 cm in size.61 A systematic review of publications related to AI for the detection of head and neck cancer using CT, PET, MRI, Planar scans, or panoramic radiographs demonstrated a wide range of accuracy (82.6%-100%), sensitivity (74%-99.68%) and specificity (66.6%-90.1%).62 Hence, AI integration in radiograph data could enhance the accuracy of detecting head and neck cancer.62 On the contrary, the usefulness of ML in detecting intraosseous lesions in jaw bones has yet to be determined, as a systematic review shows a moderate mean F1 score of 0.71, accuracy of 0.86, sensitivity of 0.82, specificity of 0.88, and precision of 0.67.63
Numerous studies have utilised whole-slide imaging of stained, histopathological slides to develop algorithms for evaluating oral squamous cell carcinoma (OSCC),60 oral potentially malignant disorders (OPMD),64 laryngeal squamous cell carcinoma (SCC),65 oropharyngeal SCC,66 and multiple HNC sites.67 In these studies, various ML approaches were employed to delineate specific histological features and conduct downstream statistical analysis, aiming to differentiate between benign and malignant lesions based on differences in spatial architectural patterns.64,68 Unsupervised ML techniques have been utilised to investigate the capacity for identifying tissue compartments in oropharyngeal squamous cell carcinoma and tissue microarrays where morphometric classification of epithelial and stromal tissues achieved a pixel-level F1 score ranging from 80% to 81%.66 DL algorithms for stimulated Raman scattering histology demonstrated significant potential in diagnosing laryngeal squamous cell carcinoma, achieving an accuracy of 90%.65 This method effectively identified tissue neoplasia at simulated resection margins, enhancing the delineation of the borders between the healthy and tumour tissues, assisting precise surgical resection and reducing disease recurrence potential.
AI in restorative dentistry and prosthodontics
AI technology has potential applications in restorative dentistry for various diagnostic functions and treatment approaches, including the detection of vertical root fractures,69 apical lesions,36 overhanging restorative materials,70 tooth wear evaluation,71 tooth shade matching, identifying dental caries,72,73 and classifying of edentulous areas.74 These applications offer promising prospects for improving clinical practice and enhancing patient care.
As described above, numerous studies have investigated the utilisation of DL systems in various oral and maxillofacial imaging procedures, including periapical radiography,72 panoramic radiography,75 and CT images.76 Object detection functionality has been applied to diagnose abnormalities or anatomical structures in panoramic images.76
In addition to the above, AI has been used in restorative dentistry to predict the longevity of CAD/CAM restorations,77 the prediction and precision of colour matching of artificial teeth.78 AI has also been utilised in smile analysis in the realm of aesthetic dentistry.79 Smile design, both in 2D and 3D, is employed mainly in patient communication and also planning for restoration. However, one study has shown that the participants’ preference for smile design was inclined towards a manual design over a design generated by AI.80
The treatment process for preparing a dental crown typically involves tooth preparation, impression taking, cast trimming, restoration design, fabrication, try-in, and cementation. While CAD/CAM systems have revolutionised the design work, commercial systems primarily relied on pre-existing tooth libraries for crown design, lacking customisation for individual patients.77 Recently, AI has been applied to address this limitation. Hwang et al.81 and Tian et al.82 introduced novel approaches based on 2D-GAN models to generate customised crowns by learning from technicians’ designs, using 2D depth maps derived from 3D tooth models as training data. Additionally, Ding83 utilised a 3DDCGAN network for crown generation, employing 3D data directly and producing crowns with morphology resembling natural teeth. Integrating AI with CAD/CAM or 3D/4D printing offers a more efficient workflow. Several workers have investigated the use of generative AI for dental prosthesis design.84 Integrating AI in design software exhibited a superior outcome than using conventional software devoid of AI. However, compared with experienced humans, knowledge-based AI is still ineffective for crown design, especially with regards to their anatomy and morphology. While generative AI can enhance the efficiency of dental practices and improve patient outcomes, its integration into dental workflows must be approached with caution, ensuring rigorous testing and continuous oversight.
Furthermore, AI has been used for shade matching85 and predicting the debonding of CAD/CAM restorations. A recent scoping review by Kong and Kim focused on the use of AI to enhance the efficiency and precision of dental crown finish lines, colour matching, and prediction of debonding probability.86 Ali et al. presented an advanced approach for automated detection and enumeration of teeth and dental prostheses in panoramic X-rays using a dual CNN system combined with an optimisation algorithm.87 The proposed method integrates separate YOLOv7 models for teeth and prosthesis detection, addressing the limitations of existing systems that fail to consider complex dental restorations such as implants, crowns, and bridges.
This novel integration of prosthesis data into the tooth numbering process resulted in high precision and recall values, demonstrating its potential to automate dental charting and improve clinical record-keeping accuracy in prosthodontics.
AI in paediatric dentistry
The use of AI in pedodontics has thus far focused primarily on detection and prediction models. Numerous ANNs and CNNs have been trained for image analysis to detect and classify supernumerary teeth, mesiodens, early childhood caries, stage of the dentition, pit and fissure sealants, dental caries, dental plaque, and ectopic eruptions.88 ML was employed to develop oral health assessment toolkits to predict Children's Oral Health Status Index (COHSI) score and referral for treatment needs.120 The latter toolkit had 93% of sensitivity and 49% of specificity.89 It can be used to evaluate the treatment need in school-based programs or to compare pre- and postinterventive programs to determine the change in treatment needs. Utilizing an array of variables, including the criteria established by the International Caries Detection and Assessment System (ICDAS) in conjunction with additional clinical factors, ML algorithms like decision trees, random forests, and extreme gradient boosting (XGBoost) were applied alongside logistic regression methodologies for the purpose of caries prediction.90 Studies indicate that AI models have the potential to determine the onset of caries development in both primary and permanent dentitions. Notably, clinical factors such as caries experience, the non-utilisation of fluoridated toothpaste, parental education levels, elevated frequencies of sugar intake, and inadequate parental perceptions of their children's oral health emerged as predominant predictors of caries in permanent dentition.90 Additionally, a caries risk prediction model for public health purposes have been developed by Qu et al. Using twelve nonbiological questions, this model predicted the onset dental caries in children aged <60 months.91
AI-based dental plaque detection could be advantageous for dentists and guardians in monitoring the oral health of children. A CNN-based model demonstrated a superior mean results in comparison to dentists for detecting dental plaque on photographs of the primary dentition.92 The model exhibited a performance that is clinically acceptable when compared with an experienced paediatric dentist. With the help of an accessible intraoral camera, parents, in future, should have the ability to capture intraoral images and employ such models for the assessment of residual dental plaque.92
Further, AI models demonstrated a superior ability to accurately identify proximal caries in bitewing radiographs in comparison to dentists. One study revealed a high matrix scoring for AI to detect proximal caries. The model achieved the following performance metrics: 88.8 percent sensitivity, 98.8 percent specificity, 95.8 percent precision, 96.4 percent accuracy, and an F1-score of 92 percent, by surface.93 Additionally, it has been shown that dentists utilizing AI-assistance exhibit enhanced efficiency in detecting caries from bitewing radiographs.94 A systematic review indicates that AI models are clinically acceptable for identifying proximal caries from bitewing radiographs. Additionally, AI-based smartphone applications for caries detection provide ease of home screening for parents thus engaging both children and guardians in pediatric oral health care.95
In terms of early childhood caries detection, AI is employed for caries detection, classification, and localization. Such a well developed model was able to diagnose using image analysis, an overall accuracy of 97.2%.96 However, the model exhibited less accuracy in cavity segmentation. LLMs were investigated to determine their potential for parent education regarding early childhood caries. These models appropriately responded to the commonly-asked questions formulated by the experts.97
One of the major treatment components in paediatric patients is behaviour management.98 Integrating AI in behaviour management has been proposed through a variety of approaches. AI-adaptive personalized lessons and gamification was introduced to develop patients’ motivation and engagement in oral health and treatment compliance. AI-based virtual assistance could be beneficial to support explanation and feedback. The latter methodology can attract children to do various tasks so as to distract them from the treatment process. Engaging in interaction via text chatbot or voice conversation could emotionally support patients and alleviate anxiety and stress.
AI in forensic odontology
Forensic odontology plays a crucial role in identifying victims of mass disasters and in cases involving decomposed, burned, or skeletonized remains. Within this field, age estimation, gender determination, and facial reconstruction are key subdisciplines, particularly valuable when information about the deceased is scarce. AI models for the latter purposes have been developed and utilised in several studies, including human bite mark analysis, predicting mandibular morphology, sex determination, age estimation, and dental comparison.85 Recent research has focused on creating automated identification systems using AI algorithms to improve these forensic procedures. Most of the developed models are based on CNNs and ANNs.
In one study, ANN identified on average 82% correct matches between bite marks and the perpetrators. Another model showed superior performance in comparison to support vector regression for mandibular morphology prediction from craniomaxillary variables for the purpose of forensic facial reconstruction.99 In regards to sex determination, multiple variables have been introduced and investigated on multiple models, including hyoid bone, mandibular condyles, coronoid process, coronoid height, condyle height, ramus height, sigmoid notch, mandibular canine width, nostril width, and intermolar width.100 Notably, a ML model demonstrated a markedly enhanced predictive accuracy, thereby highlighting its potential for sex determination in forensic and anthropological work,
Age estimation is one of the fundamental and key components of forensic odontology. AI can now be employed to determine an individual's age for personal identification. In particular in scenarios where adequate documentation is absent, age estimation becomes essential for delineating the specific legal actions that may be applicable to that individual or to those associated with the individual. Many variables have been employed as parameters for age estimation, including hand-wrist radiographs, panoramic radiographs, 3-dimensional computed tomography.101 High accuracy of AI for age estimation was reported when using orthopantomography. Model accuracy was higher than 95%. The mean absolute errors (MAE) associated with age estimation derived from panoramic radiographs exhibited a lower value in the younger age groups whereas higher errors were observed in the older population. In this context, the MAE was 1.94 years for the age cohort of 10-20 years, while a MAE of 13.40 years was observed for the 90-100 years group.102
Lastly, personal identification is considered a significant aspect in the field of forensic odontology. The process of correlating premortem and postmortem data from individuals affected by mass casualty events represents a substantial workload. Six CNNs models, VGG16, ResNet50, Inception-v3, InceptionResNet-v2, Xception, and MobileNet-v2, were compared for their sensitivity in personal identification, using orthopantomography.103 All model performances demonstrated accuracy of 80% or above.103 However, the VGG16 model exhibited the highest accuracy at 100.0%. These preliminary results demonstrate the potential application of AI-assisted personal identification from a large database of orthopantography outcomes. These models have shown encouraging results, ushering in a new realm of AI-driven research in this area. It is highly likely that, in future, AI will play a key role in forensic odontology when the identification of individuals from hard tissues such as teeth and bone is required.
AI in oral and maxillofacial surgery
AI tools have been developed and explored for various applications in oral and maxillofacial surgery. These range from image analysis, surgical planning, robotic-assisted treatment, and clinical decision-making. One notable use of AI in image analysis is in assessing the impacted mandibular third molars from radiographic images to determine the surgical difficulty in order to design treatment planning.52 With AI assistance, less experienced practitioners can achieve diagnosis and classification of the impacted mandibular third molars comparable to that of the experts’.76 Automated detection of oral structures in clinical and radiographic images enhanced AI-facilitated complex dataset analysis for optimal treatment strategies.104 For example, to assess the condylar seating prior to orthognathic surgery.105 Results demonstrated that using AI assistance, 71% of the condylar heads were correctly seated.
With AI-enhanced visualisation, surgeons can precisely evaluate treatment approaches and predicted outcomes in order to adapt and modify the appropriate options for each patient, thus supporting clinical decision-making.106 Supervised CNN models were trained in lateral cephalograms and occlusal views of scanned dental models. Data indicate that the trained models exhibited high performance in predicting the surgery-first approach.107 Such approaches could facilitate the surgical treatment planning of skeletal Class III patients for example. A study in treatment planning for dental implant placement revealed the alignment between an AI-generated plan and a clinical plan.108 Although there were some discrepancies between these two approaches, AI assistance could reduce the planning time employed. Although LLMs may assist clinical decision-making, all tested models (Bard, GPT-3.5, GPT-4, Claude-Instant, and Bing) in a recent study exhibited a low percentage of accuarcy (less than 40%).109
Robotic-assisted surgery has been implemented in oral and maxillofacial surgery, especially in dental implant placement. Automated robot-assisted surgery augments the precision and accuracy associated with the placement of dental implants.110 In one study the latter system exhibited a high degree of accuracy, evidenced by mean deviations measuring 0.61 mm in the coronal plane, 0.79 mm in the apical plane, and 2.56 degrees in terms of angular deviation, indicating precise positioning.110 Using robotic-assisted surgery augmented with AI resulted in lower point errors of bilateral osteotomy planes in a mandibular tumour model study, offering precision and accuracy of such surgery.111 Therefore, AI integration into robotic systems could enhance the diagnosis, personalise the surgical plans, as well as improve surgical skills of less experienced surgeons in particular.
AI in orthodontics
Integrating AI into orthodontics enhance diagnostic efficacy and accuracy, improving treatment planning, decision-making, and patient outcomes. Automated detection of orthodontic landmarks of cephalogram using AI algorithms is now relatively common. In one such study accurate detection within a 2 mm precision threshold was reached for more than 85% of landmarks.112 A systematic review revealed the variability in landmark detection accuracy across studies. Reported success rates for orthodontic landmark detection for 1-mm, 2-mm, 2.5-mm, 3-mm, and 4-mm precision thresholds were 65%, 81%, 86%, 91%, and 96%, respectively.113 Additionally, CNNs were utilized to analyse intraoral photographs for detecting and classifying crossbite. Results demonstrate excellent potential in clinical image processing for identifying crossbites from noncrossbites at an accuracy of 98.57 % and classifying frontal crossbites from lateral crossbites at an accuracy of 91.43 %.114 Further, as AI engages in various steps of orthodontic treatment it can be used in the prediction of treatment planning and outcomes as well as in improving clinical decision-making.115 When seven ML models were tested for tooth extraction for orthodontic treatment116, the Stacking Classifier model exhibited the highest accuracy and area under the curve (91.2%).
AI in orofacial pain, temporomandibular disorders, and sleep apnea management
Diagnosis of orofacial pain and temporomandibular disorders is a challenging task. To assist and facilitate such patient care, AI models have been developed and explored. A multilayer perceptron neural network model was developed, and demonstrated significantly higher diagnostic accuracy than evaluation by general dentists, especially in conditions involving nondental and referred orofacial pains.117 Further, AI algorithms developed for automated TMD diagnosis can serve as decision-support tools for clinicians. In this regards, a systematic review reports that AI models demonstrated a pooled accuracy of 0.91 and specificity ranging from 73% to 100% for the diagnosis of temporomandibular disorders. However, a high risk of bias in the included 17 studies was apparent.117 Random Forest and logistic regression analysis AI models application using brain imaging data were able to classify subtypes of neuropathic facial pain (trigeminal neuralgia and trigeminal neuropathic pain) and healthy controls.118 However, the accuracy of this model was only 51%.118 An LLM for assisting differential diagnosis of patients with odontogenic pain and temporomandibular disorder based on validated questionnaires have been developed, achieving an accuracy rate of 86%.119 The incorporation of advanced algorithms and extensive datasets may further enhance the diagnostic precision of orofacial pain and temporomandibular joint disorders, going forwards.
An automatic prediction model for diagnosing obstructive sleep apnea in temporomandibular disorder patients has been developed, achieving 80%-91% accuracy.120 Incorporating MRI data enhanced the model's performance, yielding an excellent area-under-curve value of 1.00. The obstructive apnea index was identified as the key predictor, while heatmap visualisations highlighted anatomical regions associated with obstructive sleep apnea, including the nasopharynx, oropharynx, uvula, larynx, and epiglottis.120 An ML model was also able to automate the analysis of mandibular jaw movements for detecting sleep bruxism.121 AI integration with conventional diagnostic approaches has the potential to transform the management of sleep disorders, facilitating personalised treatment efficacy.
Conclusions
AI is redefining dentistry by enhancing diagnostic accuracy, treatment planning, and patient care. AI-driven models, including machine learning, deep learning, and neural networks, have demonstrated high precision in radiographic analysis, disease detection, prosthodontic design, and orthodontic treatment planning. Indeed AI applications extend beyond clinical practice to foci such as forensic odontology and dental education, improving diagnostic efficacy and treatment planning. In addition, application of AI driven models in dental education should vastly improve the quality of both undergraduate and postgraduate education in future.
While AI continues to evolve, its successful integration into dentistry requires robust validation, interdisciplinary collaboration, and regulatory frameworks to ensure clinical reliability. Future advancements in AI will further refine treatment modalities, optimize workflows, and contribute to evidence-based, data-driven dentistry, ultimately improving patient outcomes and reshaping the future of dental care.
Conflict of interest
All the authors declare no conflict of interest.
Acknowledgments
Availability of data and material
All the data available have been included in the manuscript.
Declaration of generative AI and AI-assisted technologies in the writing process
During the preparation of this work, the authors used AI tools to improve readability and language. After using this tool/service, the authors reviewed and edited the content as needed and took full responsibility for the publication's content.
Funding
This research received no specific grant from funding agencies in the public, commercial, or not-for-profit sectors. L.S. is partially supported by the Second Century High Potential Professoriate by Chulalongkorn University, Thailand.
Author contributions
Lakshman Samaranayake and Nozimjon Tuygunov: Data curation; investigation; methodology; formal analysis; software; writing—original draft; writing—review and editing. Falk Schwendicke: Formal analysis; investigation; methodology; software; writing—original draft; writing—review and editing. Thanaphum Osathanon and Zohaib Khurshid: Investigation; resources; software; writing—original draft; writing—review and editing. Shukhrat A. Boymuradov and Arief Cahyanto: Conceptualization; data curation; writing—review and editing.
Contributor Information
Lakshman Samaranayake, Email: lakshman@hku.hk.
Nozimjon Tuygunov, Email: nozimtuygunov@gmail.com.
References
- 1.Schwendicke F., Samek W., Krois J. Artificial intelligence in dentistry: chances and challenges. J Dent Res. 2020;99:769–774. doi: 10.1177/0022034520915714. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Russell Stuart J., Peter N. Pearson; London: 2016. Artificial Intelligence: A Modern Approach. [Google Scholar]
- 3.Amisha Malik P., Pathania M., Rathaur V.K. Overview of artificial intelligence in medicine. J Family Med Prim Care. 2019;8:2328–2331. doi: 10.4103/jfmpc.jfmpc_440_19. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.November Joseph A. JHU Press; Baltimore, MD: 2012. Biomedical Computing: Digitizing Life in the United States. [Google Scholar]
- 5.Ding H., Wu J., Zhao W., Matinlinna J.P., Burrow M.F., Tsoi J.K.H. Artificial intelligence in dentistry-A review. Front Dent Med. 2023 Feb 20;4 doi: 10.3389/fdmed.2023.1085251. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Meskó B. The impact of multimodal large language models on health care's future. J Med Internet Res. 2023;25:e52865. doi: 10.2196/52865. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Vijayakumar S., Magazzù G., Moon P., Occhipinti A., Angione C. A practical guide to integrating multimodal machine learning and metabolic modeling. Methods Mol Biol. 2022;2399:87–122. doi: 10.1007/978-1-0716-1831-8_5. [DOI] [PubMed] [Google Scholar]
- 8.Thirunavukarasu A.J., Ting D.S.J., Elangovan K., Gutierrez L., Tan T.F., Ting D.S.W. Large language models in medicine. Nat Med. 2023 Aug;29(8):1930–1940. doi: 10.1038/s41591-023-02448-8. [DOI] [PubMed] [Google Scholar]
- 9.Suárez A., Díaz-Flores García V., Algar J., Gómez Sánchez M., Llorente de Pedro M., Freire Y. Unveiling the ChatGPT phenomenon: Evaluating the consistency and accuracy of endodontic question answers. Int Endod J. 2024 Jan;57(1):108–113. doi: 10.1111/iej.13985. [DOI] [PubMed] [Google Scholar]
- 10.Rewthamrongsris P., Burapacheep J., Trachoo V., Porntaveetus T. Accuracy of large language models for infective endocarditis prophylaxis in dental procedures. Int Dent J. 2025;75:206–212. doi: 10.1016/j.identj.2024.09.033. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Rischke R., Schneider L., Müller K., Samek W., Schwendicke F., Krois J. Federated Learning in Dentistry: Chances and Challenges. J Dent Res. 2022 Oct;101(11):1269–1273. doi: 10.1177/00220345221108953. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Schneider L., Rischke R., Krois J., Krasowski A., Büttner M., Mohammad-Rahimi H., Chaurasia A., Pereira N.S., Lee J.H., Uribe S.E., Shahab S., Koca-Ünsal R.B., Ünsal G., Martinez-Beneyto Y., Brinz J., Tryfonos O., Schwendicke F. Federated vs Local vs Central Deep Learning of Tooth Segmentation on Panoramic Radiographs. J Dent. 2023 Aug;135 doi: 10.1016/j.jdent.2023.104556. [DOI] [PubMed] [Google Scholar]
- 13.Felsch Marco, Meyer Ole, Schlickenrieder Anne, Engels Paula, Schönewolf Jule, Zöllner Felicitas, Heinrich-Weltzien Roswitha, Hesenius Marc, Hickel Reinhard, Gruhn Volker, Kühnisch Jan. Detection and localization of caries and hypomineralization on dental photographs with a vision transformer model. npj Digit. Med. 2023;6:198. doi: 10.1038/s41746-023-00944-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Al-Hamadani M.N.A., Fadhel M.A., Alzubaidi L., Harangi B. Reinforcement learning algorithms and applications in healthcare and robotics: a comprehensive and systematic review. Sensors. 2024;24:2461. doi: 10.3390/s24082461. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Bahrami R., Pourhajibagher M., Nikparto N., Bahador A. Robot-assisted dental implant surgery procedure: a literature review. J Dent Sci. 2024;19:1359–1368. doi: 10.1016/j.jds.2024.03.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Khurshid Z. Digital dentistry: transformation of oral health and dental education with technology. Eur J Dent. 2023;17:943–944. doi: 10.1055/s-0043-1772674. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Srivastava B., Chandra S., Singh S., Srivastava T. Artificial intelligence in dentistry: it's applications, impact and challenges. Asian J Oral Health Allied Sci. 2023;13:7. doi: 10.25259/AJOHAS_10_2023. [DOI] [Google Scholar]
- 18.Zheng Z., Yan H., Setzer F.C., Shi K.J., Mupparapu M., Li J. Anatomically Constrained Deep Learning for Automating Dental CBCT Segmentation and Lesion Detection. IEEE Transactions on Automation Science and Engineering. 2021;18(2):603–614. doi: 10.1109/TASE.2020.3025871. Article 9219218. [DOI] [Google Scholar]
- 19.Papantonopoulos G., Takahashi K., Bountis T., Loos B. Using cellular automata experiments to model periodontitis: a first step towards understanding the nonlinear dynamics of the disease. Int J Bifurcation Chaos. 2012;23(3):2417. doi: 10.1142/S0218127413500569. [DOI] [Google Scholar]
- 20.Ossowska A., Kusiak A., Świetlik D. Evaluation of the progression of periodontitis with the use of neural networks. J Clin Med. 2022;11:4667. doi: 10.3390/jcm11164667. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Krois J., Ekert T., Meinhold L., Golla T., Kharbot B., Wittemeier A., Dörfer C., Schwendicke F. Deep Learning for the Radiographic Detection of Periodontal Bone Loss. Sci Rep. 2019 Jun 11;9(1):8495. doi: 10.1038/s41598-019-44839-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Thanathornwong B., Suebnukarn S. Automatic detection of periodontal compromised teeth in digital panoramic radiographs using faster regional convolutional neural networks. Imaging Sci Dent. 2020;50:169–174. doi: 10.5624/isd.2020.50.2.169. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Khubrani Y.H., Thomas D., Slator P.J., White R.D., Farnell D.J.J. Detection of periodontal bone loss and periodontitis from 2D dental radiographs via machine learning and deep learning: systematic review employing APPRAISE-AI and meta-analysis. Dentomaxillofac Radiol. 2025;54(2):89–108. doi: 10.1093/dmfr/twae070. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Shetty S., Talaat W., AlKawas S., Al-Rawi N., Reddy S., Hamdoon Z., Kheder W., Acharya A., Ozsahin D.U., David LR. Application of artificial intelligence-based detection of furcation involvement in mandibular first molar using cone beam tomography images- a preliminary study. BMC Oral Health. 2024 Dec 4;24(1):1476. doi: 10.1186/s12903-024-05268-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Chang H.J., Lee S.J., Yong T.H., Shin N.Y., Jang B.G., Kim J.E., Huh K.H., Lee S.S., Heo M.S., Choi S.C., Kim T.I., Yi WJ. Deep Learning Hybrid Method to Automatically Diagnose Periodontal Bone Loss and Stage Periodontitis. Sci Rep. 2020 May 5;10(1):7531. doi: 10.1038/s41598-020-64509-z. PMID: 32372049; PMCID: PMC7200807. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Lee J.H., Kim D.H., Jeong S.N., Choi S.H. Diagnosis and prediction of periodontally compromised teeth using a deep learning-based convolutional neural network algorithm. J Periodontal Implant Sci. 2018;48:114–123. doi: 10.5051/jpis.2018.48.2.114. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Santamaria P., Troiano G., Serroni M., Araùjo T.G., Ravidà A., Nibali L. Exploring the accuracy of tooth loss prediction between a clinical periodontal prognostic system and a machine learning prognostic model. Journal of Clinical Periodontology. 2024;51(10):1333–1341. doi: 10.1111/jcpe.14023. [DOI] [PubMed] [Google Scholar]
- 28.Zhao A., Chen Y., Yang H., Chen T., Rao X., Li Z. Exploring the risk factors and clustering patterns of periodontitis in patients with different subtypes of diabetes through machine learning and cluster analysis. Acta Odontol Scand. 2024 Dec 3;83:653–665. doi: 10.2340/aos.v83.42435. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Li Y., Zhang B., Li D., Zhang Y., Xue Y., Hu K. Machine Learning and Mendelian Randomization Reveal Molecular Mechanisms and Causal Relationships of Immune-Related Biomarkers in Periodontitis. Mediators Inflamm. 2024 Dec 16;2024 doi: 10.1155/mi/9983323. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Tastan Eroglu Z., Babayigit O., Ozkan Sen D., Ucan Yarkac F. Performance of ChatGPT in classifying periodontitis according to the 2018 classification of periodontal diseases. Clin Oral Investig. 2024;28:407. doi: 10.1007/s00784-024-05799-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Chatzopoulos G.S., Koidou V.P., Tsalikis L., Kaklamanos EG. Large language models in periodontology: Assessing their performance in clinically relevant questions. J Prosthet Dent. 2024 Nov 18 doi: 10.1016/j.prosdent.2024.10.020. S0022-3913(24)00714-5. [DOI] [PubMed] [Google Scholar]
- 32.Fanelli F., Saleh M., Santamaria P., Zhurakivska K., Nibali L., Troiano G. Development and Comparative Evaluation of a Reinstructed GPT-4o Model Specialized in Periodontology. J Clin Periodontol. 2024 Dec;26 doi: 10.1111/jcpe.14101. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Yauney G., Rana A., Wong L.C., Javia P., Muftu A., Shah P. Automated Process Incorporating Machine Learning Segmentation and Correlation of Oral Diseases with Systemic Health. Annu Int Conf IEEE Eng Med Biol Soc. 2019 Jul;2019:3387–3393. doi: 10.1109/EMBC.2019.8857965. [DOI] [PubMed] [Google Scholar]
- 34.Suh B., Yu H., Cha J.K., Choi J., Kim J.W. Explainable deep learning approaches for risk screening of periodontitis. J Dent Res. 2025;104:45–53. doi: 10.1177/00220345241286488. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Chapman M.N., Nadgir R.N., Akman A.S., Saito N., Sekiya K., Kaneda T., Sakai O. Periapical lucency around the tooth: radiologic evaluation and differential diagnosis. Radiographics. 2013 Jan-Feb;33(1):E15–E32. doi: 10.1148/rg.331125172. [DOI] [PubMed] [Google Scholar]
- 36.Setzer F.C., Shi K.J., Zhang Z., Yan H., Yoon H., Mupparapu M., Li J. Artificial Intelligence for the Computer-aided Detection of Periapical Lesions in Cone-beam Computed Tomographic Images. J Endod. 2020 Jul;46(7):987–993. doi: 10.1016/j.joen.2020.03.025. [DOI] [PubMed] [Google Scholar]
- 37.Leonardi Dutra K., Haas L., Porporatti A.L., Flores-Mir C., Nascimento Santos J., Mezzomo L.A., Corrêa M., De Luca Canto G. Diagnostic Accuracy of Cone-beam Computed Tomography and Conventional Radiography on Apical Periodontitis: A Systematic Review and Meta-analysis. J Endod. 2016 Mar;42(3):356–364. doi: 10.1016/j.joen.2015.12.015. [DOI] [PubMed] [Google Scholar]
- 38.Orhan K., Bayrakdar I.S., Ezhov M., Kravtsov A., Özyürek T. Evaluation of artificial intelligence for detecting periapical pathosis on cone-beam computed tomography scans. Int Endod J. 2020;53:680–689. doi: 10.1111/iej.13265. [DOI] [PubMed] [Google Scholar]
- 39.Endres M.G., Hillen F., Salloumis M., Sedaghat A.R., Niehues S.M., Quatela O., Hanken H., Smeets R., Beck-Broichsitter B., Rendenbach C., Lakhani K., Heiland M., Gaudin RA. Development of a Deep Learning Algorithm for Periapical Disease Detection in Dental Radiographs. Diagnostics (Basel) 2020 Jun 24;10(6):430. doi: 10.3390/diagnostics10060430. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Bayrakdar I.S., Orhan K., Çelik Ö, Bilgir E., Sağlam H., Kaplan F.A., Görür S.A., Odabaş A., Aslan A.F. Różyło-Kalinowska I. A U-Net Approach to Apical Lesion Segmentation on Panoramic Radiographs. Biomed Res Int. 2022 Jan 15;2022 doi: 10.1155/2022/7035367. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Song I.S., Shin H.K., Kang J.H., Kim J.E., Huh K.H., Yi W.J., Lee S.S., Heo M.S. Deep learning-based apical lesion segmentation from panoramic radiographs. Imaging Sci Dent. 2022 Dec;52(4):351–357. doi: 10.5624/isd.20220078. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Li S., Liu J., Zhou Z., Zhou Z., Wu X., Li Y., Wang S., Liao W., Ying S., Zhao Z. Artificial intelligence for caries and periapical periodontitis detection. J Dent. 2022 Jul;122 doi: 10.1016/j.jdent.2022.104107. [DOI] [PubMed] [Google Scholar]
- 43.Hamdan M.H., Tuzova L., Mol A., Tawil P.Z., Tuzoff D., Tyndall DA. The effect of a deep-learning tool on dentists' performances in detecting apical radiolucencies on periapical radiographs. Dentomaxillofac Radiol. 2022 Sep 1;51(7) doi: 10.1259/dmfr.20220122. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Sadr S., Mohammad-Rahimi H., Motamedian S.R., Zahedrozegar S., Motie P., Vinayahalingam S., Dianat O., Nosrat A. Deep Learning for Detection of Periapical Radiolucent Lesions: A Systematic Review and Meta-analysis of Diagnostic Test Accuracy. J Endod. 2023 Mar;49(3):248–261.e3. doi: 10.1016/j.joen.2022.12.007. [DOI] [PubMed] [Google Scholar]
- 45.Slim M.L., Jacobs R., de Souza Leal R.M., Fontenele R.C. AI-driven segmentation of the pulp cavity system in mandibular molars on CBCT images using convolutional neural networks. Clin Oral Investig. 2024;28:650. doi: 10.1007/s00784-024-06009-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Jin L., Zhou W., Tang Y., Yu Z., Fan J., Wang L., Liu C., Gu Y., Zhang P. Detection of C-shaped mandibular second molars on panoramic radiographs using deep convolutional neural networks. Clin Oral Investig. 2024 Nov 18;28(12):646. doi: 10.1007/s00784-024-06049-8. [DOI] [PubMed] [Google Scholar]
- 47.Saghiri M.A., Garcia-Godoy F., Gutmann J.L., Lotfi M., Asgar K. The reliability of artificial neural network in locating minor apical foramen: a cadaver study. J Endod. 2012;38:1130–1134. doi: 10.1016/j.joen.2012.05.004. [DOI] [PubMed] [Google Scholar]
- 48.Mendonça de Moura J.D., Fontana C.E., Reis da Silva Lima V.H., de Souza Alves I., André de Melo Santos P., de Almeida Rodrigues P. Comparative accuracy of artificial intelligence chatbots in pulpal and periradicular diagnosis: A cross-sectional study. Comput Biol Med. 2024 Dec;183 doi: 10.1016/j.compbiomed.2024.109332. [DOI] [PubMed] [Google Scholar]
- 49.Portilla N.D., Garcia-Font M., Nagendrababu V., Abbott P.V., Sanchez J.A.G., Abella F. Accuracy and Consistency of Gemini Responses Regarding the Management of Traumatized Permanent Teeth. Dent Traumatol. 2024 Oct;26 doi: 10.1111/edt.13004. [DOI] [PubMed] [Google Scholar]
- 50.Litjens G., Kooi T., Bejnordi B.E., Setio A.A.A., Ciompi F., Ghafoorian M., van der Laak JAWM, van Ginneken B., Sánchez C.I. A survey on deep learning in medical image analysis. Med Image Anal. 2017 Dec;42:60–88. doi: 10.1016/j.media.2017.07.005. [DOI] [PubMed] [Google Scholar]
- 51.Aubreville M., Knipfer C., Oetter N., Jaremenko C., Rodner E., Denzler J., Bohr C., Neumann H., Stelzle F., Maier A. Automatic Classification of Cancerous Tissue in Laserendomicroscopy Images of the Oral Cavity using Deep Learning. Sci Rep. 2017 Sep 20;7(1):11979. doi: 10.1038/s41598-017-12320-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Li C., Jing B., Ke L., Li B., Xia W., He C., Qian C., Zhao C., Mai H., Chen M., Cao K., Mo H., Guo L., Chen Q., Tang L., Qiu W., Yu Y., Liang H., Huang X., Liu G., Li W., Wang L., Sun R., Zou X., Guo S., Huang P., Luo D., Qiu F., Wu Y., Hua Y., Liu K., Lv S., Miao J., Xiang Y., Sun Y., Guo X., Lv X. Development and validation of an endoscopic images-based deep learning model for detection with nasopharyngeal malignancies. Cancer Commun (Lond) 2018 Sep 25;38(1):59. doi: 10.1186/s40880-018-0325-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Mascharak S., Baird B.J., Holsinger F.C. Detecting oropharyngeal carcinoma using multispectral, narrow-band imaging and machine learning. Laryngoscope. 2018;128:2514–2520. doi: 10.1002/lary.27159. [DOI] [PubMed] [Google Scholar]
- 54.Moccia S., De Momi E., Guarnaschelli M., Savazzi M., Laborai A., Guastini L., Peretti G., Mattos L.S. Confident texture-based laryngeal tissue classification for early stage diagnosis support. J Med Imaging (Bellingham) 2017 Jul;4(3) doi: 10.1117/1.JMI.4.3.034502. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Achararit P., Manaspon C., Jongwannasiri C., Phattarataratip E., Osathanon T., Sappayatosok K. Artificial Intelligence-Based Diagnosis of Oral Lichen Planus Using Deep Convolutional Neural Networks. Eur J Dent. 2023 Oct;17(4):1275–1282. doi: 10.1055/s-0042-1760300. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Warin K., Limprasert W., Suebnukarn S., Jinaporntham S., Jantana P. Performance of deep convolutional neural network for classification and detection of oral potentially malignant disorders in photographic images. Int J Oral Maxillofac Surg. 2022 May;51(5):699–704. doi: 10.1016/j.ijom.2021.09.001. [DOI] [PubMed] [Google Scholar]
- 57.Song B., Sunny S., Uthoff R.D., Patrick S., Suresh A., Kolur T., Keerthi G., Anbarani A., Wilder-Smith P., Kuriakose M.A., Birur P., Rodriguez J.J., Liang R. Automatic classification of dual-modalilty, smartphone-based oral dysplasia and malignancy images using deep learning. Biomed Opt Express. 2018 Oct 10;9(11):5318–5329. doi: 10.1364/BOE.9.005318. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58.Mahmood H., Shaban M., Indave B.I., Santos-Silva A.R., Rajpoot N., Khurram S.A. Use of artificial intelligence in diagnosis of head and neck precancerous and cancerous lesions: A systematic review. Oral Oncol. 2020 Nov;110 doi: 10.1016/j.oraloncology.2020.104885. [DOI] [PubMed] [Google Scholar]
- 59.Charoenlarp P., Silkosessak-Chaiudom O., Vipismakul V. Atypical periosteal reaction and unusual bone involvement of ameloblastoma: a case report with 8-year follow-up. Imaging Sci Dent. 2021;51:195–201. doi: 10.5624/isd.20200264. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Lu C., Lewis J.S., Jr., Dupont W.D., Plummer W.D., Jr., Janowczyk A., Madabhushi A. An oral cavity squamous cell carcinoma quantitative histomorphometric-based image classifier of nuclear morphology can risk stratify patients for disease-specific survival. Mod Pathol. 2017 Dec;30(12):1655–1665. doi: 10.1038/modpathol.2017.98. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Wu B., Khong P.L., Chan T. Automatic detection and classification of nasopharyngeal carcinoma on PET/CT with support vector machine. Int J Comput Assist Radiol Surg. 2012;7:635–646. doi: 10.1007/s11548-011-0669-y. [DOI] [PubMed] [Google Scholar]
- 62.Rokhshad R., Salehi S.N., Yavari A., Shobeiri P., Esmaeili M., Manila N., Motamedian S.R., Mohammad-Rahimi H. Deep learning for diagnosis of head and neck cancers through radiographic data: a systematic review and meta-analysis. Oral Radiol. 2024 Jan;40(1):1–20. doi: 10.1007/s11282-023-00715-5. [DOI] [PubMed] [Google Scholar]
- 63.Giraldo-Roldán D., Araújo A.L.D., Moraes M.C., da Silva V.M., Ribeiro E.C.C., Cerqueira M., Saldivia-Siracusa C., Sousa-Neto S.S., Pérez-de-Oliveira M.E., Lopes M.A., Kowalski L.P., de Carvalho ACPLF, Santos-Silva A.R., Vargas P.A. Artificial intelligence and radiomics in the diagnosis of intraosseous lesions of the gnathic bones: A systematic review. J Oral Pathol Med. 2024 Aug;53(7):415–433. doi: 10.1111/jop.13548. [DOI] [PubMed] [Google Scholar]
- 64.Baik J., Ye Q., Zhang L., Poh C., Rosin M., MacAulay C., Guillaud M. Automated classification of oral premalignant lesions using image cytometry and Random Forests-based algorithms. Cell Oncol (Dordr) 2014 Jun;37(3):193–202. doi: 10.1007/s13402-014-0172-x. [DOI] [PubMed] [Google Scholar]
- 65.Zhang L., Wu Y., Zheng B., Su L., Chen Y., Ma S., Hu Q., Zou X., Yao L., Yang Y., Chen L., Mao Y., Chen Y., Ji M. Rapid histology of laryngeal squamous cell carcinoma with deep-learning based stimulated Raman scattering microscopy. Theranostics. 2019 Apr 13;9(9):2541–2554. doi: 10.7150/thno.32655. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Fouad S., Randell D., Galton A., Mehanna H., Landini G. Unsupervised morphological segmentation of tissue compartments in histopathological images. PLoS One. 2017 Nov 30;12(11) doi: 10.1371/journal.pone.0188717. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67.Halicek M., Shahedi M., Little J.V., Chen A.Y., Myers L.L., Sumer B.D., Fei B. Head and Neck Cancer Detection in Digitized Whole-Slide Histology Using Convolutional Neural Networks. Sci Rep. 2019 Oct 1;9(1):14043. doi: 10.1038/s41598-019-50313-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 68.Christie B., Musri N., Djustiana N., Takarini V., Tuygunov N., Zakaria M.N., Cahyanto A. Advances and challenges in regenerative dentistry: A systematic review of calcium phosphate and silicate-based materials on human dental pulp stem cells. Mater Today Bio. 2023 Sep 23;23 doi: 10.1016/j.mtbio.2023.100815. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 69.Fukuda M., Inamoto K., Shibata N., Ariji Y., Yanashita Y., Kutsuna S., Nakata K., Katsumata A., Fujita H., Ariji E. Evaluation of an artificial intelligence system for detecting vertical root fracture on panoramic radiography. Oral Radiol. 2020 Oct;36(4):337–343. doi: 10.1007/s11282-019-00409-x. [DOI] [PubMed] [Google Scholar]
- 70.Magat G., Altındag A., Pertek Hatipoglu F., Hatipoglu O., Bayrakdar İS, Celik O., Orhan K. Automatic deep learning detection of overhanging restorations in bitewing radiographs. Dentomaxillofac Radiol. 2024 Oct 1;53(7):468–477. doi: 10.1093/dmfr/twae036. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71.Jaiswal P., Bhirud S.G. 2021 IEEE International Conference on Technology, Research, and Innovation for Betterment of Society (TRIBES) 2021. Study and Analysis of an Approach Towards the Classification of Tooth Wear in Dentistry Using Machine Learning Technique; pp. 1–6. [Google Scholar]
- 72.Lee J.H., Kim D.H., Jeong S.N., Choi S.H. Detection and diagnosis of dental caries using a deep learning-based convolutional neural network algorithm. J Dent. 2018;77:106–111. doi: 10.1016/j.jdent.2018.07.015. [DOI] [PubMed] [Google Scholar]
- 73.Tuygunov N., Khairunnisa Z., Yahya N.A., Aziz A.A., Zakaria M.N., Israilova N.A., Cahyanto A. Bioactivity and remineralization potential of modified glass ionomer cement: A systematic review of the impact of calcium and phosphate ion release. Dent Mater J. 2024 Jan 30;43(1):1–10. doi: 10.4012/dmj.2023-132. [DOI] [PubMed] [Google Scholar]
- 74.Khurshid Z., Waqas M., Hasan S., Kazmi S., Faheemuddin M. Deep learning architecture to infer Kennedy classification of partially edentulous arches using object detection techniques and piecewise annotations. Int Dent J. 2025;75:223–235. doi: 10.1016/j.identj.2024.11.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 75.Murata M., Ariji Y., Ohashi Y., Kawai T., Fukuda M., Funakoshi T., Kise Y., Nozawa M., Katsumata A., Fujita H., Ariji E. Deep-learning classification using convolutional neural network for evaluation of maxillary sinusitis on panoramic radiography. Oral Radiol. 2019 Sep;35(3):301–307. doi: 10.1007/s11282-018-0363-7. [DOI] [PubMed] [Google Scholar]
- 76.Vinayahalingam S., Xi T., Bergé S., Maal T., de Jong G. Automated detection of third molars and mandibular nerve by deep learning. Sci Rep. 2019;9:9007. doi: 10.1038/s41598-019-45487-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77.Chen Y., Lee J.K.Y., Kwong G., Pow E.H.N., Tsoi J.K.H. Morphology and fracture behavior of lithium disilicate dental crowns designed by human and knowledge-based AI. J Mech Behav Biomed Mater. 2022;131 doi: 10.1016/j.jmbbm.2022.105256. [DOI] [PubMed] [Google Scholar]
- 78.Yamaguchi S., Lee C., Karaer O., Ban S., Mine A., Imazato S. Predicting the Debonding of CAD/CAM Composite Resin Crowns with AI. J Dent Res. 2019 Oct;98(11):1234–1238. doi: 10.1177/0022034519867641. [DOI] [PubMed] [Google Scholar]
- 79.Baaj R.E., Alangari T.A. Artificial intelligence applications in smile design dentistry: A scoping review. J Prosthodont. 2024 Dec;9 doi: 10.1111/jopr.14000. [DOI] [PubMed] [Google Scholar]
- 80.Ceylan G., Özel G.S., Memişoglu G., Emir F., Şen S. Evaluating the Facial Esthetic Outcomes of Digital Smile Designs Generated by Artificial Intelligence and Dental Professionals. Applied Sciences. 2023;13(15):9001. doi: 10.3390/app13159001. [DOI] [Google Scholar]
- 81.Hwang, Jyh-Jing, Azernikov, Sergei, Efros, Alexei, Yu, Stella. (2018). Learning Beyond Human Expertise with Generative Models for Dental Restorations. 10.48550/arXiv.1804.00064.
- 82.Tian S., Wang M., Dai N., Ma H., Li L., Fiorenza L., Sun Y., Li Y. DCPR-GAN: Dental Crown Prosthesis Restoration Using Two-Stage Generative Adversarial Networks. IEEE J Biomed Health Inform. 2022 Jan;26(1):151–160. doi: 10.1109/JBHI.2021.3119394. [DOI] [PubMed] [Google Scholar]
- 83.Ding H., Cui Z., Maghami E., Chen Y., Matinlinna J.P., Pow E.H.N., Fok A.S.L., Burrow M.F., Wang W., Tsoi J.K.H. Morphology and mechanical performance of dental crown designed by 3D-DCGAN. Dent Mater. 2023 Mar;39(3):320–332. doi: 10.1016/j.dental.2023.02.001. [DOI] [PubMed] [Google Scholar]
- 84.Cho J.H., Yi Y., Choi J., Ahn J., Yoon H.I., Yilmaz B. Time efficiency, occlusal morphology, and internal fit of anatomic contour crowns designed by dental software powered by generative adversarial network: A comparative study. J Dent. 2023 Nov;138 doi: 10.1016/j.jdent.2023.104739. [DOI] [PubMed] [Google Scholar]
- 85.Shetty S., Gali S., Augustine D., Sv S. Artificial intelligence systems in dental shade-matching: a systematic review. J Prosthodont. 2024;33:519–532. doi: 10.1111/jopr.13805. [DOI] [PubMed] [Google Scholar]
- 86.Kong H.-J., Kim Y.-L. Application of artificial intelligence in dental crown prosthesis: a scoping review. BMC Oral Health. 2024;24:937. doi: 10.1186/s12903-024-04657-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.Ali M.A., Fujita D., Kobashi S. Teeth and prostheses detection in dental panoramic X-rays using CNN-based object detector and a priori knowledge-based algorithm. Sci Rep. 2023;13:16542. doi: 10.1038/s41598-023-43591-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 88.Alharbi N., Alharbi A.S. AI-driven innovations in pediatric dentistry: enhancing care and improving outcome. Cureus. 2024;16:e69250. doi: 10.7759/cureus.69250. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 89.Wang Y., Hays R.D., Marcus M., Maida C.A., Shen J., Xiong D., Coulter I.D., Lee S.Y., Spolsky V.W., Crall J.J., Liu H. Developing Children's Oral Health Assessment Toolkits Using Machine Learning Algorithm. JDR Clin Trans Res. 2020 Jul;5(3):233–243. doi: 10.1177/2380084419885612. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 90.Toledo Reyes L., Knorst J.K., Ortiz F.R., Brondani B., Emmanuelli B., Saraiva Guedes R., Mendes F.M., Ardenghi T.M. Early Childhood Predictors for Dental Caries: A Machine Learning Approach. J Dent Res. 2023 Aug;102(9):999–1006. doi: 10.1177/00220345231170535. [DOI] [PubMed] [Google Scholar]
- 91.Qu X., Zhang C., Houser S.H., Zhang J., Zou J., Zhang W., Zhang Q. Prediction model for early childhood caries risk based on behavioral determinants using a machine learning algorithm. Comput Methods Programs Biomed. 2022 Dec;227 doi: 10.1016/j.cmpb.2022.107221. [DOI] [PubMed] [Google Scholar]
- 92.You W., Hao A., Li S., Wang Y., Xia B. Deep learning-based dental plaque detection on primary teeth: a comparison with clinical assessments. BMC Oral Health. 2020;20:141. doi: 10.1186/s12903-020-01114-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 93.Gonzalez C., Badr Z., Güngör H.C., Han S., Hamdan M.D. Identifying primary proximal caries lesions in pediatric patients from bitewing radiographs using artificial intelligence. Pediatr Dent. 2024;46:332–336. [PubMed] [Google Scholar]
- 94.Mertens S., Krois J., Cantu A.G., Arsiwala L.T., Schwendicke F. Artificial intelligence for caries detection: randomized trial. J Dent. 2021;115 doi: 10.1016/j.jdent.2021.103849. [DOI] [PubMed] [Google Scholar]
- 95.Al-Jallad N., Ly-Mapes O., Hao P., Ruan J., Ramesh A., Luo J., Wu T.T., Dye T., Rashwan N., Ren J., Jang H., Mendez L., Alomeir N., Bullock S., Fiscella K., Xiao J. Artificial intelligence-powered smartphone application, AICaries, improves at-home dental caries screening in children: Moderated and unmoderated usability test. PLOS Digit Health. 2022;1(6) doi: 10.1371/journal.pdig.0000046. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 96.Schwarzmaier J., Frenkel E., Neumayr J., Ammar N., Kessler A., Schwendicke F., Kühnisch J., Dujic H. Validation of an Artificial Intelligence-Based Model for Early Childhood Caries Detection in Dental Photographs. J Clin Med. 2024 Sep 3;13(17):5215. doi: 10.3390/jcm13175215. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 97.Elkarmi R., Abu-Ghazaleh S., Sonbol H., Haha O., AL-Haddad A., Hassona Y. ChatGPT for parents’ education about early childhood caries: A friend or foe? Int J Paediatr Dent. 2024;00:1–8. doi: 10.1111/ipd.13283. [DOI] [PubMed] [Google Scholar]
- 98.Acharya S., Godhi B.S., Saxena V., Assiry A.A., Alessa N.A., Dawasaz A.A., Alqarni A., Karobari M.I. Role of artificial intelligence in behavior management of pediatric dental patients-a mini review. J Clin Pediatr Dent. 2024 May;48(3):24–30. doi: 10.22514/jocpd.2024.055. [DOI] [PubMed] [Google Scholar]
- 99.Niño-Sandoval T.C., Guevara Pérez S.V., González F.A., Jaque R.A., Infante-Contreras C. Use of automated learning techniques for predicting mandibular morphology in skeletal class I, II and III. Forensic Sci Int. 2017;281:187.e181–187.e187. doi: 10.1016/j.forsciint.2017.10.004. [DOI] [PubMed] [Google Scholar]
- 100.Ferraz A.X., Schroder ÂGD, Gonçalves F.M., Küchler E.C., Santos R.S., Zeigelboim B.S., Pezzin A.P.T., Taveira K.V., Abuabara A., Baratto-Filho F., de Araujo CM. Artificial intelligence model for predicting sexual dimorphism through the hyoid bone in adult patients. PLoS One. 2024 Nov 19;19(11) doi: 10.1371/journal.pone.0310811. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 101.Keyrouz Y., Saade M., Nahas Gholmieh M., Saadé A. A machine learning approach for age prediction based on trigeminal landmarks. J Foren Legal Med. 2024;107 doi: 10.1016/j.jflm.2024.102742. [DOI] [PubMed] [Google Scholar]
- 102.Bizjak Ž., Robič T. Dentage: deep learning for automated age prediction using panoramic dental X-ray images. J Foren Sci. 2024;69:2069–2074. doi: 10.1111/1556-4029.15629. [DOI] [PubMed] [Google Scholar]
- 103.Matsuda S., Miyamoto T., Yoshimura H., Hasegawa T. Personal identification with orthopantomography using simple convolutional neural networks: a preliminary study. Scientific Reports. 2020;10:13559. doi: 10.1038/s41598-020-70474-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 104.Macrì M., D'Albis V., D'Albis G., Forte M., Capodiferro S., Favia G., Alrashadah A.O., García V.D., Festa F. The Role and Applications of Artificial Intelligence in Dental Implant Planning: A Systematic Review. Bioengineering (Basel) 2024 Jul 31;11(8):778. doi: 10.3390/bioengineering11080778. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 105.Berends B., Vinayahalingam S., Baan F., Flügge T., Maal T., Bergé S., de Jong G., Xi T. Automated condylar seating assessment using a deep learning-based three-step approach. Clin Oral Investig. 2024 Sep 4;28(9):512. doi: 10.1007/s00784-024-05895-w. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 106.Thompson R.F., Valdes G., Fuller C.D., Carpenter C.M., Morin O., Aneja S., Lindsay W.D., Aerts HJWL, Agrimson B., Deville C., Jr., Rosenthal S.A., Yu J.B., Thomas C.R., Jr Artificial intelligence in radiation oncology: A specialty-wide disruptive transformation? Radiother Oncol. 2018 Dec;129(3):421–426. doi: 10.1016/j.radonc.2018.05.030. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 107.Chang J.S., Ma C.Y., Ko E.W.C. Prediction of surgery-first approach orthognathic surgery using deep learning models. Int J Oral Maxillofac Surg. 2024;53:942–949. doi: 10.1016/j.ijom.2024.05.003. [DOI] [PubMed] [Google Scholar]
- 108.Satapathy S.K., Kunam A., Rashme R., Sudarsanam P.P., Gupta A., Kumar H.S.K. AI-Assisted Treatment Planning for Dental Implant Placement: Clinical vs AI-Generated Plans. J Pharm Bioallied Sci. 2024;16(Suppl 1) doi: 10.4103/jpbs.jpbs_1121_23. FebS939–S941. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 109.Azadi A., Gorjinejad F., Mohammad-Rahimi H., Tabrizi R., Alam M., Golkar M. Evaluation of AI-generated responses by different artificial intelligence chatbots to the clinical decision-making case-based questions in oral and maxillofacial surgery. Oral Surg Oral Med Oral Pathol Oral Radiol. 2024 Jun;137(6):587–593. doi: 10.1016/j.oooo.2024.02.018. [DOI] [PubMed] [Google Scholar]
- 110.Wu Y., Zou S., Lv P., Wang X. Accuracy of an autonomous dental implant robotic system in dental implant surgery. J Prosthet Dent. 2024 Aug 13 doi: 10.1016/j.prosdent.2024.07.020. S0022-3913(24)00501-8. [DOI] [PubMed] [Google Scholar]
- 111.Zhao Z., Zhang Y., Lin L., Huang W., Xiao C., Liu J., Chai G. Intelligent electromagnetic navigation system for robot-assisted intraoral osteotomy in mandibular tumor resection: a model experiment. Front Immunol. 2024 Jul 25;15 doi: 10.3389/fimmu.2024.1436276. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 112.Ye H., Cheng Z., Ungvijanpunya N., Chen W., Cao L., Gou Y. Is automatic cephalometric software using artificial intelligence better than orthodontist experts in landmark identification? BMC Oral Health. 2023 Jul 8;23(1):467. doi: 10.1186/s12903-023-03188-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 113.Londono J., Ghasemi S., Hussain Shah A., Fahimipour A., Ghadimi N., Hashemi S., Khurshid Z., Dashti M. Evaluation of deep learning and convolutional neural network algorithms accuracy for detecting and predicting anatomical landmarks on 2D lateral cephalometric images: A systematic review and meta-analysis. Saudi Dent J. 2023 Jul;35(5):487–497. doi: 10.1016/j.sdentj.2023.05.014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 114.Noeldeke B., Vassis S., Sefidroodi M., Pauwels R., Stoustrup P. Comparison of deep learning models to detect crossbites on 2D intraoral photographs. Head Face Med. 2024;20:45. doi: 10.1186/s13005-024-00448-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 115.Etemad L.E., Heiner J.P., Amin A.A., Wu T.H., Chao W.L., Hsieh S.J., Sun Z., Guez C., Ko CC. Effectiveness of Machine Learning in Predicting Orthodontic Tooth Extractions: A Multi-Institutional Study. Bioengineering (Basel) 2024 Aug 31;11(9):888. doi: 10.3390/bioengineering11090888. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 116.Köktürk B., Pamukçu H., Gözüaçık Ö. Evaluation of different machine learning algorithms for extraction decision in orthodontic treatment. Orthod Craniofac Res. 2024;27(Suppl 2):13–24. doi: 10.1111/ocr.12811. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 117.Kreiner M., Viloria J. A novel artificial neural network for the diagnosis of orofacial pain and temporomandibular disorders. J Oral Rehabil. 2022;49:884–889. doi: 10.1111/joor.13350. [DOI] [PubMed] [Google Scholar]
- 118.Latypov T.H., So M.C., Hung P.S., Tsai P., Walker M.R., Tohyama S., Tawfik M., Rudzicz F., Hodaie M. Brain imaging signatures of neuropathic facial pain derived by artificial intelligence. Sci Rep. 2023 Jul 3;13(1):10699. doi: 10.1038/s41598-023-37034-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 119.de Araujo B.M.M., de Jesus Freitas P.F., Deliga Schroder A.G., Küchler E.C., Baratto-Filho F., Ditzel Westphalen V.P., Carneiro E Xavier da Silva-Neto U., de Araujo C.M. PAINe: An Artificial Intelligence-based Virtual Assistant to Aid in the Differentiation of Pain of Odontogenic versus Temporomandibular Origin. J Endod. 2024 Dec;50(12):1761–1765.e2. doi: 10.1016/j.joen.2024.09.008. [DOI] [PubMed] [Google Scholar]
- 120.Li Z., Jia Y., Li Y., Han D. Automatic prediction of obstructive sleep apnea event using deep learning algorithm based on ECG and thoracic movement signals. Acta Otolaryngol. 2024;144:52–57. doi: 10.1080/00016489.2024.2301732. [DOI] [PubMed] [Google Scholar]
- 121.Martinot J.B., Le-Dong N.N., Cuthbert V., Denison S., Gozal D., Lavigne G., Pépin JL. Artificial Intelligence Analysis of Mandibular Movements Enables Accurate Detection of Phasic Sleep Bruxism in OSA Patients: A Pilot Study. Nat Sci Sleep. 2021 Aug 23;13:1449–1459. doi: 10.2147/NSS.S320664. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
All the data available have been included in the manuscript.



