Abstract
Background
Age and sex estimation, which is crucial in forensic odontology, traditionally relies on complex, time-consuming methods prone to human error. This study proposes an AI-driven approach using deep learning to estimate age and sex from panoramic radiographs of Thai children and adolescents.
Methods
This study analyzed 4627 images from 2491 panoramic radiographs of Thai individuals aged 7 to 23 years. A supervised multitask model, built upon the EfficientNetB0 architecture, was developed to simultaneously estimate age and classify sex. The model was trained using a 2-phase process of transfer learning and fine-tuning. Following the development of an initial baseline model for the entire 7 to 23-year cohort, 2 primary age-stratified models (7-14 and 15-23 years) were subsequently developed to enhance predictive accuracy. All models were validated against the subjects' chronological age and biological sex.
Results
The age estimation model for individuals aged 7 to 23 years yielded a root mean square error (RMSE) of 1.67 and mean absolute error (MAE) of 1.15, with 71.0% accuracy in predicting dental-chronological age differences within 1 year. Age-stratified analysis revealed that the model showed superior performance in the younger cohort (7-14 years), with RMSE of 0.95, MAE of 0.62, and accuracy of 90.3%. Performance declined substantially in the older age group (15-23 years), where RMSE, MAE, and accuracy values were 1.87, 1.41, and 63.8%, respectively. The sex recognition model achieved good overall performance for individuals aged 7 to 23 years (area under curve [AUC] = 0.94, accuracy = 87.8%, sensitivity = 89%, specificity = 87%). In contrast to age estimation, sex recognition performance improved notably in the older cohort (15-23 years): AUC of 0.99, 94.7% accuracy, 92% sensitivity, and 98% specificity.
Conclusion
This novel AI-based age and sex identification model exhibited good performance metrics, suggesting the potential to serve as an alternative to traditional methods as a diagnostic tool for characterizing both living individuals, as well as deceased bodies.
Key words: Age estimation, Artificial intelligence, Deep learning, Sex estimation, Panoramic radiograph, Forensic odontology
Introduction
Age and sex estimation are the most significant contributions in forensic odontology for identification of living humans and deceased bodies. Teeth and jawbones, which are durable tissues and less influenced by nutritional and environmental factors, serve as major sources of evidence for estimating both chronological age and sex in unidentified human remains.1 This is also particularly valuable in situations where individuals cannot provide identity documentation in cases such as adopted persons, unaccompanied minors seeking asylum, illegal cross-border immigration, and other civil issues. Furthermore, age estimation through dental development plays a crucial role in assessing the biological age of children and adolescents. This information is essential for diagnostic and treatment planning in pediatric dentistry and orthodontics, especially when the biological/physiological age does not align with the chronological age of the individual.
Several methods have been proposed to estimate chronological age using dental evidence obtained from dental radiographs. In children and adolescents, the stage of crown-root formation and tooth eruption of primary or permanent teeth or both are utilized to estimate dental age, which is then converted to predict chronological age.2 However, in individuals who have completed root formation in permanent teeth (typically 16 to 23 years old, excluding the third molar), age is commonly predicted based on the developmental stages of the third molar.3 Another proposed assessment method is the morphological approach, which estimates age in individuals whose third molars have fully developed roots. This technique involves either evaluating physical and histological changes in teeth and surrounding tissues or assessing the ratio between dental pulp and root dimensions.4, 5, 6
Previous studies have shown that the size and shape of the jawbone can provide evidence for estimating the sex of an unidentified victim.7,8 Mandibular morphology is primarily influenced by masticatory forces, which often vary between males and females. Canine teeth, which are less prone to developing dental caries and periodontal disease compared to the other teeth, have been utilized to predict sex and represent one of the highest levels of human sexual dimorphism.9,10
The feasibility of assessing age or sex using radiographs of the teeth and jawbones has been demonstrated as these techniques are simple, minimally invasive, and applicable to both living and deceased individuals. However, most convention methods used to analyze age and sex are complex time-consuming, and they rely on human vision, which is susceptible to human error.11
The application of artificial intelligence (AI) technology in dentistry and medicine is currently experiencing significant growth.12,13 AI is a field that involves software and hardware systems designed to perform tasks through learning, perception, reasoning, and decision-making, using mathematical models and computational processes. It can perform human-like activities more quickly, accurately, and precisely than humans, particularly in complex data pattern recognition and expedited diagnostics.14
Many previous studies have utilized AI technology to estimate age and identify sex from dental radiographs.15 In the early stages of applying AI technology for age and sex estimation, most studies employed AI-based systems under the guidance of traditional methods, which refers to hybrid approaches that combine automated deep learning techniques with traditional anatomical knowledge or constraints, such as incorporating established forensic landmarks or measurement criteria. For instance, De Tobel et al. utilized an AI-based model to stage the development of lower third molars on panoramic radiographs,16 while Galibourg et al applied AI technology to enhance the Demirjian and Williams methods for dental age estimation.17 Similarly, Shen et al. employed AI technology to assist in the Cameriere method for dental age estimation.18 Recently, there has been a paradigm shift toward fully autonomous deep learning approaches that eliminate the need for manual feature extraction or traditional anatomical landmark identification. These methods employ sophisticated neural network architectures, including convolutional neural networks (CNNs) and attention-based networks, to process raw radiographic images directly and learn discriminative features automatically. Several notable examples demonstrate the effectiveness of autonomous approaches. Park et al. developed ForensicNet, a multitask deep learning network for automatic sex and chronological age estimation from panoramic radiographs.19 Similarly, Wang et al. employed 2 CNNs (VGG16 and ResNet101) for dental age estimation,20 while Vila-Blanco et al. developed DASNet for the same purpose.21
Although several deep learning studies have attempted age estimation from panoramic radiographs, there are limited studies conducted in Thai population. Moreover, various CNN architectures have been used to develop age and sex estimation models from panoramic radiographs. While each architecture has distinct strengths and weaknesses, EfficientNetB0 stands out for its optimal balance of accuracy and efficiency, achieving competitive performance with fewer parameters and reduced computational requirements compared to conventional architectures.22 Despite these advantages, there are relatively few studies exploring the application of EfficientNetB0 for age and sex prediction from dental radiographs.23,24
This research therefore aimed to develop an artificial intelligence model using EfficientNetB0 architecture for age and sex estimation from panoramic radiographs in a Thai population. Focusing on the 7-23 years age range, where dental development markers remain most reliable, this study sought to establish a robust baseline model with improved accuracy as a foundation for future expansion to broader age groups.
Methods
This retrospective cross-sectional diagnostic accuracy study was reviewed and approved by the Institutional Review Board of the Faculty of Dentistry/Faculty of Pharmacy, Mahidol University (COA.No.MU-DT/PY-IRB 2022/055.2010).
Sample size determination
A list of patients who underwent panoramic radiograph from the Oral and Maxillofacial Radiology Clinic at the Faculty of Dentistry, Mahidol University, between January 2012 and November 2022 were retrieved from the hospital database. The inclusion criteria were Thai nationality aged 7 to 23 years and healthy patients. The subjects were excluded if their radiographic images met one of these exclusion criteria: (a) low resolution panoramic image in which the resolution for DICOM 8 bit files was less than 649 × 490 pixels and for DICOM 12 bit files less than 2105 × 1528 pixels and (b) the image showed agenesis of permanent tooth or teeth, tooth anomalies, malposition or extracted permanent tooth or teeth, or pathology in both arches on the same sides or on maxillary and mandible arches in contralateral sides.
Sample size determination was based on ensuring adequate representation across 17 age categories and 2 sex groups. Previous studies have shown that deep learning models for medical imaging typically require 50-200 samples per class for convergence.25,26 We targeted a minimum of 100 samples per age-sex combination, resulting in 4627 images after exclusions.
Two investigators screened the medical status of the patients from their dental charts, evaluated their panoramic image quality, and included only the images that met the eligible criteria. The ages of the subjects at the time of taking the radiograph were recorded. Age and sex from hospital records served as the reference standards, with chronological age calculated from birth date and panoramic radiograph acquisition date. All radiographs were exported from the hospital database in de-identified format, with all patient-identifying information removed.
Panoramic image preparation
Images were preprocessed through a standardized pipeline. From each panoramic radiograph, we extracted 2 images by cropping 60% of the central region from both the right and left sides (Figure 1), focusing on the dental structures while excluding lateral artifacts. The left-side images were then horizontally flipped to create a uniform right-side orientation for all images (Figure 1). Images were subsequently resized from their original resolutions (649 × 490 or 2105 × 1528 pixels) to 224 × 224 pixels using bilinear interpolation with antialiasing to prevent noise introduction during down sampling. Pixel values were normalized to a [0,1] range.
Figure 1.
Panoramic image preparation and model design.
The images were randomly divided into 2 groups with approximately 80% allocated for developing the model and the remaining assigned to the test group. Within the development set, the images were further randomly divided into 2 groups with about 75% reserved for training and the remaining for model validation. Data partitioning was performed at the patient level to ensure that all images from the same individual were confined to a single dataset.
Model development
This study employed the DeepToothDuo supervised multitask learning approach to estimate age and sex simultaneously (Figure 1).27 All models were developed using Google Colaboratory Pro with a GPU kernel. The models were implemented in Python 3.9 using TensorFlow 2.15. The model development process was based on the transfer learning paradigm using a pretrained EfficientNetB0 architecture. Our methodology involved a phased approach resulting in 3 distinct models. First, a baseline model was trained on the entire 7 to 23 years cohort. Analysis of this model's performance prompted the development of our 2 primary, age-stratified models: one for the 7 to 14 years cohort and another for the 15 to 23-year cohort.
Traing phase
The training for each of these 3 models comprised 2 main phases: Transfer Learning and Fine-tuning (Figure 2).
Figure 2.
Illustration of the overall development phases of a multitask age-sex estimation model. (A) The backbone of deep CNN is the EfficientNetB0 pretrained model with ImageNet dataset. (B) The model is trained with the transfer learning technique to simultaneously perform age regression and sex classification. (C) Finally, the model undergoes fine-tuning to optimize its performance for both the age and sex estimation tasks.
Phase I: Transfer learning
This study modified the pretrained EfficientNetB0 by replacing the original classification layer with 2 task-specific output nodes: one regression node for age estimation and one binary classification node for sex determination. With all convolutional layers of the base model frozen, we trained only these new output layers using a weighted composite loss (75% mean square error (MSE) for age, 25% binary cross-entropy for sex) to adapt them to our specific tasks. For the initial baseline model (7-23 years), this phase consisted of 3250 epochs, optimized with Adam and trained with a batch size of 16. For the subsequent primary models, this phase involved 3500 epochs, as exemplified by the 7 to 14 years model.
Phase II: Fine tuning
Following the initial transfer learning phase, we unfroze all network weights except for the first 2 convolutional blocks and continued training with a reduced learning rate. This phase allowed for small, gradual adjustments to the pretrained weights, refining the learned features for the specific nuances of our panoramic radiograph dataset. For the baseline model, this fine-tuning phase consisted of 3250 epochs. For the primary models, this phase was more extensive, involving 7000 epochs, as exemplified by the 7 to 14 years model. The complete training for the baseline model required approximately 12 hours. The more extensive training for the primary models required approximately 19 hours, with comparable durations for both the 7 to 14 and 15 to 23 years models. The final trained models can process a single panoramic radiograph in less than 1 second on standard clinical hardware without GPU acceleration.
Validation process
After the training process, both models underwent a validation process. The model scores were collected and summarized. If the results were not satisfactory, the model was readjusted with multiclassifiers and weights. This process was repeated until satisfactory results were achieved. The final model was chosen based on the best validation performance, prioritizing the lowest mean absolute error (MAE) for age and highest accuracy for sex.
Evaluation of model performance
The final models for age and sex estimation were assessed for their performance using the test image sets. The age estimation model performance was evaluated based on root mean square error (RMSE), mean absolute error (MAE), and accuracy in predicting the absolute difference values between the dental and chronological ages within 1 year. Sex estimation performance was evaluated using the accuracy metric and the area under curve (AUC). No ensembling methods were applied; all results were based on single trained models.
Results
A total of 6546 panoramic radiographs were screened for eligibility based on the inclusion criteria. Of these, 2491 radiographs met the inclusion criteria. After cropping and preparing each panoramic radiograph into 2 images, a total of 4627 images out of 4982 satisfied both the inclusion and exclusion criteria. Model development used 3785 images while 842 images were reserved to evaluate the model performance (Figure 3). The distribution of images across age and sex categories ranged approximately from 97 to 128 per group in the training and validation partitions, and from 22 to 26 per group in the test partition, as illustrated in Supplementary Table 1.
Figure 3.
Flow diagram of panoramic radiograph inclusion and exclusion for model development and testing.
The multitask deep learning model was developed to simultaneously predict sex and age from panoramic radiographs. For age estimation across the entire cohort (ages 7-23 years), the model yielded a RMSE value of 1.67 and a MAE value of 1.15. Furthermore, it displayed an accuracy of 71.0% in predicting the absolute difference values between dental and chronological ages within 1 year (Table 1, Figure 4A). However, upon analyzing the confusion matrix, it was observed that the accuracy of age prediction varied across different age groups. Higher accuracy was observed in the younger age group (7-14 years) compared to the older age group (15-23 years) (Figure 4A). This led to the development of 2 separate models to obtain enhanced performance by dividing the age groups into the 7- to 14-year-old (Figure 4B) and 15- to 23-year-old categories (Figure 4C).
Table 1.
Accuracy metrics from age estimation.
| Age | Root means square error (RMSE) | Means absolute error (MAE) | Acuracy (absolute difference values within 1 year) | Samples (test set) |
|---|---|---|---|---|
| 7-23 | 1.67 | 1.15 | 71.0% | 842 |
| 7-14 | 0.95 | 0.62 | 90.3% | 392 |
| 15-23 | 1.87 | 1.41 | 63.8% | 450 |
Figure 4.
Confusion matrix of age estimation in the age range of (A) 7-23 years, (B) 7-14 years, and (C) 15-23 years.
The results indicated an improvement in the model performance for individuals aged 7 to 14 years with the RMSE value decreasing to 0.95 and the MAE decreasing to 0.62. Additionally, the accuracy in predicting the absolute difference values between the dental and chronological ages within 1 year increased to 90.3% (Table 1). However, there was no discernible improvement in the performance of the model for individuals between the ages of 15 and 23, which was indicated by the MAE, RMSE, and accuracy values of 1.41, 1.87, and 63.8%, respectively (Table 1).
For sex recognition across the entire cohort (ages 7-23 years), the model achieved an AUC of 0.94, 87.8% accuracy, 89% sensitivity, and 87% specificity (Table 2, Figure 5A), demonstrating good discriminative performance. Age-stratified analysis revealed notable performance differences between age groups. Sex recognition accuracy was markedly higher in the older cohort (15-23 years), which achieved excellent metrics: 94.7% accuracy, AUC of 0.99, 92% sensitivity, and 98% specificity (Table 2, Figure 5B). In contrast, the younger cohort (7-14 years) showed moderate performance with 79.9% accuracy, AUC of 0.87, 86% sensitivity, and 74% specificity (Table 2, Figure 5C).
Table 2.
Accuracy metric of sex determination.
| Age | Specificity | Sensitivity | Accuracy | AUC | Samples (test set) |
|---|---|---|---|---|---|
| 7-23 | 0.87 | 0.89 | 87.8% | 0.94 | 842 |
| 7-14 | 0.74 | 0.86 | 79.9% | 0.87 | 392 |
| 15-23 | 0.98 | 0.92 | 94.7% | 0.99 | 450 |
AUC, area under the receiver operating characteristic curve.
Figure 5.
The area under the receiver operating characteristic curve (AUC) and receiver operating characteristic curve (ROC) of sex determination in the age ranges of (A) 7-23 years, (B) 7-14 years, and (C) 15-23 years.
Discussion
This study leveraged AI technology for age and sex estimation from dental panoramic radiographs to address traditional method limitations and improve accuracy. The study employed a multitask learning approach for simultaneous age and sex estimation using a fully autonomous deep learning method that processes images directly without the guidance of traditional methods. Several previous studies have trained their models using traditionally-defined anatomical features as constraints or ground truth.16, 17, 18 While this approach has proven valuable, it potentially limits the AI's ability to discover novel patterns or features that may be indicative of age or sex. Therefore, this study adopted a more open-ended approach, allowing the AI to independently develop its own feature extraction and classification algorithms rather than relying on predefined anatomical measurements or expert-defined landmarks.
The age estimation AI model demonstrated good performance with an RMSE of 1.67 and MAE of 1.15 years. These results are comparable to previous research by Vila-Blanco et al., which reported a MAE of 1.17 ± 1.11 years in subjects under 25 years using DASNet, a CNN model.21 Additionally, the current model revealed that age prediction accuracy varied significantly across different age ranges. The younger age group (7-14 years) demonstrated more precise age estimation compared to the older age group (15-23 years). This age-related performance difference can be attributed to the varying developmental dynamics of dental structures across life stages. Previous studies have consistently observed that model predictive accuracy tends to decrease as age ranges broaden, particularly when including older subjects.19,20,28 These findings highlight the importance of developing age-specific models or using narrowly defined age ranges to optimize estimation accuracy.
This AI model achieved a MAE of 0.62 years and 90.3% accuracy for age estimation in individuals aged 7 to 14 years. These results align with recent deep learning research showing a MAE of 0.72 years for subjects aged 3 to 15 years.29 Our result remained competitive when compared to other architectures. Wang et al. reported that VGG16 achieved 81.21% to 93.63% accuracy compared to ResNet101′s 75.36% to 88.73% for age estimation in the same paediatric population (ages 6-14 years).20 However, EfficientNetB0′s primary advantage lies in its computational efficiency, requiring only 5.3M parameters versus ResNet50′s 25.6M parameters. This efficiency provides an optimal balance between predictive accuracy and practical deployment requirements in clinical settings.
This AI model demonstrated superior performance compared to traditional methods. Duangto et al.'s regression model, based on Demirjian's method for evaluating 7 left permanent mandibular teeth in Thai children aged 6 to 15 years, achieved only 74.5 to 76.3% accuracy.30 However, the current AI model's MAE values were comparable to another traditional study focusing solely on mandibular second molar development in Thai children aged 7 to 14 years, which achieved MAE values of 0.7 to 0.8 years.31 This performance variation among studies may be attributed to differences in methodology and the inherent human error in traditional tooth development stage assessment, where accuracy typically decreases as more teeth are evaluated simultaneously. The AI model's competitive performance likely stems from its ability to minimize human error while possibly integrating multiple conventional approaches and identifying novel patterns in dental images that may not be apparent to human observers.
The age estimation AI model demonstrated reduced accuracy for individuals aged 15 to 23 years compared to the 7 to 14 year cohort, primarily due to decreased developmental dynamics of dental structures in older subjects. The roots of most permanent teeth, including second premolars, are fully developed by age 14 years, while second molars approach completion by age 16 years, leaving only third molars in active development.32 Consequently, the 15 to 23 year age group exhibits relatively fewer ongoing dental structural changes compared to younger individuals. This reduced developmental activity during late adolescence creates challenges for precise age prediction, as minimal morphological changes in dental structures limit the discriminative features available for age assessment from panoramic radiographs. To achieve higher accuracy in this age range, alternative or supplementary approaches are necessary.
This study achieved high accuracy in sex prediction using an AI-based approach with panoramic radiographs of Thai children and adolescents (accuracy: 87.8%, AUC: 0.94). The results were in line with previous studies.33,34 Differences in AI methodologies and age ranges of the subjects may account for the variations in accuracy across studies. In contrast to age estimation, this study found higher sex prediction accuracy in the oldest age group compared to the groups with younger subjects. Sex estimation in younger individuals is particularly challenging as craniofacial morphology undergoes rapid developmental changes during early adolescence. Key sexually dimorphic features—including the mandibular condyle, ramus, and gonial angle—experience substantial morphological transformations throughout this period.35,36 This critical developmental period introduces considerable variability in morphological markers, making sex prediction less reliable. As individuals progress through puberty, sexual dimorphism becomes progressively more distinct. The gradual emergence of definitive morphological characteristics means that sex estimation accuracy improves with age, with more stable and reliable markers becoming apparent in later adolescent stages.
In addition to validity and accuracy issues, traditional methods face the limitation of time-consuming manual feature extraction. A previous study reported that the estimated mean time for calculating an individual's chronological age using the Demirjian-Chaillet method was 10 minutes.37 In our study, although the training phase for each model required approximately 19 hours of computational time, once trained, the final models demonstrated exceptional efficiency during the inference phase, processing individual panoramic radiographs in less than 1 second per image. This rapid inference capability makes the system highly suitable for real-time clinical applications, as it can provide immediate age and sex estimation results without causing delays in clinical workflow, even when processing large populations.
The study presents notable limitations in its generalizability. Developed using a Thai sample aged 7 to 23 years, the AI model demonstrates high accuracy for dental age estimation within this specific demographic, yet its broader clinical application remains uncertain. Key constraints include the narrow focus on a single ethnic population and healthy dental conditions. The model's performance, based exclusively on panoramic radiographs from patients without dental anomalies, may significantly reduce its accuracy when applied to individuals with diverse dental characteristics or ethnic backgrounds.
Additionally, the preprocessing approach using 60% central crop from each side of the panoramic radiograph, while deliberate in enhancing data quality and model focus, presents practical implementation challenges. This method effectively isolates dental structures of primary interest and systematically excludes positioning artifacts commonly found in lateral regions. Furthermore, the bilateral cropping strategy enabled inclusion of patients with unilateral anomalies by utilizing the unaffected contralateral side. However, from a clinical deployment perspective, this preprocessing requirement adds complexity compared to using full panoramic images directly. Model architectures capable of handling complete radiographs while maintaining comparable performance would simplify the workflow for clinical practitioners.
Future development should prioritize validation across broader age ranges and diverse dental conditions. Advanced analytical techniques, such as heatmap analysis, could provide deeper insights into the model's predictive mechanisms and potentially uncover novel relationships between dental imaging and age estimation. Subsequent studies must explore the model's adaptability to patients with tooth anomalies, agenesis, or pathological variations to establish its comprehensive clinical utility. This model is considered a tool to assist clinicians and researchers with the interpretation process along with saving time and reducing human error. This prediction model should be developed further to improve accessibility and usability and implemented as an online or mobile application.
Conclusion
In conclusion, this novel AI-based age and sex identification model exhibited good performance metrics, suggesting the potential to serve as an alternative to traditional methods as a diagnostic tool for characterizing both living individuals and deceased bodies.
Authors’ contributions
NS contributed to study design, performed experiment, data analysis, and writing the first draft of the manuscript. TI contributed to conceptualization, methodology, supervision, data interpretation and writing and revision of the manuscript. NH performed experiment, data analysis, and interpretation. SPD contributed to conceptualization, study design and supervision. VJ contributed to conceptualization, study design, project administration, supervision, data analysis, writing and revision of the manuscript. All authors gave final approval and agreed to be accountable for all aspects of the work.
Funding
This study did not receive any specific grant from any funding agencies.
Conflict of interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Footnotes
Supplementary material associated with this article can be found in the online version at doi:10.1016/j.identj.2025.103967.
Appendix. Supplementary materials
References
- 1.Panchbhai A.S. Dental radiographic indicators, a key to age estimation. Dentomaxillofac Radiol. 2011;40(4):199–212. doi: 10.1259/dmfr/19478385. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.De Donno A., Angrisani C., Mele F., Introna F., Santoro V. Dental age estimation: Demirjian's versus the other methods in different populations. A literature review. Med Sci Law. 2021;61(1_suppl):125–129. doi: 10.1177/0025802420934253. [DOI] [PubMed] [Google Scholar]
- 3.Lewis J.M., Senn D.R. Dental age estimation utilizing third molar development: a review of principles, methods, and population studies used in the United States. Forensic Sci Int. 2010;201(1-3):79–83. doi: 10.1016/j.forsciint.2010.04.042. [DOI] [PubMed] [Google Scholar]
- 4.Gustafson G. Age determination on teeth. J Am Dent Assoc. 1950;41(1):45–54. doi: 10.14219/jada.archive.1950.0132. [DOI] [PubMed] [Google Scholar]
- 5.Ohtani S., Yamamoto T. Age estimation by amino acid racemization in human teeth. J Forensic Sci. 2010;55(6):1630–1633. doi: 10.1111/j.1556-4029.2010.01472.x. [DOI] [PubMed] [Google Scholar]
- 6.Cameriere R., Ferrante L., Cingolani M. Precision and reliability of pulp/tooth area ratio (RA) of second molar as indicator of adult age. J Forensic Sci. 2004;49(6):1319–1323. Erratum in: J Forensic Sci. 2005 Mar;50(2):486. [PubMed] [Google Scholar]
- 7.Franklin D., O'Higgins P., Oxnard C.E., Dadour I. Sexual dimorphism and population variation in the adult mandible: Forensic applications of geometric morphometrics. Forensic Sci Med Pathol. 2007;3(1):15–22. doi: 10.1385/FSMP:3:1:15. [DOI] [PubMed] [Google Scholar]
- 8.Hazari P., Hazari R.S., Mishra S.K., Agrawal S., Yadav M. Is there enough evidence so that mandible can be used as a tool for sex dimorphism? A systematic review. J Forensic Dent Sci. 2016;8(3):174. doi: 10.4103/0975-1475.195111. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Angadi P.V., Hemani S., Prabhu S., Acharya A.B. Analyses of odontometric sexual dimorphism and sex assessment accuracy on a large sample. J Forensic Leg Med. 2013;20(6):673–677. doi: 10.1016/j.jflm.2013.03.040. [DOI] [PubMed] [Google Scholar]
- 10.Pereira C., Bernardo M., Pestana D., Santos J.C., Mendonça M.C. Contribution of teeth in human forensic identification—discriminant function sexing odontometrical techniques in Portuguese population. J Forensic Leg Med. 2010;17(2):105–110. doi: 10.1016/j.jflm.2009.09.001. [DOI] [PubMed] [Google Scholar]
- 11.Marroquin T.Y., Karkhanis S., Kvaal S.I., Vasudavan S., Kruger E., Tennant M. Age estimation in adults by dental imaging assessment systematic review. Forensic Sci Int. 2017;275:203–211. doi: 10.1016/j.forsciint.2017.03.007. [DOI] [PubMed] [Google Scholar]
- 12.Samaranayake L., Tuygunov N., Schwendicke F., et al. The transformative role of artificial intelligence in dentistry: a comprehensive overview. Part 1: fundamentals of AI, and its contemporary applications in dentistry. Int Dent J. 2025;75(2):383–396. doi: 10.1016/j.identj.2025.02.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Tuygunov N., Samaranayake L., Khurshid Z., et al. The transformative role of artificial intelligence in dentistry: a comprehensive overview part 2: the promise and perils, and the international dental federation communique. Int Dent J. 2025;75(2):397–404. doi: 10.1016/j.identj.2025.02.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Vila-Blanco N., Varas-Quintana P., Tomás I., Carreira M.J. A systematic overview of dental methods for age assessment in living individuals: from traditional to artificial intelligence-based approaches. Int J Legal Med. 2023;137(4):1117–1146. doi: 10.1007/s00414-023-02960-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Singh S., Singha B., Kumar S. Artificial intelligence in age and sex determination using maxillofacial radiographs: a systematic review. J Forensic Odontostomatol. 2024;42(1):30–37. doi: 10.5281/zenodo.11088513. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.De Tobel J., Radesh P., Vandermeulen D., Thevissen P.W. An automated technique to stage lower third molar development on panoramic radiographs for age estimation: a pilot study. J Forensic Odontostomatol. 2017;35(2):42–54. [PMC free article] [PubMed] [Google Scholar]
- 17.Galibourg A., Cussat-Blanc S., Dumoncel J., Telmon N., Monsarrat P., Maret D. Comparison of different machine learning approaches to predict dental age using Demirjian's staging approach. Int J Legal Med. 2021;135(2):665–675. doi: 10.1007/s00414-020-02489-5. [DOI] [PubMed] [Google Scholar]
- 18.Shen S., Liu Z., Wang J., Fan L., Ji F., Tao J. Machine learning assisted Cameriere method for dental age estimation. BMC Oral Health. 2021;21(1):641. doi: 10.1186/s12903-021-01996-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Park S.J., Yang S., Kim J.M., et al. Automatic and robust estimation of sex and chronological age from panoramic radiographs using a multi-task deep learning network: a study on a South Korean population. Int J Legal Med. 2024;138(4):1741–1757. doi: 10.1007/s00414-024-03204-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Wang J., Dou J., Han J., Li G., Tao J. A population-based study to assess two convolutional neural networks for dental age estimation. BMC Oral Health. 2023;23(1):109. doi: 10.1186/s12903-023-02817-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Vila-Blanco N., Carreira M.J., Varas-Quintana P., Balsa-Castro C., Tomas I. Deep neural networks for chronological age estimation from OPG images. IEEE Trans Med Imaging. 2020;39(7):2374–2384. doi: 10.1109/TMI.2020.2968765. [DOI] [PubMed] [Google Scholar]
- 22.Tan M., Le Q.V. Proceedings of the 36th International Conference on Machine Learning (ICML) 2019. EfficientNet: rethinking model scaling for convolutional neural networks; pp. 6105–6114. [Google Scholar]
- 23.Upalananda W., Phisutphithayakun C., Assawasuksant P., Tanwattana P., Prasatkaew P. A fully automated deep learning framework for age estimation in adults using periapical radiographs of canine teeth. Int J Legal Med. 2025 doi: 10.1007/s00414-025-03558-3. [DOI] [PubMed] [Google Scholar]
- 24.Baydogan M.P., Baybars S.C., Tuncer S.A. Age-Net: an advanced hybrid deep learning model for age estimation using orthopantomograph images. Traitement du Signal. 2023;40:1553–1563. [Google Scholar]
- 25.Figueroa R.L., Zeng-Treitler Q., Kandula S., Ngo L.H. Predicting sample size required for classification performance. BMC Med Inform Decis Mak. 2012;12:8. doi: 10.1186/1472-6947-12-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Fang Y., Wang J., Ou X., Ying H., Hu C., Zhang Z., Hu W. The impact of training sample size on deep learning-based organ auto-segmentation for head-and-neck patients. Phys Med Biol. 2021;66(18) doi: 10.1088/1361-6560/ac2206. [DOI] [PubMed] [Google Scholar]
- 27.Hirunchavarod N., Phuphatham P., Sributsayakarn N., et al. Deeptoothduo: multi-task age-sex estimation and understanding via panoramic radiograph. The international conference on: 21st IEEE International Symposium on Biomedical Imaging, ISBI 2024; Athens, Greece; 2024. [DOI] [Google Scholar]
- 28.Wang X., Liu Y., Miao X., et al. DENSEN: a convolutional neural network for estimating chronological ages from panoramic radiographs. BMC Bioinformatics. 2022;23(Suppl 3):426. doi: 10.1186/s12859-022-04935-0. Erratum in: BMC Bioinformatics. 2022 Dec 22;23(1):557. 10.1186/s12859-022-05115-w. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Shi Y., Ye Z., Guo J., Tang Y., et al. Deep learning methods for fully automated dental age estimation on orthopantomograms. Clin Oral Investig. 2024;28(3):198. doi: 10.1007/s00784-024-05598-2. [DOI] [PubMed] [Google Scholar]
- 30.Duangto P., Janhom A., Prasitwattanaseree S., Mahakkanukrauh P., Iamaroon A. New prediction models for dental age estimation in Thai children and adolescents. Forensic Sci Int. 2016;266:583.e1–583.e5. doi: 10.1016/j.forsciint.2016.05.005. [DOI] [PubMed] [Google Scholar]
- 31.Duangto P., Iamaroon A., Prasitwattanaseree S., Mahakkanukrauh P., Janhom A. New models for age estimation and assessment of their accuracy using developing mandibular third molar teeth in a Thai population. Int J Legal Med. 2017;131(2):559–568. doi: 10.1007/s00414-016-1467-4. [DOI] [PubMed] [Google Scholar]
- 32.Logan W.H.G., Kronfeld R. Development of the human jaws and surrounding structures from birth to the age of fifteen years. J Am Dent Assoc. 1933;20(3):379–427. [Google Scholar]
- 33.Ciconelle A.C.M., da Silva R.L.B., Kim J.H., et al. Deep learning for sex determination: analyzing over 200,000 panoramic radiographs. J Forensic Sci. 2023;68(6):2057–2064. doi: 10.1111/1556-4029.15376. [DOI] [PubMed] [Google Scholar]
- 34.Zhou Y., Jiang F., Cheng F., Li J. Detecting representative characteristics of different genders using intraoral photographs: a deep learning model with interpretation of gradient-weighted class activation mapping. BMC Oral Health. 2023;23(1):327. doi: 10.1186/s12903-023-03033-8. 25. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Fan Y., Penington A., Kilpatrick N., et al. Quantification of mandibular sexual dimorphism during adolescence. J Anat. 2019;234(5):709–717. doi: 10.1111/joa.12949. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Humphrey L.T. Growth patterns in the modern human skeleton. Am J Phys Anthropol. 1998;105(1):57–72. doi: 10.1002/(SICI)1096-8644(199801)105:1<57::AID-AJPA6>3.0.CO;2-A. [DOI] [PubMed] [Google Scholar]
- 37.Kapoor P., Jain V. Comprehensive chart for dental age estimation (DAEcc8) based on Demirjian 8-teeth method: simplified for operator ease. J Forensic Leg Med. 2018;59:45–49. doi: 10.1016/j.jflm.2018.07.014. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.





