Skip to main content
Breast Cancer : Targets and Therapy logoLink to Breast Cancer : Targets and Therapy
. 2025 Oct 21;17:927–947. doi: 10.2147/BCTT.S550307

Exploring AI Approaches for Breast Cancer Detection and Diagnosis: A Review Article

Akbar Ali 1, Mansoor Alghamdi 2, Shahira Sofea Marzuki 1, Tengku Ahmad Damitri Al Astani Tengku Din 1,, Muhamad Syahmi Yamin 1, Malek Alrashidi 2, Ibrahim S Alkhazi 3, Naveed Ahmed 4,
PMCID: PMC12553387  PMID: 41141218

Abstract

Artificial intelligence (AI), particularly deep learning, is reshaping breast cancer diagnostics in the radiology and pathology fields. This review synthesizes recent advances in mammography, digital breast tomosynthesis (DBT), ultrasound, MRI, and whole-slide imaging, with an emphasis on convolutional neural networks (CNNs), Vision Transformers (ViTs), and generative adversarial networks (GANs). When embedded within established screening and diagnostic workflows, AI systems can enhance lesion detection and triage, as well as reduce interpretive variability. However, performance and generalizability depend on dataset quality, population and vendor heterogeneity, acquisition protocols, and calibrated probability outputs; diminished performance on external datasets and miscalibration remain recurrent risks that require explicit mitigation during development and deployment of these models. Beyond detection and classification, segmentation and risk prediction models increasingly integrate imaging with clinicopathological and, where available, genomic variables to enable individualized risk stratification and follow-up planning. Data generation strategies, including GAN-based augmentation, can partially address data scarcity and class imbalance but require rigorous quality control and bias monitoring. Persistent barriers to clinical adoption include uneven external validation, domain shifts across institutions, variability in reporting standards, limited interpretability, and ethical, privacy, and regulatory constraints. Overall, AI should augment, rather than replace, the role of clinicians. Priorities for responsible integration include multi-site prospective evaluations, transparent and standardized reporting, bias mitigation, robust calibration, and lifecycle monitoring to ensure sustained safety and equity.

Keywords: breast cancer, mammography/DBT, digital pathology, deep learning, CNNs, ViTs, GANs, risk prediction, external validation

Introduction

Breast cancer remains a global health challenge, with over 2.3 million new cases diagnosed annually, underscoring the need for effective early detection and accurate diagnosis methods.1,2 Traditional screening methods, such as mammography, are widely used but have limited sensitivity, particularly for dense breast tissue, often leading to false negatives and delayed diagnosis.3 Magnetic Resonance Imaging (MRI) offers a higher sensitivity (94.6%) than mammography (54.5%) and ultrasound (67.2%), but at the cost of specificity, which results in false positives that complicate patient management.3 To address these limitations, combining multiple imaging modalities has shown promise in improving the diagnostic accuracy and providing a more comprehensive evaluation.4

Mammography (X-ray) is the cornerstone of population screening, and digital breast tomosynthesis (DBT) is increasingly being adopted to enhance lesion conspicuity, reduce recall rates, and lower the incidence of interval cancers.5 Studies show that DBT increases the detection of invasive cancers without a surge in overdiagnosis (in situ lesions) and significantly lowers recall rates for normal findings.5 Ultrasound serves as a complementary test, especially in dense breasts, for the targeted evaluation of palpable or mammography-occult findings, while also supporting cyst/solid differentiation and biopsy guidance, with greater operator dependence and modestly lower specificity when used for screening purposes.6–8 Breast MRI is reserved for high-risk screening and diagnostic problem-solving due to its high sensitivity; however, its lower specificity, cost, and availability limit its use in average-risk populations. Abbreviated MRI protocols are under evaluation to mitigate these constraints. Nuclear medicine techniques (eg, FDG PET/CT and MBI) and CT are not used for routine screening, but they contribute to staging, assessment of therapy response, and selected preoperative decisions. In contemporary practice, multimodal pathways strike a balance between sensitivity and specificity, linking imaging to BI-RADS assessment and triaging patients to image-guided biopsy and management algorithms.9 In practice, breast imaging follows a multimodal, stepwise approach to balance sensitivity and specificity. Mammography/DBT is the first-line approach for most, with ultrasound or MRI added based on individual risk or imaging findings, thereby minimizing unnecessary biopsies while catching cancers.5

Disease biology is heterogeneous, comprising hormone receptor–positive/HER2-negative, HER2+, and triple-negative subtypes, with distinct patterns of growth, metastasis, and therapeutic responsiveness.10,11 Prognosis and therapeutic choices are further modulated by grade, stage, and biomarkers from histopathology and genomics. Risk reflects multiple interacting determinants: age, family history, pathogenic germline variants (eg, BRCA1/2), breast density, reproductive factors (early menarche, late menopause, nulliparity, late first birth), and modifiable exposures such as obesity and alcohol use. The global burden remains high and unevenly distributed, with a later stage at diagnosis and limited access to high-quality imaging and systemic therapy, driving excess mortality in many settings.10,12 Contemporary surveillance and guideline reports, therefore, emphasize risk-adapted screening, standardized reporting and quality assurance, and integration of imaging with clinicopathologic data to support timely diagnosis and equitable care. These clinical fundamentals frame the role of AI and deep learning in augmenting detection, characterization, and treatment planning.

Recent advances in AI, particularly in deep learning models, have introduced transformative potential for breast cancer diagnosis. These models excel in processing complex imaging data, enhancing diagnostic precision across modalities such as mammography, ultrasound, and MRI.13,14 AI-driven techniques are effective not only for image analysis but also for risk assessment, tumor characterization, and treatment monitoring. For example, AI-based mammography has demonstrated superior diagnostic accuracy in dense breast tissue compared with traditional radiological assessments.15 AI models trained on extensive datasets also exhibit reduced false-positive and false-negative rates, thereby minimizing diagnostic inconsistencies and missed diagnoses.16 In addition to imaging, AI is used for predictive modeling and personalized diagnosis. By integrating multimodal data, including genetic, histological, and clinical information, AI models enable individualized risk assessment and outcome prediction based on patient-specific factors.17,18 Such comprehensive insights support tailored treatment strategies and advance precision oncology in breast cancer.19

Despite these advancements, challenges remain in clinical implementation. Ensuring extensive validation, enhancing model interpretability, and addressing biases, particularly those stemming from underrepresented demographics in the training data, are critical for widespread adoption.20 Additionally, ethical and regulatory considerations such as patient data privacy and model transparency are essential for the integration of AI into clinical practice.20 Sechopoulos and Mann (2021) advocated continuous validation across diverse populations to mitigate bias and foster equitable diagnostic capabilities.21 This review synthesizes the recent advancements in AI and deep learning applications for breast cancer diagnosis, covering foundational models, data integration strategies, and the ethical landscape. As the field evolves, addressing these challenges will likely establish new standards for early detection, diagnostic accuracy, and personalized care in breast cancer treatment.

Advanced Deep Learning Models in Breast Cancer Detection

Deep learning has revolutionized breast cancer diagnosis by offering unparalleled accuracy, sensitivity, and specificity across various imaging modalities. This section discusses advanced architectures such as CNNs, residual networks (ResNet), DenseNet, Vits, and Generative Adversarial Networks (GANs), and explores their contributions to the field. In addition, we discuss integrative approaches, challenges, and future research directions to leverage the potential of these models in clinical practice fully.

CNNs Residual Networks (ResNet) and DenseNet

Convolutional Neural Networks (CNNs) have fundamentally transformed medical image analyses, thereby offering significant advances in breast cancer detection. Key architectures such as AlexNet, VGGNet, InceptionNet, DenseNet, and ResNet have been pivotal in addressing challenges such as computational inefficiency and vanishing gradients.22,23 AlexNet and VGGNet have laid the foundation for deep feature extraction, enabling early breakthroughs in image classification research. InceptionNet expands these capabilities through multi-scale feature extraction, whereas DenseNet introduces dense layer connections, promoting efficient gradient flow and feature reuse to improve detection in complex cases, such as dense breast tissue.23,24

ResNet, through its innovative use of skip connections, has proven particularly effective for analyzing high-dimensional and complex imaging datasets such as Digital Breast Tomosynthesis (DBT).25 These connections mitigate the vanishing gradient problem, allowing for the training of deeper networks and facilitating the detection of subtle abnormalities, which are crucial for early diagnosis.24 Transfer learning has further enhanced the utility of CNNs, enabling pre-trained models on large datasets, such as ImageNet, to be fine-tuned for specific tasks, such as breast lesion detection, risk stratification, and image classification. This approach significantly alleviates the challenge of limited annotated medical datasets, which are often costly and time-consuming to curate.26,27

Despite their transformative impact, CNNs continue to face several challenges. These include optimizing architectures for specific medical imaging tasks, addressing class imbalances inherent in medical datasets, and improving interpretability for clinical adoption.28 Future research must focus on refining the CNN frameworks to adapt to these unique challenges and ensure their robust and reliable application in breast cancer diagnosis.29,30

Vision Transformers (ViTs) in Breast Cancer Imaging: A Transformative Approach

Vision Transformers (ViTs) represent a groundbreaking shift in image analysis by replacing traditional convolutional operations with self-attention mechanisms, thereby enabling simultaneous capture of local and global contextual information. Unlike CNNs, which excel at detecting localized patterns, ViTs divide images into patches and treat them as sequences, rendering them particularly effective for breast cancer imaging. Breast tissue tumors often exhibit complex morphological and spatial relationships that span multiple regions, and ViTs are uniquely equipped for analysis.31 The integration of self-supervised learning has further enhanced the utility of ViTs, allowing them to be pre-trained on vast unlabeled medical image datasets.32 This pre-training enables the models to learn generalized features that can be fine-tuned on smaller annotated datasets, which is a critical advantage in breast cancer diagnostics, where labeled data are often scarce and costly to produce. Hybrid models that combine CNNs for local feature extraction with ViTs for capturing long-range dependencies have demonstrated superior performance, particularly in challenging cases such as dense breast tissue and multifocal tumors.33 Recent studies underscored the versatility of ViTs across various breast cancer imaging modalities. For instance, ViTs have achieved remarkable accuracy rates of up to 99.92% in mammography image classification.34

Recent studies have demonstrated the effectiveness of Vision Transformers (ViTs) in breast cancer imaging. In breast ultrasound classification, ViTs have shown performance comparable to or surpassing that of state-of-the-art CNNs.35 The BU ViTNet model, which utilizes multistage transfer learning, achieved superior results in breast ultrasound detection compared to conventional methods.36 For histopathological image classification, ViT-based models have exhibited significant improvements through pre-training on ImageNet and the employment of advanced data augmentation techniques.37 The self-attention mechanism in ViTs excels in capturing intricate morphological and spatial features crucial for breast cancer diagnosis, offering a compelling alternative to traditional CNN-based approaches.38 These findings highlight the potential of ViTs to enhance diagnostic precision across various breast imaging modalities, potentially revolutionizing breast cancer diagnostics and supporting improved clinical decision-making (Table 1).

Table 1.

Vision-Transformer-Based Approaches for Breast Cancer Imaging (2023–2024). Representative Studies Summarizing Key Findings, Headline Performance Metrics, Imaging Modalities, and Reference Numbers

Key Insight Accuracy Modality Ref.
Fine-tuned ViT on BreakHis achieved 99.99% accuracy, showcasing potential in histopathology analysis 99.99% Histopathology [39]
Proposed ViT-based hashing method, achieving MAP scores of 98.9% on BreakHis 98.9% (MAP) Medical Image Retrieval [40]
Wavelet-based ViT achieved an AUC of 0.984, excelling in dense breast tissue classification AUC: 0.984 Ultrasound [41]
ViT with majority voting mechanism achieved 98.2% accuracy in MRI classification 98.20% MRI [42]
ViT model achieved 99.42% accuracy with multi-resolution ROI and post-processing 99.42% Histopathology [43]
Compared ViT, CCT, and TVIT, with CCT achieving 99.92% accuracy 99.92% Mammography [34]
Explored pretraining and augmentation techniques, achieving 91% accuracy on BACH dataset 91% Histopathology [44]
Introduced SupCon-ViT, achieving F1-score of 0.8188 and specificity of 0.8971 81.88% (F1-score) Histopathology [45]
Combined ViTs and LSTM, achieving 99.2% accuracy in histopathology 99.20% Histopathology [46]
Proposed Dual-CapsViT for 3D ABUS, improving accuracy and AUC metrics High AUC Ultrasound (3D ABUS) [47]

Notes: Metrics were reproduced from the original studies. Percentage-based measures (accuracy, F1-score, MAP) are shown to two decimal places; AUC is reported on a 0–1 scale to three decimals unless otherwise specified. Specificity was presented on a 0–1 scale, as previously reported. Reference numbers correspond to the primary bibliography.

Abbreviations: ViT, Vision Transformer; ROI, region of interest; AUC, area under the ROC curve; MAP, mean average precision; MRI, magnetic resonance imaging; ABUS, automated breast ultrasound; LSTM, long short-term memory; SupCon-ViT, supervised contrastive ViT; CCT, TViT, model variants as defined in the cited studies; BreakHis, BACH, public histopathology datasets.

Generative Adversarial Networks (GANs) for Data Augmentation

GANs have emerged as robust solutions for data scarcity and class imbalance issues in breast cancer research. By generating synthetic mammograms and histopathological images, GANs enrich the training datasets, ensuring that the models generalize better across diverse patient populations.48,49 This is particularly important for rare breast cancer subtypes for which real-world data are often insufficient.50 GANs have shown significant potential in various medical imaging applications. They have been utilized for data augmentation to address the scarcity of training data in medical imaging applications. GANs have demonstrated efficacy in super-resolution imaging by enhancing low-quality images to improve the visibility of critical diagnostic markers.51 They have also been applied in cross-modality synthesis, converting images between different modalities such as MRI and CT.51 GANs have proven helpful in image reconstruction, denoising, and reducing the radiation dose and scan time.52 In mammography, GANs have been used to generate high-resolution mammograms for radiology education53 and to synthesize images with architectural distortions.54 These applications not only improve model training but also support radiologists in making informed decisions regarding patient care.

Integrative Approaches

The integration of these advanced architectures has led to significant improvements in breast cancer detection. Multi-task learning (MTL), which enables models to perform multiple related tasks, such as classification, segmentation, and tumor grading concurrently, has demonstrated enhanced diagnostic performance.55,56 This integrated approach reduces the need for separate analyses, and provides a holistic assessment of breast abnormalities.57 Explainable AI (XAI) techniques are becoming increasingly important in healthcare as they address the “black-box nature of traditional AI models and foster trust among clinicians.58,59 Various XAI methods, including LIME, SHAP, and Grad-CAM, have been applied to make deep learning models more interpretable for disease diagnosis and medical imaging.59,60 These techniques provide visual explanations for model predictions, thereby enhancing the transparency and understanding of AI-assisted diagnosis.60 Federated learning is emerging as a solution to address data-sharing barriers while maintaining patient privacy in distributed healthcare environments such as the Internet of Medical Things (IoMT). The integration of XAI and federated learning contributes to the development of more privacy-preserving, trustworthy, and explainable AI systems, which are essential for ethically sound healthcare applications.60

Research is increasingly focusing on personalized AI models that incorporate patient-specific factors such as genetic predisposition, hormonal status, and breast density. These models aim to deliver tailored diagnostic and prognostic insights, thereby enhancing the precision of breast cancer management.18,61 In addition, advancements in self-supervised learning are expected to reduce reliance on large annotated datasets further, thereby paving the way for scalable and efficient AI solutions for breast cancer diagnostics. The seamless integration of these models into clinical decision support systems (CDSS) is critical for optimizing diagnostic workflows and improving patient outcomes (Table 2 and Figure 1).

Table 2.

Generative Adversarial Networks (GANs) Related Studies. Representative Works Using GAN-Family Models for Data Synthesis, Augmentation, and Enhancement with Headline Performance Outcomes and Reference Numbers

Key Contribution Performance Metrics Ref.
DCGAN synthesizes mammograms, enhancing dataset diversity and clustering consistency. Clustering consistency enhanced [62]
Proposed cGAN improves classification by addressing class imbalance. 3.9% improvement in classification accuracy [63]
Integrated segmentation and GAN for high-quality mammograms, improving diagnostic accuracy. Higher diagnostic accuracy [64]
Combines VAEs and GANs to improve cancer survival prediction on missing data. Significant (p < 0.001) improvement [65]
TVAE and CTGAN improve diagnostic accuracy to 96.66%. 96.66% accuracy [66]
Pix2Pix GANs augment mammography data, enhancing VGG-16 and ResNet-50 accuracy. Improved accuracy for VGG-16, ResNet-50 [67]
Synthetic ultrasound images improve CNN classification accuracy by 15.3%. 15.3% improvement in CNN classification [68]
WA-SRGAN enhances histopathology images; achieves AUC of 0.826. AUC: 0.826 [69]
SRGAN improves radiomic feature extraction for tumor grading with AUC of 0.826. AUC: 0.826 [69]

Notes: Metrics were taken from the original studies. Accuracy is shown as two decimals; AUC is on the 0–1 scale (three decimals when available). “+x%” indicates absolute gain versus baseline unless stated otherwise. “Improved” denotes a qualitative/non-single-number gain; p-values are reported as in the sources. Reference numbers correspond to the primary bibliography.

Abbreviations: GAN, generative adversarial network; DCGAN = Deep Convolutional GAN; cGAN, conditional GAN; VAE, variational autoencoder; TVAE, tabular variational autoencoder; CTGAN, conditional tabular GAN; Pix2Pix, paired image-to-image GAN; CNN, convolutional neural network; VGG-16, ResNet-50 = CNN backbone.

Figure 1.

Figure 1

Imaging-to-AI pipeline for breast cancer diagnosis and prediction. Left: Clinical imaging modalities (mammography, ultrasound, MRI, and histopathology). Center—representative AI families (CNNs, ViTs, GAN-based methods) and biomarker identification. Right—downstream tasks: tumor classification, tumor segmentation, and predictive analytics. Indicators: yellow square boxes = candidate regions of interest (ROIs) from the attention/region-proposal stage (screening cues, not diagnostic); red square boxes = final AI-flagged suspicious ROIs used for the reported output—classification in the mammography panel (top right) and segmentation guidance in the ultrasound panel (middle right).

In-Depth Exploration of AI Applications in Breast Cancer Diagnostics

Mammography and Digital Breast Tomosynthesis (DBT)

Recent studies have demonstrated the significant potential of AI in enhancing mammography interpretation, particularly digital breast tomosynthesis (DBT). AI algorithms have shown improved diagnostic performance, surpassing radiologists in terms of sensitivity and area under the curve (AUC) values.70 In DBT, AI reduced recall rates by 2–27% and reading times by up to 53%.71 AI has also been applied to assess parenchymal density, detect cancer, and predict breast cancer risk.72 Although AI has shown promise in retrospective studies, further prospective validation is necessary, especially in large-scale screening programs with low breast cancer prevalence.71 The integration of AI into breast cancer screening workflows could help reduce radiologists’ workload and improve interpretation efficiency, although further research is required to determine the most effective clinical implementation.73

Mammography is widely used for breast cancer screening and is particularly effective in women with fatty breast tissues. However, its diagnostic accuracy diminishes in patients with dense breast tissue, where overlapping structures can obscure malignancies.74 AI-enhanced mammography addresses these limitations by identifying subtle markers with greater sensitivity, such as microcalcifications and architectural distortions. In DBT, AI plays a transformative role by decomposing the 3D structure of the breast into thin slices, thereby significantly improving the visibility of lesions that may be hidden in traditional 2D mammography.75,76 Additionally, AI algorithms help prioritize images with a higher likelihood of abnormalities, thereby reducing the cognitive load on radiologists. Incorporating AI into DBT has resulted in increased diagnostic accuracy, with sensitivity improvements of up to 15%, while reducing recall and false-positive rates.77

Ultrasound and Elastography

AI is revolutionizing breast ultrasound imaging by addressing key limitations, such as operator dependence and false-positive results.78 AI, particularly deep learning models, enhances the detection, diagnosis, and prognosis of breast cancer by using ultrasonography.79 These technologies improve lesion segmentation, malignancy classification, and the prediction of molecular subtypes and treatment responses.77,80 AI-assisted ultrasound complements mammography and is especially valuable for women with dense breasts, complementing mammography.79 Recent advancements have also included AI applications in axillary lymph node assessment and the prediction of neoadjuvant chemotherapy outcomes.80 Although AI shows promise in improving diagnostic accuracy and workflow efficiency, challenges remain in data curation, model interpretability, and clinical implementation.77 Overall, the integration of AI with breast ultrasound has the potential to enhance breast cancer detection, diagnosis, and management significantly. Ultrasound is efficient in evaluating breast tissue in women with dense breasts and serves as a complementary imaging modality to mammography. However, traditional ultrasound interpretation is operator-dependent, leading to variability in diagnostic outcomes. AI technologies, intense learning models such as U-Net and fully convolutional networks (FCNs), address this issue by automating and precisely segmenting the breast lesions. These algorithms accurately delineate lesion boundaries, assess their internal characteristics, and classify them according to the likelihood of malignancy.81

Recent advances in AI have significantly enhanced the use of ultrasound elastography in breast cancer screening. AI-based techniques improve diagnostic accuracy by reducing operator dependence and variability in visual observations.82 Machine learning models, including deep learning approaches such as convolutional neural networks (CNNs), have been applied to both shear-wave elastography and strain elastography for automated lesion detection, segmentation, and classification.83 These AI tools have demonstrated high sensitivity (≥80%) in distinguishing benign from malignant lesions, potentially reducing unnecessary biopsies.83 A novel mobile AI solution for real-time breast ultrasound analysis has shown promising results, with 100% sensitivity for malignancy detection and an area under the ROC curve of 0.835–0.850.84 Despite these advancements, challenges remain in standardizing reporting methods and avoiding overfitting AI models.83

MRI and Dynamic Contrast-Enhanced MRI (DCE-MRI)

AI has revolutionized breast cancer imaging, diagnosis, and treatment. AI-enhanced techniques, particularly radiomics, are being applied to various imaging modalities, including mammography, ultrasound, and magnetic resonance imaging (MRI).85 Dynamic contrast-enhanced MRI (DCE-MRI) has shown promise in predicting the molecular subtypes of invasive ductal breast cancer using machine learning models, with high accuracy in distinguishing triple-negative and HER2-overexpressed subtypes.86 AI technologies are advancing image quality, lesion detection and segmentation, cancer characterization, and outcome prediction by integrating multi-omics data.77 Radiomics, a quantitative approach using AI for the advanced mathematical analysis of medical images, has demonstrated potential for enhancing clinical decision-making in breast cancer care. However, challenges regarding data curation, model interpretability, and clinical implementation.87 The integration of radiomics with clinical, histopathological, and genomic information is expected to enable the more personalized management of patients with breast cancer.

Histopathology and Digital Pathology

AI has revolutionized breast cancer histopathology by enhancing diagnostic accuracy, efficiency, and standardization. AI-powered systems can be used to analyze whole-slide images to identify invasive tumors, detect lymph node metastases, and evaluate hormonal status with a high precision.88 Convolutional neural networks are widely used to process complex visual data and achieve expert-level performance in some applications.89 AI tools have demonstrated remarkable ability to predict genetic alterations, identify prognostic biomarkers, and assess tumor-infiltrating lymphocytes.89 These advancements have contributed to improved patient stratification and treatment planning, potentially reducing delays between diagnosis and prognosis determination.90 Despite challenges such as preanalytical variables and annotation demands, the integration of AI in breast cancer pathology shows great promise for enhancing clinical outcomes. Continued research and innovation are crucial for overcoming the existing limitations and fully harnessing AI’s potential of AI in precision oncology.90

Challenges

Despite these promising advancements, several challenges remain in the clinical implementation of AI in breast cancer diagnosis. One major hurdle is variability in imaging protocols and data quality across institutions, which can affect the performance and generalizability of AI models. Additionally, the “black box” nature of many AI algorithms raises concerns regarding interpretability and clinical trust. Efforts are underway to develop explainable AI (XAI) models that provide insights into decision-making processes and enhance their acceptance among clinicians. Regulatory and ethical considerations also play a critical role as AI systems must comply with stringent guidelines to ensure patient safety and data privacy. Future research should focus on integrating AI with other emerging technologies, such as liquid biopsies and multi-omics data, to create a comprehensive diagnostic ecosystem. The goal is to enable seamless, accurate, and personalized breast cancer care and improve the outcomes and quality of life of patients (Table 3 and Figure 2).

Table 3.

AI Applications in Breast Cancer Diagnostics (2018–2024). Representative Studies Across Imaging and Biosensing, Summarizing Datasets, Model Families, Headline Performance, and Key Findings

Ref. Year Modality Dataset AI Model/Technique Overall Accuracy/(AUC-ROC) Key Findings
[16] 2020 Mammography & Digital Breast Tomosynthesis (DBT) Large representative dataset (UK), large enriched dataset (USA) DeepMind AI 11.5% AUC Improved sensitivity and specificity over radiologists (+11.5%, +5.7%)
[91] 2020 Screening Mammography 240 women (100 cancers, 40 false positives, 100 normal; 2013–2017) AI support with decision support and lesion markers 0.89% AUC AI support improved AUC from 0.87 to 0.89; increased sensitivity (86% vs 83%); specificity
[92] 2018 Ultrasound Imaging Dataset A: 306 images (60 malignant, 246 benign) -Dataset B: 163 images (53 malignant, 110 benign) Patch-based LeNet - U-Net - Transfer learning (pretrained FCN-AlexNet) Not specified Deep learning outperforms state-of-the-art, with improved metrics and Dataset B available.
[93] 2021 Ultrasound Imaging Breast Ultrasound Images (BUSI) - Mendeley Breast Ultrasound Dataset - Dataset A: 306 images - Dataset B: 163 images Transfer Learning (various CNNs) 99% accuracy Achieved accuracies: 97.8% (BUSI), 99% (Mendeley), 98.7% (MT-Small).
[94] 2022 Multiparametric MRI 93 women, 104 histopathologically verified lesions Radiomics + Machine Learning (10 predictive models) AUC = 0.93–0.96 Radiomics with DWI and ADC boosts accuracy (88.5%) over radiologists (85.6%), aiding less experienced readers.
[95] 2023 Breast MRI Multiple datasets CNN, RNN, RCNN, AE, Ensemble models AUC-ROC: up to 0.98 Deep learning aided in breast cancer diagnosis, molecular classification, chemotherapy response, and lymph node prediction.
[96] 2023 Pathology Imaging H&E and fluorescent-stained datasets Color Normalization, Nucleus Extraction, Segmentation AI Model H&E: 89.6%, Fluorescent: 80.5% Cross-staining inference achieved high accuracy, demonstrating potential for expanding current pathology AI models to different staining technologies.
[97] 2023 Histopathological Imaging 95 TNBC and HER2+ BC H&E-stained WSI (1037 regions of interest Pathomic feature extraction, Random-Forest, other ML models AUC-ROC: 0.86 Enhanced consistency in tumor grading
[98] 2024 Electrochemical Biosensors - Signal Processing AI 90.0 Improved biomarker detection for early diagnosis

Notes: Values are reported in the original studies. Accuracy is given in % and AUC-ROC on a 0–1 scale. “NR” = not reported; “pp” = percentage-point difference versus the comparator. Where both baseline and post-AI AUC are available, the change is shown (eg, 0.87 → 0.89).

Abbreviations: AE, autoencoder; ADC, apparent diffusion coefficient; AUC-ROC, area under the receiver operating characteristic curve; B, benign; BUSI, Breast Ultrasound Images dataset; DBT, digital breast tomosynthesis; DL, deep learning; DWI, diffusion-weighted imaging; FP, false positive; H&E, hematoxylin and eosin; M, malignant; ML, machine learning; RCNN, region-based convolutional neural network; RNN, recurrent neural network; SOTA, state of the art; TL, transfer learning; TNBC, triple-negative breast cancer; HER2+, human epidermal growth factor receptor 2–positive; WSI, whole-slide image; pp, percentage points.

Figure 2.

Figure 2

Multimodal AI framework for breast cancer detection and diagnosis. Imaging from mammography, ultrasound, magnetic resonance imaging (MRI), and histopathology is aggregated as data sources and processed by AI modules. The fused outputs support AI-assisted detection and diagnosis, enabling higher accuracy, faster diagnosis, and personalized treatment. Indicators: pink curved arrows = cross-modality fusion/feedback; black arrows = forward data flow (sources → AI → outputs); monitor icons = AI decision support endpoints.

Predictive Diagnostics and Early Intervention for Breast Cancer

Recent advances in artificial intelligence (AI) have significantly transformed breast cancer screening and diagnosis. Deep learning algorithms, particularly convolutional neural networks, have demonstrated remarkable success in analyzing complex medical images, including mammography, ultrasound, and MRI.13,29 AI-driven imaging techniques enhance diagnostic precision, enabling early detection of malignancies and improving patient outcomes. These systems identify subtle tumor features and often surpass human radiologists in their detection and classification tasks.29 AI technologies also show promise in breast cancer segmentation, biological characterization, and prognosis prediction by integrating multi-omics data.82 However, challenges remain, including the need for rigorous validation, interpretability, and addressing technical considerations for widespread clinical adoption.13 Despite these hurdles, AI-assisted imaging is an ideal strategy for improving the efficiency and accuracy of breast cancer screening and diagnosis.77

Recent advancements in AI have shown significant potential in the management of breast cancer. AI-based tools integrate multi-omics data, imaging biomarkers, and clinical information to develop comprehensive patient risk profiles, thereby enabling personalized screening and treatment strategies.99 These models assist with various aspects of breast cancer care, including risk stratification, lesion detection and classification, treatment planning, and prognosis prediction.77 AI algorithms have demonstrated promising results in automating diagnosis, segmenting relevant data, and predicting tumor responses to neoadjuvant chemotherapy.99 The integration of AI with radiomics, pathomics, and multi-omics data (genomics, transcriptomics, epigenomics, and proteomics) has led to significant advancements in cancer diagnosis and prognosis.100 However, challenges remain in data curation, model interpretability, and clinical implementation, necessitating ongoing research to ensure the safe and effective integration of AI in breast cancer management.77,100

Although AI shows promise in breast cancer management, it faces challenges in terms of clinical integration. Key issues include the “black-box” nature of AI algorithms, data privacy concerns, and need for transparency.76 Explainable AI (XAI) has emerged as a solution for making AI models more interpretable and building trust among healthcare professionals.101 XAI techniques, such as SHAP, are increasingly used to explain model predictions in breast cancer detection and risk assessment.102 Federated learning enables AI models to be trained on decentralized datasets while protecting patient privacy and enhancing model performance and generalizability across diverse populations.103 Despite these advancements, the successful implementation of AI in clinical practice requires addressing challenges related to data quality, extensive validation, and regulation.101 Continued research and development in XAI and federated learning are crucial for improving AI’s adoption and effectiveness of AI in breast cancer management.

Real-Time Patient Monitoring

Recent advances in AI have significantly affected breast cancer management, particularly in the fields of imaging and pathology. AI has enhanced the accuracy of mammographic imaging, tomosynthesis, and other diagnostic modalities.77 In pathology, AI has improved the identification of invasive tumors, lymph node metastasis, and hormonal status evaluation.88 AI-based frameworks have shown promise in breast carcinoma classification and mitotic figure quantification, contributing to improved prognosis.88 AI-driven risk prediction models that integrate multiple data streams can aid in planning individualized screening protocols.104 However, challenges remain, including the need for large high-quality datasets, external validation, and improved code and data availability to enhance the reliability and clinical translation. Furthermore, AI has demonstrated considerable potential for predicting treatment responses and clinical outcomes through the analysis of patient data, imaging, and pathology.105

AI has revolutionized the management of breast cancer across multiple domains. AI enhances the detection accuracy of mammography, MRI, and ultrasound, rivaling that of expert radiologists.106 AI applications in pathology have improved biomarker detection and assessment, particularly for HER2 and Ki67.106 AI models also show promise in personalizing treatment strategies and predicting outcomes, although methodological weaknesses and biases remain concern.104 The integration of AI into clinical practice faces several challenges, including the need for extensive validation, model generalizability, and practical implementation.107 Beyond AI, circulating tumor DNA (ctDNA) analysis is emerging as a complementary tool to imaging for monitoring breast cancer, offering potential advantages in the early detection of recurrence or progression.108 However, technical, strategic, and economic hurdles currently impede the widespread clinical adoption of ctDNA analysis for breast cancer monitoring.108

Recent advancements in AI have revolutionized breast cancer care across the clinical spectrum. AI applications in imaging and pathology enhance detection accuracy, rivaling expert radiologists in mammography, MRI, and ultrasound interpretation.106 These technologies also improve biomarker assessments and personalize treatment strategies.106 AI-powered decision support systems streamline clinical workflows and enable more precise diagnostics.106 However, the integration of AI into health care faces significant ethical, legal, and regulatory challenges, necessitating the establishment of robust governance frameworks.106 Current research has focused on addressing these challenges, including data variability and real-world validation.106 Despite these hurdles, AI shows immense potential for improving breast cancer diagnosis, prognosis, and treatment outcomes109,110 with ongoing efforts aimed at fully realizing its capabilities in clinical practice. Ongoing research efforts are focused on addressing these challenges, with the goal of fully realizing AI’s potential to enhance breast cancer care and ultimately improve patient outcomes and survival rates.

Federated Learning and Decentralized Data Sharing

Federated learning (FL) has emerged as a promising approach in healthcare, facilitating collaborative machine learning model development while ensuring patient data privacy and security.111,112 This decentralized method enables multiple institutions to collaboratively train algorithms while maintaining the sensitive data localized within each institution.113 FL has been successfully applied in diverse healthcare areas including medical imaging, COVID-19 research, and smart healthcare systems, thereby demonstrating its versatility.114 By sharing only model updates instead of raw data, FL mitigates privacy concerns and reduces the risk of data breach.115 Despite challenges, such as potential privacy leakage due to adversarial attacks, emerging privacy-preserving techniques have been developed to address these issues. Overall, FL offers substantial potential for improving the generalizability and external validity of AI models in healthcare, particularly when working with heterogeneous data from multiple centers.116

Decentralized Clinical Trials

Decentralized clinical trials (DCTs) are transforming clinical research by leveraging digital technologies to overcome the traditional limitations. DCTs utilize remote data capture, wearables, and digital biomarkers to enable continuous monitoring and data collection outside clinical setting.117 This approach addresses challenges, such as limited participant diversity, geographical constraints, and logistical barriers. The COVID-19 pandemic has accelerated the adoption of DCTs, promoting the use of digital health technologies, while maintaining privacy and security.118 Despite these benefits, challenges such as data security, remote patient monitoring, and regulatory variations remain. Future research should focus on developing standardized protocols, exploring long-term effects, and addressing challenges through multidisciplinary collaboration. As digital technologies evolve, DCTs are poised to become an increasingly essential component of clinical research, offering new opportunities to improve healthcare outcomes.119

AI-Driven Personalized Monitoring and Adaptive AI Treatment Plans

AI-Driven Personalized Monitoring

AI has revolutionized personalized monitoring in breast cancer care, offering unparalleled precision and efficiency. By leveraging advanced machine-learning algorithms, AI enables the analysis of diverse data streams, including genomic, proteomic, and radiomic data. These systems are capable of real-time, individualized patient monitoring, and detecting subtle changes that may indicate disease progression or treatment resistance. AI has revolutionized breast cancer care through advanced imaging analysis, personalized treatment planning, and remote monitoring.120 AI algorithms can detect subtle patterns in mammograms and other imaging modalities with high accuracy, potentially leading to earlier diagnosis.106 Machine learning and deep learning methods have shown promising results in predicting treatment response, prognosis, and patient survival, with average accuracy rates of 90–96%.106

AI has shown promising results in detecting tumor metastasis and in improving cancer diagnosis using medical imaging. A meta-analysis of AI algorithms for tumor metastasis detection reported a pooled sensitivity of 82% and specificity of 84%, with an AUC of 0.90.121 Deep learning models have demonstrated high diagnostic accuracy across various medical specialties, with AUCs ranging from 0.864 to 1.0, for conditions such as diabetic retinopathy, lung cancer, and breast cancer.122 In abdominopelvic malignancies, AI-based radiomics models outperformed radiologists in lymph node metastasis detection, with AUCs of 0.895 for radiomics and 0.912 for deep learning compared with 0.774 for radiologists.123 Applications of AI in oncology extend beyond diagnosis to molecular tumor characterization, drug discovery, and treatment outcome prediction, potentially revolutionizing personalized cancer.123 Wearable devices integrated with AI can further enhance the monitoring capability. These devices continuously collect data on vital signs, activity levels, and other physiological metrics, which are analyzed to predict potential complications. This approach fosters a proactive healthcare environment, in which interventions can be deployed well before the manifestation of clinical symptoms. AI-driven monitoring not only enhances patient care, but also reduces the burden on healthcare systems by prioritizing high-risk patients for more intensive surveillance.

Adaptive AI Treatment Plans

Adaptive AI treatment plans redefine cancer care by tailoring therapies to meet each patient’s unique and evolving needs. By integrating pharmacogenomic, proteomic, and tumor evolution data, these systems can identify the most effective interventions for AI, revolutionizing cancer research and precision medicine by offering improved methods for predicting treatment response and optimizing drug selection.124 AI algorithms, including support vector machines (SVM), random forests (RF), and neural networks, have demonstrated high accuracy in predicting treatment outcomes, drug interactions, and resistance.125 These techniques analyze complex datasets by integrating patient-specific information with drug characteristics to enhance treatment.125 AI models have shown promise in predicting treatment duration, adverse reactions, and drug resistance.126 AI applications extend to various stages of drug discovery including virtual screening, lead identification, and ADMET analysis.127 The integration of AI in oncology research spans cancer detection and classification to molecular tumor characterization and drug repurposing, potentially transforming cancer care paradigms.128 One of the key applications of adaptive AI is in drug resistance management. Tumor heterogeneity and genetic mutations often lead to resistance, which complicates the treatment outcomes. AI-driven models anticipate these changes by analyzing longitudinal genetic and proteomic data, thereby allowing oncologists to modify treatment regimens proactively. For example, reinforcement learning algorithms can simulate various therapeutic scenarios, thereby helping clinicians to choose strategies that maximize efficacy while minimizing toxicity.

Recent advancements in artificial intelligence (AI) have revolutionized personalized oncology by integrating various data streams to develop adaptive treatment plans. Circulating tumor DNA (ctDNA) kinetics has emerged as a promising biomarker for monitoring treatment efficacy and guiding therapy adjustments.129 AI algorithms can analyze complex transcriptomic profiles to predict drug responses and identify optimal treatment combinations for individual patients.130 These AI-driven approaches have enabled the development of dynamic and personalized treatment strategies that can adapt to the evolving nature of cancer progression.131 In lung cancer, AI models integrate molecular information, radiomics, and patient characteristics to optimize diagnosis and treatment.132

Transforming the Landscape of Oncology

AI is revolutionizing oncology, particularly breast cancer management, by enabling personalized monitoring and adaptive treatment planning. AI applications in oncology range from cancer detection and classification to predicting treatment outcomes and drug discovery.128 These technologies analyze patient-specific data, including genetic and molecular information, to tailor interventions and improve cancer care.133 Deep learning and machine learning methods have shown promising results in predicting treatment responses, prognosis, and patient survival, with average model accuracies of 90–96%.18 However, challenges remain, such as ensuring data quality, mitigating algorithmic biases, and addressing disparities in access to AI-driven care.134 Overcoming these hurdles requires collaborative efforts among policymakers, healthcare providers, and technology developers to ensure the equitable and effective implementation of AI in clinical oncology.134 Therefore, the fusion of AI-driven personalized monitoring and adaptive treatment planning is reshaping oncology into a field that is defined by precision and adaptability. By continuously analyzing patient-specific data and making real-time adjustments, AI systems enable care to evolve along with the disease. These technologies ensure that the interventions are timely, effective, and tailored to the unique molecular and clinical characteristics of each patient.

Despite these advancements, this document underscores challenges, such as ensuring data quality, mitigating algorithmic biases, and addressing disparities in access to AI-driven care. Collaborative efforts between policymakers, healthcare providers, and technology developers are critical for overcoming these hurdles and paving the way for equitable and effective breast cancer management. Therefore, AI-driven personalized monitoring and adaptive treatment plans represent the future of oncology and offer a transformative approach to breast cancer management. These innovations hold the promise of revolutionizing cancer care by enhancing precision, reducing toxicity, and improving patient outcomes (Figure 3).

Figure 3.

Figure 3

Predictive diagnostics and early intervention for breast cancer. Concept map linking the central hub (“Breast Cancer & AI”) to application areas—tumor segmentation/classification, real-time therapy adjustment, prediction of drug resistance, AI-enhanced pathology, and wearables for continuous monitoring—and to cross-cutting issues (secure model training, federated learning for data privacy, external validation/data availability, interpretability, algorithmic bias). Indicators: Red curved arrows = iterative, closed-loop flow (data → modeling → deployment → evaluation → back to data); dashed callout bubbles = challenges/constraints; solid icon panels = application areas.

Ethical and Regulatory Innovations for AI in Breast Cancer

Ethical Implications of AI in Breast Cancer Diagnosis

AI use in breast cancer care has significant ethical, legal, and social implications. Key concerns include algorithmic bias, data ownership, consent, and professional responsibility.135,136 The “black box” nature of AI systems raises issues of transparency and accountability, particularly in clinical decision making.137 While AI has shown the potential to outperform human experts in breast cancer screening, reducing false positives and negatives,16 its implementation requires careful consideration of patient trust and healthcare equity.136,138 Addressing these challenges requires broad stakeholder engagement, robust oversight systems, and proactive roles for regulators and professional groups.135 Integrating social science perspectives into AI development is crucial for exploring under-examined ethical implications and potential discriminatory effects.137,139 Overall, the ethical use of AI in breast cancer care requires a balanced approach that maximizes benefits, while mitigating risks.

Regulatory and Legal Considerations in AI Implementation

The integration of AI into health care presents significant legal and ethical challenges. Key issues include privacy, data security, accountability, and liability,140,141 and there is a pressing need for tailored regulatory frameworks to address these concerns while fostering innovation.140,142 The lack of clear regulations raises questions about responsibility in cases of AI-related errors or harm.142 Data protection, informed consent, and algorithmic transparency are crucial considerations.143 The rapid advancement of AI technology has outpaced legal and regulatory development, creating a gap that must be addressed. Stakeholders must collaborate to develop comprehensive governance frameworks that balance innovation with ethical considerations and patient protection.143 This includes addressing issues related to data ownership, consent, and the role of technology-giants in handling sensitive health information.141 Furthermore, AI’s reliance of AI on large and diverse datasets raises concerns about data ownership, patient consent, and privacy. The use of sensitive health data by technology companies, often without explicit patient consent, raises ethical questions regarding who controls this data and how it is utilized. With the introduction of AI in medical decision making, a tailored regulatory framework that addresses these unique challenges is urgently required. Such a framework must not only safeguard patient rights and privacy, but also foster innovation, ensuring that AI is integrated into healthcare systems ethically and effectively.

Patient Autonomy and Consent in AI-Driven Healthcare

Patient Autonomy in AI-Driven Health Care

The integration of AI in healthcare raises concerns regarding patient autonomy and decision-making. Studies have shown that patients, physicians, and dieticians have varying levels of trust in AI-driven healthcare, with privacy and ethical issues significantly impacting patient autonomy.144,145 The use of unexplainable machine learning algorithms may undermine the trusted doctor-patient relationship and patient autonomy, necessitating improved explainability in healthcare AI.146 However, some argue that fears about AI threatening patient autonomy may be misplaced, as machine learning has the potential to enhance personalized medicine through big data analysis.147 Research indicates that patients’ preferences for AI autonomy in healthcare decision-making vary by individual and context, suggesting a need for patient involvement in defining appropriate levels of AI assistance.148,149 These findings highlight the complex interplay between AI, patient autonomy, and healthcare decision making. AI also has the potential to enhance patient autonomy by providing precise, personalized, and accessible information. Predictive models and data-driven tools can empower patients to make informed decisions regarding their condition and treatment options. For example, AI can offer predictive insights that help patients to understand the course of their illness, treatment choices, and potential outcomes. This information can enable individuals to make better decisions that align with their personal values and preferences, thereby strengthening their autonomy.

Informed Consent in AI-Driven Healthcare

The integration of AI into healthcare presents significant challenges for the informed consent process. Patients may struggle to understand AI-driven decision making, potentially compromising their ability to provide informed consent.150,151 This issue is exacerbated by the complexity and opacity of AI systems, which can make it difficult for healthcare providers to adequately explain their use to patients adequately.152 While the current legal doctrine may not generally require disclosure of AI involvement in treatment recommendations, there are situations in which such disclosure might be necessary, such as when patients inquire or when AI plays a significant role in decision-making.152 To address these challenges, suggestions include adopting patient-centric information disclosure standards, restructuring medical liability rules, enhancing professional training, and improving public education on the use of AI in healthcare.151,153 These efforts aim to build an AI-driven healthcare system that promotes trust and ensures equitable access to health care. Furthermore, AI tools may generate recommendations that patients feel pressured to follow, especially if they perceive these recommendations as “scientifically” grounded and therefore more reliable than human expertise. This creates a potential conflict in which patients may consent to treatment based on AI recommendations without fully understanding or agreeing with the underlying decision-making process. For health care professionals, ensuring that consent remains informed and voluntary, even in an AI-driven environment, is a key challenge that must be addressed.

Challenges to Patient Autonomy and Informed Consent in AI Healthcare

One of the major challenges in AI-driven healthcare is the opacity of the AI models. Many machine-learning algorithms, particularly deep-learning models, function as “black boxes” that are difficult to interpret. This lack of transparency can make it difficult for patients to understand the reasoning behind AI recommendations, and therefore, to make informed decisions. For healthcare professionals, this raises concerns about the ethics of providing treatment recommendations based on systems that cannot fully explain their rationale for doing so. Another challenge is the potential for bias in the algorithms. AI systems learn from large datasets that may contain biases that could affect their treatment recommendations. For example, an AI system trained on data that underrepresents certain demographic groups might generate biased recommendations that do not adequately address the needs of these populations. This creates ethical dilemmas in maintaining equitable and personalized care, while respecting the autonomy of diverse patient populations.

Additionally, there are concerns regarding data privacy and security. The use of AI in healthcare requires collection and analysis of large amounts of sensitive patient data. Patients must be assured that their data are handled securely, and that they have control over how they are used. Ensuring transparency and providing patients with clear and understandable options for consenting to data usage are crucial components for respecting their autonomy.

Ethical Frameworks for Ensuring Autonomy and Consent in AI-Driven Healthcare

To address the ethical challenges associated with AI in healthcare, new frameworks are needed to ensure that patient autonomy and informed consent remain central to medical practice (Figure 4). One potential framework is the development of explainable AI (XAI), which aims to make AI systems more transparent by providing clear explanation of how decisions are made. XAI can help bridge the gap between complex AI models and patient understanding, ensuring that they are better informed when making decisions. Furthermore, the informed consent process must evolve to accommodate the use of AI technology. Instead of relying solely on traditional methods of explaining treatment options, consent processes must be adopted to ensure that patients understand the role of AI in their care, how it functions, and its potential benefits and risks. This could involve simplified explanations, visual aids, and interactive tools to make AI-driven healthcare more accessible and comprehensible. Collaborative decision making is an important consideration. AI should not replace human judgment or patient involvement in decision-making but rather act as a tool to support these processes. Healthcare professionals should engage patients in open discussions about AI use in their care to ensure that their values, preferences, and goals are prioritized.

Figure 4.

Figure 4

Ethical and regulatory innovations for AI in breast cancer. Parallel domains— ethical implications (left) and regulatory and legal considerations (right)—with downstream outcomes. Ethical: Key concerns (algorithmic bias, data ownership, transparency), potential benefits (AI may outperform humans and reduce false positives), and challenges (patient trust and healthcare equity). Regulatory: key issues (privacy, security, liability), challenges (AI evolving faster than regulation), solutions (governance and data protection frameworks). The paths converge on the ethical implementation of AI and balanced regulations that support innovation. Indicators: purple rounded boxes = section headers; gray rectangles = subtopics; connectors/arrows = information flow towards outcomes.

Conclusion

Artificial intelligence (AI), particularly deep learning, has advanced breast cancer diagnostics across mammography/digital breast tomosynthesis (DBT), ultrasound, magnetic resonance imaging (MRI), and digital pathology, particularly when embedded in clinician-centered workflows. Evidence is most mature in screening mammography; applications in other modalities and in multimodal fusion—combining imaging with clinicopathologic and, where available, genomic data—are promising, but heterogeneous. Real-world performance remains sensitive to the quality of the dataset, population and vendor differences, acquisition protocols, probability calibration, and workflow design. Responsible clinical deployment requires multi-site external validation, bias mitigation with representative datasets, interpretable outputs that support decision-making, and post-deployment monitoring for drift, safety, and equity, along with compliance with privacy, data governance, and evolving regulatory standards. Near-term priorities include second-reader and triage use cases, integrating routinely available clinical and pathological data to refine individualized risk and reliability measures, such as robust training, cross-site harmonization, and predefined escalation pathways for uncertain cases. Overall, AI should augment, not replace, clinicians. When deployed within well-validated and transparently governed systems, it can enable earlier detection, more consistent diagnosis, and personalized care.

Acknowledgments

The authors would like to acknowledge the support of the ILMM PTE LTD and the National Heart Institute (IJN), Malaysia, grant number R504-LR-GAL008-0006150291-I163. Naveed Ahmed would like to acknowledge the Applied College, University of Tabuk, for providing the facilities to conduct this informative review article.

Disclosure

The authors declare no conflicts of interest.

References

  • 1.Wilkinson L, Gathani T. Understanding breast cancer as a global health concern. Brit J Radiol. 2021;95(1130). doi: 10.1259/bjr.20211033 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Saha S, Dutta A, Choudhury S. A Deep-Learning-Based Novel Method to Classify Breast Cancer. 2024 2nd International Conference on Intelligent Data Communication Technologies and Internet of Things (IDCIoT). 2024:503–508. [Google Scholar]
  • 3.Aristokli N, Polycarpou I, Themistocleous S, Sophocleous D, Mamais I. Comparison of the diagnostic performance of magnetic resonance imaging (MRI), ultrasound and mammography for detection of breast cancer based on tumor type, breast density and patient’s history: a review. Radiography. 2022;28(3):848–856. doi: 10.1016/j.radi.2022.01.006 [DOI] [PubMed] [Google Scholar]
  • 4.Gegios AR, Peterson MS, Fowler AM. Breast cancer screening and diagnosis: recent advances in imaging and current limitations. PET Clin. 2023;18(4):459–471. doi: 10.1016/j.cpet.2023.04.003 [DOI] [PubMed] [Google Scholar]
  • 5.Gao Y, Moy L, Heller SL. Digital breast tomosynthesis: update on technology, evidence, and clinical practice. Radiographics. 2021;41(2):321–337. doi: 10.1148/rg.2021200101 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Stout NK, Miglioretti DL, Su Y-R, et al. Breast cancer screening using mammography, digital breast tomosynthesis, and magnetic resonance imaging by breast density. JAMA Intern Med. 2024;184(10):1222. doi: 10.1001/jamainternmed.2024.4224 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Libesman S, Li T, Marinovich ML, Seidler AL, Tagliafico AS, Houssami N. Interval breast cancer rates for tomosynthesis vs mammography population screening: a systematic review and meta-analysis of prospective studies. Eur Radiol. 2024;35:1478–1489. doi: 10.1007/s00330-024-11085-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Nissan N, Albiztegui REO, Fruchtman-Brot H, et al. Extremely dense breasts: a comprehensive review of increased cancer risk and supplementary screening methods. Eur J Radiol. 2025;182:111837. doi: 10.1016/j.ejrad.2024.111837 [DOI] [PubMed] [Google Scholar]
  • 9.Niell BL, Jochelson MS, Amir T, et al. ACR appropriateness Criteria® female breast cancer screening: 2023 update. J Am College Radiol. 2024;21(6):S126–S143. doi: 10.1016/j.jacr.2024.02.019 [DOI] [PubMed] [Google Scholar]
  • 10.Kim J, Harper A, McCormack V, et al. Global patterns and trends in breast cancer incidence and mortality across 185 countries. Nat Med. 2025;31(4):1154–1162. doi: 10.1038/s41591-025-03502-3 [DOI] [PubMed] [Google Scholar]
  • 11.Vaz-Gonçalves L, Marquart-Wilson L, Protani MM, et al. Capturing breast cancer subtypes in cancer registries: insights into real-world incidence and survival. J Cancer Policy. 2025;44:100567. doi: 10.1016/j.jcpo.2025.100567 [DOI] [PubMed] [Google Scholar]
  • 12.H-p Z, R-y J, J-y Z, et al. PI3K/AKT/mTOR signaling pathway: an important driver and therapeutic target in triple-negative breast cancer. Breast Cancer. 2024;31(4):539–551. doi: 10.1007/s12282-024-01567-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Carriero A, Groenhoff L, Vologina E, Basile P, Albera M. Deep learning in breast cancer imaging: state of the art and recent advancements in early 2024. Diagnostics. 2024;15:14. doi: 10.3390/diagnostics15010014 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Hu Q, Giger ML. Clinical artificial intelligence applications: breast imaging. Radiol Clin North Am. 2021;59(6):1027–1043. doi: 10.1016/j.rcl.2021.07.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Yala A, Lehman C, Schuster T, Portnoi T, Barzilay R. A deep learning mammography-based model for improved breast cancer risk prediction. Radiology. 2019;292(1):60–66. doi: 10.1148/radiol.2019182716 [DOI] [PubMed] [Google Scholar]
  • 16.McKinney SM, Sieniek M, Godbole V, et al. International evaluation of an AI system for breast cancer screening. Nature. 2020;577:89–94. [DOI] [PubMed] [Google Scholar]
  • 17.Shao J, Ma J, Zhang Q, Li W, Wang C. Predicting gene mutation status via artificial intelligence technologies based on multimodal data integration to advance precision oncology. Semi Cancer Biol. 2023;91:1–15. doi: 10.1016/j.semcancer.2023.02.006 [DOI] [PubMed] [Google Scholar]
  • 18.Sohrabei S, Moghaddasi H, Hosseini A, Ehsanzadeh SJ. Investigating the effects of artificial intelligence on the personalization of breast cancer management: a systematic study. BMC Cancer. 2024;24:24. doi: 10.1186/s12885-023-11763-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Parikh D, Shah M. A comprehensive study on epigenetic signatures to monitor disease progression and the response to therapy in breast cancer. Biomed Anal. 2024;1(3):205–217. doi: 10.1016/j.bioana.2024.06.004 [DOI] [Google Scholar]
  • 20.Ciobanu-Caraus O, Aicher A, Kernbach JM, Regli L, Serra C, Staartjes VE. A critical moment in machine learning in medicine: on reproducible and interpretable learning. Acta Neurochirurgica. 2024;166:166. doi: 10.1007/s00701-024-06037-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Zhang J, Zhang Z-M. Ethics and governance of trustworthy medical artificial intelligence. BMC Med Inf Decis Making. 2023;23. doi: 10.1186/s12911-023-02103-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Dutta P, Upadhyay P, De M, Khalkar R. Medical image analysis using deep convolutional neural networks: CNN architectures and transfer learning. IEEE. 2020;2020:175–180. [Google Scholar]
  • 23.Shah D, Khan MAU, Abrar M, Tahir M. Optimizing breast cancer detection with an ensemble deep learning approach. Int J Intell Syst. 2024;2024. doi: 10.1155/2024/5564649 [DOI] [Google Scholar]
  • 24.Kanavos A, Mylonas P. Deep Learning Analysis of Histopathology Images for Breast Cancer Detection: a Comparative Study of ResNet and VGG Architectures. 2023 18th International Workshop on Semantic and Social Media Adaptation & Personalization (SMAP)18th International Workshop on Semantic and Social Media Adaptation & Personalization (SMAP 2023). 2023:1–6. [Google Scholar]
  • 25.Chakravarthy SRS, Bharanidharan N, Kumar VV, Mahesh TR, Alqahtani MS, Guluwadi S. Deep transfer learning with fuzzy ensemble approach for the early detection of breast cancer. BMC Med Imaging. 2024;24:24. doi: 10.1186/s12880-024-01207-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Shin H-C, Roth HR, Gao M, et al. Deep convolutional neural networks for computer-aided detection: CNN architectures, dataset characteristics and transfer learning. Ieee Trans Med Imaging. 2016;35:1285–1298. doi: 10.1109/TMI.2016.2528162 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Ayana G, Park J, Jeong J-W, S-w C. A novel multistage transfer learning for ultrasound breast cancer image classification. Diagnostics. 2022;13:12. doi: 10.3390/diagnostics13010012 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Salehi AW, Khan S, Gupta G, et al. A study of CNN and transfer learning in medical imaging: advantages, challenges, future scope. Sustainability. 2023;15(7):5930. doi: 10.3390/su15075930 [DOI] [Google Scholar]
  • 29.Baughan NM, Douglas L, Past GML. Present, and future of machine learning and artificial intelligence for breast cancer screening. J Breast Imaging. 2022;4(5):451–459. doi: 10.1093/jbi/wbac052 [DOI] [PubMed] [Google Scholar]
  • 30.Abdelhafiz D, Yang C, Ammar R, Nabavi S. Deep convolutional neural networks for mammography: advances, challenges and applications. BMC Bioinf. 2019;20:1–20. doi: 10.1186/s12859-019-2823-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Katayama A, Aoki Y, Watanabe Y, Horiguchi J, Rakha EA, Oyama T. Current status and prospects of artificial intelligence in breast cancer pathology: convolutional neural networks to prospective Vision Transformers. Int J Clin Oncol. 2024;29:1–21. [DOI] [PubMed] [Google Scholar]
  • 32.VanBerlo B, Hoey J, Wong A. A survey of the impact of self-supervised pretraining for diagnostic tasks in medical X-ray, CT, MRI, and ultrasound. BMC Med Imaging. 2024;24(1):79. doi: 10.1186/s12880-024-01253-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Zhao X, Zhu Q, Wu J. AResNet-ViT: a hybrid CNN-transformer network for benign and malignant breast nodule classification in ultrasound images. arXiv preprint arXiv:240719316. 2024;2024:1. [Google Scholar]
  • 34.Abimouloud ML, Bensid K, Elleuch M, Aiadi O, Kherallah M. Vision transformer-convolution for breast cancer classification using mammography images: a comparative study. Int J Hybrid Intelligent Systems. 2024;20(2):67–83. [Google Scholar]
  • 35.Gheflati B, Rivaz H. Vision Transformers for Classification of Breast Ultrasound Images. 2022 44th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC). 2021:480–483. [DOI] [PubMed] [Google Scholar]
  • 36.Ayana G, S-w C, Dommerich S. BUViTNet: breast ultrasound detection via vision transformers. Diagnostics. 2022;13:12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Baroni GL, Rasotto L, Roitero K, Siraj AH, Mea VD. Vision transformers for breast cancer histology image classification. International Conference on Image Analysis and Processing. Cham: Springer Nature Switzerland;2023:15–26. [Google Scholar]
  • 38.Karuppasamy A. Recent ViT based models for breast cancer histopathology Image Classification. 2023 14th International Conference on Computing Communication and Networking Technologies (ICCCNT). 2023:1–5. [Google Scholar]
  • 39.Gella V. High-performance classification of breast cancer histopathological images using fine-tuned vision transformers on the breakhis dataset. bioRxiv. 2024;2024:1. [Google Scholar]
  • 40.Kumar M, Singh R, Mukherjee P. VTHSC-MIR: vision Transformer Hashing with Supervised Contrastive learning based medical image retrieval. Pattern Recognit Lett. 2024;184:28–36. doi: 10.1016/j.patrec.2024.06.003 [DOI] [Google Scholar]
  • 41.He C, Diao Y, Ma X, et al. A vision transformer network with wavelet-based features for breast ultrasound classification. Image Anal Stereol. 2024;43(2):185–194. doi: 10.5566/ias.3116 [DOI] [Google Scholar]
  • 42.Xue J, Zhou L, Chen Y, Zheng J-X, Liu J. A novel approach for breast tumor mri classification: vision transformers and majority integration. IEEE. 2024;2024:226–231. [Google Scholar]
  • 43.Sabry M, Balaha HM, Ali KM, et al. A vision transformer approach for breast cancer classification in histopathology. IEEE. 2024;2024:1–4. [Google Scholar]
  • 44.Baroni GL, Rasotto L, Roitero K, Tulisso A, Di Loreto C, Della Mea V. Optimizing vision transformers for histopathology: pretraining and normalization in breast cancer classification. J Imaging. 2024;10(5):108. doi: 10.3390/jimaging10050108 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Shiri M, Reddy MP, Sun J. Supervised contrastive vision transformer for breast histopathological image classification. IEEE. 2024;2024:296–301. [Google Scholar]
  • 46.Chaudhury S, Sau K, Shelke N. Transforming breast cancer image classification with vision transformers and LSTM integration. IEEE. 2024;2024:1–8. [Google Scholar]
  • 47.Xu M, Wang W, Wang K, et al. Vision transformers (ViT) pretraining on 3D ABUS image and dual-CapsViT: enhancing ViT decoding via dual-channel dynamic routing. IEEE. 2023;2023:1596–1603. [Google Scholar]
  • 48.Jiménez-Gaona Y, Carrión-Figueroa D, Lakshminarayanan V, Rodríguez-álvarez MJ. Gan-based data augmentation to improve breast ultrasound and mammography mass classification. Biomed Signal Process Control. 2024;94:106255. doi: 10.1016/j.bspc.2024.106255 [DOI] [Google Scholar]
  • 49.Deebani W, Aziz L, Aziz A, Basri WS, Alawad W, Althubiti SA. Synergistic transfer learning and adversarial networks for breast cancer diagnosis: benign vs. invasive classification. Sci Rep. 2025;15:7461. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Tripathi RP, Khatri SK, van Greunen D, Ather D. Enhancing breast cancer diagnosis through segmentation-driven generative adversarial networks for synthetic mammogram generation. 2023 3rd International Conference on Technological Advancements in Computational Sciences (ICTACS). 2023:1078–1082. [Google Scholar]
  • 51.Koshino K, Werner RA, Pomper MG, et al. Narrative review of generative adversarial networks in medical and molecular imaging. Ann Translat Med. 2021;9(9):821. doi: 10.21037/atm-20-6325 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Sorin V, Barash Y, Konen E, Klang E. Creating artificial images for radiology applications using generative adversarial networks (GANs) - A systematic review. Acad Radiol. 2020;27:1175–1185. doi: 10.1016/j.acra.2019.12.024 [DOI] [PubMed] [Google Scholar]
  • 53.Zakka C, Saheb G, Najem E, Berjawi G. MammoGANesis: controlled generation of high-resolution mammograms for radiology education. ArXiv. 2020;2020:1 [Google Scholar]
  • 54.Oyelade ON, Ezugwu AE-S. ArchGAN: a generative adversarial network for architectural distortion abnormalities in digital mammograms. 2021 International Conference on Electrical, Computer and Energy Technologies (ICECET). 2021:1–7. [Google Scholar]
  • 55.Zhang Y, Zeng B, Li J, Zheng Y, Chen X. A multi-task transformer with local-global feature interaction and multiple tumoral region guidance for breast cancer diagnosis. IEEE J Biomed Health Inform. 2024;28:6840–6853. doi: 10.1109/JBHI.2024.3454000 [DOI] [PubMed] [Google Scholar]
  • 56.Wei J, Zhang H, Xie J. A novel deep learning model for breast tumor ultrasound image classification with lesion region perception. Current Oncol. 2024;31(9):5057. doi: 10.3390/curroncol31090374 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Mishra AK, Roy P, Bandyopadhyay S, Das SK. A multi‐task learning based approach for efficient breast cancer detection and classification. Expert Systems. 2022;39:e13047. [Google Scholar]
  • 58.Verma DDK. Explainable AI in healthcare: interpretable deep learning models for disease diagnosis. Pharma Innovation. 2019;8:561–565. doi: 10.22271/tpi.2019.v8.i3j.25392 [DOI] [Google Scholar]
  • 59.Loh HW, Ooi CP, Seoni S, Barua PD, Molinari F, Acharya UR. Application of explainable artificial intelligence for healthcare: a systematic review of the last decade (2011-2022). Comput Methods Programs Biomed. 2022;226:107161. doi: 10.1016/j.cmpb.2022.107161 [DOI] [PubMed] [Google Scholar]
  • 60.Chaddad A, Peng J, Xu J, Bouridane A. Survey of explainable AI techniques in healthcare. Sensors. 2023;23(2):634. doi: 10.3390/s23020634 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Feng K, Yi Z, Xu B. Artificial intelligence and breast cancer management: from data to the clinic. Cancer Innovation. 2025;4(2):e159. doi: 10.1002/cai2.159 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Shah D, Ullah Khan MA, Abrar M. Reliable breast cancer diagnosis with deep learning: DCGAN‐driven mammogram synthesis and validity assessment. Appl Comput Intell Soft Comput. 2024;2024(1):1122109. doi: 10.1155/2024/1122109 [DOI] [Google Scholar]
  • 63.Joseph AJ, Dwivedi P, Joseph J, et al. Prior-guided generative adversarial network for mammogram synthesis. Biomed Signal Process Control. 2024;87:105456. doi: 10.1016/j.bspc.2023.105456 [DOI] [Google Scholar]
  • 64.Tripathi RP, Khatri SK, Van Greunen D, Ather D. Enhancing breast cancer diagnosis through segmentation-driven generative adversarial networks for synthetic mammogram generation. IEEE. 2023;2023:1078–1082. [Google Scholar]
  • 65.Rajaram S, Mitchell CS. Data augmentation with cross-modal variational autoencoders (DACMVA) for cancer survival prediction. Information. 2023;15(1):7. doi: 10.3390/info15010007 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66.Inan MSK, Hossain S, Uddin MN. Data augmentation guided breast cancer diagnosis and prognosis using an integrated deep-generative framework based on breast tumor’s morphological information. Inf Med Unlocked. 2023;37:101171. doi: 10.1016/j.imu.2023.101171 [DOI] [Google Scholar]
  • 67.Aizaz Z, Khare K, Khursheed A, Tirmizi A. Pix2Pix generative adversarial networks (GAN) for breast cancer detection. IEEE. 2022;2022:1–5. [Google Scholar]
  • 68.Maack L, Holstein L, Schlaefer A. GANs for generation of synthetic ultrasound images from small datasets. Curr Direct Biomed Eng. 2022;8(1):17–20. doi: 10.1515/cdbme-2022-0005 [DOI] [Google Scholar]
  • 69.Shahidi F. Breast cancer histopathology image super-resolution using wide-attention gan with improved wasserstein gradient penalty and perceptual loss. IEEE Access. 2021;9:32795–32809. doi: 10.1109/ACCESS.2021.3057497 [DOI] [Google Scholar]
  • 70.Yoon JH, Strand F, Baltzer PAT, et al. Standalone AI for breast cancer detection at screening digital mammography and digital breast tomosynthesis: a systematic review and meta-analysis. Radiology. 2023;307:222639. doi: 10.1148/radiol.222639 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71.Magni V, Cozzi A, Schiaffino S, Colarieti A, Sardanelli F. Artificial intelligence for digital breast tomosynthesis: impact on diagnostic performance, reading times, and workload in the era of personalized screening. Eur J Radiol. 2023;158:110631. doi: 10.1016/j.ejrad.2022.110631 [DOI] [PubMed] [Google Scholar]
  • 72.Yoon JH, Kim EK. Deep Learning-Based Artificial Intelligence for Mammography. Korean J Radiol. 2021;22(1225):–1239. doi: 10.3348/kjr.2020.1210 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.Sechopoulos I, Teuwen J, Mann RM. Artificial intelligence for breast cancer detection in mammography and digital breast tomosynthesis: state of the art. Semi Cancer Biol. 2020;72:214–225. doi: 10.1016/j.semcancer.2020.06.002 [DOI] [PubMed] [Google Scholar]
  • 74.Vourtsis A, Berg WA. Breast density implications and supplemental screening. Eur Radiol. 2018;29:1762–1777. doi: 10.1007/s00330-018-5668-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 75.Kumar A, Kumar P, Ranjan B, Naresh R, Narayan B. Artificial intelligence in breast cancer screening in inducing diagnostic accuracy with early detection. J Angiother. 2024;8:1–8. [Google Scholar]
  • 76.Ozcan BB, Patel BK, Banerjee I, Dogan BE. Artificial intelligence in breast imaging: challenges of integration into clinical practice. J Breast Imaging. 2023;5(3):248–257. doi: 10.1093/jbi/wbad007 [DOI] [PubMed] [Google Scholar]
  • 77.Zhang J, Wu J, Zhou X, Shi F, Shen D. Recent advancements in artificial intelligence for breast cancer: image augmentation, segmentation, diagnosis, and prognosis approaches. Semi Cancer Biol. 2023;96:11–25. doi: 10.1016/j.semcancer.2023.09.001 [DOI] [PubMed] [Google Scholar]
  • 78.Kim J, Kim HJ, Kim C, Kim WH. Artificial intelligence in breast ultrasonography. Ultrasonography. 2020;40:183–190. doi: 10.14366/usg.20117 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79.Fruchtman Brot H, Mango VL. Artificial intelligence in breast ultrasound: application in clinical practice. Ultrasonography. 2023;43:3–14. doi: 10.14366/usg.23116 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 80.Trepanier C, Huang A, Liu MZ, Ha RS. Emerging uses of artificial intelligence in breast and axillary ultrasound. Clin Imaging. 2023;100:64–68. doi: 10.1016/j.clinimag.2023.05.007 [DOI] [PubMed] [Google Scholar]
  • 81.Wu T, Warren L. The added value of supplemental breast ultrasound screening for women with dense breasts: a single center canadian experience. Canad Associat Radiolog Jl. 2021;73(101):–106. [DOI] [PubMed] [Google Scholar]
  • 82.Zhang X-Y, Wei Q, Wu -G-G, et al. Artificial intelligence - based ultrasound elastography for disease evaluation - a narrative review. Front Oncol. 2023;13:1197447. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 83.Mao Y-J, Lim H-J, Ni M, Yan W-H, DW-c W, Cheung JSW. Breast tumour classification using ultrasound elastography with machine learning: a systematic scoping review. Cancers. 2022;15:14. doi: 10.3390/cancers15010014 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 84.Lee J, Cheng tsui-fen F. Clinical application of novel mobile AI solution for real-time detection and differential diagnosis in breast ultrasound: the first prospective feasibility study. J Clin Oncol. 2024;2024:1. [Google Scholar]
  • 85.Bitencourt AGV, Daimiel Naranjo I, Lo Gullo R, Rossi Saccarelli C, Pinker K. AI-enhanced breast imaging: where are we and where are we heading? Eur J Radiol. 2021;142:109882. doi: 10.1016/j.ejrad.2021.109882 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 86.Sheng W, Xia S, Wang Y, et al. Invasive ductal breast cancer molecular subtype prediction by MRI radiomic and clinical features based on machine learning. Front Oncol. 2022;12:964605. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 87.Pesapane F, De Marco P, Rapino A, et al. How radiomics can improve breast cancer diagnosis and treatment. J Clin Med. 2023;12:1372. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 88.Soliman A, Li Z, Parwani AV. Artificial intelligence’s impact on breast cancer pathology: a literature review. Diagn Pathol. 2024;19. doi: 10.1186/s13000-024-01453-w [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 89.Cifci D, Veldhuizen GP, Foersch S, Kather JN. AI in Computational Pathology of Cancer: improving Diagnostic Workflows and Clinical Outcomes? Ann Rev Cancer Biol. 2023;7(1):57–71. doi: 10.1146/annurev-cancerbio-061521-092038 [DOI] [Google Scholar]
  • 90.McCaffrey C, Jahangir CA, Murphy C, Burke C, Gallagher WM, Rahman A. Artificial intelligence in digital histopathology for predicting patient prognosis and treatment efficacy in breast cancer. Expert Rev Mol Diagn. 2024;24(363):–377. doi: 10.1080/14737159.2024.2346545 [DOI] [PubMed] [Google Scholar]
  • 91.Rodríguez-Ruiz A, Krupinski E, Mordang -J-J, et al. Detection of breast cancer with mammography: effect of an artificial intelligence support system. Radiology. 2019;290(2):305–314. doi: 10.1148/radiol.2018181371 [DOI] [PubMed] [Google Scholar]
  • 92.Yap MH, Pons G, Martí J, et al. Automated breast ultrasound lesions detection using convolutional neural networks. IEEE J Biomed Health Inform. 2018;22(4):1218–1226. doi: 10.1109/JBHI.2017.2731873 [DOI] [PubMed] [Google Scholar]
  • 93.Ayana G, Dese K, S-w C. Transfer learning in breast cancer diagnoses via ultrasound imaging. Cancers. 2021;13(4):738. doi: 10.3390/cancers13040738 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 94.Daimiel Naranjo I, Gibbs P, Reiner JS, et al. Breast lesion classification with multiparametric breast MRI using radiomics and machine learning: a comparison with radiologists’ performance. Cancers. 2022;14(7):1743. doi: 10.3390/cancers14071743 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 95.Zhao X, Bai J-W, Guo Q, Ren K, Zhang G-J. Clinical applications of deep learning in breast MRI. Biochimica Biophysica Acta. 2023;1878(2):188864. doi: 10.1016/j.bbcan.2023.188864 [DOI] [PubMed] [Google Scholar]
  • 96.Huang P-W, Ouyang H, Hsu B-Y, et al. Deep-learning based breast cancer detection for cross-staining histopathology images. Heliyon. 2023;9(2). doi: 10.1016/j.heliyon.2023.e13171. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 97.Verdicchio M, Brancato V, Cavaliere C, Isgrò F, Salvatore M, Aiello M. A pathomic approach for tumor-infiltrating lymphocytes classification on breast cancer digital pathology images. Heliyon. 2023;9(3):e14371. doi: 10.1016/j.heliyon.2023.e14371 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 98.Vatankhahan H, Esteki F, Jabalameli MA, et al. Electrochemical biosensors for early diagnosis of glioblastoma. Clin Chim Acta. 2024;557:117878. doi: 10.1016/j.cca.2024.117878 [DOI] [PubMed] [Google Scholar]
  • 99.Saleh GA, Batouty NM, Gamal A, et al. Impact of imaging biomarkers and AI on breast cancer management: a brief review. Cancers. 2023;16:15. doi: 10.3390/cancers16010015 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 100.Ozaki Y, Broughton P, Abdollahi H, Valafar H, Blenda AV. Integrating Omics Data and AI for Cancer Diagnosis and Prognosis. Cancers. 2024;16(13):2448. doi: 10.3390/cancers16132448 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 101.Markus AF, Kors JA, Rijnbeek PR. The role of explainability in creating trustworthy artificial intelligence for health care: a comprehensive survey of the terminology, design choices, and evaluation strategies. J Biomed Informat. 2020;113:103655. doi: 10.1016/j.jbi.2020.103655 [DOI] [PubMed] [Google Scholar]
  • 102.Ghasemi A, Hashtarkhani S, Schwartz DL, Shaban-Nejad A. Explainable artificial intelligence in breast cancer detection and risk prediction: a systematic scoping review. Cancer Innovation. 2024;3. doi: 10.1002/cai2.136 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 103.Chaddad A, Lu Q, Li J, et al. Explainable, domain-adaptive, and federated artificial intelligence in medicine. IEEE/CAA J Automatica Sinica. 2022;10:859–876. doi: 10.1109/JAS.2023.123123 [DOI] [Google Scholar]
  • 104.Corti C, Cobanaj M, Marian F, et al. Artificial intelligence for prediction of treatment outcomes in breast cancer: systematic review of design, reporting standards, and bias. Cancer Treat Rev. 2022;108:102410. doi: 10.1016/j.ctrv.2022.102410 [DOI] [PubMed] [Google Scholar]
  • 105.Nyambura AM. The use of AI in predicting patient outcomes. Res Output J Public Health Med. 2024;3(2):38–41. doi: 10.59298/ROJPHM/2024/323841 [DOI] [Google Scholar]
  • 106.Uchikov PA, Khalid U, Dedaj-Salad GH, et al. Artificial intelligence in breast cancer diagnosis and treatment: advances in imaging, pathology, and personalized care. Life. 2024;15:14. doi: 10.3390/life15010014 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 107.Ahn JS, Shin S, Yang S-A, et al. Artificial intelligence in breast cancer diagnosis and personalized medicine. J Breast Cancer. 2023;26:405–435. doi: 10.4048/jbc.2023.26.e45 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 108.Foffano L, Vida R, Piacentini A, et al. Is ctDNA ready to outpace imaging in monitoring early and advanced breast cancer? Expert Rev Anticancer Ther. 2024;24(8):679–691. doi: 10.1080/14737140.2024.2362173 [DOI] [PubMed] [Google Scholar]
  • 109.Lotter W, Hassett MJ, Schultz N, Kehl KL, Van Allen EM, Cerami EG. Artificial intelligence in oncology: current landscape, challenges, and future directions. Cancer Discovery. 2024;14:OF1–OF16. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 110.Díaz O, Rodríguez-Ruiz A, Sechopoulos I. Artificial Intelligence for breast cancer detection: technology, challenges, and prospects. Eur J Radiol. 2024;175:111457. doi: 10.1016/j.ejrad.2024.111457 [DOI] [PubMed] [Google Scholar]
  • 111.Chaddad A, Wu Y, Desrosiers C. Federated learning for healthcare applications. IEEE Int Things J. 2024;11:7339–7358. doi: 10.1109/JIOT.2023.3325822 [DOI] [Google Scholar]
  • 112.Gupta A, Kumar Maurya M, Dhere K, Kumar Chaurasiya V. Privacy-preserving hybrid federated learning framework for mental healthcare applications: clustered and quantum approaches. IEEE Access. 2024;12:145054–145068. doi: 10.1109/ACCESS.2024.3464240 [DOI] [Google Scholar]
  • 113.Babar M, Qureshi B, Koubaa A. Review on Federated Learning for digital transformation in healthcare through big data analytics. Future Gener Comput Syst. 2024;160:14–28. doi: 10.1016/j.future.2024.05.046 [DOI] [Google Scholar]
  • 114.Zwiers LC, Grobbee DE, Uijl A, Ong DSY. Federated learning as a smart tool for research on infectious diseases. BMC Infect Dis. 2024;24:24. doi: 10.1186/s12879-023-08819-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 115.Ferrag MA, Friha O, Maglaras L, Janicke H, Shu L. Federated deep learning for cyber security in the internet of things: concepts, applications, and experimental analysis. IEEE Access. 2021;9:138509–42. [Google Scholar]
  • 116.Mishra S, Tandon DR. Federated learning in healthcare: a path towards decentralized and secure medical insights. Int J Scient Res Eng Manag. 2024;08(10):1–15. doi: 10.55041/IJSREM37791 [DOI] [Google Scholar]
  • 117.Chodankar D, Raval TK, Jeyaraj J. The role of remote data capture, wearables, and digital biomarkers in decentralized clinical trials. Perspect Clin Res. 2023;15:38–41. doi: 10.4103/picr.picr_219_22 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 118.De Brouwer W, Patel CJ, Manrai AK, Rodriguez-Chavez IR, Shah NR. Empowering clinical research in a decentralized world. Npj Digital Med. 2021;4. doi: 10.1038/s41746-021-00473-w [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 119.Narasimhan S. Decentralized clinical trials – systematic review of methods, awareness, and inclusiveness in clinical research. Texila Int J Acad Res. 2023;10(4):113–124. doi: 10.21522/TIJAR.2014.10.04.Art010 [DOI] [Google Scholar]
  • 120.Bhattacharya S, Saleem SM, Singh A, Singh S, Tripathi SK. Empowering precision medicine: regenerative AI in breast cancer. Front Oncol. 2024;14:1465720. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 121.Zheng Q, Yang L, Zeng B, et al. Artificial intelligence performance in detecting tumor metastasis from medical radiology imaging: a systematic review and meta-analysis. EClinicalMedicine. 2020;31:1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 122.Aggarwal R, Sounderajah V, Martin G, et al. Diagnostic accuracy of deep learning in medical imaging: a systematic review and meta-analysis. Npj Digital Med. 2021;4. doi: 10.1038/s41746-021-00438-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 123.Bedrikovetski S, Dudi-Venkata NN, Maicas G, et al. Artificial intelligence for the diagnosis of lymph node metastases in patients with abdominopelvic malignancy: a systematic review and meta-analysis. Artif Intell Med. 2021;113:102022. doi: 10.1016/j.artmed.2021.102022 [DOI] [PubMed] [Google Scholar]
  • 124.Chang T, Park S, Schäffer AA, Jiang P, Ruppin E. Hallmarks of artificial intelligence contributions to precision oncology. Nat Cancer. 2025;6(3):417–431. doi: 10.1038/s43018-025-00917-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 125.Romm EL, Tsigelny IF. Artificial Intelligence in Drug Treatment. Annu Rev Pharmacol Toxicol. 2020;60(1):353–369. doi: 10.1146/annurev-pharmtox-010919-023746 [DOI] [PubMed] [Google Scholar]
  • 126.Zhang F, Zhang F, Li L, Pang Y. Clinical utilization of artificial intelligence in predicting therapeutic efficacy in pulmonary tuberculosis. J Infection Public Health. 2024;17(4):632–641. doi: 10.1016/j.jiph.2024.02.012 [DOI] [PubMed] [Google Scholar]
  • 127.Nayarisseri A, Khandelwal R, Tanwar P, et al. Artificial intelligence, big data and machine learning approaches in preci-sion medicine & drug discovery. Current Drug Targets. 2021;22(6):631–655. doi: 10.2174/18735592MTEzsMDMnz [DOI] [PubMed] [Google Scholar]
  • 128.Bhinder B, Gilvary C, Madhukar NS, Elemento O. Artificial Intelligence in Cancer Research and Precision Medicine. Cancer Discovery. 2021;11(4):900–915. doi: 10.1158/2159-8290.CD-21-0090 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 129.Yousaf A. DNA replication: causes of replication stress and DNA damage response pathways: A review on DNA replication. Electronic Journal of Medical Research. 2025;1 1 8–27. doi: [Google Scholar]
  • 130.Shams A, Ozdemir T, Yildiz M. Leveraging state-of-the-art AI algorithms in personalized oncology: from transcriptomics to treatment. Diagnostics. 2024;15:14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 131.Engelhardt D, Michor F. A quantitative paradigm for decision-making in precision oncology. Trends Cancer. 2021;7(4):293–300. doi: 10.1016/j.trecan.2021.01.006 [DOI] [PubMed] [Google Scholar]
  • 132.Ladbury C, Amini A, Govindarajan A, et al. Integration of artificial intelligence in lung cancer: rise of the machine. Cell Rep Med. 2023;4(2):100933. doi: 10.1016/j.xcrm.2023.100933 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 133.Kann BH, Hosny A, Hjwl A. Artificial intelligence for clinical oncology. Cancer Cell. 2021;39(7):916–927. doi: 10.1016/j.ccell.2021.04.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 134.Corti C, Cobanaj M, Dee EC, et al. Artificial intelligence in cancer research and precision medicine: applications, limitations and priorities to drive transformation in the delivery of equitable and unbiased care. Cancer Treat Rev. 2022;112:102498. doi: 10.1016/j.ctrv.2022.102498 [DOI] [PubMed] [Google Scholar]
  • 135.Carter SM, Rogers WA, Win KT, Frazer HM, Richards BJ, Houssami N. The ethical, legal and social implications of using artificial intelligence systems in breast cancer care. Breast. 2019;49(25):–32. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 136.Napitupulu PA. Ethical dilemmas in the use of artificial intelligence in breast cancer diagnosis and treatment (addressing issues of bias, transferability, and patient trust in breast cancer AI). West Science Law and Human Rights. 2023. [Google Scholar]
  • 137.Goisauf M, Abadía MC. Ethics of AI in Radiology: a Review of Ethical and Societal Implications. Front Big Data. 2022;5. doi: 10.3389/fdata.2022.850383 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 138.Weiner EB, Dankwa-Mullan I, Nelson WA, Hassanpour S. Ethical challenges and evolving strategies in the integration of artificial intelligence into clinical practice. PLOS Digital Health. 2024;4:e0000810. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 139.Hantel A, Walsh TP, Marron JM, et al. Perspectives of oncologists on the ethical implications of using artificial intelligence for cancer care. JAMA Network Open. 2024;7(3):e244077. doi: 10.1001/jamanetworkopen.2024.4077 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 140.Naik N, Hameed BMZ, Shetty DK, et al. Legal and ethical consideration in artificial intelligence in healthcare: who takes responsibility? Front Surg. 2022;9. doi: 10.3389/fsurg.2022.862322 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 141.Ekaterina K. Legal considerations for harnessing artificial intelligence in healthcare. Uzbek J Law Digital Policy. 2023;2(1). doi: 10.59022/ujldp.137 [DOI] [Google Scholar]
  • 142.Ganapathy K. Artificial intelligence and healthcare regulatory and legal concerns. Telehealth Med Today. 2021. doi: 10.30953/tmt.v6.252 [DOI] [Google Scholar]
  • 143.Mennella C, Maniscalco U, De Pietro G, Esposito M. Ethical and regulatory challenges of AI technologies in healthcare: a narrative review. Heliyon. 2024;10:1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 144.Alanzi TM, Alhajri AS, Almulhim S, et al. Artificial intelligence and patient autonomy in obesity treatment decisions: an empirical study of the challenges. Cureus. 2023;15:1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 145.Funer F, Wiesing U. Physician’s autonomy in the face of AI support: walking the ethical tightrope. Front Med. 2024;11:1324963. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 146.Klincewicz M, Frank LE. Consequences of unexplainable machine learning for the notions of a trusted doctor and patient autonomy. Am J Bioethics. 2019;19(7):39–40. doi: 10.1080/15265161.2019.1618960 [DOI] [PubMed] [Google Scholar]
  • 147.Di Nucci E. Should we be afraid of medical AI? J Med Ethics. 2019;45:556–558. doi: 10.1136/medethics-2018-105281 [DOI] [PubMed] [Google Scholar]
  • 148.Kim D, Vegt NJH, Visch VT, Bos-de Vos M. How much decision power should (A)I have?: investigating patients’ preferences towards ai autonomy in healthcare decision making. Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems. 2024:1–17. [Google Scholar]
  • 149.Pham TD. Ethical and legal considerations in healthcare AI: innovation and policy for safe and fair use. Royal Soc Open Sci. 2025;12:241873. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 150.Wadden JJ. Naming and diffusing the understanding objection in healthcare artificial intelligence. Canad J Bioethics. 2024;7(4):57–63. doi: 10.7202/1114958ar [DOI] [Google Scholar]
  • 151.Wang Y, Ma Z. Ethical and legal challenges of medical AI on informed consent: china as an example. Develop World Bioethics. 2024;25(1):46–54. doi: 10.1111/dewb.12442 [DOI] [PubMed] [Google Scholar]
  • 152.Cohen I. Informed consent and medical artificial intelligence: what to tell the patient? SSRN Electron J. 2020;108:1425. [Google Scholar]
  • 153.Gerke S, Minssen T, Cohen G. Ethical and legal challenges of artificial intelligence-driven healthcare. Artificial Intelligence Healthcare. 2020;2020:295–336. [Google Scholar]

Articles from Breast Cancer : Targets and Therapy are provided here courtesy of Dove Press

RESOURCES