Skip to main content
. 2023 Jul 26;16:2587–2594. doi: 10.2147/JPR.S409841

Table 2.

Details of Performed of Our Developed Deep Learning Model

Sample size (patients)
Sample ratio(patients)
230, 79.9% for training, 58, 20.1% for validation, total 288
Favor: 148, 51.4%; poor:140, 48.6%
Favor: 118, 51.3%; poor:112, 48.7% for training
Favor: 30, 51.7%; poor: 28, 48.3% for validation
Model details
  • EfficientNetV2B0 CNN model with transfer learning (freeze all layers without top)

  • RMSProp optimizer, ReLU activation

  • Learning rate 1e-05, batch size 8, training epoch 26

  • Batch Normalization and dropout for regularization

  • ROI image resized to (100 × 100)

  • Training accuracy: 89.6%, AUC 0.981 with 95% CI [0.969–0.993]

  • Validation accuracy: 79.3%, AUC 0.802 with 95% CI [0.682–0.923]

Model performance (validation data) Class Precision Recall F1-score Support
 Poor(0) 0.767 0.821 0.793 28
 Favor(1) 0.821 0.767 0.793 30
Macro average 0.794 0.794 0.793 58

Abbreviations: CNN, convolutional neural network; RMSProp, root mean squared propagation; ReLU, rectified linear units; ROI, region of interest; AUC, area under the curve; CI, confidence interval.