Table 4.
Comparison of Adam versus stochastic gradient descent as optimizer
Optimizer | Accuracy | Precision | Recall | F1 score | ROC-AUC |
---|---|---|---|---|---|
Adam | 0.935 | 0.94 | 0.84 | 0.87 | 0.861 |
SGD | 0.903 | 0.92 | 0.71 | 0.80 | 0.830 |
SGD – Stochastic gradient descent; ROC-AUC – Receiver operating characteristic-area under the curve