Skip to main content
. 2024 Sep 11;13:e56729. doi: 10.2196/56729

Table 3.

Type and frequency of ML algorithms used by the included studies.

MLa algorithm Studies, n Authors (citations)
Support vector machine 8 Anthony et al [26], Ceklic et al [27], Chin et al [28], Gatto et al [31], Inokuchi et al [32], Pacula et al [35], Spangler et al [3]b, and Veladas et al [36]
Random forest 6 Anthony et al [26], Ferri et al [30], Inokuchi et al [32], Tollinton et al [5], Spangler et al [3], and Veladas et al [36]
Neural networks 5 Ceklic et al [27], Ferri et al [30], Gatto et al [31], Inokuchi et al [32], and Marchiori et al [25]
Extreme gradient boosting (XGBoost) 4 Ferri et al [30], Inokuchi et al [32], and Spangler et al [3]
Naïve Bayes 4 Ceklic et al [27], Chin et al [28], Ferri et al [30], and Veladas et al [36]
K-nearest neighbors 4 Anthony et al [26], Ceklic et al [27], Chin et al [28], and Gatto et al [31]
Logistic regression 3 Anthony et al [26], Ferri et al [30], and Spangler et al [3]
Bayesian network 2 Cotte et al [29] and Yunoki et al [4]
Decision treec 2 Chin et al [28] and Tollinton et al [5]
Ensembled 2 Ceklic et al [27] and Ferri et al [30]
LASSOe regression 1 Inokuchi et al [32]
Multilayer perceptron 1 Chin et al [28]
Hidden Markov model 1 Pacula et al [35]
Hierarchical attention network 1 Gatto et al [31]
Transformer-based 1 Gatto et al [31]
Unspecified algorithms 2 Lai et al [33] and Morse et al [34]

aML: machine learning.

bSpangler et al [3] did not include support vector machine, random forest, neural networks, or regression models in their “Methods” but stated in their “Discussion” that they investigated these algorithms.

cDecision trees here include methods such as gradient boosting but not random forest or extreme gradient boosting.

dCeklic et al [27] did not provide information about the specific models incorporated into their ensemble. Ferri et al [30] built their ensemble using a collection of deep learning subnetworks.

eLASSO: least absolute shrinkage and selection operator.