Table 1.
Model | Type | Description | Example |
---|---|---|---|
LR | R | Uses logistic function to predict categorical outcomes | Chhatwal et al. [13] |
SVM | R, C | Constructs hyperplanes to maximise data separation | Zhang et al. [14] |
NB | C | Utilises Bayesian probability including priors for classification | Olatunji et al. [15] |
RF | R, C | Ensembles predictions of random decision trees | Xiao et al. [16] |
XGB | R, C | As RF, but sequential errors minimised by gradient descent | Liew et al. [17] |
ANN | R, C | Multiplies input by weights and biases to predict outcome | Muhammad [18] |
CNN | R, C | Uses kernels to detect image features | Suh [19] |
Abbreviations: R: regression, C: classification, LR: logistic regression, SVM: support vector machine, NB: naïve Bayes, RF: random forest, XGB: extreme gradient boosting, ANN: artificial neural network, CNN: convolutional neural network.