Skip to main content
. 2022 Oct 14;17(10):e0276116. doi: 10.1371/journal.pone.0276116

Table 2. Algorithm tuning parameters.

Algorithm Tuning parameter
Logistic regression -
Random forest mtry = 12
Support Vector Machine C = 8; sigma = 4.69·10−11
Naïve Bayes fL = 0; adjust = 1
K-nearest neighbor K = 5
Artificial Neural Network Size = 11; decay = 0.1
GLMNET Alpha = 0.8; lambda = 0.21

Algorithm tuning parameters selected by repeated (5 times) 10-fold cross-validation in a grid. Mtry: Number of variables for splitting at each tree node in a random forest; C: regularization parameter that controls the trade off between the achieving a low training error and a low testing error; sigma: determines how fast the similarity metric goes to zero as they are further apart; fL: Laplace smoother; adjust: adjust the bandwidth of the kernel density; K = number of nearest neighbours; size: number of units in hidden layer; decay: regularization parameter to avoid over-fitting; alpha: regularization parameter; lambda: penalty on the coefficients