Skip to main content
. 2023 Jul 17;13(14):2391. doi: 10.3390/diagnostics13142391

Table 3.

Model parameters.

Parameter Value
XGBoost
Base Learner Gradient boosted tree
Tree construction algorithm Exact greedy
Learning rate (η) 0.0991
Lagrange multiplier (γ) 0
Number of gradient boosted trees 80
Maximum depth of a tree 6
Minimum sum of instance weight 1
Subsample ratio of the training instances 1
Sampling method Uniform
L2 regularization term on weights 1
Tree growing policy Depthwise
Evaluation metrics for validation data Negative log likelihood
SVM
Kernel Linear
Degree of the polynomial kernel 3
Kernel coefficient (γ) Scale
Maximum iterations No constraint
Shrinking heuristic True
Probability estimates False
Tolerance for stopping criterion 1 × 10−3
Random forest
Number of trees in the forest 10
Quality of split measure function Entropy
Minimum number of samples to split 2
Minimum number of samples at a leaf node 1
Use bootstrap samples for building trees True
Number of jobs to run in parallel 1
CatBoost
Number of boosting rounds 20
Learning rate 0.44
Maximum depth of a tree 5
Maximum number of trees 1000
Random seed 0
Sample weight frequency Per tree level
Tree growing policy Symmetric tree
Maximum number of leaves 31
LightGBM
Number of decision trees 20
Bagging fraction 1
Number of threads in the physical core 8
Maximum depth of a tree 6
Number of boosting iterations 100
Learning rate 0.1
Maximum number of leaves on one tree Serial
Bagging random seed 3
Dropout rate 0.1