Skip to main content
. 2024 Jun 17;14:13929. doi: 10.1038/s41598-024-62254-1

Table 5.

Hyperparameters evaluated in the algorithms.

Algorithm Hyperparameters
Decision tree

depth=

{“max_depth”:[3,4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15],

“min_samples_split”:[2, 4, 8, 12, 16],

“min_samples_leaf”: [2, 3, 4, 5, 6, 7, 8],

“criterion”: [ ‘gini’, ‘entropy’] }

Random forest

depth=

{“max_depth”:[3,4, 5, 6, 7, 8, 9, 10],

“min_samples_split”:[2, 4, 8, 12, 16],

‘n_estimators’: [50, 100, 150],

‘criterion’: [‘gini’, ‘entropy’],

“max_features’: [‘auto’, 2, 3, 4, 6, 8, 10 ,11],

‘min_samples_leaf’ : [2, 3, 4, 5, 6, 7, 8]}

Adaboost

depth=

{“max_depth”:[1,2,3],

learning_rate = [0.05, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]

‘n_estimators’: [50, 100, 150, 200],

‘min_samples_leaf’ : [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]}