Table 4. Random Forest (RF), Gradient Boosting (GB), and AdaBoost (AB) methods: Hyperparameters’ domain and the corresponding tuned values at the data sets under consideration.
method / data set | parameters | |||
---|---|---|---|---|
N e | M ss | M sl | lr | |
AB | {10, 11, …, 10000} | - | - | [1e-3, 5e-1] |
RF | {10, 11, …, 10000} | {2, 3, …, 10} | {1, 2, …, 10} | - |
GB | {10, 11, …, 10000} | {2, 3, …, 10} | {1, 2, …, 10} | [1e-3, 5e-1] |
AB at Demo | 545 | - | - | 0.017 |
RF at Demo | 6197 | 5 | 9 | - |
GB at Demo | 257 | 4 | 1 | 0.005 |
AB at Fixation | 415 | - | - | 0.169 |
RF at Fixation | 2726 | 2 | 10 | - |
GB at Fixation | 3380 | 3 | 6 | 0.007 |
AB at IA | 4736 | - | - | 0.087 |
RF at IA | 1980 | 4 | 3 | - |
GB at IA | 165 | 2 | 10 | 0.160 |
AB at Demo-Fixation | 309 | - | - | 0.215 |
RF at Demo-Fixation | 9923 | 9 | 1 | - |
GB at Demo-Fixation | 2674 | 9 | 3 | 0.282 |
AB at Demo-IA | 7133 | - | - | 0.019 |
RF at Demo-IA | 163 | 2 | 1 | - |
GB at Demo-IA | 971 | 7 | 3 | 0.299 |