Skip to main content
. 2023 Jun 23;23(13):5850. doi: 10.3390/s23135850

Table 9.

Top results (with threshold accuracy set at 80%) of numerical experiments for the Wavelet approach with hyperparameter optimization.

Model Transformation n_train n_test % for Training Acc. [%]
Gradient Boosting STD 22 53 70 96.23
Random Forest STD 52 23 30 95.65
Gradient Boosting STD 30 45 60 95.56
Gradient Boosting MINMAX 37 38 50 94.74
Gradient Boosting MINMAX 30 45 60 93.33
Gradient Boosting NONE 30 45 60 93.33
XGB MINMAX 22 53 70 92.45
Decision Tree MINMAX 22 53 70 92.45
XGB STD 22 53 70 92.45
XGB NONE 22 53 70 92.45
Gradient Boosting MINMAX 22 53 70 90.57
Gradient Boosting NONE 22 53 70 90.57
LGBM MINMAX 37 38 50 89.47
Gradient Boosting STD 37 38 50 89.47
Decision Tree STD 37 38 50 89.47
LGBM STD 37 38 50 89.47
Gradient Boosting NONE 37 38 50 89.47
LGBM NONE 37 38 50 89.47
Decision Tree NONE 30 45 60 88.89
Random Forest NONE 52 23 30 86.96
Gradient Boosting STD 60 15 20 86.67
Decision Tree NONE 60 15 20 86.67
XGB MINMAX 30 45 60 84.44
XGB STD 30 45 60 84.44
Decision Tree STD 30 45 60 84.44
XGB NONE 30 45 60 84.44
LGBM MINMAX 22 53 70 83.02
LGBM STD 22 53 70 83.02
Decision Tree NONE 22 53 70 83.02
LGBM NONE 22 53 70 83.02
Random Forest MINMAX 52 23 30 82.61
LGBM MINMAX 30 45 60 82.22
LGBM STD 30 45 60 82.22
LGBM NONE 30 45 60 82.22
XGB MINMAX 37 38 50 81.58
XGB STD 37 38 50 81.58
XGB NONE 37 38 50 81.58
Decision Tree NONE 37 38 50 81.58
Random Forest NONE 37 38 50 81.58
Random Forest MINMAX 30 45 60 80.00
Random Forest STD 30 45 60 80.00
LGBM MINMAX 45 30 40 80.00
LGBM STD 45 30 40 80.00
Gradient Boosting NONE 45 30 40 80.00
LGBM NONE 45 30 40 80.00
Random Forest STD 60 15 20 80.00