TABLE 7.
Test results for the remaining four participants after training using data from four participants (3).
| DT | RF | ADB | LR | MLP | SVM | KNN | LDA | LR-CNN | CNN | LSTM | |
| Participant 2 | 0.750 | 0.801 | 0.805 | 0.820 | 0.808 | 0.824 | 0.862 | 0.821 | 0.933 | 0.812 | 0.801 |
| Participant 3 | 0.707 | 0.823 | 0.804 | 0.766 | 0.801 | 0.821 | 0.801 | 0.810 | 0.912 | 0.833 | 0.826 |
| Participant 4 | 0.724 | 0.804 | 0.833 | 0.793 | 0.823 | 0.809 | 0.796 | 0.803 | 0.929 | 0.815 | 0.833 |
| Participant 7 | 0.717 | 0.829 | 0.806 | 0.817 | 0.811 | 0.815 | 0.807 | 0.834 | 0.882 | 0.822 | 0.812 |
DT, decision tree; RF, random forest; ADB, adboost; LR, logistic regression; MLP, multilayer perceptron; SVM, support vector machine; KNN, k-nearest neighbor; LDA, linear discriminant analysis; LR-CNN, logistic regression and convolutional neural network; CNN, convolutional neural network; LSTM, long short-term memory.