Table 2. Performance measurement (10-fold-crossvalidation estimation) of the proposed algorithms based on HRV features.
Classifier | Parameters | Feature selection (# features) | AUC | ACC | SEN | SPE |
---|---|---|---|---|---|---|
AB | NI: 220; CF 0.5; MI: 20 | None (33) | 94.5% | 91.8% | 93.2% | 90.4% |
AB | NI: 20; CF: 0.3; MI: 10 | CFS (8) | 92.2% | 85.6% | 86.3% | 84.9% |
AB | NI: 120; CF: 0.45; MI: 10 | Χ 2-FS(10) | 94.7% | 89.0% | 90.4% | 87.7% |
C4.5 | CF: 0.3; MI: 5 | None (33) | 80.3% | 76.7% | 78.1% | 75.3% |
C4.5 | CF: 0.3; MI: 5 | CFS (8) | 82.8% | 80.8% | 87.7% | 74.0% |
C4.5 | CF: 0.1; MI: 5 | Χ 2-FS (10) | 83.0% | 76.7% | 76.7% | 76.7% |
MLP | LR 0.3; M 0.6; NE 200 | None (33) | 86.7% | 82.9% | 80.8% | 84.9% |
MLP | LR 0.6; M 0.4; NE 200 | CFS (8) | 86.9% | 78.1% | 86.3% | 69.9% |
MLP | LR 0.3; M 0.2; NE 1800 | Χ2-FS (10) | 86.1% | 78.8% | 82.2% | 75.3% |
NF | - | None (33) | 72.4% | 65.8% | 76.7% | 54.8% |
NF | - | CFS (8) | 80.1% | 70.5% | 78.1% | 63.0% |
NF | - | Χ2-FS (10) | 77.8% | 71.9% | 82.2% | 61.6% |
RF | NT 300 NF 5 | None (33) | 94.5% | 88.4% | 91.8% | 84.9% |
RF | NT 20 NF 5 | CFS (8) | 92.3% | 87.7% | 90.4% | 84.9% |
RF | NT 400 NF 4 | Χ2-FS (10) | 93.2% | 89.0% | 93.2% | 84.9% |
SVM | G: 1.4 | None (33) | 93.1% | 89.0% | 86.3% | 91.8% |
SVM | G: 2.3 | CFS (8) | 89.1% | 81.5% | 84.9% | 78.1% |
SVM | G: 1.6 | Χ2-FS (10) | 89.2% | 80.8% | 86.3% | 75.3% |
CFS: correlation-based feature selection algorithm (a subset of 8 HRV features)
Χ2-FS: chi-squared feature selection algorithm (a subset of 10 HRV features)
NI: number of iteration
ML: minimum number of instances per leaf.
CF: confidence factor for pruning
LR: learning rate
M: momentum
NE: number of epoch
NT: number of trees
NF: number of randomly chosen features
G: gamma
AUC: area under the curve
CI: confidence interval
ACC: accuracy
SEN: sensitivity
SPE: specificity
In bold: the best performances of each classifier.