Skip to main content
. 2021 Jun 1;15:673955. doi: 10.3389/fnhum.2021.673955

TABLE 3.

Comparison with previous studies on attention recognition.

Authors Attention task (levels) Subjects Methods Window (seconds) Brain regions (channels) Validation Accuracy (%)
2-Levels

Chen et al. (2017) Continuous performance task (high-attention, low-attention) 10 Temporal and entropy features—SVM Trial length Prefrontal (1) 3/4 train, 1/4 test 91.60

3-Levels
Hu et al. (2018) Randomly selected learning task (high, neutral, low) 10 Linear and nonlinear features—CFS+KNN 180 Central and temporal (6) 10 times 3-Fold CV 80.84

2-Levels 3-Levels

Gaume et al. (2019) Continuous performance task (easy, medium, and hard) 14 Power features—LDA 5 Whole brain (16) Leave-one-subject-out 75 51.8
30 85 64.8

2-Levels (in-ear) 2-Levels (prefrontal)

Jeong and Jeong (2020) Psychomotor vigilance tasks (attention, rest) 6 Temporal and spectral features—Echo State Network 0.5 In-ear (2) prefrontal (2) Within-subject 81.16 82.44
Cross-subject 64.00 65.70
10-Fold CV 74.15 73.73

2-Levels 3-Levels 4-Levels

Our study AX-CPT (rest, LA, MA, HA) 42 Complexity—XGBoost 3 Frontal (13) Leave-one-subject-out 76.30 70.49 64.69
5-Fold CV 95.36 80.42 81.39

SVM, support vector machine; CFS, correlation-based feature selection; KNN, k-nearest-neighbor; LDA, Linear discriminant analysis; ESN, Echo State Network; CV, cross-validation.