Skip to main content
. 2026 Jan 8;15:e76601. doi: 10.2196/76601

Table 2.

Task-specific performance comparison.

Task type and methods/ model Key features Performance metrics (number of studies) References
Classification (n=28)

RFa (n=13) Handcrafted features: time/frequency features (eg, mean, SD, percentiles, lag-1 autocorrelation), ensemble of decision trees
  • Lab (n=7): F1-score 91.9%, accuracy 94.0%

  • Free-living (n=4): F1-score 81.0%, accuracy 87.4%

  • Lab and free-living (n=2): F1-score 88.1%, accuracy 93.8%

[28,36-38,50-53,64,67,72,74,80]

ANNb (n=7) Handcrafted features: time/frequency features (eg, spectral entropy, signal power), multilayer perceptron
  • Lab (n=7): F1-score 88.0%, accuracy 93.1%

  • Free-living (n=1): F1-score 75.4%, accuracy 82.1%

[19,29,58,60,62,65,73]

SVMc (n=4) Kernal-based classification on RBFd, advanced cross-correlation metrics (xy, xz, yz)
  • Lab (n=1): F1-score 75.4%, accuracy 88.4%

  • Free-living (n=3): accuracy 86.5%

[28,50,55]

DTe (n=4) Tree-based splits, integrate with ANN outcomes
  • Lab (n=3): F1-score 86.6%, accuracy 87.8%

  • Free-living (n=1): F1-score 75.4%, accuracy 82.1%

[58,62,64,74]

Gradient boosting (n=3) Gradient boosting framework, handling missing data
  • Lab (n=2): F1-score 91.6%

[37,54,64]

HMMf (n=3) Temporal sequence modeling, Viterbi smoothing
  • Lab (n=1): F1-score 99.8%

  • Free-living (n=2): F1-score 73.5%, accuracy 94.0%

[71,75,76]

QDAg (n=1) Quadratic decision boundaries, probabilistic classification
  • Lab (n=1): F1-score 100%, accuracy 99.9%

[71]

LASSOh (n=1) L1 regularization, sparse solutions
  • Lab (n=1): F1-score 83.6%

[64]

CNNi (n=1) Automated feature extraction via convolutional filters on raw signals
  • Free-living (n=1): F1-score 73.4%, accuracy 96.8%

[68]
Estimation (n=10)

RF (n=6) Regression trees, bootstrapped subsets of ActiGraph data
  • Lab (n=5): F1-score 83.5%, accuracy 86.1%

  • Free-living (n=1): F1-score 80.0%, accuracy 91.4%

[56,57,59,70,77]

ANN (n=2) Nonlinear activation functions, raw signal processing
  • Lab (n=2): F1-score 91.1%, accuracy 85.7%

[61,66]

SVM (n=1) Kernal-based regression.
  • Lab (n=1): F1-score 90.7%, accuracy 88.7%

[70]

k-NNj (n=2) Instance-based learning, Euclidean distance metrics
  • Lab (n=2): F1-score 96.4%, accuracy 95.8%

[37,70]

XGBoostk (n=1) Gradient boosting framework, handling missing data
  • Lab (n=1): F1-score 100%, accuracy 100%

[37]

Gradient boosting (n=1) Iterative error correction, additive regression trees
  • Lab (n=1): F1-score 93.2%, accuracy 92.1%

[70]
Deep learning (n=5)

Bi-LSTMl (n=3) Bidirectional temporal modeling, raw signal processing
  • Free-living (n=2): F1-score 73.6%, accuracy 93.6%

  • Lab and free-living: F1-score 53.3%, accuracy 53.7%

[31,33,79]

CNN (n=2) Automated feature extraction via convolutional filters on raw signals
  • Free-living (n=2): F1-score 71.9%, accuracy 94.4%

[33,68]

ViTm (n=1) Self-attention mechanisms for long-range dependencies
  • Free-living (n=1): F1-score 79.8%, accuracy 95.0%

[33]

CNN-LSTMn or CNN-BiLSTMo (n=2) Hybrid architecture, integrate spatial and temporal learning
  • Lab (n=1): F1-score 82.1%

  • Free-living (n=1): F1-score 91.4%, accuracy 97.7%

[33,34]

ViT-BiLSTMp (n=1) Vision Transformer + BiLSTM, gravity-based acceleration analysis.
  • Free-living (n=1): F1-score 98.4%, accuracy 99.0%

[33]

aRF: random forest.

bANN: artificial neural network.

cSVM: support vector machine.

dRBF: radial basis function.

eDT: decision tree.

fHMM: hidden Markov model.

gQDA: quadratic discriminant analysis.

hLASSO: least absolute shrinkage and selection operator.

iCNN: convolutional neural network.

jk-NN: k-nearest neighbor.

kXGBoost: extreme gradient boosting.

lBiLSTM: bidirectional long short-term memory.

mViT: vision transformer.

nCNN-LSTM: convolutional neural network and bidirectional long short-term memory.

oCNN-BiLSTM: convolutional neural network and bidirectional long short-term memory.

pViT-BiLSTM: vision transformer bidirectional long short-term memory.