Skip to main content
. 2022 Oct 21;22(20):8060. doi: 10.3390/s22208060

Table 1.

Summary of HAR-related work presented in this section.

Work Used Methods Limitations
Ignatov [22], 2018 CNN + Statistical Features Statistical feature extraction requires additional computational cost
Xia et al. [23], 2020 CNN + LSTM The model depth and layer diversity increases the model complexity
Nafea et al. [24], 2021 CNN + BiLSTM
Yin et al. [25], 2022 CNN + BiLSTM + Attention LSTM and GRU RNNs suffer from increased computation time, limiting their applicability to edge inference
Tan et al. [26], 2022 Conv1D + GRU + Ensemble learning
Pushpalatha and Math [27], 2022 CNN + GRU+ FC Models tested on a single dataset do not establish the model generalization capabilities
Sikder et al. [28], 2019 CNN Using such a DNN increases the computational cost of the model
Luwe et al. [29], 2022 CNN + BiLSTM Using a DNN model with hybrid layers increases the model complexity and computational cost of the proposed classifier
Ronald et al. [30], 2021 CNN + BiLSTM + Inception + ResNet Such a deep model is not the best fit for edge inference, which requires smaller models with a reduced computational cost.
Sannara EK [32], 2022 CNN + Transformer The number of parameters is greater than 1 million
Tang et al. [33], 2021 Teacher-Student CNN
Rahimi Taghanaki et al. [34], 2021 CNN + FC + Transfer Learning Results achieved by self-supervised and semisupervised models fall behind their supervised learning counterparts by a considerable margin
Taghanaki et al. [35], 2022 CNN + STFT + Transfer Learning