Skip to main content
. 2022 Feb 24;2022:1391906. doi: 10.1155/2022/1391906

Table 1.

Abbreviations and symbols.

Abbreviations and symbols Description Abbreviations and symbols Description
3D Three dimensions ANN Artificial neural network
ABW Activity-based windowing BSS Blind source separation
ADL Activities of daily living CRF Conditional random field
AFE Analogue front end DLC Deep learning-based classification
CCA Canonical correlation analysis DLS Deep learning-based semisupervised model
CFS DLF Depp learning-based features
CNN Convolutional neural network DBN Dynamic Bayesian network
CPD Point change detection EM Expectation-maximization
CSS Contact switch sensors FA Factor analysis
DBN Deep belief network FP False positives
DFT Discrete Fourier transform FN The number of false negatives
DL Deep learning GMM Gaussian mixture model
DT Decision tree ICA Independent component analysis
HAR Human activity recognition LS Least squares
HARS Human activity recognition system NB Naïve Bayes
HMM Hidden Markov model RF Random forest
IMU Gyroscope, accelerometers, and magnetic sensors RBF Time complexity in modeling
KNN K-nearest neighbor RBM Restricted Boltzmann machine
LDA Linear discriminant analysis SBHAR Smartphone-based HAR
L-SSW Last-state sensor windowing TCM Time complexity in modeling
LSTM Long short-term memory Radial basis function TCR time complexity in recognition
MEMS Microelectromechanical systems w i The ratio of class i in all samples
Mhealth Mobile health F Freight gate
NN Neural network i t , ot and ft Input, output, and forget gates considered in time t, respectively
PCA Principal component analysis h (all) Hidden values
PI Passive infrared Recalli Sample ratio of class i that is correctly predicted on all correct samples
PN Number of participants K Kernel function
PWM Pulse width modulation N The total number of all samples
QDA Quadratic discriminant analysis Precisioni The ratio of an instance of class i that is correctly predicted on all predicted samples
REALDISP REAListic sensor DISPlacement b i , bf, bc and bo Bias vectors
RFID Radio frequency identification c t−1 Cell output at the previous time stage
RNN Recurrent neural network W ai , Whi, Wci, Waf, Whf, Wcf, Whi is hidden-input gate matrix Wac, Whc, Wao, Who, Wco Matrixes of weight: Wai is input-input gate matrix, Whi is hidden-input gate matrix, and the rest of the W is named in this way
STEW Sensor dependency extension windowing c t The state of memory at time t
SDW Sensor-dependent windowing O Output gate
SEW Sensor event-based windowing I Input gate
SHCS Smart healthcare system C Cell activation vectors
SVM Support vector machine n i The number of samples in ith class
TBW Time-based windowing a t Input to the memory cell layer at time t
TP The number of true positives All σ Non-linear functions
TSW Time slice-based windowing