Skip to main content
. 2021 Jul 1;21(13):4535. doi: 10.3390/s21134535

Table 1.

Neural Network architectures and hyperparameters.

Joint Angles
ANN Layers Learning Rate Dropout Activation
MLP 6000–4000 0.0003 0.5 relu
LSTM 32–32 0.0003 0.7 tanh
CNN 3000–6000 0.00003 0.4 relu
Joint Moments
ANN Layers Learning Rate Dropout Activation
MLP 3000–1000 0.0003 0.5 relu
LSTM 128–1024 0.0003 0.4 tanh
CNN 2000–4000 0.0001 0.4 relu