Skip to main content
. 2021 Mar 27;74:50–64. doi: 10.1016/j.inffus.2021.03.005

Table 3.

LSTM hyper parameters.

Number of layers 5
Neurons in each layer 128
Optimizer ADAM
Dropout 0.2
Activation Layer Softmax
Epoch 300
Batch size 32
Training data 75%
Testing data 25%