Skip to main content
. 2022 Jan 5;22(1):403. doi: 10.3390/s22010403

Figure 4.

Figure 4

Model training graphs with: (a) TanH activation functions in all layers of the network and (b) TanH activations in the LSTM sub-network and ReLU activation functions in the MLP sub-network.