Skip to main content
. 2022 May 12;22(10):3683. doi: 10.3390/s22103683

Table 1.

Layer parameters of the proposed network model.

Name of the Layer Layer Parameters Number of Layers
Incoming 40 × 1 1
Convolutional Filter size = 5 × 1, number = 256, step = 1, Dropout = 20%, Activation = Relu 3
Recurrent layer RNN type = LSTM, layer size = 256, Batch size = 32, optimizer = Adam, Training rate = 0.001, Dropout = 20%, Activation function = Relu 2
Fully contacted layers Layer size = 256, Dropout = 20%, Activation function = Softmax 1
Transcription layer CTC Loss function 1