Table 1.
Name of the Layer | Layer Parameters | Number of Layers |
---|---|---|
Incoming | 40 × 1 | 1 |
Convolutional | Filter size = 5 × 1, number = 256, step = 1, Dropout = 20%, Activation = Relu | 3 |
Recurrent layer | RNN type = LSTM, layer size = 256, Batch size = 32, optimizer = Adam, Training rate = 0.001, Dropout = 20%, Activation function = Relu | 2 |
Fully contacted layers | Layer size = 256, Dropout = 20%, Activation function = Softmax | 1 |
Transcription layer | CTC Loss function | 1 |