Skip to main content
. 2020 Aug 12;11:900. doi: 10.3389/fgene.2020.00900

TABLE 3.

Hyper-parameters of NanoReviser.

Layer Feature source Hyper-parameters Value
CNN Signal input Kernel size 1 × 3
Number of filters 8
Stride 1
Activation function ReLUa
Bi-LSTM Read input State size 16
Activation function tanhb
Bi-LSTM Concatenated input State size 64
Activation function tanhb
Dropout Concatenated input Dropout rate 0.2
Center loss Concatenated input Proportion 0.2
Adam optimizer Initial learning rate 0.002
Decay rate 0.05
Beta_1 0.9
Beta_2 0.999

aReLU is one of the commonly used activation functions in CNN layer. btanh is the default activation function in LSTM layer.