Table 4.
Deep Learning Architecture | Hyperparameters |
---|---|
Bi-LSTM | Number of layers: 2 |
Layer 1 units: 32 Layer 2 units: 16 |
|
Activation function: ReLU | |
DNN | Number of layers: 2 |
Layer 1 units: 64 Layer 2 units: 64 |
|
Activation function: Sigmoid | |
CNN1D | Number of layers: 1 |
Layer 1 units: 30 | |
Activation function: ReLU | |
Filter size: 10 × 1 | |
Bi-GRU | Number of layers: 2 |
Layer 1 units: 32 Layer 2 units: 16 |
|
Activation function: ReLU |