Table 4.
Parameters | Values | Description |
---|---|---|
Activation function 1 | ReLU | Rectified linear activation unit |
Activation function 2 | ReLU | |
Loss function | CELoss | Cross-entropy loss |
Batch size | 5000 | |
Learning rate | 0.001 | |
Epochs | 2000 | |
Number of hidden layers | 2 | |
Number of neurons in hidden layers | 25 |