Skip to main content
. 2024 Feb 5;17(3):764. doi: 10.3390/ma17030764

Table 4.

Network architecture and hyperparameters.

Parameters Values Description
Activation function 1 ReLU Rectified linear activation unit
Activation function 2 ReLU
Loss function CELoss Cross-entropy loss
Batch size 5000
Learning rate 0.001
Epochs 2000
Number of hidden layers 2
Number of neurons in hidden layers 25