Table 3.
Results of the ablation study with different activation functions.
Activation function | Epoch | Learning rate | Batch size | Optimizer | Backbone | Accuracy |
---|---|---|---|---|---|---|
Mish | 100 | 0.001 | 2 | Adam | EfficientNet | 0.92 |
Swish | 100 | 0.001[[parms resize(1),pos(50,50),size(200,200),bgcol(156)]] | 2 | Adam | EfficientNet | 0.91 |
GeLU | 100 | 0.001 | 2 | Adam | EfficientNet | 0.92 |
PReLU | 100 | 0.001 | 2 | Adam | EfficientNet | 0.94 |
ReLU | 100 | 0.001 | 2 | Adam | EfficientNet | 0.96 |
Note: the best results are in bold.