Skip to main content
. 2022 Jun 9;13:808380. doi: 10.3389/fpls.2022.808380

TABLE 3.

Comparative analysis of different activation functions.

Improved EfficientNetV2 Method with different activation function Accuracy (%)
Sigmoid 99.89
ReLU 99.93
PReLU 99.94
LeakyReLU 99.96
Swish 99.99