Skip to main content
. 2023 Jan 19;13(3):385. doi: 10.3390/diagnostics13030385

Table 13.

Various activation functions compared in terms of accuracy measure.

Model Sigmoid Tanh ReLU Leaky-ReLU GELU
ShuffleNet 89% 89% 88% 92% 90%
ShuffleNet-Light+Inception-v3 96% 96% 97% 96% 99%
ShuffleNet-Light+AlexNet 90% 91% 89% 89% 90%
ShuffleNet-Light+MobileNet 90% 90% 88% 87% 89%

Tanh: hyperbolic tangent function, ReLU: rectified linear unit, Leaky-ReLU: leaky rectified linear unit, GELU: Gaussian error linear unit.