Skip to main content
. 2024 Feb 9;10(5):e25757. doi: 10.1016/j.heliyon.2024.e25757

Table 4.

Comparative analysis of Swish and other activation functions over DFDC and FF++

Activation Function Accuracy (%) Avg training time (sec) Avg time (sec) classification Remarks
Sigmoid 94.0 1110 2549 Can't work for Boolean gates simulation
Swish 98.0 1166 3057 Worth giving a try in very deep networks
Mish 98.35 1155 3524 It has very few implementations, not matured
Tanh 90.0 1173 2950 In the recurrent neural network
Relu 97.0 1050 2405 Prone to the dying ReLU" problem
Leak_Relu 97.5 1231 2903 Use only if expecting a dying ReLU problem