Skip to main content
. 2024 Feb 29;10:e1915. doi: 10.7717/peerj-cs.1915

Table 7. Results of activation functions in training set.

Activation functions Accuracy Loss
Sigmoid 0.6541 0.8521
tanh 0.7551 0.6170
Leaky-ReLU 0.7615 0.5952
ELU 0.7414 0.6412
SELU 0.7299 0.6698
SoftPlus 0.7078 0.7197
ReLU 0.7630 0.5904

Note:

Values in bold represent the optimum values for each group.