Skip to main content
. 2024 Mar 4;14:5294. doi: 10.1038/s41598-024-52941-4

Figure 5.

Figure 5

A comparison between Leaky ReLU, ReLU, softplus and ELU using an sk-DNN (left panel:) for 4D with 9 blocks of width 36 and (right panel:) for 8D with 9 blocks of width 50. The Leaky ReLU activation function outperforms any other activation function and this is more prominent at 8D than at 4D. We use it for all experiments with DNNs and sk-DNNs in our work.