Skip to main content
. 2020 Dec 13;24(1):101936. doi: 10.1016/j.isci.2020.101936

Table 2.

Commonly used activation functions in deep learning

Name (fa) Expression
Sigmoid function fa(x)=1/(1+exp(x))
ReLu function fa(x)=max(0,x)
Tanh function fa(x)=(exp(x)exp(x))/(exp(x)+exp(x))