Table 9.
Activation Function | Acc | Sens | Spec | Prec | AUC |
---|---|---|---|---|---|
ReLU | 0.739 | 0.687 | 0.765 | 0.819 | 0.742 |
LeakyReLU | 0.778 | 0.808 | 0.775 | 0.852 | 0.773 |
ELU | 0.765 | 0.784 | 0.809 | 0.822 | 0.764 |
ClippedReLU | 0.756 | 0.799 | 0.780 | 0.834 | 0.761 |
Activation Function | Acc | Sens | Spec | Prec | AUC |
---|---|---|---|---|---|
ReLU | 0.739 | 0.687 | 0.765 | 0.819 | 0.742 |
LeakyReLU | 0.778 | 0.808 | 0.775 | 0.852 | 0.773 |
ELU | 0.765 | 0.784 | 0.809 | 0.822 | 0.764 |
ClippedReLU | 0.756 | 0.799 | 0.780 | 0.834 | 0.761 |