Skip to main content
. 2020 May 6;10(5):806. doi: 10.3390/ani10050806

Figure 5.

Figure 5

VGG16 architecture [26]. ReLU is a linear rectifier function, which is an activation function for the neural network. Softmax is the final layer of the neural network that has a value either 0 or 1.