Skip to main content
. 2020 Jan 28;20(3):723. doi: 10.3390/s20030723

Figure 1.

Figure 1

Basic components in a perceptron comprises of the input layer that can take in an arbitrary number of inputs, s, the weight, w that maps the inputs to the subsequent layer, a bias, b, activation function H to introduce non-linearity into the function and the output, Z.