Skip to main content
. 2019 Jul 5;39:14. doi: 10.1186/s41232-019-0103-3

Fig. 1.

Fig. 1

a Structure of simple perceptron. x1, x2, x3xi represent the output signals of each upstream neuron and each signal is multiplied by each weight: w1, w2, w3wi. Multiplied signals, which comprise the input signal, are summed and calculated by activation function. y is the output of the perceptron. b Neural network consisting of multiple layers of perceptrons converts input signal to final output signal, which is called the predictive value. Predictive value is compared with the objective value, and error is calculated by loss function. Each neuron signal weight is adjusted to minimize the error with the optimizer method, which is based on the backward propagation method