Table 1.
The code of the BP neural network algorithm.
Algorithm 1: The back propagation neural network |
Definition: Input layer neurons xi; The number of input layer neurons n; The hidden layer neurons Hj, H′j and H″j; The number of input layer neurons k; The output layer neurons y Deposition Initialization: |
Initialize all weights and biases in network; for i=1 to n do Create hierarchy model and assign to neurons x8=(fs, fh, p, ns, t, h, ws, v); y=(d) xi = X(xi) end |
forj=1 tokdo fort=1 to 3 do Hj←ΣWijxi+bj H′j←ΣW’ijHj+b’j H″j←ΣW″ijH′j+b″j yo←g(H″j) end end for all j in k do E←1/2Σej2 |
if (E ∉ Error) do Wij←Wij+αHjej bj←bj+βej return i=1 |
else y=yo |
End |