|
Algorithm 1: Pseudocode of NN-BLMA. |
-
Starting of NN-BLMAConstruction: Construct inputs and reference data set using RK-4 method in Mathematica
Data selection: Input and target data must be selected in non-linear format, i.e., matrices.
Startup: Taking number of neurons and distributing the reference data set into training, testing and validation
60 Hidden neurons
data for training
data for testing
data for validation
-
Architecture: Each input is given a weight, and the input to the transfer function is formed by adding the weights of all of the inputs together along with the bias.
Stopping criteria: If all of the conditions listed below are met, the previous process will end automatically.
Mu reach to its maximum value
Number of iteration reaches to maximum
Performance value reaches to minimum
Validation’s performance became less then maximum fail
Gradient’s performance dropped below minimum gradient
-
The network is generalised using training data. If the outputs are good, proceed to Saving Output; otherwise, retrain the network.
Retraining: Change the startup parameters and train the network again
Saving outputs: End the process by saving the results graphically as well as numerically
Ending of NN-BLMA
|