Skip to main content
. 2022 Mar 6;12(5):878. doi: 10.3390/nano12050878
Algorithm 1All the process of LMB-NN is given in the pseudocode
  • Starting of LMB-NN

  • Step 1: Construction

  • Construct input and data set

  • Step 2: Selection of Data

  • Target data and input data is chosen in non-linear form i.e., matrix form.

  • Step 3: Startup

  • Startup the ratio of Neuron numbers, testing, validation and training.

  • ▸ 90 Percent is for training

  • ▸ 5 percent is for validation

  • ▸ 5 percent is for testing

  • ▸ Number of hidden neurons is 40

  • ▸ Number of hidden layers is 4

  • Step 4: Weights for training

  • The selected data is trained from the activation function in LMB-NN

  • Step 5: Stopping criteria

  • Step 4 will stop automatically if the following conditions are satisfied.

  • ⋇ Reaching Mu to the maximum value

  • ⋇ Performance value reaches to minimum

  • ⋇ Maximum number of epoch achieved

  • ⋇ Performance of validation is less than maximum fail

  • ⋇ Gradient of performance less than minimum gradient

  • Testing data help us determine that the network is generalized. If the outputs are good and useful forward to step 7, and if the outputs are not desirable, retrain the network.

  • Step 6: Retraining

  • For retraining the hidden neurons, the ratio of testing, training, and validation is changed. Then move again to step 4 and do the same procedure.

  • Phase 7: Output saving

  • The process is ended by saving the output simulation of data statistically as well as numerically.

  • Ending of LMB-NN