Skip to main content
. 2022 Dec 27;13(1):76. doi: 10.3390/diagnostics13010076

Table 1.

Hyperparameters related to the training of a neural network [47].

Hyperparameters Description
Learning Rate The learning rate defines how quickly a network updates its parameters.
For the classification problem, it is important to choose the optimal
learning rate to minimize the loss function. A low learning rate slows down
the learning process but converges smoothly. A larger learning rate speeds up
the learning but may not converge.
Momentum Momentum helps to know the direction of the next step with the knowledge
of the previous steps. It helps to prevent oscillations.
Number of Epochs The number of epochs is the number of times the whole training data are
introduced to the network. It is important to determine an ideal epoch
number to prevent overfitting.
MiniBatch Size The larger minibatch size causes running of the model for a long period of time
with constant weights that causes overall performance loses and increases the
memory requirements. Carrying out the experiments with small minibatch sizes
can be more beneficial.