Algorithm 2. RanAdam Hyper-parameter Tuning |
-
Step 1.
Start Algorithm
-
Step 2.
Initialise iteration counter, t = 0
-
Step 3.
Initialise and assign values to hyper-parameters β1, β2, , , , ,
-
Step 4.
Initialise parameters (weights) for the chosen classifier
-
Step 5.
Define the loss function to be minimised
-
Step 6.
For each iteration t:
-
Step 7.
Compute the gradient of the loss function with respect to the hyper-parameters,
-
Step 8.
Update the exponential moving averages of the gradient and its square, and using Equations (30) and (31)
-
Step 9.
Compute bias-corrected estimates of the averages, and using Equations (28) and (29)
-
Step 10.
Update the parameters (weights) or the chosen classifier
-
Step 11.
Calculate for the current equation
-
Step 12.
If , compute the gradient of the loss function with respect to the hyper-parameter
-
Step 13.
Else if , compute the gradient of the loss function with respect to the hyper-parameter .
-
Step 14.
Initialise random numbers for Rand1, Rand2, Rand3, Rand4 and specify bandwidth
-
Step 15.
if rand 1 < solution considering rate
-
Step 16.
-
Step 17.
End if
-
Step 18.
if rand 2 < solution adjusting rate
-
Step 19.
* bandwidth * rand 3
-
Step 20.
End if
-
Step 21.
If < Lower bound (LB)
-
Step 22.
-
Step 23.
End if
-
Step 24.
if > Upper bound (UB)
-
Step 25.
= UB
-
Step 26.
End if
-
Step 27.
if < UB
-
Step 28.
= LB + rand4 * bandwidth
-
Step 29.
End if
-
Step 30.
If (ER = minimum ER)
-
Step 31.
Optimum weight, =
-
Step 32.
Else
-
Step 33.
Go to Step 14
-
Step 34.
If t = ConvCrit
-
Step 35.
Go to Step 38
-
Step 36.
Else
-
Step 37.
Go to Step 7
-
Step 38.
End Algorithm
|