|
Algorithm 1 MHA-Based Optimization for Each Training Fold Using RF Hyperparameters |
-
1:
Initialize Metaheuristic Optimization
-
2:
Step 1: Define Search Space ()
-
3:
Hyperparameter ranges for Random Forest:
-
4:
: Number of estimators (n_estimators): 100 to 300
-
5:
: Maximum tree depth (max_depth): 3 to 10
-
6:
: Minimum samples split (min_samples_split): 2 to 20
-
7:
: Minimum samples leaf (min_samples_leaf): 1 to 10
-
8:
: Number of selected features: 1 to 38
-
9:
Set population size = 50 and number of iterations = 100
-
10:
Step 2: Metaheuristic Search Process
-
11:
for to 100 do
-
12:
Select feature subset
-
13:
Generate parameter set
-
14:
Train Random Forest using and
-
15:
Compute fitness using F1-score:
-
16:
end for
-
17:
Step 3: Identify Global Optimum
-
18:
Find best fitness:
-
19:
Extract optimal feature subset and parameters
-
20:
End of Optimization
|