Skip to main content
. 2025 Sep 3;25(17):5489. doi: 10.3390/s25175489
Algorithm 1 MHA-Based Optimization for Each Training Fold Using RF Hyperparameters
  •   1:

    Initialize Metaheuristic Optimization

  •   2:

    Step 1: Define Search Space (D)

  •   3:

         Hyperparameter ranges for Random Forest:

  •   4:

            ϕ1: Number of estimators (n_estimators): 100 to 300

  •   5:

            ϕ2: Maximum tree depth (max_depth): 3 to 10

  •   6:

            ϕ3: Minimum samples split (min_samples_split): 2 to 20

  •   7:

            ϕ4: Minimum samples leaf (min_samples_leaf): 1 to 10

  •   8:

            ϕ5: Number of selected features: 1 to 38

  •   9:

         Set population size = 50 and number of iterations = 100

  • 10:

    Step 2: Metaheuristic Search Process

  • 11:

    for g=1 to 100 do

  • 12:

            Select feature subset Xg

  • 13:

            Generate parameter set ϕg={ϕ1g,ϕ2g,ϕ3g,ϕ4g}

  • 14:

            Train Random Forest using Xg and ϕg

  • 15:

            Compute fitness using F1-score: ηg

  • 16:

    end for

  • 17:

    Step 3: Identify Global Optimum

  • 18:

         Find best fitness: η*=max{η1,η2,,η50}

  • 19:

         Extract optimal feature subset X* and parameters ϕ*

  • 20:

    End of Optimization