Skip to main content
. 2022 May 12;27(10):3112. doi: 10.3390/molecules27103112
Algorithm 3 Fitness Function
Input:N: The samples amount of training set.
K: The amount of feature subsets in an individual. The number of weak classifiers to train.
I: An individual.
T: Training Set.
Output: V: Fitness value of I.
1:    For k=1,,K do
2:    Divide features into Binary Feature FB and Continuous Feature FC.
3:    Rotation: Apply PCA to FC and apply MCA to FB and then merge the two parts.
4:    Initialize the weights w1=w1,,wN, wj1=1N.j=1Nwj1=1.
5:    For k=1,,K do
6:    Take a sample Sk from T using distribution wk.
7:    Train a classifier Dk using Sk as the training set.
8:    Calculate the weighted ensemble error at step k by ϵk=j=1Nwjklkj, (lkj=1 if Dk misclassifies tj and lkj=0 otherwise.)
9:    If ϵk=0 or ϵk0.5, ignore Dk, reinitialize the weights wjk to 1N and continue.
10:     Else, calculate βk=ϵk1ϵk, where ϵk0,0.5.
11:     Update the kth part weights in I by wjk+1=wjkβ1lkji=1Nwikβ1lki, j=1,,N.
12:    Calculate the support for each class wt in Validation Set by μtx=Dkx=wtln1βk.
13:     The class with the maximum support is chosen as the label for x.
14:     V is calculated by F1score=2×precision×recallprecision+recalln validation data.