Skip to main content
. 2020 Apr 30;20(9):2559. doi: 10.3390/s20092559
Algorithm 3: Adaboost classifier algorithm
Input: KDD99 and nslkdd datasets
Training:
  1. Selection of base classifier C;

  2. Set the threshold for initial weights Wi1 [0, 1], sumi=1Nwi1 = 1, Commonly Wi1=1N;

  3. For n = 1 →k produce sample Dn for training from D using the distribution Wn

  4. Training of base classifier C on Dn data subset to develop the Cn classifier.

  5. en=j=1Nwni is the ensemble error calculated when classifier Cn misclassifies the ith data point in D.

  6. If en (0, 0.5) then calculate βn=en1en and update the next weight.

  7. wn+1,i=wniXβwni

  8. Distribution Wn+1,i needs to be normalized.

  9. For further value of en, set threshold Wi1=1N and continue the process;

  10. Return the trained classifiers C1C2C3.........Cn and β1β2β3......βn.

Testing:
  1. In the testing dataset, each data object X is passed to the testing dataset; classify by classifiers C1C2C3.........Cn.

  2. For each label y, assign to x by Cn, calculate myx=Cnx=yln1βk. The class that has maximum value my(x) is decided as the class label of x.

  3. Repeat step2 for testing data and return the output.