|
Algorithm 2 The algorithm for boosting classifier |
-
1:
Form a large set of sample features
-
2:
Initialize the weights of training samples
-
3:
for T rounds do:
-
4:
Normalize the weights of the samples
-
5:
For available features from the set, train a classifier using a single feature and evaluate the training error
-
6:
Choose the classifier with the lowest error
-
7:
Update the weights of the training samples: increase if classified wrongly by this classifier, decrease if correctly
-
8:
end for
-
9:
Form the final strong classifier as the linear combination of the T classifiers.
|