Skip to main content
. 2019 Feb 14;19(4):780. doi: 10.3390/s19040780
Algorithm 1 Group Cost-Sensitive AdaBoost Algorithm
Input: Training set (xi,yi)i=1n where xi is the feature vector of the sample and yi{1,1} is the class label, costs {Cfn1,Cfn2,,CfnN,Cfp} for different groups, the set of weak learners gk(x)k=1K, and the number M of weak learners in the final classifier.
Output: Strong classifier h(x) for multi-resolution detectors.
 1: Initialization: Set of uniformly distributed weights for each group:
 2: ωi(0)=12|G+1|,iG+1;ωi(0)=12|G+2|,iG+2;;ωi(0)=12|G+N|,iG+N;ωi(0)=12|G|,iG.
 3: for m={1,,M} do
 4:     for k={1,,K} do
 5:         Compute parameter values as in Equations (16), (17) with g(x)=gk(x);
 6:         Obtain the value of αk by solving Equation (15);
 7:         Calculate the loss of the weak learner (αk,gk(x)) as in Equation (18).
 8:     end for
 9:     Select the best weak learner (αm,gm(x)) with the minimum loss as in Equation (19);
10:     Update the weights ωi according to Equation (20).
11: end for
12: returnh(x)=sgnm=1Mαmgm(x).