Skip to main content
. 2013 Jun 14;13(6):7714–7734. doi: 10.3390/s130607714
Algorithm 1. Multi-class AdaBoost learning algorithm. M hypothesis are constructed each using a single feature vector. The final hypothesis is a weighted linear combination of M hypothesis.

  1. Initialize the observation weights w1,i=1/n, i=1,2,…,n.

  2. For m = 1 to M:

    1. Normalize the weights, wm,iwm,i/j=1nwm,j.

    2. Select the best week classifier with respect to the weighted error
      err(m)=minfi=1nwiI(ciT(xi,x(p),f))/i=1nwi.
    3. Define T(m) (x) =T(x, x(p), fm)where fm is the minimize of err(m).

    4. Compute
      α(m)=log1err(m)err(m)+log(K1). (13)
    5. Update the weights:
      wm,iwm,iexp(α(m)I(ciT(m)(xi))),i=1,,n.
  3. The final strong classifier is:
    C(x)=argmaxkm=1Mα(m)I(T(m)(x)=k). (14)