Skip to main content
. 2020 May 19;8:178. doi: 10.3389/fpubh.2020.00178

Algorithm 1.

AdaBoost.M1

1: Initialize the boosting weights Dn,1 = 1/N for xnS.
2: for t = 1, …, T do
3:     Train the tth weak classifier ft so as to minimize Jt.
4:     Get estimate of xnS: ht,n = ft(xn).
5:     Calculate the error of ht,n, εt:     εt=n=1NDt,nI(ht,nyn)
6:     Set βt = εt/(1 − εt).
7:     Update the boosting weights Dt+1,n using Eq.(2).
8: end for
9: return The final classifier H(x).