Skip to main content
. 2020 Dec 22;14:617327. doi: 10.3389/fnbot.2020.617327

Algorithm 1.

AdaBoost

1. Initialize weights distribution of training samples:
   D1 = (ω11, ω12, ...ω1i..., ω1N), ω1i = 1/N, i = 1, 2...., N
2. For m = 1, 2, ⋯ , M multiple iterations:
   (1) Training the weighted Dm sample set to obtain the base learner Gm(x)
   (2) Calculating the maximum error of the training set: Em = maxyiGm(xi)∣
   (3) Calculating the relative error of each sample:
   emi=yi-Gm(xi)Em
   (4) Calculating the regression error rate:
   em=i=1Nωmiemi
   (5) Calculating the weight coefficients of weak learners:
   αm=em1-em
   (6) Updating the weight distribution of the sample set:
   ωm+1,i=ωmiZmαm1-em
   Zm=i=1Nωmiαm1-emi
   Dm+1, i = (ωm+1, 1, ωm+1, 2, ...ωm+1, i..., ωm+1, N)
3. Output the ultimate strong learner
   f(x)=m=1M(ln1αm)αmGm(x)
end