Skip to main content
. 2022 May 20;2022:1668676. doi: 10.1155/2022/1668676

Table 2.

Summarized differences between Bagging and Boosting.

Bagging Boosting
Weak models often learn independently in parallel Weak models often learn sequentially in an adaptive way
Bagging focuses on obtaining an ensemble model with less variance Boosting focus on producing a strong model with less bias but variance can also be reduced
Different weak learners can be fitted independently and train concurrently Different weak learners cannot be fitted independently but models are fitted iteratively and training of each model depends on the model fitted previously
The idea behind boosting is to construct a set predicting model by aggregating the results of base models. The idea behind boosting is to construct a set of models which are aggregated to get a strong learner.