Bagging vs Boosting vs Stacking In Machine Learning

Bagging, boosting, and stacking are three ensemble learning techniques used to improve model performance. Bagging involves training multiple models independently on random subsets of data and then combining their predictions through a majority vote. Boosting focuses on correcting the errors made by previous weak models in a sequence to create a stronger model. Stacking combines multiple models by training a meta-model, which takes model predictions as input and outputs the final prediction....

May 2, 2023 · 22 min · Mario Filho

The 4-Step Framework I Use To Build Powerful Machine Learning Ensembles

A lot of people find machine learning ensembles very interesting. This is probably because they offer an “easy” way to improve the performance of machine learning solutions. The place where you will see a lot of ensembles is Kaggle competitions, but you don’t need to be a Top Kaggler to know how to build a good ensemble for your project. I have spent a lot of time building and thinking about ensembles and here I will tell you my “4-step ensemble framework”....

May 30, 2022 · 7 min · Mario Filho