Adaboost

Learn about the boosting technique for classification. We will discuss Adaboost in detail.

Boosting

Boosting is a very popular machine learning technique, and it is widely used for online ML competitions and industries. Boosting is the technique of combining weak classifiers to make a strong classifier. Weak learners can come from any other classifier like logistic regression, decision trees, etc. When we apply weak classifiers, like decision trees with a depth of 2, we get a very low variance error, but the bias error is high. To handle this, we increase the complexity or increase the depth. This training error is reduced, but the testing error starts increasing. Boosting helps with these scenarios.

We combine the weak learners with weights. First, we start by training the first weak classifier. In the next classifier, we focus on the examples where the first classifier was wrong and increase their weights. The idea behind boosting is to focus on the examples in which it is hard to get the predictions right. We keep changing the weight of the data example after the result of each classifier. Finally, we combine these classifiers in a weighted manner to get the result.

Level up your interview prep. Join Educative to access 70+ hands-on prep courses.