AdaBoost (Adaptive Boosting)

AdaBoost algorithm, is a Machine Learning technique for Boosting. It’s similar to Random Forest algorithm, with following differences:

  • Rather than building a forest of decision trees, ADA Boost makes a forest of decision stumps.
  • Each decision stump is assigned with different weights in final decision making.
  • It assigns higher Weights to data points that are wrongly classified so that they are given more importance while building the next model.
  • It helps combine multiple “Weak Classifiers” into a single strong classifier.

It is called Adaptive Boosting as the weights are re-assigned to each instance, with higher weights assigned to incorrectly classified instances. It works on the principle of learners growing sequentially. Except for the first, each subsequent learner is grown from previously grown learners. I.e. IT works on the same principle as boosting(Weak Learners are converted into strong ones) with a slight difference.


In simple words, In each iteration, AdaBoost identifies miss-classified data points and increases their weights so that the following classifier will pay them extra attention to get them right.


  • It's used to solve binary classification problems. though it's use-cases were extended to many other applications.
  • Boosting is used to reduce bias as well as variance for supervised learning.


  • Face detection
  • Spam filtering
  • Fraud detection
  • Medical diagnosis
  • Predicting customer churn