Decision Stump

A Decision Stump is a simple one-level decision tree often used as a Weak Learners in Ensemble Learning Methods, especially in Boosting algorithms. It consists of a single decision node and two leaf nodes, making decisions based on a single feature from the training data.


Notes:

  • Decision stumps are limited in expressive power due to their simple structure, yet they are effective when used in ensemble methods.
  • Well-known Ensemble Learning Methods like AdaBoost (Adaptive Boosting) use decision stumps as building blocks to create more powerful classifiers.
  • Decision stumps are commonly employed in binary classification tasks, where they provide the basis for more complex models within the ensemble.