Gradient Boosting (GB)
Metadata
Gradient Boosting GB Gradient Boosting Machine GBM

Gradient Boosting builds multiple decision trees where each tree learns from the mistakes of previous trees. It uses residual error to improve the prediction performance. The whole aim of Gradient boosting is to reduce the residual error as much as possible.

Gradient Boosting is similar to AdaBoost (Adaptive Boosting), the difference between the two is that ADA Boosts builds decision stumps whereas Gradient boosting builds decision trees with multiple leaves.

Notes:

  • Early stopping: To prevent overfitting and improve performance, early stopping can be used to stop the training process when the validation error stops improving.
  • Regularization (Technique) Can be used to prevent overfitting and improve generalization.
  • It's difficult to interpret the model.