Gradient Boosted Regression Trees (GBRT)

Gradient Boosted Regression Trees (GBRT) is a machine learning technique that combines the strengths of Decision Trees and Ensemble Method to create a powerful Regression Models. It sequentially builds multiple decision trees, with each new tree focusing on correcting the errors of the previous ones, ultimately leading to a more accurate prediction of continuous target variables.

GBRT is a versatile and powerful Regression technique, but it can be prone to Overfitting if not properly regularized. Understanding hyperparameter tuning is crucial for achieving optimal results.


  • GBRT leverages the power of ensemble methods by combining predictions from multiple Weak Learners (shallow decision trees) into a single, stronger model.
  • New trees are built sequentially, with each tree focusing on improving the overall prediction by learning from the errors of the previous ones.
  • This technique utilizes the gradients (errors) from the previous model's predictions to guide the training of the next tree, ensuring it concentrates on areas where the model performs poorly.
  • By iteratively refining predictions, GBRT often achieves higher accuracy compared to individual decision trees.
  • GBRT can handle various types of data, including non-linear relationships and complex feature interactions.
  • GBRT has several hyperparameters that can be tuned to optimize performance for a specific dataset.