Introduction: XGBoost
Explore the core concepts of gradient boosting and the XGBoost algorithm to build efficient predictive models. Understand how to train XGBoost models with advanced techniques like early stopping and loss-guided tree growth. Discover how SHAP values provide detailed, individualized explanations of model predictions, helping you interpret results for any dataset beyond training data.
We'll cover the following...
Overview
After reading this chapter, you will be able to describe the concept of gradient boosting, the fundamental idea underlying the XGBoost package. You will then train XGBoost models on synthetic data, while learning about early stopping as well as several XGBoost hyperparameters along the way. In addition to using a similar method to grow trees as we have previously (by setting max_depth), you’ll also discover a new way of growing trees that is offered by XGBoost: loss-guided tree growing. After learning about XGBoost, you’ll then be introduced to a new and powerful way of explaining model predictions, called SHAP ...