Search⌘ K
AI Features

Other Important Hyperparameters in XGBoost

Understand and apply key XGBoost hyperparameters to control model complexity and improve performance. Learn how settings like max_depth, gamma, and subsample influence gradient boosting, helping you optimize predictive accuracy and reduce overfitting.

We'll cover the following...

XGBoost hyperparameters

We’ve seen that overfitting in XGBoost can be compensated for by using different learning rates, as well as early stopping. What are some of the other hyperparameters that may be relevant? XGBoost has many hyperparameters and we won’t list them all here. You’re encouraged to consult the XGBoost documentation for a full list.

In an upcoming exercise, we’ll do a grid search over ranges of six hyperparameters, ...