Exercise: Randomized Grid Search to Tune XGBoost Hyperparameters
Explore how to use randomized grid search to tune multiple XGBoost hyperparameters simultaneously. Learn to sample parameter ranges effectively, set model parameters, and evaluate validation AUC to optimize model performance without exhaustive searches.
We'll cover the following...
XGBoost for randomized grid search
In this exercise, we’ll use a randomized grid search to explore the space of six hyperparameters. A randomized grid search is a good option when you have many values of many hyperparameters you’d like to search over. We’ll look at six hyperparameters here. If, for example, there were five values for each of these that we’d like to test, we’d need searches. Even if each model fit only took a second, we’d still need several hours to exhaustively search all possible combinations. A randomized grid search can achieve satisfactory results by only searching a random sample of all these combinations. Here, we’ll show how to do this using scikit-learn and XGBoost.
The first step in a randomized grid search is to specify the range of values ...