Evaluating XGBoost with tidymodels

Learn how to evaluate cross-validation results of the XGBoost algorithm using tidymodels.

We'll cover the following


Cross-validation is the best practice for evaluating the quality of machine learning models. XGBoost ensembles are no exception to this rule. The following code performs these processes:

  1. Prepares the Titanic training data.

  2. Specifies the XGBoost classification ensemble with hyperparameter tuning.

  3. Creates a tuning grid.

  4. Performs five-fold cross-validation for each row of the tuning grid.

  5. Outputs the best ten combinations of hyperparameters in terms of accuracy, sensitivity, and specificity metrics.

Get hands-on with 1200+ tech skills courses.