Evaluating XGBoost with tidymodels
Understand how to implement and evaluate XGBoost machine learning models in R using tidymodels. Explore cross-validation methods and hyperparameter tuning to improve model accuracy and reliability, with practical application to the Titanic dataset.
We'll cover the following...
We'll cover the following...
Cross-validation
Cross-validation is the best practice for evaluating the quality of machine learning models. XGBoost ensembles are no exception to this rule. The following code performs these processes:
Prepares the Titanic training data.
Specifies the XGBoost classification ensemble with hyperparameter tuning.
Creates a tuning grid.
Performs five-fold cross-validation for each row of the tuning grid.
Outputs the best ten combinations of hyperparameters in terms of ...