Hyper-Parameter Optimization and Kaggle Competition
Explore key hyper-parameter optimization methods including manual tuning, grid search, random search, Bayesian optimization, and genetic algorithms. Understand how these techniques improve machine learning model performance. Apply these concepts through a practical Kaggle competition for hands-on learning.
Cross-Validation
In the Chapter on Regression, we looked at Cross-Validation and saw the intuition of using it and its benefits. We looked at the working of k-Fold Cross Validation which divided the training dataset into sets and trained on the sets and evaluated on the ...