# Regularization

Learn what regularization is and how it affects validation and training error.

**Regularization** is a technique to reduce the variance of a model. One such way is restricting the parameters to a subset of the parameter space. Reduction in variance turns out to be a prevention of overfitting.

## Why not choose a simple model?

Starting with a model that’s too simple and gradually increasing its complexity by monitoring its performance on the testing data is one solution. Regularization does the reverse, that is, starting with a complex model and decreasing its complexity.

Get hands-on with 1200+ tech skills courses.