Gradient descent is the most optimal solution but it is not perfect. Take a look at its issues and see how we can fix them. This will allow us to improve our model and overcome these issues.

Problem 1: When the gradients are too large

A steeper slope tends to overshoot. Instead of converging, we begin to diverge further away from where we started. This overshooting leads to divergence.

Get hands-on with 1200+ tech skills courses.