Search⌘ K
AI Features

Optimizer

Explore how to implement PyTorch optimizers, such as SGD and Adam, to automate parameter updates during model training. Understand the step and zero_grad methods, and see how optimizers enhance the efficiency of gradient descent in regression problems.

Introduction to optimizers

So far, we have been manually updating the parameters using the computed gradients. That is probably fine for two parameters, but what if we had a whole lot of them? We need to use one of PyTorch’s optimizers like SGD, RMSprop, or Adam.

...

There are many optimizers; SGD is the most basic of them, and Adam is one of the most ...