Optimizer

Learn about the different optimizers in PyTorch and some of its built-in methods.

Introduction to optimizers

So far, we have been manually updating the parameters using the computed gradients. That is probably fine for two parameters, but what if we had a whole lot of them? We need to use one of PyTorch’s optimizers like SGD, RMSprop, or Adam.

Get hands-on with 1200+ tech skills courses.