Challenge: Optimizers

Test your understanding by training a model using multiple optimizers.

We'll cover the following

Problem statement

Using the JAX library, we have implemented a vanilla convolutional neural network for you. You are required to initialize the optimizers and observe the change in the loss (if any). You are required to perform the following tasks:

  1. Apply SGD as an optimizer.
  2. Apply AdaBelief as an optimizer.
  3. Check the difference between the loss values. Which one performs better in this case?

Try it yourself

Write your code in the notebook given below:

Get hands-on with 1200+ tech skills courses.