Challenge: Optimizers
Test your understanding by training a model using multiple optimizers.
We'll cover the following
Problem statement
Using the JAX library, we have implemented a vanilla convolutional neural network for you. You are required to initialize the optimizers and observe the change in the loss (if any). You are required to perform the following tasks:
- Apply SGD as an optimizer.
- Apply AdaBelief as an optimizer.
- Check the difference between the loss values. Which one performs better in this case?
Try it yourself
Write your code in the notebook given below:
Get hands-on with 1400+ tech skills courses.