Challenge: Loss and Activation function
Explore how to implement and test various loss and activation functions using JAX. Understand their impact on neural network training by working with built-in and custom functions, including softmax cross-entropy, cosine similarity, ReLU, and ELU.
Challenge 1: Implementation of loss functions
You are required to execute the following built-in functions of JAX on the given logits and labels:
-
Softmax cross-entropy ...