Search⌘ K
AI Features

Challenge: Loss and Activation function

Explore how to implement and test various loss and activation functions using JAX. Understand their impact on neural network training by working with built-in and custom functions, including softmax cross-entropy, cosine similarity, ReLU, and ELU.

Challenge 1: Implementation of loss functions

You are required to execute the following built-in functions of JAX on the given logits and labels:

  1. Softmax cross-entropy ...