Search⌘ K
AI Features

Jensen-Shannon Divergence and Cross-Entropy Loss

Explore how PyTorch's JsdCrossEntropy loss function integrates Jensen-Shannon divergence with cross-entropy to measure differences in probability distributions. Understand its parameters and role in enhancing model learnability and robustness in image classification tasks.

The PyTorch Image Model provides a special loss called JsdCrossEntropy. As the name suggests, it uses Jensen-Shannon divergence and cross-entropy to calculate the total loss.

Jensen-Shannon divergence

The Jensen-Shannon divergence is a well-known term in the probability theory and statistics domains. It’s symmetric, has finite values, and measures ...