Search⌘ K
AI Features

Loss

Explore the process of computing loss in PyTorch using Mean Squared Error for regression problems. Learn how to define and use loss functions, interpret reduction methods, and convert loss tensors for analysis. Gain practical skills to evaluate model performance effectively.

Introduction to loss functions

We will now tackle the loss computation process. As expected, PyTorch got us covered once again. There are many loss functions to choose from depending on the task at hand. Since ours is a regression, we are using the Mean Squared Error (MSE) as our loss, and thus we need PyTorch’s nn.MSELoss:

Python 3.5
import torch.nn as nn
# Defines a MSE loss function
loss_fn = nn.MSELoss(reduction='mean')
print(loss_fn)

Notice that nn.MSELoss is not the loss function itself. We do not pass predictions and labels to it! Instead, as you can see, it returns another function, which we called loss_fn. That is the actual loss function. So, we can pass a prediction and a label to it, and get the corresponding loss value:

Python 3.5
import torch
import torch.nn as nn
# Defines a MSE loss function
loss_fn = nn.MSELoss(reduction='mean')
# This is a random example to illustrate the loss function
predictions = torch.tensor([0.5, 1.0])
labels = torch.tensor([2.0, 1.3])
print(loss_fn(predictions, labels))

...

Moreover, you ca ...