# Reviewing the Steps of Gradient Descent

An overview of the steps involved in gradient descent.

## We'll cover the following

## Simple linear regression

Most tutorials start with some nice and pretty image classification problems to illustrate how to use PyTorch. It may seem cool, but I believe it distracts you from learning how PyTorch works.

For this reason, in this first example, we will stick with a simple and familiar problem: a **linear regression** with a single feature `x`

! It does not get much simpler than that. It has the following equation:

$y = b + w x + \epsilon$

It is also possible to think of it as the simplest neural network possible: one input, one output, and no activation function (that is, linear).

Create a free account to view this lesson.

By signing up, you agree to Educative's Terms of Service and Privacy Policy