Challenge II: Linear Regression
Understand how to build a linear regression model using JAX by implementing the regressor, defining a loss function, and optimizing parameters with gradient descent. This lesson helps consolidate skills in vectorization, auto-differentiation, and applying core linear algebra concepts in machine learning.
We'll cover the following...
We'll cover the following...
We can consolidate our learning so far in a working example.
Since this course slightly favors machine or deep learning, we’ll practice with the ultimate 101 example: linear regression. In this challenge, you’ll build linear regression step-by-step.
Step 1: Regressor
Remember, linear regression fits the data point to a straight line:
As a starter, this part simply requires us to implement the regressor,
Step 2: Loss function
The loss function for a single sample is:
...