Search⌘ K
AI Features

Put It All Together

Explore multiple linear regression by extending single-variable models to handle multiple inputs with corresponding weights. Understand how to integrate the bias term as an additional weight with a constant input, simplifying the prediction formula. Gain familiarity with matrix operations and NumPy that are essential for managing multi-dimensional data. This lesson equips you to build more accurate and flexible machine learning models that reflect real-world datasets.

Checklist

Let’s check if we have everything in order:

  • We wrote the code that prepares the data.
  • We upgraded predict().
  • We conclude that there is no need to upgrade loss().
  • We upgraded gradient().

We can finally apply all those changes to our learning program now:

Python 3.5
import numpy as np
# computing the predictions
def predict(X, w):
return np.matmul(X, w)
# calculating the loss
def loss(X, Y, w):
return np.average((predict(X, w) - Y) ** 2)
# evaluating the gradient
def gradient(X, Y, w):
return 2 * np.matmul(X.T, (predict(X, w) - Y)) / X.shape[0]
# performing the training phase for our classifier
def train(X, Y, iterations, lr):
w = np.zeros((X.shape[1], 1))
for i in range(iterations):
print("Iteration %4d => Loss: %.20f" % (i, loss(X, Y, w)))
w -= gradient(X, Y, w) * lr
return w
# loading the data first and then training the classifier for 50,000 iteration
x1, x2, x3, y = np.loadtxt("pizza_3_vars.txt", skiprows=1, unpack=True)
X = np.column_stack((x1, x2, x3))
Y = y.reshape(-1, 1)
w = train(X, Y, iterations=50000, lr=0.001)

This code is very similar to the code from the previous chapter. Aside from the part that loads and prepares the data, we have changed just three lines. Also note that the functions are generic. They not only can process the pizza three-variables dataset, but they also would work just as well with an arbitrary number of input variables.

After running the program, here’s what we get:

Iteration 0 => Loss: 1333.56666666666660603369
Iteration 1 => Loss: 151.14311361881479456315
Iteration 2 => Loss: 64.99460808656147037254

Iteration 99999 => Loss: ...