Search⌘ K

Multiple Regression

Explore multiple regression techniques to model relationships using multiple features, understand coefficient interpretation by fixing variables, and learn how to minimize errors with loss functions. This lesson also covers the gradient descent algorithm to optimize regression models by iteratively reducing loss.

Understanding multiple regression

Multiple regression is an extension of simple linear regression. It is defined as a linear regression with multiple features. It is the most widely used machine learning technique. This technique is useful across many domains as we cannot depend on one single variable to do effective modeling. Now, let’s formulate an equation for multiple linear regression. We have a feature set X, which consists of p features.

We can present the equation as

yi=w0+w1xi1+w2xi2+w3xi3+...+wpxipy_i= w_0 +w_1x_{i1}+ w_2x_{i2}+ w_3x_{i3} + ... + w_px_{ip}

Instead of using the features directly, we can use transform features.

yi=w0+w1T(xi1)+w2T(xi2)+w3T(xi3)+...+wkT(xik)y_i= w_0 +w_1T(x_{i1})+ w_2T(x_{i2})+ w_3T(x_{i3}) + ... + w_kT(x_{ik}) ...