Linear Models
Learn the main linear models from scikit-learn, such as OLS and lasso regression.
Linear models are the bread and butter of many ML applications, providing a quick and easy way to make predictions on a wide range of datasets.
Linear models are a type of statistical model that uses linear relationships to describe the relationship between a set of input variables and a target variable. These models aim to find the best set of parameters that can accurately predict the target variable given the input variables.
Let’s dive into the world of linear models, exploring the different types of models available in scikit-learn and how they can be used to make predictions on different types of datasets. We’ll cover the key concepts behind linear regression models and explore their strengths and limitations. Additionally, we’ll learn how to deal with high-dimensional data by using techniques such as regularization.
Ordinary least squares
Ordinary least squares (OLS) regression is one of the simplest and most widely used linear regression techniques in ML. Its main objective is to find the line that best fits a given set of data points.
The technique works by minimizing the sum of the squared errors between the predicted values and the actual values. In other words, out of all possible lines in the input space, OLS regression tries to find the best line that is closest to all the data points.
To do this, OLS regression fits a straight line to the data by estimating two parameters: the intercept (the point where the line crosses the y-axis) and the slope (the rate at which the line changes as we move along the x-axis). The goal is to find the intercept and slope that minimize the sum of the squared errors between the predicted and actual values.
The OLS estimator is calculated as follows:
Get hands-on with 1400+ tech skills courses.