Search⌘ K
AI Features

Regression

Learn how regression predicts continuous numerical values using input features in supervised learning. Understand building approximation functions through linear regression models that adjust weights and bias to minimize prediction errors measured by mean squared error.

Many real-world problems require us to predict a number, not a category. Whether we want to estimate the price of a house, forecast tomorrow’s temperature, or predict how much a customer will spend, the output is a continuous numerical value.

These kinds of tasks fall under regression, a core technique in supervised learning.

Regression

Regression is a technique that creates a mapping function (ff) to approximate the relationship between a real number (a continuous value) as a target label and its input features.

In other words, given a dataset: D={(x1,y1),(x2,y2),...,(xn,yn)}D=\{(\mathbf{x}_1, y_1), (\mathbf{x}_2, y_2),...,(\mathbf{x}_n, y_n)\}, where xi\mathbf{x}_i is the input and yiy_i is the actual target, the regression function will be:

y^i=f(xi)            i\mathbf{\hat{y}}_i = f(\mathbf{x}_i) \;\;\;\;\;\; \forall i

  • y^i\mathbf{\hat{y}}_i
...