Search⌘ K
AI Features

Least Squared Error Solution

Explore how to calculate and interpret the least squared error solution in linear regression problems. Learn to minimize the sum of squared errors of a linear system using vectorized notation and matrix operations. Gain hands-on experience applying these concepts with Python functions like pseudo-inverse and least squares solving to approximate solutions, even for inconsistent systems.

Squared error

Squared distance is also known as squared error. Consider a linear equation in wiw_i's:

w1a1+w2a2+...+wnan=bw_1a_1+w_2a_2+...+w_na_n=b

The squared error (squared distance) on a given point, (w^1,w^2,...,w^n)(\hat w_1,\hat w_2,...,\hat w_n), is defined as:

SE(w^1,w^2,...,w^n)=(w^1a1+w^2a2+...+w^nanb)2SE(\hat w_1,\hat w_2,...,\hat w_n)=(\hat w_1a_1+\hat w_2a_2+...+\hat w_na_n-b)^2 ...