Generalized Linear Regression

Learn to implement closed form solutions, vectorization, and visualization for generalized linear regression.

Single target

Consider a regression dataset D={(x1,y1),(x2,y2),,(xn,yn)}D=\{(\bold x_1,y_1),(\bold x_2,y_2),\dots,(\bold x_n,y_n)\}, where xiRd\bold x_i \in \R^d and yiRy_i \in \R. A function fw(x)=wTϕ(x)f_\bold w(\bold x) = \bold w^T\phi(\bold x) is a generalized linear model for regression for any given mapping ϕ\phi of the input features x\bold x.

Try this quiz to review what you’ve learned so far.

1.

In the context of the function fw(x)=wTϕ(x)f_\bold w(\bold x) = \bold w^T\phi(\bold x), if xRd\bold x \in \R^d, ϕ(x)Rm\phi(\bold x) \in \R^m and wRk\bold w \in \R^k, then what is kk?

A.

k=dk=d

B.

k=mk=m


1 / 1

The optimal parameters w\bold w^* can be determined by minimizing a regularised squared loss as follows:

w=arg minw{i=1n(wTϕ(xi)yi)2+λwTw}\bold w^*=\argmin_{\bold w}\bigg\{\sum_{i=1}^n (\bold w^T\phi(\bold x_i)-y_i)^2 + \lambda \bold w^T\bold w\bigg\}

Here, i=1n(wTϕ(xi)yi)2+λwTw\sum_{i=1}^n (\bold w^T\phi(\bold x_i)-y_i)^2 + \lambda \bold w^T\bold w is the loss function denoted by L(w)L(\bold w).

Suppose we want to predict the price of a house based on its size, number of bedrooms, and age. We can use a generalized linear regression to model the relationship between these input features and the target variable (the house price). The model can be defined as:

fw(x)=w1x1+w2x2+w3x3+w0f_{\bold w}(\bold x) = w_1 x_{1} + w_2 x_{2} + w_3 x_{3} + w_0

Here, x1x_{1} ...