Generalized Linear Regression for Multiple Targets
Explore techniques for extending generalized linear regression to handle multiple output variables at once. Understand how to formulate the multi-target regression task using matrix and vectorization methods, apply regularized ridge regression with the Frobenius norm, and derive a closed-form solution for optimal weights. This lesson helps you grasp multi-output prediction models and their efficient implementation in Python.
We'll cover the following...
Building on generalized linear regression, this lesson introduces techniques for simultaneously modeling several output variables. Previously, the optimal parameters were a vector (); now, the optimal parameters are represented by a weight matrix (). Each input vector now corresponds to a target vector , where is the number of outcomes we are predicting. The model uses the form , where still allows for non-linear feature transformation. This lesson focuses on formulating this multi-output prediction task as a single, efficient matrix minimization problem, often termed multi-target Ridge Regression, to derive the corresponding closed-form solution.
Multiple targets
Consider a regression dataset , where and . A function is a generalized linear model for regression for any given mapping of the input features , and is a matrix with columns, one for each target. Note that , meaning the model produces a vector of predicted values for each input , with each component corresponding to one of the multiple target variables. This allows the generalized linear model to simultaneously predict all target outputs in a single evaluation.
Try this quiz to review what you’ve learned so far.
In the context of the function , if , , and , then what is the value of ?
The optimal parameters can be determined by minimizing a regularized squared loss (multi-target ridge regression) as follows:
...