Subgradient Descent—Lasso Regression

Learn about the subgradient descent algorithm and its application in lasso regression.

Gradient descent works well on objective functions that are convex as well as differentiable at all points. However, how does it work in cases where the objective function is not differentiable at one or more points? Let’s understand this with the example of lasso regression.

Lasso regression

Lasso regression or L1L_1-regularized linear regression is a type of linear regression problem where model coefficients are constrained to near zero using the L1L_1 penalty. Consider the scenario where we want to predict the price of a house based on its size, location, number of rooms, and other features. Using lasso regression, we can identify the most important features affecting the price and discard the irrelevant ones.

Assume XRN×dX \in \R^{N \times d} denotes the set of dd-dimensional input features (size, location, number of rooms, etc.) for NN houses and their corresponding true labels (prices) YRNY \in \R^N, the objective of lasso regression is given as follows:

Get hands-on with 1400+ tech skills courses.