Search⌘ K
AI Features

Coordinate Descent

Explore the coordinate descent algorithm that optimizes multivariate functions by minimizing one coordinate at a time. Understand how it compares to gradient descent and discover applications in regression, image, and signal processing. Learn to implement coordinate descent using Python libraries for convex optimization problems.

The coordinate descent algorithm

Consider a multivariate function f(x)=f(x1,x2,...,xn)f(x) = f(x_1, x_2, ..., x_n) that we want to optimize. Using the gradient descent algorithm, the updates will happen in all directions at once.

Coordinate descent is a variation of gradient descent that tries to find the minimum of a function by minimizing it (i.e., performing gradient descent) along one coordinate direction at a time. Starting from an initial point x0x^0, coordinate descent defines ...