Search⌘ K
AI Features

Quasi-Newton Methods

Explore quasi-Newton methods to understand how they efficiently approximate the inverse Hessian matrix using gradient information. Learn the BFGS algorithm, a popular quasi-Newton method, and implement it to solve optimization problems like the Rosenbrock function without computing the Hessian explicitly.

Quasi-Newton methods are a class of second-order optimization algorithms that are particularly useful when the function’s Hessian is difficult to compute or not available.

Consider the update rule of the Newton algorithm at the time tt as follows:

Here, H(x)H(x) is the Hessian of the function f(x)f(x) at the point xx. Recall that the complexity of computing the Hessian is of the order O(m2)O(m^2) as opposed to the gradient, which is of the order O(m)O(m). This means that as the dimensionality (mm) of xx ...