Search⌘ K
AI Features

Quasi-Newton Methods

Discover how quasi-Newton methods provide computationally efficient alternatives to Newton’s method by approximating the inverse Hessian using gradient information. Learn the BFGS algorithm's steps, applications to complex functions, and how it optimizes machine learning models without costly Hessian computations.

Quasi-Newton methods are a class of second-order optimization algorithms that are particularly useful when the function’s Hessian is difficult to compute or not available.

Consider the update rule of the Newton algorithm at the time tt as follows:

Here, H(x)H(x) is the Hessian of the function f(x)f(x) at the point xx. Recall that the complexity of computing the Hessian is of the order O(m2)O(m^2) as opposed to the gradient, which is of the order O(m)O(m). This means that as the dimensionality (mm) of xx ...