Quasi-Newton Methods
Discover how quasi-Newton methods provide computationally efficient alternatives to Newton’s method by approximating the inverse Hessian using gradient information. Learn the BFGS algorithm's steps, applications to complex functions, and how it optimizes machine learning models without costly Hessian computations.
We'll cover the following...
Quasi-Newton methods are a class of second-order optimization algorithms that are particularly useful when the function’s Hessian is difficult to compute or not available.
Consider the update rule of the Newton algorithm at the time
Here,