Quasi-Newton Methods
Explore quasi-Newton methods to understand how they efficiently approximate the inverse Hessian matrix using gradient information. Learn the BFGS algorithm, a popular quasi-Newton method, and implement it to solve optimization problems like the Rosenbrock function without computing the Hessian explicitly.
We'll cover the following...
Quasi-Newton methods are a class of second-order optimization algorithms that are particularly useful when the function’s Hessian is difficult to compute or not available.
Consider the update rule of the Newton algorithm at the time
Here,