QuasiNewton
Quasi-Newton methods are a class of iterative optimization algorithms that improve upon Newton's method by building an approximation to the Hessian (second-derivative matrix) or its inverse instead of computing it directly at every iteration. They are designed for unconstrained optimization and aim to achieve faster convergence than simple gradient descent without the full computational cost of evaluating and inverting the true Hessian.
At each iteration, the gradient g_k = ∇f(x_k) is computed. The method maintains an approximation B_k of
B_{k+1} = B_k + (y_k y_k^T)/(s_k^T y_k) - (B_k s_k s_k^T B_k)/(s_k^T B_k s_k),
where s_k^T y_k > 0 under appropriate line search. This update preserves symmetry and, with suitable step
Among quasi-Newton methods, BFGS is the most widely used due to robustness and efficiency; DFP is an
Quasi-Newton methods are valued for their fast, typically superlinear convergence, and their balance of computational cost