GaussNewton
Gauss–Newton is an iterative optimization algorithm used to solve non‑linear least‑squares problems, where the objective is to minimize the sum of squared residuals between observed data and a model function. The method was independently developed by Carl Friedrich Gauss and Isaac Newton’s descendants in the 19th century, and it builds on the linear least‑squares solution by approximating the Hessian matrix with the product of the Jacobian transpose and the Jacobian.
Given a vector‑valued model function f(θ) that depends on parameters θ and observations y, the residual vector
The method converges rapidly when the residuals are small and the Jacobian is well‑conditioned, but it may