Home

LevenbergMarquardt

Levenberg-Marquardt algorithm, commonly abbreviated Levenberg-Marquardt or LM, is an iterative method for solving nonlinear least squares problems. It combines aspects of the Gauss-Newton algorithm and gradient descent to provide a robust and efficient approach for minimizing the sum of squares of nonlinear residuals.

Let F(p) = 1/2 sum r_i(p)^2 be the objective, where r_i(p) are residuals depending on parameters p. At

When λ is large, the method behaves like gradient descent, favoring safer, smaller steps. When λ is small,

LM is widely used for curve fitting and nonlinear data fitting, especially when residuals are poorly scaled

each
iteration,
LM
forms
the
Jacobian
J
of
the
residuals
with
respect
to
p
and
computes
a
step
Δp
by
solving
(J^T
J
+
λ
I)
Δp
=
-J^T
r,
where
λ
is
a
damping
parameter
and
r
is
the
vector
of
residuals.
The
damping
parameter
controls
the
trust
in
the
Gauss-Newton
direction.
it
behaves
like
Gauss-Newton,
taking
more
aggressive
steps
along
the
approximate
Newton
direction.
After
a
trial
update,
λ
is
adapted:
if
the
new
parameters
produce
a
substantial
reduction
in
F,
λ
is
decreased
to
allow
larger
Gauss-Newton
steps;
if
the
reduction
is
insufficient,
λ
is
increased
to
make
the
step
more
conservative.
or
the
problem
is
strongly
nonlinear.
It
requires
evaluation
of
the
Jacobian,
or
a
suitable
approximation,
and
a
reasonable
initial
guess.
While
efficient
in
many
applications,
it
can
be
sensitive
to
parameter
scaling,
local
minima,
and
ill-conditioning,
and
it
may
not
guarantee
convergence
to
a
global
optimum.
LM
is
related
to
trust-region
methods
and
is
historically
rooted
in
work
by
Levenberg
and
Marquardt
in
the
mid-20th
century.