Home

leastmeansquares

Least mean squares (LMS) is an adaptive filter algorithm used to iteratively minimize the mean squared error between a desired signal and the filter output. It is a stochastic gradient descent method that updates filter weights with a simple rule, offering real-time operation and low computational complexity.

Let x(n) be the input vector at time n, d(n) the desired signal, y(n) = w(n)^T x(n) the

Convergence: In the mean, the algorithm converges if 0 < μ < 2 / λ_max, where λ_max is the largest

Variants: Normalized LMS (NLMS) divides the update by the input power to stabilize performance: w(n+1) = w(n)

Applications: LMS is widely used for adaptive equalization, echo cancellation, system identification, noise reduction, and adaptive

History: The algorithm was introduced in the 1960s by Widrow and his collaborators as part of the

filter
output,
and
e(n)
=
d(n)
−
y(n)
the
error.
The
weight
vector
is
updated
by
w(n+1)
=
w(n)
+
μ
e(n)
x(n),
where
μ
>
0
is
the
step
size
that
governs
convergence
speed
and
steady-state
error.
eigenvalue
of
the
input
autocorrelation
matrix
R
=
E[x(n)
x(n)^T].
Larger
μ
yields
faster
adaptation
but
greater
misadjustment.
For
nonstationary
inputs,
LMS
tracks
changes
but
with
a
trade-off
between
speed
and
accuracy.
+
μ
e(n)
x(n)
/
(ε
+
||x(n)||^2).
Other
variants
include
leaky
LMS,
sign
LMS,
and
affine
projection
algorithms,
each
offering
different
stability
and
convergence
properties.
control
in
communication
and
signal-processing
systems.
broader
LMS
family,
which
has
since
become
a
foundational
tool
in
adaptive
signal
processing.