Home

convergence

Convergence is the idea that a process, sequence, function, or sequence of random variables approaches a limiting value as its index or input tends to a specified limit, usually infinity. It is a foundational concept across mathematics, statistics, and numerical analysis, capturing the notion of stabilization or predictability in the limit.

For a sequence {a_n} in a metric space with limit L, convergence means that the distance d(a_n,

In analysis, functions can converge in several senses. Pointwise convergence f_n(x) -> f(x) means the convergence holds

Series converge when the sequence of partial sums S_N = sum_{n=1}^N a_n approaches a finite limit. Absolute

In probability and measure theory, convergence concepts include convergence in probability, almost sure convergence, convergence in

Numerical analysis considers convergence of iterative methods to a solution, with rates such as linear or quadratic

Example: a_n = 1/n converges to 0. The function f_n(x) = x^n on [0,1] converges pointwise to 0

L)
tends
to
zero
as
n
grows.
In
the
real
numbers
this
implies
that,
beyond
some
index,
all
terms
are
arbitrarily
close
to
L
and,
if
a
limit
exists,
it
is
unique.
for
every
x
in
the
domain.
Uniform
convergence
strengthens
this
by
requiring
sup_x
|f_n(x)
-
f(x)|
to
go
to
zero;
uniform
convergence
preserves
many
limits
and
integrals
and
allows
justified
interchange
of
limits
with
differentiation
under
suitable
conditions.
convergence
implies
convergence
of
the
series
regardless
of
sign,
while
conditional
convergence
may
depend
on
the
order
of
terms.
distribution,
and
L^p
convergence,
each
with
its
own
implications
and
relationships
(for
example,
almost
sure
convergence
implies
convergence
in
probability).
convergence
describing
how
quickly
the
iterates
approach
the
limit.
for
x
in
[0,1),
but
not
uniformly
on
[0,1].