Home

limites

Limits are a foundational concept in calculus and analysis. The limit of a function f as x approaches a is the value that f(x) approaches when x gets arbitrarily close to a (with x not necessarily equal to a). If such a value L exists, we write lim_{x→a} f(x) = L. If x tends to infinity, we speak of limits at infinity; if f(x) grows without bound, the limit is ±∞; if f oscillates without settling, the limit does not exist.

These ideas are made precise by the epsilon-delta definition: lim_{x→a} f(x) = L means that for every

Examples and laws: Common limits include lim_{x→0} (sin x)/x = 1 and lim_{x→a} (f(x) + g(x)) = lim f(x)

Applications and history: Limits are used to define derivatives, integrals, and power series, and to describe

ε>0
there
exists
δ>0
such
that
0<|x−a|<δ
implies
|f(x)−L|<ε.
For
sequences,
a_n
→
L
means
that
for
every
ε>0
there
exists
N
such
that
n≥N
implies
|a_n−L|<ε.
The
two
definitions
are
equivalent.
+
lim
g(x)
when
both
limits
exist.
Limits
respect
addition,
multiplication,
and
division
by
nonzero
limits.
Limits
underpin
continuity:
a
function
is
continuous
at
a
if
lim_{x→a}
f(x)
=
f(a).
asymptotic
behavior.
The
formal
notion
emerged
in
19th-century
analysis,
with
contributions
from
Cauchy
and
Weierstrass.