Home

Infinitesimal

An infinitesimal is a quantity with magnitude smaller than any positive real number, yet not equal to zero. Historically used in the development of calculus by Newton and Leibniz, infinitesimals allowed informal reasoning about quantities that change by infinitely small amounts. Although appealing, their rigorous status was questioned, leading to the development of limits as the foundation of calculus in the 19th century and the rejection of actual infinitesimals in standard analysis. Nevertheless, the intuition persists in the notation of differentials (dx, dy) and in the idea of an infinitesimal change in a function.

Nonstandard analysis, formulated by Abraham Robinson in the 1960s, provides a rigorous framework in which infinitesimals

In geometry and algebraic geometry, infinitesimals appear in the study of tangent spaces and in infinitesimal

In physics and applied mathematics, infinitesimals appear in differential equations, perturbation theory, and in the heuristic

Infinitesimals remain a foundational and interpretive tool across schools of mathematics, serving both as a heuristic

exist
as
actual
numbers
within
the
hyperreal
extension
of
the
real
numbers.
The
transfer
principle
ensures
that
many
real-number
properties
extend
to
hyperreals,
and
the
standard
part
map
associates
a
finite
hyperreal
with
its
real
value.
thickening,
where
one
considers
nilpotent
or
dual
numbers
to
capture
first-order
behavior.
notation
of
differentials,
though
precise
justification
often
relies
on
limits
or
nonstandard
frameworks.
device
and,
in
nonstandard
analysis,
as
a
rigorous
element
of
number
systems.