Home

infinitesimally

Infinitesimal refers to a quantity with magnitude smaller than any positive real number, yet not zero. In historical calculus, infinitesimals allowed intuitive reasoning about derivatives and integrals. In modern standard analysis, there are no real infinitesimals; calculus is built on limits. The adverbial form infinitesimally is used in ordinary language to mean to an extremely small degree.

The concept originated with 17th-century thinkers such as Leibniz and Newton, who used infinitesimals as a

In the 20th century, nonstandard analysis provided a rigorous framework in which infinitesimals exist as hyperreal

formal
tool
in
developing
differential
and
integral
calculus.
Their
notation,
including
dx
and
dy,
helped
describe
instantaneous
rates
of
change
and
accumulation.
Philosophical
objections
and
the
subsequent
development
of
epsilon-delta
rigor
by
Cauchy,
Bolzano,
Weierstrass,
and
others
led
to
a
limit-based
foundation
that
does
not
require
actual
infinitesimals.
numbers.
In
this
setting,
an
infinitesimal
is
a
nonzero
hyperreal
whose
absolute
value
is
smaller
than
1/n
for
every
standard
natural
number
n.
Derivatives
and
integrals
acquire
intuitive
definitions
via
infinitesimal
changes,
for
example
f'(x)
=
st(Δy/Δx)
for
an
infinitesimal
Δx,
where
st
denotes
the
standard
part.
The
term
infinitesimally
remains
in
everyday
language
to
describe
very
small
quantities
or
distinctions.