Home

stepsize

Stepsize, or step size, is the increment by which a variable is advanced in an iterative or discrete process. It is a key parameter in numerical analysis, optimization, and simulation, governing how far the state is updated at each iteration or time step.

In numerical time integration, h denotes the time step. Methods such as Euler’s method update y_{n+1} =

In optimization and machine learning, the stepsize (often called the learning rate) determines how far along

Adaptive stepsize methods adjust h during computation based on estimated error, allowing larger steps when the

In linear algebra and iterative solvers, the stepsize concept may appear as a scalar multiplier applied to

Choosing an appropriate stepsize involves considering truncation error, stability, and computational cost, often through analysis or

y_n
+
h
f(t_n,
y_n).
A
smaller
h
generally
increases
local
accuracy
but
requires
more
steps
to
reach
a
given
interval,
while
too
large
h
can
lead
to
instability
or
significant
error.
a
chosen
search
direction
the
algorithm
moves
at
each
iteration.
A
large
stepsize
can
speed
up
progress
but
may
overshoot
or
diverge;
a
small
stepsize
yields
robust
but
slow
convergence.
Some
algorithms
employ
line
searches
or
adaptive
schemes
to
adjust
the
stepsize
to
satisfy
certain
conditions
(e.g.,
Armijo
or
Wolfe)
or
to
balance
progress
and
accuracy.
solution
behaves
smoothly
and
smaller
steps
near
rapid
changes.
This
is
common
in
Runge–Kutta
and
other
integrators,
and
stability
constraints
such
as
the
CFL
condition
influence
allowable
stepsizes
in
stiff
or
wave-dominated
problems.
a
search
direction
in
gradient
methods;
in
fixed-point
iterations,
it
can
affect
convergence
rate.
adaptive
strategies.
See
also
numerical
integration,
optimization,
line
search,
and
learning
rate.