Home

Rungeverschijnsel

Rungeverschijnsel, or Runge phenomenon, is a problem in polynomial interpolation where high-degree polynomials fitted to a function at equally spaced nodes on a finite interval exhibit large oscillations near the interval ends. Although the function is smooth, the interpolants can become wildly inaccurate at the boundaries as the degree increases, even while the approximation improves in the interior. The phenomenon was demonstrated by Carl Runge in the early 20th century and is a fundamental example of numerical instability in interpolation.

The underlying cause is linked to the growth of the Lagrange basis polynomials for many nodes on

A classic illustration uses the Runge function f(x) = 1/(1+x^2) on the interval [-1, 1]. Interpolating f

Mitigations include using nonuniform node distributions such as Chebyshev or Chebyshev-Gauss-Lobatto nodes, which minimize the maximum

a
finite
interval.
With
equidistant
nodes,
the
interpolation
error
and
the
oscillations
near
the
endpoints
grow
with
the
polynomial
degree,
leading
to
poor
convergence
at
the
boundaries
despite
good
interior
behavior.
The
effect
is
most
pronounced
for
functions
with
non-polynomial
behavior
that
cannot
be
captured
well
by
a
single
global
polynomial.
with
polynomials
of
increasing
degree
at
equally
spaced
nodes
produces
increasingly
large
oscillations
near
x
=
±1,
showing
that
more
nodes
do
not
guarantee
a
better
approximation.
interpolation
error.
Alternative
approaches
include
piecewise
polynomial
interpolation
(splines),
rational
interpolation,
or
spectral
methods
with
appropriate
basis
functions.
These
strategies
reduce
or
eliminate
Rungeverschijnsel
and
are
standard
in
numerical
analysis
and
scientific
computing.