Rungeverschijnsel
Rungeverschijnsel, or Runge phenomenon, is a problem in polynomial interpolation where high-degree polynomials fitted to a function at equally spaced nodes on a finite interval exhibit large oscillations near the interval ends. Although the function is smooth, the interpolants can become wildly inaccurate at the boundaries as the degree increases, even while the approximation improves in the interior. The phenomenon was demonstrated by Carl Runge in the early 20th century and is a fundamental example of numerical instability in interpolation.
The underlying cause is linked to the growth of the Lagrange basis polynomials for many nodes on
A classic illustration uses the Runge function f(x) = 1/(1+x^2) on the interval [-1, 1]. Interpolating f
Mitigations include using nonuniform node distributions such as Chebyshev or Chebyshev-Gauss-Lobatto nodes, which minimize the maximum