Home

undersmoothing

Undersmoothing is a practice in nonparametric smoothing where the bandwidth or smoothing parameter is chosen smaller than the value that minimizes the mean squared error (MSE). In kernel-based regression or density estimation, smoothing reduces variance but introduces bias. The bias typically grows with the bandwidth, while the variance shrinks as the bandwidth increases. The MSE-optimal bandwidth balances these two sources of error. Undersmoothing intentionally selects a bandwidth smaller than this balance point.

The primary motivation for undersmoothing is statistical inference. By using a smaller-than-optimal bandwidth, the bias of

Practically, undersmoothing involves reducing the smoothing parameter by a fixed factor or by selecting a smaller

Alternatives to undersmoothing include bias correction with explicit adjustment terms, bootstrap methods for improved coverage, and

the
estimator
becomes
negligible
relative
to
the
sampling
error,
making
confidence
intervals
and
hypothesis
tests
for
the
function
itself
(or
its
derivatives)
more
reliable
without
bias
corrections.
This
is
particularly
relevant
for
local
polynomial
estimators
and
for
estimating
derivatives,
where
bias
can
otherwise
contaminate
inference.
target
bandwidth
than
what
cross-validation
or
standard
criteria
would
recommend.
However,
this
comes
at
the
cost
of
higher
variance
and
more
erratic
estimates,
especially
in
finite
samples.
The
approach
is
most
appropriate
when
the
goal
is
inference
rather
than
prediction,
and
when
bias
correction
or
higher-order
smoothing
is
not
applied.
using
higher-order
local
polynomials
or
kernels
to
reduce
bias
without
sacrificing
too
much
variance.