Home

nonparametrics

Nonparametrics, in statistics, refers to methods that do not assume a specific parametric form for the population distribution or for the underlying relationship between variables. These approaches aim to be robust to model misspecification and to be applicable when the true distribution is unknown or difficult to specify. Nonparametric methods can be distribution-free or rely on minimal assumptions about functional form, such as monotonicity or continuity.

Nonparametric inference includes hypothesis tests and confidence intervals that do not depend on a known distribution.

Nonparametric estimation covers density estimation, regression, and related function estimation. Kernel density estimation and histograms estimate

Compared with parametric methods, nonparametric techniques offer flexibility and robustness at the cost of potentially lower

Classic
examples
include
sign
tests,
Wilcoxon
signed-rank
tests,
Mann-Whitney
U
tests,
Kruskal-Wallis
tests,
and
Friedman
tests.
Correlations
measures
such
as
Spearman's
rho
and
Kendall's
tau
are
also
rank-based.
In
many
settings,
permutation
and
bootstrap
resampling
are
used
to
obtain
p-values
or
intervals
without
strong
distributional
assumptions.
a
distribution's
shape
without
assuming
a
parametric
form;
kernel
regression,
locally
weighted
regression
(LOESS),
and
spline-based
methods
estimate
smooth
relationships.
Isotonic
regression
and
other
shape-constrained
techniques
impose
qualitative
features
like
monotonicity.
These
methods
often
require
tuning
choices
(for
example,
bandwidths)
and
can
be
computationally
intensive.
efficiency
when
a
parametric
model
is
correct
and
larger
sample
sizes
are
needed.
They
may
be
less
interpretable
and
increasingly
sensitive
to
data
quality
and
tuning
parameters.
Nonparametric
Bayesian
methods
are
a
related
area
that
combines
flexible
models
with
probabilistic
inference,
while
bootstrap
and
permutation
tools
are
common
companions
used
to
assess
variability
in
nonparametric
procedures.