Home

KSTests

Kolmogorov-Smirnov tests (KSTests) are nonparametric methods used to assess whether a sample comes from a specified distribution or whether two samples come from the same distribution. They are based on the empirical distribution functions (EDF) of the data and are widely used for goodness-of-fit testing and for comparing distributions without assuming a particular parametric form.

There are two main variants. The one-sample KS test evaluates whether the EDF of a sample matches

Assumptions and considerations include independent observations and, for the standard critical values, continuous distributions (ties can

Applications cover goodness-of-fit testing and equality of distributions between groups, with implementations commonly available in statistical

a
specified
distribution
F0
by
computing
Dn
=
sup_x
|Fn(x)
-
F0(x)|,
where
Fn
is
the
empirical
CDF.
The
two-sample
KS
test
compares
two
independent
samples
by
computing
D
=
sup_x
|F1,n(x)
-
F2,m(x)|,
where
F1,n
and
F2,m
are
the
two
EDFs
and
n,
m
are
their
sample
sizes.
Under
the
null
hypothesis,
the
distribution
of
D
follows
the
Kolmogorov
distribution
in
the
asymptotic
regime,
allowing
p-values
to
be
obtained
from
tables
or
software;
exact
critical
values
exist
for
small
samples.
affect
accuracy).
If
parameters
of
the
reference
distribution
are
estimated
from
the
data,
standard
KS
critical
values
are
not
valid,
and
adjustments
(such
as
the
Lilliefors
correction)
or
resampling
methods
should
be
used.
The
KS
test
is
more
sensitive
to
differences
near
the
center
of
the
distribution
and
may
have
limited
power
against
certain
alternatives,
particularly
in
the
tails
or
with
large
samples
where
even
small
deviations
are
detected.
software
(for
example,
ks.test
in
R
and
equivalent
functions
in
Python’s
SciPy).