Home

Kullback

Kullback is a surname associated with Solomon Kullback, a statistician who co-developed the Kullback–Leibler divergence. In 1951, together with Daniel Leibler, he introduced a measure of how one probability distribution diverges from a second reference distribution, in their work On Information and Sufficiency. The concept is a foundational tool in information theory and statistics and has influenced numerous theoretical and applied methods.

The divergence is commonly denoted D_KL(P||Q). For discrete distributions, it is defined as the sum over all

Applications and impact include its central role in statistical inference and machine learning. D_KL serves as

outcomes
x
of
P(x)
log[P(x)/Q(x)].
For
continuous
distributions,
the
sum
is
replaced
by
an
integral.
D_KL
is
non-negative
and
equals
zero
only
when
P
and
Q
are
identical
almost
everywhere,
but
it
is
not
symmetric:
D_KL(P||Q)
generally
does
not
equal
D_KL(Q||P).
This
asymmetry
reflects
the
fact
that
the
measure
quantifies
the
information
lost
when
Q
is
used
to
approximate
P.
a
criterion
for
model
fitting,
a
measure
of
information
loss
when
using
an
approximate
distribution,
and
a
component
of
algorithms
in
variational
inference,
maximum
likelihood
estimation,
and
other
learning
frameworks.
The
Kullback–Leibler
divergence
remains
a
core
concept
across
statistics,
information
theory,
and
data
analysis.