Home

pmf

PMF stands for probability mass function. In probability theory and statistics, it describes the distribution of a discrete random variable by specifying the probability that the variable takes each possible value. The PMF fully characterizes the distribution of the variable and is defined on its support, the set of values the variable can assume.

Formally, if X is a discrete random variable with possible values {x1, x2, ...}, the PMF is pX(x)

Common discrete distributions have standard PMFs. For example, a fair six-sided die has pX(k) = 1/6 for

PMFs are distinguished from probability density functions, which describe continuous variables. Outside its support, a PMF

=
P(X
=
x)
for
x
in
the
support,
and
pX(x)
=
0
for
values
not
in
the
support.
The
probabilities
are
nonnegative
and
sum
to
1:
sum
over
all
x
of
pX(x)
=
1.
The
cumulative
distribution
function
F(x)
=
P(X
≤
x)
can
be
obtained
by
F(x)
=
sum_{t
≤
x}
pX(t).
The
PMF
thus
determines
all
probabilities
and,
by
extension,
moments
of
X,
such
as
the
expectation
E[X]
=
sum_x
x
pX(x)
and
the
variance
Var(X)
=
E[X^2]
−
(E[X])^2.
k
=
1,...,6.
A
Bernoulli(p)
variable
has
pX(1)
=
p
and
pX(0)
=
1−p.
The
Binomial(n,
p)
distribution
and
the
Poisson(λ)
distribution
are
defined
by
their
respective
PMFs
and
model
counts
or
rare-event
occurrences.
assigns
zero
probability
to
all
values.