Home

PMFs

PMF stands for probability mass function. In probability theory, a PMF describes the distribution of a discrete random variable X by assigning a probability to each value that X can take. Formally, p(x) = P(X = x) for every x in the support of X. The function is defined on a countable set of values, satisfies p(x) ≥ 0 for all x, and the sum of p(x) over all x in the support equals 1. Values outside the support have p(x) = 0.

The PMF uniquely determines the distribution of X. From it one can derive the cumulative distribution function

Common examples include: a fair six-sided die, where p(1) = … = p(6) = 1/6 and p(x) = 0 otherwise; a

PMFs apply only to discrete random variables. For continuous variables, the analogous concept is the probability

F(x)
=
P(X
≤
x)
by
summing
p(y)
over
all
y
≤
x.
Moments
are
computed
as
E[X]
=
sum
x
p(x)
over
the
support,
and
Var(X)
=
E[X^2]
−
(E[X])^2,
with
sums
taken
over
the
support.
Bernoulli
variable
X
∈
{0,1}
with
P(X=1)
=
p;
and
distributions
such
as
Binomial,
Geometric,
and
Poisson,
each
with
its
own
PMF.
density
function
(PDF),
not
the
PMF.
In
general,
a
PMF
must
satisfy
sum_x
p(x)
=
1,
while
a
PDF
satisfies
integral
over
its
domain
equals
1.
Estimation
of
PMFs
from
data
often
uses
frequency
counts
or
likelihood-based
approaches.