Home

expectedvalue

Expected value, also called expectation or mean, is a central concept in probability and statistics that summarizes the average outcome of a random variable if the experiment could be repeated many times. For a discrete random variable X with possible values x_i and probabilities p_i, the expected value is E[X] = ∑ x_i p_i. For a continuous variable with density f, E[X] = ∫ x f(x) dx, provided the integral converges. More generally, for any function g, E[g(X)] = ∑ g(x_i) p_i or ∫ g(x) f(x) dx.

The expected value is a linear operator: E[aX + b] = a E[X] + b, and E[X + Y] = E[X]

Common examples illustrate its interpretation. A fair die has E[X] = (1+2+3+4+5+6)/6 = 3.5. A fair coin that

In practice, the expected value represents the long-run average outcome, not necessarily a typical single observation.

+
E[Y]
for
any
random
variables
X
and
Y,
regardless
of
independence.
If
X
is
nonnegative,
E[X]
≥
0;
the
expectation
may
be
finite,
infinite,
or
undefined
if
the
sum
or
integral
diverges.
pays
+1
for
heads
and
−1
for
tails
has
E[X]
=
0.
If
X
is
an
indicator
of
an
event
A,
E[X]
=
P(A).
The
law
of
iterated
expectation
states
that
E[X]
=
E[
E[X|Y]
].
It
is
a
basic
descriptor
of
a
distribution
and
underpins
many
methods
in
statistics
and
decision
making.
The
terms
mean,
expectation,
and
average
are
often
used
interchangeably,
and
the
empirical
mean
estimates
the
expected
value
from
data.