Home

logprobability

Logprobability is the logarithm of a probability value. It is widely used in statistics and machine learning because logging converts products of probabilities into sums, reducing numerical underflow when dealing with many events or long data sequences.

Mathematical foundations: For independent events A and B, the log probability satisfies log P(A and B) =

Log-likelihood and log-posterior: The log-likelihood is the log of the joint probability (or density) of the

Base and interpretation: The base of the logarithm matters only by a constant factor; natural logarithms (base

Numerical considerations and applications: Using log probabilities improves numerical stability, especially in models with many components

log
P(A)
+
log
P(B).
For
a
sequence
of
observations,
the
joint
log
probability
is
the
sum
of
the
log
probabilities:
log
P(x1,
x2,
...,
xn)
=
sum_i
log
P(xi
|
previous).
For
continuous
distributions,
log
probability
refers
to
the
log
of
a
probability
density
or
mass,
e.g.,
the
log-likelihood
of
observed
data
given
a
model.
observed
data
under
a
set
of
parameters.
Maximizing
the
log-likelihood
yields
maximum
likelihood
estimates.
The
log-posterior
combines
the
log-likelihood
with
a
log
prior:
log
p(θ
|
D)
=
log
p(D
|
θ)
+
log
p(θ)
−
log
p(D).
e)
are
common
in
statistics,
while
base
2
is
common
in
information
theory.
Since
probabilities
lie
in
[0,
1],
their
logarithms
are
non-positive,
with
log(1)
=
0
and
log(0)
=
−∞.
The
log-probability
value
itself
is
not
a
probability,
but
it
can
be
exponentiated
to
recover
a
probability
when
needed.
or
long
sequences,
such
as
hidden
Markov
models,
conditional
random
fields,
and
language
models.
Techniques
like
the
log-sum-exp
trick
help
compute
sums
of
exponentials
in
the
log
domain.