Home

LikelihoodRatio

Likelihood ratio is a statistic used in statistical inference to compare how well two competing hypotheses explain observed data. It is defined from the likelihood function, which measures the probability of the data given a set of model parameters. Given data x and a parameter value theta, the likelihood is L(theta; x). The likelihood ratio compares two scenarios, typically two hypotheses or two parameter values, via the ratio of their likelihoods. In hypothesis testing, the generalized form is Lambda(x) = sup{L(theta0; x)} / sup{L(theta; x): theta in Θ}, where the numerator is the maximum likelihood under the null hypothesis and the denominator is the maximum likelihood under the full parameter space. A related statistic is the log-likelihood ratio, often transformed as -2 log Lambda(x).

In the framework of hypothesis testing, the likelihood ratio test (LRT) is used to decide whether to

Interpretation and caveats: a small likelihood ratio indicates the data are much more likely under the alternative

reject
the
null
hypothesis.
For
simple
hypotheses,
the
Neyman-Pearson
lemma
shows
that
the
likelihood
ratio
test
is
the
most
powerful
test
for
distinguishing
the
two
hypotheses.
For
composite
hypotheses,
the
generalized
likelihood
ratio
test
uses
the
maximum
likelihood
under
the
null
divided
by
the
maximum
likelihood
overall.
Under
regularity
conditions
and
large
samples,
-2
log
Lambda(x)
often
follows
an
approximately
chi-square
distribution,
with
degrees
of
freedom
equal
to
the
number
of
free
parameters
constrained
under
the
null
(Wilks’
theorem).
than
under
the
null,
providing
evidence
against
the
null
hypothesis.
The
method
assumes
the
model
is
correctly
specified
and
relies
on
asymptotic
approximations;
results
can
be
sensitive
to
model
misspecification
or
nuisance
parameters.
In
Bayesian
contexts,
likelihood
ratios
relate
to
Bayes
factors,
which
compare
marginal
likelihoods
over
parameter
spaces.