Home

quasilikelihoods

Quasi-likelihoods, or quasi-likelihood methods, are inference tools that provide likelihood-like procedures without requiring a full probability model for the data. Introduced by Wedderburn in 1974, a quasi-likelihood function Q(y; mu) is defined so that its derivative with respect to the mean mu equals (y − mu) / V(mu), where mu = E[Y] and V(mu) is a specified variance function. The function Q is determined up to an additive constant by choosing V(mu); it need not correspond to the log-likelihood of any distribution.

In regression settings, one specifies a mean-variance relationship Var(Y) = V(mu) phi, where phi is a dispersion

Relation to true likelihoods and inference is nuanced: quasi-likelihoods are not generally true likelihoods, so standard

Examples include Poisson and binomial cases with familiar variance forms V(mu) = mu and V(mu) = mu(1 − mu);

parameter,
and
links
mu
to
a
linear
predictor
via
mu
=
g^{-1}(X
beta).
Parameter
estimates
are
obtained
by
solving
quasi-likelihood
estimating
equations,
often
through
iteratively
reweighted
least
squares
or
generalized
estimating
equations
with
a
working
variance
function.
Quasi-likelihood
methods
yield
consistent
estimates
of
the
mean
parameters
under
the
correct
mean-variance
specification
and
can
be
robust
to
certain
distributional
misspecifications.
likelihood-based
tests
and
information
criteria
may
not
be
valid
unless
the
variance
function
aligns
with
an
exponential-family
form
or
robust
adjustments
are
used.
Despite
this,
quasi-likelihood
techniques
underpin
many
generalized
linear
model
approaches,
especially
for
overdispersed
or
heteroscedastic
data,
and
form
the
basis
of
the
GEE
framework
with
working
variance
functions
to
obtain
efficient
estimates
and
robust
standard
errors.
in
the
normal
case
with
constant
variance,
the
quasi-likelihood
reduces
to
the
Gaussian
log-likelihood
up
to
a
constant.