Home

ECME

ECME stands for the Expectation Conditional Maximization Either algorithm, a class of iterative methods for maximum likelihood estimation with incomplete data. It was introduced by Liu and Rubin in 1994 as an extension of the EM algorithm, designed to increase efficiency by allowing some steps to maximize the observed-data likelihood directly rather than the complete-data likelihood.

Like EM, ECME begins with an expectation (E) step that computes the expected complete-data log-likelihood given

ECME is used in a variety of incomplete-data problems, including finite mixture models, factor analysis, censoring

Convergence properties of ECME mirror those of EM in spirit: the observed-data likelihood is guaranteed to

ECME is closely related to ECM (Expectation Conditional Maximization) and is often viewed as a more flexible

current
parameter
values.
It
then
performs
conditional
maximization
(CM)
steps
to
update
parameter
blocks.
What
distinguishes
ECME
is
that
some
CM
steps
are
replaced
by
direct
maximization
of
the
observed-data
log-likelihood,
while
others
continue
to
optimize
the
complete-data
log-likelihood.
This
combination
is
described
as
a
mixture
of
CM
and
observed-data
likelihood
maximization
steps,
sometimes
referred
to
as
“Either”
steps.
and
survival
models,
and
other
latent-variable
frameworks.
It
is
particularly
advantageous
when
certain
updates
have
closed-form
solutions
or
when
direct
maximization
of
the
observed
data
log-likelihood
accelerates
convergence
compared
with
standard
EM.
not
decrease
with
each
iteration
under
regularity
conditions,
and
the
algorithm
converges
to
a
stationary
point
rather
than
necessarily
the
global
maximum.
However,
like
EM,
ECME
can
be
sensitive
to
initialization
and
problem
structure,
and
convergence
speed
is
problem-dependent.
variant
that
can
exploit
direct
likelihood
maximization
to
improve
performance.