EMalgoritm
EM algorithm, short for Expectation-Maximization, is an iterative method for maximum likelihood or maximum a posteriori estimation in statistical models that involve latent variables or missing data. The method alternates between two steps: the Expectation step and the Maximization step. In the E-step, with current parameter values θ^(t), it computes the expected value of the complete-data log-likelihood log p(X,Z | θ) with respect to the conditional distribution p(Z | X, θ^(t)). This yields a function Q(θ | θ^(t)). In the M-step, it maximizes Q with respect to θ to obtain updated parameters θ^(t+1).
The EM algorithm is grounded in the idea of treating latent variables Z as missing data and
EM requires that the complete-data likelihood be tractable to compute and maximize. When the E-step involves
Extensions of EM include hard EM (classification EM), stochastic EM, online EM, and Monte Carlo EM. The