minKL
minKL is a term used in statistics and machine learning to describe optimization procedures that minimize the Kullback-Leibler (KL) divergence between probability distributions. It often arises when one wishes to approximate an intractable target distribution with a more tractable family of distributions, or to fit model parameters to data through a divergence-based objective.
The KL divergence D_KL(P || Q) between two probability distributions P and Q over a common domain
Common applications include variational inference, density estimation, and training certain generative models. In variational inference, a
See also: Kullback-Leibler divergence, variational inference, ELBO, probability density estimation.