probabilitydivergence
Probability divergence refers to a class of mathematical measures used to quantify the difference between two probability distributions. These measures are fundamental in various fields, including machine learning, statistics, and information theory, for tasks such as model comparison, clustering, and anomaly detection. Unlike distance metrics, probability divergences are not necessarily symmetric (the divergence from distribution P to distribution Q may not be the same as from Q to P) and do not always satisfy the triangle inequality.
A common example of a probability divergence is the Kullback-Leibler divergence (KL divergence), often denoted as