KLdivergenciával
KLdivergencia, also known as Kullback-Leibler divergence or relative entropy, is a measure of how one probability distribution differs from a second, reference probability distribution. It is not a true metric because it is not symmetric; the KL divergence from distribution P to distribution Q is not necessarily equal to the KL divergence from Q to P. However, it is widely used in information theory, statistics, and machine learning.
Mathematically, the KL divergence of a distribution Q from a distribution P is defined as DKL(P ||
The KL divergence quantifies the information lost when Q is used to approximate P. It can be