relativeentropy
Relative entropy, also known as Kullback-Leibler divergence or information gain, is a measure of how one probability distribution differs from a second, reference probability distribution. It quantifies the "distance" between two probability distributions, although it is not a true metric as it is not symmetric. For two discrete probability distributions P and Q, defined over the same sample space, the relative entropy of Q with respect to P is given by the sum of P(i) times the logarithm of the ratio of P(i) to Q(i) for all possible outcomes i.
In simpler terms, relative entropy tells us how much information is lost when we use distribution Q
The concept can be extended to continuous probability distributions, where the summation is replaced by an