JSD
JSD is an acronym that most commonly refers to the Jensen-Shannon divergence in information theory and statistics. The Jensen-Shannon divergence (JSD) is a method for measuring the similarity between two probability distributions.
For discrete distributions P and Q over the same domain, and M = (P+Q)/2, JSD(P||Q) = (1/2) KL(P||M)
Because it smooths and averages the distributions, it is less sensitive to zero probabilities than KL divergence.
JSD also serves as an acronym in other domains. In computing, it may be used as shorthand