Ristiinentropiaa
Ristiinentropiaa is a Finnish term that translates to "cross-entropy" in English. It is a fundamental concept in information theory and machine learning. Cross-entropy measures the difference between two probability distributions. Specifically, it quantifies how well one probability distribution predicts another.
In machine learning, cross-entropy is commonly used as a loss function. When training a classification model,
The formula for cross-entropy for discrete probability distributions P and Q is given by H(P, Q) = -
Cross-entropy is closely related to Kullback-Leibler (KL) divergence, another measure of the difference between probability distributions.