Krydsentropi
Krydsentropi, also known as cross-entropy, is a measure used in information theory and machine learning to quantify the difference between two probability distributions. It is a fundamental concept in the fields of statistics, data compression, and predictive modeling.
In information theory, cross-entropy is used to measure the inefficiency of predicting a random variable using
In machine learning, cross-entropy is commonly used as a loss function in classification tasks. It measures
The formula for cross-entropy loss between a true distribution \( p \) and an estimated distribution \( q \) is
\[ H(p, q) = -\sum_{i} p(i) \log q(i) \]
where \( p(i) \) is the true probability of class \( i \) and \( q(i) \) is the predicted probability of
Cross-entropy is a versatile and widely used metric due to its ability to provide a clear and