Informatsioonihulk
Informatsioonihulk, also known as information content or entropy, is a measure of the uncertainty associated with a random variable. It quantifies the amount of information produced when the value of the random variable is revealed. The concept is fundamental in information theory and has applications in various fields, including data compression, cryptography, and machine learning.
The informatsioonihulk of a discrete random variable X, denoted as H(X), is defined as the expected value
where P(x) is the probability of the random variable X taking on the value x, and the
The informatsioonihulk has several important properties. It is always non-negative, with H(X) = 0 if and only
In the context of data compression, informatsioonihulk provides a lower bound on the average number of bits