Shannoninformációtartalmat
Shannoninformációtartalom, often translated as Shannon information content or surprisal, is a concept introduced by Claude Shannon in his seminal work on information theory. It quantifies the amount of information contained in a single event or outcome of a random variable. The information content of an event is inversely proportional to its probability; a less probable event carries more information because it is more surprising.
Mathematically, the Shannon information content of an event with probability P is defined as -log₂(P). The base-2
This concept is fundamental to understanding various aspects of information theory, including entropy, channel capacity, and