Shannonentrópia
Shannonentrópia, also known as information entropy, is a fundamental concept in information theory and probability theory. It quantifies the uncertainty or randomness associated with a random variable. Introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication," entropy provides a measure of the average amount of information produced by a stochastic source of data.
In simpler terms, entropy can be thought of as the average number of bits required to encode
Mathematically, for a discrete random variable X with possible outcomes x1, x2, ..., xn and corresponding probabilities
The concept of entropy has wide-ranging applications, including data compression, cryptography, statistical inference, and machine learning.