Entrópia
Entrópia is a measure of the number of microscopic configurations corresponding to a macroscopic state, or more generally a measure of uncertainty or disorder in a system. The concept originated in 19th-century thermodynamics with Clausius and gained a statistical interpretation through Boltzmann, linking macroscopic observables to microscopic arrangements.
In thermodynamics, entropy S is a state function of a system’s macroscopic variables. For a reversible transfer,
Statistical mechanics provides microscopic foundations for entropy. Boltzmann’s formula S = k_B ln W relates entropy to
In information theory, entropy measures uncertainty or information content. Shannon entropy H(X) = -∑ p(x) log p(x) quantifies
Quantum generalizations include von Neumann entropy, S(ρ) = -Tr(ρ log ρ), where ρ is a quantum state. Entropy production