Entropy
Entropy is a property of a physical system that quantifies the number of microscopic configurations that correspond to its macroscopic state. In classical thermodynamics, entropy is a state function with units of energy per temperature (joules per kelvin) and is denoted by S. The second law of thermodynamics states that for an isolated system, entropy tends to increase or remain constant over time, guiding the system toward thermodynamic equilibrium.
In statistical mechanics, entropy relates to the multiplicity of microstates. Boltzmann's relation S = k_B ln W
In information theory, entropy measures the uncertainty or average information content of a random variable. For
Entropy also features in other domains, such as black hole thermodynamics, where the Bekenstein–Hawking entropy is