Home

Entropy

Entropy is a property of a physical system that quantifies the number of microscopic configurations that correspond to its macroscopic state. In classical thermodynamics, entropy is a state function with units of energy per temperature (joules per kelvin) and is denoted by S. The second law of thermodynamics states that for an isolated system, entropy tends to increase or remain constant over time, guiding the system toward thermodynamic equilibrium.

In statistical mechanics, entropy relates to the multiplicity of microstates. Boltzmann's relation S = k_B ln W

In information theory, entropy measures the uncertainty or average information content of a random variable. For

Entropy also features in other domains, such as black hole thermodynamics, where the Bekenstein–Hawking entropy is

expresses
entropy
as
proportional
to
the
logarithm
of
the
number
W
of
distinct
microstates
compatible
with
the
macrostate.
An
alternative
form
uses
probabilities
p_i
of
microstate
i:
S
=
-k_B
sum
p_i
ln
p_i.
At
fixed
energy,
volume,
and
particle
number,
the
entropy
is
maximized
at
the
equilibrium
distribution.
outcomes
i
with
probabilities
p_i,
H
=
-
sum
p_i
log
p_i.
The
base
of
the
logarithm
sets
the
unit:
bits
for
base
2,
nats
for
natural
log.
Although
conceptually
distinct,
thermodynamic
and
information
entropies
share
mathematical
structure,
and
operations
such
as
information
erasure
have
physical
consequences
(Landauer's
principle).
proportional
to
the
area
of
the
event
horizon,
linking
gravity
and
thermodynamics.
In
practice,
entropy
provides
a
unifying
language
for
irreversibility,
equilibrium,
and
statistical
likelihood
across
physics,
chemistry,
and
information
science.