Home

metaentropy

Metaentropy is a term used in some theoretical discussions to denote a second-order measure of uncertainty related to entropy itself. In this sense, metaentropy can be understood in two related ways: as the entropy of a distribution over possible entropies for a system, and as a Bayesian measure of uncertainty about a system's entropy given data.

In a modeling scenario where several models or hypotheses imply different entropies for a process, metaentropy

Metaentropy is not a standard physical quantity; it is a higher-order construct that complements ordinary entropy.

Potential applications include meta-analysis of uncertainty, model comparison, and meta-learning where one wants to quantify confidence

See also: entropy, information theory, Bayesian inference, uncertainty quantification, metacognition.

is
the
Shannon
entropy
of
the
distribution
of
those
entropy
values.
It
reflects
the
level
of
disagreement
or
uncertainty
about
how
random
the
process
is.
A
second
interpretation
arises
in
Bayesian
inference:
metaentropy
can
refer
to
the
expected
entropy
of
predictive
distributions
under
the
posterior
over
models,
or,
equivalently,
the
uncertainty
about
the
entropy
parameter
itself.
It
depends
on
the
set
of
considered
models,
priors,
or
hypotheses
and
thus
is
sensitive
to
specification.
When
the
model
class
is
well-specified
and
the
entropy
is
well-constrained,
metaentropy
tends
to
be
low;
with
sparse
data
or
divergent
models,
metaentropy
increases.
about
the
degree
of
randomness
across
alternatives.
Criticisms
emphasize
that
as
a
higher-order
measure,
metaentropy
can
be
ill-defined
without
clear
priors
or
model
spaces
and
may
conflate
epistemic
with
ontological
uncertainty.