Home

mcmc

MCMC, or Markov chain Monte Carlo, is a class of algorithms for sampling from complex probability distributions by constructing a Markov chain whose stationary distribution is the target distribution. The Monte Carlo aspect uses random sampling to estimate expectations, while the Markov chain aspect provides a mechanism to explore high‑dimensional state spaces when direct sampling is impractical.

In Bayesian statistics, MCMC is widely used to approximate posterior distributions that arise when closed-form solutions

Common algorithms include Metropolis-Hastings, which proposes new states and accepts them with a probability ensuring reversibility,

Key considerations in practice include choosing proposal distributions, diagnosing convergence and mixing, determining burn-in and thinning,

are
unavailable.
The
typical
approach
is
to
design
a
transition
kernel
that
preserves
the
target
distribution,
often
via
detailed
balance.
After
a
burn-in
period,
the
chain
is
assumed
to
have
reached
its
stationary
distribution,
and
the
collected
samples
are
used
to
estimate
quantities
such
as
means,
variances,
credible
intervals,
and
predictive
distributions.
and
Gibbs
sampling,
which
draws
successive
values
from
full
conditional
distributions.
Variants
such
as
Hamiltonian
Monte
Carlo
(HMC)
use
gradient
information
to
improve
efficiency,
with
No-U-Turn
Sampler
(NUTS)
automating
tuning.
Other
methods
include
slice
sampling
and
adaptive
MCMC,
which
tunes
proposals
on
the
fly.
Convergence
and
efficiency
are
assessed
with
diagnostics
such
as
trace
plots,
autocorrelation,
effective
sample
size,
and
the
Gelman‑Rubin
statistic
(R-hat).
and
balancing
computational
cost
with
accuracy.
MCMC
supports
posterior
inference
and
high‑dimensional
integration
across
fields
such
as
statistics,
physics,
machine
learning,
and
computational
biology.