Home

Markovchain

Markovchain refers to a Markov chain, a stochastic process with discrete time steps {X_t} taking values in a countable state space S and satisfying the Markov property: the conditional distribution of the next state depends only on the present state, not on the past. Formally, P(X_{t+1}=j | X_t=i, X_{t-1}=i_{t-1}, ..., X_0=i_0) = P(X_{t+1}=j | X_t=i). The distribution of X_0 is the initial distribution.

The future evolution is governed by a transition probability, p_{ij} = P(X_{t+1}=j | X_t=i), which for a homogeneous

Key equations include the Chapman-Kolmogorov relations, which express multi-step transition probabilities in terms of one-step transitions,

Common examples include simple random walks on finite graphs, birth-death processes, and queues such as the

Markov
chain
does
not
depend
on
t.
The
collection
{p_{ij}}
forms
the
transition
matrix
P,
with
row
sums
equal
to
1
(P
is
row-stochastic).
The
state
space
can
be
finite
or
countably
infinite.
In
a
continuous-time
setting,
the
analogue
is
a
Markov
process
with
a
generator
matrix
Q,
where
q_{ij}
gives
the
rate
of
transitioning
from
i
to
j.
and,
in
the
long
run,
the
existence
of
a
stationary
distribution
π,
satisfying
πP
=
π
(or
πQ
=
0
in
continuous
time).
If
the
chain
is
irreducible
and
aperiodic,
it
converges
to
a
unique
stationary
distribution
regardless
of
the
initial
state.
M/M/1
model.
Markov
chains
are
used
in
a
wide
range
of
applications,
including
economics,
genetics,
computer
science,
finance,
and
algorithms
such
as
PageRank
and
various
stochastic
simulations.
Parameter
estimation
for
Markovchains
typically
uses
observed
transitions
and
maximum
likelihood
or
Bayesian
methods.