Markovchain
Markovchain refers to a Markov chain, a stochastic process with discrete time steps {X_t} taking values in a countable state space S and satisfying the Markov property: the conditional distribution of the next state depends only on the present state, not on the past. Formally, P(X_{t+1}=j | X_t=i, X_{t-1}=i_{t-1}, ..., X_0=i_0) = P(X_{t+1}=j | X_t=i). The distribution of X_0 is the initial distribution.
The future evolution is governed by a transition probability, p_{ij} = P(X_{t+1}=j | X_t=i), which for a homogeneous
Key equations include the Chapman-Kolmogorov relations, which express multi-step transition probabilities in terms of one-step transitions,
Common examples include simple random walks on finite graphs, birth-death processes, and queues such as the