Markovian
Markovian refers to systems or models that obey the Markov property: the conditional probability distribution of future states depends only on the present state, not on the sequence of events that preceded it. In other words, given the current state, the past is irrelevant for predicting the future.
In discrete time, a Markov chain consists of a finite or countable state space, a transition matrix
Key properties include the ChapmanāKolmogorov equations, and, for irreducible chains, the existence of stationary distributions, which
Not all stochastic processes are Markovian; some exhibit memory that requires additional history to predict the
Examples include the M/M/1 queue, simple random walks, and many stochastic models in finance and physics. In
See also: Markov chain, Markov process, Markov decision process.