Definition that contains markov process
- markovian relating to or generated by a Markov process
- markoff chain a Markov process for which the parameter is discrete time values
- markov chain a Markov process for which the parameter is discrete time values