Definition that contains stochastic process
- markov process a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
- random walk a stochastic process consisting of a sequence of changes each of whose characteristics (as magnitude or direction) is determined by chance
- stationary stochastic process a stochastic process in which the distribution of the random variables is the same for any value of the variable parameter
- markoff process a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state