Normal view
MARC view
Markov processes (Topical Term)
Used for/see from:
- Analysis, Markov
- Chains, Markov
- Markoff processes
- Markov analysis
- Markov chains
- Markov models
- Models, Markov
- Processes, Markov
See also:
- Broader heading: Stochastic processes
UMI business vocab. (Markov processes, use Markov analysis)
Wikipedia, Jan. 3, 2007 Markov chain (in mathematics, a Markov chain, named after Andrey Markov, is a discrete-time stochastic process with the Markov property; Markov models) Markov process (in probability theory, a Markov process is a stochastic process that has the Markov property; often, the term Markov chain is used to mean a discrete-time Markov process)