释义 |
Markoff chain ThesaurusNoun | 1.Markoff chain - a Markov process for which the parameter is discrete time valuesMarkov chainMarkoff process, Markov process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state | EncyclopediaSeeMarkov chainMarkoff chain Related to Markoff chain: Markov processSynonyms for Markoff chainnoun a Markov process for which the parameter is discrete time valuesSynonymsRelated Words- Markoff process
- Markov process
|