请输入您要查询的英文单词:

 

单词 markoff chain
释义

Markoff chain


Thesaurus
Noun1.Markoff chain - a Markov process for which the parameter is discrete time valuesMarkov chainMarkoff process, Markov process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
EncyclopediaSeeMarkov chain

Markoff chain


Related to Markoff chain: Markov process
  • noun

Synonyms for Markoff chain

noun a Markov process for which the parameter is discrete time values

Synonyms

  • Markov chain

Related Words

  • Markoff process
  • Markov process
随便看

 

英语词典包含2567994条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2022 Newdu.com All Rights Reserved
更新时间:2024/11/14 5:37:47