单词 | markov chain |
释义 | > as lemmasMarkov chain Markov chain n. [compare German Markoffsche Kette (1933), French chaînes de A. Markoff (1927)] a Markov process in which there are a denumerable number of possible states or in which transitions between states occur at discrete time intervals; (also) one for which in addition the transition probabilities are constant (independent of time).extracted from Markovn.< as lemmas |
随便看 |
英语词典包含1132095条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。