单词 | Markov chain |
释义 | Markov chain /mahr"kawf/, Statistics. a Markov process restricted to discrete random events or to discontinuous time sequences. Also, Markoff chain. [1940-45; see MARKOV PROCESS] |
随便看 |
|
英语词典包含168451条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。