请输入您要查询的英文单词:

 

单词 Markov chain
释义 Markov chain
 /mahr"kawf/, Statistics.
 a Markov process restricted to discrete random events or to discontinuous time sequences.
 Also, Markoff chain.
 [1940-45; see MARKOV PROCESS]
随便看

 

英语词典包含168451条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2022 Newdu.com All Rights Reserved
更新时间:2024/12/23 17:33:54