请输入您要查询的英文单词:

 

单词 markov chain
释义 mar·kov chain
noun
or mar·koff chain \ˈmärˌkȯf-\
Usage: usually capitalized M
Etymology: after Andrei Andreevich Markov died 1922 Russ. mathematician
: a usually discrete stochastic process (as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved
随便看

 

英语词典包含332784条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2022 Newdu.com All Rights Reserved
更新时间:2024/9/22 15:41:13