请输入您要查询的英文单词:

 

单词 Markov chain
释义

Markov chain/ʹmahkof/ noun

in statistics, a random sequence of states in which the probability of occurrence of a future state depends only on the present state and not on the path by which it was reached

[named after A A Markov d.1922, Russian mathematician]
随便看

 

英语词典包含66266条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2022 Newdu.com All Rights Reserved
更新时间:2025/1/5 23:33:07