请输入您要查询的英文单词:

 

单词 markov chain
释义

> as lemmas

Markov chain
Markov chain n. [compare German Markoffsche Kette (1933), French chaînes de A. Markoff (1927)] a Markov process in which there are a denumerable number of possible states or in which transitions between states occur at discrete time intervals; (also) one for which in addition the transition probabilities are constant (independent of time).extracted from Markovn.
<
as lemmas
随便看

 

英语词典包含1132095条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2022 Newdu.com All Rights Reserved
更新时间:2025/2/1 6:40:24