单词 | Markov chain |
释义 | Markov chain /märˈkof chān/nounA series of events, the probability of each of which depends on the probability of that immediately preceding it ORIGIN: AA Markov (1856–1922), Russian mathematician |
随便看 |
|
英语词典包含305067条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。