单词 | Markov process |
释义 | Markov processnoun : a stochastic process (such as Brownian motion) that resembles a Markov chain except that the states are continuous also : markov chain called also Markoff process |
随便看 |
|
英语词典包含469781条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。