请输入您要查询的英文单词:

 

单词 Markov process
释义 Noun: Markov process
  1. A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
    - Markoff process

Derived forms: Markov processes

See also: Markovian

Type of: stochastic process

Encyclopedia: Markov process

随便看

 

英语词典包含157790条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2022 Newdu.com All Rights Reserved
更新时间:2024/12/21 16:19:03