请输入您要查询的英文单词:

 

单词 Markov
释义 Markov Math.|ˈmɑːkɒf|
Also Markoff.
[The name of Andrei Andreevich Markov (1856–1922), Russian mathematician, who investigated such processes.]
Markov process: any stochastic process for which the probabilities, at any one time, of the different future states depend only on the existing state and not on how that state was arrived at. Markov chain: a Markov process in which there are a finite or countably infinite number of possible states or in which transitions between states occur at discrete intervals of time; also, one for which in addition the transition probabilities are constant (independent of time). Also Markov property, the characteristic property of Markov processes.
1939Jap. Jrnl. Math. XVI. 47 (heading) Markoff process with an enumerable infinite number of possible states.1942Trans. Amer. Math. Soc. LII. 37 Then pij(t) can be considered a transition probability of a Markoff chain: A system is supposed which can assume various numbered states, and pij(t) is the probability that the system is in the jth state at the end of a time interval of length t, if it was in the ith state at the beginning of the interval.1950W. Feller Introd. Probability Theory I. xv. 337 A Markov process is the probabilistic analogue of the processes of classical mechanics, where the future development is completely determined by the present state and is independent of the way in which the present state has developed.Ibid. 337 A definition of the Markov property.1953J. L. Doob Stochastic Processes v. 170 A Markov chain is defined as a Markov process..whose random variables can (with probability 1) only assume values in a certain finite or denumerably infinite set. The set is usually taken, for convenience, to be the integers 1,{ddd}, N (finite case) or the integers 1, 2,{ddd}(infinite case).Ibid. 186 The problem of card mixing is a good example of the application of Markov chains.1953J. B. Carroll Study of Lang. iii. 85 A Markoff process has to do with the different ‘states’ into which a phenomenon can get, and the statistical probabilities which govern the transition of the phenomenon from one state to another.1956Nature 4 Feb. 207/1 The most simple case is when all the atoms of the assembly are supposed to have no volume and no interactions (such as in an ideal gas). In that case it can be treated as a Markov process.1960Kemeny & Snell Finite Markov Chains ii. 25 A finite Markov chain is a finite Markov process such that the transition probabilities pij(n) do not depend on n.1962J. Riordan Stochastic Service Syst. iii. 28 The simplest infinite-server system is unique among its fellows in the possession of the Markov property that future changes are independent of the past.1966S. Karlin First Course in Stochastic Processes ii. 27 A discrete time Markov chain {ob}Xn{cb} is a Markov stochastic process whose state space is a countable or finite set, and for which T = (0, 1, 2,{ddd}).Ibid., The vast majority of Markov chains that we shall encounter have stationary transition probabilities.1968P. A. P. Moran Introd. Probability Theory iii. 140 Thus a Markov chain observed in the reverse direction of time will be a Markov process. However, it will not in general be a Markov chain because the observed transition probabilities will not be independent of t.1973Manch. Sch. Econ. & Social Stud. XLI. 401 (heading) A Markov chain model of the benefits of participating in government training schemes.
随便看

 

英语词典包含277258条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2022 Newdu.com All Rights Reserved
更新时间:2024/12/22 12:07:58