单词 | markov |
释义 | Markovn. Mathematics. 1. Markov process n. [after German Markoffsche Prozess (A. Khintchine 1933, in Math. Ann. 109 604)] any stochastic process for which the probabilities, at any one time, of the different future states depend only on the existing state and not on how that state was arrived at. ΘΚΠ the world > relative properties > number > probability or statistics > [noun] > involving random generation random number1926 stochastic process1934 Markov chain1938 Markov process1938 Markov property1944 Monte Carlo method1949 Monte Carlo1951 stochasticity1972 1938 Trans. Amer. Math. Soc. 44 102 Markoff processes are sometimes carelessly discussed in the literature as if they were the general case. 1939 Japanese Jrnl. Math. 16 47 (heading) Markoff process with an enumerable infinite number of possible states. 1950 W. Feller Introd. Probability Theory I. xv. 337 A Markov process is the probabilistic analogue of the processes of classical mechanics, where the future development is completely determined by the present state and is independent of the way in which the present state has developed. 1956 Nature 4 Feb. 207/1 The most simple case is when all the atoms of the assembly are supposed to have no volume and no interactions (such as in an ideal gas). In that case it can be treated as a Markov process. 1992 C. J. Wells Use Orthographic & Lexical Information (BNC) They model English text as a Markov process which allows transition probabilities to be assigned to various letter combinations and n-grams. 2. Markov chain n. [compare German Markoffsche Kette (1933), French chaînes de A. Markoff (1927)] a Markov process in which there are a denumerable number of possible states or in which transitions between states occur at discrete time intervals; (also) one for which in addition the transition probabilities are constant (independent of time). ΘΚΠ the world > relative properties > number > probability or statistics > [noun] > involving random generation random number1926 stochastic process1934 Markov chain1938 Markov process1938 Markov property1944 Monte Carlo method1949 Monte Carlo1951 stochasticity1972 1938 Amer. Math. Monthly 45 410 Professor Allen considered probabilities which depend on the outcome or on the probability of previous events. Certain classical problems are of this type, but the Markoff chain was the first to be intensively studied. 1942 Trans. Amer. Math. Soc. 52 37 Then pij(t) can be considered a transition probability of a Markoff chain: A system is supposed which can assume various numbered states, and pij(t) is the probability that the system is in the jth state at the end of a time interval of length t, if it was in the ith state at the beginning of the interval. 1953 J. L. Doob Stochastic Processes v. 170 A Markov chain is defined as a Markov process..whose random variables can (with probability 1) only assume values in a certain finite or denumerably infinite set. The set is usually taken, for convenience, to be the integers 1,…, N (finite case) or the integers 1, 2,…(infinite case). 1953 J. L. Doob Stochastic Processes v. 186 The problem of card mixing is a good example of the application of Markov chains. 1960 J. G. Kemeny & J. L. Snell Finite Markov Chains ii. 25 A finite Markov chain is a finite Markov process such that the transition probabilities pij(n) do not depend on n. 1968 P. A. P. Moran Introd. Probability Theory iii. 140 Thus a Markov chain observed in the reverse direction of time will be a Markov process. However, it will not in general be a Markov chain because the observed transition probabilities will not be independent of t. 1973 Manch. Sch. Econ. & Social Stud. 41 401 (heading) A Markov chain model of the benefits of participating in government training schemes. 1991 Acta Metallurgica et Materialia 39 2547/1 The Monte Carlo algorithm..produces a series of snapshot atomic configurations appearing along the simulated Markov chain. 3. Markov property n. the characteristic property of Markov processes. ΘΚΠ the world > relative properties > number > probability or statistics > [noun] > involving random generation random number1926 stochastic process1934 Markov chain1938 Markov process1938 Markov property1944 Monte Carlo method1949 Monte Carlo1951 stochasticity1972 1944 Ann. Math. Statistics 15 237 The precise definition of this (Markoff) property is the following. 1950 W. Feller Introd. Probability Theory I. xv. 337 A definition of the Markov property. 1962 J. Riordan Stochastic Service Syst. iii. 28 The simplest infinite-server system is unique among its fellows in the possession of the Markov property that future changes are independent of the past. 1990 Q. Jrnl. Math. 41 109 The Markov property which plays such a large role in the theory of subfactors. This entry has been updated (OED Third Edition, December 2000; most recently modified version published online March 2022). < n.1938 |
随便看 |
|
英语词典包含1132095条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。