释义 |
DictionarySeeAndrei MarkovSee Andrei Markov See Andrei MarkovMarkov, Andrei
Markov, (Markoff), Andrei, Russian mathematician, 1865-1922. Markov chain - number of steps or events in sequence.Markov chaining - a theory used in psychiatry.Markov process - a process such that the conditional probability distribution for the state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system. |