Processes,Markov
《英文msh詞典》Processes,Markov
[入口詞] Processes,Markov
[主題詞] Markov Chains
[英文釋義] A stochastic process such that the conditional probability distribution for a state at any future instant,given the present state,is unaffected by any additional knowledge of the past history of the system.