Markov chain
柯林斯词典
1. N a sequence of events the probability for each of which is dependent only on the event immediately preceding it 马尔可夫链[statistics]
返回 Markov chain
1. N a sequence of events the probability for each of which is dependent only on the event immediately preceding it 马尔可夫链[statistics]