【正文】
, …, together with a stochastic matrix P, such that Thus the Markov chain is described by the firstorder difference equation When a Markov chain of vectors in Rn describes a system or a sequence of experiments, the entries in xk list, respectively, the probabilities that the system is in each of n possible states, or the probabilities that the oute of the experiment that is one of n possible outes. For this reason, xk is often called a state vector. So the population distribution could be Similarly, the distribution in 2022 is described by a vector x2, where What is Markov matrix? An nxn matrix whose form satisfies two properties below entries ≥0。 columns add to 1。 is called a Markov matrix. Such as As you can see, the definition of the Markov matrix is closely related to Markov chains and probability theories. We can also derivate this result: Lemma: the powers of a Markov matrix is still a Markov matrix Lemma’s proof: We could use mathematical induction to plete this lemma’s proof. Besides, we could prove a more stronger result: Any two Markov matrices’ production is still a Markov matrix. Proof: Think about two Markov