Markov Chain

Markov Chain is a markov process based on a transfer matrix.

For the state vector π\pi and transfer matrix PP

where

P=(ijkipiipijpikjpjipjjpjkkpkipkjpkk)\begin{aligned} P = \begin{pmatrix} & i & j & k\\ i & p_{ii} & p_{ij} & p_{ik} \\ j & p_{ji} & p_{jj} & p_{jk} \\ k & p_{ki} & p_{kj} & p_{kk} \\ \end{pmatrix} \end{aligned}

Each element represents the probability to transfer.

Let the current state is π0\pi_0. The new state vector can be calculated by π1=π0P\pi_1 = \pi_0 P . And after many transfer, we may have a stationary state which means we have detailed balance .

A markov chain must be a reversible markov chain to reach the Detailed Balance. When we have the Detailed balance, the Markov chain can be converge.

Author: shixuan liu
Link: http://tedlsx.github.io/2020/06/09/markov/
Copyright Notice: All articles in this blog are licensed under CC BY-NC-SA 4.0 unless stating additionally.
Donate
  • Wechat
  • Alipay

Comment
Catalog