3.1 从状态到马尔可夫链

马尔可夫链(Markov Chain)

  • A Markov chain describes a discrete stochastic process at successive times. The transitions from one state to all other states, including itself, are governed by a probability distribution

    马尔科夫链描述了一个连续时间的离散随机过程。从一个状态到所有其他状态的转换,包括其本身,都由一个概率分布所支配

  • t时刻状态的概率分布由且仅由前有限个m时刻状态的概率分布来决定

    ——M阶马尔可夫链

    Untitled
  • A chain of random variables in which the next one depends (only) on the current one

    一条随机变量链,其中下一个随机变量(仅)取决于当前随机变量。

    当前的状态与且只与前一个状态相关 —— 一阶马尔可夫链

    Untitled

    转移概率(Transition Probability)

    Untitled image-20221019155106992