zoukankan      html  css  js  c++  java
  • 2018 10-708 (CMU) Probabilistic Graphical Models {Lecture 10} [HHM and CRF]

     

     

     

     

     

    between tags and words, there's table 1.

    between tags, there's table 2.

    combine the two tables, p(...) to get the results. 

     

    MRF: factors of the tables not necessarily probabilities

    BN: must be probabilities. => BN is easier to learn than MRF

     

     Maximum-Entropy Markov Model (MEMM)

     

     

     

     

    Marginals:

    1) forward: 

     

     2) Belief:

     

    HMM is generative, modeling joint probability P(x,y)

    but tagging just needs  P(y|x)


    https://cedar.buffalo.edu/~srihari/CSE574/Discriminative-Generative.pdf

     

     


     

    Full obervation!

    (like the offline SLAM?)

     

     

    biased! because we only look at local observation.

     

     

     

     

    P(x_2|x_1) can be called Psi(x_1,x_2)

    If Y_1 ~~~~ Y_{n-2} are connected somehow, what should be changed?

     

     

     

    How close the model is closed to the truth.

     

     

     

     

     


     

     

  • 相关阅读:
    游戏运营-三节课
    游戏运营-游戏付费
    游戏运营--资深
    django 导出csv
    django-分页
    django-中间件
    django-缓存
    django post 与get请求理解
    django 中遇到的问题
    Linux下使用XAMPP
  • 原文地址:https://www.cnblogs.com/ecoflex/p/10231319.html
Copyright © 2011-2022 走看看