zoukankan      html  css  js  c++  java
  • KL散度

    https://blog.csdn.net/guolindonggld/article/details/79736508

    https://www.jianshu.com/p/43318a3dc715?from=timeline&isappinstalled=0

    KL散度(Kullback-Leibler Divergence)也叫做相对熵,用于度量两个概率分布之间的差异程度。

    离散型
    DKL(P∥Q)=∑i=1nPilog(PiQi)
    DKL(P∥Q)=∑i=1nPilog(PiQi)
    比如随机变量X∼PX∼P取值为1,2,31,2,3时的概率分别为[0.2,0.4,0.4][0.2,0.4,0.4],随机变量Y∼QY∼Q取值为1,2,31,2,3时的概率分别为[0.4,0.2,0.4][0.4,0.2,0.4],则:

    D(P∥Q)=0.2×log(0.20.4)+0.4×log(0.40.2)+0.4×log(0.40.4)=0.2×−0.69+0.4×0.69+0.4×0=0.138
    D(P∥Q)=0.2×log(0.20.4)+0.4×log(0.40.2)+0.4×log(0.40.4)=0.2×−0.69+0.4×0.69+0.4×0=0.138
    Python代码实现,离散型KL散度可通过SciPy进行计算:

    from scipy import stats

    P = [0.2, 0.4, 0.4]
    Q = [0.4, 0.2, 0.4]
    stats.entropy(P,Q) # 0.13862943611198905

    P = [0.2, 0.4, 0.4]
    Q = [0.5, 0.1, 0.4]
    stats.entropy(P,Q) # 0.3195159298250885

    P = [0.2, 0.4, 0.4]
    Q = [0.3, 0.3, 0.4]
    stats.entropy(P,Q) # 0.03533491069691495
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    KL散度的性质:

    DKL(P∥Q)≥0DKL(P∥Q)≥0,即非负性。
    DKL(P∥Q)≠DKL(Q∥P)DKL(P∥Q)≠DKL(Q∥P),即不对称性。
    连续型
    DKL(P∥Q)=∫+∞−∞p(x)logp(x)q(x)dx
    DKL(P∥Q)=∫−∞+∞p(x)logp(x)q(x)dx
    (没怎么用到,后面再补吧)
    ---------------------
    作者:加勒比海鲜
    来源:CSDN
    原文:https://blog.csdn.net/guolindonggld/article/details/79736508
    版权声明:本文为博主原创文章,转载请附上博文链接!

    萍水相逢逢萍水,浮萍之水水浮萍!
  • 相关阅读:
    TransmitFile
    xml
    鼠标划过表格行变色-简洁实现
    关于表变量
    显式接口成员实现
    华为致新员工书
    C#实现的堆栈
    Gridview中合并单元格,某字段的内容相同时如何只显示一个,屏蔽相同列或行的内容(转)
    ASP.NET 验证控件
    动态SQL EXEC
  • 原文地址:https://www.cnblogs.com/AIBigTruth/p/10481991.html
Copyright © 2011-2022 走看看