zoukankan      html  css  js  c++  java
  • If a probability density p is known then image information content can be estimated regardless of its interpretation using entropy H. The concept of entropy has roots in thermodynamics and statistical mechanics but it took many years before entropy was related to information. The information-theoretic formulation of entropy comes from Shannon [Shannon, 1948] and is often called information entropy.

    如果知道概率密度p,用H(Entropy)就可以估计出图像的信息量,而与其解释无关。熵的概念根源于热力学和统计力学,直到很多年后才与信息联系起来。熵的信息论的形成源于香农[Shannon, 1948],常称作信息熵(information entropy

    An intuitive understanding of information entropy relates to the amount of uncertainty about an event associated with a given probability distribution. The entropy can serve as an measure of 'disorder'. As the level of disorder rises, entropy increases and events are less predictable.

    信息熵的直觉理解与关联于给定概率分布的事件的不确定性大小有关。熵可作为“失调”的度量。当失调水平上升时,熵就增加而事件就越难于预测。 

    The entropy is defined formally assuming a discrete random variable X with possible outcomes (called also states) x1,... ,xn. Let p(xk) be the probability of the outcome xk,k = 1,... n. Then the entropy is defined as

    熵定义为:假设一个离散随机变量X具有可能的结果(又称为状态)x1,...,xn。令p(xk)为输出结果xk的概率,k=1,...,n。那么熵定义为:

     

    The entropy of the random variable X is the sum, over all possible outcomes k of X, of the product of the probability of outcome xk with the logarithm of the inverse of the probability of xk. log2(1/p(xk)) is also called the surprisal of the outcome xk. The entropy of the random discrete variable X is the expected value of its outcome's surprisal.

    The base of the logarithm in this formula determines the unit in which entropy is measured. If this base is two then the entropy is given in bits. Recall that the probability density p(xk) needed to calculate the entropy is often estimated using a gray-level histogram in image analysis, Section 2.3.2.

    Entropy measures the uncertainty about the realization of a random variable. For Shannon, it served as a proxy capturing the concept of information contained in a message as opposed to the portion of the message that is strictly determined and predictable by inherent structures. For example, we shall explore entropy to assess redundancy in an image for image compression (Chapter 14).

  • 相关阅读:
    OJ:自己实现一个简单的 priority_queue
    OJ:访问 const 成员函数问题
    OJ:重载 << 运算符
    Qt 编程中 namespace Ui { class Widget; } 解析
    QT 实现图片旋转的两种方法
    QTimer 的使用
    QT 完美实现圆形按钮
    QT 设置有效绘图区域
    基于 LWIP 建立 TCP Server 与主机通信实验
    大整数相乘 分治法 和 循环暴力法
  • 原文地址:https://www.cnblogs.com/2008nmj/p/9218729.html
Copyright © 2011-2022 走看看