Binary entropy function






Entropy of a Bernoulli trial as a function of binary outcome probability, called the binary entropy function.


In information theory, the binary entropy function, denoted H⁡(p){displaystyle operatorname {H} (p)}{displaystyle operatorname {H} (p)} or Hb⁡(p){displaystyle operatorname {H} _{text{b}}(p)}{displaystyle operatorname {H} _{text{b}}(p)}, is defined as the entropy of a Bernoulli process with probability p{displaystyle p}p of one of two values. It is a special case of H(X){displaystyle mathrm {H} (X)}{displaystyle mathrm {H} (X)}, the entropy function. Mathematically, the Bernoulli trial is modelled as a random variable X{displaystyle X}X that can take on only two values: 0 and 1, which are mutually exclusive and exhaustive.


If Pr⁡(X=1)=p{displaystyle operatorname {Pr} (X=1)=p}{displaystyle operatorname {Pr} (X=1)=p}, then Pr⁡(X=0)=1−p{displaystyle operatorname {Pr} (X=0)=1-p}{displaystyle operatorname {Pr} (X=0)=1-p} and the entropy of X{displaystyle X}X (in shannons) is given by



H⁡(X)=Hb⁡(p)=−plog2⁡p−(1−p)log2⁡(1−p){displaystyle operatorname {H} (X)=operatorname {H} _{text{b}}(p)=-plog _{2}p-(1-p)log _{2}(1-p)}{displaystyle operatorname {H} (X)=operatorname {H} _{text{b}}(p)=-plog _{2}p-(1-p)log _{2}(1-p)},

where 0log2⁡0{displaystyle 0log _{2}0}0log _{2}0 is taken to be 0. The logarithms in this formula are usually taken (as shown in the graph) to the base 2. See binary logarithm.


When p=12{displaystyle p={tfrac {1}{2}}}{displaystyle p={tfrac {1}{2}}}, the binary entropy function attains its maximum value. This is the case of an unbiased coin flip.


H⁡(p){displaystyle operatorname {H} (p)}{displaystyle operatorname {H} (p)} is distinguished from the entropy function H(X){displaystyle mathrm {H} (X)}{displaystyle mathrm {H} (X)} in that the former takes a single real number as a parameter whereas the latter takes a distribution or random variable as a parameter.
Sometimes the binary entropy function is also written as H2⁡(p){displaystyle operatorname {H} _{2}(p)}{displaystyle operatorname {H} _{2}(p)}.
However, it is different from and should not be confused with the Rényi entropy, which is denoted as H2(X){displaystyle mathrm {H} _{2}(X)}{displaystyle mathrm {H} _{2}(X)}.




Contents






  • 1 Explanation


  • 2 Derivative


  • 3 Taylor series


  • 4 See also


  • 5 References





Explanation


In terms of information theory, entropy is considered to be a measure of the uncertainty in a message. To put it intuitively, suppose p=0{displaystyle p=0}p=0. At this probability, the event is certain never to occur, and so there is no uncertainty at all, leading to an entropy of 0. If p=1{displaystyle p=1}p=1, the result is again certain, so the entropy is 0 here as well. When p=1/2{displaystyle p=1/2}p=1/2, the uncertainty is at a maximum; if one were to place a fair bet on the outcome in this case, there is no advantage to be gained with prior knowledge of the probabilities. In this case, the entropy is maximum at a value of 1 bit. Intermediate values fall between these cases; for instance, if p=1/4{displaystyle p=1/4}p=1/4, there is still a measure of uncertainty on the outcome, but one can still predict the outcome correctly more often than not, so the uncertainty measure, or entropy, is less than 1 full bit.



Derivative


The derivative of the binary entropy function may be expressed as the negative of the logit function:



ddpHb(p)=−logit2⁡(p)=−log2⁡(p1−p){displaystyle {d over dp}H_{text{b}}(p)=-operatorname {logit} _{2}(p)=-log _{2}left({frac {p}{1-p}}right)}{displaystyle {d over dp}H_{text{b}}(p)=-operatorname {logit} _{2}(p)=-log _{2}left({frac {p}{1-p}}right)}.


Taylor series


The Taylor series of the binary entropy function in a neighborhood of 1/2 is


Hb⁡(p)=1−12ln⁡2∑n=1∞(1−2p)2nn(2n−1){displaystyle operatorname {H} _{text{b}}(p)=1-{frac {1}{2ln 2}}sum _{n=1}^{infty }{frac {(1-2p)^{2n}}{n(2n-1)}}}{displaystyle operatorname {H} _{text{b}}(p)=1-{frac {1}{2ln 2}}sum _{n=1}^{infty }{frac {(1-2p)^{2n}}{n(2n-1)}}}

for 0≤p≤1{displaystyle 0leq pleq 1}0leq pleq 1.



See also



  • Metric entropy

  • Information theory

  • Information entropy

  • Quantities of information



References



  • MacKay, David J. C.. Information Theory, Inference, and Learning Algorithms Cambridge: Cambridge University Press, 2003. .mw-parser-output cite.citation{font-style:inherit}.mw-parser-output .citation q{quotes:"""""""'""'"}.mw-parser-output .citation .cs1-lock-free a{background:url("//upload.wikimedia.org/wikipedia/commons/thumb/6/65/Lock-green.svg/9px-Lock-green.svg.png")no-repeat;background-position:right .1em center}.mw-parser-output .citation .cs1-lock-limited a,.mw-parser-output .citation .cs1-lock-registration a{background:url("//upload.wikimedia.org/wikipedia/commons/thumb/d/d6/Lock-gray-alt-2.svg/9px-Lock-gray-alt-2.svg.png")no-repeat;background-position:right .1em center}.mw-parser-output .citation .cs1-lock-subscription a{background:url("//upload.wikimedia.org/wikipedia/commons/thumb/a/aa/Lock-red-alt-2.svg/9px-Lock-red-alt-2.svg.png")no-repeat;background-position:right .1em center}.mw-parser-output .cs1-subscription,.mw-parser-output .cs1-registration{color:#555}.mw-parser-output .cs1-subscription span,.mw-parser-output .cs1-registration span{border-bottom:1px dotted;cursor:help}.mw-parser-output .cs1-ws-icon a{background:url("//upload.wikimedia.org/wikipedia/commons/thumb/4/4c/Wikisource-logo.svg/12px-Wikisource-logo.svg.png")no-repeat;background-position:right .1em center}.mw-parser-output code.cs1-code{color:inherit;background:inherit;border:inherit;padding:inherit}.mw-parser-output .cs1-hidden-error{display:none;font-size:100%}.mw-parser-output .cs1-visible-error{font-size:100%}.mw-parser-output .cs1-maint{display:none;color:#33aa33;margin-left:0.3em}.mw-parser-output .cs1-subscription,.mw-parser-output .cs1-registration,.mw-parser-output .cs1-format{font-size:95%}.mw-parser-output .cs1-kern-left,.mw-parser-output .cs1-kern-wl-left{padding-left:0.2em}.mw-parser-output .cs1-kern-right,.mw-parser-output .cs1-kern-wl-right{padding-right:0.2em}
    ISBN 0-521-64298-1



這個網誌中的熱門文章

12.7 cm/40 Type 89 naval gun

Shark

Wiciokrzew