Loading [MathJax]/jax/output/HTML-CSS/fonts/TeX/fontdata.js

Exercise 3.4: Entropy for Different PMF

From LNTwww

Probability functions, each with  M=4  elements

In the first row of the adjacent table, the probability mass function denoted by  (a)  is given in the following.

For this PMF  PX(X)=[0.1, 0.2, 0.3, 0.4]  the entropy is to be calculated in subtask  (1) :

Ha(X)=E[log21PX(X)]=E[log2PX(X)].

Since the logarithm to the base  2  is used here, the pseudo-unit  "bit"  is to be added.

In the further tasks, some probabilities are to be varied in each case in such a way that the greatest possible entropy results:

  • By suitably varying  p3  and  p4,  one arrives at the maximum entropy  Hb(X)  under the condition  p1=0.1  and  p2=0.2   ⇒   subtask  (2).
  • By varying  p2  and  p3 appropriately, one arrives at the maximum entropy  Hc(X)  under the condition  p1=0.1  and  p4=0.4   ⇒   subtask  (3).
  • In subtask  (4)  all four parameters are released for variation,  which are to be determined according to the maximum entropy   ⇒   Hmax(X) .





Hints:


Questions

1

To which entropy does the probability mass function  PX(X)=[0.1, 0.2, 0.3, 0.4] lead?

Ha(X) = 

 bit

2

Let  PX(X)=[0.1, 0.2, p3, p4] apply in general.  What entropy is obtained if  p3  and  p4  are chosen as best as possible?

Hb(X) = 

 bit

3

Now let  PX(X)=[0.1, p2, p3, 0.4].  What entropy is obtained if  p2  and  p3  are chosen as best as possible?

Hc(X) = 

 bit

4

What entropy is obtained if all probabilities  (p1, p2, p3, p4)  can be chosen as best as possible?

Hmax(X) = 

 bit


Solution

(1)  With  PX(X)=[0.1, 0.2, 0.3, 0.4]  we get for the entropy:

Ha(X)=0.1log210.1+0.2log210.2+0.3log210.3+0.4log210.4=1.846_.

Here (and in the other tasks) the pseudo-unit  "bit"  is to be added in each case.


(2)  The entropy  Hb(X)  can be represented as the sum of two parts  Hb1(X)  and  Hb2(X),  with:

Hb1(X)=0.1log210.1+0.2log210.2=0.797,
Hb2(X)=p3log21p3+(0.7p3)log210.7p3.
  • The second function is maximum for  p3=p4=0.35.  A similar relationship has been found for the binary entropy function.  
  • Thus one obtains:
Hb2(X)=2p3log21p3=0.7log210.35=1.060
Hb(X)=Hb1(X)+Hb2(X)=0.797+1.060=1.857_.


(3)  Analogous to subtask  (2)p1=0.1  and  p4=0.4  yield the maximum for  p2=p3=0.25:

Hc(X)=0.1log210.1+20.25log210.25+0.4log210.4=1.861_.


(4)  The maximum entropy for the symbol range  M=4  is obtained for equal probabilities, i.e. for  p1=p2=p3=p4=0.25:

Hmax(X)=log2M=2_.
  • The difference of the entropies according to  (4)  and  (3)  gives  ΔH(X)=0.139 bit.  Here:
ΔH(X)=10.1log210.10.4log210.4.
  • With the binary entropy function
Hbin(p)=plog21p+(1p)log211p
can also be written for this:
ΔH(X)=0.5[1Hbin(0.2)]=0.5[10.722]=0.139.