Difference between revisions of "Aufgaben:Exercise 3.10: Mutual Information at the BSC"

From LNTwww
 
Line 129: Line 129:
  
 
'''(6)'''&nbsp; Here, too, the <u>second proposed solution</u> is correct:
 
'''(6)'''&nbsp; Here, too, the <u>second proposed solution</u> is correct:
*Because of&nbsp; I(X;Y)=H(X)H(XY)=H(Y)H(YX)&nbsp;,&nbsp; H(Y|X)&nbsp; is greater than&nbsp; H(X|Y) by the same amount that&nbsp; H(Y)&nbsp; is greater than&nbsp; H(X):  
+
*Because of&nbsp; I(X;Y)=H(X)H(XY)=H(Y)H(YX)&nbsp;,&nbsp; H(Y|X)&nbsp; is greater than&nbsp; H(X|Y) by the same magnitude that&nbsp; H(Y)&nbsp; is greater than&nbsp; H(X):  
 
:H(YX)=H(Y)I(X;Y)=0.82680.3578=0.469bit
 
:H(YX)=H(Y)I(X;Y)=0.82680.3578=0.469bit
 
*Direct calculation gives the same result&nbsp; H(Y|X)=0.469 bit:
 
*Direct calculation gives the same result&nbsp; H(Y|X)=0.469 bit:

Latest revision as of 07:05, 18 September 2022

BSC model considered

We consider the  Binary Symmetric Channel  (BSC). The parameter values are valid for the whole exercise:

  • Crossover probability:   ε=0.1,
  • Probability for  0:   p0=0.2,
  • Probability for  1:   p1=0.8.


Thus the probability mass function of the source is:   PX(X)=(0.2, 0.8)  and for the source entropy applies:

H(X)=p0log21p0+p1log21p1=Hbin(0.2)=0.7219bit.

The task is to determine:

  • the probability function of the sink:
PY(Y)=(PY(0), PY(1)),
  • the joint probability function:
PXY(X,Y)=(p00p01p10p11),
  • the mutual information:
I(X;Y)=E[log2PXY(X,Y)PX(X)PY(Y)],
  • the equivocation:
H(XY)=E[log21PXY(XY)],
  • the irrelevance:
H(YX)=E[log21PYX(YX)].




Hints:


Questions

1

Calculate the joint probabilities  PXY(X,Y)

PXY(0,0) = 

PXY(0,1) = 

PXY(1,0) = 

PXY(1,1) = 

2

What is the probability mass function  PY(Y)  of the sink?

PY(0) = 

PY(1) = 

3

What is the value of the mutual information  I(X; Y)?

I(X;Y) = 

 bit

4

Which value results for the equivocation  H(X|Y)?

H(X|Y) = 

 bit

5

Which statement is true for the sink entropy  H(Y) ?

H(Y)  is never greater than  H(X).
H(Y)  is never smaller than  H(X).

6

Which statement is true for the irrelevance  H(Y|X) ?

H(Y|X)  is never larger than the equivocation  H(X|Y).
H(Y|X)  is never smaller than the equivocation  H(X|Y).


Solution

(1)  The following applies in general or with the numerical values  p0=0.2  and  ε=0.1 for the quantities sought:

PXY(0,0)=p0(1ε)=0.18_,PXY(0,1)=p0ε=0.02_,
PXY(1,0)=p1ε=0.08_,PXY(1,1)=p1(1ε)=0.72_.


(2)  In general:

PY(Y)=[Pr(Y=0),Pr(Y=1)]=(p0,p1)(1εεε1ε).

This gives the following numerical values:

Pr(Y=0)=p0(1ε)+p1ε=0.20.9+0.80.1=0.26_,
Pr(Y=1)=p0ε+p1(1ε)=0.20.1+0.80.9=0.74_.


(3)  For the mutual information, according to the definition with  p0=0.2p1=0.8  and  ε=0.1:

I(X;Y)=E[log2PXY(X,Y)PX(X)PY(Y)]
I(X;Y)=0.18log20.180.20.26+0.02log20.020.20.74+0.08log20.080.80.26+0.72log20.720.80.74=0.3578bit_.


(4)  With the source entropy  H(X)  given, we obtain for the equivocation:

H(XY)=H(X)I(X;Y)=0.72190.3578=0.3642bit_.
  • However, one could also apply the general definition with the inference probabilities  PX|Y() :
H(XY)=E[log21PXY(XY)]=E[log2PY(Y)PXY(X,Y)]
  • In the example, the same result  H(X|Y)=0.3642 bit  is also obtained according to this calculation rule:
H(XY)=0.18log20.260.18+0.02log20.740.02+0.08log20.260.08+0.72log20.740.72.


(5)  Correct is the proposed solution 2:

  • In the case of disturbed transmission  (ε>0)  the uncertainty regarding the sink is always greater than the uncertainty regarding the source.  One obtains here as a numerical value:
H(Y)=Hbin(0.26)=0.8268bit.
  • With error-free transmission  (ε=0),  on the other hand,  PY()=PX()  and  H(Y)=H(X)  would apply.


(6)  Here, too, the second proposed solution is correct:

  • Because of  I(X;Y)=H(X)H(XY)=H(Y)H(YX) ,  H(Y|X)  is greater than  H(X|Y) by the same magnitude that  H(Y)  is greater than  H(X):
H(YX)=H(Y)I(X;Y)=0.82680.3578=0.469bit
  • Direct calculation gives the same result  H(Y|X)=0.469 bit:
H(YX)=E[log21PYX(YX)]=0.18log210.9+0.02log210.1+0.08log210.1+0.72log210.9.