Difference between revisions of "Aufgaben:Exercise 4.5Z: Again Mutual Information"

From LNTwww
m (Text replacement - "”" to """)
m (Text replacement - "value-discrete" to "discrete")
 
(9 intermediate revisions by 3 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/AWGN–Kanalkapazität bei wertkontinuierlichem Eingang
+
{{quiz-Header|Buchseite=Information_Theory/AWGN_Channel_Capacity_for_Continuous_Input
 
}}
 
}}
  
[[File:P_ID2893__Inf_Z_4_5.png|right|frame|Gegebene Verbund–WDF und Schaubild der differentiellen Entropien]]
+
[[File:EN_Inf_Z_4_5.png|right|frame|Given joint PDF and <br>graph of differential entropies]]
Die Grafik zeigt oben die in dieser Aufgabe zu betrachtende Verbund&ndash;WDF&nbsp; $f_{XY}(x, y)$, die identisch ist mit der "grünen" Konstellation in der&nbsp; [[Aufgaben:Aufgabe_4.5:_Transinformation_aus_2D-WDF|Aufgabe 4.5]].
+
The graph above shows the joint PDF&nbsp; $f_{XY}(x, y)$&nbsp; to be considered in this task,&nbsp; which is identical to the "green" constellation in&nbsp; [[Aufgaben:Exercise_4.5:_Mutual_Information_from_2D-PDF|Exercise 4.5]].
* $f_{XY}(x, y)$&nbsp; ist in der&nbsp; $y$&ndash;Richtung um den Faktor&nbsp; $3$&nbsp; vergrößert.  
+
* In this sketch&nbsp; $f_{XY}(x, y)$&nbsp; is enlarged by a factor of&nbsp; $3$&nbsp; in &nbsp; $y$&ndash;direction.
*Im grün hinterlegten Definitionsgebiet ist die Verbund&ndash;WDF konstant gleich&nbsp; $C  = 1/F$, wobei&nbsp; $F$&nbsp; die Fläche des Parallelogramms angibt.
+
*In the definition area highlighted in green, the joint PDF is constant equal to&nbsp; $C  = 1/F$,&nbsp; where&nbsp; $F$&nbsp; indicates the area of the parallelogram.
  
  
In der Aufgabe 4.5 wurden folgende differentielle Entropien berechnet:
+
In Exercise 4.5 the following differential entropies were calculated:
 
:$$h(X) \  =  \  {\rm log} \hspace{0.1cm} (\hspace{0.05cm}A\hspace{0.05cm})\hspace{0.05cm},$$
 
:$$h(X) \  =  \  {\rm log} \hspace{0.1cm} (\hspace{0.05cm}A\hspace{0.05cm})\hspace{0.05cm},$$
 
:$$h(Y)  =    {\rm log} \hspace{0.1cm} (\hspace{0.05cm}B \cdot \sqrt{ {\rm e } } \hspace{0.05cm})\hspace{0.05cm},$$  
 
:$$h(Y)  =    {\rm log} \hspace{0.1cm} (\hspace{0.05cm}B \cdot \sqrt{ {\rm e } } \hspace{0.05cm})\hspace{0.05cm},$$  
 
:$$h(XY)  =    {\rm log} \hspace{0.1cm} (\hspace{0.05cm}F \hspace{0.05cm}) =  {\rm log} \hspace{0.1cm} (\hspace{0.05cm}A \cdot B \hspace{0.05cm})\hspace{0.05cm}.$$
 
:$$h(XY)  =    {\rm log} \hspace{0.1cm} (\hspace{0.05cm}F \hspace{0.05cm}) =  {\rm log} \hspace{0.1cm} (\hspace{0.05cm}A \cdot B \hspace{0.05cm})\hspace{0.05cm}.$$
In dieser Aufgabe sind nun die Parameterwerte&nbsp; $A = {\rm e}^{-2}$&nbsp; und&nbsp; $B = {\rm e}^{0.5}$&nbsp; zu verwenden.  
+
In this exercise, the parameter values&nbsp; $A = {\rm e}^{-2}$&nbsp; and&nbsp; $B = {\rm e}^{0.5}$&nbsp; are now to be used.
  
Entsprechend dem obigen Schaubild sollen nun auch die bedingten differentiellen Entropien&nbsp; $h(Y|X)$&nbsp;  und&nbsp; $h(X|Y)$&nbsp; ermittelt und deren Bezug zur Transinformation&nbsp; $I(X; Y)$&nbsp; angegeben  werden.
+
According to the above diagram, the conditional differential entropies&nbsp; $h(Y|X)$&nbsp;  and&nbsp; $h(X|Y)$&nbsp; should now also be determined and their relation to the mutual information&nbsp; $I(X; Y)$&nbsp; given.
  
  
Line 23: Line 23:
  
  
 
+
Hints:
''Hinweise:''
+
*The exercise belongs to the chapter&nbsp; [[Information_Theory/AWGN–Kanalkapazität_bei_wertkontinuierlichem_Eingang|AWGN channel capacity with continuous input]].
*Die Aufgabe gehört zum  Kapitel&nbsp; [[Information_Theory/AWGN–Kanalkapazität_bei_wertkontinuierlichem_Eingang|AWGN–Kanalkapazität bei wertkontinuierlichem Eingang]].
+
*If the results are to be given in "nat", this is achieved with "log" &nbsp;&#8658;&nbsp; "ln".  
*Sollen die Ergebnisse in "nat" angegeben werden, so erreicht man dies mit "log" &nbsp;&#8658;&nbsp; "ln".  
+
*If the results are to be given in "bit", this is achieved with "log" &nbsp;&#8658;&nbsp; "log<sub>2</sub>".  
*Sollen die Ergebnisse in "bit" angegeben werden, so erreicht man dies mit "log" &nbsp;&#8658;&nbsp; "log<sub>2</sub>".  
 
 
   
 
   
  
  
  
===Fragebogen===
+
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
  
{Geben Sie die folgenden informationstheoretischen Größen in "nat" an:
+
{State the following information theoretic quantities&nbsp;  "nat":
 
|type="{}"}
 
|type="{}"}
 
$h(X) \ = \ $  { -2.06--1.94 } $\ \rm nat$
 
$h(X) \ = \ $  { -2.06--1.94 } $\ \rm nat$
Line 44: Line 43:
  
  
{Wie lauten die gleichen Größen mit der Pseudo&ndash;Einheit "bit"?
+
{What are the same quantities with the pseudo&ndash;unit&nbsp; "bit"?
 
|type="{}"}
 
|type="{}"}
 
$h(X) \ = \ $ { -2.986--2.786  } $\ \rm bit$
 
$h(X) \ = \ $ { -2.986--2.786  } $\ \rm bit$
Line 52: Line 51:
  
  
{Berechnen Sie die bedingte differentielle Entropie&nbsp; $h(Y|X)$.
+
{Calculate the conditional differential entropy&nbsp; $h(Y|X)$.
 
|type="{}"}
 
|type="{}"}
 
$h(Y|X) \ = \ $ { 0.5 3% } $\ \rm nat$
 
$h(Y|X) \ = \ $ { 0.5 3% } $\ \rm nat$
Line 58: Line 57:
  
  
{Berechnen Sie die bedingte differentielle Entropie&nbsp; $h(X|Y)$.
+
{Calculate the conditional differential entropy&nbsp; $h(X|Y)$.
 
|type="{}"}
 
|type="{}"}
 
$h(X|Y) \ = \ $ { -2.6--2.4 } $\ \rm nat$
 
$h(X|Y) \ = \ $ { -2.6--2.4 } $\ \rm nat$
Line 64: Line 63:
  
  
{Welche der folgenden Größen sind niemals negativ?
+
{Which of the following quantities is never negative?
 
|type="[]"}
 
|type="[]"}
+ Sowohl &nbsp;$H(X)$&nbsp; als auch &nbsp;$H(Y)$&nbsp; im wertdiskreten Fall.
+
+ Both &nbsp;$H(X)$&nbsp; and &nbsp;$H(Y)$&nbsp; in the discrete case.
+ Die Transinformation &nbsp;$I(X; Y)$&nbsp; im wertdiskreten Fall.
+
+ The mutual information &nbsp;$I(X; Y)$&nbsp; in the discrete case.
+ Die Transinformation &nbsp;$I(X; Y)$&nbsp; im wertkontinuierlichen Fall.
+
+ The mutual information &nbsp;$I(X; Y)$&nbsp; in the continuous case.
- Sowohl &nbsp;$h(X)$&nbsp; als auch &nbsp;$h(Y)$&nbsp;  im wertkontinuierlichen Fall.
+
- Both &nbsp;$h(X)$&nbsp; and &nbsp;$h(Y)$&nbsp;  in the continuous case.
- Sowohl &nbsp;$h(X|Y)$&nbsp; als auch &nbsp;$h(Y|X)$&nbsp; im wertkontinuierlichen Fall.
+
- Both &nbsp;$h(X|Y)$&nbsp; and &nbsp;$h(Y|X)$&nbsp; in the continuous case.
- Die Verbundentropie &nbsp;$h(XY)$&nbsp; im wertkontinuierlichen Fall.
+
- The joint entropy &nbsp;$h(XY)$&nbsp; in the continuous case.
 +
 
  
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''(1)'''&nbsp; Da die Ergebnisse in "nat" gefordert sind, bietet sich die Verwendung des natürlichen Logarithmus an:
+
'''(1)'''&nbsp; Since the results are required in&nbsp; "nat",&nbsp; it is convenient to use the natural logarithm:
*Die Zufallsgröße&nbsp; $X$&nbsp; ist gleichverteilt zwischen&nbsp; $0$&nbsp; und&nbsp; $1/{\rm e}^2={\rm e}^{-2}$:
+
*The random variable&nbsp; $X$&nbsp; is uniformly distributed between&nbsp; $0$&nbsp; and&nbsp; $1/{\rm e}^2={\rm e}^{-2}$:
 
:$$h(X) =  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{\rm e}^{-2}\hspace{0.05cm})
 
:$$h(X) =  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{\rm e}^{-2}\hspace{0.05cm})
 
\hspace{0.15cm}\underline{= -2\,{\rm nat}}\hspace{0.05cm}. $$
 
\hspace{0.15cm}\underline{= -2\,{\rm nat}}\hspace{0.05cm}. $$
*Die Zufallsgröße&nbsp; $Y$&nbsp; ist dreieckverteilt zwischen&nbsp; $&plusmn;{\rm e}^{-0.5}$:
+
*The random variable&nbsp; $Y$&nbsp; is triangularly distributed between&nbsp; $&plusmn;{\rm e}^{-0.5}$:
 
:$$h(Y) =  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}\sqrt{ {\rm e} } \cdot \sqrt{ {\rm e} } )
 
:$$h(Y) =  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}\sqrt{ {\rm e} } \cdot \sqrt{ {\rm e} } )
 
=  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{ { \rm e } }  
 
=  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{ { \rm e } }  
 
\hspace{0.05cm})
 
\hspace{0.05cm})
 
\hspace{0.15cm}\underline{= +1\,{\rm nat}}\hspace{0.05cm}.$$   
 
\hspace{0.15cm}\underline{= +1\,{\rm nat}}\hspace{0.05cm}.$$   
* Die Fläche des Parallelogramms ergibt sich zu
+
* The area of the parallelogram is given by
 
:$$F = A \cdot B = {\rm e}^{-2} \cdot {\rm e}^{0.5} = {\rm e}^{-1.5}\hspace{0.05cm}.$$
 
:$$F = A \cdot B = {\rm e}^{-2} \cdot {\rm e}^{0.5} = {\rm e}^{-1.5}\hspace{0.05cm}.$$
*Damit hat die 2D&ndash;WDF im grün hinterlegten Bereich die konstante Höhe&nbsp; $C = 1/F ={\rm e}^{1.5}$&nbsp; und man erhält für die Verbundentropie:
+
*Thus, the 2D-PDF in the area highlighted in green has constant height&nbsp; $C = 1/F ={\rm e}^{1.5}$&nbsp; and we obtain for the joint entropy:
 
:$$h(XY) =  {\rm ln} \hspace{0.1cm} (F)
 
:$$h(XY) =  {\rm ln} \hspace{0.1cm} (F)
 
=  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{\rm e}^{-1.5}\hspace{0.05cm})
 
=  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{\rm e}^{-1.5}\hspace{0.05cm})
 
\hspace{0.15cm}\underline{= -1.5\,{\rm nat}}\hspace{0.05cm}.$$
 
\hspace{0.15cm}\underline{= -1.5\,{\rm nat}}\hspace{0.05cm}.$$
*Daraus ergibt sich für die Transinformation:
+
*From this we obtain for the mutual information:
 
:$$I(X;Y) = h(X) + h(Y) - h(XY) = -2 \,{\rm nat} + 1 \,{\rm nat} - (-1.5 \,{\rm nat} ) \hspace{0.15cm}\underline{= 0.5\,{\rm nat}}\hspace{0.05cm}.$$
 
:$$I(X;Y) = h(X) + h(Y) - h(XY) = -2 \,{\rm nat} + 1 \,{\rm nat} - (-1.5 \,{\rm nat} ) \hspace{0.15cm}\underline{= 0.5\,{\rm nat}}\hspace{0.05cm}.$$
  
  
  
'''(2)'''&nbsp; Allgemein gilt der Zusammenhang&nbsp; $\log_2(x) = \ln(x)/\ln(2)$.&nbsp; Damit erhält man mit den Ergebnissen der Teilaufgabe&nbsp; '''(1)''':
+
'''(2)'''&nbsp; In general, the relation&nbsp; $\log_2(x) = \ln(x)/\ln(2)$ holds.&nbsp; Thus, using the results of subtask&nbsp; '''(1)''', we obtain:
 
:$$h(X) \  =  \  \frac{-2\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= -2.886\,{\rm bit}}\hspace{0.05cm},$$
 
:$$h(X) \  =  \  \frac{-2\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= -2.886\,{\rm bit}}\hspace{0.05cm},$$
 
:$$h(Y) \  =  \  \frac{+1\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= +1.443\,{\rm bit}}\hspace{0.05cm},$$
 
:$$h(Y) \  =  \  \frac{+1\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= +1.443\,{\rm bit}}\hspace{0.05cm},$$
 
:$$h(XY) \  =  \  \frac{-1.5\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= -2.164\,{\rm bit}}\hspace{0.05cm},$$
 
:$$h(XY) \  =  \  \frac{-1.5\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= -2.164\,{\rm bit}}\hspace{0.05cm},$$
 
:$$I(X;Y) \  =  \  \frac{0.5\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= 0.721\,{\rm bit}}\hspace{0.05cm}.$$
 
:$$I(X;Y) \  =  \  \frac{0.5\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= 0.721\,{\rm bit}}\hspace{0.05cm}.$$
*Oder auch:
+
*Or also:
 
:$$I(X;Y) = -2.886 \,{\rm bit} + 1.443 \,{\rm bit}+ 2.164 \,{\rm bit}{= 0.721\,{\rm bit}}\hspace{0.05cm}.$$
 
:$$I(X;Y) = -2.886 \,{\rm bit} + 1.443 \,{\rm bit}+ 2.164 \,{\rm bit}{= 0.721\,{\rm bit}}\hspace{0.05cm}.$$
  
  
  
'''(3)'''&nbsp; Die Transinformation kann auch in der Form &nbsp;$I(X;Y) = h(Y \hspace{-0.05cm}\mid \hspace{-0.05cm} X) - h(Y)$&nbsp; geschrieben werden:
+
'''(3)'''&nbsp; The mutual information can also be written in the form &nbsp;$I(X;Y) = h(Y)-h(Y \hspace{-0.05cm}\mid \hspace{-0.05cm} X) $&nbsp;:
 
:$$h(Y \hspace{-0.05cm}\mid \hspace{-0.05cm} X) = h(Y) - I(X;Y) = 1 \,{\rm nat} - 0.5 \,{\rm nat} \hspace{0.15cm}\underline{= 0.5\,{\rm nat}= 0.721\,{\rm bit}}\hspace{0.05cm}.$$
 
:$$h(Y \hspace{-0.05cm}\mid \hspace{-0.05cm} X) = h(Y) - I(X;Y) = 1 \,{\rm nat} - 0.5 \,{\rm nat} \hspace{0.15cm}\underline{= 0.5\,{\rm nat}= 0.721\,{\rm bit}}\hspace{0.05cm}.$$
  
  
  
'''(4)'''&nbsp; Für die differentielle Rückschlussentropie gilt entsprechend:
+
'''(4)'''&nbsp; For the differential inference entropy, it holds correspondingly:
 
:$$h(X \hspace{-0.05cm}\mid \hspace{-0.05cm} Y) = h(X) - I(X;Y) = -2 \,{\rm nat} - 0.5 \,{\rm nat} \hspace{0.15cm}\underline{= -2.5\,{\rm nat}= -3.607\,{\rm bit}}\hspace{0.05cm}.$$
 
:$$h(X \hspace{-0.05cm}\mid \hspace{-0.05cm} Y) = h(X) - I(X;Y) = -2 \,{\rm nat} - 0.5 \,{\rm nat} \hspace{0.15cm}\underline{= -2.5\,{\rm nat}= -3.607\,{\rm bit}}\hspace{0.05cm}.$$
  
[[File: P_ID2898__Inf_Z_4_5d.png |right|frame|Zusammenfassung aller Ergebnisse dieser Aufgabe]]
+
[[File: P_ID2898__Inf_Z_4_5d.png |right|frame|Summary of all results of this exercise]]
*Alle hier berechneten Größen sind in der Grafik zusammengestellt.&nbsp;  
+
*All quantities calculated here are summarized in the graph.&nbsp;  
*Pfeile nach oben kennzeichnen einen positiven Beitrag, Pfeile nach unten einen negativen.
+
*Arrows pointing up indicate a positive contribution, arrows pointing down indicate a negative contribution.
  
  
  
'''(5)'''&nbsp; Richtig sind die <u>Lösungsvorschläge 1 bis 3</u>.
+
'''(5)'''&nbsp; Correct are the&nbsp; <u>proposed solutions 1 to 3</u>.
 
   
 
   
Nochmals zur Verdeutlichung:
+
Again for clarification:
* Für die Transinformation gilt stets &nbsp;$I(X;Y) \ge 0$.
+
* For the mutual information &nbsp;$I(X;Y) \ge 0$ always holds.
* Im wertdiskreten Fall gibt es keine negative Entropie, jedoch im wertkontinuierlichen.
+
* In the  discrete case there is no negative entropy, but in the  continuous case there is.
  
 
{{ML-Fuß}}
 
{{ML-Fuß}}
Line 131: Line 131:
  
  
[[Category:Information Theory: Exercises|^4.2 AWGN & kontinuierlicher Eingang^]]
+
[[Category:Information Theory: Exercises|^4.2 AWGN and Value-Continuous Input^]]

Latest revision as of 10:27, 11 October 2021

Given joint PDF and
graph of differential entropies

The graph above shows the joint PDF  $f_{XY}(x, y)$  to be considered in this task,  which is identical to the "green" constellation in  Exercise 4.5.

  • In this sketch  $f_{XY}(x, y)$  is enlarged by a factor of  $3$  in   $y$–direction.
  • In the definition area highlighted in green, the joint PDF is constant equal to  $C = 1/F$,  where  $F$  indicates the area of the parallelogram.


In Exercise 4.5 the following differential entropies were calculated:

$$h(X) \ = \ {\rm log} \hspace{0.1cm} (\hspace{0.05cm}A\hspace{0.05cm})\hspace{0.05cm},$$
$$h(Y) = {\rm log} \hspace{0.1cm} (\hspace{0.05cm}B \cdot \sqrt{ {\rm e } } \hspace{0.05cm})\hspace{0.05cm},$$
$$h(XY) = {\rm log} \hspace{0.1cm} (\hspace{0.05cm}F \hspace{0.05cm}) = {\rm log} \hspace{0.1cm} (\hspace{0.05cm}A \cdot B \hspace{0.05cm})\hspace{0.05cm}.$$

In this exercise, the parameter values  $A = {\rm e}^{-2}$  and  $B = {\rm e}^{0.5}$  are now to be used.

According to the above diagram, the conditional differential entropies  $h(Y|X)$  and  $h(X|Y)$  should now also be determined and their relation to the mutual information  $I(X; Y)$  given.




Hints:

  • The exercise belongs to the chapter  AWGN channel capacity with continuous input.
  • If the results are to be given in "nat", this is achieved with "log"  ⇒  "ln".
  • If the results are to be given in "bit", this is achieved with "log"  ⇒  "log2".



Questions

1

State the following information theoretic quantities  "nat":

$h(X) \ = \ $

$\ \rm nat$
$h(Y) \ \hspace{0.03cm} = \ $

$\ \rm nat$
$h(XY)\ \hspace{0.17cm} = \ $

$\ \rm nat$
$I(X;Y)\ = \ $

$\ \rm nat$

2

What are the same quantities with the pseudo–unit  "bit"?

$h(X) \ = \ $

$\ \rm bit$
$h(Y) \ \hspace{0.03cm} = \ $

$\ \rm bit$
$h(XY)\ \hspace{0.17cm} = \ $

$\ \rm bit$
$I(X;Y)\ = \ $

$\ \rm bit$

3

Calculate the conditional differential entropy  $h(Y|X)$.

$h(Y|X) \ = \ $

$\ \rm nat$
$h(Y|X) \ = \ $

$\ \rm bit$

4

Calculate the conditional differential entropy  $h(X|Y)$.

$h(X|Y) \ = \ $

$\ \rm nat$
$h(X|Y) \ = \ $

$\ \rm bit$

5

Which of the following quantities is never negative?

Both  $H(X)$  and  $H(Y)$  in the discrete case.
The mutual information  $I(X; Y)$  in the discrete case.
The mutual information  $I(X; Y)$  in the continuous case.
Both  $h(X)$  and  $h(Y)$  in the continuous case.
Both  $h(X|Y)$  and  $h(Y|X)$  in the continuous case.
The joint entropy  $h(XY)$  in the continuous case.


Solution

(1)  Since the results are required in  "nat",  it is convenient to use the natural logarithm:

  • The random variable  $X$  is uniformly distributed between  $0$  and  $1/{\rm e}^2={\rm e}^{-2}$:
$$h(X) = {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{\rm e}^{-2}\hspace{0.05cm}) \hspace{0.15cm}\underline{= -2\,{\rm nat}}\hspace{0.05cm}. $$
  • The random variable  $Y$  is triangularly distributed between  $±{\rm e}^{-0.5}$:
$$h(Y) = {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}\sqrt{ {\rm e} } \cdot \sqrt{ {\rm e} } ) = {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{ { \rm e } } \hspace{0.05cm}) \hspace{0.15cm}\underline{= +1\,{\rm nat}}\hspace{0.05cm}.$$
  • The area of the parallelogram is given by
$$F = A \cdot B = {\rm e}^{-2} \cdot {\rm e}^{0.5} = {\rm e}^{-1.5}\hspace{0.05cm}.$$
  • Thus, the 2D-PDF in the area highlighted in green has constant height  $C = 1/F ={\rm e}^{1.5}$  and we obtain for the joint entropy:
$$h(XY) = {\rm ln} \hspace{0.1cm} (F) = {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{\rm e}^{-1.5}\hspace{0.05cm}) \hspace{0.15cm}\underline{= -1.5\,{\rm nat}}\hspace{0.05cm}.$$
  • From this we obtain for the mutual information:
$$I(X;Y) = h(X) + h(Y) - h(XY) = -2 \,{\rm nat} + 1 \,{\rm nat} - (-1.5 \,{\rm nat} ) \hspace{0.15cm}\underline{= 0.5\,{\rm nat}}\hspace{0.05cm}.$$


(2)  In general, the relation  $\log_2(x) = \ln(x)/\ln(2)$ holds.  Thus, using the results of subtask  (1), we obtain:

$$h(X) \ = \ \frac{-2\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= -2.886\,{\rm bit}}\hspace{0.05cm},$$
$$h(Y) \ = \ \frac{+1\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= +1.443\,{\rm bit}}\hspace{0.05cm},$$
$$h(XY) \ = \ \frac{-1.5\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= -2.164\,{\rm bit}}\hspace{0.05cm},$$
$$I(X;Y) \ = \ \frac{0.5\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= 0.721\,{\rm bit}}\hspace{0.05cm}.$$
  • Or also:
$$I(X;Y) = -2.886 \,{\rm bit} + 1.443 \,{\rm bit}+ 2.164 \,{\rm bit}{= 0.721\,{\rm bit}}\hspace{0.05cm}.$$


(3)  The mutual information can also be written in the form  $I(X;Y) = h(Y)-h(Y \hspace{-0.05cm}\mid \hspace{-0.05cm} X) $ :

$$h(Y \hspace{-0.05cm}\mid \hspace{-0.05cm} X) = h(Y) - I(X;Y) = 1 \,{\rm nat} - 0.5 \,{\rm nat} \hspace{0.15cm}\underline{= 0.5\,{\rm nat}= 0.721\,{\rm bit}}\hspace{0.05cm}.$$


(4)  For the differential inference entropy, it holds correspondingly:

$$h(X \hspace{-0.05cm}\mid \hspace{-0.05cm} Y) = h(X) - I(X;Y) = -2 \,{\rm nat} - 0.5 \,{\rm nat} \hspace{0.15cm}\underline{= -2.5\,{\rm nat}= -3.607\,{\rm bit}}\hspace{0.05cm}.$$
Summary of all results of this exercise
  • All quantities calculated here are summarized in the graph. 
  • Arrows pointing up indicate a positive contribution, arrows pointing down indicate a negative contribution.


(5)  Correct are the  proposed solutions 1 to 3.

Again for clarification:

  • For the mutual information  $I(X;Y) \ge 0$ always holds.
  • In the discrete case there is no negative entropy, but in the continuous case there is.