Difference between revisions of "Aufgaben:Exercise 4.5Z: Again Mutual Information"

From LNTwww
m (Text replacement - "value-discrete" to "discrete")
 
(21 intermediate revisions by 4 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/AWGN–Kanalkapazität bei wertkontinuierlichem Eingang
+
{{quiz-Header|Buchseite=Information_Theory/AWGN_Channel_Capacity_for_Continuous_Input
 
}}
 
}}
  
[[File:P_ID2893__Inf_Z_4_5.png|right|frame|Gegebene Verbund–WDF und Schaubild der differentiellen Entropien]]
+
[[File:EN_Inf_Z_4_5.png|right|frame|Given joint PDF and <br>graph of differential entropies]]
Die Grafik zeigt oben die in dieser Aufgabe zu betrachtende Verbund&ndash;WDF $f_{XY}(x, y)$, die identisch ist mit der &bdquo;grünen&rdquo; Konstellation in der
+
The graph above shows the joint PDF&nbsp; $f_{XY}(x, y)$&nbsp; to be considered in this task,&nbsp; which is identical to the "green" constellation in&nbsp; [[Aufgaben:Exercise_4.5:_Mutual_Information_from_2D-PDF|Exercise 4.5]].
[Aufgaben:4.05_I(X;_Y)_aus_fXY(x,_y)|Aufgabe 4.5.$f_{XY}(x, y)$ ist in der $y$&ndash;Richtung um den Faktor $3$ vergrößert. Im grün hinterlegten Definitionsgebiet ist die Verbund&ndash;WDF konstant gleich $C  = 1/F$, wobei $F$ die Fläche des Parallelogramms angibt.
+
* In this sketch&nbsp; $f_{XY}(x, y)$&nbsp; is enlarged by a factor of&nbsp; $3$&nbsp; in &nbsp; $y$&ndash;direction.
 +
*In the definition area highlighted in green, the joint PDF is constant equal to&nbsp; $C  = 1/F$,&nbsp; where&nbsp; $F$&nbsp; indicates the area of the parallelogram.
  
In der Aufgabe 4.5 wurden folgende differentielle Entropien berechnet:
+
 
 +
In Exercise 4.5 the following differential entropies were calculated:
 
:$$h(X) \  =  \  {\rm log} \hspace{0.1cm} (\hspace{0.05cm}A\hspace{0.05cm})\hspace{0.05cm},$$
 
:$$h(X) \  =  \  {\rm log} \hspace{0.1cm} (\hspace{0.05cm}A\hspace{0.05cm})\hspace{0.05cm},$$
 
:$$h(Y)  =    {\rm log} \hspace{0.1cm} (\hspace{0.05cm}B \cdot \sqrt{ {\rm e } } \hspace{0.05cm})\hspace{0.05cm},$$  
 
:$$h(Y)  =    {\rm log} \hspace{0.1cm} (\hspace{0.05cm}B \cdot \sqrt{ {\rm e } } \hspace{0.05cm})\hspace{0.05cm},$$  
 
:$$h(XY)  =    {\rm log} \hspace{0.1cm} (\hspace{0.05cm}F \hspace{0.05cm}) =  {\rm log} \hspace{0.1cm} (\hspace{0.05cm}A \cdot B \hspace{0.05cm})\hspace{0.05cm}.$$
 
:$$h(XY)  =    {\rm log} \hspace{0.1cm} (\hspace{0.05cm}F \hspace{0.05cm}) =  {\rm log} \hspace{0.1cm} (\hspace{0.05cm}A \cdot B \hspace{0.05cm})\hspace{0.05cm}.$$
In dieser Aufgabe sind nun die speziellen Parameterwerte $A = {\rm e}^{-2}$ und $B = {\rm e}^{0.5}$ zu verwenden. Außerdem ist zu beachten:
+
In this exercise, the parameter values&nbsp; $A = {\rm e}^{-2}$&nbsp; and&nbsp; $B = {\rm e}^{0.5}$&nbsp; are now to be used.
* Bei Verwendung des <i>natürlichen Logarithmus</i> &bdquo;ln&rdquo; ist die Pseudo&ndash;Einheit &bdquo;nat&rdquo; anzufügen.
+
 
* Verwendet man den <i>Logarithmus dualis</i> &#8658; &bdquo;log<sub>2</sub>&rdquo;, so ergeben sich alle  Größen in &bdquo;bit&rdquo;.
+
According to the above diagram, the conditional differential entropies&nbsp; $h(Y|X)$&nbsp; and&nbsp; $h(X|Y)$&nbsp; should now also be determined and their relation to the mutual information&nbsp; $I(X; Y)$&nbsp; given.
 +
 
 +
 
 +
 
 +
 
  
Entsprechend dem obigen Schaubild sollen nun auch die bedingten differentiellen Entropien $h(Y|X)$  und $h(X|Y)$ ermittelt und deren Bezug zur Transinformation $I(X; Y)$ angegeben  werden.
 
  
  
''Hinweise:''
+
Hints:
*Die Aufgabe gehört zum  Kapitel [[Informationstheorie/AWGN–Kanalkapazität_bei_wertkontinuierlichem_Eingang|AWGN–Kanalkapazität bei wertkontinuierlichem Eingang]].
+
*The exercise belongs to the chapter&nbsp; [[Information_Theory/AWGN–Kanalkapazität_bei_wertkontinuierlichem_Eingang|AWGN channel capacity with continuous input]].
*Sollen die Ergebnisse in &bdquo;nat&rdquo; angegeben werden, so erreicht man dies mit &bdquo;log&rdquo; &nbsp;&#8658;&nbsp; &bdquo;ln&rdquo;.  
+
*If the results are to be given in "nat", this is achieved with "log" &nbsp;&#8658;&nbsp; "ln".  
*Sollen die Ergebnisse in &bdquo;bit&rdquo; angegeben werden, so erreicht man dies mit &bdquo;log&rdquo; &nbsp;&#8658;&nbsp; &bdquo;log<sub>2</sub>&rdquo;.  
+
*If the results are to be given in "bit", this is achieved with "log" &nbsp;&#8658;&nbsp; "log<sub>2</sub>".  
*Sollte die Eingabe des Zahlenwertes &bdquo;0&rdquo; erforderlich sein, so geben Sie bitte &bdquo;0.&rdquo; ein.
+
  
  
  
===Fragebogen===
+
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
  
 
+
{State the following information theoretic quantities&nbsp; "nat":
{Geben Sie die folgenden informationstheoretischen Größen in &bdquo;nat&rdquo; an:
 
 
|type="{}"}
 
|type="{}"}
 
$h(X) \ = \ $  { -2.06--1.94 } $\ \rm nat$
 
$h(X) \ = \ $  { -2.06--1.94 } $\ \rm nat$
Line 39: Line 43:
  
  
{Wie lauten die gleichen Größen mit der Pseudo&ndash;Einheit &bdquo;bit&rdquo;?
+
{What are the same quantities with the pseudo&ndash;unit&nbsp; "bit"?
 
|type="{}"}
 
|type="{}"}
 
$h(X) \ = \ $ { -2.986--2.786  } $\ \rm bit$
 
$h(X) \ = \ $ { -2.986--2.786  } $\ \rm bit$
Line 47: Line 51:
  
  
{Berechnen Sie die bedingte differentielle Entropie $h(Y|X)$.
+
{Calculate the conditional differential entropy&nbsp; $h(Y|X)$.
 
|type="{}"}
 
|type="{}"}
 
$h(Y|X) \ = \ $ { 0.5 3% } $\ \rm nat$
 
$h(Y|X) \ = \ $ { 0.5 3% } $\ \rm nat$
Line 53: Line 57:
  
  
{Berechnen Sie die bedingte differentielle Entropie $h(X|Y)$.
+
{Calculate the conditional differential entropy&nbsp; $h(X|Y)$.
 
|type="{}"}
 
|type="{}"}
 
$h(X|Y) \ = \ $ { -2.6--2.4 } $\ \rm nat$
 
$h(X|Y) \ = \ $ { -2.6--2.4 } $\ \rm nat$
Line 59: Line 63:
  
  
{Welche der folgenden Größen sind niemals negativ?
+
{Which of the following quantities is never negative?
 
|type="[]"}
 
|type="[]"}
+ Sowohl $H(X)$ als auch $H(Y)$ im wertdiskreten Fall.
+
+ Both &nbsp;$H(X)$&nbsp; and &nbsp;$H(Y)$&nbsp; in the discrete case.
+ Die Transinformation $I(X; Y)$ im wertdiskreten Fall.
+
+ The mutual information &nbsp;$I(X; Y)$&nbsp; in the discrete case.
+ Die Transinformation $I(X; Y)$ im wertkontinuierlichen Fall.
+
+ The mutual information &nbsp;$I(X; Y)$&nbsp; in the continuous case.
- Sowohl $h(X)$ als auch $h(Y)$  im wertkontinuierlichen Fall.
+
- Both &nbsp;$h(X)$&nbsp; and &nbsp;$h(Y)$&nbsp; in the continuous case.
- Sowohl $h(X|Y)$ als auch $h(Y|X)$ im wertkontinuierlichen Fall.
+
- Both &nbsp;$h(X|Y)$&nbsp; and &nbsp;$h(Y|X)$&nbsp; in the continuous case.
- Die Verbundentropie $h(XY)$ im wertkontinuierlichen Fall.
+
- The joint entropy &nbsp;$h(XY)$&nbsp; in the continuous case.
 +
 
  
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
<b>a)</b>&nbsp;&nbsp;Hier bietet sich die Verwendung des natürlichen Logarithmus an:
+
'''(1)'''&nbsp; Since the results are required in&nbsp; "nat",&nbsp; it is convenient to use the natural logarithm:
:*Die Zufallsgröße <i>X</i> ist gleichverteilt zwischen 0 und 1/e<sup>2</sup> = e<sup>&ndash;2</sup>:
+
*The random variable&nbsp; $X$&nbsp; is uniformly distributed between&nbsp; $0$&nbsp; and&nbsp; $1/{\rm e}^2={\rm e}^{-2}$:
$$h(X) =  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{\rm e}^{-2}\hspace{0.05cm})
+
:$$h(X) =  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{\rm e}^{-2}\hspace{0.05cm})
 
\hspace{0.15cm}\underline{= -2\,{\rm nat}}\hspace{0.05cm}. $$
 
\hspace{0.15cm}\underline{= -2\,{\rm nat}}\hspace{0.05cm}. $$
:*Die Zufallsgröße <i>Y</i> ist dreieckverteilt zwischen &plusmn;e<sup>0.5</sup>:
+
*The random variable&nbsp; $Y$&nbsp; is triangularly distributed between&nbsp; $&plusmn;{\rm e}^{-0.5}$:
$$h(Y) =  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}\sqrt{ {\rm e} } \cdot \sqrt{ {\rm e} } )
+
:$$h(Y) =  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}\sqrt{ {\rm e} } \cdot \sqrt{ {\rm e} } )
 
=  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{ { \rm e } }  
 
=  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{ { \rm e } }  
 
\hspace{0.05cm})
 
\hspace{0.05cm})
 
\hspace{0.15cm}\underline{= +1\,{\rm nat}}\hspace{0.05cm}.$$   
 
\hspace{0.15cm}\underline{= +1\,{\rm nat}}\hspace{0.05cm}.$$   
:* Die Fläche des Parallelogramms ergibt sich zu
+
* The area of the parallelogram is given by
$$F = A \cdot B = {\rm e}^{-2} \cdot {\rm e}^{0.5} = {\rm e}^{-1.5}\hspace{0.05cm}.$$
+
:$$F = A \cdot B = {\rm e}^{-2} \cdot {\rm e}^{0.5} = {\rm e}^{-1.5}\hspace{0.05cm}.$$
Damit hat die 2D&ndash;WDF im grün hinterlegten Bereich die konstante Höhe <i>C</i> = 1/<i>F</i> = e<sup>1.5</sup> und man erhält für die Verbundentropie:
+
*Thus, the 2D-PDF in the area highlighted in green has constant height&nbsp; $C = 1/F ={\rm e}^{1.5}$&nbsp; and we obtain for the joint entropy:
$$h(XY) =  {\rm ln} \hspace{0.1cm} (F)
+
:$$h(XY) =  {\rm ln} \hspace{0.1cm} (F)
 
=  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{\rm e}^{-1.5}\hspace{0.05cm})
 
=  {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{\rm e}^{-1.5}\hspace{0.05cm})
 
\hspace{0.15cm}\underline{= -1.5\,{\rm nat}}\hspace{0.05cm}.$$
 
\hspace{0.15cm}\underline{= -1.5\,{\rm nat}}\hspace{0.05cm}.$$
Daraus ergibt sich für die Transinformation:
+
*From this we obtain for the mutual information:
$$I(X;Y) = h(X) + h(Y) - h(XY) = -2 \,{\rm nat} + 1 \,{\rm nat} - (-1.5 \,{\rm nat} ) \hspace{0.15cm}\underline{= 0.5\,{\rm nat}}\hspace{0.05cm}.$$
+
:$$I(X;Y) = h(X) + h(Y) - h(XY) = -2 \,{\rm nat} + 1 \,{\rm nat} - (-1.5 \,{\rm nat} ) \hspace{0.15cm}\underline{= 0.5\,{\rm nat}}\hspace{0.05cm}.$$
<b>b)</b>&nbsp;&nbsp;Allgemein gilt der Zusammenhang log<sub>2</sub>(<i>x</i>) = ln(<i>x</i>)/ln(2).
+
 
$$h(X) \  =  \  \frac{-2\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= -2.886\,{\rm bit}}\hspace{0.05cm},$$
+
 
$$h(Y) \  =  \  \frac{+1\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= +1.443\,{\rm bit}}\hspace{0.05cm},$$
+
 
$$h(XY) \  =  \  \frac{-1.5\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= -2.164\,{\rm bit}}\hspace{0.05cm},$$
+
'''(2)'''&nbsp; In general, the relation&nbsp; $\log_2(x) = \ln(x)/\ln(2)$ holds.&nbsp; Thus, using the results of subtask&nbsp; '''(1)''', we obtain:
$$I(X;Y) \  =  \  \frac{0.5\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= 0.721\,{\rm bit}}\hspace{0.05cm}.$$
+
:$$h(X) \  =  \  \frac{-2\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= -2.886\,{\rm bit}}\hspace{0.05cm},$$
Oder auch:
+
:$$h(Y) \  =  \  \frac{+1\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= +1.443\,{\rm bit}}\hspace{0.05cm},$$
$$I(X;Y) = -2.886 \,{\rm bit} + 1.443 \,{\rm bit}+ 2.164 \,{\rm bit}{= 0.721\,{\rm bit}}\hspace{0.05cm}.$$
+
:$$h(XY) \  =  \  \frac{-1.5\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= -2.164\,{\rm bit}}\hspace{0.05cm},$$
<b>c)</b>&nbsp;&nbsp;Die Transinformation kann auch in der Form <i>I</i>(<i>X</i>; <i>Y</i>) = <i>h</i>(<i>Y</i>) &ndash; <i>h</i>(<i>Y</i>|<i>X</i>) geschrieben werden:
+
:$$I(X;Y) \  =  \  \frac{0.5\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= 0.721\,{\rm bit}}\hspace{0.05cm}.$$
$$h(Y \hspace{-0.05cm}\mid \hspace{-0.05cm} X) = h(Y) - I(X;Y) = 1 \,{\rm nat} - 0.5 \,{\rm nat} \hspace{0.15cm}\underline{= 0.5\,{\rm nat}= 0.721\,{\rm bit}}\hspace{0.05cm}.$$
+
*Or also:
<b>d)</b>&nbsp;&nbsp;Für die differentielle Rückschlussentropie gilt entsprechend:
+
:$$I(X;Y) = -2.886 \,{\rm bit} + 1.443 \,{\rm bit}+ 2.164 \,{\rm bit}{= 0.721\,{\rm bit}}\hspace{0.05cm}.$$
$$h(X \hspace{-0.05cm}\mid \hspace{-0.05cm} Y) = h(X) - I(X;Y) = -2 \,{\rm nat} - 0.5 \,{\rm nat} \hspace{0.15cm}\underline{= -2.5\,{\rm nat}= -3.607\,{\rm bit}}\hspace{0.05cm}.$$
+
 
Alle hier berechneten Größen sind in der Grafik am Seitenende zusammengestellt. Pfeile nach oben kennzeichnen einen positiven Beitrag, Pfeile nach unten einen negativen.
+
 
 +
 
 +
'''(3)'''&nbsp; The mutual information can also be written in the form &nbsp;$I(X;Y) = h(Y)-h(Y \hspace{-0.05cm}\mid \hspace{-0.05cm} X) $&nbsp;:
 +
:$$h(Y \hspace{-0.05cm}\mid \hspace{-0.05cm} X) = h(Y) - I(X;Y) = 1 \,{\rm nat} - 0.5 \,{\rm nat} \hspace{0.15cm}\underline{= 0.5\,{\rm nat}= 0.721\,{\rm bit}}\hspace{0.05cm}.$$
 +
 
 +
 
 +
 
 +
'''(4)'''&nbsp; For the differential inference entropy, it holds correspondingly:
 +
:$$h(X \hspace{-0.05cm}\mid \hspace{-0.05cm} Y) = h(X) - I(X;Y) = -2 \,{\rm nat} - 0.5 \,{\rm nat} \hspace{0.15cm}\underline{= -2.5\,{\rm nat}= -3.607\,{\rm bit}}\hspace{0.05cm}.$$
 +
 
 +
[[File: P_ID2898__Inf_Z_4_5d.png |right|frame|Summary of all results of this exercise]]
 +
*All quantities calculated here are summarized in the graph.&nbsp;
 +
*Arrows pointing up indicate a positive contribution, arrows pointing down indicate a negative contribution.
 +
 
 +
 
  
<b>e)</b>&nbsp;&nbsp;Richtig sind die <u>Lösungsvorschläge 1 bis 3</u>. Nochmals zur Verdeutlichung:
+
'''(5)'''&nbsp; Correct are the&nbsp; <u>proposed solutions 1 to 3</u>.
:* Für die Transinformation gilt stets <i>I</i>(<i>X</i>; <i>Y</i>) &#8805; 0.
+
:* Im wertdiskreten Fall gibt es keine negative Entropie, jedoch im wertkontinuierlichen.
+
Again for clarification:
[[File: P_ID2898__Inf_Z_4_5d.png |center|]]
+
* For the mutual information &nbsp;$I(X;Y) \ge 0$ always holds.
 +
* In the  discrete case there is no negative entropy, but in the  continuous case there is.
  
 
{{ML-Fuß}}
 
{{ML-Fuß}}
Line 111: Line 131:
  
  
[[Category:Aufgaben zu Informationstheorie|^4.2 AWGN & kontinuierlicher Eingang^]]
+
[[Category:Information Theory: Exercises|^4.2 AWGN and Value-Continuous Input^]]

Latest revision as of 09:27, 11 October 2021

Given joint PDF and
graph of differential entropies

The graph above shows the joint PDF  $f_{XY}(x, y)$  to be considered in this task,  which is identical to the "green" constellation in  Exercise 4.5.

  • In this sketch  $f_{XY}(x, y)$  is enlarged by a factor of  $3$  in   $y$–direction.
  • In the definition area highlighted in green, the joint PDF is constant equal to  $C = 1/F$,  where  $F$  indicates the area of the parallelogram.


In Exercise 4.5 the following differential entropies were calculated:

$$h(X) \ = \ {\rm log} \hspace{0.1cm} (\hspace{0.05cm}A\hspace{0.05cm})\hspace{0.05cm},$$
$$h(Y) = {\rm log} \hspace{0.1cm} (\hspace{0.05cm}B \cdot \sqrt{ {\rm e } } \hspace{0.05cm})\hspace{0.05cm},$$
$$h(XY) = {\rm log} \hspace{0.1cm} (\hspace{0.05cm}F \hspace{0.05cm}) = {\rm log} \hspace{0.1cm} (\hspace{0.05cm}A \cdot B \hspace{0.05cm})\hspace{0.05cm}.$$

In this exercise, the parameter values  $A = {\rm e}^{-2}$  and  $B = {\rm e}^{0.5}$  are now to be used.

According to the above diagram, the conditional differential entropies  $h(Y|X)$  and  $h(X|Y)$  should now also be determined and their relation to the mutual information  $I(X; Y)$  given.




Hints:

  • The exercise belongs to the chapter  AWGN channel capacity with continuous input.
  • If the results are to be given in "nat", this is achieved with "log"  ⇒  "ln".
  • If the results are to be given in "bit", this is achieved with "log"  ⇒  "log2".



Questions

1

State the following information theoretic quantities  "nat":

$h(X) \ = \ $

$\ \rm nat$
$h(Y) \ \hspace{0.03cm} = \ $

$\ \rm nat$
$h(XY)\ \hspace{0.17cm} = \ $

$\ \rm nat$
$I(X;Y)\ = \ $

$\ \rm nat$

2

What are the same quantities with the pseudo–unit  "bit"?

$h(X) \ = \ $

$\ \rm bit$
$h(Y) \ \hspace{0.03cm} = \ $

$\ \rm bit$
$h(XY)\ \hspace{0.17cm} = \ $

$\ \rm bit$
$I(X;Y)\ = \ $

$\ \rm bit$

3

Calculate the conditional differential entropy  $h(Y|X)$.

$h(Y|X) \ = \ $

$\ \rm nat$
$h(Y|X) \ = \ $

$\ \rm bit$

4

Calculate the conditional differential entropy  $h(X|Y)$.

$h(X|Y) \ = \ $

$\ \rm nat$
$h(X|Y) \ = \ $

$\ \rm bit$

5

Which of the following quantities is never negative?

Both  $H(X)$  and  $H(Y)$  in the discrete case.
The mutual information  $I(X; Y)$  in the discrete case.
The mutual information  $I(X; Y)$  in the continuous case.
Both  $h(X)$  and  $h(Y)$  in the continuous case.
Both  $h(X|Y)$  and  $h(Y|X)$  in the continuous case.
The joint entropy  $h(XY)$  in the continuous case.


Solution

(1)  Since the results are required in  "nat",  it is convenient to use the natural logarithm:

  • The random variable  $X$  is uniformly distributed between  $0$  and  $1/{\rm e}^2={\rm e}^{-2}$:
$$h(X) = {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{\rm e}^{-2}\hspace{0.05cm}) \hspace{0.15cm}\underline{= -2\,{\rm nat}}\hspace{0.05cm}. $$
  • The random variable  $Y$  is triangularly distributed between  $±{\rm e}^{-0.5}$:
$$h(Y) = {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}\sqrt{ {\rm e} } \cdot \sqrt{ {\rm e} } ) = {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{ { \rm e } } \hspace{0.05cm}) \hspace{0.15cm}\underline{= +1\,{\rm nat}}\hspace{0.05cm}.$$
  • The area of the parallelogram is given by
$$F = A \cdot B = {\rm e}^{-2} \cdot {\rm e}^{0.5} = {\rm e}^{-1.5}\hspace{0.05cm}.$$
  • Thus, the 2D-PDF in the area highlighted in green has constant height  $C = 1/F ={\rm e}^{1.5}$  and we obtain for the joint entropy:
$$h(XY) = {\rm ln} \hspace{0.1cm} (F) = {\rm ln} \hspace{0.1cm} (\hspace{0.05cm}{\rm e}^{-1.5}\hspace{0.05cm}) \hspace{0.15cm}\underline{= -1.5\,{\rm nat}}\hspace{0.05cm}.$$
  • From this we obtain for the mutual information:
$$I(X;Y) = h(X) + h(Y) - h(XY) = -2 \,{\rm nat} + 1 \,{\rm nat} - (-1.5 \,{\rm nat} ) \hspace{0.15cm}\underline{= 0.5\,{\rm nat}}\hspace{0.05cm}.$$


(2)  In general, the relation  $\log_2(x) = \ln(x)/\ln(2)$ holds.  Thus, using the results of subtask  (1), we obtain:

$$h(X) \ = \ \frac{-2\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= -2.886\,{\rm bit}}\hspace{0.05cm},$$
$$h(Y) \ = \ \frac{+1\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= +1.443\,{\rm bit}}\hspace{0.05cm},$$
$$h(XY) \ = \ \frac{-1.5\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= -2.164\,{\rm bit}}\hspace{0.05cm},$$
$$I(X;Y) \ = \ \frac{0.5\,{\rm nat}}{0.693\,{\rm nat/bit}}\hspace{0.35cm}\underline{= 0.721\,{\rm bit}}\hspace{0.05cm}.$$
  • Or also:
$$I(X;Y) = -2.886 \,{\rm bit} + 1.443 \,{\rm bit}+ 2.164 \,{\rm bit}{= 0.721\,{\rm bit}}\hspace{0.05cm}.$$


(3)  The mutual information can also be written in the form  $I(X;Y) = h(Y)-h(Y \hspace{-0.05cm}\mid \hspace{-0.05cm} X) $ :

$$h(Y \hspace{-0.05cm}\mid \hspace{-0.05cm} X) = h(Y) - I(X;Y) = 1 \,{\rm nat} - 0.5 \,{\rm nat} \hspace{0.15cm}\underline{= 0.5\,{\rm nat}= 0.721\,{\rm bit}}\hspace{0.05cm}.$$


(4)  For the differential inference entropy, it holds correspondingly:

$$h(X \hspace{-0.05cm}\mid \hspace{-0.05cm} Y) = h(X) - I(X;Y) = -2 \,{\rm nat} - 0.5 \,{\rm nat} \hspace{0.15cm}\underline{= -2.5\,{\rm nat}= -3.607\,{\rm bit}}\hspace{0.05cm}.$$
Summary of all results of this exercise
  • All quantities calculated here are summarized in the graph. 
  • Arrows pointing up indicate a positive contribution, arrows pointing down indicate a negative contribution.


(5)  Correct are the  proposed solutions 1 to 3.

Again for clarification:

  • For the mutual information  $I(X;Y) \ge 0$ always holds.
  • In the discrete case there is no negative entropy, but in the continuous case there is.