Difference between revisions of "Aufgaben:Exercise 3.7: Some Entropy Calculations"

From LNTwww
m (Text replacement - "Category:Aufgaben zu Informationstheorie" to "Category:Information Theory: Exercises")
 
(9 intermediate revisions by 2 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Verschiedene Entropien zweidimensionaler Zufallsgrößen
+
{{quiz-Header|Buchseite=Information_Theory/Different_Entropy_Measures_of_Two-Dimensional_Random_Variables
 
}}
 
}}
  
[[File:P_ID2766__Inf_A_3_6.png|right|frame|Schaubild: Entropien und Information]]
+
[[File:P_ID2766__Inf_A_3_6.png|right|frame|Diagram: Entropies and information]]
Wir betrachten die beiden Zufallsgrößen  $XY$  und  $UV$  mit den folgenden 2D-Wahrscheinlichkeitsfunktionen:
+
We consider the two random variables  $XY$  and  $UV$  with the following two-dimensional probability mass functions:
 
:$$P_{XY}(X, Y) = \begin{pmatrix} 0.18 & 0.16\\ 0.02 & 0.64 \end{pmatrix}\hspace{0.05cm} \hspace{0.05cm}$$
 
:$$P_{XY}(X, Y) = \begin{pmatrix} 0.18 & 0.16\\ 0.02 & 0.64 \end{pmatrix}\hspace{0.05cm} \hspace{0.05cm}$$
 
:$$P_{UV}(U, V) \hspace{0.05cm}= \begin{pmatrix} 0.068 & 0.132\\ 0.272 & 0.528 \end{pmatrix}\hspace{0.05cm}$$
 
:$$P_{UV}(U, V) \hspace{0.05cm}= \begin{pmatrix} 0.068 & 0.132\\ 0.272 & 0.528 \end{pmatrix}\hspace{0.05cm}$$
  
Für die Zufallsgröße  $XY$  sollen in dieser Aufgabe berechnet werden:
+
For the random variable  $XY$  the following are to be calculated in this exercise:
* die Verbundentropie  (englisch:  ''Joint Entropy''):  
+
* the joint entropy:
 
:$$H(XY) = -{\rm E}\big [\log_2  P_{ XY }( X,Y) \big ],$$
 
:$$H(XY) = -{\rm E}\big [\log_2  P_{ XY }( X,Y) \big ],$$
* die beiden Einzelentropien:  
+
* the two individual entropies:
 
:$$H(X) = -{\rm E}\big [\log_2  P_X( X)\big ],$$
 
:$$H(X) = -{\rm E}\big [\log_2  P_X( X)\big ],$$
 
:$$H(Y) = -{\rm E}\big [\log_2  P_Y( Y)\big ].$$
 
:$$H(Y) = -{\rm E}\big [\log_2  P_Y( Y)\big ].$$
Daraus lassen sich gemäß dem obigen Schema dargestellt für die Zufallsgröße  $XY$ – auch folgende Beschreibungsgrößen sehr einfach bestimmen:
+
From this, the following descriptive variables can also be determined very easily according to the above scheme shown for the random variable  $XY$:
* die bedingten Entropien  (englisch:  ''Conditional Entropies''):  
+
* the conditional entropies:
 
:$$H(X \hspace{0.05cm}|\hspace{0.05cm} Y) = -{\rm E}\big [\log_2  P_{ X \hspace{0.05cm}|\hspace{0.05cm}Y }( X \hspace{0.05cm}|\hspace{0.05cm} Y)\big ],$$
 
:$$H(X \hspace{0.05cm}|\hspace{0.05cm} Y) = -{\rm E}\big [\log_2  P_{ X \hspace{0.05cm}|\hspace{0.05cm}Y }( X \hspace{0.05cm}|\hspace{0.05cm} Y)\big ],$$
 
:$$H(Y \hspace{0.05cm}|\hspace{0.05cm} X) = -{\rm E}\big [\log_2  P_{ Y \hspace{0.05cm}|\hspace{0.05cm} X }( Y \hspace{0.05cm}|\hspace{0.05cm} X)\big ],$$
 
:$$H(Y \hspace{0.05cm}|\hspace{0.05cm} X) = -{\rm E}\big [\log_2  P_{ Y \hspace{0.05cm}|\hspace{0.05cm} X }( Y \hspace{0.05cm}|\hspace{0.05cm} X)\big ],$$
* die Transinformation  (englisch:  ''Mutual Information'') zwischen $X$ und $Y$:
+
* the mutual information between $X$ and $Y$:
 
:$$I(X;Y) = {\rm E} \hspace{-0.08cm}\left [ \hspace{0.02cm}{\rm log}_2 \hspace{0.1cm} \frac{P_{XY}(X, Y)}
 
:$$I(X;Y) = {\rm E} \hspace{-0.08cm}\left [ \hspace{0.02cm}{\rm log}_2 \hspace{0.1cm} \frac{P_{XY}(X, Y)}
 
{P_{X}(X) \cdot P_{Y}(Y) }\right ]  \hspace{0.05cm}.$$
 
{P_{X}(X) \cdot P_{Y}(Y) }\right ]  \hspace{0.05cm}.$$
  
Abschließend sind qualitative Aussagen hinsichtlich der zweiten Zufallsgröße  $UV$  zu verifizieren.
+
Finally, verify qualitative statements regarding the second random variable  $UV$ .
  
  
Line 30: Line 30:
  
  
''Hinweise:''
+
Hints:
*Die Aufgabe gehört zum  Kapitel  [[Information_Theory/Verschiedene_Entropien_zweidimensionaler_Zufallsgrößen|Verschiedene Entropien zweidimensionaler Zufallsgrößen]].
+
*The exercise belongs to the chapter  [[Information_Theory/Different_Entropy_Measures_of_Two-Dimensional_Random_Variables|Different entropy measures of two-dimensional random variables]].
*Insbesondere wird Bezug genommen auf die Seiten&nbsp; <br> &nbsp; &nbsp; [[Information_Theory/Verschiedene_Entropien_zweidimensionaler_Zufallsgrößen#Bedingte_Wahrscheinlichkeit_und_bedingte_Entropie|Bedingte Wahrscheinlichkeit und bedingte Entropie]] &nbsp; sowie <br> &nbsp; &nbsp; [[Information_Theory/Verschiedene_Entropien_zweidimensionaler_Zufallsgrößen#Transinformation_zwischen_zwei_Zufallsgr.C3.B6.C3.9Fen|Transinformation zwischen zwei Zufallsgrößen]].
+
*In particular, reference is made to the pages&nbsp; <br> &nbsp; &nbsp; [[Information_Theory/Different_Entropy_Measures_of_Two-Dimensional_Random_Variables#Conditional_probability_and_conditional_entropy|Conditional probability and conditional entropy]] &nbsp; as well as <br> &nbsp; &nbsp; [[Information_Theory/Different_Entropy_Measures_of_Two-Dimensional_Random_Variables#Mutual_information_between_two_random_variables|Mutual information between two random variables]].
 
   
 
   
  
  
===Fragebogen===
+
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
{Berechnen Sie die Verbundentropie.
+
{Calculate the joint entropy.
 
|type="{}"}
 
|type="{}"}
 
$H(XY) \ = \ $  { 1.393 3% } $\ \rm bit$
 
$H(XY) \ = \ $  { 1.393 3% } $\ \rm bit$
  
{Welche Entropien weisen die 1D–Zufallsgrößen&nbsp; $X$&nbsp; und&nbsp; $Y$&nbsp; auf?
+
{What are the entropies of the one-dimensional random variables&nbsp; $X$&nbsp; and&nbsp; $Y$&nbsp;?
 
|type="{}"}
 
|type="{}"}
 
$H(X) \ = \ $ { 0.722 3% } $\ \rm bit$
 
$H(X) \ = \ $ { 0.722 3% } $\ \rm bit$
 
$H(Y) \ = \ $ { 0.925 3% } $\ \rm bit$
 
$H(Y) \ = \ $ { 0.925 3% } $\ \rm bit$
  
{Wie groß ist die Transinformation zwischen den Zufallsgrößen&nbsp; $X$&nbsp; und&nbsp; $Y$?
+
{How large is the mutual information between the random variables&nbsp; $X$&nbsp; and&nbsp; $Y$?
 
|type="{}"}
 
|type="{}"}
 
$I(X; Y) \ = \ $ { 0.254 3% } $\ \rm bit$
 
$I(X; Y) \ = \ $ { 0.254 3% } $\ \rm bit$
  
{Berechnen Sie die beiden bedingten Entropien.
+
{Calculate the two conditional entropies.
 
|type="{}"}
 
|type="{}"}
 
$H(X|Y) \ = \ $ { 0.468 3% } $\ \rm bit$
 
$H(X|Y) \ = \ $ { 0.468 3% } $\ \rm bit$
Line 58: Line 58:
  
  
{Welche der folgenden Aussagen treffen für die 2D–Zufallsgröße $UV$ zu?
+
{Which of the following statements are true for the two-dimensional random variable $UV$?
 
|type="[]"}
 
|type="[]"}
+ Die 1D–Zufallsgrößen&nbsp; $U$&nbsp; und&nbsp; $V$&nbsp; sind statistisch unabhängig.
+
+ The one-dimensional random variables&nbsp; $U$&nbsp; and&nbsp; $V$&nbsp; are statistically independent.
+ Die gemeinsame Information von&nbsp; $U$&nbsp; und&nbsp; $V$&nbsp;  ist&nbsp;  $I(U; V) = 0$.
+
+ The mutual information of&nbsp; $U$&nbsp; and&nbsp; $V$&nbsp;  is&nbsp;  $I(U; V) = 0$.
- Für die Verbundentropie gilt&nbsp; $H(UV) = H(XY)$.
+
- For the compound entropy&nbsp; $H(UV) = H(XY)$ holds.
+ Es gelten die Beziehungen&nbsp; $H(U|V) = H(U)$&nbsp; und&nbsp; $H(V|U) = H(V)$.
+
+ The relations&nbsp; $H(U|V) = H(U)$&nbsp; and&nbsp; $H(V|U) = H(V)$.
  
  
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
  
'''(1)'''&nbsp; Aus der gegebenen Verbundwahrscheinlichkeit erhält man
+
'''(1)'''&nbsp; From the given composite probability we obtain
  
 
:$$H(XY) = 0.18 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.18} + 0.16\cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.16}+  
 
:$$H(XY) = 0.18 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.18} + 0.16\cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.16}+  
Line 81: Line 81:
  
  
'''(2)'''&nbsp; Die 1D–Wahrscheinlichkeitsfunktionen lauten&nbsp; $P_X(X) = \big [0.2, \ 0.8 \big ]$&nbsp; und&nbsp; $P_Y(Y) = \big [0.34, \ 0.66 \big ]$.&nbsp; Daraus folgt:
+
'''(2)'''&nbsp; The one-dimensional probability functions are&nbsp; $P_X(X) = \big [0.2, \ 0.8 \big ]$&nbsp; and&nbsp; $P_Y(Y) = \big [0.34, \ 0.66 \big ]$.&nbsp; From this follows:
 
:$$H(X)  = 0.2 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.2} + 0.8\cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.8}\hspace{0.15cm} \underline {= 0.722\,{\rm (bit)}} \hspace{0.05cm},$$
 
:$$H(X)  = 0.2 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.2} + 0.8\cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.8}\hspace{0.15cm} \underline {= 0.722\,{\rm (bit)}} \hspace{0.05cm},$$
 
:$$H(Y) =0.34 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.34} + 0.66\cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.66}\hspace{0.15cm} \underline {= 0.925\,{\rm (bit)}} \hspace{0.05cm}.$$
 
:$$H(Y) =0.34 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.34} + 0.66\cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.66}\hspace{0.15cm} \underline {= 0.925\,{\rm (bit)}} \hspace{0.05cm}.$$
Line 87: Line 87:
  
  
'''(3)'''&nbsp; Aus der Grafik auf der Angabenseite erkennt man den Zusammenhang:
+
'''(3)'''&nbsp; From the graph on the information page you can see the relationship:
 
:$$I(X;Y) = H(X) + H(Y) - H(XY) = 0.722\,{\rm (bit)} + 0.925\,{\rm (bit)}- 1.393\,{\rm (bit)}\hspace{0.15cm} \underline {= 0.254\,{\rm (bit)}} \hspace{0.05cm}.$$
 
:$$I(X;Y) = H(X) + H(Y) - H(XY) = 0.722\,{\rm (bit)} + 0.925\,{\rm (bit)}- 1.393\,{\rm (bit)}\hspace{0.15cm} \underline {= 0.254\,{\rm (bit)}} \hspace{0.05cm}.$$
  
  
  
'''(4)'''&nbsp; Ebenso gilt entsprechend der Grafik auf der Angabenseite:
+
'''(4)'''&nbsp; Similarly, according to the graph on the information page:
 
:$$H(X \hspace{-0.1cm}\mid \hspace{-0.08cm} Y)  = H(XY) - H(Y) = 1.393- 0.925\hspace{0.15cm} \underline {= 0.468\,{\rm (bit)}} \hspace{0.05cm},$$
 
:$$H(X \hspace{-0.1cm}\mid \hspace{-0.08cm} Y)  = H(XY) - H(Y) = 1.393- 0.925\hspace{0.15cm} \underline {= 0.468\,{\rm (bit)}} \hspace{0.05cm},$$
 
:$$H(Y \hspace{-0.1cm}\mid \hspace{-0.08cm} X)  = H(XY) - H(X) = 1.393- 0.722\hspace{0.15cm} \underline {= 0.671\,{\rm (bit)}} \hspace{0.05cm}$$
 
:$$H(Y \hspace{-0.1cm}\mid \hspace{-0.08cm} X)  = H(XY) - H(X) = 1.393- 0.722\hspace{0.15cm} \underline {= 0.671\,{\rm (bit)}} \hspace{0.05cm}$$
  
[[File:P_ID2767__Inf_A_3_6d.png|right|frame|Entropiewerte für die Zufallsgrößen $XY$ und $UV$]]
+
[[File:P_ID2767__Inf_A_3_6d.png|right|frame|Entropy values for the random variables $XY$ and $UV$]]
  
*Die linke Grafik fasst die Ergebnisse der Teilaufgaben&nbsp; '''(1)''', ... , &nbsp;'''(4)'''&nbsp; maßstabsgetreu zusammen.  
+
*The  left diagram summarises the results of subtasks&nbsp; '''(1)''', ... , &nbsp;'''(4)'''&nbsp; true to scale.
*Grau hinterlegt ist die Verbundentropie und gelb die Transinformation.  
+
*The joint entropy is highlighted in grey and the mutual information in yellow.  
*Eine rote Hinterlegung bezieht sich auf die Zufallsgröße&nbsp; $X$, eine grüne auf&nbsp; $Y$.&nbsp; Schraffierte Felder deuten auf eine bedingte Entropie hin.
+
*A red background refers to the random variable&nbsp; $X$,&nbsp; and a green one to&nbsp; $Y$.&nbsp; Hatched fields indicate a  conditional entropy.
  
  
Die rechte Grafik beschreibt den gleichen Sachverhalt für die Zufallsgröße&nbsp; $UV$ &nbsp; &rArr; &nbsp; Teilaufgabe&nbsp; '''(5)'''.
+
The right graph describes the same situation for the random variable&nbsp; $UV$ &nbsp; &rArr; &nbsp; subtask&nbsp; '''(5)'''.
  
  
'''(5)'''&nbsp; Richtig sind gemäß dem Schaubild die Aussagen 1, 2 und 4:
+
'''(5)'''&nbsp; According to the diagram on the right, <br>statements 1, 2 and 4 are correct:
*Man erkennt die Gültigkeit von&nbsp; $P_{ UV } = P_U  · P_V$  &nbsp; &rArr; &nbsp;  Transinformation $I(U; V) = 0$&nbsp; daran, dass die zweite Zeile der&nbsp; $P_{ UV }$–Matrix sich von der ersten Zeile nur durch einen konstanten Faktor&nbsp; $(4)$&nbsp; unterscheidet.  
+
*One recognises the validity of&nbsp; $P_{ UV } = P_U  · P_V$  &nbsp; &rArr; &nbsp;  mutual information $I(U; V) = 0$&nbsp; by the fact that the second row of the&nbsp; $P_{ UV }$ matrix differs from the first row only by a constant factor&nbsp; $(4)$&nbsp;.
*Es ergeben sich gleiche 1D–Wahrscheinlichkeitsfunktionen wie für die Zufallsgröße&nbsp; $XY$ &nbsp; &rArr; &nbsp;  $P_U(U) = \big [0.2, \ 0.8 \big ]$ &nbsp;und&nbsp;  $P_V(V) = \big [0.34, \ 0.66 \big ]$.
+
*This results in the same one-dimensional probability mass functions as for the random variable&nbsp; $XY$ &nbsp; &rArr; &nbsp;  $P_U(U) = \big [0.2, \ 0.8 \big ]$ &nbsp;and&nbsp;  $P_V(V) = \big [0.34, \ 0.66 \big ]$.
*Deshalb ist auch&nbsp; $H(U) = H(X) = 0.722\ \rm  bit$ &nbsp;und&nbsp; $H(V) = H(Y) = 0.925 \ \rm bit$.
+
*Therefore&nbsp; $H(U) = H(X) = 0.722\ \rm  bit$ &nbsp;and&nbsp; $H(V) = H(Y) = 0.925 \ \rm bit$.
* Hier gilt aber nun für die Verbundentropie: &nbsp; $H(UV) = H(U) + H(V) ≠ H(XY)$.
+
*Here, however, the following now applies for the joint entropy: &nbsp; $H(UV) = H(U) + H(V) ≠ H(XY)$.
  
 
{{ML-Fuß}}
 
{{ML-Fuß}}
Line 116: Line 116:
  
  
[[Category:Information Theory: Exercises|^3.2 Entropien von 2D-Zufallsgrößen^]]
+
[[Category:Information Theory: Exercises|^3.2 Entropies of 2D Random Variables^]]

Latest revision as of 10:15, 24 September 2021

Diagram: Entropies and information

We consider the two random variables  $XY$  and  $UV$  with the following two-dimensional probability mass functions:

$$P_{XY}(X, Y) = \begin{pmatrix} 0.18 & 0.16\\ 0.02 & 0.64 \end{pmatrix}\hspace{0.05cm} \hspace{0.05cm}$$
$$P_{UV}(U, V) \hspace{0.05cm}= \begin{pmatrix} 0.068 & 0.132\\ 0.272 & 0.528 \end{pmatrix}\hspace{0.05cm}$$

For the random variable  $XY$  the following are to be calculated in this exercise:

  • the joint entropy:
$$H(XY) = -{\rm E}\big [\log_2 P_{ XY }( X,Y) \big ],$$
  • the two individual entropies:
$$H(X) = -{\rm E}\big [\log_2 P_X( X)\big ],$$
$$H(Y) = -{\rm E}\big [\log_2 P_Y( Y)\big ].$$

From this, the following descriptive variables can also be determined very easily according to the above scheme – shown for the random variable  $XY$:

  • the conditional entropies:
$$H(X \hspace{0.05cm}|\hspace{0.05cm} Y) = -{\rm E}\big [\log_2 P_{ X \hspace{0.05cm}|\hspace{0.05cm}Y }( X \hspace{0.05cm}|\hspace{0.05cm} Y)\big ],$$
$$H(Y \hspace{0.05cm}|\hspace{0.05cm} X) = -{\rm E}\big [\log_2 P_{ Y \hspace{0.05cm}|\hspace{0.05cm} X }( Y \hspace{0.05cm}|\hspace{0.05cm} X)\big ],$$
  • the mutual information between $X$ and $Y$:
$$I(X;Y) = {\rm E} \hspace{-0.08cm}\left [ \hspace{0.02cm}{\rm log}_2 \hspace{0.1cm} \frac{P_{XY}(X, Y)} {P_{X}(X) \cdot P_{Y}(Y) }\right ] \hspace{0.05cm}.$$

Finally, verify qualitative statements regarding the second random variable  $UV$ .




Hints:


Questions

1

Calculate the joint entropy.

$H(XY) \ = \ $

$\ \rm bit$

2

What are the entropies of the one-dimensional random variables  $X$  and  $Y$ ?

$H(X) \ = \ $

$\ \rm bit$
$H(Y) \ = \ $

$\ \rm bit$

3

How large is the mutual information between the random variables  $X$  and  $Y$?

$I(X; Y) \ = \ $

$\ \rm bit$

4

Calculate the two conditional entropies.

$H(X|Y) \ = \ $

$\ \rm bit$
$H(Y|X) \ = \ $

$\ \rm bit$

5

Which of the following statements are true for the two-dimensional random variable $UV$?

The one-dimensional random variables  $U$  and  $V$  are statistically independent.
The mutual information of  $U$  and  $V$  is  $I(U; V) = 0$.
For the compound entropy  $H(UV) = H(XY)$ holds.
The relations  $H(U|V) = H(U)$  and  $H(V|U) = H(V)$.


Solution

(1)  From the given composite probability we obtain

$$H(XY) = 0.18 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.18} + 0.16\cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.16}+ 0.02\cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.02}+ 0.64\cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.64} \hspace{0.15cm} \underline {= 1.393\,{\rm (bit)}} \hspace{0.05cm}.$$


(2)  The one-dimensional probability functions are  $P_X(X) = \big [0.2, \ 0.8 \big ]$  and  $P_Y(Y) = \big [0.34, \ 0.66 \big ]$.  From this follows:

$$H(X) = 0.2 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.2} + 0.8\cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.8}\hspace{0.15cm} \underline {= 0.722\,{\rm (bit)}} \hspace{0.05cm},$$
$$H(Y) =0.34 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.34} + 0.66\cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.66}\hspace{0.15cm} \underline {= 0.925\,{\rm (bit)}} \hspace{0.05cm}.$$


(3)  From the graph on the information page you can see the relationship:

$$I(X;Y) = H(X) + H(Y) - H(XY) = 0.722\,{\rm (bit)} + 0.925\,{\rm (bit)}- 1.393\,{\rm (bit)}\hspace{0.15cm} \underline {= 0.254\,{\rm (bit)}} \hspace{0.05cm}.$$


(4)  Similarly, according to the graph on the information page:

$$H(X \hspace{-0.1cm}\mid \hspace{-0.08cm} Y) = H(XY) - H(Y) = 1.393- 0.925\hspace{0.15cm} \underline {= 0.468\,{\rm (bit)}} \hspace{0.05cm},$$
$$H(Y \hspace{-0.1cm}\mid \hspace{-0.08cm} X) = H(XY) - H(X) = 1.393- 0.722\hspace{0.15cm} \underline {= 0.671\,{\rm (bit)}} \hspace{0.05cm}$$
Entropy values for the random variables $XY$ and $UV$
  • The left diagram summarises the results of subtasks  (1), ... ,  (4)  true to scale.
  • The joint entropy is highlighted in grey and the mutual information in yellow.
  • A red background refers to the random variable  $X$,  and a green one to  $Y$.  Hatched fields indicate a conditional entropy.


The right graph describes the same situation for the random variable  $UV$   ⇒   subtask  (5).


(5)  According to the diagram on the right,
statements 1, 2 and 4 are correct:

  • One recognises the validity of  $P_{ UV } = P_U · P_V$   ⇒   mutual information $I(U; V) = 0$  by the fact that the second row of the  $P_{ UV }$ matrix differs from the first row only by a constant factor  $(4)$ .
  • This results in the same one-dimensional probability mass functions as for the random variable  $XY$   ⇒   $P_U(U) = \big [0.2, \ 0.8 \big ]$  and  $P_V(V) = \big [0.34, \ 0.66 \big ]$.
  • Therefore  $H(U) = H(X) = 0.722\ \rm bit$  and  $H(V) = H(Y) = 0.925 \ \rm bit$.
  • Here, however, the following now applies for the joint entropy:   $H(UV) = H(U) + H(V) ≠ H(XY)$.