Difference between revisions of "Aufgaben:Exercise 4.5: Mutual Information from 2D-PDF"

From LNTwww
m (Text replacement - "value-continuous" to "continuous")
 
(27 intermediate revisions by 4 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/AWGN–Kanalkapazität bei wertkontinuierlichem Eingang
+
{{quiz-Header|Buchseite=Information_Theory/AWGN_Channel_Capacity_for_Continuous_Input
 
}}
 
}}
  
[[File:P_ID2886__Inf_A_4_5_neu.png|right|]]
+
[[File:P_ID2886__Inf_A_4_5_neu.png|right|frame|Given joint PDF]]
Vorgegeben sind hier die drei unterschiedlichen 2D&ndash;Gebiete <i>f<sub>XY</sub></i>(<i>x</i>, <i>y</i>), die in der Aufgabe nach ihren Füllfarben mit
+
Given here are the three different two-dimensional regions&nbsp; $f_{XY}(x, y)$,&nbsp; which in the task are identified by their fill colors with
:* '''rote''' Verbund-WDF
+
* red  joint PDF,
:* '''blaue''' Verbund-WDF
+
* blue joint PDF,
:* '''grüne''' Verbund-WDF
+
* green joint PDF,
  
bezeichnet werden. In den dargestellten Gebieten gelte jeweils <i>f<sub>XY</sub></i>(<i>x</i>, <i>y</i>) = <i>C</i> = const.
 
  
Die Transinformation zwischen den wertkontinuierlichen Zufallsgrößen <i>X</i> und <i>Y</i> kann unter anderem nach folgender Gleichung berechnet werden:
+
respectively.&nbsp; Within each of the regions shown, let&nbsp; $f_{XY}(x, y) = C = \rm const.$
$$I(X;Y) = h(X) + h(Y) - h(XY)\hspace{0.05cm}.$$
 
  
Für die hier verwendeten differentiellen Entropien gelten die folgenden Gleichungen:
+
For example, the mutual information between the continuous random variables&nbsp; $X$&nbsp; and&nbsp; $Y$&nbsp; can be calculated as follows:
$$h(X) = -\hspace{-0.7cm}  \int\limits_{x \hspace{0.05cm}\in \hspace{0.05cm}{\rm supp}(f_X)} \hspace{-0.55cm}  f_X(x) \cdot {\rm log} \hspace{0.1cm} [f_X(x)] \hspace{0.1cm}{\rm d}x
+
:$$I(X;Y) = h(X) + h(Y) - h(XY)\hspace{0.05cm}.$$
 +
 
 +
For the differential entropies used here, the following equations apply:
 +
:$$h(X) = -\hspace{-0.7cm}  \int\limits_{x \hspace{0.05cm}\in \hspace{0.05cm}{\rm supp}(f_X)} \hspace{-0.55cm}  f_X(x) \cdot {\rm log} \hspace{0.1cm} \big[f_X(x)\big] \hspace{0.1cm}{\rm d}x
 
\hspace{0.05cm},$$
 
\hspace{0.05cm},$$
$$h(Y) = -\hspace{-0.7cm}  \int\limits_{y \hspace{0.05cm}\in \hspace{0.05cm}{\rm supp}(f_Y)} \hspace{-0.55cm}  f_Y(y) \cdot {\rm log} \hspace{0.1cm} [f_Y(y)] \hspace{0.1cm}{\rm d}y
+
:$$h(Y) = -\hspace{-0.7cm}  \int\limits_{y \hspace{0.05cm}\in \hspace{0.05cm}{\rm supp}(f_Y)} \hspace{-0.55cm}  f_Y(y) \cdot {\rm log} \hspace{0.1cm} \big[f_Y(y)\big] \hspace{0.1cm}{\rm d}y
 
\hspace{0.05cm},$$
 
\hspace{0.05cm},$$
$$h(XY) = \hspace{0.1cm}-\hspace{0.2cm} \int \hspace{-0.9cm} \int\limits_{\hspace{-0.5cm}(x, y) \hspace{0.1cm}\in \hspace{0.1cm}{\rm supp} (f_{XY}\hspace{-0.08cm})}  
+
:$$h(XY) = \hspace{0.1cm}-\hspace{0.2cm} \int \hspace{-0.9cm} \int\limits_{\hspace{-0.5cm}(x, y) \hspace{0.1cm}\in \hspace{0.1cm}{\rm supp} (f_{XY}\hspace{-0.08cm})}  
  \hspace{-0.6cm} f_{XY}(x, y) \cdot {\rm log} \hspace{0.1cm} [ f_{XY}(x, y) ]
+
  \hspace{-0.6cm} f_{XY}(x, y) \cdot {\rm log} \hspace{0.1cm} \big[ f_{XY}(x, y) \big]
 
  \hspace{0.15cm}{\rm d}x\hspace{0.15cm}{\rm d}y\hspace{0.05cm}.$$
 
  \hspace{0.15cm}{\rm d}x\hspace{0.15cm}{\rm d}y\hspace{0.05cm}.$$
Für die beiden Randwahrscheinlichkeitsdichtefunktionen gilt dabei:
+
*For the two marginal probability density functions, the following holds:
$$f_X(x) = \hspace{-0.5cm}  \int\limits_{\hspace{-0.2cm}y \hspace{0.1cm}\in \hspace{0.1cm}{\rm supp} (f_{Y}\hspace{-0.08cm})} \hspace{-0.4cm} f_{XY}(x, y)  
+
:$$f_X(x) = \hspace{-0.5cm}  \int\limits_{\hspace{-0.2cm}y \hspace{0.1cm}\in \hspace{0.1cm}{\rm supp} (f_{Y}\hspace{-0.08cm})} \hspace{-0.4cm} f_{XY}(x, y)  
  \hspace{0.15cm}{\rm d}y\hspace{0.05cm},\hspace{0.8cm}
+
  \hspace{0.15cm}{\rm d}y\hspace{0.05cm},$$
f_Y(y) = \hspace{-0.5cm}  \int\limits_{\hspace{-0.2cm}x \hspace{0.1cm}\in \hspace{0.1cm}{\rm supp} (f_{X}\hspace{-0.08cm})} \hspace{-0.4cm} f_{XY}(x, y)  
+
:$$f_Y(y) = \hspace{-0.5cm}  \int\limits_{\hspace{-0.2cm}x \hspace{0.1cm}\in \hspace{0.1cm}{\rm supp} (f_{X}\hspace{-0.08cm})} \hspace{-0.4cm} f_{XY}(x, y)  
 
  \hspace{0.15cm}{\rm d}x\hspace{0.05cm}.$$
 
  \hspace{0.15cm}{\rm d}x\hspace{0.05cm}.$$
'''Hinweis:''' Die Aufgabe gehört zum Themengebiet von [http://en.lntwww.de/Informationstheorie/AWGN–Kanalkapazität_bei_wertkontinuierlichem_Eingang '''Kapitel 4.2.''']  Gegeben seien zudem folgende differentielle Entropien:
+
 
:* Ist <i>X</i> dreieckverteilt zwischen <i>x</i><sub>min</sub> und <i>x</i><sub>max</sub>, so gilt:
+
 
$$h(X) = {\rm log} \hspace{0.1cm} [\hspace{0.05cm}\sqrt{ e} \cdot (x_{\rm max} - x_{\rm min})/2\hspace{0.05cm}]\hspace{0.05cm}.$$
+
 
:* Ist <i>Y</i> gleichverteilt zwischen <i>y</i><sub>min</sub> und <i>y</i><sub>max</sub>, so gilt:
+
 
$$h(Y) = {\rm log} \hspace{0.1cm} [\hspace{0.05cm}y_{\rm max} - y_{\rm min}\hspace{0.05cm}]\hspace{0.05cm}.$$
+
 
:*Alle Ergebnisse sollen in &bdquo;bit&rdquo; angegeben werden. Dies erreicht man mit &bdquo;log&rdquo; &nbsp;&#8658;&nbsp; &bdquo;log<sub>2</sub>&rdquo;.
+
 
===Fragebogen===
+
 
 +
Hints:  
 +
*The exercise belongs to the chapter&nbsp;  [[Information_Theory/AWGN_Channel_Capacity_for_Continuous_Input#Mutual_information_between_continuous_random_variables|Mutual information with continuous input]].
 +
 
 +
*Let the following differential entropies also be given:
 +
:* If&nbsp; $X$&nbsp; is triangularly distributed between&nbsp; $x_{\rm min}$&nbsp; and&nbsp; $x_{\rm max}$,&nbsp; then:  
 +
::$$h(X) = {\rm log} \hspace{0.1cm} [\hspace{0.05cm}\sqrt{ e} \cdot (x_{\rm max} - x_{\rm min})/2\hspace{0.05cm}]\hspace{0.05cm}.$$
 +
:* If&nbsp; $Y$&nbsp; is equally distributed between&nbsp; $y_{\rm min}$&nbsp; and&nbsp; $y_{\rm max}$,&nbsp; then holds:
 +
::$$h(Y) = {\rm log} \hspace{0.1cm} \big [\hspace{0.05cm}y_{\rm max} - y_{\rm min}\hspace{0.05cm}\big ]\hspace{0.05cm}.$$
 +
*All results should be expressed in&nbsp; "bit". &nbsp; This is achieved with &nbsp; $\log$ &nbsp;&#8658;&nbsp; $\log_2$.  
 +
 
 +
 +
 
 +
 
 +
 
 +
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
  
{Wie groß ist die Transinformation der roten Verbund-WDF?
+
{What is the mutual information of&nbsp; <u>the red joint PDF</u>?
 
|type="{}"}
 
|type="{}"}
$rote Verbund–WDF:  I(X; Y)$ = { 0 3% }
+
$I(X; Y) \ = \ $ { 0. } $\ \rm bit$
  
{Wie groß ist die Transinformation der blauen Verbund-WDF?
+
{What is the mutual information of&nbsp; <u>the blue joint PDF</u>?
 
|type="{}"}
 
|type="{}"}
$blaue Verbund–WDF:  I(X; Y)$ = { 0.721 3% }
+
$I(X; Y) \ = \ $ { 0.721 3% } $\ \rm bit$
  
  
{Wie groß ist die Transinformation der grünen Verbund-WDF?
+
{What is the mutual information of&nbsp; <u>the green joint PDF</u>?
 
|type="{}"}
 
|type="{}"}
$grüne Verbund–WDF:  I(X; Y)$ = { 0.721 3% }
+
$I(X; Y) \ = \ $ { 0.721 3% } $\ \rm bit$
  
{Welche Voraussetzungen müssen die Zufallsgrößen <i>X</i> und <i>Y</i> gleichzeitig erfüllen, damit allgemein <i>I</i>(<i>X</i>; <i>Y</i>) = 1/2 &middot; log (e) gilt:
+
{What conditions must the random variables&nbsp; $X$&nbsp; and&nbsp; $Y$&nbsp; satisfy simultaneously for &nbsp;$I(X;Y) = 1/2 \cdot \log (\rm e)$&nbsp; to hold in general?
 
|type="[]"}
 
|type="[]"}
+ Die Verbund-WDF ''f<sub>XY</sub>''(''x'', ''y'') ergibt ein Parallelogramm.
+
+ The two-dimensional PDF &nbsp;$f_{XY}(x, y)$&nbsp; results in a parallelogram.
+ Eine der Zufallsgrößen (''X'' oder ''Y'') ist gleichverteilt.
+
+ One of the random variables&nbsp; $(X$ &nbsp;or&nbsp; $Y)$&nbsp; is uniformly distributed.
+ Die andere Zufallsgröße (''Y'' oder ''X'') ist dreieckverteilt.
+
+ The other random variable&nbsp; $(Y$&nbsp; or&nbsp; $X)$&nbsp; is triangularly distributed.
  
  
Line 60: Line 76:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
[[File:P_ID2887__Inf_A_4_5a.png|right|]]
+
[[File:P_ID2887__Inf_A_4_5a.png|right|frame|"Red"&nbsp; probability density functions; <br>'''!''' Note:&nbsp; Ordinate of &nbsp;$f_{Y}(y)$&nbsp; is directed to the left '''!''']]
<b>a)</b>&nbsp;&nbsp;Bei der rechteckförmigen Verbund&ndash;WDF <i>f<sub>XY</sub></i>(<i>x</i>, <i>y</i>) gibt es  zwischen <i>X</i> und <i>Y</i> keine statistischen Bindungen  &nbsp;&#8658;&nbsp; <u><i>I</i>(<i>X</i>; <i>Y</i>) = 0</u>.
+
'''(1)'''&nbsp; For the rectangular two-dimensional PDF &nbsp;$f_{XY}(x, y)$&nbsp; there are no statistical dependences between&nbsp; $X$&nbsp; and&nbsp; $Y$&nbsp; &nbsp; &#8658; &nbsp; $\underline{I(X;Y) = 0}$.
 
 
Formal lässt sich dieses Ergebnis mit der folgenden Gleichung nachweisen:
 
  
$$I(X;Y) = h(X) \hspace{-0.05cm}+\hspace{-0.05cm} h(Y) \hspace{-0.05cm}- \hspace{-0.05cm}h(XY)\hspace{0.02cm}.$$
+
Formally, this result can be proved with the following equation:
Die rote Fläche 2D&ndash;WDF <i>f<sub>XY</sub></i>(<i>x</i>, <i>y</i>) ist <i>F</i> = 4. Da <i>f<sub>XY</sub></i>(<i>x</i>, <i>y</i>) in diesem Gebiet konstant ist und das Volumen unter <i>f<sub>XY</sub></i>(<i>x</i>, <i>y</i>) gleich 1 sein muss, gilt <i>C</i> = 1/<i>F</i> = 1/4. Daraus folgt für die differentielle Verbundentropie in &bdquo;bit&rdquo;:
+
:$$I(X;Y) = h(X) \hspace{-0.05cm}+\hspace{-0.05cm} h(Y) \hspace{-0.05cm}- \hspace{-0.05cm}h(XY)\hspace{0.02cm}.$$
$$h(XY) \  =  \  \hspace{0.1cm}-\hspace{0.2cm} \int \hspace{-0.9cm} \int\limits_{\hspace{-0.5cm}(x, y) \hspace{0.1cm}\in \hspace{0.1cm}{\rm supp} \hspace{0.03cm}(\hspace{-0.03cm}f_{XY}\hspace{-0.08cm})}  
+
*The red area of the two-dimensional PDF &nbsp;$f_{XY}(x, y)$&nbsp; is&nbsp; $F = 4$.&nbsp;
 +
*Since &nbsp;$f_{XY}(x, y)$&nbsp; is constant in this area and the volume under &nbsp;$f_{XY}(x, y)$&nbsp; must be equal to&nbsp; $1$,&nbsp; the height is&nbsp; $C = 1/F = 1/4$.  
 +
*From this follows for the differential joint entropy in&nbsp; "bit":
 +
:$$h(XY) \  =  \  \hspace{0.1cm}-\hspace{0.2cm} \int \hspace{-0.9cm} \int\limits_{\hspace{-0.5cm}(x, y) \hspace{0.1cm}\in \hspace{0.1cm}{\rm supp} \hspace{0.03cm}(\hspace{-0.03cm}f_{XY}\hspace{-0.08cm})}  
 
  \hspace{-0.6cm} f_{XY}(x, y) \cdot {\rm log}_2 \hspace{0.1cm} [ f_{XY}(x, y) ]
 
  \hspace{-0.6cm} f_{XY}(x, y) \cdot {\rm log}_2 \hspace{0.1cm} [ f_{XY}(x, y) ]
  \hspace{0.15cm}{\rm d}x\hspace{0.15cm}{\rm d}y\\
+
  \hspace{0.15cm}{\rm d}x\hspace{0.15cm}{\rm d}y$$
  =  \  {\rm log}_2 \hspace{0.1cm} (4) \cdot \hspace{0.02cm} \int \hspace{-0.9cm} \int\limits_{\hspace{-0.5cm}(x, y) \hspace{0.1cm}\in \hspace{0.1cm}{\rm supp} \hspace{0.03cm}(\hspace{-0.03cm}f_{XY}\hspace{-0.08cm})}  
+
:$$\Rightarrow \hspace{0.3cm} h(XY) \  = \ \  {\rm log}_2 \hspace{0.1cm} (4) \cdot \hspace{0.02cm} \int \hspace{-0.9cm} \int\limits_{\hspace{-0.5cm}(x, y) \hspace{0.1cm}\in \hspace{0.1cm}{\rm supp} \hspace{0.03cm}(\hspace{-0.03cm}f_{XY}\hspace{-0.08cm})}  
 
  \hspace{-0.6cm} f_{XY}(x, y)  
 
  \hspace{-0.6cm} f_{XY}(x, y)  
 
  \hspace{0.15cm}{\rm d}x\hspace{0.15cm}{\rm d}y = 2 \,{\rm bit}\hspace{0.05cm}.$$
 
  \hspace{0.15cm}{\rm d}x\hspace{0.15cm}{\rm d}y = 2 \,{\rm bit}\hspace{0.05cm}.$$
Es ist berücksichtigt, das das Doppelintegral gleich 1 ist. Die Pseudo&ndash;Einheit &bdquo;bit&rdquo; korrespondiert mit dem <i>Logarithmus dualis</i> &nbsp;&#8658;&nbsp; &bdquo;log<sub>2</sub>&rdquo;. Weiterhin gilt:
+
*It is considered that the double integral is equal to&nbsp; $1$&nbsp;.&nbsp; The pseudo-unit&nbsp; "bit"&nbsp; corresponds to the&nbsp; "binary logarithm" &nbsp;&#8658;&nbsp; "log<sub>2</sub>".  
:* Die beiden Randwahrscheinlichkeitsdichtefunktionen ''f<sub>X</sub>''(''x'') und ''f<sub>Y</sub>''(''y'') sind jeweils rechteckförmig &#8658; Gleichverteilung zwischen 0 und 2:
+
 
$$h(X) = h(Y) = {\rm log}_2 \hspace{0.1cm} (2) = 1 \,{\rm bit}\hspace{0.05cm}.$$
+
 
:* Setzt man diese Ergebnisse in die obige Gleichung ein, so erhält man:
+
Furthermore:
$$I(X;Y) = h(X) + h(Y) - h(XY) = 1 \,{\rm bit} + 1 \,{\rm bit} - 2 \,{\rm bit} = 0 \,{\rm (bit)}
+
 
 +
* The marginal probability density functions &nbsp;$f_{X}(x)$&nbsp; and &nbsp;$f_{Y}(y)$&nbsp; are rectangular &nbsp; &#8658; &nbsp; uniform distribution between&nbsp; $0$&nbsp; and&nbsp; $2$:
 +
:$$h(X) = h(Y) = {\rm log}_2 \hspace{0.1cm} (2) = 1 \,{\rm bit}\hspace{0.05cm}.$$
 +
[[File:P_ID2888__Inf_A_4_5b_neu.png|right|frame|"Blue"&nbsp;
 +
probability density functions]]
 +
 
 +
* Substituting these results into the above equation, we obtain:
 +
:$$I(X;Y) = h(X) + h(Y) - h(XY) = 1 \,{\rm bit} + 1 \,{\rm bit} - 2 \,{\rm bit} = 0 \,{\rm (bit)}
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
[[File:P_ID2888__Inf_A_4_5b_neu.png|right|]]
 
<b>b)</b>&nbsp;&nbsp;Auch bei diesem Parallelogramm ergibt sich <i>F</i> = 4, <i>C</i> = 1/4 sowie <i>h</i>(<i>XY</i>) = 2 bit. Die Zufallsgröße <i>Y</i> ist hier wie in der Teilaufgabe (a) zwischen 0 und 2 gleichverteilt. Somit gilt weiter <i>h</i>(<i>Y</i>) = 1 bit.
 
  
Dagegen ist <i>X</i> dreieckverteilt zwischen 0 und 4 (mit Maximum bei 2). Es ergibt sich hierfür die gleiche differentielle Entropie <i>h</i>(<i>Y</i>) wie bei einer symmetrischen Dreieckverteilung im Bereich zwischen &plusmn;2  (siehe Angabenblatt):
+
 
$$h(X) = {\rm log}_2 \hspace{0.1cm} [\hspace{0.05cm}2 \cdot \sqrt{ e} \hspace{0.05cm}]
+
 
 +
'''(2)'''&nbsp; Also for this parallelogram we get&nbsp; $F = 4, \ C = 1/4$&nbsp; as well as&nbsp; $h(XY) = 2$ bit.
 +
* Here, as in subtask&nbsp; '''(1)'''&nbsp;, the variable&nbsp; $Y$&nbsp; is uniformly distributed between&nbsp; $0$&nbsp; and&nbsp; $2$&nbsp;&nbsp; &rArr; &nbsp; $h(Y) = 1$ bit.
 +
 
 +
*In contrast,&nbsp; $X$&nbsp; is triangularly distributed between&nbsp; $0$&nbsp; and&nbsp; $4$&nbsp; $($with maximum at $2)$.&nbsp;
 +
*This results in the same differential entropy&nbsp; $h(Y)$&nbsp; as for a symmetric triangular distribution in the range between&nbsp; $&plusmn;2$&nbsp; (see specification sheet):
 +
:$$h(X) = {\rm log}_2 \hspace{0.1cm} \big[\hspace{0.05cm}2 \cdot \sqrt{ e} \hspace{0.05cm}\big ]
 
= 1.721 \,{\rm bit}$$
 
= 1.721 \,{\rm bit}$$
$$\Rightarrow \hspace{0.3cm} I(X;Y) =  1.721 \,{\rm bit} + 1 \,{\rm bit} - 2 \,{\rm bit}\hspace{0.05cm}\underline{ = 0.721 \,{\rm (bit)}}
+
:$$\Rightarrow \hspace{0.3cm} I(X;Y) =  1.721 \,{\rm bit} + 1 \,{\rm bit} - 2 \,{\rm bit}\hspace{0.05cm}\underline{ = 0.721 \,{\rm (bit)}}
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
[[File:P_ID2889__Inf_A_4_5c_neu.png|right|]]
+
<br clear=all>
<b>c)</b>&nbsp;&nbsp;Bei den grünen Gegebenheiten berechnet sich die Verbundentropie wie folgt:
+
[[File:P_ID2889__Inf_A_4_5c_neu.png|right|frame|"„Green"&nbsp; probability density functions]]
$$F = A \cdot B \hspace{0.3cm}  \Rightarrow \hspace{0.3cm} C = \frac{1}{A \cdot B}
+
'''(3)'''&nbsp; The following properties are obtained for the green conditions:
\hspace{0.05cm}$$
+
:$$F = A \cdot B \hspace{0.3cm}  \Rightarrow \hspace{0.3cm} C = \frac{1}{A \cdot B}
$$\Rightarrow \hspace{0.3cm} h(XY)  =  {\rm log}_2 \hspace{0.1cm} (A \cdot B)  
+
\hspace{0.05cm}\hspace{0.3cm}
 +
\Rightarrow \hspace{0.3cm} h(XY)  =  {\rm log}_2 \hspace{0.1cm} (A \cdot B)  
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
Die Zufallsgröße <i>Y</i> ist nun zwischen 0 und <i>A</i> gleichverteilt und die Zufallsgröße <i>X</i> ist  zwischen 0 und <i>B</i> dreieckverteilt:
+
*The random variable&nbsp; $Y$&nbsp; is now uniformly distributed between&nbsp; $0$&nbsp; and&nbsp; $A$&nbsp; and the random variable&nbsp; $X$&nbsp; is triangularly distributed between&nbsp; $0$&nbsp; and&nbsp; $2B$&nbsp; $($with maximum at&nbsp; $B)$:
$$h(X)  \ =  \  {\rm log}_2 \hspace{0.1cm} (B \cdot \sqrt{ e})  
+
:$$h(X)  \ =  \  {\rm log}_2 \hspace{0.1cm} (B \cdot \sqrt{ e})  
 
\hspace{0.05cm},$$ $$
 
\hspace{0.05cm},$$ $$
 
  h(Y)  \  =  \  {\rm log}_2 \hspace{0.1cm} (A)\hspace{0.05cm}.$$
 
  h(Y)  \  =  \  {\rm log}_2 \hspace{0.1cm} (A)\hspace{0.05cm}.$$
[[File: P_ID2890__Inf_A_4_5d.png |right|]]
+
*Thus, for the mutual information between&nbsp; $X$&nbsp; and&nbsp; $Y$:
Damit ergibt sich für die Transinformation zwischen <i>X</i> und <i>Y</i>:
+
:$$I(X;Y)  \  =      {\rm log}_2 \hspace{0.1cm} (B \cdot \sqrt{ {\rm e}}) + {\rm log}_2 \hspace{0.1cm} (A) - {\rm log}_2 \hspace{0.1cm} (A \cdot B)$$  
$$I(X;Y)  \  =      {\rm log}_2 \hspace{0.1cm} (B \cdot \sqrt{ {\rm e}}) + {\rm log}_2 \hspace{0.1cm} (A) - {\rm log}_2 \hspace{0.1cm} (A \cdot B)$$  
+
:$$\Rightarrow \hspace{0.3cm} I(X;Y) =  \ {\rm log}_2 \hspace{0.1cm} \frac{B \cdot \sqrt{ {\rm e}} \cdot A}{A \cdot B} = {\rm log}_2 \hspace{0.1cm} (\sqrt{ {\rm e}})\hspace{0.15cm}\underline{= 0.721\,{\rm bit}}
$$  =  \ {\rm log}_2 \hspace{0.1cm} \frac{B \cdot \sqrt{ {\rm e}} \cdot A}{A \cdot B} = {\rm log}_2 \hspace{0.1cm} (\sqrt{ {\rm e}})\hspace{0.15cm}\underline{= 0.721\,{\rm bit}}
 
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
<i>I</i>(<i>X</i>; <i>Y</i>) ist somit unabhängig von den WDF&ndash;Parametern <i>A</i> und <i>B</i>.
+
[[File: P_ID2890__Inf_A_4_5d.png |right|frame|Other examples of 2D PDF&nbsp; $f_{XY}(x, y)$]]
<br><br><br>
+
*$I(X;Y)$&nbsp; thus independent of th PDF parameters&nbsp; $A$&nbsp; and&nbsp; $B$.
<b>d)</b>&nbsp;&nbsp;<u>Alle genannten Voraussetzungen</u> sind erforderlich. Allerdings sind nicht für jedes Parallelogramm die Forderungen 2 und 3 zu erfüllen. Nebenstehende Grafik zeigt zwei solche Konstellationen, wobei nun die Zufallsgröße <i>X</i> jeweils gleichverteilt ist zwischen 0 und 1.
+
 
:* Bei der oberen Grafik liegen die beiden eingezeichneten Punkte auf einer Höhe &nbsp;&#8658;&nbsp; <i>f<sub>Y</sub></i>(<i>y</i>) ist dreieckverteilt &nbsp;&#8658;&nbsp; <i>I</i>(<i>X</i>; <i>Y</i>) = 0.721 bit.
+
 
:*Die untere Verbund&ndash;WDF besitzt eine andere Transinformation, da die beiden Punkte nicht auf gleicher Höhe liegen &nbsp;&#8658;&nbsp; die WDF <i>f<sub>Y</sub></i>(<i>y</i>) hat hier eine Trapezform. Gefühlsmäßig tippe ich auf <i>I</i>(<i>X</i>;&nbsp;<i>Y</i>)&nbsp;<&nbsp;0.721 bit, da sich das 2D&ndash;Gebiet eher einem Rechteck annähert. Wenn Sie noch  Lust haben, so überprüfen Sie das bitte.  
+
 
 +
 
 +
 
 +
 
 +
 
 +
 
 +
 
 +
 
 +
'''(4)'''&nbsp; <u>All of the above conditions</u> are required.&nbsp; However, the requirements&nbsp; '''(2)'''&nbsp; and&nbsp; '''(3)'''&nbsp; are not satisfied for every parallelogram.&nbsp;
 +
 
 +
*The adjacent graph shows two constellations, where the random variable&nbsp; $X$&nbsp; is equally distributed between&nbsp; $0$&nbsp; and&nbsp; $1$&nbsp; in each case.
 +
*For the top graph, the plotted points lie at a height &nbsp; &#8658; &nbsp; $f_{Y}(y)$&nbsp; is triangularly distributed &nbsp; &#8658; &nbsp; $I(X;Y) = 0.721$ bit.
 +
*The lower composite PDF has a different mutual information, since the two plotted points are not at the same height &nbsp; <br>&#8658; &nbsp; the PDF&nbsp; $f_{Y}(y)$&nbsp; here has a trapezoidal shape.  
 +
*Feeling, I guess&nbsp; $I(X;Y) < 0.721$&nbsp; bit,&nbsp; since the two-dimensional area is more approaching a rectangle.&nbsp; If you still feel like it, so check this statement.  
 +
 
 
{{ML-Fuß}}
 
{{ML-Fuß}}
  
  
  
[[Category:Aufgaben zu Informationstheorie|^4.2 AWGN & kontinuierlicher Eingang^]]
+
[[Category:Information Theory: Exercises|^4.2 AWGN and Value-Continuous Input^]]

Latest revision as of 09:27, 11 October 2021

Given joint PDF

Given here are the three different two-dimensional regions  $f_{XY}(x, y)$,  which in the task are identified by their fill colors with

  • red joint PDF,
  • blue joint PDF,
  • green joint PDF,


respectively.  Within each of the regions shown, let  $f_{XY}(x, y) = C = \rm const.$

For example, the mutual information between the continuous random variables  $X$  and  $Y$  can be calculated as follows:

$$I(X;Y) = h(X) + h(Y) - h(XY)\hspace{0.05cm}.$$

For the differential entropies used here, the following equations apply:

$$h(X) = -\hspace{-0.7cm} \int\limits_{x \hspace{0.05cm}\in \hspace{0.05cm}{\rm supp}(f_X)} \hspace{-0.55cm} f_X(x) \cdot {\rm log} \hspace{0.1cm} \big[f_X(x)\big] \hspace{0.1cm}{\rm d}x \hspace{0.05cm},$$
$$h(Y) = -\hspace{-0.7cm} \int\limits_{y \hspace{0.05cm}\in \hspace{0.05cm}{\rm supp}(f_Y)} \hspace{-0.55cm} f_Y(y) \cdot {\rm log} \hspace{0.1cm} \big[f_Y(y)\big] \hspace{0.1cm}{\rm d}y \hspace{0.05cm},$$
$$h(XY) = \hspace{0.1cm}-\hspace{0.2cm} \int \hspace{-0.9cm} \int\limits_{\hspace{-0.5cm}(x, y) \hspace{0.1cm}\in \hspace{0.1cm}{\rm supp} (f_{XY}\hspace{-0.08cm})} \hspace{-0.6cm} f_{XY}(x, y) \cdot {\rm log} \hspace{0.1cm} \big[ f_{XY}(x, y) \big] \hspace{0.15cm}{\rm d}x\hspace{0.15cm}{\rm d}y\hspace{0.05cm}.$$
  • For the two marginal probability density functions, the following holds:
$$f_X(x) = \hspace{-0.5cm} \int\limits_{\hspace{-0.2cm}y \hspace{0.1cm}\in \hspace{0.1cm}{\rm supp} (f_{Y}\hspace{-0.08cm})} \hspace{-0.4cm} f_{XY}(x, y) \hspace{0.15cm}{\rm d}y\hspace{0.05cm},$$
$$f_Y(y) = \hspace{-0.5cm} \int\limits_{\hspace{-0.2cm}x \hspace{0.1cm}\in \hspace{0.1cm}{\rm supp} (f_{X}\hspace{-0.08cm})} \hspace{-0.4cm} f_{XY}(x, y) \hspace{0.15cm}{\rm d}x\hspace{0.05cm}.$$




Hints:

  • Let the following differential entropies also be given:
  • If  $X$  is triangularly distributed between  $x_{\rm min}$  and  $x_{\rm max}$,  then:
$$h(X) = {\rm log} \hspace{0.1cm} [\hspace{0.05cm}\sqrt{ e} \cdot (x_{\rm max} - x_{\rm min})/2\hspace{0.05cm}]\hspace{0.05cm}.$$
  • If  $Y$  is equally distributed between  $y_{\rm min}$  and  $y_{\rm max}$,  then holds:
$$h(Y) = {\rm log} \hspace{0.1cm} \big [\hspace{0.05cm}y_{\rm max} - y_{\rm min}\hspace{0.05cm}\big ]\hspace{0.05cm}.$$
  • All results should be expressed in  "bit".   This is achieved with   $\log$  ⇒  $\log_2$.



Questions

1

What is the mutual information of  the red joint PDF?

$I(X; Y) \ = \ $

$\ \rm bit$

2

What is the mutual information of  the blue joint PDF?

$I(X; Y) \ = \ $

$\ \rm bit$

3

What is the mutual information of  the green joint PDF?

$I(X; Y) \ = \ $

$\ \rm bit$

4

What conditions must the random variables  $X$  and  $Y$  satisfy simultaneously for  $I(X;Y) = 1/2 \cdot \log (\rm e)$  to hold in general?

The two-dimensional PDF  $f_{XY}(x, y)$  results in a parallelogram.
One of the random variables  $(X$  or  $Y)$  is uniformly distributed.
The other random variable  $(Y$  or  $X)$  is triangularly distributed.


Solution

"Red"  probability density functions;
! Note:  Ordinate of  $f_{Y}(y)$  is directed to the left !

(1)  For the rectangular two-dimensional PDF  $f_{XY}(x, y)$  there are no statistical dependences between  $X$  and  $Y$    ⇒   $\underline{I(X;Y) = 0}$.

Formally, this result can be proved with the following equation:

$$I(X;Y) = h(X) \hspace{-0.05cm}+\hspace{-0.05cm} h(Y) \hspace{-0.05cm}- \hspace{-0.05cm}h(XY)\hspace{0.02cm}.$$
  • The red area of the two-dimensional PDF  $f_{XY}(x, y)$  is  $F = 4$. 
  • Since  $f_{XY}(x, y)$  is constant in this area and the volume under  $f_{XY}(x, y)$  must be equal to  $1$,  the height is  $C = 1/F = 1/4$.
  • From this follows for the differential joint entropy in  "bit":
$$h(XY) \ = \ \hspace{0.1cm}-\hspace{0.2cm} \int \hspace{-0.9cm} \int\limits_{\hspace{-0.5cm}(x, y) \hspace{0.1cm}\in \hspace{0.1cm}{\rm supp} \hspace{0.03cm}(\hspace{-0.03cm}f_{XY}\hspace{-0.08cm})} \hspace{-0.6cm} f_{XY}(x, y) \cdot {\rm log}_2 \hspace{0.1cm} [ f_{XY}(x, y) ] \hspace{0.15cm}{\rm d}x\hspace{0.15cm}{\rm d}y$$
$$\Rightarrow \hspace{0.3cm} h(XY) \ = \ \ {\rm log}_2 \hspace{0.1cm} (4) \cdot \hspace{0.02cm} \int \hspace{-0.9cm} \int\limits_{\hspace{-0.5cm}(x, y) \hspace{0.1cm}\in \hspace{0.1cm}{\rm supp} \hspace{0.03cm}(\hspace{-0.03cm}f_{XY}\hspace{-0.08cm})} \hspace{-0.6cm} f_{XY}(x, y) \hspace{0.15cm}{\rm d}x\hspace{0.15cm}{\rm d}y = 2 \,{\rm bit}\hspace{0.05cm}.$$
  • It is considered that the double integral is equal to  $1$ .  The pseudo-unit  "bit"  corresponds to the  "binary logarithm"  ⇒  "log2".


Furthermore:

  • The marginal probability density functions  $f_{X}(x)$  and  $f_{Y}(y)$  are rectangular   ⇒   uniform distribution between  $0$  and  $2$:
$$h(X) = h(Y) = {\rm log}_2 \hspace{0.1cm} (2) = 1 \,{\rm bit}\hspace{0.05cm}.$$
"Blue"  probability density functions
  • Substituting these results into the above equation, we obtain:
$$I(X;Y) = h(X) + h(Y) - h(XY) = 1 \,{\rm bit} + 1 \,{\rm bit} - 2 \,{\rm bit} = 0 \,{\rm (bit)} \hspace{0.05cm}.$$


(2)  Also for this parallelogram we get  $F = 4, \ C = 1/4$  as well as  $h(XY) = 2$ bit.

  • Here, as in subtask  (1) , the variable  $Y$  is uniformly distributed between  $0$  and  $2$   ⇒   $h(Y) = 1$ bit.
  • In contrast,  $X$  is triangularly distributed between  $0$  and  $4$  $($with maximum at $2)$. 
  • This results in the same differential entropy  $h(Y)$  as for a symmetric triangular distribution in the range between  $±2$  (see specification sheet):
$$h(X) = {\rm log}_2 \hspace{0.1cm} \big[\hspace{0.05cm}2 \cdot \sqrt{ e} \hspace{0.05cm}\big ] = 1.721 \,{\rm bit}$$
$$\Rightarrow \hspace{0.3cm} I(X;Y) = 1.721 \,{\rm bit} + 1 \,{\rm bit} - 2 \,{\rm bit}\hspace{0.05cm}\underline{ = 0.721 \,{\rm (bit)}} \hspace{0.05cm}.$$


"„Green"  probability density functions

(3)  The following properties are obtained for the green conditions:

$$F = A \cdot B \hspace{0.3cm} \Rightarrow \hspace{0.3cm} C = \frac{1}{A \cdot B} \hspace{0.05cm}\hspace{0.3cm} \Rightarrow \hspace{0.3cm} h(XY) = {\rm log}_2 \hspace{0.1cm} (A \cdot B) \hspace{0.05cm}.$$
  • The random variable  $Y$  is now uniformly distributed between  $0$  and  $A$  and the random variable  $X$  is triangularly distributed between  $0$  and  $2B$  $($with maximum at  $B)$:
$$h(X) \ = \ {\rm log}_2 \hspace{0.1cm} (B \cdot \sqrt{ e}) \hspace{0.05cm},$$ $$ h(Y) \ = \ {\rm log}_2 \hspace{0.1cm} (A)\hspace{0.05cm}.$$
  • Thus, for the mutual information between  $X$  and  $Y$:
$$I(X;Y) \ = {\rm log}_2 \hspace{0.1cm} (B \cdot \sqrt{ {\rm e}}) + {\rm log}_2 \hspace{0.1cm} (A) - {\rm log}_2 \hspace{0.1cm} (A \cdot B)$$
$$\Rightarrow \hspace{0.3cm} I(X;Y) = \ {\rm log}_2 \hspace{0.1cm} \frac{B \cdot \sqrt{ {\rm e}} \cdot A}{A \cdot B} = {\rm log}_2 \hspace{0.1cm} (\sqrt{ {\rm e}})\hspace{0.15cm}\underline{= 0.721\,{\rm bit}} \hspace{0.05cm}.$$
Other examples of 2D PDF  $f_{XY}(x, y)$
  • $I(X;Y)$  thus independent of th PDF parameters  $A$  and  $B$.






(4)  All of the above conditions are required.  However, the requirements  (2)  and  (3)  are not satisfied for every parallelogram. 

  • The adjacent graph shows two constellations, where the random variable  $X$  is equally distributed between  $0$  and  $1$  in each case.
  • For the top graph, the plotted points lie at a height   ⇒   $f_{Y}(y)$  is triangularly distributed   ⇒   $I(X;Y) = 0.721$ bit.
  • The lower composite PDF has a different mutual information, since the two plotted points are not at the same height  
    ⇒   the PDF  $f_{Y}(y)$  here has a trapezoidal shape.
  • Feeling, I guess  $I(X;Y) < 0.721$  bit,  since the two-dimensional area is more approaching a rectangle.  If you still feel like it, so check this statement.