Difference between revisions of "Aufgaben:Exercise 4.2Z: Mixed Random Variables"

From LNTwww
 
(21 intermediate revisions by 4 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Differentielle Entropie
+
{{quiz-Header|Buchseite=Information_Theory/Differential_Entropy
 
}}
 
}}
  
[[File:P_ID2868__Inf_Z_4_2_neu.png|right|WDF von <i>X</i> und VTF von <i>Y</i>]]
+
[[File:P_ID2868__Inf_Z_4_2_neu.png|right|frame|PDF of&nbsp; $X$&nbsp; (top),&nbsp; and <br>CDF of&nbsp; $Y$&nbsp; (bottom)]]
Man spricht von einer <i>gemischten Zufallsgröße</i>, wenn die Zufallsgröße neben einem kontinuierlichen Anteil auch noch diskrete Anteile beinhaltet.
+
One speaks of a&nbsp; "mixed random variable",&nbsp; if the random variable contains discrete components in addition to a continuous component.
  
*Die Zufallsgröße $Y$ mit der [[Stochastische_Signaltheorie/Verteilungsfunktion|Verteilungsfunktion]] $F_Y(y)$ gemäß der unteren Skizze besitzt beispielsweise sowohl einen kontinuierlichen als auch einen diskreten Anteil.  
+
*For example, the random variable&nbsp; $Y$&nbsp; with&nbsp; [[Theory_of_Stochastic_Signals/Cumulative_Distribution_Function_(CDF)|cumulative distribution function]]&nbsp; $F_Y(y)$&nbsp;  as shown in the sketch below has both a continuous and a discrete component.
*Die [[Stochastische_Signaltheorie/Wahrscheinlichkeitsdichtefunktion|Wahrscheinlichkeitsdichtefunktion]] $f_Y(y)$ erhält man aus $F_Y(y)$ durch Differentiation.  
+
*The&nbsp; [[Theory_of_Stochastic_Signals/Wahrscheinlichkeitsdichtefunktion|probability density function]]&nbsp; $f_Y(y)$&nbsp; is obtained from&nbsp; $F_Y(y)$&nbsp; by differentiation.
*Aus dem Sprung bei $y= 1$ in der Verteilungsfunktion (VTF) wird somit ein &bdquo;Dirac&rdquo; in der Wahrscheinlichkeitsdichtefunktion (WDF).
+
*The jump at&nbsp; $y= 1$&nbsp; in the CDF thus becomes a "Dirac" in the probability density function.
  
In der Teilaufgabe $(4)$ soll die differentielle Entropie $h(Y)$ der Zufallsgröße $Y$ ermittelt werden (in bit), wobei von folgender Gleichung auszugehen ist:
+
*In subtask&nbsp; '''(4)'''&nbsp; the differential entropy&nbsp; $h(Y)$&nbsp; of&nbsp; $Y$&nbsp;  is to be determined (in bit), assuming the following equation:
 
:$$h(Y) =  
 
:$$h(Y) =  
\hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}\hspace{0.03cm}(\hspace{-0.03cm}f_Y)} \hspace{-0.35cm}  f_Y(y) \cdot {\rm log}_2 \hspace{0.1cm} [ f_Y(y) ] \hspace{0.1cm}{\rm d}y  
+
\hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}\hspace{0.03cm}(\hspace{-0.03cm}f_Y)} \hspace{-0.35cm}  f_Y(y) \cdot {\rm log}_2 \hspace{0.1cm} \big[ f_Y(y) \big] \hspace{0.1cm}{\rm d}y  
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
  
*In der Teilaufgabe $(2)$  ist die differentielle Entropie $h(X)$ der Zufallsgröße $X$ zu berechnen, deren WDF $f_X(x)$ oben skizziert ist. Führt man einen geeigneten Grenzübergang durch, so wird auch aus der Zufallsgröße $X$ eine gemischte Zufallsgröße.
+
*In subtask&nbsp; '''(2)''',&nbsp; calculate the differential entropy&nbsp; $h(X)$&nbsp; of the random variable&nbsp; $X$&nbsp; whose PDF&nbsp; $f_X(x)$&nbsp; is sketched above.&nbsp; If one performs a suitable boundary transition, the random variable&nbsp; $X$&nbsp; also becomes a mixed random variable.
  
  
''Hinweise:''
 
*Die Aufgabe gehört zum  Kapitel [[Informationstheorie/Differentielle_Entropie|Differentielle Entropie]].
 
*Weitere Informationen zu gemischten Zufallsgrößen finden Sie im Kapitel  [[Stochastische_Signaltheorie/Verteilungsfunktion|Verteilungsfunktion]] des Buches &bdquo;Stochastische Signaltheorie&rdquo;.
 
*Sollte die Eingabe des Zahlenwertes &bdquo;0&rdquo; erforderlich sein, so geben Sie bitte &bdquo;0.&rdquo; ein.
 
  
  
  
===Fragebogen===
+
Hints:
 +
*The exercise belongs to the chapter&nbsp; [[Information_Theory/Differentielle_Entropie|Differential Entropy]].
 +
*Further information on mixed random variables can be found in the chapter &nbsp;  [[Theory_of_Stochastic_Signals/Cumulative_Distribution_Function_(CDF)|Cumulative Distribution Function]]&nbsp; of the book "Theory of Stochastic Signals".
 +
 +
 
 +
 
 +
 
 +
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
{Wie groß ist die WDF&ndash;Höhe $A$ von $f_X(x)$ um $x = 1$?
+
{What is the PDF height&nbsp; $A$&nbsp; of &nbsp;$f_X(x)$&nbsp; around &nbsp;$x = 1$?
 
|type="[]"}
 
|type="[]"}
 
- $A = 0.5/\varepsilon$,
 
- $A = 0.5/\varepsilon$,
Line 34: Line 37:
 
- $A = 1/\varepsilon$.
 
- $A = 1/\varepsilon$.
  
{Berechnen Sie die differentielle Entropie für verschiedene $\varepsilon$&ndash;Werte.
+
{Calculate the differential entropy for different &nbsp;$\varepsilon$&ndash;values.
 
|type="{}"}
 
|type="{}"}
$ε = 10^{-1}\text{:} \ \    h(X) \ = $ { 0.644 3% } $\ \rm bit$
+
$ε = 10^{-1}\text{:} \ \    h(X) \ = \ $ { 0.644 3% } $\ \rm bit$
$ε = 10^{-2}\text{:} \ \    h(X) \ = $ { -0.87--0.83 } $\ \rm bit$
+
$ε = 10^{-2}\text{:} \ \    h(X) \ = \ $ { -0.87--0.83 } $\ \rm bit$
$ε = 10^{-3}\text{:} \ \    h(X) \ = $ { -7.2--6.8 } $\ \rm bit$
+
$ε = 10^{-3}\text{:} \ \    h(X) \ = \ $ { -7.2--6.8 } $\ \rm bit$
  
{Welches Ergebnis liefert der Grenzwert $ε \to 0$?
+
{What is the result of the limit &nbsp;$ε \to 0$?
 
|type="[]"}
 
|type="[]"}
+ $f_X(x)$ hat nun einen kontinuierlichen und einen diskreten Anteil.
+
+ $f_X(x)$&nbsp; now has a continuous and a discrete component.
+ Die differentielle Energie $h(X)$ ist negativ.
+
+ The differential energy &nbsp;$h(X)$&nbsp; is negative.
+ Der Betrag $|h(X)|$ ist unendlich groß.
+
+ The magnitude &nbsp;$|h(X)|$&nbsp; is infinite.
  
  
{Welche Aussagen treffen für die Zufallsgröße $Y$ zu?
+
{Which statements are true for the random variable&nbsp; $Y$?
 
|type="[]"}
 
|type="[]"}
- Der VTF&ndash;Wert an der Stelle $y = 1$ ist $0.5$.
+
- The CDF value at the point &nbsp;$y = 1$&nbsp; is&nbsp; $0.5$.
+ $Y$ beinhaltet einen diskreten und einen kontinuierlichen Anteil..
+
+ $Y$&nbsp; contains a discrete and a continuous component.
+ Der diskrete Anteil bei $Y = 1$ = 1 tritt mit $10\%$ Wahrscheinlichkeit auf.
+
+ The discrete component at&nbsp; &nbsp;$Y = 1$&nbsp;  occurs with &nbsp;$10\%$&nbsp; probability.
- Der kontinuierliche Anteil von $Y$ ist gleichverteilt.
+
- The continuous component of&nbsp; $Y$&nbsp; is uniformly distributed.
+ Die differentiellen Entropien von $X$ und $Y$ sind gleich.
+
+ The differential entropies of&nbsp; $X$&nbsp; and&nbsp; $Y$&nbsp; are equal.  
  
  
Line 59: Line 62:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
<b>a)</b>&nbsp;&nbsp;Das Integral über die WDF muss 1 ergeben:
+
'''(1)'''&nbsp; <u>Proposed solution 2</u>&nbsp; is correct because the integral&nbsp; $1$&nbsp; over the PDF must yield:
$$f_X(x) \hspace{0.1cm}{\rm d}x =
+
:$$f_X(x) \hspace{0.1cm}{\rm d}x =
0.25 \cdot 2 + (A - 0.25) \cdot \varepsilon \stackrel{!}{=} 1$$
+
0.25 \cdot 2 + (A - 0.25) \cdot \varepsilon \stackrel{!}{=} 1 \hspace{0.3cm}
$$\Rightarrow\hspace{0.3cm}(A - 0.25) \cdot \varepsilon \stackrel{!}{=} 0.5
+
\Rightarrow\hspace{0.3cm}(A - 0.25) \cdot \varepsilon \stackrel{!}{=} 0.5
 
\hspace{0.3cm}\Rightarrow\hspace{0.3cm} A = 0.5/\varepsilon +0.25\hspace{0.05cm}.$$
 
\hspace{0.3cm}\Rightarrow\hspace{0.3cm} A = 0.5/\varepsilon +0.25\hspace{0.05cm}.$$
Richtig ist also der <u>Lösungsvorschlag 2</u>.
 
  
<b>b)</b>&nbsp;&nbsp;Die differentielle Entropie (in &bdquo;bit&rdquo;) ist wie folgt gegeben:
+
 
$$h(X) =  
+
 
 +
'''(2)'''&nbsp; The differential entropy (in "bit") is given as follows:
 +
:$$h(X) =  
 
\hspace{0.1cm}  \hspace{-0.45cm} \int\limits_{{\rm supp}(f_X)} \hspace{-0.35cm}  f_X(x) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{f_X(x)} \hspace{0.1cm}{\rm d}x  
 
\hspace{0.1cm}  \hspace{-0.45cm} \int\limits_{{\rm supp}(f_X)} \hspace{-0.35cm}  f_X(x) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{f_X(x)} \hspace{0.1cm}{\rm d}x  
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
Wir unterteilen nun das Integral in drei Teilintegrale:
+
We now divide the integral into three partial integrals:
$$h(X) \hspace{-0.15cm}  = \hspace{-0.15cm}
+
:$$h(X) =  
 
  \hspace{-0.25cm} \int\limits_{0}^{1-\varepsilon/2} \hspace{-0.15cm}  0.25 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.25} \hspace{0.1cm}{\rm d}x +
 
  \hspace{-0.25cm} \int\limits_{0}^{1-\varepsilon/2} \hspace{-0.15cm}  0.25 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.25} \hspace{0.1cm}{\rm d}x +
\hspace{-0.25cm}\int\limits_{1+\varepsilon/2}^{2} \hspace{-0.15cm}  0.25 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.25} \hspace{0.1cm}{\rm d}x $$ $$\
+
\hspace{-0.25cm}\int\limits_{1+\varepsilon/2}^{2} \hspace{-0.15cm}  0.25 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.25} \hspace{0.1cm}{\rm d}x
   +  \hspace{-0.15cm}\hspace{-0.25cm}\int\limits_{1-\varepsilon/2}^{1+\varepsilon/2} \hspace{-0.15cm}  [0.5/\varepsilon + 0.25] \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.5/\varepsilon + 0.25} \hspace{0.1cm}{\rm d}x $$ $$=\
+
   +  \hspace{-0.25cm}\int\limits_{1-\varepsilon/2}^{1+\varepsilon/2} \hspace{-0.15cm}  \big [0.5/\varepsilon + 0.25 \big ] \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.5/\varepsilon + 0.25} \hspace{0.1cm}{\rm d}x $$  
  \hspace{-0.15cm} 2 \cdot 0.25 \cdot 2 \cdot (2-\varepsilon) - (0.5 + 0.25 \cdot \varepsilon) \cdot {\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon +0.25)
+
:$$   \Rightarrow \hspace{0.3cm} h(X) = 2 \cdot 0.25 \cdot 2 \cdot (2-\varepsilon) - (0.5 + 0.25 \cdot \varepsilon) \cdot {\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon +0.25)
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
Insbesondere erhält man
+
In particular, one obtains
:* für <i>&epsilon;</i> = 0.1:
+
* for&nbsp; $\varepsilon = 0.1$:
$$h(X) =1.9 - 0.525 \cdot {\rm log}_2 \hspace{0.1cm}(5.25) = 1.9 - 1.256
+
:$$h(X) =1.9 - 0.525 \cdot {\rm log}_2 \hspace{0.1cm}(5.25) = 1.9 - 1.256
 
\hspace{0.15cm}\underline{= 0.644\,{\rm bit}}
 
\hspace{0.15cm}\underline{= 0.644\,{\rm bit}}
 
\hspace{0.05cm},$$
 
\hspace{0.05cm},$$
:* für <i>&epsilon;</i> = 0.01:
+
* for&nbsp; $\varepsilon = 0.01$:
$$h(X) =1.99 - 0.5025 \cdot {\rm log}_2 \hspace{0.1cm}(50.25)= 1.99 - 2.84  
+
:$$h(X) =1.99 - 0.5025 \cdot {\rm log}_2 \hspace{0.1cm}(50.25)= 1.99 - 2.84  
 
\hspace{0.15cm}\underline{= -0.850\,{\rm bit}}
 
\hspace{0.15cm}\underline{= -0.850\,{\rm bit}}
 
\hspace{0.05cm}$$  
 
\hspace{0.05cm}$$  
:* für <i>&epsilon;</i> = 0.001:
+
* for&nbsp; $\varepsilon = 0.001$:
$$h(X) =1.999 - 0.50025 \cdot {\rm log}_2 \hspace{0.1cm}(500.25) = 1.999 - 8.967
+
:$$h(X) =1.999 - 0.50025 \cdot {\rm log}_2 \hspace{0.1cm}(500.25) = 1.999 - 8.967
 
\hspace{0.15cm}\underline{= -6.968\,{\rm bit}}
 
\hspace{0.15cm}\underline{= -6.968\,{\rm bit}}
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
<b>c)</b>&nbsp;&nbsp;<u>Alle Lösungsvorschläge</u> sind hier zutreffend. Nach dem Grenzübergang <i>&epsilon;</i> &#8594; 0 erhält man für die differentielle Entropie
+
 
$$h(X) \hspace{-0.15cm}  = \hspace{-0.15cm} \lim\limits_{\varepsilon \hspace{0.05cm}\rightarrow \hspace{0.05cm} 0} \hspace{0.1cm}[(2-\varepsilon) - (0.5 + 0.25 \cdot \varepsilon) \cdot {\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon +0.25)]$$ $$ = \
+
 
   \hspace{-0.15cm} 2\,{\rm bit} - 0.5 \cdot \lim\limits_{\varepsilon \hspace{0.05cm}\rightarrow \hspace{0.05cm} 0}\hspace{0.1cm}{\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon)
+
'''(3)'''&nbsp; <u>All the proposed solutions</u>&nbsp; $1$&nbsp; are correct:
 +
*After the boundary transition &nbsp; $\varepsilon &#8594; 0$ &nbsp; we obtain for the differential entropy
 +
:$$h(X) = \lim\limits_{\varepsilon \hspace{0.05cm}\rightarrow \hspace{0.05cm} 0} \hspace{0.1cm}\big[(2-\varepsilon) - (0.5 + 0.25 \cdot \varepsilon) \cdot {\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon +0.25)\big]  
 +
   = 2\,{\rm bit} - 0.5 \cdot \lim\limits_{\varepsilon \hspace{0.05cm}\rightarrow \hspace{0.05cm} 0}\hspace{0.1cm}{\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon)
 
\hspace{0.3cm}\Rightarrow\hspace{0.3cm} - \infty
 
\hspace{0.3cm}\Rightarrow\hspace{0.3cm} - \infty
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
Die Wahrscheinlichkeitsdichtefunktion (WDF) ergibt sich in diesem Fall zu
+
[[File:P_ID2871__Inf_Z_4_2c_neu.png|right|frame|PDF and CDF of the mixed random variable&nbsp; $X$]]
$$f_X(x) = \left\{ \begin{array}{c} 0.25 + 0.5 \cdot \delta (x-1) \\  0 \\  \end{array} \right. \begin{array}{*{20}c}  {\rm{f\ddot{u}r}} \hspace{0.1cm} 0 \le x \le 2, \\    {\rm sonst} \\ \end{array}
+
*The probability density function (PDF) in this case is given by.
 +
:$$f_X(x) = \left\{ \begin{array}{c} 0.25 + 0.5 \cdot \delta (x-1) \\  0 \\  \end{array} \right. \begin{array}{*{20}c}  {\rm{f\ddot{u}r}} \hspace{0.1cm} 0 \le x \le 2, \\    {\rm sonst} \\ \end{array}
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
Es handelt sich demzufolge um eine &bdquo;gemischte&rdquo; Zufallsgröße mit
+
Consequently, it is a&nbsp; "mixed"&nbsp; random variable with
:* einem stochastischen, gleichverteilten Anteil zwischen 0 &#8804; <i>x</i> &#8804; 2, und
+
* a stochastic, uniformly distributed part in the range&nbsp; $0 \le x \le 2$, and
:* einem diskreten Anteil bei <i>x</i> = 1 mit der Wahrscheinlichkeit 0.5.  
+
* a discrete component at&nbsp; $x = 1$&nbsp; with probability&nbsp; $0.5$.  
Die Grafik zeigt links die WDF <i>f<sub>X</sub></i>(<i>x</i>) und rechts die Verteilungsfunktion (kurz VTF) <i>F<sub>X</sub></i>(<i>x</i>).
 
[[File:P_ID2871__Inf_Z_4_2c_neu.png|center|]]
 
  
<b>d)</b>&nbsp;&nbsp;Richtig sind die <u>Lösungsvorschläge 2, 3 und 5</u>. Die untere Grafik zeigt die WDF und die VTF der Zufallsgröße <i>Y</i>. Man erkennt:
 
:* <i>Y</i> beinhaltet wie <i>X</i> sowohl einen kontinuierlichen als auch einen diskreten Anteil.
 
:* Der diskrete Anteil tritt mit der Wahrscheinlichkeit Pr(<i>Y</i> = 1) = 0.1 auf.
 
:* Da <i>F<sub>Y</sub></i>(<i>y</i>) = Pr(<i>Y</i> &#8804; <i>y</i>) gilt,  ergibt sich  der rechtsseitige Grenzwert: <i>F<sub>Y</sub></i>(<i>y</i> = 1) = 0.55.
 
:* Der kontinuierliche Anteil ist nicht gleichverteilt; vielmehr liegt eine Dreieckverteilung vor.
 
[[File:P_ID2872__Inf_Z_4_2d_neu.png|center|]]
 
  
Richtig ist auch der letzte Vorschlag: <i>h</i>(<i>Y</i>) = <i>h</i>(<i>X</i>) = &ndash;&#8734;. Denn: Bei einer jeden Zufallsgröße mit einem diskreten Anteil &ndash; und ist er auch noch so klein, ist die differentielle Entropie gleich minus unendlich.
+
The graph shows the PDF &nbsp;$f_X(x)$&nbsp; on the left and the CDF &nbsp;$F_X(x)$ on the right.
 +
<br clear=all>
 +
'''(4)'''&nbsp; <u>The correct solutions are 2, 3 and 5</u>.&nbsp;
 +
The lower graph shows the PDF and the CDF of the random variable&nbsp; $Y$.&nbsp; You can see:
 +
[[File:P_ID2872__Inf_Z_4_2d_neu.png|right|frame|PDF and CDF of the mixed random variable $Y$]]
 +
* Like $X$&nbsp;,&nbsp; $Y$&nbsp; contains a continuous and a discrete part.
 +
* The discrete part occurs with probability&nbsp; ${\rm Pr}(Y = 1) = 0.1$.
 +
* Since &nbsp;$F_Y(y)= {\rm Pr}(Y \le y)$&nbsp; holds, the right-hand side limit is:
 +
:$$F_Y(y = 1) = 0.55.$$
 +
* The continuous component is not uniformly distributed;&nbsp; rather, there is a triangular PDF.
 +
*The last proposition is also correct: &nbsp; $h(Y) = h(X) = - \infty$.  
 +
<br clear=all>
 +
Because: &nbsp; '''For every random quantity with a discrete part &ndash; and it is also extremely small, the differential entropy is equal minus infinity.'''.
  
 
{{ML-Fuß}}
 
{{ML-Fuß}}
Line 120: Line 132:
  
  
[[Category:Aufgaben zu Informationstheorie|^4.1  Differentielle Entropie^]]
+
[[Category:Information Theory: Exercises|^4.1  Differential Entropy^]]

Latest revision as of 14:57, 1 October 2021

PDF of  $X$  (top),  and
CDF of  $Y$  (bottom)

One speaks of a  "mixed random variable",  if the random variable contains discrete components in addition to a continuous component.

  • For example, the random variable  $Y$  with  cumulative distribution function  $F_Y(y)$  as shown in the sketch below has both a continuous and a discrete component.
  • The  probability density function  $f_Y(y)$  is obtained from  $F_Y(y)$  by differentiation.
  • The jump at  $y= 1$  in the CDF thus becomes a "Dirac" in the probability density function.
  • In subtask  (4)  the differential entropy  $h(Y)$  of  $Y$  is to be determined (in bit), assuming the following equation:
$$h(Y) = \hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}\hspace{0.03cm}(\hspace{-0.03cm}f_Y)} \hspace{-0.35cm} f_Y(y) \cdot {\rm log}_2 \hspace{0.1cm} \big[ f_Y(y) \big] \hspace{0.1cm}{\rm d}y \hspace{0.05cm}.$$
  • In subtask  (2),  calculate the differential entropy  $h(X)$  of the random variable  $X$  whose PDF  $f_X(x)$  is sketched above.  If one performs a suitable boundary transition, the random variable  $X$  also becomes a mixed random variable.



Hints:



Questions

1

What is the PDF height  $A$  of  $f_X(x)$  around  $x = 1$?

$A = 0.5/\varepsilon$,
$A = 0.5/\varepsilon+0.25$,
$A = 1/\varepsilon$.

2

Calculate the differential entropy for different  $\varepsilon$–values.

$ε = 10^{-1}\text{:} \ \ h(X) \ = \ $

$\ \rm bit$
$ε = 10^{-2}\text{:} \ \ h(X) \ = \ $

$\ \rm bit$
$ε = 10^{-3}\text{:} \ \ h(X) \ = \ $

$\ \rm bit$

3

What is the result of the limit  $ε \to 0$?

$f_X(x)$  now has a continuous and a discrete component.
The differential energy  $h(X)$  is negative.
The magnitude  $|h(X)|$  is infinite.

4

Which statements are true for the random variable  $Y$?

The CDF value at the point  $y = 1$  is  $0.5$.
$Y$  contains a discrete and a continuous component.
The discrete component at   $Y = 1$  occurs with  $10\%$  probability.
The continuous component of  $Y$  is uniformly distributed.
The differential entropies of  $X$  and  $Y$  are equal.


Solution

(1)  Proposed solution 2  is correct because the integral  $1$  over the PDF must yield:

$$f_X(x) \hspace{0.1cm}{\rm d}x = 0.25 \cdot 2 + (A - 0.25) \cdot \varepsilon \stackrel{!}{=} 1 \hspace{0.3cm} \Rightarrow\hspace{0.3cm}(A - 0.25) \cdot \varepsilon \stackrel{!}{=} 0.5 \hspace{0.3cm}\Rightarrow\hspace{0.3cm} A = 0.5/\varepsilon +0.25\hspace{0.05cm}.$$


(2)  The differential entropy (in "bit") is given as follows:

$$h(X) = \hspace{0.1cm} \hspace{-0.45cm} \int\limits_{{\rm supp}(f_X)} \hspace{-0.35cm} f_X(x) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{f_X(x)} \hspace{0.1cm}{\rm d}x \hspace{0.05cm}.$$

We now divide the integral into three partial integrals:

$$h(X) = \hspace{-0.25cm} \int\limits_{0}^{1-\varepsilon/2} \hspace{-0.15cm} 0.25 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.25} \hspace{0.1cm}{\rm d}x + \hspace{-0.25cm}\int\limits_{1+\varepsilon/2}^{2} \hspace{-0.15cm} 0.25 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.25} \hspace{0.1cm}{\rm d}x + \hspace{-0.25cm}\int\limits_{1-\varepsilon/2}^{1+\varepsilon/2} \hspace{-0.15cm} \big [0.5/\varepsilon + 0.25 \big ] \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.5/\varepsilon + 0.25} \hspace{0.1cm}{\rm d}x $$
$$ \Rightarrow \hspace{0.3cm} h(X) = 2 \cdot 0.25 \cdot 2 \cdot (2-\varepsilon) - (0.5 + 0.25 \cdot \varepsilon) \cdot {\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon +0.25) \hspace{0.05cm}.$$

In particular, one obtains

  • for  $\varepsilon = 0.1$:
$$h(X) =1.9 - 0.525 \cdot {\rm log}_2 \hspace{0.1cm}(5.25) = 1.9 - 1.256 \hspace{0.15cm}\underline{= 0.644\,{\rm bit}} \hspace{0.05cm},$$
  • for  $\varepsilon = 0.01$:
$$h(X) =1.99 - 0.5025 \cdot {\rm log}_2 \hspace{0.1cm}(50.25)= 1.99 - 2.84 \hspace{0.15cm}\underline{= -0.850\,{\rm bit}} \hspace{0.05cm}$$
  • for  $\varepsilon = 0.001$:
$$h(X) =1.999 - 0.50025 \cdot {\rm log}_2 \hspace{0.1cm}(500.25) = 1.999 - 8.967 \hspace{0.15cm}\underline{= -6.968\,{\rm bit}} \hspace{0.05cm}.$$


(3)  All the proposed solutions  $1$  are correct:

  • After the boundary transition   $\varepsilon → 0$   we obtain for the differential entropy
$$h(X) = \lim\limits_{\varepsilon \hspace{0.05cm}\rightarrow \hspace{0.05cm} 0} \hspace{0.1cm}\big[(2-\varepsilon) - (0.5 + 0.25 \cdot \varepsilon) \cdot {\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon +0.25)\big] = 2\,{\rm bit} - 0.5 \cdot \lim\limits_{\varepsilon \hspace{0.05cm}\rightarrow \hspace{0.05cm} 0}\hspace{0.1cm}{\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon) \hspace{0.3cm}\Rightarrow\hspace{0.3cm} - \infty \hspace{0.05cm}.$$
PDF and CDF of the mixed random variable  $X$
  • The probability density function (PDF) in this case is given by.
$$f_X(x) = \left\{ \begin{array}{c} 0.25 + 0.5 \cdot \delta (x-1) \\ 0 \\ \end{array} \right. \begin{array}{*{20}c} {\rm{f\ddot{u}r}} \hspace{0.1cm} 0 \le x \le 2, \\ {\rm sonst} \\ \end{array} \hspace{0.05cm}.$$

Consequently, it is a  "mixed"  random variable with

  • a stochastic, uniformly distributed part in the range  $0 \le x \le 2$, and
  • a discrete component at  $x = 1$  with probability  $0.5$.


The graph shows the PDF  $f_X(x)$  on the left and the CDF  $F_X(x)$ on the right.
(4)  The correct solutions are 2, 3 and 5.  The lower graph shows the PDF and the CDF of the random variable  $Y$.  You can see:

PDF and CDF of the mixed random variable $Y$
  • Like $X$ ,  $Y$  contains a continuous and a discrete part.
  • The discrete part occurs with probability  ${\rm Pr}(Y = 1) = 0.1$.
  • Since  $F_Y(y)= {\rm Pr}(Y \le y)$  holds, the right-hand side limit is:
$$F_Y(y = 1) = 0.55.$$
  • The continuous component is not uniformly distributed;  rather, there is a triangular PDF.
  • The last proposition is also correct:   $h(Y) = h(X) = - \infty$.


Because:   For every random quantity with a discrete part – and it is also extremely small, the differential entropy is equal minus infinity..