Difference between revisions of "Aufgaben:Exercise 4.2Z: Mixed Random Variables"

From LNTwww
m (Text replacement - "„" to """)
 
(8 intermediate revisions by 2 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Differentielle Entropie
+
{{quiz-Header|Buchseite=Information_Theory/Differential_Entropy
 
}}
 
}}
  
[[File:P_ID2868__Inf_Z_4_2_neu.png|right|frame|WDF von&nbsp; $X$&nbsp; (oben),&nbsp; und <br>VTF von&nbsp; $Y$&nbsp; (unten)]]
+
[[File:P_ID2868__Inf_Z_4_2_neu.png|right|frame|PDF of&nbsp; $X$&nbsp; (top),&nbsp; and <br>CDF of&nbsp; $Y$&nbsp; (bottom)]]
Man spricht von einer "gemischten Zufallsgröße", wenn die Zufallsgröße neben einem kontinuierlichen Anteil auch noch diskrete Anteile beinhaltet.
+
One speaks of a&nbsp; "mixed random variable",&nbsp; if the random variable contains discrete components in addition to a continuous component.
  
*Die Zufallsgröße&nbsp; $Y$&nbsp; mit der&nbsp; [[Theory_of_Stochastic_Signals/Verteilungsfunktion|Verteilungsfunktion]]&nbsp; $F_Y(y)$&nbsp; gemäß der unteren Skizze besitzt beispielsweise sowohl einen kontinuierlichen als auch einen diskreten Anteil.  
+
*For example, the random variable&nbsp; $Y$&nbsp; with&nbsp; [[Theory_of_Stochastic_Signals/Cumulative_Distribution_Function_(CDF)|cumulative distribution function]]&nbsp; $F_Y(y)$&nbsp; as shown in the sketch below has both a continuous and a discrete component.
*Die&nbsp; [[Theory_of_Stochastic_Signals/Wahrscheinlichkeitsdichtefunktion|Wahrscheinlichkeitsdichtefunktion]]&nbsp; $f_Y(y)$&nbsp; erhält man aus&nbsp; $F_Y(y)$&nbsp; durch Differentiation.  
+
*The&nbsp; [[Theory_of_Stochastic_Signals/Wahrscheinlichkeitsdichtefunktion|probability density function]]&nbsp; $f_Y(y)$&nbsp; is obtained from&nbsp; $F_Y(y)$&nbsp; by differentiation.
*Aus dem Sprung bei&nbsp; $y= 1$&nbsp; in der Verteilungsfunktion (VTF) wird somit ein "Dirac" in der Wahrscheinlichkeitsdichtefunktion (WDF).
+
*The jump at&nbsp; $y= 1$&nbsp; in the CDF thus becomes a "Dirac" in the probability density function.
  
*In der Teilaufgabe&nbsp; '''(4)'''&nbsp; soll die differentielle Entropie&nbsp; $h(Y)$&nbsp; der Zufallsgröße&nbsp; $Y$&nbsp; ermittelt werden (in bit), wobei von folgender Gleichung auszugehen ist:
+
*In subtask&nbsp; '''(4)'''&nbsp; the differential entropy&nbsp; $h(Y)$&nbsp; of&nbsp; $Y$&nbsp; is to be determined (in bit), assuming the following equation:
 
:$$h(Y) =  
 
:$$h(Y) =  
 
\hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}\hspace{0.03cm}(\hspace{-0.03cm}f_Y)} \hspace{-0.35cm}  f_Y(y) \cdot {\rm log}_2 \hspace{0.1cm} \big[ f_Y(y) \big] \hspace{0.1cm}{\rm d}y  
 
\hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}\hspace{0.03cm}(\hspace{-0.03cm}f_Y)} \hspace{-0.35cm}  f_Y(y) \cdot {\rm log}_2 \hspace{0.1cm} \big[ f_Y(y) \big] \hspace{0.1cm}{\rm d}y  
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
  
*In der Teilaufgabe&nbsp; '''(2)'''&nbsp; ist die differentielle Entropie&nbsp;  $h(X)$&nbsp; der Zufallsgröße&nbsp; $X$&nbsp; zu berechnen, deren WDF&nbsp; $f_X(x)$&nbsp; oben skizziert ist.&nbsp; Führt man einen geeigneten Grenzübergang durch, so wird auch aus der Zufallsgröße&nbsp; $X$&nbsp; eine gemischte Zufallsgröße.
+
*In subtask&nbsp; '''(2)''',&nbsp; calculate the differential entropy&nbsp;  $h(X)$&nbsp; of the random variable&nbsp; $X$&nbsp; whose PDF&nbsp; $f_X(x)$&nbsp; is sketched above.&nbsp; If one performs a suitable boundary transition, the random variable&nbsp; $X$&nbsp; also becomes a mixed random variable.
  
  
Line 21: Line 21:
  
  
 
+
Hints:
 
+
*The exercise belongs to the chapter&nbsp; [[Information_Theory/Differentielle_Entropie|Differential Entropy]].
 
+
*Further information on mixed random variables can be found in the chapter &nbsp;  [[Theory_of_Stochastic_Signals/Cumulative_Distribution_Function_(CDF)|Cumulative Distribution Function]]&nbsp; of the book "Theory of Stochastic Signals".
''Hinweise:''
 
*Die Aufgabe gehört zum  Kapitel&nbsp; [[Information_Theory/Differentielle_Entropie|Differentielle Entropie]].
 
*Weitere Informationen zu gemischten Zufallsgrößen finden Sie im Kapitel&nbsp;  [[Theory_of_Stochastic_Signals/Verteilungsfunktion|Verteilungsfunktion]] des Buches "Stochastische Signaltheorie".
 
 
   
 
   
  
  
  
===Fragebogen===
+
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
{Wie groß ist die WDF&ndash;Höhe&nbsp; $A$&nbsp; von &nbsp;$f_X(x)$&nbsp; um &nbsp;$x = 1$?
+
{What is the PDF height&nbsp; $A$&nbsp; of &nbsp;$f_X(x)$&nbsp; around &nbsp;$x = 1$?
 
|type="[]"}
 
|type="[]"}
 
- $A = 0.5/\varepsilon$,
 
- $A = 0.5/\varepsilon$,
Line 40: Line 37:
 
- $A = 1/\varepsilon$.
 
- $A = 1/\varepsilon$.
  
{Berechnen Sie die differentielle Entropie für verschiedene &nbsp;$\varepsilon$&ndash;Werte.
+
{Calculate the differential entropy for different &nbsp;$\varepsilon$&ndash;values.
 
|type="{}"}
 
|type="{}"}
 
$ε = 10^{-1}\text{:} \ \    h(X) \ = \ $ { 0.644 3% } $\ \rm bit$
 
$ε = 10^{-1}\text{:} \ \    h(X) \ = \ $ { 0.644 3% } $\ \rm bit$
Line 46: Line 43:
 
$ε = 10^{-3}\text{:} \ \    h(X) \ = \ $ { -7.2--6.8 } $\ \rm bit$
 
$ε = 10^{-3}\text{:} \ \    h(X) \ = \ $ { -7.2--6.8 } $\ \rm bit$
  
{Welches Ergebnis liefert der Grenzwert &nbsp;$ε \to 0$?
+
{What is the result of the limit &nbsp;$ε \to 0$?
 
|type="[]"}
 
|type="[]"}
+ $f_X(x)$&nbsp; hat nun einen kontinuierlichen und einen diskreten Anteil.
+
+ $f_X(x)$&nbsp; now has a continuous and a discrete component.
+ Die differentielle Energie &nbsp;$h(X)$&nbsp; ist negativ.
+
+ The differential energy &nbsp;$h(X)$&nbsp; is negative.
+ Der Betrag &nbsp;$|h(X)|$&nbsp; ist unendlich groß.
+
+ The magnitude &nbsp;$|h(X)|$&nbsp; is infinite.
  
  
{Welche Aussagen treffen für die Zufallsgröße&nbsp; $Y$&nbsp; zu?
+
{Which statements are true for the random variable&nbsp; $Y$?
 
|type="[]"}
 
|type="[]"}
- Der VTF&ndash;Wert an der Stelle &nbsp;$y = 1$&nbsp; ist&nbsp; $0.5$.
+
- The CDF value at the point &nbsp;$y = 1$&nbsp; is&nbsp; $0.5$.
+ $Y$&nbsp; beinhaltet einen diskreten und einen kontinuierlichen Anteil.
+
+ $Y$&nbsp; contains a discrete and a continuous component.
+ Der diskrete Anteil bei&nbsp; &nbsp;$Y = 1$&nbsp;  tritt mit &nbsp;$10\%$&nbsp; Wahrscheinlichkeit auf.
+
+ The discrete component at&nbsp; &nbsp;$Y = 1$&nbsp;  occurs with &nbsp;$10\%$&nbsp; probability.
- Der kontinuierliche Anteil von&nbsp; $Y$&nbsp; ist gleichverteilt.
+
- The continuous component of&nbsp; $Y$&nbsp; is uniformly distributed.
+ Die differentiellen Entropien von&nbsp; $X$&nbsp; und&nbsp; $Y$&nbsp; sind gleich.
+
+ The differential entropies of&nbsp; $X$&nbsp; and&nbsp; $Y$&nbsp; are equal.  
  
  
Line 65: Line 62:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''(1)'''&nbsp; Richtig ist der <u>Lösungsvorschlag 2</u>, weil das Integral über die WDF&nbsp; $1$&nbsp; ergeben  muss:
+
'''(1)'''&nbsp; <u>Proposed solution 2</u>&nbsp; is correct because the integral&nbsp; $1$&nbsp; over the PDF must yield:
 
:$$f_X(x) \hspace{0.1cm}{\rm d}x =
 
:$$f_X(x) \hspace{0.1cm}{\rm d}x =
 
0.25 \cdot 2 + (A - 0.25) \cdot \varepsilon \stackrel{!}{=} 1 \hspace{0.3cm}
 
0.25 \cdot 2 + (A - 0.25) \cdot \varepsilon \stackrel{!}{=} 1 \hspace{0.3cm}
Line 75: Line 72:
  
  
'''(2)'''&nbsp; Die differentielle Entropie (in "bit") ist wie folgt gegeben:
+
'''(2)'''&nbsp; The differential entropy (in "bit") is given as follows:
 
:$$h(X) =  
 
:$$h(X) =  
 
\hspace{0.1cm}  \hspace{-0.45cm} \int\limits_{{\rm supp}(f_X)} \hspace{-0.35cm}  f_X(x) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{f_X(x)} \hspace{0.1cm}{\rm d}x  
 
\hspace{0.1cm}  \hspace{-0.45cm} \int\limits_{{\rm supp}(f_X)} \hspace{-0.35cm}  f_X(x) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{f_X(x)} \hspace{0.1cm}{\rm d}x  
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
Wir unterteilen nun das Integral in drei Teilintegrale:
+
We now divide the integral into three partial integrals:
 
:$$h(X) =  
 
:$$h(X) =  
 
  \hspace{-0.25cm} \int\limits_{0}^{1-\varepsilon/2} \hspace{-0.15cm}  0.25 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.25} \hspace{0.1cm}{\rm d}x +
 
  \hspace{-0.25cm} \int\limits_{0}^{1-\varepsilon/2} \hspace{-0.15cm}  0.25 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.25} \hspace{0.1cm}{\rm d}x +
Line 86: Line 83:
 
:$$  \Rightarrow \hspace{0.3cm} h(X) = 2 \cdot 0.25 \cdot 2 \cdot (2-\varepsilon) - (0.5 + 0.25 \cdot \varepsilon) \cdot {\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon +0.25)
 
:$$  \Rightarrow \hspace{0.3cm} h(X) = 2 \cdot 0.25 \cdot 2 \cdot (2-\varepsilon) - (0.5 + 0.25 \cdot \varepsilon) \cdot {\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon +0.25)
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
Insbesondere erhält man
+
In particular, one obtains
* für&nbsp; $\varepsilon = 0.1$:
+
* for&nbsp; $\varepsilon = 0.1$:
 
:$$h(X) =1.9 - 0.525 \cdot {\rm log}_2 \hspace{0.1cm}(5.25) = 1.9 - 1.256
 
:$$h(X) =1.9 - 0.525 \cdot {\rm log}_2 \hspace{0.1cm}(5.25) = 1.9 - 1.256
 
\hspace{0.15cm}\underline{= 0.644\,{\rm bit}}
 
\hspace{0.15cm}\underline{= 0.644\,{\rm bit}}
 
\hspace{0.05cm},$$
 
\hspace{0.05cm},$$
* für&nbsp; $\varepsilon = 0.01$:
+
* for&nbsp; $\varepsilon = 0.01$:
 
:$$h(X) =1.99 - 0.5025 \cdot {\rm log}_2 \hspace{0.1cm}(50.25)= 1.99 - 2.84  
 
:$$h(X) =1.99 - 0.5025 \cdot {\rm log}_2 \hspace{0.1cm}(50.25)= 1.99 - 2.84  
 
\hspace{0.15cm}\underline{= -0.850\,{\rm bit}}
 
\hspace{0.15cm}\underline{= -0.850\,{\rm bit}}
 
\hspace{0.05cm}$$  
 
\hspace{0.05cm}$$  
* für&nbsp; $\varepsilon = 0.001$:
+
* for&nbsp; $\varepsilon = 0.001$:
 
:$$h(X) =1.999 - 0.50025 \cdot {\rm log}_2 \hspace{0.1cm}(500.25) = 1.999 - 8.967
 
:$$h(X) =1.999 - 0.50025 \cdot {\rm log}_2 \hspace{0.1cm}(500.25) = 1.999 - 8.967
 
\hspace{0.15cm}\underline{= -6.968\,{\rm bit}}
 
\hspace{0.15cm}\underline{= -6.968\,{\rm bit}}
Line 101: Line 98:
  
  
'''(3)'''&nbsp; <u>Alle Lösungsvorschläge</u> sind zutreffend:  
+
'''(3)'''&nbsp; <u>All the proposed solutions</u>&nbsp; $1$&nbsp; are correct:  
*Nach dem Grenzübergang &nbsp; $\varepsilon &#8594; 0$ &nbsp; erhält man für die differentielle Entropie
+
*After the boundary transition &nbsp; $\varepsilon &#8594; 0$ &nbsp; we obtain for the differential entropy
 
:$$h(X) = \lim\limits_{\varepsilon \hspace{0.05cm}\rightarrow \hspace{0.05cm} 0} \hspace{0.1cm}\big[(2-\varepsilon) - (0.5 + 0.25 \cdot \varepsilon) \cdot {\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon +0.25)\big]  
 
:$$h(X) = \lim\limits_{\varepsilon \hspace{0.05cm}\rightarrow \hspace{0.05cm} 0} \hspace{0.1cm}\big[(2-\varepsilon) - (0.5 + 0.25 \cdot \varepsilon) \cdot {\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon +0.25)\big]  
 
   = 2\,{\rm bit} - 0.5 \cdot \lim\limits_{\varepsilon \hspace{0.05cm}\rightarrow \hspace{0.05cm} 0}\hspace{0.1cm}{\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon)
 
   = 2\,{\rm bit} - 0.5 \cdot \lim\limits_{\varepsilon \hspace{0.05cm}\rightarrow \hspace{0.05cm} 0}\hspace{0.1cm}{\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon)
 
\hspace{0.3cm}\Rightarrow\hspace{0.3cm} - \infty
 
\hspace{0.3cm}\Rightarrow\hspace{0.3cm} - \infty
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
[[File:P_ID2871__Inf_Z_4_2c_neu.png|right|frame|WDF und VTF der gemischten Zufallsgröße&nbsp; $X$]]
+
[[File:P_ID2871__Inf_Z_4_2c_neu.png|right|frame|PDF and CDF of the mixed random variable&nbsp; $X$]]
*Die Wahrscheinlichkeitsdichtefunktion (WDF) ergibt sich in diesem Fall zu
+
*The probability density function (PDF) in this case is given by.
 
:$$f_X(x) = \left\{ \begin{array}{c} 0.25 + 0.5 \cdot \delta (x-1) \\  0 \\  \end{array} \right. \begin{array}{*{20}c}  {\rm{f\ddot{u}r}} \hspace{0.1cm} 0 \le x \le 2, \\    {\rm sonst} \\ \end{array}
 
:$$f_X(x) = \left\{ \begin{array}{c} 0.25 + 0.5 \cdot \delta (x-1) \\  0 \\  \end{array} \right. \begin{array}{*{20}c}  {\rm{f\ddot{u}r}} \hspace{0.1cm} 0 \le x \le 2, \\    {\rm sonst} \\ \end{array}
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
Es handelt sich demzufolge um eine "gemischte" Zufallsgröße mit
+
Consequently, it is a&nbsp; "mixed"&nbsp; random variable with
* einem stochastischen, gleichverteilten Anteil im Bereich&nbsp; $0 \le x \le 2$, und
+
* a stochastic, uniformly distributed part in the range&nbsp; $0 \le x \le 2$, and
* einem diskreten Anteil bei&nbsp; $x = 1$&nbsp; mit der Wahrscheinlichkeit&nbsp; $0.5$.  
+
* a discrete component at&nbsp; $x = 1$&nbsp; with probability&nbsp; $0.5$.  
  
  
Die Grafik zeigt links die WDF &nbsp;$f_X(x)$&nbsp; und rechts die Verteilungsfunktion &nbsp;$F_X(x)$.
+
The graph shows the PDF &nbsp;$f_X(x)$&nbsp; on the left and the CDF &nbsp;$F_X(x)$ on the right.
 
<br clear=all>
 
<br clear=all>
'''(4)'''&nbsp; Richtig sind die <u>Lösungsvorschläge 2, 3 und 5</u>.  
+
'''(4)'''&nbsp; <u>The correct solutions are 2, 3 and 5</u>.&nbsp;
Die untere Grafik zeigt die WDF und die VTF der Zufallsgröße&nbsp; $Y$.&nbsp; Man erkennt:
+
The lower graph shows the PDF and the CDF of the random variable&nbsp; $Y$.&nbsp; You can see:
[[File:P_ID2872__Inf_Z_4_2d_neu.png|right|frame|WDF und VTF der gemischten Zufallsgröße $Y$]]
+
[[File:P_ID2872__Inf_Z_4_2d_neu.png|right|frame|PDF and CDF of the mixed random variable $Y$]]
* $Y$&nbsp; beinhaltet wie&nbsp; $X$&nbsp; einen kontinuierlichen und einen diskreten Anteil.
+
* Like $X$&nbsp;,&nbsp; $Y$&nbsp; contains a continuous and a discrete part.
* Der diskrete Anteil tritt mit der Wahrscheinlichkeit&nbsp; ${\rm Pr}(Y = 1) = 0.1$ auf.
+
* The discrete part occurs with probability&nbsp; ${\rm Pr}(Y = 1) = 0.1$.
* Da &nbsp;$F_Y(y)= {\rm Pr}(Y \le y)$&nbsp; gilt, ergibt sich  der rechtsseitige Grenzwert:  
+
* Since &nbsp;$F_Y(y)= {\rm Pr}(Y \le y)$&nbsp; holds, the right-hand side limit is:  
 
:$$F_Y(y = 1) = 0.55.$$
 
:$$F_Y(y = 1) = 0.55.$$
* Der kontinuierliche Anteil ist nicht gleichverteilt;&nbsp; vielmehr liegt eine Dreieckverteilung vor.
+
* The continuous component is not uniformly distributed;&nbsp; rather, there is a triangular PDF.
*Richtig ist auch der letzte Vorschlag: &nbsp; $h(Y) = h(X) = - \infty$.  
+
*The last proposition is also correct: &nbsp; $h(Y) = h(X) = - \infty$.  
 
<br clear=all>
 
<br clear=all>
Denn: &nbsp; '''Bei jeder Zufallsgröße mit einem diskreten Anteil &ndash; und ist er auch noch so klein, ist die differentielle Entropie gleich minus unendlich'''.
+
Because: &nbsp; '''For every random quantity with a discrete part &ndash; and it is also extremely small, the differential entropy is equal minus infinity.'''.
  
 
{{ML-Fuß}}
 
{{ML-Fuß}}
Line 135: Line 132:
  
  
[[Category:Information Theory: Exercises|^4.1  Differentielle Entropie^]]
+
[[Category:Information Theory: Exercises|^4.1  Differential Entropy^]]

Latest revision as of 15:57, 1 October 2021

PDF of  $X$  (top),  and
CDF of  $Y$  (bottom)

One speaks of a  "mixed random variable",  if the random variable contains discrete components in addition to a continuous component.

  • For example, the random variable  $Y$  with  cumulative distribution function  $F_Y(y)$  as shown in the sketch below has both a continuous and a discrete component.
  • The  probability density function  $f_Y(y)$  is obtained from  $F_Y(y)$  by differentiation.
  • The jump at  $y= 1$  in the CDF thus becomes a "Dirac" in the probability density function.
  • In subtask  (4)  the differential entropy  $h(Y)$  of  $Y$  is to be determined (in bit), assuming the following equation:
$$h(Y) = \hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}\hspace{0.03cm}(\hspace{-0.03cm}f_Y)} \hspace{-0.35cm} f_Y(y) \cdot {\rm log}_2 \hspace{0.1cm} \big[ f_Y(y) \big] \hspace{0.1cm}{\rm d}y \hspace{0.05cm}.$$
  • In subtask  (2),  calculate the differential entropy  $h(X)$  of the random variable  $X$  whose PDF  $f_X(x)$  is sketched above.  If one performs a suitable boundary transition, the random variable  $X$  also becomes a mixed random variable.



Hints:



Questions

1

What is the PDF height  $A$  of  $f_X(x)$  around  $x = 1$?

$A = 0.5/\varepsilon$,
$A = 0.5/\varepsilon+0.25$,
$A = 1/\varepsilon$.

2

Calculate the differential entropy for different  $\varepsilon$–values.

$ε = 10^{-1}\text{:} \ \ h(X) \ = \ $

$\ \rm bit$
$ε = 10^{-2}\text{:} \ \ h(X) \ = \ $

$\ \rm bit$
$ε = 10^{-3}\text{:} \ \ h(X) \ = \ $

$\ \rm bit$

3

What is the result of the limit  $ε \to 0$?

$f_X(x)$  now has a continuous and a discrete component.
The differential energy  $h(X)$  is negative.
The magnitude  $|h(X)|$  is infinite.

4

Which statements are true for the random variable  $Y$?

The CDF value at the point  $y = 1$  is  $0.5$.
$Y$  contains a discrete and a continuous component.
The discrete component at   $Y = 1$  occurs with  $10\%$  probability.
The continuous component of  $Y$  is uniformly distributed.
The differential entropies of  $X$  and  $Y$  are equal.


Solution

(1)  Proposed solution 2  is correct because the integral  $1$  over the PDF must yield:

$$f_X(x) \hspace{0.1cm}{\rm d}x = 0.25 \cdot 2 + (A - 0.25) \cdot \varepsilon \stackrel{!}{=} 1 \hspace{0.3cm} \Rightarrow\hspace{0.3cm}(A - 0.25) \cdot \varepsilon \stackrel{!}{=} 0.5 \hspace{0.3cm}\Rightarrow\hspace{0.3cm} A = 0.5/\varepsilon +0.25\hspace{0.05cm}.$$


(2)  The differential entropy (in "bit") is given as follows:

$$h(X) = \hspace{0.1cm} \hspace{-0.45cm} \int\limits_{{\rm supp}(f_X)} \hspace{-0.35cm} f_X(x) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{f_X(x)} \hspace{0.1cm}{\rm d}x \hspace{0.05cm}.$$

We now divide the integral into three partial integrals:

$$h(X) = \hspace{-0.25cm} \int\limits_{0}^{1-\varepsilon/2} \hspace{-0.15cm} 0.25 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.25} \hspace{0.1cm}{\rm d}x + \hspace{-0.25cm}\int\limits_{1+\varepsilon/2}^{2} \hspace{-0.15cm} 0.25 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.25} \hspace{0.1cm}{\rm d}x + \hspace{-0.25cm}\int\limits_{1-\varepsilon/2}^{1+\varepsilon/2} \hspace{-0.15cm} \big [0.5/\varepsilon + 0.25 \big ] \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{0.5/\varepsilon + 0.25} \hspace{0.1cm}{\rm d}x $$
$$ \Rightarrow \hspace{0.3cm} h(X) = 2 \cdot 0.25 \cdot 2 \cdot (2-\varepsilon) - (0.5 + 0.25 \cdot \varepsilon) \cdot {\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon +0.25) \hspace{0.05cm}.$$

In particular, one obtains

  • for  $\varepsilon = 0.1$:
$$h(X) =1.9 - 0.525 \cdot {\rm log}_2 \hspace{0.1cm}(5.25) = 1.9 - 1.256 \hspace{0.15cm}\underline{= 0.644\,{\rm bit}} \hspace{0.05cm},$$
  • for  $\varepsilon = 0.01$:
$$h(X) =1.99 - 0.5025 \cdot {\rm log}_2 \hspace{0.1cm}(50.25)= 1.99 - 2.84 \hspace{0.15cm}\underline{= -0.850\,{\rm bit}} \hspace{0.05cm}$$
  • for  $\varepsilon = 0.001$:
$$h(X) =1.999 - 0.50025 \cdot {\rm log}_2 \hspace{0.1cm}(500.25) = 1.999 - 8.967 \hspace{0.15cm}\underline{= -6.968\,{\rm bit}} \hspace{0.05cm}.$$


(3)  All the proposed solutions  $1$  are correct:

  • After the boundary transition   $\varepsilon → 0$   we obtain for the differential entropy
$$h(X) = \lim\limits_{\varepsilon \hspace{0.05cm}\rightarrow \hspace{0.05cm} 0} \hspace{0.1cm}\big[(2-\varepsilon) - (0.5 + 0.25 \cdot \varepsilon) \cdot {\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon +0.25)\big] = 2\,{\rm bit} - 0.5 \cdot \lim\limits_{\varepsilon \hspace{0.05cm}\rightarrow \hspace{0.05cm} 0}\hspace{0.1cm}{\rm log}_2 \hspace{0.1cm}(0.5/\varepsilon) \hspace{0.3cm}\Rightarrow\hspace{0.3cm} - \infty \hspace{0.05cm}.$$
PDF and CDF of the mixed random variable  $X$
  • The probability density function (PDF) in this case is given by.
$$f_X(x) = \left\{ \begin{array}{c} 0.25 + 0.5 \cdot \delta (x-1) \\ 0 \\ \end{array} \right. \begin{array}{*{20}c} {\rm{f\ddot{u}r}} \hspace{0.1cm} 0 \le x \le 2, \\ {\rm sonst} \\ \end{array} \hspace{0.05cm}.$$

Consequently, it is a  "mixed"  random variable with

  • a stochastic, uniformly distributed part in the range  $0 \le x \le 2$, and
  • a discrete component at  $x = 1$  with probability  $0.5$.


The graph shows the PDF  $f_X(x)$  on the left and the CDF  $F_X(x)$ on the right.
(4)  The correct solutions are 2, 3 and 5.  The lower graph shows the PDF and the CDF of the random variable  $Y$.  You can see:

PDF and CDF of the mixed random variable $Y$
  • Like $X$ ,  $Y$  contains a continuous and a discrete part.
  • The discrete part occurs with probability  ${\rm Pr}(Y = 1) = 0.1$.
  • Since  $F_Y(y)= {\rm Pr}(Y \le y)$  holds, the right-hand side limit is:
$$F_Y(y = 1) = 0.55.$$
  • The continuous component is not uniformly distributed;  rather, there is a triangular PDF.
  • The last proposition is also correct:   $h(Y) = h(X) = - \infty$.


Because:   For every random quantity with a discrete part – and it is also extremely small, the differential entropy is equal minus infinity..