Difference between revisions of "Aufgaben:Exercise 4.4: Conventional Entropy and Differential Entropy"

From LNTwww
 
(31 intermediate revisions by 4 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Differentielle Entropie
+
{{quiz-Header|Buchseite=Information_Theory/Differential_Entropy
 
}}
 
}}
  
[[File:P_ID2878__Inf_A_4_4.png|right|frame|WDF gleichverteilter Zufallsgrößen]]
+
[[File:P_ID2878__Inf_A_4_4.png|right|frame|Two uniform distributions]]
Wir betrachten die zwei wertkontinuierlichen Zufallsgrößen $X$ und $Y$ mit den Wahrscheinlichkeitsdichtefunktionen $f_X(x)$ und $f_Y(y)$. Für diese Zufallsgrößen kann man
+
We consider the two continuous random variables  $X$  and  $Y$  with probability density functions $f_X(x)$  and $f_Y(y)$.  For these random variables one can
* die herkömmlichen Entropien $H(X)$ bzw. $H(Y)$ nicht angeben,
+
* not specify the conventional entropies  $H(X)$  and  $H(Y)$ , respectively,
* jedoch aber die differentiellen Entropien $h(X)$ und $h(Y)$.
+
* but the differential entropies  $h(X)$  and  $h(Y)$.
  
  
Wir betrachten außerdem zwei wertdiskrete Zufallsgrößen:
+
We also consider two discrete random variables:
*Die Zufallsgröße $Z_{X,\hspace{0.05cm}M}$ ergibt sich durch (geeignete) Quantisierung der Zufallsgröße $X$ mit der Quantisierungsstufenzahl $N$   ⇒   Quantisierungsintervallbreite ${\it Delta} = 0.5/M$.
+
*The variable&nbsp; $Z_{X,\hspace{0.05cm}M}$&nbsp; is obtained by (suitably) quantizing the random quantity&nbsp; $X$&nbsp;with the quantization level number&nbsp; $M$  <br>&#8658; &nbsp; quantization interval width&nbsp; ${\it \Delta} = 0.5/M$.
* Die Zufallsgröße $Z_{Y,\hspace{0.05cm}M}$ ergibt sich nach Quantisierung der wertkontinuierlichen Zufallsgröße $Y$ mit der Quantisierungsstufenzahl $M$ &nbsp; &#8658; &nbsp; Quantisierungsintervallbreite ${\it Delta} = 2/M$.
+
* The variable&nbsp; $Z_{Y,\hspace{0.05cm}M}$&nbsp; is obtained after quantization of the random quantity&nbsp; $Y$&nbsp; with the quantization level number&nbsp; $M$ &nbsp; <br>&#8658; &nbsp; quantization interval width&nbsp; ${\it \Delta} = 2/M$.
  
  
Die Wahrscheinlichkeitsdichtefunktionen dieser diskreten Zufallsgrößen setzen sich jeweils aus $M$ Diracfunktionen zusammen, deren Impulsgewichte durch die Intervallflächen der zugehörigen wertkontinuierlichen Zufallsgrößen gegeben sind. Daraus lassen sich die Entropien $H(Z_{X,\hspace{0.05cm}M})$ und $H(Z_{Y,\hspace{0.05cm}M})$ in herkömmlicher Weise entsprechend dem Kapitel [[Informationstheorie/Einige_Vorbemerkungen_zu_zweidimensionalen_Zufallsgrößen#Wahrscheinlichkeitsfunktion_und_Entropie|Wahrscheinlichkeitsfunktion und Entropie]] bestimmen.
+
The probability density functions&nbsp; $\rm (PDF)$&nbsp; of these discrete random variables are each composed of&nbsp; $M$&nbsp; Dirac functions whose momentum weights are given by the interval areas of the associated continuous random variables.
  
Im Abschnitt [[Informationstheorie/Differentielle_Entropie#Entropie_wertkontinuierlicher_Zufallsgr.C3.B6.C3.9Fen_nach_Quantisierung|Entropiewertkontinuierlicher Zufallsgrößen nach Quantisierung]] wurde auch eine Näherung angegeben. Beispielsweise gilt:
+
From this, the entropies&nbsp; $H(Z_{X,\hspace{0.05cm}M})$&nbsp; and&nbsp; $H(Z_{Y,\hspace{0.05cm}M})$&nbsp; can be determined in the conventional way according to the section&nbsp; [[Information_Theory/Einige_Vorbemerkungen_zu_zweidimensionalen_Zufallsgrößen#Probability_mass_function_and_entropy|Probability mass function and entropy]]&nbsp;.
 +
 
 +
In the section&nbsp; [[Information_Theory/Differentielle_Entropie#Entropy_of_continuous_random_variables_after_quantization|Entropy of value-ontinuous random variables after quantization]],&nbsp; an approximation was also given.&nbsp; For example:
 
:$$H(Z_{X, \hspace{0.05cm}M}) \approx  -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(X)\hspace{0.05cm}. $$
 
:$$H(Z_{X, \hspace{0.05cm}M}) \approx  -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(X)\hspace{0.05cm}. $$
  
*Sie werden im Laufe der Aufgabe feststellen, dass bei rechteckförmiger WDF &nbsp; &#8658; &nbsp; Gleichverteilung diese &bdquo;Näherung&rdquo; genau das gleiche Ergebnis liefert wie die direkte Berechnung.
+
*In the course of the task it will be shown that in the case of rectangular PDF  &nbsp; &#8658; &nbsp; uniform distribution this&nbsp; "approximation"&nbsp; gives the same result as the direct calculation.
*Aber im allgemeinen Fall &ndash; zum Beispiel bei [http://en.lntwww.de/Informationstheorie/Differentielle_Entropie#Entropie_wertkontinuierlicher_Zufallsgr.C3.B6.C3.9Fen_nach_Quantisierung '''dreieckförmiger WDF'''] &ndash; stellt obige Gleichung tatsächlich nur eine Näherung dar, die erst im Grenzfall <i>&Delta;</i> &#8594; 0 mit der tatsächlichen Entropie <i>H</i>(<i>Z<sub>X,M</sub></i>) übereinstimmt.
+
*But in the general case &ndash; so in&nbsp; [[Information_Theory/Differentielle_Entropie#Entropy_of_continuous_random_variables_after_quantization|$\text{Example 2}$]]&nbsp; with triangular PDF &ndash; this equation is in fact only an approximation, which agrees with the actual entropy&nbsp$H(Z_{X,\hspace{0.05cm}M})$&nbsp; only in the limiting case &nbsp; ${\it \Delta} \to 0$&nbsp;.
 +
 
 +
 
 +
 
 +
 
 +
 
 +
 
 +
 
 +
 
 +
Hints:
 +
*The exercise belongs to the chapter&nbsp; [[Information_Theory/Differentielle_Entropie|Differential Entropy]].
 +
*Useful hints for solving this task can be found in particular in the section&nbsp;  [[Information_Theory/Differentielle_Entropie#Entropy_of_continuous_random_variables_after_quantization|Entropy of continuous random variables after quantization]].
 +
  
<b>Hinweis:</b> Die Aufgabe gehört zum Themengebiet von [http://en.lntwww.de/Informationstheorie/Differentielle_Entropie '''Kapitel 4.1''']
 
  
===Fragebogen===
+
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
  
{Berechnen Sie die differentielle Entropie <i>h</i>(<i>X</i>).
+
{Calculate the differential entropy&nbsp; $h(X)$.
 
|type="{}"}
 
|type="{}"}
$ h(X)$ = { 1 3% }
+
$ h(X) \ = \ $ { -1.03--0.97 } $\ \rm bit$
  
{Berechnen Sie die differentielle Entropie <i>h</i>(<i>Y</i>).
+
{Calculate the differential entropy $h(Y)$.
 
|type="{}"}
 
|type="{}"}
$ h(Y)$ = { 1 3% }
+
$ h(Y) \ = \ $ { 1 3% } $\ \rm bit$
  
{Berechnen Sie die Entropie der wertdiskreten Zufallsgrößen <i>Z<sub>X, M</sub></i><sub> = 4</sub>.
+
{Calculate the entropy of the discrete  random variables&nbsp; $Z_{X,\hspace{0.05cm}M=4}$&nbsp; using the direct method</u>.
 
|type="{}"}
 
|type="{}"}
$direkte  Berechnung: H(Z_{ X, M = 4})$ = { 2 3% }
+
$H(Z_{X,\hspace{0.05cm}M=4})\ = \ $ { 2 3% } $\ \rm bit$
$mit  Näherung: H(Z_{ X, M = 4})$ = { 2 3% }
 
  
{Berechnen Sie die Entropie der wertdiskreten Zufallsgröße <i>Z<sub>Y, M</sub></i><sub> = 4</sub>.
+
{Calculate the entropy of the discrete random variables&nbsp; $Z_{X,\hspace{0.05cm}M=4}$&nbsp; using the given approximation</u>.
 
|type="{}"}
 
|type="{}"}
$mit  Näherung: H(Z_{ Y, M = 4})$ = { 2 3% }
+
$H(Z_{X,\hspace{0.05cm}M=4})\ = \ $ { 2 3% } $\ \rm bit$
  
{Berechnen Sie die Entropie der wertdiskreten Zufallsgröße <i>Z<sub>Y, M</sub></i><sub> = 8</sub>.
+
{Calculate the entropy of the discrete random variable&nbsp; $Z_{Y,\hspace{0.05cm}M=8}$&nbsp; with the given approximation</u>.
 
|type="{}"}
 
|type="{}"}
$mit Näherung: H(Z_{ Y, M = 8})$ = { 3 3% }
+
$H(Z_{Y,\hspace{0.05cm}M=8})\ = \ $ { 3 3% } $\ \rm bit$
  
{Welche der folgenden Aussagen sind zutreffend?
+
{Which of the following statements are true?
 
|type="[]"}
 
|type="[]"}
+ Die Entropie einer diskreten Zufallsgröße <i>Z</i> ist stets <i>H</i>(<i>Z</i>)&nbsp;&#8805;&nbsp;0.
+
+ The entropy of a discrete value random variable&nbsp; $Z$&nbsp; is always&nbsp; $H(Z) \ge 0$.
+ Die differentielle Entropie einer kontinuierlichen Zufallsgröße <i>X</i> ist stets <i>h</i>(<i>X</i>) &#8805; 0.
+
- The differential entropy of a continuous value random variable&nbsp; $X$&nbsp; is always&nbsp; $h(X) \ge 0$.
  
  
Line 58: Line 70:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
<b>a)</b>&nbsp;&nbsp;Gemäß der entsprechenden [http://en.lntwww.de/Informationstheorie/Differentielle_Entropie#Definition_und_Eigenschaften_der_differentiellen_Entropie '''Theorieseite'''] gilt mit <i>x</i><sub>min</sub> = 0 und <i>x</i><sub>max</sub> = 1/2:
+
'''(1)'''&nbsp; According to the corresponding theory section, with &nbsp;$x_{\rm min} = 0$&nbsp; and &nbsp;$x_{\rm max} = 1/2$:
$$h(X) = {\rm log}_2 \hspace{0.1cm} (x_{\rm max} - x_{\rm min}) = {\rm log}_2 \hspace{0.1cm} (1/2) \hspace{0.15cm}\underline{= - 1\,{\rm bit}}\hspace{0.05cm}.$$
+
:$$h(X) = {\rm log}_2 \hspace{0.1cm} (x_{\rm max} - x_{\rm min}) = {\rm log}_2 \hspace{0.1cm} (1/2) \hspace{0.15cm}\underline{= - 1\,{\rm bit}}\hspace{0.05cm}.$$
 +
 
  
<b>b)</b>&nbsp;&nbsp;Mit <i>y</i><sub>min</sub> = &ndash;1 und <i>y</i><sub>max</sub> = +1 ergibt sich für die differentielle Entropie der Zufallsgröße <i>Y</i>:
 
$$h(Y) = {\rm log}_2 \hspace{0.1cm} (x_{\rm max} - x_{\rm min}) = {\rm log}_2 \hspace{0.1cm} (2) \hspace{0.15cm}\underline{= + 1\,{\rm bit}}\hspace{0.05cm}. $$
 
  
<b>c)</b>&nbsp;&nbsp;Die nachfolgende Grafik verdeutlicht die bestmögliche Quantisierung der Zufallsgröße <i>X</i> mit der Quantisierungsstufenzahl <i>M</i> = 4 &nbsp;&#8658;&nbsp; Zufallsgröße <i>Z<sub>X, M</sub></i><sub> = 4</sub>:
+
'''(2)'''&nbsp; On the other hand, with &nbsp;$y_{\rm min} = -1$&nbsp; and &nbsp;$y_{\rm max} = +1$&nbsp; the differential entropy of the random variable&nbsp; $Y$ is given by:
 +
:$$h(Y) = {\rm log}_2 \hspace{0.1cm} (y_{\rm max} - y_{\rm min}) = {\rm log}_2 \hspace{0.1cm} (2) \hspace{0.15cm}\underline{= + 1\,{\rm bit}}\hspace{0.05cm}. $$
  
[[File:P_ID2879__Inf_A_4_4c.png|right|]]
+
 
:*Die Intervallbreite ist hier gleich <i>&Delta;</i> = 0.5/4 = 1/8.
+
 
:*Die möglichen Werte (jeweils in der Intervallmitte) sind <i>z</i>&nbsp;&#8712; {0.0625,&nbsp;0.1875,&nbsp;0.3125,&nbsp;0.4375}.
+
[[File:P_ID2879__Inf_A_4_4c.png|right|frame|Quantized random variable&nbsp;  $Z_{X, \ M = 4}$]]
:*Die <u>direkte Berechnung</u> der Entropie ergibt mit der Wahrscheinlichkeitsfunktion <i>P<sub>Z</sub></i>(<i>Z</i>)&nbsp;=&nbsp;[1/4,&nbsp;... ,&nbsp;1/4]:
+
'''(3)'''&nbsp; The adjacent graph illustrates the best possible quantization of random variable&nbsp; $X$&nbsp; with quantization level number&nbsp; $M = 4$&nbsp; &nbsp; &#8658; &nbsp; random variable&nbsp; $Z_{X, \ M = 4}$:
$$H(Z_{X, M = 4}) = {\rm log}_2 \hspace{0.1cm} (4) \hspace{0.15cm}\underline{= 2\,{\rm bit}}
+
*The interval width here is equal to &nbsp;${\it \Delta} = 0.5/4 = 1/8$.
 +
*The possible values&nbsp; (at the center of the interval,&nbsp; respectively)&nbsp; are &nbsp;$z \in \{0.0625,\ 0.1875,\ 0.3125,\ 0.4375\}$.
 +
 
 +
 
 +
Using the probability mass function, the <u>direct entropy calculation</u> gives $P_Z(Z) = \big [1/4,\ \text{...} , \ 1/4 \big]$:
 +
:$$H(Z_{X, \ M = 4}) = {\rm log}_2 \hspace{0.1cm} (4) \hspace{0.15cm}\underline{= 2\,{\rm bit}}
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
:* Mit der <u>Näherung</u> erhält man unter Berücksichtigung des Ergebnisses der Teilaufgabe (a):
+
 
$$H(Z_{X, M = 4}) \approx  -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(X) =  
+
With the <u>approximation</u>, considering the result of&nbsp; '''(1)''', we obtain:
 +
:$$H(Z_{X,\hspace{0.05cm} M = 4}) \approx  -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(X) =  
 
3\,{\rm bit} +(- 1\,{\rm bit})\hspace{0.15cm}\underline{= 2\,{\rm bit}}\hspace{0.05cm}. $$
 
3\,{\rm bit} +(- 1\,{\rm bit})\hspace{0.15cm}\underline{= 2\,{\rm bit}}\hspace{0.05cm}. $$
<i>Hinweis:</i> Nur bei der Gleichverteilung liefert die Näherung genau das gleiche Ergebnis.
+
<i>Note:</i>&nbsp; Only in the case of uniform distribution, the approximation gives exactly the same result as the direct calculation, i.e. the actual entropy.
 +
 
 +
[[File:P_ID2880__Inf_A_4_4d.png|right|frame|Quantized random variable  $Z_{Y, \ M = 4}$]]
 +
<br>
 +
'''(4)'''&nbsp; From the second graph, one can see the similarities / differences to subtask&nbsp; '''(3)''':
 +
* The quantization parameter is now &nbsp;${\it \Delta}  = 2/4 = 1/2$.
 +
* The possible values are now &nbsp;$z \in \{\pm 0.75,\ \pm 0.25\}$.
 +
* Thus, here the&nbsp; "approximation"&nbsp; (as well as the direct calculation)&nbsp; gives the result:
 +
:$$H(Z_{Y,\hspace{0.05cm} M = 4})  \approx    -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(Y) =
 +
    1\,{\rm bit} + 1\,{\rm bit}\hspace{0.15cm}\underline{= 2\,{\rm bit}}\hspace{0.05cm}.$$
 +
 
 +
 
 +
[[File:P_ID2881__Inf_A_4_4e.png|right|frame|Quantized random variable&nbsp;  $Z_{Y, \ M = 8}$]]
 +
'''(5)'''&nbsp; In contrast to subtask&nbsp; '''(5)''':&nbsp; &nbsp;${\it \Delta}  = 1/4$&nbsp; is now valid.&nbsp; From this follows for the "approximation":
 +
:$$H(Z_{Y,\hspace{0.05cm} M = 8})  \approx    -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(Y) =
 +
2\,{\rm bit} + 1\,{\rm bit}\hspace{0.15cm}\underline{= 3\,{\rm bit}}\hspace{0.05cm}.$$
 +
Again, one gets the same result as in the direct calculation.
  
<b>d)</b>&nbsp;&nbsp;Aus der zweiten Grafik erkennt man die Gemeinsamkeiten / Unterschiede zur Teilaufgabe (c):
 
[[File:P_ID2880__Inf_A_4_4d.png|right|]]
 
:* Der Quantisierungsparameter ist nun <i>&Delta;</i> = 2/4 = 1/2.
 
:* Die möglichen Werte sind nun <i>z</i> &#8712; {&plusmn;0.75, &plusmn;0.25}.
 
:* Somit liefert hier die &bdquo;Näherung&rdquo; (ebenso wie die direkte Berechnung) das Ergebnis:
 
$$H(Z_{Y, M = 4})  \approx    -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(Y)$$ $$ =\
 
  \hspace{-0.15cm} 1\,{\rm bit} + 1\,{\rm bit}\hspace{0.15cm}\underline{= 2\,{\rm bit}}\hspace{0.05cm}.$$
 
[[File:P_ID2881__Inf_A_4_4e.png|right|]]
 
<b>e)</b>&nbsp;&nbsp;Im Gegensatz zur Teilaufgabe (d) gilt nun <i>&Delta;</i> = 1/4. Daraus folgt für die &bdquo;Näherung&rdquo;:
 
$$H(Z_{Y, M = 8})  \approx    -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(Y)$$ $$ =\
 
\hspace{-0.15cm} 2\,{\rm bit} + 1\,{\rm bit}\hspace{0.15cm}\underline{= 3\,{\rm bit}}\hspace{0.05cm}.$$
 
Wieder gleiches  Ergebnis bei direkter Berechnung.
 
  
<b>f)</b>&nbsp;&nbsp;Richtig ist nur die <u>Aussage 1</u>:
+
'''(6)'''&nbsp; Only <u>statement 1</u>&nbsp; is correct:
:* Die Entropie <i>H</i>(<i>Z</i>) einer diskreten Zufallsgröße <i>Z</i>&nbsp;=&nbsp;{<i>z</i><sub>1</sub>,&nbsp;... , <i>z<sub>M</sub></i>} kann nie negativ werden. Der Grenzfall <i>H</i>(<i>Z</i>) = 0 ergibt sich z.B. für Pr(<i>Z</i>&nbsp;=&nbsp;<i>z</i><sub>1</sub>)&nbsp;=&nbsp;1 und Pr(<i>Z</i>&nbsp;=&nbsp;<i>z<sub>&mu;</sub></i>)&nbsp;=&nbsp;0&nbsp;für 2&nbsp;&#8804;&nbsp;<i>&mu;</i>&nbsp;&#8804;&nbsp;<i>M</i>.
+
* The entropy&nbsp; $H(Z)$&nbsp; of a discrete random variable&nbsp; $Z = \{z_1, \ \text{...} \ , z_M\}$&nbsp; is never negative.
 +
*For example, the limiting case&nbsp; $H(Z) = 0$&nbsp; results for &nbsp;${\rm Pr}(Z = z_1) = 1$&nbsp; and &nbsp;${\rm Pr}(Z = z_\mu) = 0$&nbsp; for &nbsp;$2 \le \mu \le M$.
  
:* Dagegen kann die differentielle Entropie <i>h</i>(<i>X</i>) einer kontinuierlichen Zufallsgröße <i>X</i> negativ (Teilaufgabe a), positiv (Teilaufgabe b) oder auch <i>h</i>(<i>X</i>) = 0 (z.B. <i>x</i><sub>min</sub> = 0, <i>x</i><sub>max</sub> = 1) sein.
+
* In contrast, the differential entropy&nbsp; $h(X)$&nbsp; of a continuous value random variable&nbsp; $X$&nbsp; can be as follows:
 +
** $h(X) < 0$&nbsp; $($subtask 1$)$,  
 +
** $h(X) > 0$&nbsp; $($subtask 2$)$, or even
 +
**$h(X) = 0$&nbsp;  $($for example fo &nbsp;$x_{\rm min} = 0$&nbsp; and  &nbsp;$x_{\rm max} = 1)$.
  
 
{{ML-Fuß}}
 
{{ML-Fuß}}
Line 101: Line 127:
  
  
[[Category:Aufgaben zu Informationstheorie|^4.1  Differentielle Entropie^]]
+
[[Category:Information Theory: Exercises|^4.1  Differential Entropy^]]

Latest revision as of 13:44, 17 November 2022

Two uniform distributions

We consider the two continuous random variables  $X$  and  $Y$  with probability density functions $f_X(x)$  and $f_Y(y)$.  For these random variables one can

  • not specify the conventional entropies  $H(X)$  and  $H(Y)$ , respectively,
  • but the differential entropies  $h(X)$  and  $h(Y)$.


We also consider two discrete random variables:

  • The variable  $Z_{X,\hspace{0.05cm}M}$  is obtained by (suitably) quantizing the random quantity  $X$ with the quantization level number  $M$
    ⇒   quantization interval width  ${\it \Delta} = 0.5/M$.
  • The variable  $Z_{Y,\hspace{0.05cm}M}$  is obtained after quantization of the random quantity  $Y$  with the quantization level number  $M$  
    ⇒   quantization interval width  ${\it \Delta} = 2/M$.


The probability density functions  $\rm (PDF)$  of these discrete random variables are each composed of  $M$  Dirac functions whose momentum weights are given by the interval areas of the associated continuous random variables.

From this, the entropies  $H(Z_{X,\hspace{0.05cm}M})$  and  $H(Z_{Y,\hspace{0.05cm}M})$  can be determined in the conventional way according to the section  Probability mass function and entropy .

In the section  Entropy of value-ontinuous random variables after quantization,  an approximation was also given.  For example:

$$H(Z_{X, \hspace{0.05cm}M}) \approx -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(X)\hspace{0.05cm}. $$
  • In the course of the task it will be shown that in the case of rectangular PDF   ⇒   uniform distribution this  "approximation"  gives the same result as the direct calculation.
  • But in the general case – so in  $\text{Example 2}$  with triangular PDF – this equation is in fact only an approximation, which agrees with the actual entropy  $H(Z_{X,\hspace{0.05cm}M})$  only in the limiting case   ${\it \Delta} \to 0$ .





Hints:


Questions

1

Calculate the differential entropy  $h(X)$.

$ h(X) \ = \ $

$\ \rm bit$

2

Calculate the differential entropy $h(Y)$.

$ h(Y) \ = \ $

$\ \rm bit$

3

Calculate the entropy of the discrete random variables  $Z_{X,\hspace{0.05cm}M=4}$  using the direct method.

$H(Z_{X,\hspace{0.05cm}M=4})\ = \ $

$\ \rm bit$

4

Calculate the entropy of the discrete random variables  $Z_{X,\hspace{0.05cm}M=4}$  using the given approximation.

$H(Z_{X,\hspace{0.05cm}M=4})\ = \ $

$\ \rm bit$

5

Calculate the entropy of the discrete random variable  $Z_{Y,\hspace{0.05cm}M=8}$  with the given approximation.

$H(Z_{Y,\hspace{0.05cm}M=8})\ = \ $

$\ \rm bit$

6

Which of the following statements are true?

The entropy of a discrete value random variable  $Z$  is always  $H(Z) \ge 0$.
The differential entropy of a continuous value random variable  $X$  is always  $h(X) \ge 0$.


Solution

(1)  According to the corresponding theory section, with  $x_{\rm min} = 0$  and  $x_{\rm max} = 1/2$:

$$h(X) = {\rm log}_2 \hspace{0.1cm} (x_{\rm max} - x_{\rm min}) = {\rm log}_2 \hspace{0.1cm} (1/2) \hspace{0.15cm}\underline{= - 1\,{\rm bit}}\hspace{0.05cm}.$$


(2)  On the other hand, with  $y_{\rm min} = -1$  and  $y_{\rm max} = +1$  the differential entropy of the random variable  $Y$ is given by:

$$h(Y) = {\rm log}_2 \hspace{0.1cm} (y_{\rm max} - y_{\rm min}) = {\rm log}_2 \hspace{0.1cm} (2) \hspace{0.15cm}\underline{= + 1\,{\rm bit}}\hspace{0.05cm}. $$


Quantized random variable  $Z_{X, \ M = 4}$

(3)  The adjacent graph illustrates the best possible quantization of random variable  $X$  with quantization level number  $M = 4$    ⇒   random variable  $Z_{X, \ M = 4}$:

  • The interval width here is equal to  ${\it \Delta} = 0.5/4 = 1/8$.
  • The possible values  (at the center of the interval,  respectively)  are  $z \in \{0.0625,\ 0.1875,\ 0.3125,\ 0.4375\}$.


Using the probability mass function, the direct entropy calculation gives $P_Z(Z) = \big [1/4,\ \text{...} , \ 1/4 \big]$:

$$H(Z_{X, \ M = 4}) = {\rm log}_2 \hspace{0.1cm} (4) \hspace{0.15cm}\underline{= 2\,{\rm bit}} \hspace{0.05cm}.$$

With the approximation, considering the result of  (1), we obtain:

$$H(Z_{X,\hspace{0.05cm} M = 4}) \approx -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(X) = 3\,{\rm bit} +(- 1\,{\rm bit})\hspace{0.15cm}\underline{= 2\,{\rm bit}}\hspace{0.05cm}. $$

Note:  Only in the case of uniform distribution, the approximation gives exactly the same result as the direct calculation, i.e. the actual entropy.

Quantized random variable $Z_{Y, \ M = 4}$


(4)  From the second graph, one can see the similarities / differences to subtask  (3):

  • The quantization parameter is now  ${\it \Delta} = 2/4 = 1/2$.
  • The possible values are now  $z \in \{\pm 0.75,\ \pm 0.25\}$.
  • Thus, here the  "approximation"  (as well as the direct calculation)  gives the result:
$$H(Z_{Y,\hspace{0.05cm} M = 4}) \approx -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(Y) = 1\,{\rm bit} + 1\,{\rm bit}\hspace{0.15cm}\underline{= 2\,{\rm bit}}\hspace{0.05cm}.$$


Quantized random variable  $Z_{Y, \ M = 8}$

(5)  In contrast to subtask  (5):   ${\it \Delta} = 1/4$  is now valid.  From this follows for the "approximation":

$$H(Z_{Y,\hspace{0.05cm} M = 8}) \approx -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(Y) = 2\,{\rm bit} + 1\,{\rm bit}\hspace{0.15cm}\underline{= 3\,{\rm bit}}\hspace{0.05cm}.$$

Again, one gets the same result as in the direct calculation.


(6)  Only statement 1  is correct:

  • The entropy  $H(Z)$  of a discrete random variable  $Z = \{z_1, \ \text{...} \ , z_M\}$  is never negative.
  • For example, the limiting case  $H(Z) = 0$  results for  ${\rm Pr}(Z = z_1) = 1$  and  ${\rm Pr}(Z = z_\mu) = 0$  for  $2 \le \mu \le M$.
  • In contrast, the differential entropy  $h(X)$  of a continuous value random variable  $X$  can be as follows:
    • $h(X) < 0$  $($subtask 1$)$,
    • $h(X) > 0$  $($subtask 2$)$, or even
    • $h(X) = 0$  $($for example fo  $x_{\rm min} = 0$  and  $x_{\rm max} = 1)$.