Loading [MathJax]/jax/output/HTML-CSS/fonts/TeX/fontdata.js

Difference between revisions of "Aufgaben:Exercise 4.4: Conventional Entropy and Differential Entropy"

From LNTwww
 
(35 intermediate revisions by 4 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Differentielle Entropie
+
{{quiz-Header|Buchseite=Information_Theory/Differential_Entropy
 
}}
 
}}
  
[[File:P_ID2878__Inf_A_4_4.png|right|]]
+
[[File:P_ID2878__Inf_A_4_4.png|right|frame|Two uniform distributions]]
Wir betrachten die zwei wertkontinuierlichen Zufallsgrößen <i>X</i> und <i>Y</i> mit den Wahrscheinlichkeitsdichtefunktionen <i>f<sub>X</sub></i>(<i>x</i>) und <i>f<sub>Y</sub></i>(<i>y</i>). Für diese Zufallsgrößen kann man
+
We consider the two continuous random variables&nbsp; $X&nbsp; and&nbsp;Y&nbsp; with probability density functionsf_X(x)&nbsp; andf_Y(y)$.&nbsp; For these random variables one can
:* die herkömmlichen Entropien <i>H</i>(<i>X</i>), <i>H</i>(<i>Y</i>) nicht angeben,
+
* not specify the conventional entropies&nbsp; $H(X)&nbsp; and&nbsp;H(Y)$&nbsp;, respectively,
:* jedoch aber die differentiellen Entropien <i>h</i>(<i>X</i>) und <i>h</i>(<i>Y</i>).
+
* but the differential entropies&nbsp; $h(X)$&nbsp; and&nbsp; $h(Y)$.
Wir betrachten außerdem zwei wertdiskrete Zufallsgrößen:
 
:*<i>Z<sub>X,M</sub></i> ergibt sich durch (geeignete) Quantisierung der Zufallsgröße <i>X</i> mit der Quantisierungsstufenzahl <i>M</i> &nbsp;&#8658;&nbsp; Quantisierungsintervallbreite <i>&Delta;</i> = 0.5/<i>M</i>.
 
:* Die Zufallsgröße <i>Z<sub>Y,M</sub></i> ergibt sich nach Quantisierung der wertkontinuierlichen Zufallsgröße <i>Y</i> mit der Quantisierungsstufenzahl <i>M</i> &nbsp;&#8658;&nbsp; Quantisierungsintervallbreite <i>&Delta;</i> = 2/<i>M</i>.
 
  
Die Wahrscheinlichkeitsdichtefunktionen dieser diskreten Zufallsgrößen setzen sich jeweils aus <i>M</i> Diracfunktionen zusammen, deren Impulsgewichte durch die Intervallflächen der zugehörigen wertkontinuierlichen Zufallsgrößen gegeben sind. Daraus lassen sich die Entropien <i>H</i>(<i>Z<sub>X,M</sub></i>) und <i>H</i>(<i>Z<sub>Y,M</sub></i>) in herkömmlicher Weise (entsprechend Kapitel 3) bestimmen.
 
  
Im  [http://en.lntwww.de/Informationstheorie/Differentielle_Entropie#Entropie_wertkontinuierlicher_Zufallsgr.C3.B6.C3.9Fen_nach_Quantisierung '''Theorieteil'''] wurde auch eine Näherung angegeben. Beispielsweise gilt:
+
We also consider two discrete random variables:
$$H(Z_{X, \hspace{0.05cm}M}) \approx -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(X)\hspace{0.05cm}. $$
+
*The variable&nbsp; $Z_{X,\hspace{0.05cm}M}$&nbsp; is obtained by (suitably) quantizing the random quantity&nbsp; X&nbsp;with the quantization level number&nbsp; M <br>&#8658; &nbsp; quantization interval width&nbsp; ${\it \Delta} = 0.5/M$.
Sie werden im Laufe der Aufgabe feststellen, dass bei rechteckförmiger WDF &#8658; Gleichverteilung diese &bdquo;Näherung&rdquo; genau das gleiche Ergebnis liefert wie die direkte Berechnung.
+
* The variable&nbsp; $Z_{Y,\hspace{0.05cm}M}$&nbsp; is obtained after quantization of the random quantity&nbsp; Y&nbsp; with the quantization level number&nbsp; M &nbsp; <br>&#8658; &nbsp; quantization interval width&nbsp; ${\it \Delta} = 2/M$.
Aber im allgemeinen Fall &ndash; zum Beispiel bei [http://en.lntwww.de/Informationstheorie/Differentielle_Entropie#Entropie_wertkontinuierlicher_Zufallsgr.C3.B6.C3.9Fen_nach_Quantisierung '''dreieckförmiger WDF''']  &ndash; stellt obige Gleichung tatsächlich nur eine Näherung dar, die erst im Grenzfall <i>&Delta;</i> &#8594; 0 mit der tatsächlichen Entropie  <i>H</i>(<i>Z<sub>X,M</sub></i>) übereinstimmt.
 
  
<b>Hinweis:</b> Die Aufgabe gehört zum Themengebiet von [http://en.lntwww.de/Informationstheorie/Differentielle_Entropie '''Kapitel 4.1''']
 
  
===Fragebogen===
+
The probability density functions&nbsp; (PDF)&nbsp; of these discrete random variables are each composed of&nbsp; M&nbsp; Dirac functions whose momentum weights are given by the interval areas of the associated continuous random variables.
 +
 
 +
From this, the entropies&nbsp; H(ZX,M)&nbsp; and&nbsp; H(ZY,M)&nbsp; can be determined in the conventional way according to the section&nbsp; [[Information_Theory/Einige_Vorbemerkungen_zu_zweidimensionalen_Zufallsgrößen#Probability_mass_function_and_entropy|Probability mass function and entropy]]&nbsp;.
 +
 
 +
In the section&nbsp; [[Information_Theory/Differentielle_Entropie#Entropy_of_continuous_random_variables_after_quantization|Entropy of value-ontinuous random variables after quantization]],&nbsp; an approximation was also given.&nbsp; For example:
 +
:H(ZX,M)log2(Δ)+h(X).
 +
 
 +
*In the course of the task it will be shown that in the case of rectangular PDF  &nbsp; &#8658; &nbsp; uniform distribution this&nbsp; "approximation"&nbsp; gives the same result as the direct calculation.
 +
*But in the general case &ndash; so in&nbsp; [[Information_Theory/Differentielle_Entropie#Entropy_of_continuous_random_variables_after_quantization|Example 2]]&nbsp; with triangular PDF &ndash; this equation is in fact only an approximation, which agrees with the actual entropy&nbsp;  H(ZX,M)&nbsp; only in the limiting case &nbsp; Δ0&nbsp;.
 +
 
 +
 
 +
 
 +
 
 +
 
 +
 
 +
 
 +
 
 +
Hints:
 +
*The exercise belongs to the chapter&nbsp; [[Information_Theory/Differentielle_Entropie|Differential Entropy]].
 +
*Useful hints for solving this task can be found in particular in the section&nbsp;  [[Information_Theory/Differentielle_Entropie#Entropy_of_continuous_random_variables_after_quantization|Entropy of continuous random variables after quantization]].
 +
 +
 
 +
 
 +
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
  
{Berechnen Sie die differentielle Entropie <i>h</i>(<i>X</i>).
+
{Calculate the differential entropy&nbsp; $h(X)$.
 
|type="{}"}
 
|type="{}"}
h(X) = { 1 3% }
+
$ h(X) \ = \ $ { -1.03--0.97 }  bit
  
{Berechnen Sie die differentielle Entropie <i>h</i>(<i>Y</i>).
+
{Calculate the differential entropy $h(Y)$.
 
|type="{}"}
 
|type="{}"}
h(Y) = { 1 3% }
+
$ h(Y) \ = \ $ { 1 3% }  bit
  
{Berechnen Sie die Entropie der wertdiskreten Zufallsgrößen <i>Z<sub>X, M</sub></i><sub> = 4</sub>.
+
{Calculate the entropy of the discrete  random variables&nbsp; $Z_{X,\hspace{0.05cm}M=4}$&nbsp; using the direct method</u>.
 
|type="{}"}
 
|type="{}"}
$direkte  Berechnung: H(Z_{ X, M = 4})$ = { 2 3% }
+
$H(Z_{X,\hspace{0.05cm}M=4})\ = \ { 2 3% }\ \rm bit$
$mit  Näherung: H(Z_{ X, M = 4})$ = { 2 3% }
 
  
{Berechnen Sie die Entropie der wertdiskreten Zufallsgröße <i>Z<sub>Y, M</sub></i><sub> = 4</sub>.
+
{Calculate the entropy of the discrete random variables&nbsp; $Z_{X,\hspace{0.05cm}M=4}$&nbsp; using the given approximation</u>.
 
|type="{}"}
 
|type="{}"}
$mit  Näherung: H(Z_{ Y, M = 4})$ = { 2 3% }
+
$H(Z_{X,\hspace{0.05cm}M=4})\ = \ $ { 2 3% }  bit
  
{Berechnen Sie die Entropie der wertdiskreten Zufallsgröße <i>Z<sub>Y, M</sub></i><sub> = 8</sub>.
+
{Calculate the entropy of the discrete random variable&nbsp; $Z_{Y,\hspace{0.05cm}M=8}$&nbsp; with the given approximation</u>.
 
|type="{}"}
 
|type="{}"}
$mit Näherung: H(Z_{ Y, M = 8})$ = { 3 3% }
+
$H(Z_{Y,\hspace{0.05cm}M=8})\ = \ $ { 3 3% }  bit
  
{Welche der folgenden Aussagen sind zutreffend?
+
{Which of the following statements are true?
 
|type="[]"}
 
|type="[]"}
+ Die Entropie einer diskreten Zufallsgröße <i>Z</i> ist stets <i>H</i>(<i>Z</i>)&nbsp;&#8805;&nbsp;0.
+
+ The entropy of a discrete value random variable&nbsp; $Z&nbsp; is always&nbsp;H(Z) \ge 0$.
+ Die differentielle Entropie einer kontinuierlichen Zufallsgröße <i>X</i> ist stets <i>h</i>(<i>X</i>) &#8805; 0.
+
- The differential entropy of a continuous value random variable&nbsp; X&nbsp; is always&nbsp; $h(X) \ge 0$.
  
  
Line 54: Line 70:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
<b>a)</b>&nbsp;&nbsp;Gemäß der entsprechenden [http://en.lntwww.de/Informationstheorie/Differentielle_Entropie#Definition_und_Eigenschaften_der_differentiellen_Entropie '''Theorieseite'''] gilt mit <i>x</i><sub>min</sub> = 0 und <i>x</i><sub>max</sub> = 1/2:
+
'''(1)'''&nbsp; According to the corresponding theory section, with &nbsp;$x_{\rm min} = 0&nbsp; and &nbsp;x_{\rm max} = 1/2$:
h(X)=log2(xmaxxmin)=log2(1/2)=1bit_.
+
:h(X)=log2(xmaxxmin)=log2(1/2)=1bit_.
 +
 
 +
 
  
<b>b)</b>&nbsp;&nbsp;Mit <i>y</i><sub>min</sub> = &ndash;1 und <i>y</i><sub>max</sub> = +1 ergibt sich für die differentielle Entropie der Zufallsgröße <i>Y</i>:
+
'''(2)'''&nbsp; On the other hand, with &nbsp;$y_{\rm min} = -1$&nbsp; and &nbsp;$y_{\rm max} = +1&nbsp; the differential entropy of the random variable&nbsp;Y$ is given by:
$$h(Y) = {\rm log}_2 \hspace{0.1cm} (x_{\rm max} - x_{\rm min}) = {\rm log}_2 \hspace{0.1cm} (2) \hspace{0.15cm}\underline{= + 1\,{\rm bit}}\hspace{0.05cm}. $$
+
:$$h(Y) = {\rm log}_2 \hspace{0.1cm} (y_{\rm max} - y_{\rm min}) = {\rm log}_2 \hspace{0.1cm} (2) \hspace{0.15cm}\underline{= + 1\,{\rm bit}}\hspace{0.05cm}. $$
  
<b>c)</b>&nbsp;&nbsp;Die nachfolgende Grafik verdeutlicht die bestmögliche Quantisierung der Zufallsgröße <i>X</i> mit der Quantisierungsstufenzahl <i>M</i> = 4 &nbsp;&#8658;&nbsp; Zufallsgröße <i>Z<sub>X, M</sub></i><sub> = 4</sub>:
 
  
[[File:P_ID2879__Inf_A_4_4c.png|right|]]
+
 
:*Die Intervallbreite ist hier gleich <i>&Delta;</i> = 0.5/4 = 1/8.
+
[[File:P_ID2879__Inf_A_4_4c.png|right|frame|Quantized random variable&nbsp;  ZX, M=4]]
:*Die möglichen Werte (jeweils in der Intervallmitte) sind <i>z</i>&nbsp;&#8712; {0.0625,&nbsp;0.1875,&nbsp;0.3125,&nbsp;0.4375}.
+
'''(3)'''&nbsp; The adjacent graph illustrates the best possible quantization of random variable&nbsp; X&nbsp; with quantization level number&nbsp; M=4&nbsp; &nbsp; &#8658; &nbsp; random variable&nbsp; ZX, M=4:
:*Die <u>direkte Berechnung</u> der Entropie ergibt mit der Wahrscheinlichkeitsfunktion <i>P<sub>Z</sub></i>(<i>Z</i>)&nbsp;=&nbsp;[1/4,&nbsp;... ,&nbsp;1/4]:
+
*The interval width here is equal to &nbsp;${\it \Delta} = 0.5/4 = 1/8$.
$$H(Z_{X, M = 4}) = {\rm log}_2 \hspace{0.1cm} (4) \hspace{0.15cm}\underline{= 2\,{\rm bit}}
+
*The possible values&nbsp; (at the center of the interval,&nbsp; respectively)&nbsp; are &nbsp;$z \in \{0.0625,\ 0.1875,\ 0.3125,\ 0.4375\}$.
 +
 
 +
 
 +
Using the probability mass function, the <u>direct entropy calculation</u> gives $P_Z(Z) = \big [1/4,\ \text{...} , \ 1/4 \big]$:
 +
:$$H(Z_{X, \ M = 4}) = {\rm log}_2 \hspace{0.1cm} (4) \hspace{0.15cm}\underline{= 2\,{\rm bit}}
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
:* Mit der <u>Näherung</u> erhält man unter Berücksichtigung des Ergebnisses der Teilaufgabe (a):
+
 
$$H(Z_{X, M = 4}) \approx  -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(X) =  
+
With the <u>approximation</u>, considering the result of&nbsp; '''(1)''', we obtain:
 +
:$$H(Z_{X,\hspace{0.05cm} M = 4}) \approx  -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(X) =  
 
3\,{\rm bit} +(- 1\,{\rm bit})\hspace{0.15cm}\underline{= 2\,{\rm bit}}\hspace{0.05cm}. $$
 
3\,{\rm bit} +(- 1\,{\rm bit})\hspace{0.15cm}\underline{= 2\,{\rm bit}}\hspace{0.05cm}. $$
<i>Hinweis:</i> Nur bei der Gleichverteilung liefert die Näherung genau das gleiche Ergebnis.
+
<i>Note:</i>&nbsp; Only in the case of uniform distribution, the approximation gives exactly the same result as the direct calculation, i.e. the actual entropy.
 +
 
 +
[[File:P_ID2880__Inf_A_4_4d.png|right|frame|Quantized random variable  ZY, M=4]]
 +
<br>
 +
'''(4)'''&nbsp; From the second graph, one can see the similarities / differences to subtask&nbsp; '''(3)''':
 +
* The quantization parameter is now &nbsp;Δ=2/4=1/2.
 +
* The possible values are now &nbsp;z{±0.75, ±0.25}.
 +
* Thus, here the&nbsp; "approximation"&nbsp; (as well as the direct calculation)&nbsp; gives the result:
 +
:$$H(Z_{Y,\hspace{0.05cm} M = 4})  \approx    -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(Y) =
 +
    1\,{\rm bit} + 1\,{\rm bit}\hspace{0.15cm}\underline{= 2\,{\rm bit}}\hspace{0.05cm}.$$
 +
 
 +
 
 +
[[File:P_ID2881__Inf_A_4_4e.png|right|frame|Quantized random variable&nbsp;  ZY, M=8]]
 +
'''(5)'''&nbsp; In contrast to subtask&nbsp; '''(5)''':&nbsp; &nbsp;Δ=1/4&nbsp; is now valid.&nbsp; From this follows for the "approximation":
 +
:$$H(Z_{Y,\hspace{0.05cm} M = 8})  \approx    -{\rm log}_2 \hspace{0.1cm} ({\it \Delta}) + h(Y) =
 +
2\,{\rm bit} + 1\,{\rm bit}\hspace{0.15cm}\underline{= 3\,{\rm bit}}\hspace{0.05cm}.$$
 +
Again, one gets the same result as in the direct calculation.
  
<b>d)</b>&nbsp;&nbsp;Aus der zweiten Grafik erkennt man die Gemeinsamkeiten / Unterschiede zur Teilaufgabe (c):
 
[[File:P_ID2880__Inf_A_4_4d.png|right|]]
 
:* Der Quantisierungsparameter ist nun <i>&Delta;</i> = 2/4 = 1/2.
 
:* Die möglichen Werte sind nun <i>z</i> &#8712; {&plusmn;0.75, &plusmn;0.25}.
 
:* Somit liefert hier die &bdquo;Näherung&rdquo; (ebenso wie die direkte Berechnung) das Ergebnis:
 
H(ZY,M=4)log2(Δ)+h(Y) $$ =\
 
  \hspace{-0.15cm} 1\,{\rm bit} + 1\,{\rm bit}\hspace{0.15cm}\underline{= 2\,{\rm bit}}\hspace{0.05cm}.$$
 
[[File:P_ID2881__Inf_A_4_4e.png|right|]]
 
<b>e)</b>&nbsp;&nbsp;Im Gegensatz zur Teilaufgabe (d) gilt nun <i>&Delta;</i> = 1/4. Daraus folgt für die &bdquo;Näherung&rdquo;:
 
H(ZY,M=8)log2(Δ)+h(Y) $$ =\
 
\hspace{-0.15cm} 2\,{\rm bit} + 1\,{\rm bit}\hspace{0.15cm}\underline{= 3\,{\rm bit}}\hspace{0.05cm}.$$
 
Wieder gleiches  Ergebnis bei direkter Berechnung.
 
  
<b>f)</b>&nbsp;&nbsp;Richtig ist nur die <u>Aussage 1</u>:
+
'''(6)'''&nbsp; Only <u>statement 1</u>&nbsp; is correct:
:* Die Entropie <i>H</i>(<i>Z</i>) einer diskreten Zufallsgröße <i>Z</i>&nbsp;=&nbsp;{<i>z</i><sub>1</sub>,&nbsp;... , <i>z<sub>M</sub></i>} kann nie negativ werden. Der Grenzfall <i>H</i>(<i>Z</i>) = 0 ergibt sich z.B. für Pr(<i>Z</i>&nbsp;=&nbsp;<i>z</i><sub>1</sub>)&nbsp;=&nbsp;1 und Pr(<i>Z</i>&nbsp;=&nbsp;<i>z<sub>&mu;</sub></i>)&nbsp;=&nbsp;0&nbsp;für 2&nbsp;&#8804;&nbsp;<i>&mu;</i>&nbsp;&#8804;&nbsp;<i>M</i>.
+
* The entropy&nbsp; $H(Z)$&nbsp; of a discrete random variable&nbsp; $Z = \{z_1, \ \text{...} \ , z_M\}$&nbsp; is never negative.
 +
*For example, the limiting case&nbsp; $H(Z) = 0$&nbsp; results for &nbsp;${\rm Pr}(Z = z_1) = 1$&nbsp; and &nbsp;${\rm Pr}(Z = z_\mu) = 0$&nbsp; for &nbsp;$2 \le \mu \le M$.
  
:* Dagegen kann die differentielle Entropie <i>h</i>(<i>X</i>) einer kontinuierlichen Zufallsgröße <i>X</i> negativ (Teilaufgabe a), positiv (Teilaufgabe b) oder auch <i>h</i>(<i>X</i>) = 0 (z.B. <i>x</i><sub>min</sub> = 0, <i>x</i><sub>max</sub> = 1) sein.
+
* In contrast, the differential entropy&nbsp; h(X)&nbsp; of a continuous value random variable&nbsp; X&nbsp; can be as follows:
 +
** $h(X) < 0&nbsp;(subtask1)$,  
 +
** $h(X) > 0&nbsp;(subtask2)$, or even
 +
**$h(X) = 0&nbsp;(for example fo &nbsp;x_{\rm min} = 0&nbsp; and  &nbsp;x_{\rm max} = 1)$.
  
 
{{ML-Fuß}}
 
{{ML-Fuß}}
Line 97: Line 127:
  
  
[[Category:Aufgaben zu Informationstheorie|^4.1  Differentielle Entropie^]]
+
[[Category:Information Theory: Exercises|^4.1  Differential Entropy^]]

Latest revision as of 14:44, 17 November 2022

Two uniform distributions

We consider the two continuous random variables  X  and  Y  with probability density functions fX(x)  and fY(y).  For these random variables one can

  • not specify the conventional entropies  H(X)  and  H(Y) , respectively,
  • but the differential entropies  h(X)  and  h(Y).


We also consider two discrete random variables:

  • The variable  ZX,M  is obtained by (suitably) quantizing the random quantity  X with the quantization level number  M
    ⇒   quantization interval width  Δ=0.5/M.
  • The variable  ZY,M  is obtained after quantization of the random quantity  Y  with the quantization level number  M  
    ⇒   quantization interval width  Δ=2/M.


The probability density functions  (PDF)  of these discrete random variables are each composed of  M  Dirac functions whose momentum weights are given by the interval areas of the associated continuous random variables.

From this, the entropies  H(ZX,M)  and  H(ZY,M)  can be determined in the conventional way according to the section  Probability mass function and entropy .

In the section  Entropy of value-ontinuous random variables after quantization,  an approximation was also given.  For example:

H(ZX,M)log2(Δ)+h(X).
  • In the course of the task it will be shown that in the case of rectangular PDF   ⇒   uniform distribution this  "approximation"  gives the same result as the direct calculation.
  • But in the general case – so in  Example 2  with triangular PDF – this equation is in fact only an approximation, which agrees with the actual entropy  H(ZX,M)  only in the limiting case   Δ0 .





Hints:


Questions

1

Calculate the differential entropy  h(X).

h(X) = 

 bit

2

Calculate the differential entropy h(Y).

h(Y) = 

 bit

3

Calculate the entropy of the discrete random variables  ZX,M=4  using the direct method.

H(ZX,M=4) = 

 bit

4

Calculate the entropy of the discrete random variables  ZX,M=4  using the given approximation.

H(ZX,M=4) = 

 bit

5

Calculate the entropy of the discrete random variable  ZY,M=8  with the given approximation.

H(ZY,M=8) = 

 bit

6

Which of the following statements are true?

The entropy of a discrete value random variable  Z  is always  H(Z)0.
The differential entropy of a continuous value random variable  X  is always  h(X)0.


Solution

(1)  According to the corresponding theory section, with  xmin=0  and  xmax=1/2:

h(X)=log2(xmaxxmin)=log2(1/2)=1bit_.


(2)  On the other hand, with  ymin=1  and  ymax=+1  the differential entropy of the random variable  Y is given by:

h(Y)=log2(ymaxymin)=log2(2)=+1bit_.


Quantized random variable  ZX, M=4

(3)  The adjacent graph illustrates the best possible quantization of random variable  X  with quantization level number  M=4    ⇒   random variable  ZX, M=4:

  • The interval width here is equal to  Δ=0.5/4=1/8.
  • The possible values  (at the center of the interval,  respectively)  are  z{0.0625, 0.1875, 0.3125, 0.4375}.


Using the probability mass function, the direct entropy calculation gives PZ(Z)=[1/4, ..., 1/4]:

H(ZX, M=4)=log2(4)=2bit_.

With the approximation, considering the result of  (1), we obtain:

H(ZX,M=4)log2(Δ)+h(X)=3bit+(1bit)=2bit_.

Note:  Only in the case of uniform distribution, the approximation gives exactly the same result as the direct calculation, i.e. the actual entropy.

Quantized random variable ZY, M=4


(4)  From the second graph, one can see the similarities / differences to subtask  (3):

  • The quantization parameter is now  Δ=2/4=1/2.
  • The possible values are now  z{±0.75, ±0.25}.
  • Thus, here the  "approximation"  (as well as the direct calculation)  gives the result:
H(ZY,M=4)log2(Δ)+h(Y)=1bit+1bit=2bit_.


Quantized random variable  ZY, M=8

(5)  In contrast to subtask  (5):   Δ=1/4  is now valid.  From this follows for the "approximation":

H(ZY,M=8)log2(Δ)+h(Y)=2bit+1bit=3bit_.

Again, one gets the same result as in the direct calculation.


(6)  Only statement 1  is correct:

  • The entropy  H(Z)  of a discrete random variable  Z={z1, ... ,zM}  is never negative.
  • For example, the limiting case  H(Z)=0  results for  Pr(Z=z1)=1  and  Pr(Z=zμ)=0  for  2μM.
  • In contrast, the differential entropy  h(X)  of a continuous value random variable  X  can be as follows:
    • h(X)<0  (subtask 1),
    • h(X)>0  (subtask 2), or even
    • h(X)=0  (for example fo  xmin=0  and  xmax=1).