Loading [MathJax]/jax/output/HTML-CSS/fonts/TeX/fontdata.js

Difference between revisions of "Aufgaben:Exercise 4.2: Triangular PDF"

From LNTwww
m (Text replacement - "[[Informationstheorie" to "[[Information_Theory")
m (Text replacement - "value-continuous" to "continuous")
 
(10 intermediate revisions by 3 users not shown)
Line 1: Line 1:
{{quiz-Header|Buchseite=Informationstheorie/Differentielle Entropie
+
{{quiz-Header|Buchseite=Information_Theory/Differential_Entropy
 
}}
 
}}
  
[[File:P_ID2865__Inf_A_4_2.png|right|frame|Zweimal dreieckförmige WDF]]
+
[[File:P_ID2865__Inf_A_4_2.png|right|frame|Two triangular PDFs]]
Betrachtet werden zwei Wahrscheinlichkeitsdichtefunktionen (kurz WDF) mit dreieckförmigem Verlauf.
+
Two probability density functions  $\rm (PDF)$  with triangular shapes are considered.
* Die Zufallsgröße   X   ist auf den Wertebereich von   0   bis   1  begrenzt,   und es gilt für die WDF (obere Skizze):
+
* The random variable   X   is limited to the range from   0   to   1 ,   and it holds for the PDF (upper sketch):
:$$f_X(x) = \left\{ 2x0 \right. \begin{array}{*{20}c}  {\rm{f\ddot{u}r}} \hspace{0.1cm} 0 \le x \le 1 \\    {\rm sonst} \\ \end{array}
+
:$$f_X(x) = \left\{ 2x0 \right. \begin{array}{*{20}c}  {\rm{f\ddot{u}r}} \hspace{0.1cm} 0 \le x \le 1 \\    {\rm else} \\ \end{array}
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
* Die Zufallsgröße   Y   besitzt gemäß der unteren Skizze die folgende WDF:
+
* According to the lower sketch, the random variable   Y   has the following PDF:
:$$f_Y(y) = \left\{ 1|y|0 \right. \begin{array}{*{20}c}  {\rm{f\ddot{u}r}} \hspace{0.1cm} |\hspace{0.03cm}y\hspace{0.03cm}| \le 1 \\    {\rm sonst} \\ \end{array}
+
:$$f_Y(y) = \left\{ 1|y|0 \right. \begin{array}{*{20}c}  {\rm{f\ddot{u}r}} \hspace{0.1cm} |\hspace{0.03cm}y\hspace{0.03cm}| \le 1 \\    {\rm else} \\ \end{array}
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
  
Für beide Zufallsgrößen soll jeweils die   [[Information_Theory/Differentielle_Entropie|differentielle Entropie]]  ermittelt werden.
+
For both random variables, the   [[Information_Theory/Differentielle_Entropie|differential  entropy]]  is to be determined in each case.
  
Beispielsweise lautet die entsprechende Gleichung für die Zufallsgröße   X:
+
For example, the corresponding equation for the random variable   X  is:
 
:$$h(X) =  
 
:$$h(X) =  
 
\hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}\hspace{0.03cm}(\hspace{-0.03cm}f_X)} \hspace{-0.35cm}  f_X(x) \cdot {\rm log} \hspace{0.1cm} \big [ f_X(x) \big ] \hspace{0.1cm}{\rm d}x  
 
\hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}\hspace{0.03cm}(\hspace{-0.03cm}f_X)} \hspace{-0.35cm}  f_X(x) \cdot {\rm log} \hspace{0.1cm} \big [ f_X(x) \big ] \hspace{0.1cm}{\rm d}x  
\hspace{0.6cm}{\rm mit}\hspace{0.6cm} {\rm supp}(f_X) = \{ x\text{:} \  f_X(x) > 0 \}
+
\hspace{0.6cm}{\rm with}\hspace{0.6cm} {\rm supp}(f_X) = \{ x\text{:} \  f_X(x) > 0 \}
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
*Verwendet man den   ''natürlichen Logarithmus'', so ist die Pseudo–Einheit   „nat”   anzufügen.  
+
*If the   "natural logarithm",  the pseudo-unit   "nat"   must be added.  
*Ist das Ergebnis dagegen in&nbsp;  &bdquo;bit&rdquo;&nbsp;  gefragt, so ist der <i>Logarithmus dualis</i> &nbsp; &#8658; &nbsp; &bdquo;log2&rdquo;&nbsp; zu verwenden.
+
*If, on the other hand, the result is asked in&nbsp;  "bit"&nbsp; then the&nbsp;  "dual logarithm" &nbsp; &#8658; &nbsp; "log2"&nbsp; is to be used.
  
  
In der vierten Teilaufgabe wird die neue Zufallsgröße &nbsp;Z=AY&nbsp; betrachtet. Der WDF&ndash;Parameter&nbsp; A&nbsp; ist dabei so zu bestimmen, dass die differentielle Entropie der neuen Zufallsgröße&nbsp; Z&nbsp; genau&nbsp; 1 bit ergibt:<br>
+
In the fourth subtask, the new random variable &nbsp;Z=AY&nbsp; is considered. Here,&nbsp; the PDF parameter&nbsp; A&nbsp; is to be determined in such a way that the differential entropy of the new random variable&nbsp; Z&nbsp; yields exactly&nbsp; 1&nbsp; bit :<br>
:$$h(Z) = h (A \cdot Y) =  h (Y)  + {\rm log}_2 \hspace{0.1cm} (A) = 1\,{\rm bit} \hspace{0.05cm}.$$
+
:h(Z)=h(AY)=h(Y)+log2(A)=1 bit.
  
  
Line 32: Line 32:
  
  
''Hinweise:''
+
Hints:
*Die Aufgabe gehört zum  Kapitel&nbsp; [[Information_Theory/Differentielle_Entropie|Differentielle Entropie]].
+
*The task belongs to the chapter&nbsp; [[Information_Theory/Differentielle_Entropie|Differential Entropy]].
*Nützliche Hinweise zur Lösung dieser Aufgabe und weitere Informationen zu den wertkontinuierlichen Zufallsgrößen finden Sie im dritten Kapitel &bdquo;Kontinuierliche Zufallsgrößen&rdquo; des Buches&nbsp;  [[Stochastische Signaltheorie]].
+
*Useful hints for solving this task and further information on continuous random variables can be found in the third chapter "Continuous Random Variables" of the book&nbsp;  [[Theory of Stochastic Signals]].
 
   
 
   
*Vorgegeben ist das folgende unbestimmte Integral:
+
*Given the following indefinite integral:
 
:$$\int  \xi \cdot {\rm ln} \hspace{0.1cm} (\xi)\hspace{0.1cm}{\rm d}\xi =  
 
:$$\int  \xi \cdot {\rm ln} \hspace{0.1cm} (\xi)\hspace{0.1cm}{\rm d}\xi =  
 
  \xi^2 \cdot \big [1/2 \cdot {{\rm ln} \hspace{0.1cm} (\xi)} -  
 
  \xi^2 \cdot \big [1/2 \cdot {{\rm ln} \hspace{0.1cm} (\xi)} -  
Line 43: Line 43:
  
  
===Fragebogen===
+
===Questions===
 
<quiz display=simple>
 
<quiz display=simple>
{Berechnen Sie die differentielle Entropie der Zufallsgröße&nbsp; X&nbsp; in&nbsp; &bdquo;nat&rdquo;.
+
{Calculate the differential entropy of the random variable&nbsp; X&nbsp; in&nbsp; "nat".
 
|type="{}"}
 
|type="{}"}
 
h(X) =  { -0.199--0.187 }  nat
 
h(X) =  { -0.199--0.187 }  nat
  
  
{Welches Ergebnis erhält man mit der Pseudoeinheit&nbsp; &bdquo;bit&rdquo;?
+
{What result is obtained with the pseudo-unit&nbsp; "bit"?
 
|type="{}"}
 
|type="{}"}
 
h(X) =  { -0.288--0.270 }  bit
 
h(X) =  { -0.288--0.270 }  bit
  
{Berechnen Sie die differentielle Entropie der Zufallsgröße&nbsp; Y.
+
{Calculate the differential entropy of the random variable&nbsp; Y.
 
|type="{}"}
 
|type="{}"}
 
h(Y) =  { 0.721 3% }  bit
 
h(Y) =  { 0.721 3% }  bit
  
{Bestimmen Sie den WDF&ndash;Parameter&nbsp; A&nbsp; derart, dass&nbsp; h(Z)=h(AY)=1 bit_&nbsp; gilt.
+
{Determine the PDF parameter&nbsp; A&nbsp; such that&nbsp; h(Z)=h(AY)=1 bit_&nbsp;.
 
|type="{}"}
 
|type="{}"}
 
A = { 1.213 3% }
 
A = { 1.213 3% }
Line 65: Line 65:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''(1)'''&nbsp; Für die Wahrscheinlichkeitsdichtefunktion gilt im Bereich&nbsp; 0X1&nbsp; vereinbarungsgemäß:  
+
'''(1)'''&nbsp; For the probability density function, in the range&nbsp; 0X1&nbsp;, it is agreed that:
 
:$$f_X(x) = 2x = C \cdot x
 
:$$f_X(x) = 2x = C \cdot x
 
\hspace{0.05cm}.$$  
 
\hspace{0.05cm}.$$  
*Wir haben hierbei &bdquo;2&rdquo; durch&nbsp; C&nbsp; ersetzt &nbsp; &#8658; &nbsp; Verallgemeinerung, um in der Teilaufgabe&nbsp; (3)&nbsp; die nachfolgende Berechnung nochmals nutzen zu können.
+
*Here we have replaced&nbsp; "2"&nbsp; by&nbsp; C&nbsp; &nbsp; &#8658; &nbsp; generalization in order to be able to use the following calculation again in subtask&nbsp; (3)&nbsp;.
  
*Da die differentielle Entropie in &bdquo;nat&rdquo; gesucht ist, verwenden wir den natürlichen Logarithmus.&nbsp; Mit der Substitution&nbsp; ξ=Cx&nbsp; erhalten wir:
+
*Since the differential entropy is sought in&nbsp; "nat",&nbsp; we use the natural logarithm.&nbsp; With the substitution&nbsp; ξ=Cx&nbsp; we obtain:
 
:$$h_{\rm nat}(X) = \hspace{0.1cm} - \int_{0}^{1} \hspace{0.1cm}  C \cdot x \cdot {\rm ln} \hspace{0.1cm} \big[ C \cdot x \big] \hspace{0.1cm}{\rm d}x =  
 
:$$h_{\rm nat}(X) = \hspace{0.1cm} - \int_{0}^{1} \hspace{0.1cm}  C \cdot x \cdot {\rm ln} \hspace{0.1cm} \big[ C \cdot x \big] \hspace{0.1cm}{\rm d}x =  
 
\hspace{0.1cm} - \hspace{0.1cm}\frac{1}{C} \cdot \int_{0}^{C} \hspace{0.1cm}  \xi \cdot {\rm ln} \hspace{0.1cm} [ \xi ] \hspace{0.1cm}{\rm d}\xi  
 
\hspace{0.1cm} - \hspace{0.1cm}\frac{1}{C} \cdot \int_{0}^{C} \hspace{0.1cm}  \xi \cdot {\rm ln} \hspace{0.1cm} [ \xi ] \hspace{0.1cm}{\rm d}\xi  
Line 79: Line 79:
 
\frac{1}{4}\right ]_{\xi = 0}^{\xi = C}  
 
\frac{1}{4}\right ]_{\xi = 0}^{\xi = C}  
 
\hspace{0.05cm}$$
 
\hspace{0.05cm}$$
*Hierbei wurde das vorne angegebene unbestimmte Integral benutzt.&nbsp; Nach Einsetzen der Grenzen erhält man unter Berücksichtigung von&nbsp; C=2:
+
*Here the indefinite integral given in the front was used.&nbsp; After inserting the limits, considering&nbsp; C=2,&nbsp; we obtain::
 
:$$h_{\rm nat}(X) =
 
:$$h_{\rm nat}(X) =
 
- C/2 \cdot   
 
- C/2 \cdot   
Line 95: Line 95:
  
  
'''(2)'''&nbsp; Allgemein gilt:
+
'''(2)'''&nbsp; In general:
 +
[[File:P_ID2866__Inf_A_4_2c.png|right|frame|To calculate&nbsp; h(Y)]]
 +
 
 
:$$h_{\rm bit}(X) = \frac{h_{\rm nat}(X)}{{\rm ln} \hspace{0.1cm} (2)\,{\rm nat/bit}} = - 0.279
 
:$$h_{\rm bit}(X) = \frac{h_{\rm nat}(X)}{{\rm ln} \hspace{0.1cm} (2)\,{\rm nat/bit}} = - 0.279
 
\hspace{0.3cm} \Rightarrow\hspace{0.3cm}
 
\hspace{0.3cm} \Rightarrow\hspace{0.3cm}
Line 101: Line 103:
 
\hspace{0.15cm}\underline {= - 0.279\,{\rm bit}}  
 
\hspace{0.15cm}\underline {= - 0.279\,{\rm bit}}  
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
*Diese Umrechnung kann man sich sparen, wenn man bereits im analytischen Ergebnis der Teilaufgabe&nbsp; (1)&nbsp; direkt &bdquo;ln&rdquo; durch &bdquo;log<sub>2</sub>&rdquo; ersetzt:
+
*You can save this conversion if you directly replace&nbsp; (1)&nbsp; direct&nbsp; "ln"&nbsp; by&nbsp; "log<sub>2</sub>"&nbsp; already in the analytical result of subtask:
 +
 
 
:$$h(X) = \  {\rm log}_2 \hspace{0.1cm} (\sqrt{\rm e}/2)\hspace{0.05cm}, \hspace{1.3cm}
 
:$$h(X) = \  {\rm log}_2 \hspace{0.1cm} (\sqrt{\rm e}/2)\hspace{0.05cm}, \hspace{1.3cm}
{\rm Pseudo-Einheit\hspace{-0.1cm}:\hspace{0.15cm} bit}  
+
{\rm pseudo-unit\hspace{-0.1cm}:\hspace{0.15cm} bit}  
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
  
  
  
[[File:P_ID2866__Inf_A_4_2c.png|right|frame|Zur Berechnung von&nbsp; h(Y)]]
+
'''(3)'''&nbsp; We again use the natural logarithm and divide the integral into two partial integrals:
'''(3)'''&nbsp; Wir verwenden wieder den natürlichen Logarithmus und teilen das Integral in zwei Teilintegrale auf:
 
 
:$$h(Y) =  
 
:$$h(Y) =  
 
\hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}
 
\hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}
Line 115: Line 117:
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
  
*Das erste Integral für den Bereich&nbsp; 1y0&nbsp; ist formgleich mit dem der Teilaufgabe&nbsp; (1)&nbsp; und gegenüber diesem nur verschoben, was das Ergebnis nicht beeinflusst.  
+
*The first integral for the range&nbsp; 1y0&nbsp; is identical in form to that of subtask&nbsp; (1)&nbsp; and only shifted with respect to it, which does not affect the result.
*Zu berücksichtigen ist nun die Höhe&nbsp; C=1&nbsp; anstelle von&nbsp; C=2:
+
*Now the height&nbsp; C=1&nbsp; instead of&nbsp; C=2&nbsp; has to be considered:
 
:$$I_{\rm neg} =- C/2 \cdot   
 
:$$I_{\rm neg} =- C/2 \cdot   
 
\big [ {\rm ln} \hspace{0.1cm} (C) - 1/2
 
\big [ {\rm ln} \hspace{0.1cm} (C) - 1/2
Line 124: Line 126:
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
  
*Der zweite Integrand ist bis auf eine Verschiebung und Spiegelung identisch mit dem ersten.&nbsp; Außerdem überlappen sich die Integrationsintervalle nicht &nbsp; &#8658; &nbsp; Ipos=Ineg:
+
*The second integrand is identical to the first except for a shift and reflection.&nbsp; Moreover, the integration intervals do not overlap &nbsp; &#8658; &nbsp; Ipos=Ineg:
 
:$$h_{\rm nat}(Y)  = 2 \cdot I_{\rm neg} =  1/2 \cdot  
 
:$$h_{\rm nat}(Y)  = 2 \cdot I_{\rm neg} =  1/2 \cdot  
 
{\rm ln} \hspace{0.1cm} ({\rm e}) = {\rm ln} \hspace{0.1cm} (\sqrt{\rm e}) \hspace{0.3cm}
 
{\rm ln} \hspace{0.1cm} ({\rm e}) = {\rm ln} \hspace{0.1cm} (\sqrt{\rm e}) \hspace{0.3cm}
Line 133: Line 135:
  
  
'''(4)'''&nbsp; Für die differentielle Entropie der Zufallsgröße&nbsp; Z=AY&nbsp; gilt allgemein:
+
'''(4)'''&nbsp; For the differential entropy of the random variable&nbsp; Z=AY&nbsp; holds in general:
 
:h(Z)=h(AY)=h(Y)+log2(A).
 
:h(Z)=h(AY)=h(Y)+log2(A).
*Aus der Forderung&nbsp; h(Z)=1 bit&nbsp; und dem Ergebnis der Teilaufgabe&nbsp; (3)&nbsp; folgt somit:
+
*Thus, from the requiremen&nbsp; h(Z)=1 bit&nbsp; and the result of subtask&nbsp; (3)&nbsp; follows:
 
:$${\rm log}_2 \hspace{0.1cm} (A) = 1\,{\rm bit} - 0.721 \,{\rm bit} = 0.279 \,{\rm bit}
 
:$${\rm log}_2 \hspace{0.1cm} (A) = 1\,{\rm bit} - 0.721 \,{\rm bit} = 0.279 \,{\rm bit}
 
\hspace{0.3cm} \Rightarrow\hspace{0.3cm} A = 2^{0.279}\hspace{0.15cm}\underline
 
\hspace{0.3cm} \Rightarrow\hspace{0.3cm} A = 2^{0.279}\hspace{0.15cm}\underline
Line 144: Line 146:
  
  
[[Category:Aufgaben zu Informationstheorie|^4.1  Differentielle Entropie^]]
+
[[Category:Information Theory: Exercises|^4.1  Differential Entropy^]]

Latest revision as of 10:27, 11 October 2021

Two triangular PDFs

Two probability density functions  (PDF)  with triangular shapes are considered.

  • The random variable  X  is limited to the range from  0  to  1 ,  and it holds for the PDF (upper sketch):
fX(x)={2x0f¨ur0x1else.
  • According to the lower sketch, the random variable  Y  has the following PDF:
fY(y)={1|y|0f¨ur|y|1else.

For both random variables, the  differential entropy  is to be determined in each case.

For example, the corresponding equation for the random variable  X  is:

h(X)=supp(fX)fX(x)log[fX(x)]dxwithsupp(fX)={x: fX(x)>0}.
  • If the  "natural logarithm",  the pseudo-unit  "nat"  must be added.
  • If, on the other hand, the result is asked in  "bit"  then the  "dual logarithm"   ⇒   "log2"  is to be used.


In the fourth subtask, the new random variable  Z=AY  is considered. Here,  the PDF parameter  A  is to be determined in such a way that the differential entropy of the new random variable  Z  yields exactly  1  bit :

h(Z)=h(AY)=h(Y)+log2(A)=1 bit.





Hints:

  • The task belongs to the chapter  Differential Entropy.
  • Useful hints for solving this task and further information on continuous random variables can be found in the third chapter "Continuous Random Variables" of the book  Theory of Stochastic Signals.
  • Given the following indefinite integral:
ξln(ξ)dξ=ξ2[1/2ln(ξ)1/4].


Questions

1

Calculate the differential entropy of the random variable  X  in  "nat".

h(X) = 

 nat

2

What result is obtained with the pseudo-unit  "bit"?

h(X) = 

 bit

3

Calculate the differential entropy of the random variable  Y.

h(Y) = 

 bit

4

Determine the PDF parameter  A  such that  h(Z)=h(AY)=1 bit_ .

A =


Solution

(1)  For the probability density function, in the range  0X1 , it is agreed that:

fX(x)=2x=Cx.
  • Here we have replaced  "2"  by  C    ⇒   generalization in order to be able to use the following calculation again in subtask  (3) .
  • Since the differential entropy is sought in  "nat",  we use the natural logarithm.  With the substitution  ξ=Cx  we obtain:
hnat(X)=10Cxln[Cx]dx=1CC0ξln[ξ]dξ=ξ2C[ln(ξ)214]ξ=Cξ=0
  • Here the indefinite integral given in the front was used.  After inserting the limits, considering  C=2,  we obtain::
hnat(X)=C/2[ln(C)1/2]=ln(2)+1/2=ln(2)+1/2ln(e)=ln(e/2)=0.193h(X)=0.193nat_.


(2)  In general:

To calculate  h(Y)
hbit(X)=hnat(X)ln(2)nat/bit=0.279h(X)=0.279bit_.
  • You can save this conversion if you directly replace  (1)  direct  "ln"  by  "log2"  already in the analytical result of subtask:
h(X)= log2(e/2),pseudounit:bit.


(3)  We again use the natural logarithm and divide the integral into two partial integrals:

h(Y)=supp(fY)fY(y)ln[fY(y)]dy=Ineg+Ipos.
  • The first integral for the range  1y0  is identical in form to that of subtask  (1)  and only shifted with respect to it, which does not affect the result.
  • Now the height  C=1  instead of  C=2  has to be considered:
Ineg=C/2[ln(C)1/2]=1/2[ln(1)1/2ln(e)]=1/4ln(e).
  • The second integrand is identical to the first except for a shift and reflection.  Moreover, the integration intervals do not overlap   ⇒   Ipos=Ineg:
hnat(Y)=2Ineg=1/2ln(e)=ln(e)hbit(Y)=log2(e)h(Y)=log2(1.649)=0.721bit_.


(4)  For the differential entropy of the random variable  Z=AY  holds in general:

h(Z)=h(AY)=h(Y)+log2(A).
  • Thus, from the requiremen  h(Z)=1 bit  and the result of subtask  (3)  follows:
log2(A)=1bit0.721bit=0.279bitA=20.279=1.213_.