Difference between revisions of "Aufgaben:Exercise 3.3: Entropy of Ternary Quantities"

From LNTwww
m (Text replacement - "”" to """)
 
(5 intermediate revisions by 2 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Einige Vorbemerkungen zu zweidimensionalen Zufallsgrößen
+
{{quiz-Header|Buchseite=Information_Theory/Some_Preliminary_Remarks_on_Two-Dimensional_Random_Variables
 
}}
 
}}
  
[[File:P_ID2754__Inf_A_3_3.png|right|frame|Vorgegebene Entropiefunktionen]]
+
[[File:P_ID2754__Inf_A_3_3.png|right|frame|Given entropy functions]]
Rechts sehen Sie die Entropiefunktionen  $H_{\rm R}(p)$,  $H_{\rm B}(p)$  und  $H_{\rm G}(p)$, wobei  $\rm R$  für "Rot" steht,  $\rm B$  für "Blau" und  $\rm G$  für "Grün".  Die Wahrscheinlichkeitsfunktionen lauten für alle Zufallsgrößen:
+
On the right you see the entropy functions  $H_{\rm R}(p)$,  $H_{\rm B}(p)$  and  $H_{\rm G}(p)$, where  $\rm R$  stands for "red",  $\rm B$  for "blue" and  $\rm G$  for "green".  The probability functions are for all random variables:
 
:$$P_X(X) = [\hspace{0.05cm}p_1\hspace{0.05cm},\ p_2\hspace{0.05cm},\ p_3\hspace{0.05cm}]\hspace{0.3cm}\hspace{0.3cm} \Rightarrow \hspace{0.3cm} |X| = 3\hspace{0.05cm}.$$
 
:$$P_X(X) = [\hspace{0.05cm}p_1\hspace{0.05cm},\ p_2\hspace{0.05cm},\ p_3\hspace{0.05cm}]\hspace{0.3cm}\hspace{0.3cm} \Rightarrow \hspace{0.3cm} |X| = 3\hspace{0.05cm}.$$
Für den Fragebogen gilt der Zusammenhang  $p_1 = p$  und  $p_2 = 1 - p_3- p$.
+
For the questionnaire, the relationship  $p_1 = p$  and  $p_2 = 1 - p_3- p$.
  
Die Wahrscheinlichkeitsfunktion einer Zufallsgröße
+
The probability function of a random variable
 
:$$X = \big \{\hspace{0.05cm}x_1\hspace{0.05cm}, \hspace{0.15cm} x_2\hspace{0.05cm},\hspace{0.15cm} \text{...}\hspace{0.1cm} ,\hspace{0.15cm} x_{\mu}\hspace{0.05cm}, \hspace{0.05cm}\text{...}\hspace{0.1cm} , \hspace{0.15cm} x_{M}\hspace{0.05cm}\big  \}$$
 
:$$X = \big \{\hspace{0.05cm}x_1\hspace{0.05cm}, \hspace{0.15cm} x_2\hspace{0.05cm},\hspace{0.15cm} \text{...}\hspace{0.1cm} ,\hspace{0.15cm} x_{\mu}\hspace{0.05cm}, \hspace{0.05cm}\text{...}\hspace{0.1cm} , \hspace{0.15cm} x_{M}\hspace{0.05cm}\big  \}$$
mit dem Symbolumfang  $|X| = M$  lautet allgemein:
+
with the symbolic range  $|X| = M$  is generally:
 
:$$P_X(X) = [\hspace{0.05cm}p_1\hspace{0.05cm}, \hspace{0.15cm} p_2\hspace{0.05cm},\hspace{0.05cm} ...\hspace{0.1cm} ,\hspace{0.15cm} p_{\mu}\hspace{0.05cm}, \hspace{0.05cm}...\hspace{0.1cm} , \hspace{0.15cm} p_{M}\hspace{0.05cm}]\hspace{0.05cm}.$$
 
:$$P_X(X) = [\hspace{0.05cm}p_1\hspace{0.05cm}, \hspace{0.15cm} p_2\hspace{0.05cm},\hspace{0.05cm} ...\hspace{0.1cm} ,\hspace{0.15cm} p_{\mu}\hspace{0.05cm}, \hspace{0.05cm}...\hspace{0.1cm} , \hspace{0.15cm} p_{M}\hspace{0.05cm}]\hspace{0.05cm}.$$
Die Entropie (Unsicherheit) dieser Zufallsgröße berechnet sich entsprechend der Gleichung
+
The entropy (uncertainty) of this random variable is calculated according to the equation
 
:$$H(X) = {\rm E} \big [\log_2 \hspace{0.05cm} {1}/{P_X(X)} \big ]\hspace{0.05cm},$$
 
:$$H(X) = {\rm E} \big [\log_2 \hspace{0.05cm} {1}/{P_X(X)} \big ]\hspace{0.05cm},$$
und liegt stets im Bereich  $0 \le H(X)  \le  \log_2 \hspace{0.05cm}  |X|$.  
+
and always lies in the range  $0 \le H(X)  \le  \log_2 \hspace{0.05cm}  |X|$.  
  
*Die untere Schranke  $H(X) = 0$  ergibt sich, wenn eine beliebige Wahrscheinlichkeit  $p_\mu = 1$  ist und alle anderen Null sind.  
+
*The lower bound  $H(X) = 0$  results when any probability  $p_\mu = 1$  and all others are zero.  
  
*Die obere Schranke soll hier wie in der Vorlesung "Information Theory" von  [[Biographies_and_Bibliographies/Lehrstuhlinhaber_des_LNT#Prof._Dr._sc._techn._Gerhard_Kramer_.28seit_2010.29|Gerhard Kramer]]  an der TU München hergeleitet werden:
+
*The upper bound is to be derived here as in the lecture "Information Theory" by  [[Biographies_and_Bibliographies/Lehrstuhlinhaber_des_LNT#Prof._Dr._sc._techn._Gerhard_Kramer_.28seit_2010.29|Gerhard Kramer]]  at the TU Munich:
[[File:P_ID2755__Inf_A_3_3_B_neu.png|right|frame|Obere Abschätzung für den natürlichen Logarithmus]]
+
[[File:P_ID2755__Inf_A_3_3_B_neu.png|right|frame|Upper bound estimate for the natural logarithm]]
:* Durch Erweiterung obiger Gleichung um  $|X|$  in Zähler und Nenner erhält man unter Verwendung von  $\log_2 \hspace{0.05cm}x= \ln(x)/\ln(2)$:
+
:* By extending the above equation by  $|X|$  in the numerator and denominator, using  $\log_2 \hspace{0.05cm}x= \ln(x)/\ln(2)$, we obtain:
 
::$$H(X) = \frac{1}{{\rm ln}(2)}\cdot {\rm E} \left [{\rm ln} \hspace{0.1cm} \frac{1}{|X| \cdot P_X(X)} \right ] + {\rm log}_2 \hspace{0.1cm}|X| \hspace{0.05cm}.$$
 
::$$H(X) = \frac{1}{{\rm ln}(2)}\cdot {\rm E} \left [{\rm ln} \hspace{0.1cm} \frac{1}{|X| \cdot P_X(X)} \right ] + {\rm log}_2 \hspace{0.1cm}|X| \hspace{0.05cm}.$$
:* Wie aus nebenstehender Grafik hervorgeht, gilt die Abschätzung  $\ln(x) \le x-1$  mit der Identität für  $x=1$.  Somit kann geschrieben werden:
+
:* As can be seen from the graph opposite, the estimation  $\ln(x) \le x-1$  holds with the identity for  $x=1$.  Thus, it can be written:
 
::$$H(X) \le \frac{1}{{\rm ln}(2)}\cdot {\rm E} \left [\frac{1}{|X| \cdot P_X(X)} -1 \right ] + {\rm log}_2 \hspace{0.1cm}|X| \hspace{0.05cm}.$$
 
::$$H(X) \le \frac{1}{{\rm ln}(2)}\cdot {\rm E} \left [\frac{1}{|X| \cdot P_X(X)} -1 \right ] + {\rm log}_2 \hspace{0.1cm}|X| \hspace{0.05cm}.$$
:* In der  [[Aufgaben:3.2_Erwartungswertberechnungen|Aufgabe 3.2]]  wurde für den Fall  $p_\mu \ne 0$  für alle  $\mu$  der Erwartungswert  ${\rm E} \big [\log_2 \hspace{0.05cm} {1}/{P_X(X)} \big ] =|X|$  berechnet.  Damit verschwindet der erste Term und man erhält das bekannte Ergebnis:
+
:* In   [[Aufgaben:3.2_Erwartungswertberechnungen|Exercise 3.2]]  the expected value  ${\rm E} \big [\log_2 \hspace{0.05cm} {1}/{P_X(X)} \big ] =|X|$  was calculated for the case  $p_\mu \ne 0$  for all  $\mu$ .  Thus, the first term disappears and the known result is obtained:
 
::$$H(X) \le {\rm log}_2 \hspace{0.1cm}|X| \hspace{0.05cm}.$$
 
::$$H(X) \le {\rm log}_2 \hspace{0.1cm}|X| \hspace{0.05cm}.$$
  
Line 32: Line 32:
  
  
''Hinweise:''
+
Hints:  
*Die Aufgabe gehört zum  Kapitel  [[Information_Theory/Einige_Vorbemerkungen_zu_zweidimensionalen_Zufallsgrößen|Einige Vorbemerkungen zu den 2D-Zufallsgrößen]].
+
*The exercise belongs to the chapter  [[Information_Theory/Einige_Vorbemerkungen_zu_zweidimensionalen_Zufallsgrößen|Some preliminary remarks on 2D random variables]].
*Insbesondere wird Bezug genommen auf die Seite  [[Information_Theory/Einige_Vorbemerkungen_zu_zweidimensionalen_Zufallsgrößen#Wahrscheinlichkeitsfunktion_und_Entropie|Wahrscheinlichkeitsfunktion undEntropie]].
+
*In particular, reference is made to the page  [[Information_Theory/Einige_Vorbemerkungen_zu_zweidimensionalen_Zufallsgrößen#Wahrscheinlichkeitsfunktion_und_Entropie|Probability function and entropy]].
*Ausgegangen wird hier von der gleichen Konstellation wie in der  [[Aufgaben:Aufgabe_3.2:_Erwartungswertberechnungen|Aufgabe 3.2]].
+
*The same constellation is assumed here as in  [[Aufgaben:Aufgabe_3.2:_Erwartungswertberechnungen|Exercise 3.2]].
 
   
 
   
*Die Gleichung der binären Entropiefunktion lautet:
+
*The equation of the binary entropy function is:
 
:$$H_{\rm bin}(p) =  p \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{p} +  
 
:$$H_{\rm bin}(p) =  p \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{p} +  
 
  (1-p) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{1-p} \hspace{0.05cm}.$$
 
  (1-p) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{1-p} \hspace{0.05cm}.$$
  
  
===Fragebogen===
+
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
{Welche Aussagen gelten für die rote Entropiefunktion&nbsp; $H_{\rm R}(p)$?
+
{Which statements are true for the red entropy function&nbsp; $H_{\rm R}(p)$?
 
|type="[]"}
 
|type="[]"}
+ $H_{\rm R}(p)$&nbsp; ergibt sich zum Beispiel mit &nbsp;$p_1 = p$, &nbsp;$p_2 = 1- p$  &nbsp;und&nbsp; $p_3 = 0$.
+
+ $H_{\rm R}(p)$&nbsp; results, for example, with &nbsp;$p_1 = p$, &nbsp;$p_2 = 1- p$  &nbsp;and&nbsp; $p_3 = 0$.
+ $H_{\rm R}(p)$&nbsp; ist identisch mit der binären Entropiefunktion&nbsp; $H_{\rm bin}(p)$.
+
+ $H_{\rm R}(p)$&nbsp; is identical to the binary entropy function&nbsp; $H_{\rm bin}(p)$.
  
  
{Welche Eigenschaften weist die binäre Entropiefunktion&nbsp; $H_{\rm bin}(p)$&nbsp; auf?
+
{What are the properties of the binary entropy function&nbsp; $H_{\rm bin}(p)$&nbsp;?
 
|type="[]"}
 
|type="[]"}
+ $H_{\rm bin}(p)$&nbsp; ist konkav hinsichtlich des Parameters&nbsp; $p$.
+
+ $H_{\rm bin}(p)$&nbsp; is concave with respect to the parameter&nbsp; $p$.
- Es gilt&nbsp; $\text {Max }  [H_{\rm bin}(p)] = 2$&nbsp; bit.
+
- &nbsp; $\text {Max }  [H_{\rm bin}(p)] = 2$&nbsp; bit applies.
  
  
{Welche Aussagen gelten für die blaue Entropiefunktion&nbsp; $H_{\rm B}(p)$?
+
{Which statements are true for the blue entropy function&nbsp; $H_{\rm B}(p)$?
 
|type="[]"}
 
|type="[]"}
 
+ $H_{\rm B}(p)$&nbsp; ergibt sich beispielsweise  mit &nbsp;$p_1 = p$, &nbsp;$p_2 = 1/2- p$  &nbsp;und&nbsp; $p_3 = 1/2$.
 
+ $H_{\rm B}(p)$&nbsp; ergibt sich beispielsweise  mit &nbsp;$p_1 = p$, &nbsp;$p_2 = 1/2- p$  &nbsp;und&nbsp; $p_3 = 1/2$.
Line 64: Line 64:
  
  
{Welche Aussagen gelten für die grüne Entropiefunktion&nbsp; $H_{\rm G}(p)$?
+
{Which statements are true for the green entropy function&nbsp; $H_{\rm G}(p)$?
 
|type="[]"}
 
|type="[]"}
+ $H_{\rm G}(p)$&nbsp; ergibt sich beispielsweise  mit &nbsp;$p_1 = p$, &nbsp;$p_2 = 2/3- p$  &nbsp;und&nbsp; $p_3 = 1/3$.
+
+ $H_{\rm G}(p)$&nbsp; results, for example, with &nbsp;$p_1 = p$, &nbsp;$p_2 = 2/3- p$  &nbsp;and&nbsp; $p_3 = 1/3$.
- Es gilt&nbsp; $H_{\rm G}(p = 0)= 1$&nbsp; bit.
+
- &nbsp; $H_{\rm G}(p = 0)= 1$&nbsp; bit is valid.
+ Es gilt&nbsp; $\text {Max } [H_{\rm G}(p)] =  \log_2 \hspace{0.1cm} (3)$ bit.
+
+ &nbsp; $\text {Max } [H_{\rm G}(p)] =  \log_2 \hspace{0.1cm} (3)$ bit applies.
  
  
Line 74: Line 74:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''(1)'''&nbsp; <u>Beide Aussagen sind richtig:</u>  
+
'''(1)'''&nbsp; <u>Both statements are correct:</u>  
*Setzt man &nbsp;$p_3 = 0$ und formal &nbsp;$p_1 = p$ &nbsp; &#8658; &nbsp; &nbsp;$p_2 = 1- p$, so ergibt sich die binäre Entropiefunktion
+
*If we set &nbsp;$p_3 = 0$ and formally &nbsp;$p_1 = p$ &nbsp; &#8658; &nbsp; &nbsp;$p_2 = 1- p$, we get the binary entropy function
 
:$$H_{\rm bin}(p) =  p \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{p} +  
 
:$$H_{\rm bin}(p) =  p \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{p} +  
 
  (1-p) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{1-p} \hspace{0.05cm}.$$
 
  (1-p) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{1-p} \hspace{0.05cm}.$$
  
  
'''(2)'''&nbsp; Richtig ist allein der <u>Lösungsvorschlag 1</u>:
+
'''(2)'''&nbsp; Only the <u>proposed solution 1</u> is correct:
*Man kann die binäre Entropiefunktion wegen&nbsp; $\log(x) = \ln(x)/\ln(2)$&nbsp; auch in die folgende Form bringen:
+
*One can also put the binary entropy function into the following form because of&nbsp; $\log(x) = \ln(x)/\ln(2)$&nbsp;:
 
:$$H_{\rm bin}(p) = \frac{-1}{{\rm ln}(2)} \cdot \big [  p \cdot {\rm ln}(p)  +  
 
:$$H_{\rm bin}(p) = \frac{-1}{{\rm ln}(2)} \cdot \big [  p \cdot {\rm ln}(p)  +  
 
  (1-p) \cdot {\rm ln}(1-p) \big ] \hspace{0.05cm}.$$
 
  (1-p) \cdot {\rm ln}(1-p) \big ] \hspace{0.05cm}.$$
*Die erste Ableitung führt zum Ergebnis
+
*The first derivation leads to the result
 
:$$\frac {{\rm d}H_{\rm bin}(p)}{{\rm d}p} = \frac{-1}{{\rm ln}(2)} \cdot \big [  {\rm ln}(p)  + p \cdot \frac{1}{p} -  
 
:$$\frac {{\rm d}H_{\rm bin}(p)}{{\rm d}p} = \frac{-1}{{\rm ln}(2)} \cdot \big [  {\rm ln}(p)  + p \cdot \frac{1}{p} -  
 
   {\rm ln}(1-p) - (1-p) \cdot \frac{1}{1-p} \big ] =
 
   {\rm ln}(1-p) - (1-p) \cdot \frac{1}{1-p} \big ] =
 
\frac{1}{{\rm ln}(2)} \cdot \big [ {\rm ln}(1-p) - {\rm ln}(p)  \big ] = {\rm log}_2 \hspace{0.1cm} \frac{1-p}{p} \hspace{0.05cm}.$$
 
\frac{1}{{\rm ln}(2)} \cdot \big [ {\rm ln}(1-p) - {\rm ln}(p)  \big ] = {\rm log}_2 \hspace{0.1cm} \frac{1-p}{p} \hspace{0.05cm}.$$
*Durch Nullsetzen dieser Ableitung erhält man den Abszissenwert&nbsp; $p = 0.5$, der zum Maximum der Entropiefunktion führt: &nbsp; $H_{\rm bin}(p =0.5) = 1$ bit <br>&#8658; &nbsp; der Lösungsvorschlag 2 ist falsch.
+
*By setting this derivative to zero, we obtain the abscissa value&nbsp; $p = 0.5$, which leads to the maximum of the entropy function: &nbsp; $H_{\rm bin}(p =0.5) = 1$ bit <br>&#8658; &nbsp; the proposed solution 2 is wrong.
*Durch nochmaliges Differenzieren erhält man für die zweite Ableitung:
+
*By differentiating again, one obtains for the second derivative
 
:$$\frac {{\rm d}^2H_{\rm bin}(p)}{{\rm d}p^2} = \frac{1}{{\rm ln}(2)} \cdot \left
 
:$$\frac {{\rm d}^2H_{\rm bin}(p)}{{\rm d}p^2} = \frac{1}{{\rm ln}(2)} \cdot \left
 
  [  \frac{-1}{1-p}  - \frac{1}{p}    \right ] =
 
  [  \frac{-1}{1-p}  - \frac{1}{p}    \right ] =
 
\frac{-1}{{\rm ln}(2) \cdot p \cdot (1-p)}  \hspace{0.05cm}.$$
 
\frac{-1}{{\rm ln}(2) \cdot p \cdot (1-p)}  \hspace{0.05cm}.$$
*Diese Funktion ist im gesamten Definitionsgebiet&nbsp; $0 &#8804; p &#8804; 1$&nbsp; negativ &nbsp; &#8658; &nbsp; $H_{\rm bin}(p)$ ist konkav &nbsp; &#8658; &nbsp; der Lösungsvorschlag 1 ist richtig.
+
*This function is negative in the entire definition domain&nbsp; $0 &#8804; p &#8804; 1$&nbsp; &nbsp; &#8658; &nbsp; $H_{\rm bin}(p)$ is concave &nbsp; &#8658; &nbsp; the proposed solution 1 is correct.
  
  
  
[[File:P_ID2756__Inf_A_3_3_ML.png|right|frame|Drei Entropiefunktionen mit&nbsp; $M = 3$]]
+
[[File:P_ID2756__Inf_A_3_3_ML.png|right|frame|Three entropy functions with&nbsp; $M = 3$]]
'''(3)'''&nbsp; Richtig sind hier die <u>Aussagen 1 und 2</u>:
+
'''(3)'''&nbsp; <u>Propositions 1 and 2</u> are correct here:
* Für&nbsp; $p = 0$&nbsp; erhält man die Wahrscheinlichkeitsfunktion&nbsp; $P_X(X) = \big  [\hspace{0.05cm}0\hspace{0.05cm}, \hspace{0.15cm} 1/2\hspace{0.05cm},\hspace{0.15cm} 1/2 \hspace{0.05cm} \big ]$ &nbsp; &#8658; &nbsp; $H(X) = 1$&nbsp; bit.
+
* For&nbsp; $p = 0$&nbsp; one obtains the probability function&nbsp; $P_X(X) = \big  [\hspace{0.05cm}0\hspace{0.05cm}, \hspace{0.15cm} 1/2\hspace{0.05cm},\hspace{0.15cm} 1/2 \hspace{0.05cm} \big ]$ &nbsp; &#8658; &nbsp; $H(X) = 1$&nbsp; bit.
* Das Maximum unter der Voraussetzung&nbsp; $p_3 = 1/2$&nbsp; ergibt sich für&nbsp; $p_1 = p_2 = 1/4$:
+
* The maximum under the condition&nbsp; $p_3 = 1/2$&nbsp; is obtained for&nbsp; $p_1 = p_2 = 1/4$:
 
:$$P_X(X) = \big  [\hspace{0.05cm}1/4\hspace{0.05cm}, \hspace{0.05cm} 1/4\hspace{0.05cm},\hspace{0.05cm} 1/2 \hspace{0.05cm} \big ]
 
:$$P_X(X) = \big  [\hspace{0.05cm}1/4\hspace{0.05cm}, \hspace{0.05cm} 1/4\hspace{0.05cm},\hspace{0.05cm} 1/2 \hspace{0.05cm} \big ]
 
\hspace{0.3cm} \Rightarrow \hspace{0.3cm}
 
\hspace{0.3cm} \Rightarrow \hspace{0.3cm}
 
{\rm Max} \ [H_{\rm B}(p)] = 1.5 \ \rm bit
 
{\rm Max} \ [H_{\rm B}(p)] = 1.5 \ \rm bit
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
*In kompakter Form lässt sich&nbsp; $H_{\rm B}(p)$&nbsp; mit der Einschränkung&nbsp; $0 &#8804; p &#8804; 1/2$&nbsp; wie folgt darstellen:
+
*In compact form,&nbsp; $H_{\rm B}(p)$&nbsp; with the constraint&nbsp; $0 &#8804; p &#8804; 1/2$&nbsp; can be represented as follows:
 
:$$H_{\rm B}(p) = 1.0\,{\rm bit} + {1}/{2} \cdot H_{\rm bin}(2p)  
 
:$$H_{\rm B}(p) = 1.0\,{\rm bit} + {1}/{2} \cdot H_{\rm bin}(2p)  
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
  
  
'''(4)'''&nbsp; Richtig sind hier die <u> erste und letzte Aussage</u>:
+
'''(4)'''&nbsp; <u>The first and last statements</u> are correct here::
* Der grüne Kurvenzug beinhaltet mit&nbsp; $p = 1/3$&nbsp; auch die Gleichverteilung aller Wahrscheinlichkeiten    
+
* The green curve with&nbsp; $p = 1/3$&nbsp; also contains the equal distribution of all probabilities    
 
:$$ {\rm Max} \ [H_{\rm G}(p)] = \log_2 (3)\ \text{bit}.$$  
 
:$$ {\rm Max} \ [H_{\rm G}(p)] = \log_2 (3)\ \text{bit}.$$  
*Allgemein lässt sich der gesamte Kurvenverlauf im Bereich&nbsp; $0 &#8804; p &#8804; 2/3$&nbsp; wie folgt ausdrücken:
+
*In general, the entire curve in the range&nbsp; $0 &#8804; p &#8804; 2/3$&nbsp; can be expressed as follows:
 
:$$H_{\rm G}(p) = H_{\rm G}(p= 0) + {2}/{3} \cdot H_{\rm bin}(3p/2)  
 
:$$H_{\rm G}(p) = H_{\rm G}(p= 0) + {2}/{3} \cdot H_{\rm bin}(3p/2)  
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
*Aus der Grafik auf der Angabenseite erkennt man auch, dass folgende Bedingung erfüllt sein muss:
+
*From the graph on the information page, one can also see that the following condition must be fulfilled:
 
:$$H_{\rm G}(p = 0) + {2}/{3}= {\rm log}_2 \hspace{0.01cm} (3)
 
:$$H_{\rm G}(p = 0) + {2}/{3}= {\rm log}_2 \hspace{0.01cm} (3)
 
\hspace{0.3cm} \Rightarrow \hspace{0.3cm}
 
\hspace{0.3cm} \Rightarrow \hspace{0.3cm}
 
H_{\rm G}(p= 0) = 1.585 - 0.667 = 0.918 \,{\rm bit}
 
H_{\rm G}(p= 0) = 1.585 - 0.667 = 0.918 \,{\rm bit}
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
*Der zweite Lösungsvorschlag ist somit falsch.&nbsp; Zum gleichen Ergebnis gelangt man über die Gleichung
+
*The second suggested solution is therefore wrong.&nbsp; The same result can be obtained with the equation
 
:$$H_{\rm G}(p = 0) = {1}/{3} \cdot  {\rm log}_2 \hspace{0.01cm} (3)
 
:$$H_{\rm G}(p = 0) = {1}/{3} \cdot  {\rm log}_2 \hspace{0.01cm} (3)
 
+{2}/{3} \cdot  {\rm log}_2 \hspace{0.01cm} (3/2) = {\rm log}_2 \hspace{0.01cm} (3) -2/3 \cdot  
 
+{2}/{3} \cdot  {\rm log}_2 \hspace{0.01cm} (3/2) = {\rm log}_2 \hspace{0.01cm} (3) -2/3 \cdot  
Line 132: Line 132:
  
  
[[Category:Information Theory: Exercises|^3.1 Allgemeines zu 2D-Zufallsgrößen^]]
+
[[Category:Information Theory: Exercises|^3.1 General Information on 2D Random Variables^]]

Latest revision as of 10:13, 24 September 2021

Given entropy functions

On the right you see the entropy functions  $H_{\rm R}(p)$,  $H_{\rm B}(p)$  and  $H_{\rm G}(p)$, where  $\rm R$  stands for "red",  $\rm B$  for "blue" and  $\rm G$  for "green".  The probability functions are for all random variables:

$$P_X(X) = [\hspace{0.05cm}p_1\hspace{0.05cm},\ p_2\hspace{0.05cm},\ p_3\hspace{0.05cm}]\hspace{0.3cm}\hspace{0.3cm} \Rightarrow \hspace{0.3cm} |X| = 3\hspace{0.05cm}.$$

For the questionnaire, the relationship  $p_1 = p$  and  $p_2 = 1 - p_3- p$.

The probability function of a random variable

$$X = \big \{\hspace{0.05cm}x_1\hspace{0.05cm}, \hspace{0.15cm} x_2\hspace{0.05cm},\hspace{0.15cm} \text{...}\hspace{0.1cm} ,\hspace{0.15cm} x_{\mu}\hspace{0.05cm}, \hspace{0.05cm}\text{...}\hspace{0.1cm} , \hspace{0.15cm} x_{M}\hspace{0.05cm}\big \}$$

with the symbolic range  $|X| = M$  is generally:

$$P_X(X) = [\hspace{0.05cm}p_1\hspace{0.05cm}, \hspace{0.15cm} p_2\hspace{0.05cm},\hspace{0.05cm} ...\hspace{0.1cm} ,\hspace{0.15cm} p_{\mu}\hspace{0.05cm}, \hspace{0.05cm}...\hspace{0.1cm} , \hspace{0.15cm} p_{M}\hspace{0.05cm}]\hspace{0.05cm}.$$

The entropy (uncertainty) of this random variable is calculated according to the equation

$$H(X) = {\rm E} \big [\log_2 \hspace{0.05cm} {1}/{P_X(X)} \big ]\hspace{0.05cm},$$

and always lies in the range  $0 \le H(X) \le \log_2 \hspace{0.05cm} |X|$.

  • The lower bound  $H(X) = 0$  results when any probability  $p_\mu = 1$  and all others are zero.
  • The upper bound is to be derived here as in the lecture "Information Theory" by  Gerhard Kramer  at the TU Munich:
Upper bound estimate for the natural logarithm
  • By extending the above equation by  $|X|$  in the numerator and denominator, using  $\log_2 \hspace{0.05cm}x= \ln(x)/\ln(2)$, we obtain:
$$H(X) = \frac{1}{{\rm ln}(2)}\cdot {\rm E} \left [{\rm ln} \hspace{0.1cm} \frac{1}{|X| \cdot P_X(X)} \right ] + {\rm log}_2 \hspace{0.1cm}|X| \hspace{0.05cm}.$$
  • As can be seen from the graph opposite, the estimation  $\ln(x) \le x-1$  holds with the identity for  $x=1$.  Thus, it can be written:
$$H(X) \le \frac{1}{{\rm ln}(2)}\cdot {\rm E} \left [\frac{1}{|X| \cdot P_X(X)} -1 \right ] + {\rm log}_2 \hspace{0.1cm}|X| \hspace{0.05cm}.$$
  • In   Exercise 3.2  the expected value  ${\rm E} \big [\log_2 \hspace{0.05cm} {1}/{P_X(X)} \big ] =|X|$  was calculated for the case  $p_\mu \ne 0$  for all  $\mu$ .  Thus, the first term disappears and the known result is obtained:
$$H(X) \le {\rm log}_2 \hspace{0.1cm}|X| \hspace{0.05cm}.$$




Hints:

  • The equation of the binary entropy function is:
$$H_{\rm bin}(p) = p \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{p} + (1-p) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{1-p} \hspace{0.05cm}.$$


Questions

1

Which statements are true for the red entropy function  $H_{\rm R}(p)$?

$H_{\rm R}(p)$  results, for example, with  $p_1 = p$,  $p_2 = 1- p$  and  $p_3 = 0$.
$H_{\rm R}(p)$  is identical to the binary entropy function  $H_{\rm bin}(p)$.

2

What are the properties of the binary entropy function  $H_{\rm bin}(p)$ ?

$H_{\rm bin}(p)$  is concave with respect to the parameter  $p$.
  $\text {Max } [H_{\rm bin}(p)] = 2$  bit applies.

3

Which statements are true for the blue entropy function  $H_{\rm B}(p)$?

$H_{\rm B}(p)$  ergibt sich beispielsweise mit  $p_1 = p$,  $p_2 = 1/2- p$  und  $p_3 = 1/2$.
Es gilt  $H_{\rm B}(p = 0)= 1$  bit.
Es gilt  $\text {Max } [H_{\rm B}(p)] = \log_2 \hspace{0.1cm} (3)$  bit.

4

Which statements are true for the green entropy function  $H_{\rm G}(p)$?

$H_{\rm G}(p)$  results, for example, with  $p_1 = p$,  $p_2 = 2/3- p$  and  $p_3 = 1/3$.
  $H_{\rm G}(p = 0)= 1$  bit is valid.
  $\text {Max } [H_{\rm G}(p)] = \log_2 \hspace{0.1cm} (3)$ bit applies.


Solution

(1)  Both statements are correct:

  • If we set  $p_3 = 0$ and formally  $p_1 = p$   ⇒    $p_2 = 1- p$, we get the binary entropy function
$$H_{\rm bin}(p) = p \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{p} + (1-p) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{1-p} \hspace{0.05cm}.$$


(2)  Only the proposed solution 1 is correct:

  • One can also put the binary entropy function into the following form because of  $\log(x) = \ln(x)/\ln(2)$ :
$$H_{\rm bin}(p) = \frac{-1}{{\rm ln}(2)} \cdot \big [ p \cdot {\rm ln}(p) + (1-p) \cdot {\rm ln}(1-p) \big ] \hspace{0.05cm}.$$
  • The first derivation leads to the result
$$\frac {{\rm d}H_{\rm bin}(p)}{{\rm d}p} = \frac{-1}{{\rm ln}(2)} \cdot \big [ {\rm ln}(p) + p \cdot \frac{1}{p} - {\rm ln}(1-p) - (1-p) \cdot \frac{1}{1-p} \big ] = \frac{1}{{\rm ln}(2)} \cdot \big [ {\rm ln}(1-p) - {\rm ln}(p) \big ] = {\rm log}_2 \hspace{0.1cm} \frac{1-p}{p} \hspace{0.05cm}.$$
  • By setting this derivative to zero, we obtain the abscissa value  $p = 0.5$, which leads to the maximum of the entropy function:   $H_{\rm bin}(p =0.5) = 1$ bit
    ⇒   the proposed solution 2 is wrong.
  • By differentiating again, one obtains for the second derivative
$$\frac {{\rm d}^2H_{\rm bin}(p)}{{\rm d}p^2} = \frac{1}{{\rm ln}(2)} \cdot \left [ \frac{-1}{1-p} - \frac{1}{p} \right ] = \frac{-1}{{\rm ln}(2) \cdot p \cdot (1-p)} \hspace{0.05cm}.$$
  • This function is negative in the entire definition domain  $0 ≤ p ≤ 1$    ⇒   $H_{\rm bin}(p)$ is concave   ⇒   the proposed solution 1 is correct.


Three entropy functions with  $M = 3$

(3)  Propositions 1 and 2 are correct here:

  • For  $p = 0$  one obtains the probability function  $P_X(X) = \big [\hspace{0.05cm}0\hspace{0.05cm}, \hspace{0.15cm} 1/2\hspace{0.05cm},\hspace{0.15cm} 1/2 \hspace{0.05cm} \big ]$   ⇒   $H(X) = 1$  bit.
  • The maximum under the condition  $p_3 = 1/2$  is obtained for  $p_1 = p_2 = 1/4$:
$$P_X(X) = \big [\hspace{0.05cm}1/4\hspace{0.05cm}, \hspace{0.05cm} 1/4\hspace{0.05cm},\hspace{0.05cm} 1/2 \hspace{0.05cm} \big ] \hspace{0.3cm} \Rightarrow \hspace{0.3cm} {\rm Max} \ [H_{\rm B}(p)] = 1.5 \ \rm bit \hspace{0.05cm}.$$
  • In compact form,  $H_{\rm B}(p)$  with the constraint  $0 ≤ p ≤ 1/2$  can be represented as follows:
$$H_{\rm B}(p) = 1.0\,{\rm bit} + {1}/{2} \cdot H_{\rm bin}(2p) \hspace{0.05cm}.$$


(4)  The first and last statements are correct here::

  • The green curve with  $p = 1/3$  also contains the equal distribution of all probabilities
$$ {\rm Max} \ [H_{\rm G}(p)] = \log_2 (3)\ \text{bit}.$$
  • In general, the entire curve in the range  $0 ≤ p ≤ 2/3$  can be expressed as follows:
$$H_{\rm G}(p) = H_{\rm G}(p= 0) + {2}/{3} \cdot H_{\rm bin}(3p/2) \hspace{0.05cm}.$$
  • From the graph on the information page, one can also see that the following condition must be fulfilled:
$$H_{\rm G}(p = 0) + {2}/{3}= {\rm log}_2 \hspace{0.01cm} (3) \hspace{0.3cm} \Rightarrow \hspace{0.3cm} H_{\rm G}(p= 0) = 1.585 - 0.667 = 0.918 \,{\rm bit} \hspace{0.05cm}.$$
  • The second suggested solution is therefore wrong.  The same result can be obtained with the equation
$$H_{\rm G}(p = 0) = {1}/{3} \cdot {\rm log}_2 \hspace{0.01cm} (3) +{2}/{3} \cdot {\rm log}_2 \hspace{0.01cm} (3/2) = {\rm log}_2 \hspace{0.01cm} (3) -2/3 \cdot {\rm log}_2 \hspace{0.01cm} (2) \hspace{0.05cm}.$$