Difference between revisions of "Aufgaben:Exercise 1.2: Entropy of Ternary Sources"

From LNTwww
m (Text replacement - "autocorrelation" to "auto-correlation")
 
(25 intermediate revisions by 4 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Gedächtnislose Nachrichtenquellen
+
{{quiz-Header|Buchseite=Information_Theory/Discrete_Memoryless_Sources}}
}}
 
  
[[File:P_ID2235__Inf_A_1_2.png|right|]]
+
[[File:Inf_A_1_2_vers2.png|right|frame|Probabilities of two ternary sources]]
:Die Entropie einer wertdiskreten gedächtnislosen Nachrichtenquelle mit <i>M</i> möglichen Symbolen lautet:
+
The entropy of a discrete memoryless source with&nbsp; $M$&nbsp; possible symbols is:
 
:$$H =  \sum_{\mu = 1}^M p_{\mu} \cdot {\rm log}_2\hspace{0.1cm}\frac{1}{p_\mu}\hspace{0.05cm},\hspace{0.3cm}
 
:$$H =  \sum_{\mu = 1}^M p_{\mu} \cdot {\rm log}_2\hspace{0.1cm}\frac{1}{p_\mu}\hspace{0.05cm},\hspace{0.3cm}
  {\rm Pseudoeinheit\hspace{-0.15cm}: \hspace{0.15cm}bit}\hspace{0.05cm}.$$
+
  {\rm pseudo unit\hspace{-0.15cm}: \hspace{0.15cm}bit}\hspace{0.05cm}.$$
:Hierbei bezeichnen die <i>p<sub>&mu;</sub></i> die Auftrittswahrscheinlichkeiten der einzelnen Symbole bzw. Ereignisse. Im vorliegenden Beispiel werden die Ereignisse mit <b>R</b>(ot), <b>G</b>(rün) und <b>S</b>(chwarz) bezeichnet.
+
Here, the&nbsp; $p_\mu$&nbsp; denote the occurrence probabilities of the individual symbols or events.&nbsp; In the present example, the events are denoted by&nbsp; $\rm R$(ed),&nbsp; $\rm G$(reen)&nbsp; and&nbsp; $\rm S$(chwarz)&nbsp; with&nbsp; "Schwarz"&nbsp; being the German word for&nbsp; "Black".
  
:Bei einer binären Quelle mit den Auftrittswahrscheinlichkeiten <i>p</i> und 1 &ndash; <i>p</i> kann hierfür geschrieben werden:
+
*For a binary source with the occurrence probabilities&nbsp; $p$ &nbsp;and&nbsp; $1-p$&nbsp; this can be written:
 
:$$H = H_{\rm bin}(p) = p \cdot {\rm log}_2\hspace{0.1cm}\frac{1}{p}+ (1-p) \cdot  
 
:$$H = H_{\rm bin}(p) = p \cdot {\rm log}_2\hspace{0.1cm}\frac{1}{p}+ (1-p) \cdot  
{\rm log}_2\hspace{0.1cm}\frac{1}{1-p}\hspace{0.05cm}.$$
+
{\rm log}_2\hspace{0.1cm}\frac{1}{1-p}\hspace{0.05cm},\hspace{0.3cm}
:Die Entropie einer mehrstufigen Quelle lässt sich häufig mit dieser &bdquo;binären Entropiefunktion&rdquo; <i>H</i><sub>bin</sub>(<i>p</i>) &ndash; ebenfalls mit der Pseudoeinheit &bdquo;bit&rdquo; &ndash; ausdrücken.
+
\text{pseudo&ndash;unit: bit}\hspace{0.05cm}.$$
 +
*The entropy of a multilevel source can often be expressed with this&nbsp; "binary entropy function"&nbsp; $H_{\rm bin}(p)$.
 +
  
:Betrachtet werden in dieser Aufgabe zwei Ternärquellen mit den Symbolwahrscheinlichkeiten gemäß der obigen Grafik:
+
In this task, two ternary sources with the symbol probabilities according to the above graph are considered:
  
:* die Quelle Q<sub>1</sub> mit <i>p</i><sub>G</sub> = 1/2, <i>p</i><sub>S</sub> = 1/3, <i>p</i><sub>R</sub> = 1/6,
+
# source&nbsp; $\rm Q_1$ with&nbsp; $p_{\rm G }= 1/2$, &nbsp;$p_{\rm S }= 1/3$ &nbsp;and&nbsp; $p_{\rm R }= 1/6$,
 +
# source&nbsp; $\rm Q_2$ with&nbsp; $p_{\rm G }= p$ &nbsp;and&nbsp;  $p_{\rm S } = p_{\rm R } = (1-p)/2$.
  
:* die Quelle Q<sub>2</sub> mit <i>p</i><sub>G</sub> = <i>p</i>, <i>p</i><sub>R</sub> = <i>p</i><sub>S</sub> = (1 &ndash; <i>p</i>)/2.
 
  
:Die Ternärquelle Q<sub>2</sub> lässt sich auch auf Roulette anwenden, wenn ein Spieler nur auf die Felder <b>R</b>ot, <b>S</b>chwarz und <b>G</b>rün (die &bdquo;Null&rdquo;) setzt. Dieser Spieltyp wird im Fragebogen mit &bdquo;Roulette 1&rdquo; bezeichnet.
+
*The ternary source&nbsp; $\rm Q_2$&nbsp; can also be applied to&nbsp; "Roulette"&nbsp; when a player bets only on the squares&nbsp; $\rm R$(ed),&nbsp; $\rm S$(chwarz)&nbsp; and $\rm G$(reen)&nbsp; (the "zero").&nbsp; This type of game is referred to as&nbsp; $\text{Roulette 1}$&nbsp; in the question section.
  
:Dagegen weist &bdquo;Roulette 2&rdquo; darauf hin, dass der Spieler auf einzelne Zahlen (<b>0</b>, ... , <b>36</b>) setzt.
+
*In contrast,&nbsp; $\text{Roulette 2}$&nbsp; indicates that the player bets on single numbers&nbsp; $(0$, ... , $36)$.
  
:<b>Hinweis:</b> Die Aufgabe bezieht sich auf das Kapitel 1.1.
 
  
  
===Fragebogen===
+
 
 +
 
 +
 
 +
 
 +
''Hint:''
 +
*The task belongs to the chapter&nbsp; [[Information_Theory/Gedächtnislose_Nachrichtenquellen|Discrete Memoryless Sources]].
 +
 +
 
 +
 
 +
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
{Welche Entropie <i>H</i> besitzt die Quelle Q<sub>1</sub>?
+
{What is the entropy&nbsp; $H$&nbsp; of the source&nbsp; $\rm \underline{Q_1}$?
 
|type="{}"}
 
|type="{}"}
$Q_1:\ \ H$ = { 1.46 3% } $bit$
+
$H \ = \ $ { 1.46 3% } $\ \rm bit$
  
  
{Welche der folgenden Aussagen sind zutreffend, wenn man <b>R</b>, <b>G</b> und <b>S</b> durch die Zahlenwerte &ndash;1, 0 und +1 darstellt?
+
{Which of the following statements are true if&nbsp; $\rm R$,&nbsp; $\rm G$&nbsp; and&nbsp; $\rm S$&nbsp; are represented by the numerical values&nbsp; $-1$, &nbsp;$0$ &nbsp;and&nbsp; $+1$&nbsp;?
|type="[]"}
+
|type="()"}
- Es ergibt sich eine kleinere Entropie.
+
- The result is a smaller entropy.
+ Die Entropie bleibt gleich.
+
+ The entropy remains the same.
- Es ergibt sich eine größere Entropie.
+
- The result is a greater entropy.
  
  
{Bestimmen Sie die Entropie der Quelle Q<sub>2</sub> unter Verwendung der binären Entropiefunktion <i>H</i><sub>bin</sub>(<i>p</i>). Welcher Wert ergibt sich für <i>p</i> = 0.5?
+
{Determine the entropy of the source&nbsp; $\rm \underline{Q_2}$&nbsp; using the binary entropy function&nbsp; $H_{\rm bin}(p)$.&nbsp; What value results for&nbsp; $\underline{p = 0.5}$?
 
|type="{}"}
 
|type="{}"}
$Q_2;\ p = 0.5:\ H$ = { 1.5 3% } $bit$
+
$H \ = \ $ { 1.5 3% } $\ \rm bit$
  
  
{Für welchen <i>p</i>&ndash;Wert ergibt sich die maximale Entropie?
+
{For which&nbsp; $p$&ndash;value of the source&nbsp; $\rm \underline{Q_2}$&nbsp; does the maximum entropy result:&nbsp; $H &#8594; H_\text{max}$?
 
|type="{}"}
 
|type="{}"}
$Q_2,\ H &#8594; H_\text{max}:\ p$ = { 0.333 3% }  
+
$p \ \ $ { 0.333 3% }  
  
  
{Welche Entropie hat die Nachrichtenquelle &bdquo;Roulette&rdquo; hinsichtlich der Ereignisse <b>R</b>ot<b>S</b>chwarz und <b>G</b>rün (die &bdquo;Null&rdquo;)?
+
{What is the entropy of the source model&nbsp; $\text{Roulette 1}$,&nbsp; i.e. with respect to the events&nbsp; $\rm R$(ed),&nbsp; $\rm S$(chwarz)&nbsp; and&nbsp; $\rm G$(reen)&nbsp; (the "zero")?
 
|type="{}"}
 
|type="{}"}
$Roulette\ 1:\ H$ = { 1.152 3% } $bit$
+
$H \ = \ $ { 1.152 3% } $\ \rm bit$
  
  
{Welche Entropie weist &bdquo;Roulette&rdquo; hinsichtlich der Zahlen <b>0</b>, ... , <b>36</b> auf?
+
{What is the entropy of&nbsp; $\text{Roulette 2}$&nbsp;,&nbsp; i.e. with regard to the numbers &nbsp; $0$, ... , $36$?
 
|type="{}"}
 
|type="{}"}
$Roulette\ 2:\ H$ = { 5.209 3% } $bit$
+
$H \ \ $ { 5.209 3% } $\ \rm bit$
  
  
Line 65: Line 74:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
:<b>2.</b>&nbsp;&nbsp;Mit den Auftrittswahrscheinlichkeiten 1/2, 1/3 und 1/6 erhält man folgenden Entropiewert:
+
'''(1)'''&nbsp; With the "symbol" probabilities&nbsp; $1/2$,&nbsp; $1/3$&nbsp; and&nbsp; $1/6$&nbsp; we get the following entropy value:
:$$H \hspace{0.1cm}  =  \hspace{0.1cm}  1/2 \cdot {\rm log}_2\hspace{0.1cm}(2) +1/3 \cdot {\rm log}_2\hspace{0.1cm}(3) +1/6 \cdot {\rm log}_2\hspace{0.1cm}(6) =\\
+
:$$H \hspace{0.1cm}  =  \hspace{0.1cm}  1/2 \cdot {\rm log}_2\hspace{0.1cm}(2) +1/3 \cdot {\rm log}_2\hspace{0.1cm}(3) +1/6 \cdot {\rm log}_2\hspace{0.1cm}(6) =(1/2 + 1/6)\cdot {\rm log}_2\hspace{0.1cm}(2) +  (1/3 + 1/6)\cdot {\rm log}_2\hspace{0.1cm}(3) \hspace{0.15cm}\underline {\approx 1.46 \, {\rm bit}} \hspace{0.05cm}.$$
  \hspace{0.1cm}  =  \hspace{0.1cm}  (1/2 + 1/6)\cdot {\rm log}_2\hspace{0.1cm}(2) +  (1/3 + 1/6)\cdot {\rm log}_2\hspace{0.1cm}(3) =\\
+
 
  \hspace{0.1cm} =  \hspace{0.1cm} 2/3 \cdot 1\,{\rm bit} + 1/2 \cdot 1.585\,{\rm bit}\hspace{0.15cm}\underline {\approx 1.46 \, {\rm bit}}
+
 
  \hspace{0.05cm}.$$
+
 
 +
'''(2)'''&nbsp;<u>Proposed solution 2</u> is correct:
 +
*The entropy depends only on the probabilities of occurrence.
 +
*It does not matter which numerical values or physical quantities one assigns to the individual symbols.
 +
*It is different with mean values or the ACF (auto correlation function) calculation.&nbsp; If only symbols are given, no moments can be calculated for them.
 +
*Moreover, the mean values, auto-correlation, etc. depend on whether one agrees on the assignment bipolar&nbsp; $(-1, \hspace{0.10cm}0, \hspace{0.05cm}+1)$&nbsp;  or unipolar&nbsp;  $(0, \hspace{0.05cm}1, \hspace{0.05cm}2)$&nbsp;.
 +
 
 +
 
 +
 
 +
'''(3)'''&nbsp; The entropy of source&nbsp; $\rm Q_2$&nbsp; can be expressed as follows:
 +
:$$H \hspace{0.1cm} =  \hspace{0.1cm} p \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{p}+ 2 \cdot \frac{1-p}{2}  \cdot {\rm log}_2\hspace{0.1cm}\frac {2}{1-p}= p \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{p}+ (1-p)  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{1-p} + (1-p)\cdot {\rm log}_2\hspace{0.1cm}(2)= H_{\rm bin}(p) + 1-p \hspace{0.05cm}.$$
 +
*For&nbsp; $p = 0.5$ &nbsp;&nbsp;&#8658;&nbsp;&nbsp; $H_{\rm bin}(p) = 1$&nbsp;, we get&nbsp; $\underline{H = 1.5\hspace{0.05cm}\rm  bit}$.
  
:<b>2.</b>&nbsp;&nbsp;Richtig ist <u>Lösungsvorschlag 2</u>. Die Entropie hängt nur von den Auftrittswahrscheinlichkeiten ab. Es ist dabei egal, welche Zahlenwerte oder physikalische Größen man den einzelnen Symbolen zuordnet. Anders ist es bei Mittelwerten oder der AKF&ndash;Berechnung. Werden nur Symbole angegeben, so kann man hierfür keine Momente angeben. Außerdem hängen die Mittelwerte, Autokorrelation, usw. davon ab, ob man die Zuordnung bipolar (&ndash;1, 0, +1) oder unipolar (zum Beispiel: 0, 1, 2) vereinbart.
 
  
:<b>3.</b>&nbsp;&nbsp;Die Entropie der Quelle Q<sub>2</sub> lässt sich wie folgt ausdrücken:
 
:$$H \hspace{0.1cm} =  \hspace{0.1cm} p \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{p}+ 2 \cdot \frac{1-p}{2}  \cdot {\rm log}_2\hspace{0.1cm}\frac {2}{1-p}=\\
 
\hspace{0.1cm}  =  \hspace{0.1cm} p \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{p}+ (1-p)  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{1-p} + (1-p)\cdot {\rm log}_2\hspace{0.1cm}(2)= H_{\rm bin}(p) + 1-p \hspace{0.05cm}.$$
 
:Für <i>p</i> = 0.5 &nbsp;&nbsp;&#8658;&nbsp;&nbsp; <i>H</i><sub>bin</sub>(<i>p</i>) = 1 ergibt sich <i>H</i> <u>= 1.5 bit</u>.
 
  
:<b>4.</b>&nbsp;&nbsp;Die maximale Entropie einer gedächtnislosen Quelle mit dem Symbolumfang <i>M</i> ergibt sich, wenn alle <i>M</i> Symbole gleichwahrscheinlich sind. Für den Sonderfall <i>M</i> = 3 folgt daraus:
+
'''(4)'''&nbsp; The maximum entropy of a memoryless source with symbol set size&nbsp; $M$&nbsp; is obtained when all&nbsp; $M$&nbsp; symbols are equally probable.
 +
*For the special case&nbsp; $M=3$&nbsp; it follows:
 
:$$p_{\rm R} + p_{\rm G} + p_{\rm S} = 1 \hspace{0.3cm} \Rightarrow \hspace{0.3cm}
 
:$$p_{\rm R} + p_{\rm G} + p_{\rm S} = 1 \hspace{0.3cm} \Rightarrow \hspace{0.3cm}
  \underline {p = 1/3}\hspace{0.05cm}.$$
+
  \underline {p = 1/3 \approx 0.333}\hspace{0.05cm}.$$
:Damit erhält man mit dem Ergebnis der Teilaufgabe (3) die folgende Entropie:
+
*Thus, using the result of sub-task&nbsp; '''(3)'''&nbsp;, we obtain the following entropy:
:$$H \hspace{0.1cm} = \hspace{0.1cm} H_{\rm bin}(1/3) + 1-1/3 = 1/3 \cdot  
+
:$$H = H_{\rm bin}(1/3) + 1-1/3 = 1/3 \cdot  
{\rm log}_2\hspace{0.1cm}(3) + 2/3 \cdot {\rm log}_2\hspace{0.1cm}(3/2) + 2/3 =\\
+
{\rm log}_2\hspace{0.1cm}(3) + 2/3 \cdot {\rm log}_2\hspace{0.1cm}(3/2) + 2/3 $$
\hspace{0.1cm} = \hspace{0.1cm}  1/3 \cdot {\rm log}_2\hspace{0.1cm}(3) + 2/3 \cdot  
+
:$$\Rightarrow \hspace{0.3cm}H = 1/3 \cdot {\rm log}_2\hspace{0.1cm}(3) + 2/3 \cdot  
 
{\rm log}_2\hspace{0.1cm}(3) - 2/3 \cdot {\rm log}_2\hspace{0.1cm}(2)+ 2/3 =  
 
{\rm log}_2\hspace{0.1cm}(3) - 2/3 \cdot {\rm log}_2\hspace{0.1cm}(2)+ 2/3 =  
 
{\rm log}_2\hspace{0.1cm}(3) = {1.585 \, {\rm bit}}
 
{\rm log}_2\hspace{0.1cm}(3) = {1.585 \, {\rm bit}}
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>5.</b>&nbsp;&nbsp;&bdquo;Roulette 1&rdquo; ist informationstheoretisch gleich der Konfiguration Q<sub>2</sub> mit <i>p</i> = 1/37:
+
 
 +
 
 +
'''(5)'''&nbsp; The source model&nbsp; $\text{Roulette 1}$&nbsp; is information theoretically equal to the configuration&nbsp; $\rm Q_2$&nbsp; with&nbsp; $p = 1/37$:
 
:$$p_{\rm G} = p =  \frac{1}{37}\hspace{0.05cm},\hspace{0.2cm} p_{\rm R} = p_{\rm S} = \frac{1-p}{2} = \frac{18}{37} \hspace{0.05cm}.$$
 
:$$p_{\rm G} = p =  \frac{1}{37}\hspace{0.05cm},\hspace{0.2cm} p_{\rm R} = p_{\rm S} = \frac{1-p}{2} = \frac{18}{37} \hspace{0.05cm}.$$
:Damit erhält man mit dem Ergebnis der Teilaufgabe (3):
+
*Thus, using the result of subtask&nbsp; '''(3)''', we obtain:
:$$H \hspace{0.1cm} = \hspace{0.1cm} H_{\rm bin}(1/37) + \frac{36}{37} = \frac{1}{37} \cdot {\rm log}_2\hspace{0.1cm}(37) + \frac{36}{37} \cdot {\rm log}_2\hspace{0.1cm}(37) - \frac{36}{37} \cdot {\rm log}_2\hspace{0.1cm}36 + \frac{36}{37} =\\
+
:$$H = H_{\rm bin}(1/37) + \frac{36}{37} = \frac{1}{37} \cdot {\rm log}_2\hspace{0.1cm}(37) + \frac{36}{37} \cdot {\rm log}_2\hspace{0.1cm}(37) - \frac{36}{37} \cdot {\rm log}_2\hspace{0.1cm}36 + \frac{36}{37} =
\hspace{0.1cm}  =  \hspace{0.1cm}  {\rm log}_2\hspace{0.1cm}(37) + \frac{36}{37} \cdot ( 1- {\rm log}_2\hspace{0.1cm}(36)) = 5.209 - 4.057  \hspace{0.15cm} \underline { = 1.152 \, {\rm bit}}
+
  {\rm log}_2\hspace{0.1cm}(37) + \frac{36}{37} \cdot ( 1- {\rm log}_2\hspace{0.1cm}(36)) = 5.209 - 4.057  \hspace{0.15cm} \underline { = 1.152 \, {\rm bit}}
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>6.</b>&nbsp;&nbsp; Setzt man bei Roulette auf einzelne Zahlen &#8658; Konfiguration &bdquo;Roulette 2&rdquo;, so sind alle Zahlen von <b>0</b> bis <b>36</b> gleichwahrscheinlich und man erhält:
+
 
 +
 
 +
'''(6)'''&nbsp; If we bet on single numbers in roulette &nbsp; &#8658; &nbsp; source model&nbsp; $\text{Roulette 2}$, all numbers from&nbsp; $0$&nbsp; to&nbsp; $36$&nbsp;  are equally probable and we get:
 
:$$H = {\rm log}_2\hspace{0.1cm}(37)  \hspace{0.15cm} \underline { = 5.209 \, {\rm bit}}
 
:$$H = {\rm log}_2\hspace{0.1cm}(37)  \hspace{0.15cm} \underline { = 5.209 \, {\rm bit}}
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
Line 105: Line 125:
  
  
[[Category:Aufgaben zu Informationstheorie|^1.1 Gedächtnislose Nachrichtenquellen^]]
+
[[Category:Information Theory: Exercises|^1.1 Memoryless Sources^]]

Latest revision as of 12:29, 17 February 2022

Probabilities of two ternary sources

The entropy of a discrete memoryless source with  $M$  possible symbols is:

$$H = \sum_{\mu = 1}^M p_{\mu} \cdot {\rm log}_2\hspace{0.1cm}\frac{1}{p_\mu}\hspace{0.05cm},\hspace{0.3cm} {\rm pseudo unit\hspace{-0.15cm}: \hspace{0.15cm}bit}\hspace{0.05cm}.$$

Here, the  $p_\mu$  denote the occurrence probabilities of the individual symbols or events.  In the present example, the events are denoted by  $\rm R$(ed),  $\rm G$(reen)  and  $\rm S$(chwarz)  with  "Schwarz"  being the German word for  "Black".

  • For a binary source with the occurrence probabilities  $p$  and  $1-p$  this can be written:
$$H = H_{\rm bin}(p) = p \cdot {\rm log}_2\hspace{0.1cm}\frac{1}{p}+ (1-p) \cdot {\rm log}_2\hspace{0.1cm}\frac{1}{1-p}\hspace{0.05cm},\hspace{0.3cm} \text{pseudo–unit: bit}\hspace{0.05cm}.$$
  • The entropy of a multilevel source can often be expressed with this  "binary entropy function"  $H_{\rm bin}(p)$.


In this task, two ternary sources with the symbol probabilities according to the above graph are considered:

  1. source  $\rm Q_1$ with  $p_{\rm G }= 1/2$,  $p_{\rm S }= 1/3$  and  $p_{\rm R }= 1/6$,
  2. source  $\rm Q_2$ with  $p_{\rm G }= p$  and  $p_{\rm S } = p_{\rm R } = (1-p)/2$.


  • The ternary source  $\rm Q_2$  can also be applied to  "Roulette"  when a player bets only on the squares  $\rm R$(ed),  $\rm S$(chwarz)  and $\rm G$(reen)  (the "zero").  This type of game is referred to as  $\text{Roulette 1}$  in the question section.
  • In contrast,  $\text{Roulette 2}$  indicates that the player bets on single numbers  $(0$, ... , $36)$.




Hint:


Questions

1

What is the entropy  $H$  of the source  $\rm \underline{Q_1}$?

$H \ = \ $

$\ \rm bit$

2

Which of the following statements are true if  $\rm R$,  $\rm G$  and  $\rm S$  are represented by the numerical values  $-1$,  $0$  and  $+1$ ?

The result is a smaller entropy.
The entropy remains the same.
The result is a greater entropy.

3

Determine the entropy of the source  $\rm \underline{Q_2}$  using the binary entropy function  $H_{\rm bin}(p)$.  What value results for  $\underline{p = 0.5}$?

$H \ = \ $

$\ \rm bit$

4

For which  $p$–value of the source  $\rm \underline{Q_2}$  does the maximum entropy result:  $H → H_\text{max}$?

$p \ = \ $

5

What is the entropy of the source model  $\text{Roulette 1}$,  i.e. with respect to the events  $\rm R$(ed),  $\rm S$(chwarz)  and  $\rm G$(reen)  (the "zero")?

$H \ = \ $

$\ \rm bit$

6

What is the entropy of  $\text{Roulette 2}$ ,  i.e. with regard to the numbers   $0$, ... , $36$?

$H \ = \ $

$\ \rm bit$


Solution

(1)  With the "symbol" probabilities  $1/2$,  $1/3$  and  $1/6$  we get the following entropy value:

$$H \hspace{0.1cm} = \hspace{0.1cm} 1/2 \cdot {\rm log}_2\hspace{0.1cm}(2) +1/3 \cdot {\rm log}_2\hspace{0.1cm}(3) +1/6 \cdot {\rm log}_2\hspace{0.1cm}(6) =(1/2 + 1/6)\cdot {\rm log}_2\hspace{0.1cm}(2) + (1/3 + 1/6)\cdot {\rm log}_2\hspace{0.1cm}(3) \hspace{0.15cm}\underline {\approx 1.46 \, {\rm bit}} \hspace{0.05cm}.$$


(2) Proposed solution 2 is correct:

  • The entropy depends only on the probabilities of occurrence.
  • It does not matter which numerical values or physical quantities one assigns to the individual symbols.
  • It is different with mean values or the ACF (auto correlation function) calculation.  If only symbols are given, no moments can be calculated for them.
  • Moreover, the mean values, auto-correlation, etc. depend on whether one agrees on the assignment bipolar  $(-1, \hspace{0.10cm}0, \hspace{0.05cm}+1)$  or unipolar  $(0, \hspace{0.05cm}1, \hspace{0.05cm}2)$ .


(3)  The entropy of source  $\rm Q_2$  can be expressed as follows:

$$H \hspace{0.1cm} = \hspace{0.1cm} p \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{p}+ 2 \cdot \frac{1-p}{2} \cdot {\rm log}_2\hspace{0.1cm}\frac {2}{1-p}= p \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{p}+ (1-p) \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{1-p} + (1-p)\cdot {\rm log}_2\hspace{0.1cm}(2)= H_{\rm bin}(p) + 1-p \hspace{0.05cm}.$$
  • For  $p = 0.5$   ⇒   $H_{\rm bin}(p) = 1$ , we get  $\underline{H = 1.5\hspace{0.05cm}\rm bit}$.


(4)  The maximum entropy of a memoryless source with symbol set size  $M$  is obtained when all  $M$  symbols are equally probable.

  • For the special case  $M=3$  it follows:
$$p_{\rm R} + p_{\rm G} + p_{\rm S} = 1 \hspace{0.3cm} \Rightarrow \hspace{0.3cm} \underline {p = 1/3 \approx 0.333}\hspace{0.05cm}.$$
  • Thus, using the result of sub-task  (3) , we obtain the following entropy:
$$H = H_{\rm bin}(1/3) + 1-1/3 = 1/3 \cdot {\rm log}_2\hspace{0.1cm}(3) + 2/3 \cdot {\rm log}_2\hspace{0.1cm}(3/2) + 2/3 $$
$$\Rightarrow \hspace{0.3cm}H = 1/3 \cdot {\rm log}_2\hspace{0.1cm}(3) + 2/3 \cdot {\rm log}_2\hspace{0.1cm}(3) - 2/3 \cdot {\rm log}_2\hspace{0.1cm}(2)+ 2/3 = {\rm log}_2\hspace{0.1cm}(3) = {1.585 \, {\rm bit}} \hspace{0.05cm}.$$


(5)  The source model  $\text{Roulette 1}$  is information theoretically equal to the configuration  $\rm Q_2$  with  $p = 1/37$:

$$p_{\rm G} = p = \frac{1}{37}\hspace{0.05cm},\hspace{0.2cm} p_{\rm R} = p_{\rm S} = \frac{1-p}{2} = \frac{18}{37} \hspace{0.05cm}.$$
  • Thus, using the result of subtask  (3), we obtain:
$$H = H_{\rm bin}(1/37) + \frac{36}{37} = \frac{1}{37} \cdot {\rm log}_2\hspace{0.1cm}(37) + \frac{36}{37} \cdot {\rm log}_2\hspace{0.1cm}(37) - \frac{36}{37} \cdot {\rm log}_2\hspace{0.1cm}36 + \frac{36}{37} = {\rm log}_2\hspace{0.1cm}(37) + \frac{36}{37} \cdot ( 1- {\rm log}_2\hspace{0.1cm}(36)) = 5.209 - 4.057 \hspace{0.15cm} \underline { = 1.152 \, {\rm bit}} \hspace{0.05cm}.$$


(6)  If we bet on single numbers in roulette   ⇒   source model  $\text{Roulette 2}$, all numbers from  $0$  to  $36$  are equally probable and we get:

$$H = {\rm log}_2\hspace{0.1cm}(37) \hspace{0.15cm} \underline { = 5.209 \, {\rm bit}} \hspace{0.05cm}.$$