Difference between revisions of "Aufgaben:Exercise 1.2: Entropy of Ternary Sources"

From LNTwww
m (Text replacement - "autocorrelation" to "auto-correlation")
 
(18 intermediate revisions by 3 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Gedächtnislose Nachrichtenquellen
+
{{quiz-Header|Buchseite=Information_Theory/Discrete_Memoryless_Sources}}
}}
 
  
[[File:Inf_A_1_2_vers2.png|right|frame|Wahrscheinlichkeiten zweier Ternärquellen]]
+
[[File:Inf_A_1_2_vers2.png|right|frame|Probabilities of two ternary sources]]
Die Entropie einer wertdiskreten gedächtnislosen Nachrichtenquelle mit $M$ möglichen Symbolen lautet:
+
The entropy of a discrete memoryless source with  $M$  possible symbols is:
 
:$$H =  \sum_{\mu = 1}^M p_{\mu} \cdot {\rm log}_2\hspace{0.1cm}\frac{1}{p_\mu}\hspace{0.05cm},\hspace{0.3cm}
 
:$$H =  \sum_{\mu = 1}^M p_{\mu} \cdot {\rm log}_2\hspace{0.1cm}\frac{1}{p_\mu}\hspace{0.05cm},\hspace{0.3cm}
  {\rm Pseudoeinheit\hspace{-0.15cm}: \hspace{0.15cm}bit}\hspace{0.05cm}.$$
+
  {\rm pseudo unit\hspace{-0.15cm}: \hspace{0.15cm}bit}\hspace{0.05cm}.$$
Hierbei bezeichnen die $p_\mu$ die Auftrittswahrscheinlichkeiten der einzelnen Symbole bzw. Ereignisse. Im vorliegenden Beispiel werden die Ereignisse mit $\rm R$(ot), $\rm G$(rün) und $\rm S$(chwarz) bezeichnet.
+
Here, the  $p_\mu$  denote the occurrence probabilities of the individual symbols or events.  In the present example, the events are denoted by  $\rm R$(ed),  $\rm G$(reen)  and  $\rm S$(chwarz)  with  "Schwarz"  being the German word for  "Black".
  
*Bei einer binären Quelle mit den Auftrittswahrscheinlichkeiten $p$  und  $1-p$  kann hierfür geschrieben werden:
+
*For a binary source with the occurrence probabilities  $p$  and  $1-p$  this can be written:
 
:$$H = H_{\rm bin}(p) = p \cdot {\rm log}_2\hspace{0.1cm}\frac{1}{p}+ (1-p) \cdot  
 
:$$H = H_{\rm bin}(p) = p \cdot {\rm log}_2\hspace{0.1cm}\frac{1}{p}+ (1-p) \cdot  
 
{\rm log}_2\hspace{0.1cm}\frac{1}{1-p}\hspace{0.05cm},\hspace{0.3cm}
 
{\rm log}_2\hspace{0.1cm}\frac{1}{1-p}\hspace{0.05cm},\hspace{0.3cm}
  {\rm Pseudoeinheit\hspace{-0.15cm}: \hspace{0.15cm}bit}\hspace{0.05cm}.$$
+
  \text{pseudo–unit: bit}\hspace{0.05cm}.$$
*Die Entropie einer mehrstufigen Quelle lässt sich häufig mit dieser „binären Entropiefunktion” $H_{\rm bin}(p)$ ausdrücken.
+
*The entropy of a multilevel source can often be expressed with this  "binary entropy function"  $H_{\rm bin}(p)$.
 
   
 
   
  
Betrachtet werden in dieser Aufgabe zwei Ternärquellen mit den Symbolwahrscheinlichkeiten gemäß der obigen Grafik:
+
In this task, two ternary sources with the symbol probabilities according to the above graph are considered:
  
* die Quelle $\rm Q_1$ mit  $p_{\rm G }= 1/2$,  $p_{\rm S }= 1/3$  und  $p_{\rm R }= 1/6$,
+
# source  $\rm Q_1$ with  $p_{\rm G }= 1/2$,  $p_{\rm S }= 1/3$  and  $p_{\rm R }= 1/6$,
* die Quelle $\rm Q_2$ mit  $p_{\rm G }= p$  sowie   $p_{\rm S } = p_{\rm R } = (1-p)/2$.  
+
# source  $\rm Q_2$ with  $p_{\rm G }= p$  and   $p_{\rm S } = p_{\rm R } = (1-p)/2$.  
  
  
Die Ternärquelle $\rm Q_2$ lässt sich auch auf das Roulette anwenden, wenn ein Spieler nur auf die Felder $\rm R$, $\rm S$chwarz und $\rm G$rün (die „Null”) setzt. Dieser Spieltyp wird im Fragebogen mit „Roulette 1” bezeichnet.
+
*The ternary source  $\rm Q_2$  can also be applied to  "Roulette"  when a player bets only on the squares  $\rm R$(ed),  $\rm S$(chwarz)  and $\rm G$(reen)  (the "zero").  This type of game is referred to as  $\text{Roulette 1}$  in the question section.
  
Dagegen weist „Roulette 2” darauf hin, dass der Spieler auf einzelne Zahlen $(0$, ... , $36)$ setzt.
+
*In contrast,  $\text{Roulette 2}$  indicates that the player bets on single numbers  $(0$, ... , $36)$.
  
  
  
  
''Hinweis:''  
+
 
*Die Aufgabe gehört zum  Kapitel [[Informationstheorie/Gedächtnislose_Nachrichtenquellen|Gedächtnislose Nachrichtenquellen]].
+
 
 +
 
 +
''Hint:''  
 +
*The task belongs to the chapter  [[Information_Theory/Gedächtnislose_Nachrichtenquellen|Discrete Memoryless Sources]].
 
   
 
   
  
  
===Fragebogen===
+
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
{Welche Entropie $H$ besitzt die Quelle $\rm \underline{Q_1}$?
+
{What is the entropy&nbsp; $H$&nbsp; of the source&nbsp; $\rm \underline{Q_1}$?
 
|type="{}"}
 
|type="{}"}
 
$H \ = \ $ { 1.46 3% } $\ \rm bit$
 
$H \ = \ $ { 1.46 3% } $\ \rm bit$
  
  
{Welche der folgenden Aussagen sind zutreffend, wenn man $\rm R$, $\rm G$ und $\rm S$ durch die Zahlenwerte $-1$, &nbsp;$0$ &nbsp;und&nbsp; $+1$ darstellt?
+
{Which of the following statements are true if&nbsp; $\rm R$,&nbsp; $\rm G$&nbsp; and&nbsp; $\rm S$&nbsp; are represented by the numerical values&nbsp; $-1$, &nbsp;$0$ &nbsp;and&nbsp; $+1$&nbsp;?
|type="[]"}
+
|type="()"}
- Es ergibt sich eine kleinere Entropie.
+
- The result is a smaller entropy.
+ Die Entropie bleibt gleich.
+
+ The entropy remains the same.
- Es ergibt sich eine größere Entropie.
+
- The result is a greater entropy.
  
  
{Bestimmen Sie die Entropie der Quelle $\rm \underline{Q_2}$ unter Verwendung der binären Entropiefunktion $H_{\rm bin}(p)$. <br>Welcher Wert ergibt sich für $\underline{p = 0.5}$?
+
{Determine the entropy of the source&nbsp; $\rm \underline{Q_2}$&nbsp; using the binary entropy function&nbsp; $H_{\rm bin}(p)$.&nbsp; What value results for&nbsp; $\underline{p = 0.5}$?
 
|type="{}"}
 
|type="{}"}
 
$H \ =  \ $ { 1.5 3% } $\ \rm bit$
 
$H \ =  \ $ { 1.5 3% } $\ \rm bit$
  
  
{Für welchen $p$&ndash;Wert  der Quelle $\rm \underline{Q_2}$ergibt sich die maximale Entropie: $H &#8594; H_\text{max}$?
+
{For which&nbsp; $p$&ndash;value of the source&nbsp; $\rm \underline{Q_2}$&nbsp; does the maximum entropy result:&nbsp; $H &#8594; H_\text{max}$?
 
|type="{}"}
 
|type="{}"}
 
$p \ =  \ $ { 0.333 3% }  
 
$p \ =  \ $ { 0.333 3% }  
  
  
{Welche Entropie hat die Nachrichtenquelle $\text{Roulette 1}$, also hinsichtlich der Ereignisse $\rm R$ot,  $\rm S$chwarz und $\rm G$rün (die &bdquo;Null&rdquo;)?
+
{What is the entropy of the source model&nbsp; $\text{Roulette 1}$,&nbsp; i.e. with respect to the events&nbsp; $\rm R$(ed),&nbsp; $\rm S$(chwarz)&nbsp; and&nbsp; $\rm G$(reen)&nbsp; (the "zero")?
 
|type="{}"}
 
|type="{}"}
 
$H \ = \ $ { 1.152 3% } $\ \rm bit$
 
$H \ = \ $ { 1.152 3% } $\ \rm bit$
  
  
{Welche Entropie weist $\text{Roulette 2}$ auf, also  hinsichtlich der Zahlen $0$, ... , $36$?
+
{What is the entropy of&nbsp; $\text{Roulette 2}$&nbsp;,&nbsp; i.e. with regard to the numbers &nbsp; $0$, ... , $36$?
 
|type="{}"}
 
|type="{}"}
 
$H \ =  \ $ { 5.209 3% } $\ \rm bit$
 
$H \ =  \ $ { 5.209 3% } $\ \rm bit$
Line 72: Line 74:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''(1)'''&nbsp; Mit den Auftrittswahrscheinlichkeiten $1/2$, $1/3$ und $1/6$ erhält man folgenden Entropiewert:
+
'''(1)'''&nbsp; With the "symbol" probabilities&nbsp; $1/2$,&nbsp; $1/3$&nbsp; and&nbsp; $1/6$&nbsp; we get the following entropy value:
 
:$$H \hspace{0.1cm}  =  \hspace{0.1cm}  1/2 \cdot {\rm log}_2\hspace{0.1cm}(2) +1/3 \cdot {\rm log}_2\hspace{0.1cm}(3) +1/6 \cdot {\rm log}_2\hspace{0.1cm}(6) =(1/2 + 1/6)\cdot {\rm log}_2\hspace{0.1cm}(2) +  (1/3 + 1/6)\cdot {\rm log}_2\hspace{0.1cm}(3) \hspace{0.15cm}\underline {\approx 1.46 \, {\rm bit}} \hspace{0.05cm}.$$
 
:$$H \hspace{0.1cm}  =  \hspace{0.1cm}  1/2 \cdot {\rm log}_2\hspace{0.1cm}(2) +1/3 \cdot {\rm log}_2\hspace{0.1cm}(3) +1/6 \cdot {\rm log}_2\hspace{0.1cm}(6) =(1/2 + 1/6)\cdot {\rm log}_2\hspace{0.1cm}(2) +  (1/3 + 1/6)\cdot {\rm log}_2\hspace{0.1cm}(3) \hspace{0.15cm}\underline {\approx 1.46 \, {\rm bit}} \hspace{0.05cm}.$$
  
'''(2)'''&nbsp; Richtig ist <u>Lösungsvorschlag 2</u>:
 
* Die Entropie hängt nur von den Auftrittswahrscheinlichkeiten ab. Es ist dabei egal, welche Zahlenwerte oder physikalische Größen man den einzelnen Symbolen zuordnet.
 
*Anders ist es bei Mittelwerten oder der AKF&ndash;Berechnung. Werden nur Symbole angegeben, so kann man hierfür keine Momente angeben. *Außerdem hängen die Mittelwerte, Autokorrelation, usw. davon ab, ob man die Zuordnung bipolar $(-1, \hspace{0.05cm}0, \hspace{0.05cm}+1)$ oder unipolar (zum Beispiel: $(0, \hspace{0.05cm}1, \hspace{0.05cm}2)$ vereinbart.
 
  
  
'''(3)'''&nbsp; Die Entropie der Quelle $\rm Q_2$ lässt sich wie folgt ausdrücken:
+
'''(2)'''&nbsp;<u>Proposed solution 2</u> is correct:
 +
*The entropy depends only on the probabilities of occurrence.
 +
*It does not matter which numerical values or physical quantities one assigns to the individual symbols.
 +
*It is different with mean values or the ACF (auto correlation function) calculation.&nbsp; If only symbols are given, no moments can be calculated for them.
 +
*Moreover, the mean values, auto-correlation, etc. depend on whether one agrees on the assignment bipolar&nbsp; $(-1, \hspace{0.10cm}0, \hspace{0.05cm}+1)$&nbsp;  or unipolar&nbsp;  $(0, \hspace{0.05cm}1, \hspace{0.05cm}2)$&nbsp;.
 +
 
 +
 
 +
 
 +
'''(3)'''&nbsp; The entropy of source&nbsp; $\rm Q_2$&nbsp; can be expressed as follows:
 
:$$H \hspace{0.1cm} =  \hspace{0.1cm} p \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{p}+ 2 \cdot \frac{1-p}{2}  \cdot {\rm log}_2\hspace{0.1cm}\frac {2}{1-p}= p \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{p}+ (1-p)  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{1-p} + (1-p)\cdot {\rm log}_2\hspace{0.1cm}(2)= H_{\rm bin}(p) + 1-p \hspace{0.05cm}.$$
 
:$$H \hspace{0.1cm} =  \hspace{0.1cm} p \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{p}+ 2 \cdot \frac{1-p}{2}  \cdot {\rm log}_2\hspace{0.1cm}\frac {2}{1-p}= p \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{p}+ (1-p)  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{1-p} + (1-p)\cdot {\rm log}_2\hspace{0.1cm}(2)= H_{\rm bin}(p) + 1-p \hspace{0.05cm}.$$
Für $p = 0.5$ &nbsp;&nbsp;&#8658;&nbsp;&nbsp; $H_{\rm bin}(p) = 1$ ergibt sich $\underline{H = 1.5\hspace{0.05cm}\rm  bit}$.
+
*For&nbsp; $p = 0.5$ &nbsp;&nbsp;&#8658;&nbsp;&nbsp; $H_{\rm bin}(p) = 1$&nbsp;, we get&nbsp; $\underline{H = 1.5\hspace{0.05cm}\rm  bit}$.
  
  
'''(4)'''&nbsp; Die maximale Entropie einer gedächtnislosen Quelle mit dem Symbolumfang $M$ ergibt sich, wenn alle $M$ Symbole gleichwahrscheinlich sind. Für den Sonderfall $M=3$ folgt daraus:
+
 
 +
'''(4)'''&nbsp; The maximum entropy of a memoryless source with symbol set size&nbsp; $M$&nbsp; is obtained when all&nbsp; $M$&nbsp; symbols are equally probable.
 +
*For the special case&nbsp; $M=3$&nbsp; it follows:
 
:$$p_{\rm R} + p_{\rm G} + p_{\rm S} = 1 \hspace{0.3cm} \Rightarrow \hspace{0.3cm}
 
:$$p_{\rm R} + p_{\rm G} + p_{\rm S} = 1 \hspace{0.3cm} \Rightarrow \hspace{0.3cm}
  \underline {p = 1/3 = 0.333}\hspace{0.05cm}.$$
+
  \underline {p = 1/3 \approx 0.333}\hspace{0.05cm}.$$
Damit erhält man mit dem Ergebnis der Teilaufgabe (3) die folgende Entropie:
+
*Thus, using the result of sub-task&nbsp; '''(3)'''&nbsp;, we obtain the following entropy:
 
:$$H = H_{\rm bin}(1/3) + 1-1/3 = 1/3 \cdot  
 
:$$H = H_{\rm bin}(1/3) + 1-1/3 = 1/3 \cdot  
 
{\rm log}_2\hspace{0.1cm}(3) + 2/3 \cdot {\rm log}_2\hspace{0.1cm}(3/2) + 2/3 $$
 
{\rm log}_2\hspace{0.1cm}(3) + 2/3 \cdot {\rm log}_2\hspace{0.1cm}(3/2) + 2/3 $$
Line 98: Line 107:
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
'''(5)'''&nbsp; Das System &brdquo;Roulette 1&rdquo; ist informationstheoretisch gleich der Konfiguration $\rm Q_2$ mit $p = 1/37$:
+
 
 +
 
 +
'''(5)'''&nbsp; The source model&nbsp; $\text{Roulette 1}$&nbsp; is information theoretically equal to the configuration&nbsp; $\rm Q_2$&nbsp; with&nbsp; $p = 1/37$:
 
:$$p_{\rm G} = p =  \frac{1}{37}\hspace{0.05cm},\hspace{0.2cm} p_{\rm R} = p_{\rm S} = \frac{1-p}{2} = \frac{18}{37} \hspace{0.05cm}.$$
 
:$$p_{\rm G} = p =  \frac{1}{37}\hspace{0.05cm},\hspace{0.2cm} p_{\rm R} = p_{\rm S} = \frac{1-p}{2} = \frac{18}{37} \hspace{0.05cm}.$$
Damit erhält man mit dem Ergebnis der Teilaufgabe (3):
+
*Thus, using the result of subtask&nbsp; '''(3)''', we obtain:
:$$H \hspace{0.1cm} = \hspace{0.1cm} H_{\rm bin}(1/37) + \frac{36}{37} = \frac{1}{37} \cdot {\rm log}_2\hspace{0.1cm}(37) + \frac{36}{37} \cdot {\rm log}_2\hspace{0.1cm}(37) - \frac{36}{37} \cdot {\rm log}_2\hspace{0.1cm}36 + \frac{36}{37} =\\
+
:$$H = H_{\rm bin}(1/37) + \frac{36}{37} = \frac{1}{37} \cdot {\rm log}_2\hspace{0.1cm}(37) + \frac{36}{37} \cdot {\rm log}_2\hspace{0.1cm}(37) - \frac{36}{37} \cdot {\rm log}_2\hspace{0.1cm}36 + \frac{36}{37} =
\hspace{0.1cm}  =  \hspace{0.1cm}  {\rm log}_2\hspace{0.1cm}(37) + \frac{36}{37} \cdot ( 1- {\rm log}_2\hspace{0.1cm}(36)) = 5.209 - 4.057  \hspace{0.15cm} \underline { = 1.152 \, {\rm bit}}
+
  {\rm log}_2\hspace{0.1cm}(37) + \frac{36}{37} \cdot ( 1- {\rm log}_2\hspace{0.1cm}(36)) = 5.209 - 4.057  \hspace{0.15cm} \underline { = 1.152 \, {\rm bit}}
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
'''(6)'''&nbsp; Setzt man bei Roulette auf einzelne Zahlen &nbsp;  &#8658; &nbsp; Konfiguration &bdquo;Roulette 2&rdquo;, so sind alle Zahlen von $0$ bis $36$ gleichwahrscheinlich und man erhält:
+
 
 +
 
 +
'''(6)'''&nbsp; If we bet on single numbers in roulette &nbsp;  &#8658; &nbsp; source model&nbsp; $\text{Roulette 2}$, all numbers from&nbsp; $0$&nbsp; to&nbsp; $36$&nbsp;  are equally probable and we get:
 
:$$H = {\rm log}_2\hspace{0.1cm}(37)  \hspace{0.15cm} \underline { = 5.209 \, {\rm bit}}
 
:$$H = {\rm log}_2\hspace{0.1cm}(37)  \hspace{0.15cm} \underline { = 5.209 \, {\rm bit}}
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
Line 112: Line 125:
  
  
[[Category:Aufgaben zu Informationstheorie|^1.1 Gedächtnislose Nachrichtenquellen^]]
+
[[Category:Information Theory: Exercises|^1.1 Memoryless Sources^]]

Latest revision as of 12:29, 17 February 2022

Probabilities of two ternary sources

The entropy of a discrete memoryless source with  $M$  possible symbols is:

$$H = \sum_{\mu = 1}^M p_{\mu} \cdot {\rm log}_2\hspace{0.1cm}\frac{1}{p_\mu}\hspace{0.05cm},\hspace{0.3cm} {\rm pseudo unit\hspace{-0.15cm}: \hspace{0.15cm}bit}\hspace{0.05cm}.$$

Here, the  $p_\mu$  denote the occurrence probabilities of the individual symbols or events.  In the present example, the events are denoted by  $\rm R$(ed),  $\rm G$(reen)  and  $\rm S$(chwarz)  with  "Schwarz"  being the German word for  "Black".

  • For a binary source with the occurrence probabilities  $p$  and  $1-p$  this can be written:
$$H = H_{\rm bin}(p) = p \cdot {\rm log}_2\hspace{0.1cm}\frac{1}{p}+ (1-p) \cdot {\rm log}_2\hspace{0.1cm}\frac{1}{1-p}\hspace{0.05cm},\hspace{0.3cm} \text{pseudo–unit: bit}\hspace{0.05cm}.$$
  • The entropy of a multilevel source can often be expressed with this  "binary entropy function"  $H_{\rm bin}(p)$.


In this task, two ternary sources with the symbol probabilities according to the above graph are considered:

  1. source  $\rm Q_1$ with  $p_{\rm G }= 1/2$,  $p_{\rm S }= 1/3$  and  $p_{\rm R }= 1/6$,
  2. source  $\rm Q_2$ with  $p_{\rm G }= p$  and  $p_{\rm S } = p_{\rm R } = (1-p)/2$.


  • The ternary source  $\rm Q_2$  can also be applied to  "Roulette"  when a player bets only on the squares  $\rm R$(ed),  $\rm S$(chwarz)  and $\rm G$(reen)  (the "zero").  This type of game is referred to as  $\text{Roulette 1}$  in the question section.
  • In contrast,  $\text{Roulette 2}$  indicates that the player bets on single numbers  $(0$, ... , $36)$.




Hint:


Questions

1

What is the entropy  $H$  of the source  $\rm \underline{Q_1}$?

$H \ = \ $

$\ \rm bit$

2

Which of the following statements are true if  $\rm R$,  $\rm G$  and  $\rm S$  are represented by the numerical values  $-1$,  $0$  and  $+1$ ?

The result is a smaller entropy.
The entropy remains the same.
The result is a greater entropy.

3

Determine the entropy of the source  $\rm \underline{Q_2}$  using the binary entropy function  $H_{\rm bin}(p)$.  What value results for  $\underline{p = 0.5}$?

$H \ = \ $

$\ \rm bit$

4

For which  $p$–value of the source  $\rm \underline{Q_2}$  does the maximum entropy result:  $H → H_\text{max}$?

$p \ = \ $

5

What is the entropy of the source model  $\text{Roulette 1}$,  i.e. with respect to the events  $\rm R$(ed),  $\rm S$(chwarz)  and  $\rm G$(reen)  (the "zero")?

$H \ = \ $

$\ \rm bit$

6

What is the entropy of  $\text{Roulette 2}$ ,  i.e. with regard to the numbers   $0$, ... , $36$?

$H \ = \ $

$\ \rm bit$


Solution

(1)  With the "symbol" probabilities  $1/2$,  $1/3$  and  $1/6$  we get the following entropy value:

$$H \hspace{0.1cm} = \hspace{0.1cm} 1/2 \cdot {\rm log}_2\hspace{0.1cm}(2) +1/3 \cdot {\rm log}_2\hspace{0.1cm}(3) +1/6 \cdot {\rm log}_2\hspace{0.1cm}(6) =(1/2 + 1/6)\cdot {\rm log}_2\hspace{0.1cm}(2) + (1/3 + 1/6)\cdot {\rm log}_2\hspace{0.1cm}(3) \hspace{0.15cm}\underline {\approx 1.46 \, {\rm bit}} \hspace{0.05cm}.$$


(2) Proposed solution 2 is correct:

  • The entropy depends only on the probabilities of occurrence.
  • It does not matter which numerical values or physical quantities one assigns to the individual symbols.
  • It is different with mean values or the ACF (auto correlation function) calculation.  If only symbols are given, no moments can be calculated for them.
  • Moreover, the mean values, auto-correlation, etc. depend on whether one agrees on the assignment bipolar  $(-1, \hspace{0.10cm}0, \hspace{0.05cm}+1)$  or unipolar  $(0, \hspace{0.05cm}1, \hspace{0.05cm}2)$ .


(3)  The entropy of source  $\rm Q_2$  can be expressed as follows:

$$H \hspace{0.1cm} = \hspace{0.1cm} p \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{p}+ 2 \cdot \frac{1-p}{2} \cdot {\rm log}_2\hspace{0.1cm}\frac {2}{1-p}= p \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{p}+ (1-p) \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{1-p} + (1-p)\cdot {\rm log}_2\hspace{0.1cm}(2)= H_{\rm bin}(p) + 1-p \hspace{0.05cm}.$$
  • For  $p = 0.5$   ⇒   $H_{\rm bin}(p) = 1$ , we get  $\underline{H = 1.5\hspace{0.05cm}\rm bit}$.


(4)  The maximum entropy of a memoryless source with symbol set size  $M$  is obtained when all  $M$  symbols are equally probable.

  • For the special case  $M=3$  it follows:
$$p_{\rm R} + p_{\rm G} + p_{\rm S} = 1 \hspace{0.3cm} \Rightarrow \hspace{0.3cm} \underline {p = 1/3 \approx 0.333}\hspace{0.05cm}.$$
  • Thus, using the result of sub-task  (3) , we obtain the following entropy:
$$H = H_{\rm bin}(1/3) + 1-1/3 = 1/3 \cdot {\rm log}_2\hspace{0.1cm}(3) + 2/3 \cdot {\rm log}_2\hspace{0.1cm}(3/2) + 2/3 $$
$$\Rightarrow \hspace{0.3cm}H = 1/3 \cdot {\rm log}_2\hspace{0.1cm}(3) + 2/3 \cdot {\rm log}_2\hspace{0.1cm}(3) - 2/3 \cdot {\rm log}_2\hspace{0.1cm}(2)+ 2/3 = {\rm log}_2\hspace{0.1cm}(3) = {1.585 \, {\rm bit}} \hspace{0.05cm}.$$


(5)  The source model  $\text{Roulette 1}$  is information theoretically equal to the configuration  $\rm Q_2$  with  $p = 1/37$:

$$p_{\rm G} = p = \frac{1}{37}\hspace{0.05cm},\hspace{0.2cm} p_{\rm R} = p_{\rm S} = \frac{1-p}{2} = \frac{18}{37} \hspace{0.05cm}.$$
  • Thus, using the result of subtask  (3), we obtain:
$$H = H_{\rm bin}(1/37) + \frac{36}{37} = \frac{1}{37} \cdot {\rm log}_2\hspace{0.1cm}(37) + \frac{36}{37} \cdot {\rm log}_2\hspace{0.1cm}(37) - \frac{36}{37} \cdot {\rm log}_2\hspace{0.1cm}36 + \frac{36}{37} = {\rm log}_2\hspace{0.1cm}(37) + \frac{36}{37} \cdot ( 1- {\rm log}_2\hspace{0.1cm}(36)) = 5.209 - 4.057 \hspace{0.15cm} \underline { = 1.152 \, {\rm bit}} \hspace{0.05cm}.$$


(6)  If we bet on single numbers in roulette   ⇒   source model  $\text{Roulette 2}$, all numbers from  $0$  to  $36$  are equally probable and we get:

$$H = {\rm log}_2\hspace{0.1cm}(37) \hspace{0.15cm} \underline { = 5.209 \, {\rm bit}} \hspace{0.05cm}.$$