Difference between revisions of "Aufgaben:Exercise 1.5Z: Symmetrical Markov Source"

From LNTwww
 
(22 intermediate revisions by 5 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie und Quellencodierung/Nachrichtenquellen mit Gedächtnis
+
{{quiz-Header|Buchseite=Information_Theory/Discrete_Sources_with_Memory
 
}}
 
}}
  
[[File:P_ID2252__Inf_Z_1_5.png|right|]]
+
[[File:Inf_Z_1_5_vers2.png|right|frame|Binary symmetrical Markov diagram]]
:In der Aufgabe A1.5 wurde eine binäre Markovquelle behandelt, bei der die Übergangswahrscheinlichkeiten von <b>A</b> nach <b>B</b> sowie von <b>B</b> nach <b>A</b> unterschiedlich waren. In dieser Aufgabe soll nun gelten:
+
[[Aufgaben:Exercise_1.5:_Binary_Markov_Source|Exercise 1.5]]&nbsp; dealt with a binary Markov source in which the transition probabilities from&nbsp; $\rm A$&nbsp; to&nbsp; $\rm B$&nbsp; and from&nbsp; $\rm B$&nbsp; to&nbsp; $\rm A$&nbsp; were different.
 +
 
 +
In this task, the following shall now apply:
 
:$$p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = q \hspace{0.8cm} ( 0 \le q \le 1)
 
:$$p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = q \hspace{0.8cm} ( 0 \le q \le 1)
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
:Alle in der Aufgabe A1.5 angegebenen Gleichungen gelten auch hier:
 
  
:* <b>Entropie:</b>
+
All equations given in Exercise 1.5 also apply here:
:$$H = p_{\rm AA}  \cdot {\rm ld}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB}  \cdot {\rm ld}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} +  p_{\rm BA}  \cdot {\rm ld}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}} + p_{\rm BB}  \cdot  {\rm ld}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}}
+
 
  \hspace{0.05cm},$$
+
* <b>Entropy:</b>
 +
:$$H = p_{\rm AA}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} +  p_{\rm BA}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}} + p_{\rm BB}  \cdot  {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}}
 +
  \hspace{0.05cm}.$$
  
:* <b>Erste Entropienäherung</b>:
+
* <b>First entropy approximation:</b>:
:$$H_{\rm 1}  =  p_{\rm A} \cdot {\rm ld}\hspace{0.1cm} \frac{1}{p_{\rm A}} + p_{\rm B} \cdot {\rm ld}\hspace{0.1cm} \frac{1}{p_{\rm B}}  
+
:$$H_{\rm 1}  =  p_{\rm A} \cdot {\rm log_2}\hspace{0.1cm} \frac{1}{p_{\rm A}} + p_{\rm B} \cdot {\rm log_2}\hspace{0.1cm} \frac{1}{p_{\rm B}}  
  \hspace{0.05cm},$$
+
  \hspace{0.05cm}.$$
  
:* <b><i>k</i>&ndash;te Entropienäherung</b> (<i>k</i> = 2, 3, ...):
+
* <b><i>k</i>&ndash;th entropy approximation</b> $(k = 2, 3, \ \text{...})$:
:$$H_k =  \frac{1}{k} \cdot [ H_{\rm 1} + (k-1) \cdot H]  
+
:$$H_k =  {1}/{k} \cdot \big [ H_{\rm 1} + (k-1) \cdot H \big]  
 
  \hspace{0.05cm},\hspace{0.5cm}H  =  \lim_{k \rightarrow \infty } H_k  \hspace{0.05cm}.$$
 
  \hspace{0.05cm},\hspace{0.5cm}H  =  \lim_{k \rightarrow \infty } H_k  \hspace{0.05cm}.$$
  
:<b>Hinweis:</b> Die Aufgabe bezieht sich auf Kapitel 1.2, Seite 5c. Bei allen Entropien ist die Pseudoeinheit &bdquo;bit/Symbol&rdquo; hinzuzufügen.
 
  
  
===Fragebogen===
+
 
 +
 
 +
 
 +
 
 +
''Hints:''
 +
*The exercise belongs to the chapter&nbsp; [[Information_Theory/Discrete_Sources_with_Memory|Discrete Sources with Memory]].
 +
*Reference is made in particular to the page&nbsp;  [[Information_Theory/Discrete_Sources_with_Memory#Binary_sources_with_Markov_properties|"Binary sources with Markov properties"]].
 +
*For all entropies, add the pseudo-unit "bit/symbol".
 +
 +
 
 +
 
 +
 
 +
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
{Berechnen Sie die Symbolwahrscheinlichkeiten.
+
{Calculate the symbol probabilities for the transition probability&nbsp; $q = 1/4$.
 
|type="{}"}
 
|type="{}"}
$q = 1/4:\ \ p_A$ = { 0.5 3% }
+
$p_{\rm A} \ = \ $ { 0.5 1% }
$p_B$ = { 0.5 3% }
+
$p_{\rm B} \ = \ $ { 0.5 1% }
  
  
{Berechnen Sie die Quellenentropie <i>H</i>. Welches Ergebnis liefert <i>q</i> = 1/4?
+
{Calculate the source entropy&nbsp; $H$&nbsp; for&nbsp; $q = 1/4$.
 
|type="{}"}
 
|type="{}"}
$q = 1/4:\ \ H$ = { 0.5 3% } $bit/Symbol$
+
$H \ =$ { 0.811 3% } $\ \rm bit/symbol$
  
  
{Welche Entropienäherungen erhält man für <i>q</i> = 1/4?
+
{What entropy approximations are obtained for&nbsp; $q = 1/4$?
 
|type="{}"}
 
|type="{}"}
$q = 1/4:\ \ H_1$ = { 1 3% } $bit/Symbol$
+
$H_1 \ = \ $ { 1 1% } $\ \rm bit/symbol$
$H_2$ = { 0.906 3% } $bit/Symbol$
+
$H_2 \ = \ $ { 0.906 1% } $\ \rm bit/symbol$
$H_3$ = { 0.874 3% } $bit/Symbol$
+
$H_3 \ = \ $ { 0.874 1% } $\ \rm bit/symbol$
  
  
{Bestimmen Sie <i>q</i> derart, dass <i>H</i> maximal wird. Interpretation.
+
{Determine&nbsp; $q$&nbsp; such that&nbsp; $H$&nbsp; becomes maximum. Interpretation.
 
|type="{}"}
 
|type="{}"}
$H \rightarrow Maximum:\ \ q$ = { 0.5 3% }
+
$q \ = \ $ { 0.5 3% }
  
  
{Welche Symbolfolgen sind mit <i>q</i> = 0 möglich?
+
{Which symbol sequences are possible with&nbsp; $q = 0$&nbsp;?
 
|type="[]"}
 
|type="[]"}
+ AAAAAA ...
+
+ $\rm AAAAAA$ ...  
+ BBBBBB ...
+
+ $\rm BBBBBB$ ...
- ABABAB ...
+
- $\rm ABABAB$ ...
  
  
{Welche Symbolfolgen sind mit <i>q</i> = 1 möglich?
+
{Which symbol sequences are possible with&nbsp; $q = 1$&nbsp;?
 
|type="[]"}
 
|type="[]"}
- AAAAAA ...
+
- $\rm AAAAAA$ ...  
- BBBBBB ...
+
- $\rm BBBBBB$ ...
+ ABABAB ...
+
+ $\rm ABABAB$ ...
  
  
Line 67: Line 81:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
:<b>1.</b>&nbsp;&nbsp;Bei einer stationären binären Markovquelle erster Ordnung gilt:
+
'''(1)'''&nbsp; For a stationary first order binary Markov source holds:
 
:$$p_{\rm A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \cdot p_{\rm A} + p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} \cdot p_{\rm B}
 
:$$p_{\rm A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \cdot p_{\rm A} + p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} \cdot p_{\rm B}
 
  = (1-q) \cdot p_{\rm A} + q \cdot p_{\rm B}$$
 
  = (1-q) \cdot p_{\rm A} + q \cdot p_{\rm B}$$
:$$q \cdot p_{\rm A} = q \cdot p_{\rm B} \hspace{0.3cm} \Rightarrow \hspace{0.3cm}p_{\rm A} = p_{\rm B}\hspace{0.15cm} \underline {= 0.5}  
+
:$$\Rightarrow \hspace{0.3cm}q \cdot p_{\rm A} = q \cdot p_{\rm B} \hspace{0.3cm} \Rightarrow \hspace{0.3cm}p_{\rm A} = p_{\rm B}\hspace{0.15cm} \underline {= 0.5}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>2.</b>&nbsp;&nbsp;Zur Berechnung von <i>H</i> benötigt man alle vier Verbundwahrscheinlichkeiten:
+
 
:$$p_{\rm AA} \hspace{0.1cm} =  \hspace{0.1cm}  p_{\rm A} \cdot  p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} = 1/2 \cdot(1-q) = p_{\rm BB}\hspace{0.05cm},\\
+
 
 +
'''(2)'''&nbsp; To calculate the entropy&nbsp; $H$&nbsp; one needs all four composite probabilities:
 +
:$$p_{\rm AA} \hspace{0.1cm} =  \hspace{0.1cm}  p_{\rm A} \cdot  p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} = 1/2 \cdot(1-q) = p_{\rm BB}\hspace{0.05cm},\hspace{1cm}
 
  p_{\rm AB} \hspace{0.1cm} =  \hspace{0.1cm}  p_{\rm A} \cdot  p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = 1/2 \cdot q = p_{\rm BA}\hspace{0.05cm}.$$
 
  p_{\rm AB} \hspace{0.1cm} =  \hspace{0.1cm}  p_{\rm A} \cdot  p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = 1/2 \cdot q = p_{\rm BA}\hspace{0.05cm}.$$
:Setzt man diese Werte in die gegebene Entropie&ndash;Gleichung ein, so erhält man
+
*Substituting these values into the given entropy equation, we get
:$$H  \hspace{0.1cm} = \hspace{0.1cm}  2 \cdot \frac{1}{2} \cdot(1-q) \cdot  
+
:$$H  = 2 \cdot \frac{1}{2} \cdot(1-q) \cdot  
 
{\rm log}_2\hspace{0.1cm} \frac{1}{1-q} + 2 \cdot \frac{1}{2} \cdot q \cdot  
 
{\rm log}_2\hspace{0.1cm} \frac{1}{1-q} + 2 \cdot \frac{1}{2} \cdot q \cdot  
{\rm log}_2\hspace{0.1cm} \frac{1}{q} = \\
+
{\rm log}_2\hspace{0.1cm} \frac{1}{q} =  q \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{q} + (1-q) \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{1-q} = H_{\rm bin}(q) \hspace{0.05cm}.$$
\hspace{0.1cm} =  \hspace{0.1cm}  q \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{q} + (1-q) \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{1-q} = H_{\rm bin}(q) \hspace{0.05cm}.$$
+
*The numerical value sought is&nbsp; $H = H_{\rm bin} (0.25) \hspace{0.15cm}\underline{= 0.811 \, \rm bit/symbol}$.
:Der gesuchte Zahlenwert ist <i>H</i> = <i>H</i><sub>bin</sub> (0.25) <u>= 0.811 bit/Symbol</u>.
+
 
 +
 
  
:<b>3.</b>&nbsp;&nbsp;Bei gleichwahrscheinlichen Binärsymbolen ist <i>H</i><sub>1</sub> <u>= 1 bit/Symbol</u>. Mit der für Markovquellen gültigen Gleichung gilt weiter:
+
'''(3)'''&nbsp; For equally probable binary symbols,&nbsp; $H_1 \hspace{0.15cm}\underline{= 1 \, \rm bit/Symbol}$.&nbsp;
:$$H_2 \hspace{0.1cm} =  \hspace{0.1cm}  \frac{1}{2} \cdot [ H_1 +  H] \hspace{0.15cm} \underline {= 0.906 \,{\rm bit/Symbol}}  
+
*Using the equation valid for Markov sources, it further holds:
  \hspace{0.05cm},\\
+
:$$H_2 \hspace{0.1cm} =  \hspace{0.1cm}  {1}/{2} \cdot \big[ H_1 +  H \big] \hspace{0.15cm} \underline {= 0.906 \,{\rm bit/Symbol}}  
H_3 \hspace{0.1cm} =  \hspace{0.1cm} \frac{1}{3} \cdot [ H_1 + 2  H] \hspace{0.15cm} \underline {= 0.874 \,{\rm bit/Symbol}}  
+
  \hspace{0.05cm},$$
 +
:$$ H_3 \hspace{0.1cm} =  \hspace{0.1cm} {1}/{3} \cdot \big[ H_1 + 2  H \big] \hspace{0.15cm} \underline {= 0.874 \,{\rm bit/Symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>4.</b>&nbsp;&nbsp;Das Maximum der binären Entropiefunktion ergibt sich für <i>q</i> <u>= 0.5</u>. Damit beträgt die maximale Entropie <i>H</i> = 1 bit/Symbol. Man erkennt aus der Beziehung <i>H</i> = <i>H</i><sub>1</sub> und aus dem vorne abgebildeten Übergangsdiagramm, dass <i>q</i> = 0.5 statistisch unabhängige Symbole zur Folge hat:
+
 
 +
 
 +
'''(4)'''&nbsp; The maximum of the binary entropy function is obtained for&nbsp; $q\hspace{0.15cm}\underline{= 0.5}$.
 +
*Thus the maximum entropy is&nbsp; $H = 1 \, \rm bit/symbol$.  
 +
*It can be seen from the relationship&nbsp; $H = H_1$&nbsp; and from the transition diagram shown at the front that&nbsp; $q = 0.5$&nbsp; results in statistically independent symbols:
 
:$$p_{\rm A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} = 0.5
 
:$$p_{\rm A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} = 0.5
 
  \hspace{0.05cm}, \hspace{0.2cm} p_{\rm B} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}= 0.5
 
  \hspace{0.05cm}, \hspace{0.2cm} p_{\rm B} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}= 0.5
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>5.</b>&nbsp;&nbsp;Richtig sind die <u>Lösungsvorschläge 1 und 2</u>. Die Symbolfolge ergibt sich entweder zu <b>AAAAAA</b>... oder zu <b>BBBBBB</b>..., je nachdem, welches Symbol als Startwert vorgegeben wurde. Die Entropie einer solchen Quelle ist <i>H</i> = <i>H</i><sub>bin</sub>(0) = 0.
 
  
:<b>6.</b>&nbsp;&nbsp;Nun kann weder <b>A</b> direkt auf <b>A</b> noch <b>B</b> direkt auf <b>B</b> folgen. Es ergibt sich eine alternierende Folge, je nach Startwert die Folge <b>ABABAB</b>... oder <b>BABABA</b>... &#8658;&nbsp;&nbsp; <u>Lösungsvorschlag 3</u>. Diese Quelle hat in beiden Fällen ebenfalls die Entropie <i>H</i> = 0 = <i>H</i><sub>bin</sub>(1).
+
 
 +
'''(5)'''&nbsp; <u>Proposed solutions 1 and 2</u> are correct:
 +
*The symbol sequence results either in &nbsp;$\rm AAAAAA$ ... &nbsp;or in &nbsp;$\rm BBBBBB$ ... , depending on which symbol was given as the starting value.
 +
*The entropy of such a source is always &nbsp;$H = H_{\rm bin}(0) = 0$.
 +
 
 +
 
 +
 
 +
 
 +
'''(6)'''&nbsp; Only <u>proposed solution 3</u> is correct:
 +
*Now neither&nbsp; $\rm A$&nbsp; can directly follow&nbsp; $\rm A$&nbsp; nor&nbsp; $\rm B$&nbsp; can directly follow&nbsp; $\rm B$&nbsp;.  
 +
*The result is always an alternating sequence, depending on the starting value the sequence &nbsp;$\rm ABABAB$ ... &nbsp;or&nbsp; $\rm BABABA$... .
 +
*This source also has the entropy&nbsp; $H = H_{\rm bin}(1) = 0$&nbsp; in both cases.
 
{{ML-Fuß}}
 
{{ML-Fuß}}
  
  
  
[[Category:Aufgaben zu Informationstheorie und Quellencodierung|^1.2 Nachrichtenquellen mit Gedächtnis^]]
+
[[Category:Information Theory: Exercises|^1.2 Sources with Memory^]]

Latest revision as of 13:04, 10 August 2021

Binary symmetrical Markov diagram

Exercise 1.5  dealt with a binary Markov source in which the transition probabilities from  $\rm A$  to  $\rm B$  and from  $\rm B$  to  $\rm A$  were different.

In this task, the following shall now apply:

$$p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = q \hspace{0.8cm} ( 0 \le q \le 1) \hspace{0.05cm}.$$

All equations given in Exercise 1.5 also apply here:

  • Entropy:
$$H = p_{\rm AA} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm BA} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}} + p_{\rm BB} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}} \hspace{0.05cm}.$$
  • First entropy approximation::
$$H_{\rm 1} = p_{\rm A} \cdot {\rm log_2}\hspace{0.1cm} \frac{1}{p_{\rm A}} + p_{\rm B} \cdot {\rm log_2}\hspace{0.1cm} \frac{1}{p_{\rm B}} \hspace{0.05cm}.$$
  • k–th entropy approximation $(k = 2, 3, \ \text{...})$:
$$H_k = {1}/{k} \cdot \big [ H_{\rm 1} + (k-1) \cdot H \big] \hspace{0.05cm},\hspace{0.5cm}H = \lim_{k \rightarrow \infty } H_k \hspace{0.05cm}.$$




Hints:



Questions

1

Calculate the symbol probabilities for the transition probability  $q = 1/4$.

$p_{\rm A} \ = \ $

$p_{\rm B} \ = \ $

2

Calculate the source entropy  $H$  for  $q = 1/4$.

$H \ =$

$\ \rm bit/symbol$

3

What entropy approximations are obtained for  $q = 1/4$?

$H_1 \ = \ $

$\ \rm bit/symbol$
$H_2 \ = \ $

$\ \rm bit/symbol$
$H_3 \ = \ $

$\ \rm bit/symbol$

4

Determine  $q$  such that  $H$  becomes maximum. Interpretation.

$q \ = \ $

5

Which symbol sequences are possible with  $q = 0$ ?

$\rm AAAAAA$ ...
$\rm BBBBBB$ ...
$\rm ABABAB$ ...

6

Which symbol sequences are possible with  $q = 1$ ?

$\rm AAAAAA$ ...
$\rm BBBBBB$ ...
$\rm ABABAB$ ...


Solution

(1)  For a stationary first order binary Markov source holds:

$$p_{\rm A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \cdot p_{\rm A} + p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} \cdot p_{\rm B} = (1-q) \cdot p_{\rm A} + q \cdot p_{\rm B}$$
$$\Rightarrow \hspace{0.3cm}q \cdot p_{\rm A} = q \cdot p_{\rm B} \hspace{0.3cm} \Rightarrow \hspace{0.3cm}p_{\rm A} = p_{\rm B}\hspace{0.15cm} \underline {= 0.5} \hspace{0.05cm}.$$


(2)  To calculate the entropy  $H$  one needs all four composite probabilities:

$$p_{\rm AA} \hspace{0.1cm} = \hspace{0.1cm} p_{\rm A} \cdot p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} = 1/2 \cdot(1-q) = p_{\rm BB}\hspace{0.05cm},\hspace{1cm} p_{\rm AB} \hspace{0.1cm} = \hspace{0.1cm} p_{\rm A} \cdot p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = 1/2 \cdot q = p_{\rm BA}\hspace{0.05cm}.$$
  • Substituting these values into the given entropy equation, we get
$$H = 2 \cdot \frac{1}{2} \cdot(1-q) \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{1-q} + 2 \cdot \frac{1}{2} \cdot q \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{q} = q \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{q} + (1-q) \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{1-q} = H_{\rm bin}(q) \hspace{0.05cm}.$$
  • The numerical value sought is  $H = H_{\rm bin} (0.25) \hspace{0.15cm}\underline{= 0.811 \, \rm bit/symbol}$.


(3)  For equally probable binary symbols,  $H_1 \hspace{0.15cm}\underline{= 1 \, \rm bit/Symbol}$. 

  • Using the equation valid for Markov sources, it further holds:
$$H_2 \hspace{0.1cm} = \hspace{0.1cm} {1}/{2} \cdot \big[ H_1 + H \big] \hspace{0.15cm} \underline {= 0.906 \,{\rm bit/Symbol}} \hspace{0.05cm},$$
$$ H_3 \hspace{0.1cm} = \hspace{0.1cm} {1}/{3} \cdot \big[ H_1 + 2 H \big] \hspace{0.15cm} \underline {= 0.874 \,{\rm bit/Symbol}} \hspace{0.05cm}.$$


(4)  The maximum of the binary entropy function is obtained for  $q\hspace{0.15cm}\underline{= 0.5}$.

  • Thus the maximum entropy is  $H = 1 \, \rm bit/symbol$.
  • It can be seen from the relationship  $H = H_1$  and from the transition diagram shown at the front that  $q = 0.5$  results in statistically independent symbols:
$$p_{\rm A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} = 0.5 \hspace{0.05cm}, \hspace{0.2cm} p_{\rm B} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}= 0.5 \hspace{0.05cm}.$$


(5)  Proposed solutions 1 and 2 are correct:

  • The symbol sequence results either in  $\rm AAAAAA$ ...  or in  $\rm BBBBBB$ ... , depending on which symbol was given as the starting value.
  • The entropy of such a source is always  $H = H_{\rm bin}(0) = 0$.



(6)  Only proposed solution 3 is correct:

  • Now neither  $\rm A$  can directly follow  $\rm A$  nor  $\rm B$  can directly follow  $\rm B$ .
  • The result is always an alternating sequence, depending on the starting value the sequence  $\rm ABABAB$ ...  or  $\rm BABABA$... .
  • This source also has the entropy  $H = H_{\rm bin}(1) = 0$  in both cases.