Difference between revisions of "Aufgaben:Exercise 1.5Z: Symmetrical Markov Source"

From LNTwww
 
(16 intermediate revisions by 4 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Nachrichtenquellen mit Gedächtnis
+
{{quiz-Header|Buchseite=Information_Theory/Discrete_Sources_with_Memory
 
}}
 
}}
  
[[File:Inf_Z_1_5_vers2.png|right|Betrachtetes binäres Markovdiagramm]]
+
[[File:Inf_Z_1_5_vers2.png|right|frame|Binary symmetrical Markov diagram]]
In der [[Aufgaben:1.5_Binäre_Markovquelle|Aufgabe 1.5]] wurde eine binäre Markovquelle behandelt, bei der die Übergangswahrscheinlichkeiten von $\rm A$ nach $\rm B$ und von $\rm B$ nach $\rm A$ unterschiedlich waren. In dieser Aufgabe soll nun gelten:
+
[[Aufgaben:Exercise_1.5:_Binary_Markov_Source|Exercise 1.5]]  dealt with a binary Markov source in which the transition probabilities from  $\rm A$  to  $\rm B$  and from  $\rm B$  to  $\rm A$  were different.
 +
 
 +
In this task, the following shall now apply:
 
:$$p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = q \hspace{0.8cm} ( 0 \le q \le 1)
 
:$$p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = q \hspace{0.8cm} ( 0 \le q \le 1)
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
Alle in der Aufgabe 1.5 angegebenen Gleichungen gelten auch hier:
+
All equations given in Exercise 1.5 also apply here:
  
* <b>Entropie:</b>
+
* <b>Entropy:</b>
 
:$$H = p_{\rm AA}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} +  p_{\rm BA}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}} + p_{\rm BB}  \cdot  {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}}
 
:$$H = p_{\rm AA}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} +  p_{\rm BA}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}} + p_{\rm BB}  \cdot  {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}}
  \hspace{0.05cm},$$
+
  \hspace{0.05cm}.$$
  
* <b>Erste Entropienäherung</b>:
+
* <b>First entropy approximation:</b>:
 
:$$H_{\rm 1}  =  p_{\rm A} \cdot {\rm log_2}\hspace{0.1cm} \frac{1}{p_{\rm A}} + p_{\rm B} \cdot {\rm log_2}\hspace{0.1cm} \frac{1}{p_{\rm B}}  
 
:$$H_{\rm 1}  =  p_{\rm A} \cdot {\rm log_2}\hspace{0.1cm} \frac{1}{p_{\rm A}} + p_{\rm B} \cdot {\rm log_2}\hspace{0.1cm} \frac{1}{p_{\rm B}}  
  \hspace{0.05cm},$$
+
  \hspace{0.05cm}.$$
  
* <b><i>k</i>&ndash;te Entropienäherung</b> ($k = 2, 3$, ...):
+
* <b><i>k</i>&ndash;th entropy approximation</b> $(k = 2, 3, \ \text{...})$:
:$$H_k =  {1}/{k} \cdot [ H_{\rm 1} + (k-1) \cdot H]  
+
:$$H_k =  {1}/{k} \cdot \big [ H_{\rm 1} + (k-1) \cdot H \big]  
 
  \hspace{0.05cm},\hspace{0.5cm}H  =  \lim_{k \rightarrow \infty } H_k  \hspace{0.05cm}.$$
 
  \hspace{0.05cm},\hspace{0.5cm}H  =  \lim_{k \rightarrow \infty } H_k  \hspace{0.05cm}.$$
  
''Hinweise:''
 
*Die Aufgabe gehört zum  Kapitel [[Informationstheorie/Nachrichtenquellen_mit_Gedächtnis|Nachrichtenquellen mit Gedächtnis]].
 
*Bezug genommen wird insbesondere auf die Seite  [[Informationstheorie/Nachrichtenquellen_mit_Gedächtnis#Bin.C3.A4rquellen_mit_Markoveigenschaften|Binärquellen mit Markoveigenschaften]].
 
*Bei allen Entropien ist die Pseudoeinheit &bdquo;bit/Symbol&rdquo; hinzuzufügen.
 
*Sollte die Eingabe des Zahlenwertes &bdquo;0&rdquo; erforderlich sein, so geben Sie bitte &bdquo;0.&rdquo; ein.
 
  
  
  
===Fragebogen===
+
 
 +
 
 +
 
 +
''Hints:''
 +
*The exercise belongs to the chapter&nbsp; [[Information_Theory/Discrete_Sources_with_Memory|Discrete Sources with Memory]].
 +
*Reference is made in particular to the page&nbsp;  [[Information_Theory/Discrete_Sources_with_Memory#Binary_sources_with_Markov_properties|"Binary sources with Markov properties"]].
 +
*For all entropies, add the pseudo-unit "bit/symbol".
 +
 +
 
 +
 
 +
 
 +
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
{Berechnen Sie die Symbolwahrscheinlichkeiten für die Übergangswahrscheinlichkeit $q = 1/4$.
+
{Calculate the symbol probabilities for the transition probability&nbsp; $q = 1/4$.
 
|type="{}"}
 
|type="{}"}
$p_{\rm A} \ = $ { 0.5 1% }
+
$p_{\rm A} \ = \ $ { 0.5 1% }
$p_{\rm B} \ = $ { 0.5 1% }
+
$p_{\rm B} \ = \ $ { 0.5 1% }
  
  
{Berechnen Sie die Quellenentropie $H$ für $q = 1/4$.
+
{Calculate the source entropy&nbsp; $H$&nbsp; for&nbsp; $q = 1/4$.
 
|type="{}"}
 
|type="{}"}
$H \ =$ { 0.5 3% } $\ \rm bit/Symbol$
+
$H \ =$ { 0.811 3% } $\ \rm bit/symbol$
  
  
{Welche Entropienäherungen erhält man für $q = 1/4$?
+
{What entropy approximations are obtained for&nbsp; $q = 1/4$?
 
|type="{}"}
 
|type="{}"}
$H_1 \ =$ { 1 1% } $\ \rm bit/Symbol$
+
$H_1 \ = \ $ { 1 1% } $\ \rm bit/symbol$
$H_2 \ =$ { 0.906 1% } $\ \rm bit/Symbol$
+
$H_2 \ = \ $ { 0.906 1% } $\ \rm bit/symbol$
$H_3 \ =$ { 0.874 1% } $\ \rm bit/Symbol$
+
$H_3 \ = \ $ { 0.874 1% } $\ \rm bit/symbol$
  
  
{Bestimmen Sie $q$ derart, dass <i>H</i> maximal wird. Interpretation.
+
{Determine&nbsp; $q$&nbsp; such that&nbsp; $H$&nbsp; becomes maximum. Interpretation.
 
|type="{}"}
 
|type="{}"}
$H \rightarrow \text{Maximum:}\ \ q \ =$ { 0.5 3% }
+
$q \ = \ $ { 0.5 3% }
  
  
{Welche Symbolfolgen sind mit $q = 0$ möglich?
+
{Which symbol sequences are possible with&nbsp; $q = 0$&nbsp;?
 
|type="[]"}
 
|type="[]"}
 
+ $\rm AAAAAA$ ...  
 
+ $\rm AAAAAA$ ...  
Line 63: Line 71:
  
  
{Welche Symbolfolgen sind mit $q = 0$ möglich?
+
{Which symbol sequences are possible with&nbsp; $q = 1$&nbsp;?
 
|type="[]"}
 
|type="[]"}
 
- $\rm AAAAAA$ ...  
 
- $\rm AAAAAA$ ...  
Line 73: Line 81:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''(1)'''&nbsp; Bei einer stationären binären Markovquelle erster Ordnung gilt:
+
'''(1)'''&nbsp; For a stationary first order binary Markov source holds:
 
:$$p_{\rm A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \cdot p_{\rm A} + p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} \cdot p_{\rm B}
 
:$$p_{\rm A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \cdot p_{\rm A} + p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} \cdot p_{\rm B}
 
  = (1-q) \cdot p_{\rm A} + q \cdot p_{\rm B}$$
 
  = (1-q) \cdot p_{\rm A} + q \cdot p_{\rm B}$$
:$$q \cdot p_{\rm A} = q \cdot p_{\rm B} \hspace{0.3cm} \Rightarrow \hspace{0.3cm}p_{\rm A} = p_{\rm B}\hspace{0.15cm} \underline {= 0.5}  
+
:$$\Rightarrow \hspace{0.3cm}q \cdot p_{\rm A} = q \cdot p_{\rm B} \hspace{0.3cm} \Rightarrow \hspace{0.3cm}p_{\rm A} = p_{\rm B}\hspace{0.15cm} \underline {= 0.5}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
'''(2)'''&nbsp; Zur Berechnung der Entropie $H$ benötigt man alle vier Verbundwahrscheinlichkeiten:
+
 
 +
 
 +
'''(2)'''&nbsp; To calculate the entropy&nbsp; $H$&nbsp; one needs all four composite probabilities:
 
:$$p_{\rm AA} \hspace{0.1cm} =  \hspace{0.1cm}  p_{\rm A} \cdot  p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} = 1/2 \cdot(1-q) = p_{\rm BB}\hspace{0.05cm},\hspace{1cm}  
 
:$$p_{\rm AA} \hspace{0.1cm} =  \hspace{0.1cm}  p_{\rm A} \cdot  p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} = 1/2 \cdot(1-q) = p_{\rm BB}\hspace{0.05cm},\hspace{1cm}  
 
  p_{\rm AB} \hspace{0.1cm} =  \hspace{0.1cm}  p_{\rm A} \cdot  p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = 1/2 \cdot q = p_{\rm BA}\hspace{0.05cm}.$$
 
  p_{\rm AB} \hspace{0.1cm} =  \hspace{0.1cm}  p_{\rm A} \cdot  p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = 1/2 \cdot q = p_{\rm BA}\hspace{0.05cm}.$$
Setzt man diese Werte in die gegebene Entropie&ndash;Gleichung ein, so erhält man
+
*Substituting these values into the given entropy equation, we get
 
:$$H  = 2 \cdot \frac{1}{2} \cdot(1-q) \cdot  
 
:$$H  = 2 \cdot \frac{1}{2} \cdot(1-q) \cdot  
 
{\rm log}_2\hspace{0.1cm} \frac{1}{1-q} + 2 \cdot \frac{1}{2} \cdot q \cdot  
 
{\rm log}_2\hspace{0.1cm} \frac{1}{1-q} + 2 \cdot \frac{1}{2} \cdot q \cdot  
 
{\rm log}_2\hspace{0.1cm} \frac{1}{q} =  q \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{q} + (1-q) \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{1-q} = H_{\rm bin}(q) \hspace{0.05cm}.$$
 
{\rm log}_2\hspace{0.1cm} \frac{1}{q} =  q \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{q} + (1-q) \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{1-q} = H_{\rm bin}(q) \hspace{0.05cm}.$$
Der gesuchte Zahlenwert ist $H = H_{\rm bin} (0.25) \hspace{0.15cm}\underline{= 0.811 \, \rm bit/Symbol}$.
+
*The numerical value sought is&nbsp; $H = H_{\rm bin} (0.25) \hspace{0.15cm}\underline{= 0.811 \, \rm bit/symbol}$.
  
  
'''(3)'''&nbsp; Bei gleichwahrscheinlichen Binärsymbolen ist $H_1 \hspace{0.15cm}\underline{= 1 \, \rm bit/Symbol}$. Mit der für Markovquellen gültigen Gleichung gilt weiter:
+
 
:$$H_2 \hspace{0.1cm} =  \hspace{0.1cm}  {1}/{2} \cdot [ H_1 +  H] \hspace{0.15cm} \underline {= 0.906 \,{\rm bit/Symbol}}  
+
'''(3)'''&nbsp; For equally probable binary symbols,&nbsp; $H_1 \hspace{0.15cm}\underline{= 1 \, \rm bit/Symbol}$.&nbsp;
 +
*Using the equation valid for Markov sources, it further holds:
 +
:$$H_2 \hspace{0.1cm} =  \hspace{0.1cm}  {1}/{2} \cdot \big[ H_1 +  H \big] \hspace{0.15cm} \underline {= 0.906 \,{\rm bit/Symbol}}  
 
  \hspace{0.05cm},$$
 
  \hspace{0.05cm},$$
:$$ H_3 \hspace{0.1cm} =  \hspace{0.1cm} {1}/{3} \cdot [ H_1 + 2  H] \hspace{0.15cm} \underline {= 0.874 \,{\rm bit/Symbol}}  
+
:$$ H_3 \hspace{0.1cm} =  \hspace{0.1cm} {1}/{3} \cdot \big[ H_1 + 2  H \big] \hspace{0.15cm} \underline {= 0.874 \,{\rm bit/Symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
  
'''(4)'''&nbsp; Das Maximum der binären Entropiefunktion ergibt sich für $q\hspace{0.15cm}\underline{= 0.5}$.  Damit beträgt die maximale Entropie $H =  1 \, \rm bit/Symbol$. Man erkennt aus der Beziehung $H = H_1$ und aus dem vorne abgebildeten Übergangsdiagramm, dass $q = 0.5$ statistisch unabhängige Symbole zur Folge hat:
+
 
 +
'''(4)'''&nbsp; The maximum of the binary entropy function is obtained for&nbsp; $q\hspace{0.15cm}\underline{= 0.5}$.   
 +
*Thus the maximum entropy is&nbsp; $H =  1 \, \rm bit/symbol$.  
 +
*It can be seen from the relationship&nbsp; $H = H_1$&nbsp; and from the transition diagram shown at the front that&nbsp; $q = 0.5$&nbsp; results in statistically independent symbols:
 
:$$p_{\rm A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} = 0.5
 
:$$p_{\rm A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} = 0.5
 
  \hspace{0.05cm}, \hspace{0.2cm} p_{\rm B} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}= 0.5
 
  \hspace{0.05cm}, \hspace{0.2cm} p_{\rm B} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}= 0.5
Line 104: Line 119:
  
  
'''(5)'''&nbsp; Richtig sind die <u>Lösungsvorschläge 1 und 2</u>:
+
 
*Die Symbolfolge ergibt sich entweder zu $\rm AAAAAA$ ... oder zu $\rm BBBBBB$ ... , je nachdem, welches Symbol als Startwert vorgegeben wurde.  
+
'''(5)'''&nbsp; <u>Proposed solutions 1 and 2</u> are correct:
*Die Entropie einer solchen Quelle ist $H = H_{\rm bin}(0) = 0$.
+
*The symbol sequence results either in &nbsp;$\rm AAAAAA$ ... &nbsp;or in &nbsp;$\rm BBBBBB$ ... , depending on which symbol was given as the starting value.
 +
*The entropy of such a source is always &nbsp;$H = H_{\rm bin}(0) = 0$.
 +
 
 +
 
  
  
'''(6)'''&nbsp; Richtig ist nur der <u>Lösungsvorschlag 3</u>:
+
'''(6)'''&nbsp; Only <u>proposed solution 3</u> is correct:
*Nun kann weder $\rm A$ direkt auf $\rm A$ noch $\rm B$ direkt auf $\rm B$ folgen.  
+
*Now neither&nbsp; $\rm A$&nbsp; can directly follow&nbsp; $\rm A$&nbsp; nor&nbsp; $\rm B$&nbsp; can directly follow&nbsp; $\rm B$&nbsp;.  
*Es ergibt sich stets eine alternierende Folge, je nach Startwert die Folge $\rm ABABAB$ ... oder $\rm BABABA$... .  
+
*The result is always an alternating sequence, depending on the starting value the sequence &nbsp;$\rm ABABAB$ ... &nbsp;or&nbsp; $\rm BABABA$... .  
*Diese Quelle hat in beiden Fällen die Entropie $H = H_{\rm bin}(1) = 0$.
+
*This source also has the entropy&nbsp; $H = H_{\rm bin}(1) = 0$&nbsp; in both cases.
 
{{ML-Fuß}}
 
{{ML-Fuß}}
  
  
  
[[Category:Aufgaben zu Informationstheorie|^1.2 Nachrichtenquellen mit Gedächtnis^]]
+
[[Category:Information Theory: Exercises|^1.2 Sources with Memory^]]

Latest revision as of 13:04, 10 August 2021

Binary symmetrical Markov diagram

Exercise 1.5  dealt with a binary Markov source in which the transition probabilities from  $\rm A$  to  $\rm B$  and from  $\rm B$  to  $\rm A$  were different.

In this task, the following shall now apply:

$$p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = q \hspace{0.8cm} ( 0 \le q \le 1) \hspace{0.05cm}.$$

All equations given in Exercise 1.5 also apply here:

  • Entropy:
$$H = p_{\rm AA} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm BA} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}} + p_{\rm BB} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}} \hspace{0.05cm}.$$
  • First entropy approximation::
$$H_{\rm 1} = p_{\rm A} \cdot {\rm log_2}\hspace{0.1cm} \frac{1}{p_{\rm A}} + p_{\rm B} \cdot {\rm log_2}\hspace{0.1cm} \frac{1}{p_{\rm B}} \hspace{0.05cm}.$$
  • k–th entropy approximation $(k = 2, 3, \ \text{...})$:
$$H_k = {1}/{k} \cdot \big [ H_{\rm 1} + (k-1) \cdot H \big] \hspace{0.05cm},\hspace{0.5cm}H = \lim_{k \rightarrow \infty } H_k \hspace{0.05cm}.$$




Hints:



Questions

1

Calculate the symbol probabilities for the transition probability  $q = 1/4$.

$p_{\rm A} \ = \ $

$p_{\rm B} \ = \ $

2

Calculate the source entropy  $H$  for  $q = 1/4$.

$H \ =$

$\ \rm bit/symbol$

3

What entropy approximations are obtained for  $q = 1/4$?

$H_1 \ = \ $

$\ \rm bit/symbol$
$H_2 \ = \ $

$\ \rm bit/symbol$
$H_3 \ = \ $

$\ \rm bit/symbol$

4

Determine  $q$  such that  $H$  becomes maximum. Interpretation.

$q \ = \ $

5

Which symbol sequences are possible with  $q = 0$ ?

$\rm AAAAAA$ ...
$\rm BBBBBB$ ...
$\rm ABABAB$ ...

6

Which symbol sequences are possible with  $q = 1$ ?

$\rm AAAAAA$ ...
$\rm BBBBBB$ ...
$\rm ABABAB$ ...


Solution

(1)  For a stationary first order binary Markov source holds:

$$p_{\rm A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \cdot p_{\rm A} + p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} \cdot p_{\rm B} = (1-q) \cdot p_{\rm A} + q \cdot p_{\rm B}$$
$$\Rightarrow \hspace{0.3cm}q \cdot p_{\rm A} = q \cdot p_{\rm B} \hspace{0.3cm} \Rightarrow \hspace{0.3cm}p_{\rm A} = p_{\rm B}\hspace{0.15cm} \underline {= 0.5} \hspace{0.05cm}.$$


(2)  To calculate the entropy  $H$  one needs all four composite probabilities:

$$p_{\rm AA} \hspace{0.1cm} = \hspace{0.1cm} p_{\rm A} \cdot p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} = 1/2 \cdot(1-q) = p_{\rm BB}\hspace{0.05cm},\hspace{1cm} p_{\rm AB} \hspace{0.1cm} = \hspace{0.1cm} p_{\rm A} \cdot p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = 1/2 \cdot q = p_{\rm BA}\hspace{0.05cm}.$$
  • Substituting these values into the given entropy equation, we get
$$H = 2 \cdot \frac{1}{2} \cdot(1-q) \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{1-q} + 2 \cdot \frac{1}{2} \cdot q \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{q} = q \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{q} + (1-q) \cdot {\rm log}_2\hspace{0.1cm} \frac{1}{1-q} = H_{\rm bin}(q) \hspace{0.05cm}.$$
  • The numerical value sought is  $H = H_{\rm bin} (0.25) \hspace{0.15cm}\underline{= 0.811 \, \rm bit/symbol}$.


(3)  For equally probable binary symbols,  $H_1 \hspace{0.15cm}\underline{= 1 \, \rm bit/Symbol}$. 

  • Using the equation valid for Markov sources, it further holds:
$$H_2 \hspace{0.1cm} = \hspace{0.1cm} {1}/{2} \cdot \big[ H_1 + H \big] \hspace{0.15cm} \underline {= 0.906 \,{\rm bit/Symbol}} \hspace{0.05cm},$$
$$ H_3 \hspace{0.1cm} = \hspace{0.1cm} {1}/{3} \cdot \big[ H_1 + 2 H \big] \hspace{0.15cm} \underline {= 0.874 \,{\rm bit/Symbol}} \hspace{0.05cm}.$$


(4)  The maximum of the binary entropy function is obtained for  $q\hspace{0.15cm}\underline{= 0.5}$.

  • Thus the maximum entropy is  $H = 1 \, \rm bit/symbol$.
  • It can be seen from the relationship  $H = H_1$  and from the transition diagram shown at the front that  $q = 0.5$  results in statistically independent symbols:
$$p_{\rm A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} = 0.5 \hspace{0.05cm}, \hspace{0.2cm} p_{\rm B} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}= 0.5 \hspace{0.05cm}.$$


(5)  Proposed solutions 1 and 2 are correct:

  • The symbol sequence results either in  $\rm AAAAAA$ ...  or in  $\rm BBBBBB$ ... , depending on which symbol was given as the starting value.
  • The entropy of such a source is always  $H = H_{\rm bin}(0) = 0$.



(6)  Only proposed solution 3 is correct:

  • Now neither  $\rm A$  can directly follow  $\rm A$  nor  $\rm B$  can directly follow  $\rm B$ .
  • The result is always an alternating sequence, depending on the starting value the sequence  $\rm ABABAB$ ...  or  $\rm BABABA$... .
  • This source also has the entropy  $H = H_{\rm bin}(1) = 0$  in both cases.