Difference between revisions of "Aufgaben:Exercise 1.6Z: Ternary Markov Source"

From LNTwww
 
(19 intermediate revisions by 4 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Nachrichtenquellen mit Gedächtnis
+
{{quiz-Header|Buchseite=Information_Theory/Discrete_Sources_with_Memory
 
}}
 
}}
  
[[File:P_ID2254__Inf_Z_1_6.png|right|Ternäre Markovquelle]]
+
[[File:Inf_Z_1_6_vers2.png|right|frame|Ternary Markov source]]
Die Grafik zeigt eine Markovquelle mit $M = 3$ Zuständen $\rm A$, $\rm B$ und $\rm C$. Für die beiden Parameter dieses Markovprozesses soll gelten:
+
The diagram on the right shows a Markov source with  $M = 3$  states  $\rm A$,  $\rm B$  and  $\rm C$.  
 +
 
 +
Let the two parameters of this Markov process be:
 
:$$0 \le p \le 0.5 \hspace{0.05cm},\hspace{0.2cm}0 \le q \le 1 \hspace{0.05cm}.$$
 
:$$0 \le p \le 0.5 \hspace{0.05cm},\hspace{0.2cm}0 \le q \le 1 \hspace{0.05cm}.$$
Aufgrund der Markoveigenschaft dieser Quelle kann die Entropie auf unterschiedliche Weise ermittelt werden:
+
Due to the Markov property of this source, the entropy can be determined in different ways:
  
*Man berechnet die <i>beiden ersten Entropienäherungen</i> $H_1$ und $H_2$. Dann gilt für die tatsächliche Entropie:
+
*One calculates the first two entropy approximations&nbsp; $H_1$&nbsp; and&nbsp; $H_2$.&nbsp; Then the following applies to the actual entropy:
:$$H = 2 \cdot H_{\rm 2} - H_{\rm 1}  \hspace{0.05cm}.$$
+
:$$H= H_{k \to \infty}  = 2 \cdot H_{\rm 2} - H_{\rm 1}  \hspace{0.05cm}.$$
  
*Nach der so genannten <i>direkten Berechnungsmethode</i> kann die Entropie aber auch wie folgt berechnet werden (insgesamt 9 Terme):
+
*However, according to the&nbsp; "direct calculation method", the entropy can also be written as follows&nbsp; (nine terms in total):
:$$H = p_{\rm AA}  \cdot {\rm ld}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB}  \cdot {\rm ld}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} +  \ldots
+
:$$H = p_{\rm AA}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} +  \ \text{...}
 
  \hspace{0.05cm}, \
 
  \hspace{0.05cm}, \
 
\text{wobei} \ p_{\rm AA} = p_{\rm A} \cdot p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \hspace{0.05cm},\hspace{0.2cm}
 
\text{wobei} \ p_{\rm AA} = p_{\rm A} \cdot p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \hspace{0.05cm},\hspace{0.2cm}
p_{\rm AB} = p_{\rm A} \cdot p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} \hspace{0.05cm}, \hspace{0.1cm}\ldots$$
+
p_{\rm AB} = p_{\rm A} \cdot p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} \hspace{0.05cm}, \ \text{...}$$
 +
 
 +
 
  
''Hinweise:''
 
*Die Aufgabe gehört zum  Kapitel [[Informationstheorie/Nachrichtenquellen_mit_Gedächtnis|Nachrichtenquellen mit Gedächtnis]].
 
*Bezug genommen wird insbesondere auf die Seite  [[Informationstheorie/Nachrichtenquellen_mit_Gedächtnis#Nichtbin.C3.A4re_Markovquellen|Nichtbinäre Markovquellen]].
 
*Bei allen Entropien ist die Pseudoeinheit &bdquo;bit/Symbol&rdquo; hinzuzufügen.
 
*Sollte die Eingabe des Zahlenwertes &bdquo;0&rdquo; erforderlich sein, so geben Sie bitte &bdquo;0.&rdquo; ein.
 
  
:<b>Hinwis:</b> Die Aufgabe gehört zum Themenkomplex von Kapitel 1.2.
 
  
  
===Fragebogen===
+
''Hint:''
 +
*The exercise belongs to the chapter&nbsp; [[Information_Theory/Discrete_Sources_with_Memory|Discrete Sources with Memory]].
 +
*Reference is made in particular to the page&nbsp; [[Information_Theory/Discrete_Sources_with_Memory#Non-binary_Markov_sources|Non-binary Markov sources]].
 +
*For all entropies, add the pseudo-unit&nbsp; "bit/symbol".
 +
 
 +
 
 +
 +
 
 +
 
 +
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
{Für welche Parameter $p$ und $q$ ergibt sich die maximale Entropie pro Symbol?
+
{For which parameters &nbsp;$p$ &nbsp;and&nbsp; $q$&nbsp; does the maximum entropy per symbol result?
 
|type="{}"}
 
|type="{}"}
$p \ = $ { 0.333 1% }
+
$p \ = \ $ { 0.333 1% }
$q\ = $ { 1 1% }
+
$q\ = \ $ { 1 1% }
$H_\text{max} \ = $ { 1.585 1% } $\ \rm bit/Symbol$
+
$H_\text{max} \ = \ $ { 1.585 1% } $\ \rm bit/symbol$
  
  
{Es sei $p = 1/4$ und $q = 1$. Welcher Wert ergibt sich in diesem Fall für die erste Entropienäherung?
+
{Let &nbsp;$p = 1/4$ &nbsp;and&nbsp; $q = 1$.&nbsp; What is the value of the first entropy approximation in this case?
 
|type="{}"}
 
|type="{}"}
$H_1 = \ $ { 1.585 3% } $\ \rm bit/Symbol$
+
$H_1 = \ $ { 1.585 3% } $\ \rm bit/symbol$
  
  
{Weiterhin gelte $p = 1/4$ und $q = 1$. Welcher Wert ergibt sich in diesem Fall für die zweite Entropienäherung?
+
{Furthermore, let &nbsp;$p = 1/4$ &nbsp;and&nbsp; $q = 1$.&nbsp; What value results in this case for the second entropy approximation?
 
|type="{}"}
 
|type="{}"}
$H_2 = \ $ { 1.5425 1% } $\ \rm bit/Symbol$
+
$H_2 = \ $ { 1.5425 1% } $\ \rm bit/symbol$
  
  
{Wie groß ist die tatsächliche Quellenentropie mit $p = 1/4$ und $q = 1$?
+
{What is the actual source entropy&nbsp; $H= H_{k \to \infty}$&nbsp; with &nbsp;$p = 1/4$ &nbsp;and&nbsp; $q = 1$?
 
|type="{}"}
 
|type="{}"}
$H = \ $ { 1.5 1% } $\ \rm bit/Symbol$
+
$H = \ $ { 1.5 1% } $\ \rm bit/symbol$
  
  
{Wie groß ist die tatsächliche Quellenentropie mit $p = 1/2$ und $q = 0$?
+
{What is the actual source entropy&nbsp; $H= H_{k \to \infty}$&nbsp; with &nbsp;$p = 1/2$ &nbsp;and&nbsp; $q = 0$?
 
|type="{}"}
 
|type="{}"}
$H = \ $ { 0.667 1% } $\ \rm bit/Symbol$
+
$H = \ $ { 0.667 1% } $\ \rm bit/symbol$
  
  
Line 59: Line 66:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
:<i>Hinweis:</i> Aus Platzgründen verwenden wir in der Musterlösung &bdquo;ld&rdquo; anstelle von  &bdquo;log<sub>2</sub>&rdquo;.
+
'''(1)'''&nbsp; The maximum entropy results when the symbols&nbsp; $\rm A$,&nbsp; $\rm B$&nbsp; and&nbsp; $\rm C$&nbsp; are equally probable and the symbols within the sequence are statistically independent of each other.&nbsp; Then the following must apply:
 +
[[File:Inf_Z_1_6_vers2.png|right|frame|Transition diagram for&nbsp; $p = 1/4$,&nbsp; $q = 1$]]
 +
* $p_{\rm A} = p_{\rm A|A} =  p_{\rm A|B} = p_{\rm A|C}  = 1/3$,
 +
* $p_{\rm B} = p_{\rm B|A} =  p_{\rm B|B} = p_{\rm B|C}  = 1/3$,
 +
* $p_{\rm C} = p_{\rm C|A} =  p_{\rm C|B} = p_{\rm C|C}  = 1/3$.
  
:<b>a)</b>&nbsp;&nbsp;Die maximale Entropie ergibt sich dann, wenn die Symbole <b>A</b>, <b>B</b> und <b>C</b> gleichwahrscheinlich und die Symbole innerhalb der Folge statistisch voneinander unabhängig sind. Dann muss gelten:
 
  
:* <i>p</i><sub>A</sub> = <i>p</i><sub>A|A</sub> = <i>p</i><sub>A|B</sub> = <i>p</i><sub>A|C</sub> = 1/3,
+
From this, the values we are looking for can be determined:
 +
*For example, from&nbsp; $p_{\rm C|C}  = 1/3$&nbsp; one obtains the value &nbsp;$p \hspace{0.15cm}\underline{= 1/3}$.
 +
*If one also takes into account the relationship &nbsp;$p_{\rm A|A} = p \cdot q$,&nbsp; then &nbsp;$q \hspace{0.15cm}\underline{= 1}$.
 +
*This gives the maximum entropy &nbsp;$H_\text{max} ={\rm  log_2} \ 3\hspace{0.15cm}\underline{= 1.585\  \rm bit/symbol}$.
  
:* <i>p</i><sub>B</sub> = <i>p</i><sub>B|A</sub> = <i>p</i><sub>B|B</sub> = <i>p</i><sub>B|C</sub> = 1/3,
 
  
:* <i>p</i><sub>C</sub> = <i>p</i><sub>C|A</sub> = <i>p</i><sub>C|B</sub> = <i>p</i><sub>C|C</sub> = 1/3.
 
  
:Beispielsweise erhält man aus <i>p</i><sub>C|C</sub> = 1/3 der Wert <u><i>p</i> = 1/3</u>. Berücksichtigt man noch <i>p</i><sub>A|A</sub> = <i>q</i> &middot; <i>p</i>, so folgt <u><i>q</i> = 1</u>. Damit ergibt sich die maximale Entropie <i>H</i><sub>max</sub> = ld 3 <u>= 1.585 bit/Symbol</u>.
+
'''(2)'''&nbsp; With the parameter values&nbsp;  $p = 1/4$ &nbsp;and&nbsp;  $q = 1$ , we obtain the adjacent transition diagram, which has the following symmetries:
[[File:P_ID2255__Inf_Z_1_6b.png|right|]]
+
* $p_{\rm A|A} =  p_{\rm B|B} = p_{\rm C|C= 1/4$&nbsp; (marked in red),
 +
* $p_{\rm A|B} =  p_{\rm B|C} = p_{\rm C|A= 1/2$&nbsp; (marked in green),
 +
*$p_{\rm A|C} = p_{\rm B|A} = p_{\rm C|CB}  = 1/4$&nbsp; (marked in blue).
  
:<b>2.</b>&nbsp;&nbsp;Mit den Parameterwerten <i>p</i> = 1/4 und <i>q</i> = 1 ergibt sich das nebenstehende Übergangsdiagramm, das folgende Symmetrien aufweist:
 
  
:* <i>p</i><sub>A|A</sub> = <i>p</i><sub>B|B</sub> = <i>p</i><sub>C|C</sub> = 1/4 (rot markiert),
+
It is obvious that the symbol probabilities are all equal:
  
:* <i>p</i><sub>A|B</sub> = <i>p</i><sub>B|C</sub> = <i>p</i><sub>C|A</sub> = 1/2 (grün markiert),
+
:$$p_{\rm A} = p_{\rm B} = p_{\rm C} = 1/3 \hspace{0.3cm}
 +
\Rightarrow \hspace{0.3cm} H_1 =   {\rm log_2}\hspace{0.1cm} 3  \hspace{0.15cm} \underline {= 1.585 \,{\rm bit/symbol}}
 +
\hspace{0.05cm}.$$
  
:* <i>p</i><sub>A|C</sub> = <i>p</i><sub>B|A</sub> = <i>p</i><sub>C|B</sub> = 1/4 (blau markiert).
 
  
:Es ist offensichtlich, dass die Symbolwahrscheinlichkeiten alle gleich sind:
+
 
:$$p_{\rm A} = p_{\rm B} = p_{\rm C} = 1/3$$
+
'''(3)'''&nbsp; For the second entropy approximation one needs&nbsp; $3^2 = 9$&nbsp; joint probabilities.
:$$\Rightarrow \hspace{0.3cm} H_1 =  {\rm ld}\hspace{0.1cm} 3  \hspace{0.15cm} \underline {= 1.585 \,{\rm bit/Symbol}}  
+
*Using the result from&nbsp; '''(2)'''&nbsp;, one obtains for this:
 +
:$$p_{\rm AA} = p_{\rm BB}= p_{\rm CC}= p_{\rm AC}=p_{\rm BA}=p_{\rm CB}=1/12  \hspace{0.05cm},\hspace{0.5cm}
 +
p_{\rm AB} = p_{\rm BC}=p_{\rm CA}=1/6$$
 +
:$$\Rightarrow \hspace{0.2cm} H_2  = \frac{1}{2} \cdot \left [ 6 \cdot \frac{1}{12} \cdot {\rm log_2}\hspace{0.1cm} 12  +
 +
3 \cdot \frac{1}{6} \cdot {\rm log_2}\hspace{0.1cm} 6 \right ] \frac{1}{4} \cdot {\rm log_2}\hspace{0.1cm} 4  + \frac{1}{4} \cdot {\rm log_2}\hspace{0.1cm} 3 +  \frac{1}{4} \cdot {\rm log_2}\hspace{0.1cm} 2 +  \frac{1}{4} \cdot {\rm log_2}\hspace{0.1cm} 3
 +
  = \frac{3}{4} + \frac{{\rm log_2}\hspace{0.1cm} 3}{2} \hspace{0.15cm} \underline {= 1.5425 \,{\rm bit/symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>3.</b>&nbsp;&nbsp;Für die zweite Entropienäherung benötigt man die 3<sup>2</sup> = 9 Verbundwahrscheinlichkeiten. Mit dem Ergebnis der Teilaufgabe b) erhält man hierfür:
 
:$$p_{\rm AA} \hspace{0.1cm} =  \hspace{0.1cm} p_{\rm BB}= p_{\rm CC}= p_{\rm AC}=p_{\rm BA}=p_{\rm CB}=1/12  \hspace{0.05cm},\\
 
p_{\rm AB} \hspace{0.1cm} =  \hspace{0.1cm} p_{\rm BC}=p_{\rm CA}=1/6$$
 
:$$\Rightarrow \hspace{0.2cm} H_2 \hspace{0.15cm} =  \hspace{0.15cm}  \frac{1}{2} \cdot \left [ 6 \cdot \frac{1}{12} \cdot {\rm ld}\hspace{0.1cm} 12  +
 
3 \cdot \frac{1}{6} \cdot {\rm ld}\hspace{0.1cm} 6 \right ] = \\
 
\hspace{0.15cm} =  \hspace{0.15cm}  \frac{1}{4} \cdot {\rm ld}\hspace{0.1cm} 4  + \frac{1}{4} \cdot {\rm ld}\hspace{0.1cm} 3 +  \frac{1}{4} \cdot {\rm ld}\hspace{0.1cm} 2 +  \frac{1}{4} \cdot {\rm ld}\hspace{0.1cm} 3
 
= \frac{3}{4} + \frac{{\rm ld}\hspace{0.1cm} 3}{2} \hspace{0.15cm} \underline {= 1.5425 \,{\rm bit/Symbol}}
 
\hspace{0.05cm}.$$
 
  
:<b>4.</b>&nbsp;&nbsp;Aufgrund der Markoveigenschaft der Quelle gilt
+
 
:$$H = 2 \cdot H_2 - H_1 = [ {3}/{2} + {\rm ld}\hspace{0.1cm} 3] -  {\rm ld}\hspace{0.1cm} 3\hspace{0.15cm} \underline {= 1.5 \,{\rm bit/Symbol}}  
+
'''(4)'''&nbsp; Due to the Markov property of the source, the following holds true:
 +
:$$H = H_{k \to \infty}= 2 \cdot H_2 - H_1 = \big [ {3}/{2} + {\rm log_2}\hspace{0.1cm} 3 \big ] -  {\rm log_2}\hspace{0.1cm} 3\hspace{0.15cm} \underline {= 1.5 \,{\rm bit/symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
:Zum gleichen Ergebnis würde man mit folgender Rechnung kommen:
+
*One would arrive at the same result according to the following calculation:
:$$H \hspace{0.1cm} = \hspace{0.1cm} p_{\rm AA}  \cdot {\rm ld}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB}  \cdot {\rm ld}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} +  ... \\
+
:$$H= p_{\rm AA}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} +  ... \hspace{0.1cm}= 6 \cdot \frac{1}{12} \cdot {\rm log_2}\hspace{0.1cm} 4 + 3 \cdot \frac{1}{16} \cdot {\rm log_2}\hspace{0.1cm} 2
\hspace{0.1cm} = \hspace{0.1cm} 6 \cdot \frac{1}{12} \cdot {\rm ld}\hspace{0.1cm} 4 + 3 \cdot \frac{1}{16} \cdot {\rm ld}\hspace{0.1cm} 2
+
  \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/symbol}}
  \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/Symbol}}
 
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
[[File:P_ID2256__Inf_Z_1_6e.png|right|]]
 
  
:<b>5.</b>&nbsp;&nbsp;Aus nebenstehendem Übergangsdiagramm mit den aktuellen Parametern erkennt man, dass bei Stationarität <i>p</i><sub>B</sub> = 0 gelten wird: <b>B</b> kann höchstens zum Starzeitpunkt einmal auftreten. Es liegt also eine binäre Markovkette mit den Symbolen <b>A</b> und <b>C</b> vor. Die Symbolwahrscheinlichkeiten ergeben sich zu:
+
 
:$$p_{\rm A} = 0.5 \cdot p_{\rm C} \hspace{0.05cm}, \hspace{0.2cm}p_{\rm A} + p_{\rm C} = 1 $$
+
[[File:Inf_Z_1_6e_vers2.png|right|frame|Transition diagram for&nbsp; $p = 1/4$,&nbsp; $q = 0$]]
:$$\Rightarrow \hspace{0.3cm} p_{\rm A} = 1/3 \hspace{0.05cm}, \hspace{0.2cm} p_{\rm C} = 2/3\hspace{0.05cm}.  $$
+
'''(5)'''&nbsp; From the adjacent transition diagram with the current parameters, one can see that in the case of stationarity&nbsp; $p_{\rm B= 0$&nbsp; will apply, since&nbsp; $\rm B$&nbsp; can occur at most once at the starting time.
:<br>Damit erhält man folgende Wahrscheinlichkeiten:
+
*So there is a binary Markov chain with the symbols&nbsp; $\rm A$&nbsp; and&nbsp; $\rm C$&nbsp;.  
:$$p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \hspace{0.1cm} =  \hspace{0.1cm}0\hspace{0.7cm} \Rightarrow \hspace{0.3cm} p_{\rm AA} = 0 \hspace{0.05cm},\\
+
*The symbol probabilities are therefore given by:
p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}C} \hspace{0.1cm} = \hspace{0.1cm}1/2\hspace{0.3cm} \Rightarrow \hspace{0.3cm} p_{\rm CA} =  
+
:$$p_{\rm A} = 0.5 \cdot p_{\rm C} \hspace{0.05cm}, \hspace{0.2cm}p_{\rm A} + p_{\rm C} = 1 \hspace{0.3cm}
  p_{\rm C} \cdot p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}C} = 2/3 \cdot 1/2 = 1/3 \hspace{0.05cm},\hspace{0.2cm}{\rm ld}\hspace{0.1cm}(1/p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}C} )= 1\hspace{0.05cm},\\
+
\Rightarrow \hspace{0.3cm} p_{\rm A} = 1/3 \hspace{0.05cm}, \hspace{0.2cm} p_{\rm C} = 2/3\hspace{0.05cm}.  $$
p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}A} \hspace{0.1cm} = \hspace{0.1cm}1\hspace{0.7cm} \Rightarrow \hspace{0.3cm} p_{\rm AC} =  
+
 
  p_{\rm A} \cdot p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}A} = 1/3 \cdot 1 = 1/3 \hspace{0.05cm},\hspace{0.61cm}{\rm ld}\hspace{0.1cm}(1/p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}A} )= 0\hspace{0.05cm},\\
+
*This gives the following probabilities:
p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}C} \hspace{0.1cm} = \hspace{0.1cm}1/2\hspace{0.3cm} \Rightarrow \hspace{0.3cm} p_{\rm CC} =  
+
:$$p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \hspace{0.1cm} =  \hspace{0.1cm}0\hspace{0.7cm} \Rightarrow \hspace{0.3cm} p_{\rm AA} = 0 \hspace{0.05cm},$$
  p_{\rm C} \cdot p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}C} = 2/3 \cdot 1/2 = 1/3\hspace{0.05cm},\hspace{0.2cm}{\rm ld}\hspace{0.1cm}(1/p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}C} )= 1 $$
+
:$$ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}C} =1/2\hspace{0.3cm} \Rightarrow \hspace{0.3cm} p_{\rm CA} =  
:$$\Rightarrow \hspace{0.3cm} H  \hspace{0.1cm} = \hspace{0.1cm} p_{\rm AA}  \cdot {\rm ld}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}}  +p_{\rm CA}  \cdot {\rm ld}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}C}}+ p_{\rm AC} \cdot {\rm ld}\hspace{0.1cm}\frac {1}{ p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}A}} +  
+
  p_{\rm C} \cdot p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}C} = 2/3 \cdot 1/2 = 1/3 \hspace{0.05cm},\hspace{0.2cm}{\rm log_2}\hspace{0.1cm}(1/p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}C} )= 1\hspace{0.05cm},$$
   p_{\rm CC}  \cdot {\rm ld}\hspace{0.1cm}\frac {1}{ p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}C}}=
+
:$$ p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}A} =1\hspace{0.7cm} \Rightarrow \hspace{0.3cm} p_{\rm AC} =  
\\
+
  p_{\rm A} \cdot p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}A} = 1/3 \cdot 1 = 1/3 \hspace{0.05cm},\hspace{0.61cm}{\rm log_2}\hspace{0.1cm}(1/p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}A} )= 0\hspace{0.05cm},$$
\hspace{0.1cm} =  \hspace{0.1cm}0 + 1/3 \cdot 1 + 1/3 \cdot 0 + 1/3 \cdot 1
+
:$$ p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}C} =1/2\hspace{0.3cm} \Rightarrow \hspace{0.3cm} p_{\rm CC} =  
  \hspace{0.15cm} \underline {= 0.667 \,{\rm bit/Symbol}}  
+
  p_{\rm C} \cdot p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}C} = 2/3 \cdot 1/2 = 1/3\hspace{0.05cm},\hspace{0.2cm}{\rm log_2}\hspace{0.1cm}(1/p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}C} )= 1 $$
 +
:$$\Rightarrow \hspace{0.25cm} H  = p_{\rm AA}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}}  +p_{\rm CA}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}C}}+ p_{\rm AC} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}A}} +  
 +
   p_{\rm CC}  \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}C}}=
 +
0 + 1/3 \cdot 1 + 1/3 \cdot 0 + 1/3 \cdot 1
 +
  \hspace{0.15cm} \underline {= 0.667 \,{\rm bit/symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
 
{{ML-Fuß}}
 
{{ML-Fuß}}
Line 127: Line 140:
  
  
[[Category:Aufgaben zu Informationstheorie|^1.2 Nachrichtenquellen mit Gedächtnis^]]
+
[[Category:Information Theory: Exercises|^1.2 Sources with Memory^]]

Latest revision as of 13:05, 10 August 2021

Ternary Markov source

The diagram on the right shows a Markov source with  $M = 3$  states  $\rm A$,  $\rm B$  and  $\rm C$.

Let the two parameters of this Markov process be:

$$0 \le p \le 0.5 \hspace{0.05cm},\hspace{0.2cm}0 \le q \le 1 \hspace{0.05cm}.$$

Due to the Markov property of this source, the entropy can be determined in different ways:

  • One calculates the first two entropy approximations  $H_1$  and  $H_2$.  Then the following applies to the actual entropy:
$$H= H_{k \to \infty} = 2 \cdot H_{\rm 2} - H_{\rm 1} \hspace{0.05cm}.$$
  • However, according to the  "direct calculation method", the entropy can also be written as follows  (nine terms in total):
$$H = p_{\rm AA} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} + \ \text{...} \hspace{0.05cm}, \ \text{wobei} \ p_{\rm AA} = p_{\rm A} \cdot p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \hspace{0.05cm},\hspace{0.2cm} p_{\rm AB} = p_{\rm A} \cdot p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} \hspace{0.05cm}, \ \text{...}$$




Hint:



Questions

1

For which parameters  $p$  and  $q$  does the maximum entropy per symbol result?

$p \ = \ $

$q\ = \ $

$H_\text{max} \ = \ $

$\ \rm bit/symbol$

2

Let  $p = 1/4$  and  $q = 1$.  What is the value of the first entropy approximation in this case?

$H_1 = \ \ $

$\ \rm bit/symbol$

3

Furthermore, let  $p = 1/4$  and  $q = 1$.  What value results in this case for the second entropy approximation?

$H_2 = \ \ $

$\ \rm bit/symbol$

4

What is the actual source entropy  $H= H_{k \to \infty}$  with  $p = 1/4$  and  $q = 1$?

$H = \ \ $

$\ \rm bit/symbol$

5

What is the actual source entropy  $H= H_{k \to \infty}$  with  $p = 1/2$  and  $q = 0$?

$H = \ \ $

$\ \rm bit/symbol$


Solution

(1)  The maximum entropy results when the symbols  $\rm A$,  $\rm B$  and  $\rm C$  are equally probable and the symbols within the sequence are statistically independent of each other.  Then the following must apply:

Transition diagram for  $p = 1/4$,  $q = 1$
  • $p_{\rm A} = p_{\rm A|A} = p_{\rm A|B} = p_{\rm A|C} = 1/3$,
  • $p_{\rm B} = p_{\rm B|A} = p_{\rm B|B} = p_{\rm B|C} = 1/3$,
  • $p_{\rm C} = p_{\rm C|A} = p_{\rm C|B} = p_{\rm C|C} = 1/3$.


From this, the values we are looking for can be determined:

  • For example, from  $p_{\rm C|C} = 1/3$  one obtains the value  $p \hspace{0.15cm}\underline{= 1/3}$.
  • If one also takes into account the relationship  $p_{\rm A|A} = p \cdot q$,  then  $q \hspace{0.15cm}\underline{= 1}$.
  • This gives the maximum entropy  $H_\text{max} ={\rm log_2} \ 3\hspace{0.15cm}\underline{= 1.585\ \rm bit/symbol}$.


(2)  With the parameter values  $p = 1/4$  and  $q = 1$ , we obtain the adjacent transition diagram, which has the following symmetries:

  • $p_{\rm A|A} = p_{\rm B|B} = p_{\rm C|C} = 1/4$  (marked in red),
  • $p_{\rm A|B} = p_{\rm B|C} = p_{\rm C|A} = 1/2$  (marked in green),
  • $p_{\rm A|C} = p_{\rm B|A} = p_{\rm C|CB} = 1/4$  (marked in blue).


It is obvious that the symbol probabilities are all equal:

$$p_{\rm A} = p_{\rm B} = p_{\rm C} = 1/3 \hspace{0.3cm} \Rightarrow \hspace{0.3cm} H_1 = {\rm log_2}\hspace{0.1cm} 3 \hspace{0.15cm} \underline {= 1.585 \,{\rm bit/symbol}} \hspace{0.05cm}.$$


(3)  For the second entropy approximation one needs  $3^2 = 9$  joint probabilities.

  • Using the result from  (2) , one obtains for this:
$$p_{\rm AA} = p_{\rm BB}= p_{\rm CC}= p_{\rm AC}=p_{\rm BA}=p_{\rm CB}=1/12 \hspace{0.05cm},\hspace{0.5cm} p_{\rm AB} = p_{\rm BC}=p_{\rm CA}=1/6$$
$$\Rightarrow \hspace{0.2cm} H_2 = \frac{1}{2} \cdot \left [ 6 \cdot \frac{1}{12} \cdot {\rm log_2}\hspace{0.1cm} 12 + 3 \cdot \frac{1}{6} \cdot {\rm log_2}\hspace{0.1cm} 6 \right ] = \frac{1}{4} \cdot {\rm log_2}\hspace{0.1cm} 4 + \frac{1}{4} \cdot {\rm log_2}\hspace{0.1cm} 3 + \frac{1}{4} \cdot {\rm log_2}\hspace{0.1cm} 2 + \frac{1}{4} \cdot {\rm log_2}\hspace{0.1cm} 3 = \frac{3}{4} + \frac{{\rm log_2}\hspace{0.1cm} 3}{2} \hspace{0.15cm} \underline {= 1.5425 \,{\rm bit/symbol}} \hspace{0.05cm}.$$


(4)  Due to the Markov property of the source, the following holds true:

$$H = H_{k \to \infty}= 2 \cdot H_2 - H_1 = \big [ {3}/{2} + {\rm log_2}\hspace{0.1cm} 3 \big ] - {\rm log_2}\hspace{0.1cm} 3\hspace{0.15cm} \underline {= 1.5 \,{\rm bit/symbol}} \hspace{0.05cm}.$$
  • One would arrive at the same result according to the following calculation:
$$H= p_{\rm AA} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} + ... \hspace{0.1cm}= 6 \cdot \frac{1}{12} \cdot {\rm log_2}\hspace{0.1cm} 4 + 3 \cdot \frac{1}{16} \cdot {\rm log_2}\hspace{0.1cm} 2 \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/symbol}} \hspace{0.05cm}.$$


Transition diagram for  $p = 1/4$,  $q = 0$

(5)  From the adjacent transition diagram with the current parameters, one can see that in the case of stationarity  $p_{\rm B} = 0$  will apply, since  $\rm B$  can occur at most once at the starting time.

  • So there is a binary Markov chain with the symbols  $\rm A$  and  $\rm C$ .
  • The symbol probabilities are therefore given by:
$$p_{\rm A} = 0.5 \cdot p_{\rm C} \hspace{0.05cm}, \hspace{0.2cm}p_{\rm A} + p_{\rm C} = 1 \hspace{0.3cm} \Rightarrow \hspace{0.3cm} p_{\rm A} = 1/3 \hspace{0.05cm}, \hspace{0.2cm} p_{\rm C} = 2/3\hspace{0.05cm}. $$
  • This gives the following probabilities:
$$p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \hspace{0.1cm} = \hspace{0.1cm}0\hspace{0.7cm} \Rightarrow \hspace{0.3cm} p_{\rm AA} = 0 \hspace{0.05cm},$$
$$ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}C} =1/2\hspace{0.3cm} \Rightarrow \hspace{0.3cm} p_{\rm CA} = p_{\rm C} \cdot p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}C} = 2/3 \cdot 1/2 = 1/3 \hspace{0.05cm},\hspace{0.2cm}{\rm log_2}\hspace{0.1cm}(1/p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}C} )= 1\hspace{0.05cm},$$
$$ p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}A} =1\hspace{0.7cm} \Rightarrow \hspace{0.3cm} p_{\rm AC} = p_{\rm A} \cdot p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}A} = 1/3 \cdot 1 = 1/3 \hspace{0.05cm},\hspace{0.61cm}{\rm log_2}\hspace{0.1cm}(1/p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}A} )= 0\hspace{0.05cm},$$
$$ p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}C} =1/2\hspace{0.3cm} \Rightarrow \hspace{0.3cm} p_{\rm CC} = p_{\rm C} \cdot p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}C} = 2/3 \cdot 1/2 = 1/3\hspace{0.05cm},\hspace{0.2cm}{\rm log_2}\hspace{0.1cm}(1/p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}C} )= 1 $$
$$\Rightarrow \hspace{0.25cm} H = p_{\rm AA} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} +p_{\rm CA} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}C}}+ p_{\rm AC} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm CC} \cdot {\rm log_2}\hspace{0.1cm}\frac {1}{ p_{\rm C\hspace{0.01cm}|\hspace{0.01cm}C}}= 0 + 1/3 \cdot 1 + 1/3 \cdot 0 + 1/3 \cdot 1 \hspace{0.15cm} \underline {= 0.667 \,{\rm bit/symbol}} \hspace{0.05cm}.$$