Difference between revisions of "Aufgaben:Exercise 1.6: Non-Binary Markov Sources"

From LNTwww
m (Text replacement - "”" to """)
 
(7 intermediate revisions by 2 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Nachrichtenquellen mit Gedächtnis
+
{{quiz-Header|Buchseite=Information_Theory/Discrete_Sources_with_Memory
 
}}
 
}}
  
[[File:P_ID2253__Inf_A_1_6.png|right|frame|Markovquellen mit <br>$M = 3$&nbsp; und&nbsp; $M = 4$]]
+
[[File:EN_Inf_A_1_6.png|right|frame|Markov sources with <br>$M = 3$&nbsp; and&nbsp; $M = 4$]]
Die Grafik zeigt zwei ergodische Markovquellen&nbsp; $\rm (MQ)$:
+
The graph shows two ergodic Markov sources &nbsp; $($German:&nbsp; "Markovquellen" &nbsp; &rArr; &nbsp; "$\rm MQ$"$)$:
  
* Die Quelle&nbsp; $\rm MQ3$&nbsp; ist durch&nbsp; $M = 3$&nbsp; Zustände (Symbole)&nbsp; $\rm N$,&nbsp; $\rm M$,&nbsp; $\rm P$&nbsp; gekennzeichnet.&nbsp; Aufgrund der Stationarität haben die Wahrscheinlichkeiten folgende Werte:
+
* The source&nbsp; $\rm MQ3$&nbsp; is denoted by&nbsp; $M = 3$&nbsp; states&nbsp; (symbols)&nbsp; $\rm N$,&nbsp; $\rm M$,&nbsp; $\rm P$.&nbsp; Due to stationarity, the probabilities have the following values:
:$$p_{\rm N} = 1/2\hspace{0.05cm},\hspace{0.2cm}p_{\rm M} = p_{\rm P} = 1/4\hspace{0.05cm}.$$
+
:$$p_{\rm N} = 1/2\hspace{0.05cm},\hspace{0.2cm}p_{\rm M} = p_{\rm P} = 1/4\hspace{0.05cm}.$$
  
* Bei der Quelle&nbsp; $\rm MQ4$&nbsp; ist zusätzlich der Zustand&nbsp; $\rm O$&nbsp; möglich &nbsp; &#8658; &nbsp; $M = 4$.&nbsp; Aufgrund der symmetrischen Übergänge sind die stationären Wahrscheinlichkeiten alle gleich:
+
* For the source&nbsp; $\rm MQ4$&nbsp; the state&nbsp; $\rm O$&nbsp; is additionally possible &nbsp; &#8658; &nbsp; $M = 4$.&nbsp; Due to the symmetric transitions, the stationary probabilities are all equal:
:$$p_{\rm N} = p_{\rm M} = p_{\rm O} = p_{\rm P} = 1/4\hspace{0.05cm}.$$
+
:$$p_{\rm N} = p_{\rm M} = p_{\rm O} = p_{\rm P} = 1/4\hspace{0.05cm}.$$
  
Informationstheoretisch sind Markovquellen von besonderer Bedeutung, da bei diesen &ndash; und nur bei diesen &ndash; durch
+
In terms of information theory, Markov sources are of particular importance, since with these &ndash; and only with these &ndash; by
* $H_1$&nbsp; (erste Entropienäherung, nur auf den Symbolwahrscheinlichkeiten basierend), und
+
* $H_1$&nbsp; (first entropy approximation, based only on the symbol probabilities), and
* $H_2$&nbsp; (zweite Entropienäherung, berechenbar mit den Verbundwahrscheinlichkeiten für alle Zweiertupel)  
+
* $H_2$&nbsp; (second entropy approximation, computable with the composite probabilities for all two-tuples)  
  
  
gleichzeitig auch bestimmt sind:
+
are also determined at the same time:
  
* die weiteren Entropienäherungen&nbsp; $H_k$&nbsp; mit&nbsp; $k = 3, \ 4$,&nbsp; ...  und
+
* the further entropy approximations&nbsp; $H_k$&nbsp; with&nbsp; $k = 3, \ 4$,&nbsp; ..., and
* die tatsächliche Quellenentropie&nbsp; $H$.
+
* the actual (final) source entropy&nbsp; $H$.
  
  
Es gelten folgende Bestimmungsgleichungen:
+
The following equations of determination apply:
:$$H = 2 \cdot H_2 - H_1\hspace{0.05cm},$$
+
:$$H= H_{k \to \infty} = 2 \cdot H_2 - H_1\hspace{0.05cm},$$
 
:$$H_k =  {1}/{k} \cdot \big [ H_{\rm 1} + (k-1) \cdot H \big ]  
 
:$$H_k =  {1}/{k} \cdot \big [ H_{\rm 1} + (k-1) \cdot H \big ]  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
Line 32: Line 32:
  
  
 +
''Hint:''
 +
*The exercise belongs to the chapter&nbsp; [[Information_Theory/Discrete_Sources_with_Memory|Discrete Sources with Memory]].
 +
*Reference is made in particular to the page&nbsp; [[Information_Theory/Discrete_Sources_with_Memory#Non-binary_Markov_sources|Non-binary Markov Sources]].
 +
*For all entropies, add the pseudo-unit&nbsp; "bit/symbol".
  
  
 
''Hinweise:''
 
*Die Aufgabe gehört zum  Kapitel&nbsp; [[Information_Theory/Nachrichtenquellen_mit_Gedächtnis|Nachrichtenquellen mit Gedächtnis]].
 
*Bezug genommen wird insbesondere auf die Seite&nbsp;  [[Information_Theory/Nachrichtenquellen_mit_Gedächtnis#Nichtbin.C3.A4re_Markovquellen|Nichtbinäre Markovquellen]].
 
*Bei allen Entropien ist die Pseudoeinheit &bdquo;bit/Symbol" hinzuzufügen.
 
 
  
  
Line 45: Line 43:
  
 
<quiz display=simple>
 
<quiz display=simple>
{Berechnen Sie die Entropienäherung&nbsp; $H_1$&nbsp; der Markovquelle&nbsp; $\rm MQ3$.
+
{Calculate the entropy approximation&nbsp; $H_1$&nbsp; of the Markov source&nbsp; $\rm MQ3$.
 
|type="{}"}
 
|type="{}"}
$H_1 \ = \ $ { 1.5 3% } $\ \rm bit/Symbol$
+
$H_1 \ = \ $ { 1.5 3% } $\ \rm bit/symbol$
  
  
{Berechnen Sie die Entropienäherung&nbsp; $H_2$&nbsp; der Markovquelle&nbsp; $\rm MQ3$.
+
{Calculate the entropy approximation&nbsp; $H_2$&nbsp; of the Markov source&nbsp; $\rm MQ3$.
 
|type="{}"}
 
|type="{}"}
$H_2 \ = \ $ { 1.375 3% } $\ \rm bit/Symbol$
+
$H_2 \ = \ $ { 1.375 3% } $\ \rm bit/symbol$
  
  
{Wie groß sind für&nbsp; $\rm MQ3$&nbsp; die tatsächliche Quellenentropie&nbsp; $H$&nbsp; und die Näherungen&nbsp; $H_3$&nbsp; und&nbsp; $H_4$?
+
{What are the actual source entropy&nbsp; $H= H_{k \to \infty}$&nbsp; and the approximations&nbsp; $H_3$&nbsp; and&nbsp; $H_4$ for&nbsp; $\rm MQ3$&nbsp;?
 
|type="{}"}
 
|type="{}"}
$H \ = \ $ { 1.25 3% } $\ \rm bit/Symbol$
+
$H \ = \ $ { 1.25 3% } $\ \rm bit/symbol$
$H_3 \ = \ $ { 1.333 3% } $\ \rm bit/Symbol$
+
$H_3 \ = \ $ { 1.333 3% } $\ \rm bit/symbol$
$H_4 \ = \ $ { 1.3125 3% } $\ \rm bit/Symbol$
+
$H_4 \ = \ $ { 1.3125 3% } $\ \rm bit/symbol$
  
 
+
{Calculate the entropy approximation&nbsp; $H_1$&nbsp; of the Markov source&nbsp; $\rm MQ4$.
{Berechnen Sie die Entropienäherung&nbsp; $H_1$&nbsp; der Markovquelle&nbsp; $\rm MQ4$.
 
 
|type="{}"}
 
|type="{}"}
$H_1 \ = \ $ { 2 3% } $\ \rm bit/Symbol$
+
$H_1 \ = \ $ { 2 3% } $\ \rm bit/symbol$
  
  
{Berechnen Sie die Entropienäherung&nbsp; $H_2$&nbsp; der Markovquelle&nbsp; $\rm MQ4$.
+
{Calculate the entropy approximation&nbsp; $H_2$&nbsp; of the Markov source&nbsp; $\rm MQ4$
 
|type="{}"}
 
|type="{}"}
$H_2 \ = \ $ { 1.5 3% } $\ \rm bit/Symbol$
+
$H_2 \ = \ $ { 1.5 3% } $\ \rm bit/symbol$
  
  
{Wie groß sind für&nbsp; $\rm MQ4$&nbsp; die tatsächliche Quellenentropie&nbsp; $H$&nbsp; und die Näherungen&nbsp; $H_3$&nbsp; und&nbsp; $H_4$?
+
{What are the actual source entropy&nbsp; $H= H_{k \to \infty}$&nbsp; and the approximations&nbsp; $H_3$&nbsp; and&nbsp; $H_4$ for&nbsp; $\rm MQ4$&nbsp;?
 
|type="{}"}
 
|type="{}"}
$H \ = \ $ { 1 3% } $\ \rm bit/Symbol$
+
$H \ = \ $ { 1 3% } $\ \rm bit/symbol$
$H_3 \ = \ $ { 1.333 3% } $\ \rm bit/Symbol$
+
$H_3 \ = \ $ { 1.333 3% } $\ \rm bit/symbol$
$H_4 \ = \ $ { 1.25 3% } $\ \rm bit/Symbol$
+
$H_4 \ = \ $ { 1.25 3% } $\ \rm bit/symbol$
 
 
  
 
</quiz>
 
</quiz>
Line 83: Line 79:
 
===Musterlösung===
 
===Musterlösung===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''(1)'''&nbsp; Die Symbolwahrscheinlichkeiten der ternären Markovquelle sind gegeben.  
+
'''(1)'''&nbsp; The symbol probabilities of the ternary Markov source are given.  
*Daraus lässt sich die Entropienäherung&nbsp; $H_1$&nbsp; berechnen:
+
*From this, the entropy approximation&nbsp; $H_1$&nbsp; can be calculated:
:$$H_{\rm 1} = 1/2 \cdot {\rm log}_2\hspace{0.1cm} (2) + 2 \cdot 1/4 \cdot {\rm log}_2\hspace{0.1cm}(4 ) \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/Symbol}}  
+
:$$H_{\rm 1} = 1/2 \cdot {\rm log}_2\hspace{0.1cm} (2) + 2 \cdot 1/4 \cdot {\rm log}_2\hspace{0.1cm}(4 ) \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
  
  
'''(2)'''&nbsp; Die Verbundwahrscheinlichkeit ist&nbsp; $p_{\rm XY} = p_{\rm X} \cdot p_{\rm Y|X}$,&nbsp; wobei&nbsp; $p_{\rm X}$&nbsp; die Symbolwahrscheinlichkeit von&nbsp; $\rm X$&nbsp; angibt und&nbsp; $p_{\rm Y|X}$&nbsp; die bedingte Wahrscheinlichkeit für&nbsp; $\rm Y$, unter der Voraussetzung, dass vorher&nbsp; $\rm X$&nbsp; aufgetreten ist.  
+
'''(2)'''&nbsp; The composite probability is&nbsp; $p_{\rm XY} = p_{\rm X} \cdot p_{\rm Y|X}$,&nbsp; where&nbsp; $p_{\rm X}$&nbsp; gives the symbol probability of&nbsp; $\rm X$&nbsp; and&nbsp; $p_{\rm Y|X}$&nbsp; gives the conditional probability of&nbsp; $\rm Y$, given that previously&nbsp; $\rm X$&nbsp; occurred.  
  
*$\rm X$&nbsp; und&nbsp; $\rm Y$&nbsp; sind hier Platzhalter für die Symbole&nbsp; $\rm N$,&nbsp; $\rm P$&nbsp; und&nbsp; $\rm M$.&nbsp; Dann gilt:
+
*$\rm X$&nbsp; and&nbsp; $\rm Y$&nbsp; are here placeholders for the symbols&nbsp; $\rm N$,&nbsp; $\rm P$&nbsp; and&nbsp; $\rm M$.&nbsp; Then holds:
 
:$$p_{\rm NN} = 1/2 \cdot 1/2 = 1/4\hspace{0.05cm},\hspace{0.2cm}p_{\rm PP} =  1/4 \cdot 0 = 0\hspace{0.05cm},\hspace{0.2cm}p_{\rm MM} =  1/4 \cdot 0 = 0 \hspace{0.05cm},$$
 
:$$p_{\rm NN} = 1/2 \cdot 1/2 = 1/4\hspace{0.05cm},\hspace{0.2cm}p_{\rm PP} =  1/4 \cdot 0 = 0\hspace{0.05cm},\hspace{0.2cm}p_{\rm MM} =  1/4 \cdot 0 = 0 \hspace{0.05cm},$$
 
:$$ p_{\rm NP} =  1/2 \cdot 1/4 = 1/8\hspace{0.05cm},\hspace{0.2cm} p_{\rm PM} = 1/4 \cdot 1/2 = 1/8\hspace{0.05cm},\hspace{0.2cm}p_{\rm MN} =  1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},$$
 
:$$ p_{\rm NP} =  1/2 \cdot 1/4 = 1/8\hspace{0.05cm},\hspace{0.2cm} p_{\rm PM} = 1/4 \cdot 1/2 = 1/8\hspace{0.05cm},\hspace{0.2cm}p_{\rm MN} =  1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},$$
Line 101: Line 97:
  
  
'''(3)'''&nbsp; $\rm MQ3$&nbsp; weist Markoveigenschaften auf.
+
'''(3)'''&nbsp; $\rm MQ3$&nbsp; has Markov properties.
* Deshalb können aus&nbsp; $H_1$&nbsp; und&nbsp; $H_2$&nbsp; alle Näherungen&nbsp; $H_3$,&nbsp; $H_4$,&nbsp; ... und auch der Grenzwert&nbsp; $H =H_\infty$&nbsp; für&nbsp; $k \to \infty$&nbsp; angegeben werden:
+
* Therefore, from&nbsp; $H_1$&nbsp; and&nbsp; $H_2$&nbsp; all approximations&nbsp; $H_3$,&nbsp; $H_4$,&nbsp; ... and also the limit&nbsp; $H =H_\infty$&nbsp; for&nbsp; $k \to \infty$&nbsp; are given:
:$$H = 2 \cdot H_2 - H_1 = 2\cdot 1.375 - 1.5 \hspace{0.15cm} \underline {= 1.250 \,{\rm bit/Symbol}}\hspace{0.05cm},$$
+
:$$H = 2 \cdot H_2 - H_1 = 2\cdot 1.375 - 1.5 \hspace{0.15cm} \underline {= 1.250 \,{\rm bit/symbol}}\hspace{0.05cm},$$
:$$ H_3 \hspace{0.1cm} = \hspace{0.1cm}(H_1 + 2 \cdot H)/3 = (1.5 + 2 \cdot 1.25)/3 \hspace{0.15cm} \underline {= 1.333 \,{\rm bit/Symbol}}\hspace{0.05cm},$$
+
:$$ H_3 \hspace{0.1cm} = \hspace{0.1cm}(H_1 + 2 \cdot H)/3 = (1.5 + 2 \cdot 1.25)/3 \hspace{0.15cm} \underline {= 1.333 \,{\rm bit/symbol}}\hspace{0.05cm},$$
:$$ H_4 = (H_1 + 3 \cdot H)/4 = (1.5 + 3 \cdot 1.25)/4 \hspace{0.15cm} \underline {= 1.3125 \,{\rm bit/Symbol}}\hspace{0.05cm}.$$
+
:$$ H_4 = (H_1 + 3 \cdot H)/4 = (1.5 + 3 \cdot 1.25)/4 \hspace{0.15cm} \underline {= 1.3125 \,{\rm bit/symbol}}\hspace{0.05cm}.$$
*Die zehnte Entropienäherung unterscheidet sich noch immer vom Endwert&nbsp; $H = 1.25 \, \rm bit/Symbol$,&nbsp; wenn auch nur  geringfügig&nbsp; $($um $2\%)$&nbsp;:
+
*The tenth entropy approximation still differs from the final value&nbsp; $H = 1.25 \,\rm bit/symbol$,&nbsp; albeit only slightly&nbsp; $($by $2\%)$&nbsp;:
:$$H_{10} = (H_1 + 9 \cdot H)/10 = (1.5 + 9 \cdot 1.25)/10 {= 1.275 \,{\rm bit/Symbol}}\hspace{0.05cm}.$$
+
:$$H_{10} = (H_1 + 9 \cdot H)/10 = (1.5 + 9 \cdot 1.25)/10 {= 1.275 \,{\rm bit/symbol}}\hspace{0.05cm}.$$
  
  
  
'''(4)'''&nbsp; Entsprechend der Angabe sind bei&nbsp; $\rm MQ4$&nbsp; die&nbsp; $M = 4$&nbsp; Symbole gleichwahrscheinlich.  
+
'''(4)'''&nbsp; According to the specification, for&nbsp; $\rm MQ4$&nbsp; the&nbsp; $M = 4$&nbsp; symbols are equally probable.  
*Daraus folgt:
+
*From this follows:
:$$H_{\rm 1} = H_{\rm 0} = {\rm log}_2\hspace{0.1cm} (4) \hspace{0.15cm} \underline {= 2 \,{\rm bit/Symbol}}  
+
:$$H_{\rm 1} = H_{\rm 0} = {\rm log}_2\hspace{0.1cm} (4) \hspace{0.15cm} \underline {= 2 \,{\rm bit/symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
  
 +
'''(5)'''&nbsp; Of the&nbsp; $M^2 = 16$&nbsp; possible two-tuples, eight combinations are now not possible:
 +
:$$\rm NP, NO, PP, PO, OM, ON, MM, MN.$$
  
'''(5)'''&nbsp; Von den&nbsp; $M^2 = 16$&nbsp; möglichen Zweiertupeln sind nun acht Kombinationen nicht möglich:
+
*The eight other combinations (two-tuples) each yield the composite probability value&nbsp; $1/8$, as shown in two examples:
:$$\rm NP, NO, PP, PO, OM, ON, MMMN.$$
+
:$$p_{\rm NN} = p_{\rm N} \cdot p_{\rm N\hspace{0.01cm}|\hspace{0.01cm}N} = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},\hspace{0.5cm}
 +
p_{\rm MP} = p_{\rm M} \cdot p_{\rm P\hspace{0.01cm}|\hspace{0.01cm}M} = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm}.$$
 +
:$$\Rightarrow \hspace{0.3cm} H_2 = {1}/{2} \cdot \big [ 8 \cdot 1/8 \cdot {\rm log}_2\hspace{0.1cm} (8) \big ] \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/symbol}}
 +
  \hspace{0.05cm}.$$
  
*Die acht weiteren Kombinationen (Zweiertupel) ergeben jeweils den Verbundwahrscheinlichkeitswert&nbsp; $1/8$, wie an zwei Beispielen gezeigt wird:
 
:$$p_{\rm NN} = p_{\rm N} \cdot p_{\rm N\hspace{0.01cm}|\hspace{0.01cm}N} = 1/4 \cdot 1/2 = 1/8  \hspace{0.05cm},\hspace{0.5cm}
 
p_{\rm MP} = p_{\rm M} \cdot p_{\rm P\hspace{0.01cm}|\hspace{0.01cm}M} = 1/4 \cdot 1/2 = 1/8  \hspace{0.05cm}.$$
 
:$$\Rightarrow \hspace{0.3cm} H_2 =  {1}/{2} \cdot \big [ 8 \cdot 1/8 \cdot {\rm log}_2\hspace{0.1cm} (8)  \big ] \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/Symbol}}
 
\hspace{0.05cm}.$$
 
  
  
 +
'''(6)'''&nbsp; Because of the Markove property, here:
 +
:$$H = 2 \cdot H_2 - H_1 = 2\cdot 1.5 - 2 \hspace{0.15cm} \underline {= 1 \,{\rm bit/symbol}}\hspace{0.05cm},$$
 +
:$$ H_3 = (H_1 + 2 \cdot H)/3 = (2 + 2 \cdot 1)/3 \hspace{0.15cm} \underline {= 1.333 \,{\rm bit/symbol}}\hspace{0.05cm},$$
 +
:$$ H_4 = (H_1 + 3 \cdot H)/4 = (2 + 3 \cdot 1)/4 \hspace{0.15cm} \underline {= 1.250 \,{\rm bit/symbol}}\hspace{0.05cm}.$$
 +
*Also here, the tenth approximation still differs significantly from the final value, namely by&nbsp; $10\%$:
 +
:$$H_{10} = (H_1 + 9 \cdot H)/10 = (2 + 9 \cdot 1)/10 {= 1.1 \,{\rm bit/symbol}}\hspace{0.05cm}.$$
 +
*A deviation of only&nbsp; $2\%$&nbsp; results here only for&nbsp; $k = 50$.&nbsp; For comparison: &nbsp; For the Markov source&nbsp; $\rm MQ3$&nbsp; this approximation was already achieved with&nbsp; $k = 10$&nbsp;.
  
'''(6)'''&nbsp; Aufgrund der Markoveigenschaft gilt hier:
 
:$$H  =  2 \cdot H_2 - H_1 = 2\cdot 1.5 - 2 \hspace{0.15cm} \underline {= 1 \,{\rm bit/Symbol}}\hspace{0.05cm},$$
 
:$$ H_3 =  (H_1 + 2 \cdot H)/3 = (2 + 2 \cdot 1)/3 \hspace{0.15cm} \underline {= 1.333 \,{\rm bit/Symbol}}\hspace{0.05cm},$$
 
:$$ H_4 = (H_1 + 3 \cdot H)/4 = (2 + 3 \cdot 1)/4 \hspace{0.15cm} \underline {= 1.250 \,{\rm bit/Symbol}}\hspace{0.05cm}.$$
 
*Auch hier unterscheidet sich die zehnte Näherung noch deutlich vom Endwert, nämlich um&nbsp; $10\%$:
 
:$$H_{10} = (H_1 + 9 \cdot H)/10 = (2 + 9 \cdot 1)/10  {= 1.1 \,{\rm bit/Symbol}}\hspace{0.05cm}.$$
 
*Eine Abweichung um nur&nbsp;  $2\%$&nbsp; ergibt sich hier erst für&nbsp; $k = 50$.&nbsp; Zum Vergleich: &nbsp; Bei der Markovquelle&nbsp; $\rm MQ3$&nbsp; wurde diese Annäherung bereits mit&nbsp; $k = 10$&nbsp; erreicht.
 
 
{{ML-Fuß}}
 
{{ML-Fuß}}
  

Latest revision as of 13:05, 10 August 2021

Markov sources with
$M = 3$  and  $M = 4$

The graph shows two ergodic Markov sources   $($German:  "Markovquellen"   ⇒   "$\rm MQ$"$)$:

  • The source  $\rm MQ3$  is denoted by  $M = 3$  states  (symbols)  $\rm N$,  $\rm M$,  $\rm P$.  Due to stationarity, the probabilities have the following values:
$$p_{\rm N} = 1/2\hspace{0.05cm},\hspace{0.2cm}p_{\rm M} = p_{\rm P} = 1/4\hspace{0.05cm}.$$
  • For the source  $\rm MQ4$  the state  $\rm O$  is additionally possible   ⇒   $M = 4$.  Due to the symmetric transitions, the stationary probabilities are all equal:
$$p_{\rm N} = p_{\rm M} = p_{\rm O} = p_{\rm P} = 1/4\hspace{0.05cm}.$$

In terms of information theory, Markov sources are of particular importance, since with these – and only with these – by

  • $H_1$  (first entropy approximation, based only on the symbol probabilities), and
  • $H_2$  (second entropy approximation, computable with the composite probabilities for all two-tuples)


are also determined at the same time:

  • the further entropy approximations  $H_k$  with  $k = 3, \ 4$,  ..., and
  • the actual (final) source entropy  $H$.


The following equations of determination apply:

$$H= H_{k \to \infty} = 2 \cdot H_2 - H_1\hspace{0.05cm},$$
$$H_k = {1}/{k} \cdot \big [ H_{\rm 1} + (k-1) \cdot H \big ] \hspace{0.05cm}.$$



Hint:



Fragebogen

1

Calculate the entropy approximation  $H_1$  of the Markov source  $\rm MQ3$.

$H_1 \ = \ $

$\ \rm bit/symbol$

2

Calculate the entropy approximation  $H_2$  of the Markov source  $\rm MQ3$.

$H_2 \ = \ $

$\ \rm bit/symbol$

3

What are the actual source entropy  $H= H_{k \to \infty}$  and the approximations  $H_3$  and  $H_4$ for  $\rm MQ3$ ?

$H \ = \ $

$\ \rm bit/symbol$
$H_3 \ = \ $

$\ \rm bit/symbol$
$H_4 \ = \ $

$\ \rm bit/symbol$

4

Calculate the entropy approximation  $H_1$  of the Markov source  $\rm MQ4$.

$H_1 \ = \ $

$\ \rm bit/symbol$

5

Calculate the entropy approximation  $H_2$  of the Markov source  $\rm MQ4$

$H_2 \ = \ $

$\ \rm bit/symbol$

6

What are the actual source entropy  $H= H_{k \to \infty}$  and the approximations  $H_3$  and  $H_4$ for  $\rm MQ4$ ?

$H \ = \ $

$\ \rm bit/symbol$
$H_3 \ = \ $

$\ \rm bit/symbol$
$H_4 \ = \ $

$\ \rm bit/symbol$


Musterlösung

(1)  The symbol probabilities of the ternary Markov source are given.

  • From this, the entropy approximation  $H_1$  can be calculated:
$$H_{\rm 1} = 1/2 \cdot {\rm log}_2\hspace{0.1cm} (2) + 2 \cdot 1/4 \cdot {\rm log}_2\hspace{0.1cm}(4 ) \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/symbol}} \hspace{0.05cm}.$$


(2)  The composite probability is  $p_{\rm XY} = p_{\rm X} \cdot p_{\rm Y|X}$,  where  $p_{\rm X}$  gives the symbol probability of  $\rm X$  and  $p_{\rm Y|X}$  gives the conditional probability of  $\rm Y$, given that previously  $\rm X$  occurred.

  • $\rm X$  and  $\rm Y$  are here placeholders for the symbols  $\rm N$,  $\rm P$  and  $\rm M$.  Then holds:
$$p_{\rm NN} = 1/2 \cdot 1/2 = 1/4\hspace{0.05cm},\hspace{0.2cm}p_{\rm PP} = 1/4 \cdot 0 = 0\hspace{0.05cm},\hspace{0.2cm}p_{\rm MM} = 1/4 \cdot 0 = 0 \hspace{0.05cm},$$
$$ p_{\rm NP} = 1/2 \cdot 1/4 = 1/8\hspace{0.05cm},\hspace{0.2cm} p_{\rm PM} = 1/4 \cdot 1/2 = 1/8\hspace{0.05cm},\hspace{0.2cm}p_{\rm MN} = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},$$
$$ p_{\rm NM} = 1/2 \cdot 1/4 = 1/8\hspace{0.05cm},\hspace{0.2cm} p_{\rm MP} = 1/4 \cdot 1/2 = 1/8\hspace{0.05cm},\hspace{0.2cm}p_{\rm PN} = 1/4 \cdot 1/2 = 1/8$$
$$\Rightarrow \hspace{0.3cm} H_{\rm 2} = {1}/{2} \cdot \big [ 1/4 \cdot {\rm log}_2\hspace{0.1cm}( 4) + 6 \cdot 1/8 \cdot {\rm log}_2\hspace{0.1cm} (8) \big ] \hspace{0.15cm} \underline {= 1.375 \,{\rm bit/Symbol}} \hspace{0.05cm}.$$


(3)  $\rm MQ3$  has Markov properties.

  • Therefore, from  $H_1$  and  $H_2$  all approximations  $H_3$,  $H_4$,  ... and also the limit  $H =H_\infty$  for  $k \to \infty$  are given:
$$H = 2 \cdot H_2 - H_1 = 2\cdot 1.375 - 1.5 \hspace{0.15cm} \underline {= 1.250 \,{\rm bit/symbol}}\hspace{0.05cm},$$
$$ H_3 \hspace{0.1cm} = \hspace{0.1cm}(H_1 + 2 \cdot H)/3 = (1.5 + 2 \cdot 1.25)/3 \hspace{0.15cm} \underline {= 1.333 \,{\rm bit/symbol}}\hspace{0.05cm},$$
$$ H_4 = (H_1 + 3 \cdot H)/4 = (1.5 + 3 \cdot 1.25)/4 \hspace{0.15cm} \underline {= 1.3125 \,{\rm bit/symbol}}\hspace{0.05cm}.$$
  • The tenth entropy approximation still differs from the final value  $H = 1.25 \,\rm bit/symbol$,  albeit only slightly  $($by $2\%)$ :
$$H_{10} = (H_1 + 9 \cdot H)/10 = (1.5 + 9 \cdot 1.25)/10 {= 1.275 \,{\rm bit/symbol}}\hspace{0.05cm}.$$


(4)  According to the specification, for  $\rm MQ4$  the  $M = 4$  symbols are equally probable.

  • From this follows:
$$H_{\rm 1} = H_{\rm 0} = {\rm log}_2\hspace{0.1cm} (4) \hspace{0.15cm} \underline {= 2 \,{\rm bit/symbol}} \hspace{0.05cm}.$$


(5)  Of the  $M^2 = 16$  possible two-tuples, eight combinations are now not possible:

$$\rm NP, NO, PP, PO, OM, ON, MM, MN.$$
  • The eight other combinations (two-tuples) each yield the composite probability value  $1/8$, as shown in two examples:
$$p_{\rm NN} = p_{\rm N} \cdot p_{\rm N\hspace{0.01cm}|\hspace{0.01cm}N} = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},\hspace{0.5cm} p_{\rm MP} = p_{\rm M} \cdot p_{\rm P\hspace{0.01cm}|\hspace{0.01cm}M} = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm}.$$
$$\Rightarrow \hspace{0.3cm} H_2 = {1}/{2} \cdot \big [ 8 \cdot 1/8 \cdot {\rm log}_2\hspace{0.1cm} (8) \big ] \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/symbol}} \hspace{0.05cm}.$$


(6)  Because of the Markove property, here:

$$H = 2 \cdot H_2 - H_1 = 2\cdot 1.5 - 2 \hspace{0.15cm} \underline {= 1 \,{\rm bit/symbol}}\hspace{0.05cm},$$
$$ H_3 = (H_1 + 2 \cdot H)/3 = (2 + 2 \cdot 1)/3 \hspace{0.15cm} \underline {= 1.333 \,{\rm bit/symbol}}\hspace{0.05cm},$$
$$ H_4 = (H_1 + 3 \cdot H)/4 = (2 + 3 \cdot 1)/4 \hspace{0.15cm} \underline {= 1.250 \,{\rm bit/symbol}}\hspace{0.05cm}.$$
  • Also here, the tenth approximation still differs significantly from the final value, namely by  $10\%$:
$$H_{10} = (H_1 + 9 \cdot H)/10 = (2 + 9 \cdot 1)/10 {= 1.1 \,{\rm bit/symbol}}\hspace{0.05cm}.$$
  • A deviation of only  $2\%$  results here only for  $k = 50$.  For comparison:   For the Markov source  $\rm MQ3$  this approximation was already achieved with  $k = 10$ .