Loading [MathJax]/jax/output/HTML-CSS/fonts/TeX/fontdata.js

Difference between revisions of "Aufgaben:Exercise 1.6: Non-Binary Markov Sources"

From LNTwww
 
(19 intermediate revisions by 3 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Nachrichtenquellen mit Gedächtnis
+
{{quiz-Header|Buchseite=Information_Theory/Discrete_Sources_with_Memory
 
}}
 
}}
  
[[File:P_ID2253__Inf_A_1_6.png|right|Markovquellen mit <i>M</i> = 3 und <i>M</i> = 4]]
+
[[File:EN_Inf_A_1_6.png|right|frame|Markov sources with <br>$M = 3&nbsp; and&nbsp;M = 4$]]
Die Grafik zeigt zwei ergodische Markovquellen (MQ):
+
The graph shows two ergodic Markov sources &nbsp; (German:&nbsp; "Markovquellen" &nbsp; &rArr; &nbsp; "$\rm MQ")$:
  
* Die Quelle '''MQ3''' ist durch M = 3 Zustände (Symbole) \rm N, \rm M, \rm P gekennzeichnet. Aufgrund der Stationarität haben die Wahrscheinlichkeiten folgende Werte:
+
* The source&nbsp; $\rm MQ3$&nbsp; is denoted by&nbsp; M = 3&nbsp; states&nbsp; (symbols)&nbsp; \rm N,&nbsp; \rm M,&nbsp; \rm P.&nbsp; Due to stationarity, the probabilities have the following values:
:$$p_{\rm N} = 1/2\hspace{0.05cm},\hspace{0.2cm}p_{\rm M} = p_{\rm P} = 1/4\hspace{0.05cm}.$$
+
:p_{\rm N} = 1/2\hspace{0.05cm},\hspace{0.2cm}p_{\rm M} = p_{\rm P} = 1/4\hspace{0.05cm}.
  
* Bei der Quelle '''MQ4''' ist zusätzlich der Zustand \rm O möglich &#8658; M = 4. Aufgrund der symmetrischen Übergänge sind die stationären Wahrscheinlichkeiten alle gleich:
+
* For the source&nbsp; $\rm MQ4$&nbsp; the state&nbsp; \rm O&nbsp; is additionally possible &nbsp; &#8658; &nbsp; M = 4.&nbsp; Due to the symmetric transitions, the stationary probabilities are all equal:
:$$p_{\rm N} = p_{\rm M} = p_{\rm O} = p_{\rm P} = 1/4\hspace{0.05cm}.$$
+
:p_{\rm N} = p_{\rm M} = p_{\rm O} = p_{\rm P} = 1/4\hspace{0.05cm}.
  
Informationstheoretisch sind Markovquellen von besonderer Bedeutung, da bei diesen &ndash; und nur bei diesen &ndash; durch
+
In terms of information theory, Markov sources are of particular importance, since with these &ndash; and only with these &ndash; by
* H_1 (Entropienäherung, nur auf den Symbolwahrscheinlichkeiten basierend), und
+
* H_1&nbsp; (first entropy approximation, based only on the symbol probabilities), and
* H_2 (zweite Entropienäherung, berechenbar mit den Verbundwahrscheinlichkeiten für alle Zweiertupel)  
+
* H_2&nbsp; (second entropy approximation, computable with the composite probabilities for all two-tuples)  
  
gleichzeitig auch bestimmt sind:
 
  
* die weiteren Entropienäherungen H_k mit k = 3, 4, ...  und
+
are also determined at the same time:
* die tatsächliche Quellenentropie H.
 
  
 +
* the further entropy approximations&nbsp; H_k&nbsp; with&nbsp; k = 3, \ 4,&nbsp; ...,  and
 +
* the actual (final) source entropy&nbsp; H.
  
Es gelten folgende Bestimmungsgleichungen:
+
 
:$$H = 2 \cdot H_2 - H_1\hspace{0.05cm},\hspace{0.4cm}H_k =  {1}/{k} \cdot [ H_{\rm 1} + (k-1) \cdot H]  
+
The following equations of determination apply:
 +
:$$H= H_{k \to \infty} = 2 \cdot H_2 - H_1\hspace{0.05cm},$$
 +
:$$H_k =  {1}/{k} \cdot \big [ H_{\rm 1} + (k-1) \cdot H \big ]  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
  
''Hinweise:''  
+
 
*Die Aufgabe gehört zum  Kapitel [[Informationstheorie/Nachrichtenquellen_mit_Gedächtnis|Nachrichtenquellen mit Gedächtnis]].
+
 
*Bezug genommen wird insbesondere auf die Seite  [[Informationstheorie/Nachrichtenquellen_mit_Gedächtnis#Nichtbin.C3.A4re_Markovquellen|Nichtbinäre Markovquellen]].
+
 
*Bei allen Entropien ist die Pseudoeinheit &bdquo;bit/Symbol&rdquo; hinzuzufügen.
+
''Hint:''  
*Sollte die Eingabe des Zahlenwertes &bdquo;0&rdquo; erforderlich sein, so geben Sie bitte &bdquo;0.&rdquo; ein.
+
*The exercise belongs to the chapter&nbsp; [[Information_Theory/Discrete_Sources_with_Memory|Discrete Sources with Memory]].
 +
*Reference is made in particular to the page&nbsp; [[Information_Theory/Discrete_Sources_with_Memory#Non-binary_Markov_sources|Non-binary Markov Sources]].
 +
*For all entropies, add the pseudo-unit&nbsp; "bit/symbol".
 +
 
 +
 
  
  
Line 37: Line 43:
  
 
<quiz display=simple>
 
<quiz display=simple>
{Berechnen Sie die Entropienäherung H_1 der Markovquelle  '''MQ3'''.
+
{Calculate the entropy approximation&nbsp; H_1&nbsp; of the Markov source&nbsp; $\rm MQ3$.
 
|type="{}"}
 
|type="{}"}
H_1 \ = { 1.5 3% } $\ \rm bit/Symbol$
+
$H_1 \ = \ { 1.5 3% } \ \rm bit/symbol$
  
  
{Berechnen Sie die Entropienäherung H_2 der Markovquelle  '''MQ3'''.
+
{Calculate the entropy approximation&nbsp; H_2&nbsp; of the Markov source&nbsp; $\rm MQ3$.
 
|type="{}"}
 
|type="{}"}
H_2 \ = { 1.375 3% } $\ \rm bit/Symbol$
+
$H_2 \ = \ { 1.375 3% } \ \rm bit/symbol$
  
  
{Wie groß sind für '''MQ3''' die tatsächliche Quellenentropie H und die Näherungen H_3 und H_4?
+
{What are the actual source entropy&nbsp; $H= H_{k \to \infty}$&nbsp; and the approximations&nbsp; H_3&nbsp; and&nbsp; H_4 for&nbsp; \rm MQ3&nbsp;?
 
|type="{}"}
 
|type="{}"}
H \ = { 1.25 3% } $\ \rm bit/Symbol$
+
$H \ = \ { 1.25 3% } \ \rm bit/symbol$
H_3 \ = { 1.333 3% } $\ \rm bit/Symbol$
+
$H_3 \ = \ { 1.333 3% } \ \rm bit/symbol$
H_4 \ = { 1.3125 3% } $\ \rm bit/Symbol$
+
$H_4 \ = \ { 1.3125 3% } \ \rm bit/symbol$
 
 
  
{Berechnen Sie die Entropienäherung H_1 der Markovquelle  '''MQ4'''.
+
{Calculate the entropy approximation&nbsp; H_1&nbsp; of the Markov source&nbsp; $\rm MQ4$.
 
|type="{}"}
 
|type="{}"}
H_1 \ = { 2 3% } $\ \rm bit/Symbol$
+
$H_1 \ = \ { 2 3% } \ \rm bit/symbol$
  
  
{Berechnen Sie die Entropienäherung H_2 der Markovquelle  '''MQ4'''.
+
{Calculate the entropy approximation&nbsp; H_2&nbsp; of the Markov source&nbsp; $\rm MQ4$
 
|type="{}"}
 
|type="{}"}
H_2 \ = { 1.5 3% } $\ \rm bit/Symbol$
+
$H_2 \ = \ { 1.5 3% } \ \rm bit/symbol$
  
  
{Wie groß sind für '''MQ4''' die tatsächliche Quellenentropie H und die Näherungen H_3 und H_4?
+
{What are the actual source entropy&nbsp; $H= H_{k \to \infty}$&nbsp; and the approximations&nbsp; H_3&nbsp; and&nbsp; H_4 for&nbsp; \rm MQ4&nbsp;?
 
|type="{}"}
 
|type="{}"}
H \ = { 1 3% } $\ \rm bit/Symbol$
+
$H \ = \ { 1 3% } \ \rm bit/symbol$
H_3 \ = { 1.333 3% } $\ \rm bit/Symbol$
+
$H_3 \ = \ { 1.333 3% } \ \rm bit/symbol$
H_4 \ = { 1.25 3% } $\ \rm bit/Symbol$
+
$H_4 \ = \ { 1.25 3% } \ \rm bit/symbol$
 
 
  
 
</quiz>
 
</quiz>
Line 75: Line 79:
 
===Musterlösung===
 
===Musterlösung===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''(1)'''&nbsp; Die Symbolwahrscheinlichkeiten der ternären Markovquelle sind gegeben. Daraus lässt sich die Entropienäherung H_1 berechnen:
+
'''(1)'''&nbsp; The symbol probabilities of the ternary Markov source are given.  
:$$H_{\rm 1} = 1/2 \cdot {\rm log}_2\hspace{0.1cm} (2) + 2 \cdot 1/4 \cdot {\rm log}_2\hspace{0.1cm}(4 ) \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/Symbol}}  
+
*From this, the entropy approximation&nbsp; H_1&nbsp; can be calculated:
 +
:$$H_{\rm 1} = 1/2 \cdot {\rm log}_2\hspace{0.1cm} (2) + 2 \cdot 1/4 \cdot {\rm log}_2\hspace{0.1cm}(4 ) \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
'''(2)'''&nbsp; Die Verbundwahrscheinlichkeit ist p_{\rm XY} = p_{\rm X} \cdot p_{\rm Y|X}, wobei p_{\rm X} die Symbolwahrscheinlichkeit von \rm X angibt und p_{\rm Y|X} die bedingte Wahrscheinlichkeit für \rm Y, unter der Voraussetzung, dass vorher \rm X aufgetreten ist.
 
  
\rm X und \rm Y sind hier Platzhalter für die Symbole \rm N, \rm P und \rm M. Dann gilt:
+
 
 +
'''(2)'''&nbsp; The composite probability is&nbsp; p_{\rm XY} = p_{\rm X} \cdot p_{\rm Y|X},&nbsp; where&nbsp; p_{\rm X}&nbsp; gives the symbol probability of&nbsp; \rm X&nbsp; and&nbsp; p_{\rm Y|X}&nbsp; gives the conditional probability of&nbsp; \rm Y, given that previously&nbsp; \rm X&nbsp; occurred.
 +
 
 +
*\rm X&nbsp; and&nbsp; \rm Y&nbsp; are here placeholders for the symbols&nbsp; \rm N,&nbsp; \rm P&nbsp; and&nbsp; \rm M.&nbsp; Then holds:
 
:p_{\rm NN} = 1/2 \cdot 1/2 = 1/4\hspace{0.05cm},\hspace{0.2cm}p_{\rm PP} =  1/4 \cdot 0 = 0\hspace{0.05cm},\hspace{0.2cm}p_{\rm MM} =  1/4 \cdot 0 = 0 \hspace{0.05cm},
 
:p_{\rm NN} = 1/2 \cdot 1/2 = 1/4\hspace{0.05cm},\hspace{0.2cm}p_{\rm PP} =  1/4 \cdot 0 = 0\hspace{0.05cm},\hspace{0.2cm}p_{\rm MM} =  1/4 \cdot 0 = 0 \hspace{0.05cm},
 
: p_{\rm NP} =  1/2 \cdot 1/4 = 1/8\hspace{0.05cm},\hspace{0.2cm} p_{\rm PM} = 1/4 \cdot 1/2 = 1/8\hspace{0.05cm},\hspace{0.2cm}p_{\rm MN} =  1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},
 
: p_{\rm NP} =  1/2 \cdot 1/4 = 1/8\hspace{0.05cm},\hspace{0.2cm} p_{\rm PM} = 1/4 \cdot 1/2 = 1/8\hspace{0.05cm},\hspace{0.2cm}p_{\rm MN} =  1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},
 
: p_{\rm NM} =  1/2 \cdot 1/4 = 1/8\hspace{0.05cm},\hspace{0.2cm} p_{\rm MP} = 1/4 \cdot 1/2 = 1/8\hspace{0.05cm},\hspace{0.2cm}p_{\rm PN} =  1/4 \cdot 1/2 = 1/8
 
: p_{\rm NM} =  1/2 \cdot 1/4 = 1/8\hspace{0.05cm},\hspace{0.2cm} p_{\rm MP} = 1/4 \cdot 1/2 = 1/8\hspace{0.05cm},\hspace{0.2cm}p_{\rm PN} =  1/4 \cdot 1/2 = 1/8
:$$\Rightarrow \hspace{0.3cm} H_{\rm 2}  = {1}/{2} \cdot \left [ 1/4 \cdot {\rm log}_2\hspace{0.1cm}( 4) + 6 \cdot 1/8 \cdot {\rm log}_2\hspace{0.1cm} (8) \right ] \hspace{0.15cm} \underline {= 1.375 \,{\rm bit/Symbol}}  
+
:$$\Rightarrow \hspace{0.3cm} H_{\rm 2}  = {1}/{2} \cdot \big [ 1/4 \cdot {\rm log}_2\hspace{0.1cm}( 4) + 6 \cdot 1/8 \cdot {\rm log}_2\hspace{0.1cm} (8) \big ] \hspace{0.15cm} \underline {= 1.375 \,{\rm bit/Symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
'''(3)'''&nbsp; Da '''MQ3''' Markoveigenschaften aufweist, können aus H_1 und H_2 alle Entropienäherungen H_k (für k = 3, 4, ... )  angegeben werden und auch der Grenzwert H =H_\infty  für k \to \infty:
 
:H  =  2 \cdot H_2 - H_1 = 2\cdot 1.375 - 1.5 \hspace{0.15cm} \underline {= 1.250 \,{\rm bit/Symbol}}\hspace{0.05cm},
 
: H_3 \hspace{0.1cm} =  \hspace{0.1cm}=  (H_1 + 2 \cdot H)/3 = (1.5 + 2 \cdot 1.25)/3 \hspace{0.15cm} \underline {= 1.333 \,{\rm bit/Symbol}}\hspace{0.05cm},
 
: H_4 = (H_1 + 3 \cdot H)/4 = (1.5 + 3 \cdot 1.25)/4 \hspace{0.15cm} \underline {= 1.3125 \,{\rm bit/Symbol}}\hspace{0.05cm}.
 
Die 10. Entropienäherung unterscheidet sich noch immer, wenn auch nur  geringfügig (um 2%) vom Endwert H = 1.25 \, \rm bit/Symbol:
 
:H_{10} = (H_1 + 9 \cdot H)/10 = (1.5 + 9 \cdot 1.25)/10  {= 1.275 \,{\rm bit/Symbol}}\hspace{0.05cm}.
 
  
'''(4)'''&nbsp; Entsprechend der Angabe sind bei '''MQ3''' die M = 4 Symbole gleichwahrscheinlich. Daraus folgt:
+
 
:$$H_{\rm 1} = H_{\rm 0} = {\rm log}_2\hspace{0.1cm} (4) \hspace{0.15cm} \underline {= 2 \,{\rm bit/Symbol}}  
+
'''(3)'''&nbsp; \rm MQ3&nbsp; has Markov properties.
 +
* Therefore, from&nbsp; H_1&nbsp; and&nbsp; H_2&nbsp; all approximations&nbsp; H_3,&nbsp; H_4,&nbsp; ... and also the limit&nbsp; H =H_\infty&nbsp; for&nbsp; k \to \infty&nbsp; are given:
 +
:H = 2 \cdot H_2 - H_1 = 2\cdot 1.375 - 1.5 \hspace{0.15cm} \underline {= 1.250 \,{\rm bit/symbol}}\hspace{0.05cm},
 +
: H_3 \hspace{0.1cm} = \hspace{0.1cm}(H_1 + 2 \cdot H)/3 = (1.5 + 2 \cdot 1.25)/3 \hspace{0.15cm} \underline {= 1.333 \,{\rm bit/symbol}}\hspace{0.05cm},
 +
: H_4 = (H_1 + 3 \cdot H)/4 = (1.5 + 3 \cdot 1.25)/4 \hspace{0.15cm} \underline {= 1.3125 \,{\rm bit/symbol}}\hspace{0.05cm}.
 +
*The tenth entropy approximation still differs from the final value&nbsp; H = 1.25 \,\rm bit/symbol,&nbsp; albeit only slightly&nbsp; (by 2\%)&nbsp;:
 +
:H_{10} = (H_1 + 9 \cdot H)/10 = (1.5 + 9 \cdot 1.25)/10 {= 1.275 \,{\rm bit/symbol}}\hspace{0.05cm}.
 +
 
 +
 
 +
 
 +
'''(4)'''&nbsp; According to the specification, for&nbsp; \rm MQ4&nbsp; the&nbsp; M = 4&nbsp; symbols are equally probable.  
 +
*From this follows:
 +
:$$H_{\rm 1} = H_{\rm 0} = {\rm log}_2\hspace{0.1cm} (4) \hspace{0.15cm} \underline {= 2 \,{\rm bit/symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
'''(5)'''&nbsp; Von den M^2 = 16 möglichen Zweiertupeln sind acht Kombinationen nicht möglich:
 
:\rm NP, NO, PP, PO, OM, ON, MM,  MN.
 
  
Die acht weiteren Kombinationen (Zweiertupel) ergeben jeweils den Verbundwahrscheinlichkeitswert 1/8, wie an zwei Beispielen gezeigt wird:
+
'''(5)'''&nbsp; Of the&nbsp; M^2 = 16&nbsp; possible two-tuples, eight combinations are now not possible:
:$$p_{\rm NN} = p_{\rm N} \cdot p_{\rm N\hspace{0.01cm}|\hspace{0.01cm}N} = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},$$
+
:\rm NP, NO, PP, PO, OM, ON, MM, MN.
:$$p_{\rm MP} = p_{\rm M} \cdot p_{\rm P\hspace{0.01cm}|\hspace{0.01cm}M} = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm}.$$
+
 
:$$\Rightarrow \hspace{0.3cm} H_2 = {1}/{2} \cdot \left [ 8 \cdot 1/8 \cdot {\rm log}_2\hspace{0.1cm} (8) \right ] \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/Symbol}}  
+
*The eight other combinations (two-tuples) each yield the composite probability value&nbsp; 1/8, as shown in two examples:
 +
:$$p_{\rm NN} = p_{\rm N} \cdot p_{\rm N\hspace{0.01cm}|\hspace{0.01cm}N} = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},\hspace{0.5cm}
 +
p_{\rm MP} = p_{\rm M} \cdot p_{\rm P\hspace{0.01cm}|\hspace{0.01cm}M} = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm}.$$
 +
:$$\Rightarrow \hspace{0.3cm} H_2 = {1}/{2} \cdot \big [ 8 \cdot 1/8 \cdot {\rm log}_2\hspace{0.1cm} (8) \big ] \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
'''(6)'''&nbsp; Aufgrund der Markoveigenschaft gilt hier:
+
 
:$$H = 2 \cdot H_2 - H_1 = 2\cdot 1.5 - 2 \hspace{0.15cm} \underline {= 1 \,{\rm bit/Symbol}}\hspace{0.05cm},$$
+
 
:$$ H_3 = (H_1 + 2 \cdot H)/3 = (2 + 2 \cdot 1)/3 \hspace{0.15cm} \underline {= 1.333 \,{\rm bit/Symbol}}\hspace{0.05cm},$$
+
'''(6)'''&nbsp; Because of the Markove property, here:
:$$ H_4 = (H_1 + 3 \cdot H)/4 = (2 + 3 \cdot 1)/4 \hspace{0.15cm} \underline {= 1.250 \,{\rm bit/Symbol}}\hspace{0.05cm}.$$
+
:$$H = 2 \cdot H_2 - H_1 = 2\cdot 1.5 - 2 \hspace{0.15cm} \underline {= 1 \,{\rm bit/symbol}}\hspace{0.05cm},$$
Auch hier unterscheidet sich die 10. Näherung noch deutlich, nämlich um 10\%,  vom Endwert:
+
:$$ H_3 = (H_1 + 2 \cdot H)/3 = (2 + 2 \cdot 1)/3 \hspace{0.15cm} \underline {= 1.333 \,{\rm bit/symbol}}\hspace{0.05cm},$$
:$$H_{10} = (H_1 + 9 \cdot H)/10 = (2 + 9 \cdot 1)/10 {= 1.1 \,{\rm bit/Symbol}}\hspace{0.05cm}.$$
+
:$$ H_4 = (H_1 + 3 \cdot H)/4 = (2 + 3 \cdot 1)/4 \hspace{0.15cm} \underline {= 1.250 \,{\rm bit/symbol}}\hspace{0.05cm}.$$
Eine Abweichung um  2\% ergibt sich hier erst für k = 50. Zum Vergleich: Bei der Markovquelle '''MQ3''' wurde diese Annäherung bereits mit k = 10 erreicht.
+
*Also here, the tenth approximation still differs significantly from the final value, namely by&nbsp; 10\%:
 +
:$$H_{10} = (H_1 + 9 \cdot H)/10 = (2 + 9 \cdot 1)/10 {= 1.1 \,{\rm bit/symbol}}\hspace{0.05cm}.$$
 +
*A deviation of only&nbsp; 2\%&nbsp; results here only for&nbsp; k = 50.&nbsp; For comparison: &nbsp; For the Markov source&nbsp; $\rm MQ3$&nbsp; this approximation was already achieved with&nbsp; k = 10&nbsp;.
 +
 
 
{{ML-Fuß}}
 
{{ML-Fuß}}
  
  
  
[[Category:Aufgaben zu Informationstheorie|^1.2 Nachrichtenquellen mit Gedächtnis^]]
+
[[Category:Information Theory: Exercises|^1.2 Sources with Memory^]]

Latest revision as of 14:05, 10 August 2021

Markov sources with
M = 3  and  M = 4

The graph shows two ergodic Markov sources   (German:  "Markovquellen"   ⇒   "\rm MQ"):

  • The source  \rm MQ3  is denoted by  M = 3  states  (symbols)  \rm N\rm M\rm P.  Due to stationarity, the probabilities have the following values:
p_{\rm N} = 1/2\hspace{0.05cm},\hspace{0.2cm}p_{\rm M} = p_{\rm P} = 1/4\hspace{0.05cm}.
  • For the source  \rm MQ4  the state  \rm O  is additionally possible   ⇒   M = 4.  Due to the symmetric transitions, the stationary probabilities are all equal:
p_{\rm N} = p_{\rm M} = p_{\rm O} = p_{\rm P} = 1/4\hspace{0.05cm}.

In terms of information theory, Markov sources are of particular importance, since with these – and only with these – by

  • H_1  (first entropy approximation, based only on the symbol probabilities), and
  • H_2  (second entropy approximation, computable with the composite probabilities for all two-tuples)


are also determined at the same time:

  • the further entropy approximations  H_k  with  k = 3, \ 4,  ..., and
  • the actual (final) source entropy  H.


The following equations of determination apply:

H= H_{k \to \infty} = 2 \cdot H_2 - H_1\hspace{0.05cm},
H_k = {1}/{k} \cdot \big [ H_{\rm 1} + (k-1) \cdot H \big ] \hspace{0.05cm}.



Hint:



Fragebogen

1

Calculate the entropy approximation  H_1  of the Markov source  \rm MQ3.

H_1 \ = \

\ \rm bit/symbol

2

Calculate the entropy approximation  H_2  of the Markov source  \rm MQ3.

H_2 \ = \

\ \rm bit/symbol

3

What are the actual source entropy  H= H_{k \to \infty}  and the approximations  H_3  and  H_4 for  \rm MQ3 ?

H \ = \

\ \rm bit/symbol
H_3 \ = \

\ \rm bit/symbol
H_4 \ = \

\ \rm bit/symbol

4

Calculate the entropy approximation  H_1  of the Markov source  \rm MQ4.

H_1 \ = \

\ \rm bit/symbol

5

Calculate the entropy approximation  H_2  of the Markov source  \rm MQ4

H_2 \ = \

\ \rm bit/symbol

6

What are the actual source entropy  H= H_{k \to \infty}  and the approximations  H_3  and  H_4 for  \rm MQ4 ?

H \ = \

\ \rm bit/symbol
H_3 \ = \

\ \rm bit/symbol
H_4 \ = \

\ \rm bit/symbol


Musterlösung

(1)  The symbol probabilities of the ternary Markov source are given.

  • From this, the entropy approximation  H_1  can be calculated:
H_{\rm 1} = 1/2 \cdot {\rm log}_2\hspace{0.1cm} (2) + 2 \cdot 1/4 \cdot {\rm log}_2\hspace{0.1cm}(4 ) \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/symbol}} \hspace{0.05cm}.


(2)  The composite probability is  p_{\rm XY} = p_{\rm X} \cdot p_{\rm Y|X},  where  p_{\rm X}  gives the symbol probability of  \rm X  and  p_{\rm Y|X}  gives the conditional probability of  \rm Y, given that previously  \rm X  occurred.

  • \rm X  and  \rm Y  are here placeholders for the symbols  \rm N\rm P  and  \rm M.  Then holds:
p_{\rm NN} = 1/2 \cdot 1/2 = 1/4\hspace{0.05cm},\hspace{0.2cm}p_{\rm PP} = 1/4 \cdot 0 = 0\hspace{0.05cm},\hspace{0.2cm}p_{\rm MM} = 1/4 \cdot 0 = 0 \hspace{0.05cm},
p_{\rm NP} = 1/2 \cdot 1/4 = 1/8\hspace{0.05cm},\hspace{0.2cm} p_{\rm PM} = 1/4 \cdot 1/2 = 1/8\hspace{0.05cm},\hspace{0.2cm}p_{\rm MN} = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},
p_{\rm NM} = 1/2 \cdot 1/4 = 1/8\hspace{0.05cm},\hspace{0.2cm} p_{\rm MP} = 1/4 \cdot 1/2 = 1/8\hspace{0.05cm},\hspace{0.2cm}p_{\rm PN} = 1/4 \cdot 1/2 = 1/8
\Rightarrow \hspace{0.3cm} H_{\rm 2} = {1}/{2} \cdot \big [ 1/4 \cdot {\rm log}_2\hspace{0.1cm}( 4) + 6 \cdot 1/8 \cdot {\rm log}_2\hspace{0.1cm} (8) \big ] \hspace{0.15cm} \underline {= 1.375 \,{\rm bit/Symbol}} \hspace{0.05cm}.


(3)  \rm MQ3  has Markov properties.

  • Therefore, from  H_1  and  H_2  all approximations  H_3H_4,  ... and also the limit  H =H_\infty  for  k \to \infty  are given:
H = 2 \cdot H_2 - H_1 = 2\cdot 1.375 - 1.5 \hspace{0.15cm} \underline {= 1.250 \,{\rm bit/symbol}}\hspace{0.05cm},
H_3 \hspace{0.1cm} = \hspace{0.1cm}(H_1 + 2 \cdot H)/3 = (1.5 + 2 \cdot 1.25)/3 \hspace{0.15cm} \underline {= 1.333 \,{\rm bit/symbol}}\hspace{0.05cm},
H_4 = (H_1 + 3 \cdot H)/4 = (1.5 + 3 \cdot 1.25)/4 \hspace{0.15cm} \underline {= 1.3125 \,{\rm bit/symbol}}\hspace{0.05cm}.
  • The tenth entropy approximation still differs from the final value  H = 1.25 \,\rm bit/symbol,  albeit only slightly  (by 2\%) :
H_{10} = (H_1 + 9 \cdot H)/10 = (1.5 + 9 \cdot 1.25)/10 {= 1.275 \,{\rm bit/symbol}}\hspace{0.05cm}.


(4)  According to the specification, for  \rm MQ4  the  M = 4  symbols are equally probable.

  • From this follows:
H_{\rm 1} = H_{\rm 0} = {\rm log}_2\hspace{0.1cm} (4) \hspace{0.15cm} \underline {= 2 \,{\rm bit/symbol}} \hspace{0.05cm}.


(5)  Of the  M^2 = 16  possible two-tuples, eight combinations are now not possible:

\rm NP, NO, PP, PO, OM, ON, MM, MN.
  • The eight other combinations (two-tuples) each yield the composite probability value  1/8, as shown in two examples:
p_{\rm NN} = p_{\rm N} \cdot p_{\rm N\hspace{0.01cm}|\hspace{0.01cm}N} = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},\hspace{0.5cm} p_{\rm MP} = p_{\rm M} \cdot p_{\rm P\hspace{0.01cm}|\hspace{0.01cm}M} = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm}.
\Rightarrow \hspace{0.3cm} H_2 = {1}/{2} \cdot \big [ 8 \cdot 1/8 \cdot {\rm log}_2\hspace{0.1cm} (8) \big ] \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/symbol}} \hspace{0.05cm}.


(6)  Because of the Markove property, here:

H = 2 \cdot H_2 - H_1 = 2\cdot 1.5 - 2 \hspace{0.15cm} \underline {= 1 \,{\rm bit/symbol}}\hspace{0.05cm},
H_3 = (H_1 + 2 \cdot H)/3 = (2 + 2 \cdot 1)/3 \hspace{0.15cm} \underline {= 1.333 \,{\rm bit/symbol}}\hspace{0.05cm},
H_4 = (H_1 + 3 \cdot H)/4 = (2 + 3 \cdot 1)/4 \hspace{0.15cm} \underline {= 1.250 \,{\rm bit/symbol}}\hspace{0.05cm}.
  • Also here, the tenth approximation still differs significantly from the final value, namely by  10\%:
H_{10} = (H_1 + 9 \cdot H)/10 = (2 + 9 \cdot 1)/10 {= 1.1 \,{\rm bit/symbol}}\hspace{0.05cm}.
  • A deviation of only  2\%  results here only for  k = 50.  For comparison:   For the Markov source  \rm MQ3  this approximation was already achieved with  k = 10 .