Difference between revisions of "Aufgaben:Exercise 1.5: Binary Markov Source"

From LNTwww
(Die Seite wurde neu angelegt: „ {{quiz-Header|Buchseite=Informationstheorie und Quellencodierung/Nachrichtenquellen mit Gedächtnis }} right| :Die Aufgabe…“)
 
 
(35 intermediate revisions by 5 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie und Quellencodierung/Nachrichtenquellen mit Gedächtnis
+
{{quiz-Header|Buchseite=Information_Theory/Discrete_Sources_with_Memory
 
}}
 
}}
  
[[File:P_ID2250__Inf_A_1_5.png|right|]]
+
[[File:P_ID2250__Inf_A_1_5.png|right|frame|Binary Markov diagram]]
:Die Aufgabe A1.4 hat gezeigt, dass die Berechnung der Entropie bei einer gedächtnisbehafteten Quelle sehr aufwändig sein kann. Man muss dann zunächst (sehr viele) Entropienäherungen <i>H<sub>k</sub></i> für <i>k</i>&ndash;Tupel berechnen und kann erst  dann mit dem Grenzübergang <i>k</i> &#8594; &#8734; die Quellenentropie ermitteln:
+
[[Aufgaben:Exercise_1.4:_Entropy_Approximations_for_the_AMI_Code|Exercise 1.4]]&nbsp; has shown that the calculation of the entropy for a source with memory can be very time-consuming.&nbsp; One must then first calculate (very many) entropy approximations&nbsp; $H_k$&nbsp; for&nbsp; $k$&ndash;tuples and only then the source entropy can be determined with the boundary transition&nbsp; $k \to \infty$&nbsp;:
 
:$$H  =  \lim_{k \rightarrow \infty } H_k  \hspace{0.05cm}.$$
 
:$$H  =  \lim_{k \rightarrow \infty } H_k  \hspace{0.05cm}.$$
:Oft tendiert dabei <i>H<sub>k</sub></i> nur sehr langsam gegen den Grenzwert <i>H</i>.
+
Often,&nbsp; $H_k$&nbsp; tends only very slowly towards the limit&nbsp; $H$.
  
:Der Rechengang wird drastisch reduziert, wenn die Nachrichtenquelle Markoveigenschaften besitzt. Die Grafik zeigt das Übergangsdiagramm für eine binäre Markovquelle mit den zwei Zuständen (Symbolen) <b>A</b> und <b>B</b>. Dieses ist durch die beiden bedingten Wahrscheinlichkeiten <i>p</i><sub>A|B</sub> = <i>p</i> und <i>p</i><sub>B|A</sub> = <i>q</i> eindeutig bestimmt. Die bedingten Wahrscheinlichkeiten <i>p</i><sub>A|A</sub> und <i>p</i><sub>B|B</sub> sowie die Symbolwahrscheinlichkeiten <i>p</i><sub>A</sub> und <i>p</i><sub>B</sub> lassen sich daraus ermitteln.
+
The calculation process is drastically reduced if the message source has Markov properties.&nbsp; The graphic shows the transition diagram for a binary Markov source with the two states (symbols)&nbsp; $\rm A$&nbsp; and&nbsp; $\rm B$.  
  
:Die Entropie der binären Markovkette (mit der Einheit &bdquo;bit/Symbol&rdquo;) lautet dann:
+
*This is clearly determined by the two conditional probabilities&nbsp; $p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} = p$ &nbsp;and&nbsp; $p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = q$&nbsp;.
:$$H = p_{\rm AA}  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB}  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} +  p_{\rm BA}  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}} + p_{\rm BB}  \cdot   
+
*The other conditional probabilities&nbsp; $p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}$ &nbsp;and&nbsp; $p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}$&nbsp; as well as the (unconditional) symbol probabilities&nbsp; $p_{\rm A}$&nbsp; &nbsp;and&nbsp; $p_{\rm B}$&nbsp; can be determined from this.
 +
 
 +
 
 +
The entropy of the binary Markov chain&nbsp; (with the unit "bit/symbol")&nbsp; is then:
 +
:$$H = H_{\rm M} =  p_{\rm AA}  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB}  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} +  p_{\rm BA}  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}} + p_{\rm BB}  \cdot   
 
{\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}}
 
{\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}}
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
:Bei dieser Gleichung ist zu beachten, dass im Argument des <i>Logarithmus dualis</i> jeweils die <i>bedingten Wahrscheinlichkeiten p</i><sub>A|A</sub>, <i>p</i><sub>B|A</sub>, ... einzusetzen sind, während für die Gewichtung die <i>Verbundwahrscheinlichkeiten</i> <i>p</i><sub>AA</sub>, <i>p</i><sub>AB</sub>, ... zu verwenden sind.
+
It should be noted that in the argument of the&nbsp; "binary logarithm", the&nbsp; [[Theory_of_Stochastic_Signals/Statistical_Dependence_and_Independence#Conditional_Probability|conditional probabilities]]&nbsp; $p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}$,&nbsp; $p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}$, ... &nbsp; are to be used, while the&nbsp; [[Theory_of_Stochastic_Signals/Statistical_Dependence_and_Independence#Conditional_Probability|"Conditional Probabilities"]]&nbsp; $p_{\rm AA}$,&nbsp; $p_{\rm AB}$, ... &nbsp; are to be used for the weighting.
  
:Mit der Entropienäherung erster Ordnung,
+
Using the first order entropy approximation,
 
:$$H_1 = p_{\rm A}  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm A}} + p_{\rm B}  \cdot  
 
:$$H_1 = p_{\rm A}  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm A}} + p_{\rm B}  \cdot  
 
{\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm B}}
 
{\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm B}}
  \hspace{0.5cm}({\rm Einheit\hspace{-0.1cm}: \hspace{0.1cm}bit/Symbol})\hspace{0.05cm},$$
+
  \hspace{0.5cm}({\rm unit\hspace{-0.1cm}: \hspace{0.1cm}bit/symbol})\hspace{0.05cm},$$
:sowie der oben angegebenen (tatsächlichen) Entropie <i>H</i> lassen sich bei einer Markovquelle auch alle weiteren Entropienäherungen (<i>k</i> = 2, 3, ...) direkt berechnen:
+
as well as the (actual) entropy&nbsp; $H = H_{\rm M}$&nbsp; given above, all further entropy approximations&nbsp; $(k = 2, 3, \text{...})$&nbsp; can also be given directly for a Markov source:
:$$H_k =  \frac{1}{k} \cdot [ H_{\rm 1} + (k-1) \cdot H_{\rm M}]  
+
:$$H_k =  \frac{1}{k} \cdot \big [ H_{\rm 1} + (k-1) \cdot H_{\rm M} \big ]  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>Hinweis:</b> Diese Aufgabe gehört zum Themengebiet von Kapitel 1.2. Mit Ausnahme der Teilaufgabe (6) sei <i>p</i> = 1/4 und <i>q</i> = 1/2.
 
  
:Für die (ergodischen) Symbolwahrscheinlichkeiten einer Markovkette erster Ordnung gilt:
+
 
 +
 
 +
''Hints:''
 +
*The exercise belongs to the chapter&nbsp; [[Information_Theory/Discrete_Sources_with_Memory|Discrete Sources with Memory]].
 +
*Reference is also made in particular to the two pages&nbsp;  [[Theory_of_Stochastic_Signals/Set_Theory_Basics#Intersection|"Intersection"]]&nbsp; and&nbsp; [[Theory_of_Stochastic_Signals/Statistical_Dependence_and_Independence#Conditional_Probability|"Conditional Probability"]].
 +
*With the exception of subtask&nbsp; '''(6)'''&nbsp; &nbsp;  $p = 1/4$ &nbsp;and&nbsp; $q = 1/2$ always apply.
 +
 +
*For the&nbsp; (ergodic)&nbsp; symbol probabilities of a first order Markov chain applies:
 
:$$ p_{\rm A}  = \frac {p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}}
 
:$$ p_{\rm A}  = \frac {p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}}
 
{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} + p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}}  \hspace{0.05cm},  \hspace{0.3cm}
 
{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} + p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}}  \hspace{0.05cm},  \hspace{0.3cm}
Line 33: Line 43:
  
  
===Fragebogen===
+
 
 +
 
 +
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
{Geben Sie die Übergangswahrscheinlichkeiten für <i>p</i> = 1/4 und <i>q</i> = 1/2 an.
+
{Give the transition probabilities for &nbsp;$p = 1/4$ &nbsp;and&nbsp; $q = 1/2$.
 
|type="{}"}
 
|type="{}"}
$p = 1/4,\ q = 1/2:\ p_\text{A|A}$ = { 0.5 3% }
+
$p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \ = \ $ { 0.5 3% }
$p_\text{B|B}$ = { 0.75 3% }
+
$p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B} \ =  \  $ { 0.75 3% }
  
  
{Wie groß sind die Symbolwahrscheinlichkeiten?
+
{What are the&nbsp; (unconditional)&nbsp; symbol probabilities?&nbsp; Let &nbsp;$p = 1/4$ &nbsp;and&nbsp; $q = 1/2$ still hold.
 
|type="{}"}
 
|type="{}"}
$p = 1/4,\ q = 1/2:\ p_A$ = { 0.333 3% }
+
$p_{\rm A} \ = \ $ { 0.333 3% }
$p_B$ = { 0.667 3% }
+
$p_{\rm B} \ =  \  ${ 0.667 3% }
  
  
{Geben Sie die Entropienäherung erster Ordnung an.
+
{Give the corresponding first order entropy approximation.
 
|type="{}"}
 
|type="{}"}
$p = 1/4,\ q = 1/2:\ H_1$ = { 0.918 3% } $bit/Symbol$
+
$H_1  \ = \ $ { 0.918 1% } $\ \rm bit/symbol$
  
  
{Welche Entropie besitzt diese Markovquelle?
+
{What is the entropy&nbsp; $H = H_{\rm M}$&nbsp; of this Markov source with &nbsp;$p = 1/4$ &nbsp;and&nbsp; $q = 1/2$?
 
|type="{}"}
 
|type="{}"}
$p = 1/4,\ q = 1/2:\ H$ = { 0.875 3% } $bit/Symbol$
+
$H \ = \ $ { 0.875 1% } $\ \rm bit/symbol$
  
  
{Welche Näherungen <i>H<sub>k</sub></i> ergeben sich aufgrund der Markoveigenschaften?
+
{Which entropy approximations&nbsp; $H_k$&nbsp; result from the Markov properties?
 
|type="{}"}
 
|type="{}"}
$p = 1/4,\ q = 1/2:\ H_2$ = { 0.897 3% } $bit/Symbol$
+
$H_2 \ = \ $ { 0.897 0.5% } $\ \rm bit/symbol$
$H_3$ = { 0.889 3% } $bit/Symbol$
+
$H_3 \  =  \ $ { 0.889 0.5% } $\ \rm bit/symbol$
$H_4$ = { 0.886 3% } $bit/Symbol$
+
$H_4 \  =  \ $ { 0.886 0.5% } $\ \rm bit/symbol$
  
  
{Welche Entropie besitzt die Markovquelle mit <i>p</i> = 1/4 und <i>q</i> = 3/4?
+
{What is the entropy&nbsp; $H = H_{\rm M}$&nbsp; of the Markov source with &nbsp;$p = 1/4$ &nbsp;and&nbsp; $q = 3/4$?
 
|type="{}"}
 
|type="{}"}
$p = 1/4,\ q = 1/2:\ H$ = { 0.811 3% } $bit/Symbol$
+
$H \ = \ $ { 0.811 1% } $\ \rm bit/symbol$
  
  
Line 73: Line 85:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
[[File:P_ID2757__Inf_A_1_5.png|right|]]
+
[[File:Inf_A_1_5a_vers2.png|right|frame|Markov diagram for tasks&nbsp; '''(1)''', ... ,&nbsp; '''(5)''']]
:<b>1.</b>&nbsp;&nbsp;Hier gilt für die Übergangswahrscheinlichkeiten:
+
After&nbsp; $\rm A$&nbsp;, &nbsp; $\rm A$&nbsp; and&nbsp; $\rm B$&nbsp; are equally probable.&nbsp; After&nbsp; $\rm B$&nbsp;, &nbsp; $\rm B$&nbsp; occurs much more frequently than&nbsp; $\rm A$&nbsp;.&nbsp; The following applies to the transition probabilities
:$$p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \hspace{0.1cm} =  \hspace{0.1cm} 1 - p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}= 1 - q \hspace{0.15cm} \underline {= 0.5} \hspace{0.05cm},\\
+
:$$p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \hspace{0.1cm} =  \hspace{0.1cm} 1 - p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}= 1 - q \hspace{0.15cm} \underline {= 0.5} \hspace{0.05cm},$$
  p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B} \hspace{0.1cm} =  \hspace{0.1cm} 1 - p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}= 1 - p \hspace{0.15cm} \underline {= 0.75} \hspace{0.05cm}.$$
+
:$$ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B} \hspace{0.1cm} =  \hspace{0.1cm} 1 - p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}= 1 - p \hspace{0.15cm} \underline {= 0.75} \hspace{0.05cm}.$$
:Nach <b>A</b> sind <b>A</b> und <b>B</b> gleichwahrscheinlich. Nach <b>B</b> tritt <b>B</b> sehr viel häufiger als <b>A</b> auf.
+
 
 +
 
 +
 
 +
'''(2)'''&nbsp; According to the equations given:
 +
:$$p_{\rm A}= \frac{p}{p+q} = \frac{0.25}{0.25 + 0.50} \hspace{0.15cm} \underline {= 0.333} \hspace{0.05cm}, \hspace{0.5cm}
 +
p_{\rm B} = \frac{q}{p+q} = \frac{0.50}{0.25 + 0.50} \hspace{0.15cm} \underline {= 0.667} \hspace{0.05cm}.$$
  
:<b>2.</b>&nbsp;&nbsp;Entsprechend den angegebenen Gleichungen gilt:
 
:$$p_{\rm A}= \frac{p}{p+q} = \frac{0.25}{0.25 + 0.50} \hspace{0.15cm} \underline {= 0.333} \hspace{0.05cm}, \hspace{2cm}$$
 
:$$  p_{\rm B} = \frac{q}{p+q} = \frac{0.50}{0.25 + 0.50} \hspace{0.15cm} \underline {= 0.667}  \hspace{0.05cm}.$$
 
  
:<b>3.</b>&nbsp;&nbsp;Mit den unter (b) berechneten Wahrscheinlichkeiten gilt:
+
 
 +
'''(3)'''&nbsp; With the probabilities calculated in the last sub-task:
 
:$$H_{\rm 1}  =  H_{\rm bin}(p_{\rm A})  =  1/3 \cdot {\rm log}_2\hspace{0.01cm} (3) + 2/3 \cdot {\rm log}_2\hspace{0.01cm} (1.5) =
 
:$$H_{\rm 1}  =  H_{\rm bin}(p_{\rm A})  =  1/3 \cdot {\rm log}_2\hspace{0.01cm} (3) + 2/3 \cdot {\rm log}_2\hspace{0.01cm} (1.5) =
  1.585 - 2/3\hspace{0.15cm} \underline {= 0.918 \,{\rm bit/Symbol}}  
+
  1.585 - 2/3\hspace{0.15cm} \underline {= 0.918 \,{\rm bit/symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>4.</b>&nbsp;&nbsp;Die Entropie der Markovquelle lautet entsprechend der Angabe
+
 
 +
 
 +
'''(4)'''&nbsp; The entropy of the Markov source is according to:
 
:$$H = p_{\rm AA}  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB}  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} +  p_{\rm BA}  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}} + p_{\rm BB}  \cdot   
 
:$$H = p_{\rm AA}  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB}  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} +  p_{\rm BA}  \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}} + p_{\rm BB}  \cdot   
 
{\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}}
 
{\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}}
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
:Für die Verbundwahrscheinlichkeiten gilt:
+
*For the composite probabilities holds:
:$$p_{\rm AA} \hspace{0.1cm} = \hspace{0.1cm} p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \cdot p_{\rm A} = (1-q) \cdot \frac{p}{p+q} =
+
:$$p_{\rm AA} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \cdot p_{\rm A} = (1-q) \cdot \frac{p}{p+q} =
  \frac{1/2 \cdot 1/4}{3/4} =  \frac{1}{6} \hspace{0.05cm},\\
+
  \frac{1/2 \cdot 1/4}{3/4} =  {1}/{6} \hspace{0.05cm},$$
p_{\rm AB} \hspace{0.1cm} =  \hspace{0.1cm} p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} \cdot p_{\rm A} = q \cdot \frac{p}{p+q} =
+
:$$ p_{\rm AB} =  p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} \cdot p_{\rm A} = q \cdot \frac{p}{p+q} =
  \frac{1/2 \cdot 1/4}{3/4} =  \frac{1}{6} \hspace{0.05cm},\\
+
  \frac{1/2 \cdot 1/4}{3/4} =  {1}/{6} \hspace{0.05cm},$$
p_{\rm BA} \hspace{0.1cm} =  \hspace{0.1cm} p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} \cdot p_{\rm B} = p \cdot \frac{q}{p+q} =
+
:$$ p_{\rm BA} =  p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} \cdot p_{\rm B} = p \cdot \frac{q}{p+q} =
  p_{\rm AB} =  \frac{1}{6} \hspace{0.05cm},\\
+
  p_{\rm AB} =  {1}/{6} \hspace{0.05cm},$$
p_{\rm BB} \hspace{0.1cm} =  \hspace{0.1cm} p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B} \cdot p_{\rm B} = (1-p) \cdot \frac{q}{p+q} =
+
:$$ p_{\rm BB} =  p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B} \cdot p_{\rm B} = (1-p) \cdot \frac{q}{p+q} =
  \frac{3/4 \cdot 1/2}{3/4} =  \frac{1}{2} $$
+
  \frac{3/4 \cdot 1/2}{3/4} =  {1}/{2} $$
:$$\Rightarrow\hspace{0.3cm} H  \hspace{0.1cm} = \hspace{0.1cm}  1/6 \cdot {\rm log}_2\hspace{0.01cm} (2) + 1/6 \cdot {\rm log}_2\hspace{0.01cm} (2) + 1/6 \cdot  
+
:$$\Rightarrow\hspace{0.3cm} H  = 1/6 \cdot {\rm log}_2\hspace{0.01cm} (2) + 1/6 \cdot {\rm log}_2\hspace{0.01cm} (2) + 1/6 \cdot  
{\rm log}_2\hspace{0.01cm} (4) + 1/2 \cdot {\rm log}_2\hspace{0.1cm} (4/3) = \\
+
{\rm log}_2\hspace{0.01cm} (4) + 1/2 \cdot {\rm log}_2\hspace{0.1cm} (4/3) =  
  \hspace{0.1cm} =  \hspace{0.1cm} 1/6 + 1/6 + 2/6 + 1 - 1/2 \cdot {\rm log}_2\hspace{0.01cm} (3) \hspace{0.15cm} \underline {= 0.875 \,{\rm bit/Symbol}}  
+
  10/6 - 1/2 \cdot {\rm log}_2\hspace{0.01cm} (3) \hspace{0.15cm} \underline {= 0.875 \,{\rm bit/symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>5.</b>&nbsp;&nbsp;Allgemein gilt mit <i>H</i><sub>M</sub> = <i>H</i> für die <i>k</i>&ndash;Entropienäherung:
+
 
:$$H_k =  \frac{1}{k} \cdot [ H_{\rm 1} + (k-1) \cdot H_{\rm M}]  
+
 
 +
'''(5)'''&nbsp; In general, with&nbsp; $H_{\rm M} = H$&nbsp; for&nbsp; $k$&ndash;th entropy approximation: &nbsp;
 +
:$$H_k =  {1}/{k} \cdot [ H_{\rm 1} + (k-1) \cdot H_{\rm M}] \hspace{0.05cm}.$$
 +
*It follows that:
 +
:$$H_2 =  {1}/{2} \cdot [ 0.918 + 1  \cdot 0.875] \hspace{0.15cm} \underline {= 0.897 \,{\rm bit/symbol}}
 +
\hspace{0.05cm},$$
 +
:$$ H_3 = {1}/{3} \cdot [ 0.918 + 2  \cdot 0.875] \hspace{0.15cm} \underline {= 0.889 \,{\rm bit/symbol}}
 +
\hspace{0.05cm},$$
 +
:$$ H_4 =  {1}/{4} \cdot [ 0.918 + 3  \cdot 0.875] \hspace{0.15cm} \underline {= 0.886 \,{\rm bit/symbol}}
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
:Daraus folgt:
 
:$$H_2 \hspace{0.1cm} =  \hspace{0.1cm}  \frac{1}{2} \cdot [ 0.918 + 1  \cdot 0.875] \hspace{0.15cm} \underline {= 0.897 \,{\rm bit/Symbol}}
 
\hspace{0.05cm},\\
 
H_3 \hspace{0.1cm} =  \hspace{0.1cm}  \frac{1}{3} \cdot [ 0.918 + 2  \cdot 0.875] \hspace{0.15cm} \underline {= 0.889 \,{\rm bit/Symbol}}
 
\hspace{0.05cm},\\
 
H_4 \hspace{0.1cm} =  \hspace{0.1cm}  \frac{1}{4} \cdot [ 0.918 + 3  \cdot 0.875] \hspace{0.15cm} \underline {= 0.886 \,{\rm bit/Symbol}}
 
\hspace{0.05cm}.$$
 
[[File:P_ID2251__Inf_A_1_5f.png|right|]]
 
  
:<b>6.</b>&nbsp;&nbsp;Mit dem neuen Parametersatz (<i>p</i> = 1/4, <i>q</i> = 3/4) erhält man für die Symbolwahrscheinlichkeiten: <i>p</i><sub>A</sub> = 1/4 und <i>p</i><sub>B</sub> = 3/4. Dieser Sonderfall führt demnach zu statistisch unabhängigen Symbolen:
+
 
 +
[[File:Inf_A_1_5f_vers2.png|right|frame|Markov diagram for subtask&nbsp; '''(6)''']]
 +
'''(6)'''&nbsp; With the new set of parameters&nbsp; $(p = 1/4, q = 3/4)$,&nbsp; we obtain for the symbol probabilities:
 +
:$$ p_{\rm A} = 1/4, \ p_{\rm B} = 3/4.$$
 +
 
 +
*This special case thus leads to statistically independent symbols:
 
:$$ p_{\rm A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}
 
:$$ p_{\rm A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}
 
  \hspace{0.05cm}, \hspace{0.2cm} p_{\rm B} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}
 
  \hspace{0.05cm}, \hspace{0.2cm} p_{\rm B} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
:Damit ist die Entropie <i>H</i> identisch mit der Entropienäherung <i>H</i><sub>1</sub>:
+
*Thus the entropy&nbsp; $H$&nbsp; is identical with the entropy approximation&nbsp; $H_1$:
 
:$$H = H_{\rm 1}  =  1/4 \cdot {\rm log}_2\hspace{0.01cm} (4) + 3/4 \cdot {\rm log}_2\hspace{0.01cm} (4/3) =
 
:$$H = H_{\rm 1}  =  1/4 \cdot {\rm log}_2\hspace{0.01cm} (4) + 3/4 \cdot {\rm log}_2\hspace{0.01cm} (4/3) =
 
  2 - 0.75 \cdot {\rm log}_2\hspace{0.01cm} (3) \hspace{0.15cm} \underline {= 0.811 \,{\rm bit/Symbol}}  
 
  2 - 0.75 \cdot {\rm log}_2\hspace{0.01cm} (3) \hspace{0.15cm} \underline {= 0.811 \,{\rm bit/Symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
:Die Entropienäherungen <i>H</i><sub>2</sub>, <i>H</i><sub>3</sub>, <i>H</i><sub>4</sub>, ... liefern hier ebenfalls das Ergebnis 0.811 bit/Symbol.
+
*The entropy approximations&nbsp; $H_2$,&nbsp; $H_3$,&nbsp; $H_4$,&nbsp; ... &nbsp;also yield the result&nbsp; $0.811 \, \rm bit/symbol$.
 
{{ML-Fuß}}
 
{{ML-Fuß}}
  
  
  
[[Category:Aufgaben zu Informationstheorie und Quellencodierung|^1.2 Nachrichtenquellen mit Gedächtnis^]]
+
[[Category:Information Theory: Exercises|^1.2 Sources with Memory^]]

Latest revision as of 13:03, 10 August 2021

Binary Markov diagram

Exercise 1.4  has shown that the calculation of the entropy for a source with memory can be very time-consuming.  One must then first calculate (very many) entropy approximations  $H_k$  for  $k$–tuples and only then the source entropy can be determined with the boundary transition  $k \to \infty$ :

$$H = \lim_{k \rightarrow \infty } H_k \hspace{0.05cm}.$$

Often,  $H_k$  tends only very slowly towards the limit  $H$.

The calculation process is drastically reduced if the message source has Markov properties.  The graphic shows the transition diagram for a binary Markov source with the two states (symbols)  $\rm A$  and  $\rm B$.

  • This is clearly determined by the two conditional probabilities  $p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} = p$  and  $p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = q$ .
  • The other conditional probabilities  $p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}$  and  $p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}$  as well as the (unconditional) symbol probabilities  $p_{\rm A}$   and  $p_{\rm B}$  can be determined from this.


The entropy of the binary Markov chain  (with the unit "bit/symbol")  is then:

$$H = H_{\rm M} = p_{\rm AA} \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB} \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm BA} \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}} + p_{\rm BB} \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}} \hspace{0.05cm}.$$

It should be noted that in the argument of the  "binary logarithm", the  conditional probabilities  $p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}$,  $p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}$, ...   are to be used, while the  "Conditional Probabilities"  $p_{\rm AA}$,  $p_{\rm AB}$, ...   are to be used for the weighting.

Using the first order entropy approximation,

$$H_1 = p_{\rm A} \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm A}} + p_{\rm B} \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm B}} \hspace{0.5cm}({\rm unit\hspace{-0.1cm}: \hspace{0.1cm}bit/symbol})\hspace{0.05cm},$$

as well as the (actual) entropy  $H = H_{\rm M}$  given above, all further entropy approximations  $(k = 2, 3, \text{...})$  can also be given directly for a Markov source:

$$H_k = \frac{1}{k} \cdot \big [ H_{\rm 1} + (k-1) \cdot H_{\rm M} \big ] \hspace{0.05cm}.$$



Hints:

  • For the  (ergodic)  symbol probabilities of a first order Markov chain applies:
$$ p_{\rm A} = \frac {p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}} { p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} + p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} \hspace{0.05cm}, \hspace{0.3cm} p_{\rm B} = \frac {p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} { p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} + p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} \hspace{0.05cm}.$$



Questions

1

Give the transition probabilities for  $p = 1/4$  and  $q = 1/2$.

$p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \ = \ $

$p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B} \ = \ $

2

What are the  (unconditional)  symbol probabilities?  Let  $p = 1/4$  and  $q = 1/2$ still hold.

$p_{\rm A} \ = \ $

$p_{\rm B} \ = \ $

3

Give the corresponding first order entropy approximation.

$H_1 \ = \ $

$\ \rm bit/symbol$

4

What is the entropy  $H = H_{\rm M}$  of this Markov source with  $p = 1/4$  and  $q = 1/2$?

$H \ = \ $

$\ \rm bit/symbol$

5

Which entropy approximations  $H_k$  result from the Markov properties?

$H_2 \ = \ $

$\ \rm bit/symbol$
$H_3 \ = \ $

$\ \rm bit/symbol$
$H_4 \ = \ $

$\ \rm bit/symbol$

6

What is the entropy  $H = H_{\rm M}$  of the Markov source with  $p = 1/4$  and  $q = 3/4$?

$H \ = \ $

$\ \rm bit/symbol$


Solution

Markov diagram for tasks  (1), ... ,  (5)

After  $\rm A$ ,   $\rm A$  and  $\rm B$  are equally probable.  After  $\rm B$ ,   $\rm B$  occurs much more frequently than  $\rm A$ .  The following applies to the transition probabilities

$$p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \hspace{0.1cm} = \hspace{0.1cm} 1 - p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}= 1 - q \hspace{0.15cm} \underline {= 0.5} \hspace{0.05cm},$$
$$ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B} \hspace{0.1cm} = \hspace{0.1cm} 1 - p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}= 1 - p \hspace{0.15cm} \underline {= 0.75} \hspace{0.05cm}.$$


(2)  According to the equations given:

$$p_{\rm A}= \frac{p}{p+q} = \frac{0.25}{0.25 + 0.50} \hspace{0.15cm} \underline {= 0.333} \hspace{0.05cm}, \hspace{0.5cm} p_{\rm B} = \frac{q}{p+q} = \frac{0.50}{0.25 + 0.50} \hspace{0.15cm} \underline {= 0.667} \hspace{0.05cm}.$$


(3)  With the probabilities calculated in the last sub-task:

$$H_{\rm 1} = H_{\rm bin}(p_{\rm A}) = 1/3 \cdot {\rm log}_2\hspace{0.01cm} (3) + 2/3 \cdot {\rm log}_2\hspace{0.01cm} (1.5) = 1.585 - 2/3\hspace{0.15cm} \underline {= 0.918 \,{\rm bit/symbol}} \hspace{0.05cm}.$$


(4)  The entropy of the Markov source is according to:

$$H = p_{\rm AA} \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm AB} \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A}} + p_{\rm BA} \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B}} + p_{\rm BB} \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{ p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B}} \hspace{0.05cm}.$$
  • For the composite probabilities holds:
$$p_{\rm AA} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} \cdot p_{\rm A} = (1-q) \cdot \frac{p}{p+q} = \frac{1/2 \cdot 1/4}{3/4} = {1}/{6} \hspace{0.05cm},$$
$$ p_{\rm AB} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} \cdot p_{\rm A} = q \cdot \frac{p}{p+q} = \frac{1/2 \cdot 1/4}{3/4} = {1}/{6} \hspace{0.05cm},$$
$$ p_{\rm BA} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} \cdot p_{\rm B} = p \cdot \frac{q}{p+q} = p_{\rm AB} = {1}/{6} \hspace{0.05cm},$$
$$ p_{\rm BB} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B} \cdot p_{\rm B} = (1-p) \cdot \frac{q}{p+q} = \frac{3/4 \cdot 1/2}{3/4} = {1}/{2} $$
$$\Rightarrow\hspace{0.3cm} H = 1/6 \cdot {\rm log}_2\hspace{0.01cm} (2) + 1/6 \cdot {\rm log}_2\hspace{0.01cm} (2) + 1/6 \cdot {\rm log}_2\hspace{0.01cm} (4) + 1/2 \cdot {\rm log}_2\hspace{0.1cm} (4/3) = 10/6 - 1/2 \cdot {\rm log}_2\hspace{0.01cm} (3) \hspace{0.15cm} \underline {= 0.875 \,{\rm bit/symbol}} \hspace{0.05cm}.$$


(5)  In general, with  $H_{\rm M} = H$  for  $k$–th entropy approximation:  

$$H_k = {1}/{k} \cdot [ H_{\rm 1} + (k-1) \cdot H_{\rm M}] \hspace{0.05cm}.$$
  • It follows that:
$$H_2 = {1}/{2} \cdot [ 0.918 + 1 \cdot 0.875] \hspace{0.15cm} \underline {= 0.897 \,{\rm bit/symbol}} \hspace{0.05cm},$$
$$ H_3 = {1}/{3} \cdot [ 0.918 + 2 \cdot 0.875] \hspace{0.15cm} \underline {= 0.889 \,{\rm bit/symbol}} \hspace{0.05cm},$$
$$ H_4 = {1}/{4} \cdot [ 0.918 + 3 \cdot 0.875] \hspace{0.15cm} \underline {= 0.886 \,{\rm bit/symbol}} \hspace{0.05cm}.$$


Markov diagram for subtask  (6)

(6)  With the new set of parameters  $(p = 1/4, q = 3/4)$,  we obtain for the symbol probabilities:

$$ p_{\rm A} = 1/4, \ p_{\rm B} = 3/4.$$
  • This special case thus leads to statistically independent symbols:
$$ p_{\rm A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm A\hspace{0.01cm}|\hspace{0.01cm}B} \hspace{0.05cm}, \hspace{0.2cm} p_{\rm B} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}A} = p_{\rm B\hspace{0.01cm}|\hspace{0.01cm}B} \hspace{0.05cm}.$$
  • Thus the entropy  $H$  is identical with the entropy approximation  $H_1$:
$$H = H_{\rm 1} = 1/4 \cdot {\rm log}_2\hspace{0.01cm} (4) + 3/4 \cdot {\rm log}_2\hspace{0.01cm} (4/3) = 2 - 0.75 \cdot {\rm log}_2\hspace{0.01cm} (3) \hspace{0.15cm} \underline {= 0.811 \,{\rm bit/Symbol}} \hspace{0.05cm}.$$
  • The entropy approximations  $H_2$,  $H_3$,  $H_4$,  ...  also yield the result  $0.811 \, \rm bit/symbol$.