Difference between revisions of "Aufgaben:Exercise 1.4: Entropy Approximations for the AMI Code"

From LNTwww
m (Text replacement - "code symbol sequence" to "encoded sequence")
 
(27 intermediate revisions by 5 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie und Quellencodierung/Nachrichtenquellen mit Gedächtnis
+
{{quiz-Header|Buchseite=Information_Theory/Discrete_Sources_with_Memory
 
}}
 
}}
  
[[File:P_ID2248__Inf_A_1_4.png|right|]]
+
[[File:P_ID2248__Inf_A_1_4.png|right|frame|Binary source signal (top) and <br>ternary encoder signal (bottom)]]
:Die Grafik zeigt oben das binäre Quellensignal <i>q</i>(<i>t</i>), das man ebenfalls durch die Symbolfolge &#9001;<i>q<sub>&nu;</sub></i>&#9002; mit <i>q<sub>&nu;</sub></i>&nbsp;&#8712;&nbsp;{<b>L</b>,&nbsp;<b>H</b>} beschreiben kann. In der gesamten Aufgabe gelte <i>p</i><sub>L</sub>&nbsp;=&nbsp;<i>p</i><sub>H</sub>&nbsp;=&nbsp;0.5.
+
The graph above shows the binary source signal&nbsp; $q(t)$, which can also be described by the symbol sequence&nbsp; $\langle q_\nu \rangle$&nbsp; with&nbsp; $q_\nu \in \{ {\rm L}, {\rm H} \}$&nbsp;.&nbsp; In the entire task,&nbsp; $p_{\rm L} = p_{\rm H} =0.5$ applies.
  
:Das codierte Signal <i>c</i>(<i>t</i>) und die dazugehörige Symbolfolge &#9001;<i>c<sub>&nu;</sub></i>&#9002;&nbsp;&#8712;&nbsp;{<b>P</b>, <b>N</b>, <b>M</b>} ergibt sich aus der AMI&ndash;Codierung (<i>Alternate Mark Inversion</i>) nach folgender Vorschrift:
+
The coded signal&nbsp; $c(t)$&nbsp; and the corresponding symbol sequence&nbsp; $\langle c_\nu \rangle$&nbsp; with&nbsp; $c_\nu \in \{{\rm P}, {\rm N}, {\rm M} \}$&nbsp; results from the AMI coding ("Alternate Mark Inversion") according to the following rule:
  
:* Das Binärsymbol <b>L</b> &#8658; <i>Low</i> wird stets durch das Ternärsymbol <b>N</b> &#8658; <i>Null</i> dargestellt.
+
* The binary symbol&nbsp; $\rm L$ &nbsp;&rArr;&nbsp; "Low"is always represented by the ternary symbol&nbsp; $\rm N$ &nbsp;&rArr;&nbsp; "German: Null"&nbsp;&rArr;&nbsp;"Zero".
 +
* The binary symbol&nbsp; $\rm H$ &nbsp;&rArr;&nbsp; "High"&nbsp; is also encoded deterministically but alternately (hence the name "Alternate Mark Inversion") by the symbols&nbsp; $\rm P$ &nbsp;&rArr;&nbsp; "Plus"&nbsp; and&nbsp; $\rm M$ &nbsp;&rArr;&nbsp; "Minus".
  
:* Das Binärsymbol <b>H</b> &#8658; <i>High</i> wird ebenfalls deterministisch, aber alternierend (daher der Name &bdquo;AMI&rdquo;) durch die Symbole <nobr><b>P</b> &#8658; <i>Plus</i></nobr> und <b>M</b> &#8658; <i>Minus</i> codiert.
 
  
:In dieser Aufgabe sollen die Entropienäherungen für das AMI&ndash;codierte Signal berechnet werden:
+
In this task, the entropy approximations for the AMI coded signal are to be calculated:
  
:* Die Näherung <i>H</i><sub>1</sub> bezieht sich nur auf die Symbolwahrscheinlichkeiten <i>p</i><sub>P</sub>, <i>p</i><sub>N</sub> und <i>p</i><sub>M</sub>.
+
* Approximation&nbsp; $H_1$&nbsp; refers only to the symbol probabilities&nbsp; $p_{\rm P}$,&nbsp; $p_{\rm N}$&nbsp; and&nbsp; $p_{\rm M}$.
  
:* Die <i>k</i>&ndash;te Entropienäherung (<i>k</i> = 2, 3, ... ) kann nach folgender Gleichung ermittelt werden:
+
* The&nbsp; $k$&ndash;th is the entropy approximation&nbsp; $(k = 2, 3, \text{...} \ )$&nbsp; can be determined according to the following equation:
 
:$$H_k = \frac{1}{k} \cdot \sum_{i=1}^{3^k} p_i^{(k)} \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{p_i^{(k)}} \hspace{0.5cm}({\rm Einheit\hspace{-0.1cm}: \hspace{0.1cm}bit/Symbol})
 
:$$H_k = \frac{1}{k} \cdot \sum_{i=1}^{3^k} p_i^{(k)} \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{p_i^{(k)}} \hspace{0.5cm}({\rm Einheit\hspace{-0.1cm}: \hspace{0.1cm}bit/Symbol})
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
:Hierbei bezeichnet <i>p<sub>i</sub></i><sup>(<i>k</i>)</sup> die <i>i</i>&ndash;te Verbundwahrscheinlichkeit eines <i>k</i>&ndash;Tupels.
+
:Here,&nbsp; $p_i^{(k)}$&nbsp; the&nbsp; $i$&ndash;th composite probability of a&nbsp; $k$&ndash;tuple.
  
:<b>Hinweis:</b> Die Aufgabe gehört zu Kapitel 1.2. In der Aufgabe Z1.4 wird die tatsächliche Entropie der Codesymbolfolge  &#9001;<i>c<sub>&nu;</sub></i>&#9002; zu <i>H</i> = 1 bit/Symbol berechnet. Zu erwarten sind die folgenden Größenrelationen:
 
:$$H \le ... \le H_3 \le H_2 \le H_1 \le H_0
 
\hspace{0.05cm}.$$
 
  
  
===Fragebogen===
+
 
 +
 
 +
 
 +
 
 +
 
 +
''Hints:''
 +
*This task belongs to the chapter&nbsp; [[Information_Theory/Discrete_Sources_with_Memory|Discrete Sources with Memory]].
 +
*Reference is made in particular to the page&nbsp;  [[Information_Theory/Discrete_Sources_with_Memory#The_entropy_of_the_AMI_code|The Entropy of the AMI code]].
 +
*In&nbsp;  [[Aufgaben:Exercise_1.4Z:_Entropy_of_the_AMI_Code|Exercise 1.4Z]]&nbsp; the actual entropy of the encoded sequence&nbsp;  $\langle c_\nu \rangle$&nbsp; is calculated to&nbsp; $H = 1 \; \rm bit/symbol$.
 +
*The following relations between the units are to be expected: &nbsp; $H \le$ ...$ \le H_3 \le H_2 \le H_1 \le H_0
 +
\hspace{0.05cm}.$
 +
 +
 
 +
 
 +
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
{Wie groß ist der Entscheidungsgehalt des AMI&ndash;Codes?
+
{What is the decision content of the AMI code?
 
|type="{}"}
 
|type="{}"}
$H_0$ = { 1.585 3% } $bit/Symbol$
+
$H_0 \ = \ $ { 1.585 3% } $\ \rm bit/symbol$
  
  
{Berechnen Sie die erste Entropienäherung.
+
{Calculate the first entropy approximation of the AMI code.
 
|type="{}"}
 
|type="{}"}
$H_1$ = { 1.5 3% } $bit/Symbol$
+
$H_1 \ = \ $ { 1.5 3% } $\ \rm bit/symbol$
  
  
{Wie groß ist die Entropienäherung <i>H</i><sub>2</sub>, basierend auf Zweiertupel?
+
{What is the entropy approximation&nbsp; $H_2$, based on two-tuples?
 
|type="{}"}
 
|type="{}"}
$H_2$ = { 1.375 3% } $bit/Symbol$
+
$H_2 \ = \ $ { 1.375 3% } $\ \rm bit/symbol$
  
  
{Welchen Wert liefert die Entropienäherung <i>H</i><sub>3</sub>, basierend auf Dreiertuptel?
+
{What is the value of the entropy approximation&nbsp; $H_3$, based on three-tuples?
 
|type="{}"}
 
|type="{}"}
$H_3$ = { 0 3% } $bit/Symbol$
+
$H_3 \ = \ $ { 1.292 3% } $\ \rm bit/symbol$
  
  
{Welche Aussagen gelten für die Entropienäherung <i>H</i><sub>4</sub>?
+
{Which statements apply to the entropy approximation&nbsp; $H_4$?
 
|type="[]"}
 
|type="[]"}
+ Es muss über 3<sup>4</sup> = 81 Summanden gemittelt werden.
+
+ It must be averaged over&nbsp; $3^4 = 81$&nbsp; summands.
+ Es gilt 1 bit/Symbol < <i>H</i><sub>4</sub> < <i>H</i><sub>3</sub>.
+
+ &nbsp; $1 \; {\rm bit/symbol} < H_4 < H_3$ is valid.
- Nach langer Rechnung erhält man <i>H</i><sub>4</sub>&nbsp;=&nbsp;1.333&nbsp;bit/Symbol.
+
- After a long calculation you get&nbsp; $H_4 = 1.333 \; {\rm bit/symbol}$.
  
  
Line 59: Line 70:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
:<b>1.</b>&nbsp;&nbsp;Der Symbolumfang beträgt <i>M</i> = 3. Daraus ergibt sich der Entscheidungsgehalt mit dem <i>Logarithmus dualis</i> zur Basis 2 (log<sub>2</sub> oder &bdquo;ld&rdquo;):
+
'''(1)'''&nbsp; The symbol set size is&nbsp; $M = 3$.&nbsp; This gives the decision content with the&nbsp; logarithm&nbsp; to the base $2$ &nbsp; &rArr; &nbsp; $\log_2$:
:$$H_0  = {\rm log}_2\hspace{0.1cm} M = {\rm log}_2\hspace{0.1cm} (3)  \hspace{0.15cm} \underline { = 1.585 \,{\rm bit/Symbol}}  
+
:$$H_0  = {\rm log}_2\hspace{0.1cm} M = {\rm log}_2\hspace{0.1cm} (3)  \hspace{0.15cm} \underline { = 1.585 \,{\rm bit/symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>2.</b>&nbsp;&nbsp;Die Entropienäherung erster Ordnung berücksichtigt nur die Symbolwahrscheinlichkeiten <i>p</i><sub>P</sub>, <i>p</i><sub>N</sub> und <i>p</i><sub>M</sub> und nicht die statistischen Bindungen innerhalb der Codefolge &#9001;<i>c<sub>&nu;</sub></i>&#9002;. Damit erhält man:
+
 
:$$p_{\rm N} = p_{\rm L} = 1/2\hspace{0.05cm},\hspace{0.2cm}p_{\rm P} = p_{\rm M} = p_{\rm H}/2 = 1/4$$
+
 
:$$\Rightarrow\hspace{0.3cm} H_1  = \frac{1}{2} \cdot {\rm log}_2\hspace{0.1cm} (2) +  
+
'''(2)'''&nbsp; The first-order entropy approximation takes into account only the symbol probabilities&nbsp; $p_{\rm P}$,&nbsp; $p_{\rm N}$&nbsp; and&nbsp; $p_{\rm M}$&nbsp; <br>and not the statistical bindings within the code sequence&nbsp; $\langle c_\nu \rangle$.&nbsp; Thus one obtains:
 +
:$$p_{\rm N} = p_{\rm L} = 1/2\hspace{0.05cm},\hspace{0.2cm}p_{\rm P} = p_{\rm M} = p_{\rm H}/2 = 1/4 \hspace{0.3cm}
 +
\Rightarrow\hspace{0.3cm} H_1  = \frac{1}{2} \cdot {\rm log}_2\hspace{0.1cm} (2) +  
 
  2 \cdot \frac{1}{4} \cdot {\rm log}_2\hspace{0.1cm}(4)  \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/Symbol}}  
 
  2 \cdot \frac{1}{4} \cdot {\rm log}_2\hspace{0.1cm}(4)  \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/Symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>3.</b>&nbsp;&nbsp;Zunächst müssen hier die <i>M</i><sup>2</sup> = 9 Verbundwahrscheinlichkeiten von Zweiertupeln ermittelt werden, im Folgenden gekennzeichnet durch die beiden ersten Codesymbole <i>c</i><sub>1</sub> und <i>c</i><sub>2</sub>:
 
  
:* Da beim AMI&ndash;Code weder <b>P</b> auf <b>P</b> noch <b>M</b> auf <b>M</b> folgen kann, ist <i>p</i><sub>PP</sub> = <i>p</i><sub>MM</sub> = 0.
 
  
:* Für die Verbundwahrscheinlichkeiten unter der Bedingung <i>c</i><sub>2</sub> = <b>N</b> gilt:
+
'''(3)'''&nbsp; First, the&nbsp; $M^2 = 9$&nbsp; composite probabilities of two-tuples have to be determined here, in the following marked by the first two code symbols&nbsp; $c_1$&nbsp; and&nbsp; $c_2$:
:$$p_{\rm NN} \hspace{0.1cm} =  \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{N}) \cdot {\rm Pr}(c_2 = \mathbf{N}\hspace{0.05cm} | c_1 = \mathbf{N}) = 1/2 \cdot 1/2 = 1/4 \hspace{0.05cm},\\
+
* Since in the AMI code neither&nbsp; $\rm P$&nbsp; can follow&nbsp; $\rm P$&nbsp; nor&nbsp; $\rm M$&nbsp; can follow&nbsp; $\rm M$&nbsp;,&nbsp; $p_{\rm PP} = p_{\rm MM} =0$.
p_{\rm MN} \hspace{0.1cm} =  \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{M}) \cdot {\rm Pr}(c_2 = \mathbf{N}\hspace{0.05cm} | c_1 = \mathbf{M}) = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},\\
+
* For the composite probabilities under the condition&nbsp; $c_2 = \rm N$&nbsp; applies:
  p_{\rm PN} \hspace{0.1cm} =  \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{P}) \cdot {\rm Pr}(c_2 = \mathbf{N}\hspace{0.05cm} | c_1 = \mathbf{P}) = 1/4 \cdot 1/2 = 1/8
+
:$$p_{\rm NN} \hspace{0.1cm} =  \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{N}) \cdot {\rm Pr}(c_2 = \mathbf{N}\hspace{0.05cm} | c_1 = \mathbf{N}) = 1/2 \cdot 1/2 = 1/4 \hspace{0.05cm},$$
 +
:$$ p_{\rm MN} \hspace{0.1cm} =  \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{M}) \cdot {\rm Pr}(c_2 = \mathbf{N}\hspace{0.05cm} | c_1 = \mathbf{M}) = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},$$
 +
:$$ p_{\rm PN} \hspace{0.1cm} =  \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{P}) \cdot {\rm Pr}(c_2 = \mathbf{N}\hspace{0.05cm} | c_1 = \mathbf{P}) = 1/4 \cdot 1/2 = 1/8
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
 +
* The composite probabilities of the two-tuples&nbsp; $\rm PM$&nbsp;  and&nbsp; $\rm MP$&nbsp; are:
 +
:$$p_{\rm PM} \hspace{0.1cm} =  \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{P}) \cdot {\rm Pr}(c_2 = \mathbf{M}\hspace{0.05cm} | c_1 = \mathbf{P}) = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},$$
 +
:$$ p_{\rm MP} \hspace{0.1cm} =  \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{M}) \cdot {\rm Pr}(c_2 = \mathbf{P}\hspace{0.05cm} | c_1 = \mathbf{M}) = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm}.$$
 +
* For the remaining probabilities, it must also be taken into account whether the binary symbol&nbsp; $\rm H$&nbsp; was encoded with&nbsp; $\rm P$&nbsp; or with&nbsp; $\rm M$&nbsp; last time &nbsp;&#8658;&nbsp; further factor&nbsp; $1/2$:
 +
:$$p_{\rm NM} \hspace{0.1cm} =  \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{N}) \cdot {\rm Pr}(c_2 = \mathbf{M}\hspace{0.05cm} | c_1 = \mathbf{N}) = 1/2 \cdot 1/2 \cdot 1/2= 1/8 \hspace{0.05cm},$$
 +
:$$ p_{\rm NP} \hspace{0.1cm} =  \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{N}) \cdot {\rm Pr}(c_2 = \mathbf{P}\hspace{0.05cm} | c_1 = \mathbf{N}) = 1/2 \cdot 1/2 \cdot 1/2 = 1/8 \hspace{0.05cm}.$$
  
:* Die Verbundwahrscheinlichkeiten der Zweiertupel &bdquo;PM&rdquo; und &bdquo;MP&rdquo; lauten:
+
Thus the entropy&nbsp; $H_2'$&nbsp; of a two-tuple or its entropy&nbsp; $H_2$&nbsp; per code symbol:
:$$p_{\rm PM} \hspace{0.1cm} = \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{P}) \cdot {\rm Pr}(c_2 = \mathbf{M}\hspace{0.05cm} | c_1 = \mathbf{P}) = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},\\
+
:$$H_2\hspace{0.01cm}= \frac{1}{4} \cdot {\rm log}_2\hspace{0.1cm} (4) +
p_{\rm MP} \hspace{0.1cm} = \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{M}) \cdot {\rm Pr}(c_2 = \mathbf{P}\hspace{0.05cm} | c_1 = \mathbf{M}) = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm}.$$
+
6 \cdot \frac{1}{8} \cdot {\rm log}_2\hspace{0.1cm}(8)  \hspace{0.15cm} {= 2.75 \,{\text{bit/two-tuple} }}\hspace{0.3cm}
 +
\Rightarrow\hspace{0.3cm} H_2  = \frac{H_2\hspace{0.01cm}'}{2}   \hspace{0.15cm} \underline {= 1.375 \,{\rm bit/symbol}}
 +
\hspace{0.05cm}.$$
  
:* Bei den restlichen Wahrscheinlichkeiten muss zusätzlich berücksichtigt werden, ob beim letzten Mal das Binärsymbol <b>H</b> mit <b>P</b> oder mit <b>M</b> codiert wurde &nbsp;&#8658;&nbsp; weiterer Faktor 1/2:
 
:$$p_{\rm NM} \hspace{0.1cm} =  \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{N}) \cdot {\rm Pr}(c_2 = \mathbf{M}\hspace{0.05cm} | c_1 = \mathbf{N}) = 1/2 \cdot 1/2 \cdot 1/2= 1/8 \hspace{0.05cm},\\
 
p_{\rm NP} \hspace{0.1cm} =  \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{N}) \cdot {\rm Pr}(c_2 = \mathbf{P}\hspace{0.05cm} | c_1 = \mathbf{N}) = 1/2 \cdot 1/2 \cdot 1/2 = 1/8 \hspace{0.05cm}.$$
 
  
:Damit ist die Entropie <i>H</i><sub>2</sub>' eines Zweiertupels bzw. dessen Entropie <i>H</i><sub>2</sub> pro Codesymbol:
+
'''(4)'''&nbsp; The calculation of&nbsp; $H_3$&nbsp; is similar to the last subtask for&nbsp; $H_2$, except that now&nbsp; $3^3 = 27$&nbsp; composite probabilities must be determined:
:$$H_2' = \frac{1}{4} \cdot {\rm log}_2\hspace{0.1cm} (4) +
+
:$$p_{\rm NNN} = 1/8\hspace{0.4cm}{\rm (nur \hspace{0.15cm}only once)}
6 \cdot \frac{1}{8} \cdot {\rm log}_2\hspace{0.1cm}(8)  \hspace{0.15cm} {= 2.75 \,{\rm bit/Zweiertupel}}$$
 
:$$\Rightarrow\hspace{0.3cm} H_2  = \frac{H_2'}{2}  \hspace{0.15cm} \underline {= 1.375 \,{\rm bit/Symbol}}
 
\hspace{0.05cm}.$$
 
 
 
:<b>4.</b>&nbsp;&nbsp;Die Berechnung von <i>H</i><sub>3</sub> erfolgt ähnlich wie bei der letzten Teilaufgabe für <i>H</i><sub>2</sub>, nur müssen nun 3<sup>3</sup> = 27 Verbundwahrscheinlichkeiten ermittelt werden:
 
:$$p_{\rm NNN} = 1/8\hspace{0.4cm}{\rm (nur \hspace{0.15cm}einmal)}
 
 
  \hspace{0.05cm},$$
 
  \hspace{0.05cm},$$
:$$p_{\rm NMM} = p_{\rm NPP} = p_{\rm MNM}  = ... = 0 \hspace{0.4cm}{\rm (ingesamt \hspace{0.15cm}12)}
+
:$$p_{\rm NMM} = p_{\rm NPP} = p_{\rm MNM}  = ... = 0 \hspace{0.4cm}{\rm (total \hspace{0.15cm}12)}
 
  \hspace{0.05cm},$$
 
  \hspace{0.05cm},$$
:$$p_{\rm NNM} = p_{\rm NNP} = p_{\rm PMP}  = ... = 1/16 \hspace{0.4cm}{\rm (ingesamt \hspace{0.15cm}14)}$$
+
:$$p_{\rm NNM} = p_{\rm NNP} = p_{\rm PMP}  = ... = 1/16 \hspace{0.4cm}{\rm (total \hspace{0.15cm}14)}$$
 
:$$\Rightarrow\hspace{0.3cm} H_3  = \frac{1}{3} \cdot \left [  
 
:$$\Rightarrow\hspace{0.3cm} H_3  = \frac{1}{3} \cdot \left [  
 
  \frac{1}{8} \cdot {\rm log}_2\hspace{0.1cm} (8) +  
 
  \frac{1}{8} \cdot {\rm log}_2\hspace{0.1cm} (8) +  
 
  14 \cdot \frac{1}{16} \cdot {\rm log}_2\hspace{0.1cm}(16)
 
  14 \cdot \frac{1}{16} \cdot {\rm log}_2\hspace{0.1cm}(16)
  \right ]  \hspace{0.15cm} \underline {= 1.292 \,{\rm bit/Symbol}}  
+
  \right ]  \hspace{0.15cm} \underline {= 1.292 \,{\rm bit/symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>5.</b>&nbsp;&nbsp;Richtig sind die <u>Lösungsvorschläge 1 und 2</u>. Falsch ist dagegen die Aussage 3, da <i>H</i><sub>4</sub> auf jeden Fall kleiner sein muss als <i>H</i><sub>3</sub> = 1.292 bit/Symbol.
+
 
 +
'''(5)'''&nbsp; Correct are the <u>proposed solutions 1 and 2</u>.  
 +
*On the other hand, statement 3 is wrong, because&nbsp; $H_4$&nbsp; must in any case be smaller than&nbsp; $H_3 = 1.292 \; \rm bit/symbol$.
 
{{ML-Fuß}}
 
{{ML-Fuß}}
  
  
  
[[Category:Aufgaben zu Informationstheorie|^1.2 Nachrichtenquellen mit Gedächtnis^]]
+
[[Category:Information Theory: Exercises|^1.2 Sources with Memory^]]

Latest revision as of 14:08, 16 August 2021

Binary source signal (top) and
ternary encoder signal (bottom)

The graph above shows the binary source signal  $q(t)$, which can also be described by the symbol sequence  $\langle q_\nu \rangle$  with  $q_\nu \in \{ {\rm L}, {\rm H} \}$ .  In the entire task,  $p_{\rm L} = p_{\rm H} =0.5$ applies.

The coded signal  $c(t)$  and the corresponding symbol sequence  $\langle c_\nu \rangle$  with  $c_\nu \in \{{\rm P}, {\rm N}, {\rm M} \}$  results from the AMI coding ("Alternate Mark Inversion") according to the following rule:

  • The binary symbol  $\rm L$  ⇒  "Low"is always represented by the ternary symbol  $\rm N$  ⇒  "German: Null" ⇒ "Zero".
  • The binary symbol  $\rm H$  ⇒  "High"  is also encoded deterministically but alternately (hence the name "Alternate Mark Inversion") by the symbols  $\rm P$  ⇒  "Plus"  and  $\rm M$  ⇒  "Minus".


In this task, the entropy approximations for the AMI coded signal are to be calculated:

  • Approximation  $H_1$  refers only to the symbol probabilities  $p_{\rm P}$,  $p_{\rm N}$  and  $p_{\rm M}$.
  • The  $k$–th is the entropy approximation  $(k = 2, 3, \text{...} \ )$  can be determined according to the following equation:
$$H_k = \frac{1}{k} \cdot \sum_{i=1}^{3^k} p_i^{(k)} \cdot {\rm log}_2\hspace{0.1cm}\frac {1}{p_i^{(k)}} \hspace{0.5cm}({\rm Einheit\hspace{-0.1cm}: \hspace{0.1cm}bit/Symbol}) \hspace{0.05cm}.$$
Here,  $p_i^{(k)}$  the  $i$–th composite probability of a  $k$–tuple.





Hints:

  • This task belongs to the chapter  Discrete Sources with Memory.
  • Reference is made in particular to the page  The Entropy of the AMI code.
  • In  Exercise 1.4Z  the actual entropy of the encoded sequence  $\langle c_\nu \rangle$  is calculated to  $H = 1 \; \rm bit/symbol$.
  • The following relations between the units are to be expected:   $H \le$ ...$ \le H_3 \le H_2 \le H_1 \le H_0 \hspace{0.05cm}.$


Questions

1

What is the decision content of the AMI code?

$H_0 \ = \ $

$\ \rm bit/symbol$

2

Calculate the first entropy approximation of the AMI code.

$H_1 \ = \ $

$\ \rm bit/symbol$

3

What is the entropy approximation  $H_2$, based on two-tuples?

$H_2 \ = \ $

$\ \rm bit/symbol$

4

What is the value of the entropy approximation  $H_3$, based on three-tuples?

$H_3 \ = \ $

$\ \rm bit/symbol$

5

Which statements apply to the entropy approximation  $H_4$?

It must be averaged over  $3^4 = 81$  summands.
  $1 \; {\rm bit/symbol} < H_4 < H_3$ is valid.
After a long calculation you get  $H_4 = 1.333 \; {\rm bit/symbol}$.


Solution

(1)  The symbol set size is  $M = 3$.  This gives the decision content with the  logarithm  to the base $2$   ⇒   $\log_2$:

$$H_0 = {\rm log}_2\hspace{0.1cm} M = {\rm log}_2\hspace{0.1cm} (3) \hspace{0.15cm} \underline { = 1.585 \,{\rm bit/symbol}} \hspace{0.05cm}.$$


(2)  The first-order entropy approximation takes into account only the symbol probabilities  $p_{\rm P}$,  $p_{\rm N}$  and  $p_{\rm M}$ 
and not the statistical bindings within the code sequence  $\langle c_\nu \rangle$.  Thus one obtains:

$$p_{\rm N} = p_{\rm L} = 1/2\hspace{0.05cm},\hspace{0.2cm}p_{\rm P} = p_{\rm M} = p_{\rm H}/2 = 1/4 \hspace{0.3cm} \Rightarrow\hspace{0.3cm} H_1 = \frac{1}{2} \cdot {\rm log}_2\hspace{0.1cm} (2) + 2 \cdot \frac{1}{4} \cdot {\rm log}_2\hspace{0.1cm}(4) \hspace{0.15cm} \underline {= 1.5 \,{\rm bit/Symbol}} \hspace{0.05cm}.$$


(3)  First, the  $M^2 = 9$  composite probabilities of two-tuples have to be determined here, in the following marked by the first two code symbols  $c_1$  and  $c_2$:

  • Since in the AMI code neither  $\rm P$  can follow  $\rm P$  nor  $\rm M$  can follow  $\rm M$ ,  $p_{\rm PP} = p_{\rm MM} =0$.
  • For the composite probabilities under the condition  $c_2 = \rm N$  applies:
$$p_{\rm NN} \hspace{0.1cm} = \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{N}) \cdot {\rm Pr}(c_2 = \mathbf{N}\hspace{0.05cm} | c_1 = \mathbf{N}) = 1/2 \cdot 1/2 = 1/4 \hspace{0.05cm},$$
$$ p_{\rm MN} \hspace{0.1cm} = \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{M}) \cdot {\rm Pr}(c_2 = \mathbf{N}\hspace{0.05cm} | c_1 = \mathbf{M}) = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},$$
$$ p_{\rm PN} \hspace{0.1cm} = \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{P}) \cdot {\rm Pr}(c_2 = \mathbf{N}\hspace{0.05cm} | c_1 = \mathbf{P}) = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm}.$$
  • The composite probabilities of the two-tuples  $\rm PM$  and  $\rm MP$  are:
$$p_{\rm PM} \hspace{0.1cm} = \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{P}) \cdot {\rm Pr}(c_2 = \mathbf{M}\hspace{0.05cm} | c_1 = \mathbf{P}) = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm},$$
$$ p_{\rm MP} \hspace{0.1cm} = \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{M}) \cdot {\rm Pr}(c_2 = \mathbf{P}\hspace{0.05cm} | c_1 = \mathbf{M}) = 1/4 \cdot 1/2 = 1/8 \hspace{0.05cm}.$$
  • For the remaining probabilities, it must also be taken into account whether the binary symbol  $\rm H$  was encoded with  $\rm P$  or with  $\rm M$  last time  ⇒  further factor  $1/2$:
$$p_{\rm NM} \hspace{0.1cm} = \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{N}) \cdot {\rm Pr}(c_2 = \mathbf{M}\hspace{0.05cm} | c_1 = \mathbf{N}) = 1/2 \cdot 1/2 \cdot 1/2= 1/8 \hspace{0.05cm},$$
$$ p_{\rm NP} \hspace{0.1cm} = \hspace{0.1cm} {\rm Pr}( c_1 = \mathbf{N}) \cdot {\rm Pr}(c_2 = \mathbf{P}\hspace{0.05cm} | c_1 = \mathbf{N}) = 1/2 \cdot 1/2 \cdot 1/2 = 1/8 \hspace{0.05cm}.$$

Thus the entropy  $H_2'$  of a two-tuple or its entropy  $H_2$  per code symbol:

$$H_2\hspace{0.01cm}' = \frac{1}{4} \cdot {\rm log}_2\hspace{0.1cm} (4) + 6 \cdot \frac{1}{8} \cdot {\rm log}_2\hspace{0.1cm}(8) \hspace{0.15cm} {= 2.75 \,{\text{bit/two-tuple} }}\hspace{0.3cm} \Rightarrow\hspace{0.3cm} H_2 = \frac{H_2\hspace{0.01cm}'}{2} \hspace{0.15cm} \underline {= 1.375 \,{\rm bit/symbol}} \hspace{0.05cm}.$$


(4)  The calculation of  $H_3$  is similar to the last subtask for  $H_2$, except that now  $3^3 = 27$  composite probabilities must be determined:

$$p_{\rm NNN} = 1/8\hspace{0.4cm}{\rm (nur \hspace{0.15cm}only once)} \hspace{0.05cm},$$
$$p_{\rm NMM} = p_{\rm NPP} = p_{\rm MNM} = ... = 0 \hspace{0.4cm}{\rm (total \hspace{0.15cm}12)} \hspace{0.05cm},$$
$$p_{\rm NNM} = p_{\rm NNP} = p_{\rm PMP} = ... = 1/16 \hspace{0.4cm}{\rm (total \hspace{0.15cm}14)}$$
$$\Rightarrow\hspace{0.3cm} H_3 = \frac{1}{3} \cdot \left [ \frac{1}{8} \cdot {\rm log}_2\hspace{0.1cm} (8) + 14 \cdot \frac{1}{16} \cdot {\rm log}_2\hspace{0.1cm}(16) \right ] \hspace{0.15cm} \underline {= 1.292 \,{\rm bit/symbol}} \hspace{0.05cm}.$$


(5)  Correct are the proposed solutions 1 and 2.

  • On the other hand, statement 3 is wrong, because  $H_4$  must in any case be smaller than  $H_3 = 1.292 \; \rm bit/symbol$.