Difference between revisions of "Aufgaben:Exercise 1.4Z: Entropy of the AMI Code"

From LNTwww
m (Text replacement - "generalis" to "generaliz")
 
(32 intermediate revisions by 5 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie und Quellencodierung/Nachrichtenquellen mit Gedächtnis
+
{{quiz-Header|Buchseite=Information_Theory/Discrete_Sources_with_Memory
 
}}
 
}}
  
[[File:P_ID2249__Inf_A_1_4.png|right|]]
+
[[File:P_ID2249__Inf_A_1_4.png|right|frame|Binary source signal (top) and <br>ternary encoder signal (bottom)]]
:Wir gehen von ähnlichen Voraussetzungen wie in der Aufgabe A1.4 aus: Eine Binärquelle liefert die Quellensybolfolge &#9001;<i>q<sub>&nu;</sub></i>&#9002; mit <i>q<sub>&nu;</sub></i> &#8712; {<b>L</b>, <b>H</b>}, wobei es keine statistischen Bindungen zwischen den einzelnen Folgenelementen gibt.
+
We assume similar prerequisites as in&nbsp; [[Aufgaben:Exercise_1.4:_Entropy_Approximations_for_the_AMI_Code|Exercise 1.4]]&nbsp;: &nbsp;  
  
:Für die Symbolwahrscheinlichkeiten gelte:
+
A binary source provides the source symbol sequence&nbsp; $\langle q_\nu \rangle$&nbsp;  with&nbsp; $q_\nu \in \{ {\rm L}, {\rm H} \}$, where there are no statistical bindings between the individual sequence elements.
  
:* <i>p</i><sub>L</sub> = <i>p</i><sub>H</sub> = 1/2 (Teilaufgaben 1 und 2),
+
For the symbol probabilities, let:
 +
* $p_{\rm L} =p_{\rm H} = 1/2$&nbsp; (in subtasks 1 und 2),
 +
* $p_{\rm L} = 1/4, \, p_{\rm H} = 3/4$&nbsp; (subtasks 3, 4 and 5),
 +
* $p_{\rm L} = 3/4, \, p_{\rm H} = 1/4$&nbsp; (subtask 6).
  
:* <i>p</i><sub>L</sub> = 1/4, <i>p</i><sub>H</sub> = 3/4 (Teilaufgaben 3, 4 und 5),
 
  
:* <i>p</i><sub>L</sub> = 3/4, <i>p</i><sub>H</sub> = 1/4 (Teilaufgabe 6).
+
The presented coded signal&nbsp; $c(t)$&nbsp; and the corresponding encoded sequence&nbsp; $\langle c_\nu \rangle$&nbsp;  with&nbsp; $c_\nu \in \{{\rm P}, {\rm N}, {\rm M}  \}$&nbsp; results from the AMI coding&nbsp; ("Alternate Mark Inversion")&nbsp; according to the following rule:
  
:Das dargestellte Codesignal <i>c</i>(<i>t</i>) und die zugehörige Symbolfolge &#9001;<i>c<sub>&nu;</sub></i>&#9002; mit <i>c<sub>&nu;</sub></i>&nbsp;&#8712;&nbsp;{<b>P</b>, <b>N</b>, <b>M</b>} ergibt sich aus der AMI&ndash;Codierung (<i>Alternate Mark Inversion</i>) nach folgender Vorschrift:
+
* The binary symbol&nbsp; $\rm L$ &nbsp; &rArr; &nbsp; "Low"&nbsp; is always represented by the ternary symbol&nbsp; $\rm N$ &nbsp; &rArr; &nbsp; "German: Null"&nbsp; &rArr; &nbsp;"Zero".
 +
* The binary symbol&nbsp; $\rm H$ &nbsp; &rArr; &nbsp; "High"&nbsp; is also encoded deterministically but alternately&nbsp; (hence the name "Alternate Mark Inversion")&nbsp; by the symbols&nbsp; $\rm P$ &nbsp;&rArr;&nbsp; "Plus"&nbsp; and&nbsp; $\rm M$ &nbsp;&rArr;&nbsp; "Minus".
  
:* Das Binärsymbol <b>L</b> &#8658; <i>Low</i> wird stets durch das Ternärsymbol <b>N</b> &#8658; <i>Null</i> dargestellt.
 
  
:* Das Binärsymbol <b>H</b> &#8658; <i>High</i> wird ebenfalls deterministisch, aber alternierend (daher der Name &bdquo;AMI&rdquo;) durch die Symbole <nobr><b>P</b> &#8658; <i>Plus</i></nobr> und <b>M</b> &#8658; <i>Minus</i> codiert.
 
  
:In dieser Aufgabe sollen für die drei oben genannten Parametersätze der Entscheidungsgehalt <i>H</i><sub>0</sub> sowie die resultierende Entropie <i>H</i><sub>C</sub> der Codesymbolfolge &#9001;<i>c<sub>&nu;</sub></i>&#9002; bestimmt werden. Die relative Redundanz der Codefolge ergibt sich daraus entsprechend der Gleichung
+
 
 +
In this task, the decision content&nbsp; $H_0$&nbsp; and the resulting entropy&nbsp; $H_{\rm C}$&nbsp; of the encoded sequence&nbsp; $\langle c_\nu \rangle$&nbsp; are to be determined for the three parameter sets mentioned above.&nbsp; The relative redundancy of the code sequence results from this according to the equation
 
:$$r_{\rm C} = \frac{H_{\rm 0}-H_{\rm C}}{H_{\rm C}}
 
:$$r_{\rm C} = \frac{H_{\rm 0}-H_{\rm C}}{H_{\rm C}}
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>Hinweis:</b> Die Aufgabe gehört zu Kapitel 1.2. Allgemein bestehen folgende Relationen zwischen dem Entscheidungsgehalt <i>H</i><sub>0</sub>, der Entropie <i>H</i> (hier gleich <i>H</i><sub>C</sub>) und den Entropienäherungen:
+
 
:$$H \le ... \le H_3 \le H_2 \le H_1 \le H_0  
+
 
 +
 
 +
 
 +
 
 +
 
 +
''Hints:''
 +
*This task belongs to the chapter&nbsp; [[Information_Theory/Discrete_Sources_with_Memory|Discrete Sources with Memory]].
 +
*Reference is made in particular to the page&nbsp;  [[Information_Theory/Discrete_Sources_with_Memory#The_entropy_of_the_AMI_code|The entropy of the AMI code]].
 +
 +
*In general, the following relations exist between the decision content&nbsp; $H_0$,&nbsp; the entropy&nbsp; $H$&nbsp; $($here equal to&nbsp; $H_{\rm C})$&nbsp; and the entropy approximations:  
 +
:$$H \le \ \text{...} \  \le H_3 \le H_2 \le H_1 \le H_0  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
:In Aufgabe A1.4 wurden für gleichwahrscheinliche Symbole <b>L</b> und <b>H</b> die Entropie&ndash;Näherungen wie folgt berechnet (jeweils in bit/Symbol):
+
*In&nbsp; [[Aufgaben:Exercise_1.4:_Entropy_Approximations_for_the_AMI_Code|Exercise 1.4]]&nbsp;&nbsp;  the entropy approximations were calculated for equally probable symbols&nbsp; $\rm L$&nbsp; and&nbsp; $\rm H$&nbsp; as follows (each in "bit/symbol"):  
 
:$$H_1 = 1.500\hspace{0.05cm},\hspace{0.2cm} H_2 = 1.375\hspace{0.05cm},\hspace{0.2cm}H_3 = 1.292
 
:$$H_1 = 1.500\hspace{0.05cm},\hspace{0.2cm} H_2 = 1.375\hspace{0.05cm},\hspace{0.2cm}H_3 = 1.292
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
  
===Fragebogen===
+
 
 +
 
 +
 
 +
 
 +
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
{Die Quellensymbole seien gleichwahrscheinlich. Wie groß ist die Entropie <i>H</i><sub>C</sub> der Codesymbolfolge &#9001;<i>c<sub>&nu;</sub></i>&#9002;?
+
{Let the source symbols be equally probable&nbsp; $(p_{\rm L} =  p_{\rm H}= 1/2)$.&nbsp; What is the entropy&nbsp; $H_{\rm C}$&nbsp; of the encoded sequence&nbsp; $\langle c_\nu \rangle$?
 
|type="{}"}
 
|type="{}"}
$p_L = p_H:\ \ H_C$ = { 1 3% } $bit/Ternärsymbol$
+
$H_{\rm C} \ = \ $ { 1 3% } $\ \rm bit/ternary \ symbol$
  
  
{Wie groß ist die relative Redundanz der Codesymbolfolge?
+
{What is the relative redundancy of the encoded sequence?
 
|type="{}"}
 
|type="{}"}
$p_L = p_H:\ \ r_C$ = { 36.9 3% } %
+
$r_{\rm C} \ = \ $ { 36.9 3% } $\ \rm \%$
  
  
{Für die Binärquelle gelte nun <i>p</i><sub>L</sub> = 1/4 und <i>p</i><sub>H</sub> = 3/4. Welcher Wert ergibt sich nun für die Entropie der Codesymbolfolge?
+
{For the binary source,&nbsp; $p_{\rm L= 1/4$ &nbsp;and&nbsp; $p_{\rm H= 3/4$.&nbsp; What is the entropy of the encoded sequence?
 
|type="{}"}
 
|type="{}"}
$p_L = 1/4:\ \ H_C$ = { 0.811 3% } $bit/Ternärsymbol$
+
$H_{\rm C} \ = \ $ { 0.811 3% } $\ \rm bit/ternary \ symbol$
  
  
{Wie groß ist nun die relative Redundanz der Codesymbolfolge?
+
{What is the relative redundancy of the encoded sequence?
 +
 
 
|type="{}"}
 
|type="{}"}
$p_L = 1/4:\ \ r_C$ = { 48.8 3% } %
+
$r_{\rm C} \ = \ $ { 48.8 3% } $\ \rm \%$
  
  
{Berechnen Sie die Näherung <i>H</i><sub>1</sub> der Coderentropie für <i>p</i><sub>L</sub> = 1/4, <i>p</i><sub>H</sub> = 3/4.
+
{Calculate the approximation&nbsp; $H_{\rm 1}$&nbsp; of the code entropy for&nbsp; $p_{\rm L} = 1/4$ &nbsp;and&nbsp; $p_{\rm H} = 3/4$.
 
|type="{}"}
 
|type="{}"}
$p_L = 1/4:\ \ H_1$ = { 1.56 3% } $bit/Ternärsymbol$
+
$H_{\rm 1} \ = \ $ { 1.56 3% } $\ \rm bit/ternary \ symbol$
  
  
{Berechnen Sie die Näherung <i>H</i><sub>1</sub> der Coderentropie für <i>p</i><sub>L</sub> = 3/4, <i>p</i><sub>H</sub> = 1/4.
+
{Calculate the approximation&nbsp; $H_{\rm 1}$&nbsp; of the code entropy for&nbsp; $p_{\rm L} = 3/4$ &nbsp;and&nbsp; $p_{\rm H} = 1/4$.
 
|type="{}"}
 
|type="{}"}
$p_L = 3/4:\ \ H_1$ = { 1.06 3% } $bit/Ternärsymbol$
+
$H_{\rm 1} \ = \ $ { 1.06 3% } $\ \rm bit/ternary \ symbol$
  
  
Line 68: Line 85:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
:<b>1.</b>&nbsp;&nbsp;Da durch den AMI&ndash;Code weder neue Information hinzukommt noch Information verschwindet, ist die Entropie <i>H</i><sub>C</sub> der Codesymbolfolge &#9001;<i>c<sub>&nu;</sub></i>&#9002; gleich der Quellenentropie <i>H</i><sub>Q</sub>. Bei gleichwahrscheinlichen und statistisch voneinander unabhängigen Quellensymbolen gilt deshalb:
+
'''(1)'''&nbsp; Since the AMI code neither adds new information nor causes information to disappear, the entropy&nbsp; $H_{\rm C}$&nbsp; of the encoded sequence&nbsp; $\langle c_\nu \rangle$&nbsp; is equal to the source entropy&nbsp; $H_{\rm Q}$.&nbsp;
:$$H_{\rm Q}    {= 1 \,{\rm bit/Bin\ddot{a}rsymbol}} \hspace{0.3cm} \Rightarrow\hspace{0.3cm} H_{\rm C}    \hspace{0.15cm} \underline {= 1 \,{\rm bit/Tern\ddot{a}rsymbol}}  
+
*Therefore, for equally probable and statistically independent source symbols, the following holds:
 +
:$$H_{\rm Q}    {= 1 \,{\rm bit/binary \ symbol}} \hspace{0.3cm} \Rightarrow\hspace{0.3cm} H_{\rm C}    \hspace{0.15cm} \underline {= 1 \,{\rm bit/ternary \ symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>2.</b>&nbsp;&nbsp;Der Entscheidungsgehalt einer ternären Quelle beträgt <i>H</i><sub>0</sub>&nbsp;=&nbsp&nbsp;log<sub>2</sub>&nbsp;(3)&nbsp;=&nbsp;1.585&nbsp;bit/Symbol. Damit ergibt sich für die relative Redundanz
+
 
 +
 
 +
'''(2)'''&nbsp; The decision content of a ternary source is&nbsp; $H_0 = \log_2 \; (3) = 1.585\; \rm bit/symbol$.&nbsp;
 +
* This gives the following for the relative redundancy
 
:$$r_{\rm C} =1 -{H_{\rm C}/H_{\rm 0}}=1-1/{\rm log}_2\hspace{0.05cm}(3)  
 
:$$r_{\rm C} =1 -{H_{\rm C}/H_{\rm 0}}=1-1/{\rm log}_2\hspace{0.05cm}(3)  
 
  \hspace{0.15cm} \underline {= 36.9  \,\%}
 
  \hspace{0.15cm} \underline {= 36.9  \,\%}
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>3.</b>&nbsp;&nbsp;Weiter gilt <i>H</i><sub>C</sub> = <i>H</i><sub>Q</sub>. Wegen den ungleichen Symbolwahrscheinlichkeiten ist nun <i>H</i><sub>Q</sub> kleiner:
+
 
 +
 
 +
'''(3)'''&nbsp; &nbsp; $H_{\rm C} = H_{\rm Q}$ is still valid.&nbsp;  However, because of the unequal symbol probabilities,&nbsp; $H_{\rm Q}$&nbsp; is now smaller:
 
:$$H_{\rm Q}  =  \frac{1}{4} \cdot {\rm log}_2\hspace{0.05cm} (4) + \frac{3}{4} \cdot  
 
:$$H_{\rm Q}  =  \frac{1}{4} \cdot {\rm log}_2\hspace{0.05cm} (4) + \frac{3}{4} \cdot  
 
{\rm log}_2\hspace{0.1cm} (4/3)
 
{\rm log}_2\hspace{0.1cm} (4/3)
  {= 0.811 \,{\rm bit/Bin\ddot{a}rsymbol}}$$
+
  {= 0.811 \,{\rm bit/binary \ symbol}} \hspace{0.3cm}
:$$\Rightarrow\hspace{0.3cm} H_{\rm C}  = H_{\rm Q}  \hspace{0.15cm} \underline {= 0.811 \,{\rm bit/Tern\ddot{a}rsymbol}}
+
\Rightarrow\hspace{0.3cm} H_{\rm C}  = H_{\rm Q}  \hspace{0.15cm} \underline {= 0.811 \,{\rm bit/ternary \ symbol}}
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>4.</b>&nbsp;&nbsp;In Analogie zur Teilaufgabe 2) gilt
+
 
:$$r_{\rm C} = 1 -  0.811/1.585
+
 
 +
'''(4)'''&nbsp; By analogy with sub-task&nbsp; '''(2)'''&nbsp; &nbsp; $r_{\rm C} = 1 -  0.811/1.585
 
  \hspace{0.15cm} \underline {= 48.8  \,\%}
 
  \hspace{0.15cm} \underline {= 48.8  \,\%}
 +
\hspace{0.05cm}$ now holds.
 +
*One can generalize this result.&nbsp; Namely, it holds:
 +
:$$(1-0.488) = (1- 0.189) \cdot (1- 0.369)\hspace{0.3cm}
 +
\Rightarrow\hspace{0.3cm} (1-r_{\rm Codefolge})  = (1-r_{\rm Quelle}) \cdot (1- r_{\rm AMI-Code})
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
:Man kann dieses Ergebnis verallgemeinern:
+
 
:$$(1-0.488) = (1- 0.189) \cdot (1- 0.369)$$
+
 
:$$\Rightarrow\hspace{0.3cm} (1-r_{\rm Codefolge})   = (1-r_{\rm Quelle}) \cdot (1- r_{\rm AMI-Code})
+
 
 +
'''(5)'''&nbsp; Since each&nbsp; $\rm L$&nbsp; is mapped to&nbsp; $\rm N$&nbsp; and&nbsp; $\rm H$&nbsp; is mapped alternately to&nbsp; $\rm M$&nbsp; and&nbsp; $\rm P$, it holds that
 +
:$$p_{\rm N} = p_{\rm L} = 1/4\hspace{0.05cm},\hspace{0.2cm}p_{\rm P} = p_{\rm M} = p_{\rm H}/2 = 3/8\hspace{0.3cm}
 +
\Rightarrow\hspace{0.3cm} H_1  = {1}/{4} \cdot {\rm log}_2\hspace{0.1cm} (4) +
 +
2 \cdot {3}/{8} \cdot {\rm log}_2\hspace{0.1cm}(8/3) \hspace{0.15cm} \underline {= 1.56 \,{\rm bit/ternary \ symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
  
:<b>5.</b>&nbsp;&nbsp;Da jedes <b>L</b> auf <b>N</b> abgebildet wird und <b>H</b> alternierend auf <b>M</b> und <b>P</b>, gilt
 
:$$p_{\rm N} = p_{\rm L} = 1/4\hspace{0.05cm},\hspace{0.2cm}p_{\rm P} = p_{\rm M} = p_{\rm H}/2 = 3/8$$
 
:$$\Rightarrow\hspace{0.3cm} H_1  = {1}/{4} \cdot {\rm log}_2\hspace{0.1cm} (4) +
 
2 \cdot {3}/{8} \cdot {\rm log}_2\hspace{0.1cm}(8/3)  \hspace{0.15cm} \underline {= 1.56 \,{\rm bit/Tern\ddot{a}rsymbol}}
 
\hspace{0.05cm}.$$
 
  
:<b>6.</b>&nbsp;&nbsp; Nun ergeben sich die Symbolwahrscheinlichkeiten <i>p</i><sub>N</sub> = 3/4 sowie <i>p</i><sub>P</sub> = <i>p</i><sub>M</sub> = 1/8. Somit gilt:
+
'''(6)'''&nbsp; Now the probabilities of occurrence of the ternary symbols are &nbsp; $p_{\rm N} = 3/4$&nbsp; sowie&nbsp; $p_{\rm P} = p_{\rm M} =1/8$.&nbsp; Thus:
 
:$$H_1  = {3}/{4} \cdot {\rm log}_2\hspace{0.1cm} (4/3) +  
 
:$$H_1  = {3}/{4} \cdot {\rm log}_2\hspace{0.1cm} (4/3) +  
  2 \cdot {1}/{8} \cdot {\rm log}_2\hspace{0.1cm}(8)  \hspace{0.15cm} \underline {= 1.06 \,{\rm bit/Tern\ddot{a}rsymbol}}  
+
  2 \cdot {1}/{8} \cdot {\rm log}_2\hspace{0.1cm}(8)  \hspace{0.15cm} \underline {= 1.06 \,{\rm bit/ternary \ symbol}}  
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
:Für <i>p</i><sub>L</sub> = 1/4, <i>p</i><sub>H</sub> = 3/4 ergibt sich <i>H</i><sub>1</sub> = 1.56 bit/Symbol, bei <i>p</i><sub>L</sub> = 3/4, <i>p</i><sub>H</sub>  = 1/4 dagegen ein deutlich kleinerer Wert: <i>H</i><sub>1</sub> = 1.06 bit/Symbol. Für beide Parameterkombinationen gilt aber gleichermaßen:
+
 
:$$H_0  = 1.585 \,{\rm bit/Symbol}\hspace{0.05cm},\hspace{0.2cm}H_{\rm C} =  
+
''Interpretation:''
  \lim_{k \rightarrow \infty } H_k = 0.811 \,{\rm bit/Symbol}
+
*For&nbsp; $p_{\rm L} = 1/4, \ p_{\rm H} = 3/4$&nbsp; gives&nbsp; $H_1 = 1.56 \; \rm bit/symbol$.
 +
*For&nbsp; $p_{\rm L} = 3/4, \ p_{\rm H} = 1/4$&nbsp;, on the other hand,&nbsp; $H_1 = 1.06 \; \rm bit/symbol$&nbsp; results in a significantly smaller value.
 +
*For both parameter combinations, however, the same applies:
 +
:$$H_0  = 1.585 \,{\rm bit/symbol}\hspace{0.05cm},\hspace{0.2cm}H_{\rm C} =  
 +
  \lim_{k \rightarrow \infty } H_k = 0.811 \,{\rm bit/symbol}
 
  \hspace{0.05cm}.$$
 
  \hspace{0.05cm}.$$
:Daraus folgt: Betrachtet man zwei Nachrichtenquellen Q1 und Q2 mit gleichem Symbolumfang <i>M</i> &nbsp;&#8658;&nbsp; Entscheidungsgehalt&nbsp;<i>H</i><sub>0</sub>&nbsp;=&nbsp;const., wobei bei der Quelle Q1 die Entropienäherung erster Ordnung deutlich größer ist als bei der Quelle Q2, so kann man daraus noch lange nicht schließen, dass die Entropie von Q1 tatsächlich größer ist als die Entropie von Q2. Vielmehr muss man für beide Quellen
+
It follows from this: <br>
 
+
*If one considers two message sources&nbsp; $\rm Q1$&nbsp; and&nbsp; $\rm Q2$&nbsp; with the same symbol set size&nbsp; $M$ &nbsp; &#8658; &nbsp; decision content&nbsp; $H_0 = \rm const.$, whereby the first order entropy approximation&nbsp; $(H_1)$&nbsp; is clearly greater for source&nbsp; $\rm Q1$&nbsp; than for source&nbsp; $\rm Q2$, one cannot conclude from this by any means that the entropy of&nbsp; $\rm Q1$&nbsp; is actually greater than the entropy of $\rm Q2$.&nbsp;
:* genügend viele Entropienäherungen <i>H</i><sub>1</sub>, <i>H</i><sub>2</sub>, <i>H</i><sub>3</sub>, ... , berechnen, und
+
*Rather, one must
 
+
:* calculate enough entropy approximations&nbsp; $H_1$,&nbsp; $H_2$,&nbsp; $H_3$,&nbsp; ... for both sources and
:* daraus (grafisch oder analytisch) den Grenzwert von <i>H<sub>k</sub></i> für <i>k</i> &#8594; &#8734; bestimmen.
+
:* determine from them&nbsp; (graphically or analytically)&nbsp; the limit value of&nbsp; $H_k$&nbsp; for&nbsp; $k \to \infty$.
  
:Erst dann ist eine endgültige Aussage über die Entropieverhältnisse möglich.
+
*Only then a final statement about the entropy ratios is possible.
 
{{ML-Fuß}}
 
{{ML-Fuß}}
  
  
  
[[Category:Aufgaben zu Informationstheorie und Quellencodierung|^1.2 Nachrichtenquellen mit Gedächtnis^]]
+
[[Category:Information Theory: Exercises|^1.2 Sources with Memory^]]

Latest revision as of 12:48, 22 September 2021

Binary source signal (top) and
ternary encoder signal (bottom)

We assume similar prerequisites as in  Exercise 1.4 :  

A binary source provides the source symbol sequence  $\langle q_\nu \rangle$  with  $q_\nu \in \{ {\rm L}, {\rm H} \}$, where there are no statistical bindings between the individual sequence elements.

For the symbol probabilities, let:

  • $p_{\rm L} =p_{\rm H} = 1/2$  (in subtasks 1 und 2),
  • $p_{\rm L} = 1/4, \, p_{\rm H} = 3/4$  (subtasks 3, 4 and 5),
  • $p_{\rm L} = 3/4, \, p_{\rm H} = 1/4$  (subtask 6).


The presented coded signal  $c(t)$  and the corresponding encoded sequence  $\langle c_\nu \rangle$  with  $c_\nu \in \{{\rm P}, {\rm N}, {\rm M} \}$  results from the AMI coding  ("Alternate Mark Inversion")  according to the following rule:

  • The binary symbol  $\rm L$   ⇒   "Low"  is always represented by the ternary symbol  $\rm N$   ⇒   "German: Null"  ⇒  "Zero".
  • The binary symbol  $\rm H$   ⇒   "High"  is also encoded deterministically but alternately  (hence the name "Alternate Mark Inversion")  by the symbols  $\rm P$  ⇒  "Plus"  and  $\rm M$  ⇒  "Minus".



In this task, the decision content  $H_0$  and the resulting entropy  $H_{\rm C}$  of the encoded sequence  $\langle c_\nu \rangle$  are to be determined for the three parameter sets mentioned above.  The relative redundancy of the code sequence results from this according to the equation

$$r_{\rm C} = \frac{H_{\rm 0}-H_{\rm C}}{H_{\rm C}} \hspace{0.05cm}.$$




Hints:

  • In general, the following relations exist between the decision content  $H_0$,  the entropy  $H$  $($here equal to  $H_{\rm C})$  and the entropy approximations:
$$H \le \ \text{...} \ \le H_3 \le H_2 \le H_1 \le H_0 \hspace{0.05cm}.$$
  • In  Exercise 1.4   the entropy approximations were calculated for equally probable symbols  $\rm L$  and  $\rm H$  as follows (each in "bit/symbol"):
$$H_1 = 1.500\hspace{0.05cm},\hspace{0.2cm} H_2 = 1.375\hspace{0.05cm},\hspace{0.2cm}H_3 = 1.292 \hspace{0.05cm}.$$




Questions

1

Let the source symbols be equally probable  $(p_{\rm L} = p_{\rm H}= 1/2)$.  What is the entropy  $H_{\rm C}$  of the encoded sequence  $\langle c_\nu \rangle$?

$H_{\rm C} \ = \ $

$\ \rm bit/ternary \ symbol$

2

What is the relative redundancy of the encoded sequence?

$r_{\rm C} \ = \ $

$\ \rm \%$

3

For the binary source,  $p_{\rm L} = 1/4$  and  $p_{\rm H} = 3/4$.  What is the entropy of the encoded sequence?

$H_{\rm C} \ = \ $

$\ \rm bit/ternary \ symbol$

4

What is the relative redundancy of the encoded sequence?

$r_{\rm C} \ = \ $

$\ \rm \%$

5

Calculate the approximation  $H_{\rm 1}$  of the code entropy for  $p_{\rm L} = 1/4$  and  $p_{\rm H} = 3/4$.

$H_{\rm 1} \ = \ $

$\ \rm bit/ternary \ symbol$

6

Calculate the approximation  $H_{\rm 1}$  of the code entropy for  $p_{\rm L} = 3/4$  and  $p_{\rm H} = 1/4$.

$H_{\rm 1} \ = \ $

$\ \rm bit/ternary \ symbol$


Solution

(1)  Since the AMI code neither adds new information nor causes information to disappear, the entropy  $H_{\rm C}$  of the encoded sequence  $\langle c_\nu \rangle$  is equal to the source entropy  $H_{\rm Q}$. 

  • Therefore, for equally probable and statistically independent source symbols, the following holds:
$$H_{\rm Q} {= 1 \,{\rm bit/binary \ symbol}} \hspace{0.3cm} \Rightarrow\hspace{0.3cm} H_{\rm C} \hspace{0.15cm} \underline {= 1 \,{\rm bit/ternary \ symbol}} \hspace{0.05cm}.$$


(2)  The decision content of a ternary source is  $H_0 = \log_2 \; (3) = 1.585\; \rm bit/symbol$. 

  • This gives the following for the relative redundancy
$$r_{\rm C} =1 -{H_{\rm C}/H_{\rm 0}}=1-1/{\rm log}_2\hspace{0.05cm}(3) \hspace{0.15cm} \underline {= 36.9 \,\%} \hspace{0.05cm}.$$


(3)    $H_{\rm C} = H_{\rm Q}$ is still valid.  However, because of the unequal symbol probabilities,  $H_{\rm Q}$  is now smaller:

$$H_{\rm Q} = \frac{1}{4} \cdot {\rm log}_2\hspace{0.05cm} (4) + \frac{3}{4} \cdot {\rm log}_2\hspace{0.1cm} (4/3) {= 0.811 \,{\rm bit/binary \ symbol}} \hspace{0.3cm} \Rightarrow\hspace{0.3cm} H_{\rm C} = H_{\rm Q} \hspace{0.15cm} \underline {= 0.811 \,{\rm bit/ternary \ symbol}} \hspace{0.05cm}.$$


(4)  By analogy with sub-task  (2)    $r_{\rm C} = 1 - 0.811/1.585 \hspace{0.15cm} \underline {= 48.8 \,\%} \hspace{0.05cm}$ now holds.

  • One can generalize this result.  Namely, it holds:
$$(1-0.488) = (1- 0.189) \cdot (1- 0.369)\hspace{0.3cm} \Rightarrow\hspace{0.3cm} (1-r_{\rm Codefolge}) = (1-r_{\rm Quelle}) \cdot (1- r_{\rm AMI-Code}) \hspace{0.05cm}.$$


(5)  Since each  $\rm L$  is mapped to  $\rm N$  and  $\rm H$  is mapped alternately to  $\rm M$  and  $\rm P$, it holds that

$$p_{\rm N} = p_{\rm L} = 1/4\hspace{0.05cm},\hspace{0.2cm}p_{\rm P} = p_{\rm M} = p_{\rm H}/2 = 3/8\hspace{0.3cm} \Rightarrow\hspace{0.3cm} H_1 = {1}/{4} \cdot {\rm log}_2\hspace{0.1cm} (4) + 2 \cdot {3}/{8} \cdot {\rm log}_2\hspace{0.1cm}(8/3) \hspace{0.15cm} \underline {= 1.56 \,{\rm bit/ternary \ symbol}} \hspace{0.05cm}.$$


(6)  Now the probabilities of occurrence of the ternary symbols are   $p_{\rm N} = 3/4$  sowie  $p_{\rm P} = p_{\rm M} =1/8$.  Thus:

$$H_1 = {3}/{4} \cdot {\rm log}_2\hspace{0.1cm} (4/3) + 2 \cdot {1}/{8} \cdot {\rm log}_2\hspace{0.1cm}(8) \hspace{0.15cm} \underline {= 1.06 \,{\rm bit/ternary \ symbol}} \hspace{0.05cm}.$$

Interpretation:

  • For  $p_{\rm L} = 1/4, \ p_{\rm H} = 3/4$  gives  $H_1 = 1.56 \; \rm bit/symbol$.
  • For  $p_{\rm L} = 3/4, \ p_{\rm H} = 1/4$ , on the other hand,  $H_1 = 1.06 \; \rm bit/symbol$  results in a significantly smaller value.
  • For both parameter combinations, however, the same applies:
$$H_0 = 1.585 \,{\rm bit/symbol}\hspace{0.05cm},\hspace{0.2cm}H_{\rm C} = \lim_{k \rightarrow \infty } H_k = 0.811 \,{\rm bit/symbol} \hspace{0.05cm}.$$

It follows from this:

  • If one considers two message sources  $\rm Q1$  and  $\rm Q2$  with the same symbol set size  $M$   ⇒   decision content  $H_0 = \rm const.$, whereby the first order entropy approximation  $(H_1)$  is clearly greater for source  $\rm Q1$  than for source  $\rm Q2$, one cannot conclude from this by any means that the entropy of  $\rm Q1$  is actually greater than the entropy of $\rm Q2$. 
  • Rather, one must
  • calculate enough entropy approximations  $H_1$,  $H_2$,  $H_3$,  ... for both sources and
  • determine from them  (graphically or analytically)  the limit value of  $H_k$  for  $k \to \infty$.
  • Only then a final statement about the entropy ratios is possible.