Difference between revisions of "Aufgaben:Exercise 1.13: Binary Erasure Channel Decoding"

From LNTwww
 
(34 intermediate revisions by 7 users not shown)
Line 1: Line 1:
{{quiz-Header|Buchseite=Kanalcodierung/Decodierung linearer Blockcodes
+
{{quiz-Header|Buchseite=Channel_Coding/Decoding_of_Linear_Block_Codes}}
  
 +
[[File:EN_KC_A_1_13.png|right|frame|Decoding at the BEC]]
  
}}
+
We assume here the model in section  [[Channel_Coding/Decoding_of_Linear_Block_Codes#Decoding_at_the_Binary_Erasure_Channel|"Decoding at the Binary Erasure Channel"]]  (BEC configuration highlighted in green):
  
[[File:P_ID2539__KC_A_1_13.png|right|frame|Zur BEC–Decodierung]]
+
*Each information word  $\underline{u}$  is encoded blockwise and yields the code word  $\underline{x}$.  Let the block code be linear and completely given by its parity-check matrix  $\boldsymbol{\rm H}$.
  
Wir gehen hier von dem [[Kanalcodierung/Decodierung_linearer_Blockcodes#Decodierung_beim_Binary_Erasure_Channel|Modell]] auf der letzten Theorieseite im Kapitel 1.5 aus (grün hinterlegte BEC–Konfiguration):
+
*During transmission  $n_{\rm E}$  bits of the code word are erased   ⇒  [[Channel_Coding/Channel_Models_and_Decision_Structures#Binary_Erasure_Channel_.E2.80.93_BEC|"Binary Erasure Channel"]]  $\rm (BEC)$.  The code word  $\underline{x}$  thus becomes the received word  $\underline{y}$.
  
*Jedes Informationswort <u>''u''</u> wird blockweise codiert und liefert das Codewort <u>''x''</u>. Der Blockcode sei linear und durch seine Prüfmatrix '''H''' vollständig gegeben.
+
*If the number&nbsp; $n_{\rm E}$&nbsp; of erasures is less than the&nbsp; [[Channel_Coding/Objective_of_Channel_Coding#Important_definitions_for_block_coding|"minimum distance"]]&nbsp; $d_{\rm min}$&nbsp; of the code,&nbsp; it is possible to reconstruct from&nbsp; $\underline{y}$&nbsp; the code word&nbsp; $\underline{z} = \underline{x}$&nbsp; without error,&nbsp; and thus the correct information word&nbsp; $\underline{v} = \underline{u}$&nbsp; is also obtained.
  
*Bei der Übertragung werden $n_{\rm E}$ Bit des Codewortes ausgelöscht ⇒ [[Kanalcodierung/Kanalmodelle_und_Entscheiderstrukturen#Binary_Erasure_Channel_.E2.80.93_BEC|Binary Erasure Channel]] (BEC). Aus dem Codewort <u>''x''</u> wird somit das Empfangswort <u>''y''</u>.
 
  
*Ist die Anzahl $n_{\rm E}$ der Auslöschungen kleiner als die [[Kanalcodierung/Zielsetzung_der_Kanalcodierung#Einige_wichtige_Definitionen_zur_Blockcodierung|minimale Distanz]] $d_{\rm min}$ des Codes, so gelingt es, aus <u>''y''</u> das Codewort $\underline{z} = \underline{x}$ ohne Fehler zu rekonstruieren, und man erhält so auch das richtige Informationswort $\underline{\upsilon} = \underline{u}$.
+
For exercise description,&nbsp; let us now consider the Hamming code word&nbsp; $\underline{x} = (0, 1, 0, 1, 1, 0, 0)$&nbsp; and the received word&nbsp; $\underline{y} = (0, 1, {\rm E} , {\rm E}, 1, 0, 0).$
  
*Zur Aufgabenbeschreibung betrachten wir beispielhaft das Hamming–Codewort $\underline{x} = (0, 1, 0, 1, 1, 0, 0)$ und das Empfangswort $\underline{y} = (0, 1, {\rm E} , {\rm E}, 1, 0, 0).$
+
*The third and fourth bit were thus erased by the channel.&nbsp; The&nbsp; "code word finder"&nbsp; thus has the task to determine the vector&nbsp; $z_{\rm E} = (z_{3}, z_{4})$&nbsp; with&nbsp; $z_{3}, \ z_{4} \in \{0, 1\}$&nbsp; to be determined.&nbsp; This is done according to the equation
  
Ausgelöscht wurden somit durch den Kanal das dritte und vierte Bit. Der Codewortfinder hat somit die Aufgabe, den Vektor $z_{\rm E} = (z_{3}, z_{4})$ mit $z_{3}, z_{4} \in$ {0, 1} zu bestimmen. Dies geschieht entsprechend der Gleichung
+
:$${ \boldsymbol{\rm H}}_{\rm E} \cdot \underline{z}_{\rm E}^{\rm T}= { \boldsymbol{\rm H}}_{\rm K} \cdot \underline{z}_{\rm K}^{\rm T}\hspace{0.05cm}.$$
 +
 
 +
*In the present example:
 +
 
 +
:$$\underline{z}_{\rm K} = (0, 1, 1, 0, 0)\hspace{0.05cm},\hspace{0.3cm}{ \boldsymbol{\rm H}}_{\rm K} = \begin{pmatrix} 1 &1 &1 &0 &0\\ 0 &1 &0 &1 &0\\ 1 &1 &0 &0 &1 \end{pmatrix} \hspace{0.05cm}, \hspace{0.3cm}{ \boldsymbol{\rm H}}_{\rm E} = \begin{pmatrix} 1 &0\\ 1 &1\\ 0 &1 \end{pmatrix} \hspace{0.05cm}.$$
 +
 
 +
*This equation gives two equations for the bits to be determined,&nbsp; whose solution leads to the result&nbsp; $z_{3} = 0$&nbsp; and&nbsp; $z_{4} = 1$.
  
:$${ \boldsymbol{\rm H}}_{\rm E} \cdot \underline{z}_{\rm E}^{\rm T}= { \boldsymbol{\rm H}}_{\rm K} \cdot \underline{z}_{\rm K}^{\rm T}\hspace{0.05cm},$$
 
  
wobei im vorliegenden Beispiel gilt:
 
  
:$$\underline{z}_{\rm K} = (0, 1, 1, 0, 0)\hspace{0.05cm},\hspace{0.3cm}{ \boldsymbol{\rm H}}_{\rm K} = \begin{pmatrix} 1 &1 &1 &0 &0\\ 0 &1 &0 &1 &0\\ 1 &1 &0 &0 &1 \end{pmatrix} \hspace{0.05cm}, \hspace{0.3cm}{ \boldsymbol{\rm H}}_{\rm E} = \begin{pmatrix} 1 &0\\ 1 &1\\ 0 &1 \end{pmatrix} \hspace{0.05cm}.$$
 
  
Diese Gleichung liefert zwei Bestimmungsgleichungen für die zu bestimmenden Bits, deren Lösung zum Ergebnis $z_{3} = 0$ und $z_{4} = 1$ führt.
+
Hints:
 +
* This exercise belongs to the chapter&nbsp; [[Channel_Coding/Decoding_of_Linear_Block_Codes|"Decoding of Linear Block Codes"]].
  
 +
* The algorithm for mapping the received word&nbsp; $\underline{y}$&nbsp; to the correct code word&nbsp; $\underline{z} = \underline{x}$ &nbsp; is described in detail in the&nbsp; [[Channel_Coding/Decoding_of_Linear_Block_Codes#Decoding_at_the_Binary_Erasure_Channel|"theory section"]]&nbsp;.
 +
 +
* We would like to remind again that in BEC decoding we use the first decoder block&nbsp; $\underline{y} → \underline{z}$&nbsp; as&nbsp; "code word finder"&nbsp; since wrong decisions are excluded here.&nbsp; Each received word is decoded correctly,&nbsp; or it may not be decoded at all.
 +
 +
*In the BSC model,&nbsp; on the other hand,&nbsp; decoding errors cannot be avoided.&nbsp; Accordingly,&nbsp; we refer to the corresponding block there as&nbsp; "code word estimator".
  
''Hinweis: ''
 
  
Die Aufgabe gehört zu [[Kanalcodierung/Decodierung_linearer_Blockcodes|Kapitel Decodierung linearer Blockcodes]] Der Algorithmus zur Zuordnung des Empfangswortes <u>''y''</u> zum richtigen Codewort $\underline{z} = \underline{x}$  ist im [[Aufgaben:1.1_ISDN–Versorgungsleitungen|Theorieteil]] ausführlich beschrieben. Wir möchten nochmals daran erinnern, dass wir bei der BEC–Decodierung den ersten Decoderblock $\underline{y} → \underline{z}$  als ''Codewortfinder'' bezeichnen, da hier Fehlentscheidungen ausgeschlossen sind. Jedes Empfangswort wird richtig decodiert, oder es kann gar nicht decodiert werden. Beim BSC–Modell lassen sich dagegen Decodierfehler nicht vermeiden. Dementsprechend heißt der entsprechende Block dort ''Codewortschätzer''.
 
  
===Fragebogen===
 
  
 +
===Questions===
 
<quiz display=simple>
 
<quiz display=simple>
 +
{&nbsp; $\underline{y} = (1, {\rm E}, 0, 1, 0, 0, {\rm E})$&nbsp; was received.&nbsp; Which sequence does the code word finder decide to use?
 +
|type="()"}
 +
- $\underline{z} = (1, 0, 0, 1, 0, 0, 0),$
 +
+ $\underline{z} = (1, 1, 0, 1, 0, 0, 1),$
 +
- $\underline{z} = (1, 0, 0, 1, 0, 0, 1).$
  
{Empfangen wurde $\underline{y} = (1, {\rm E}, 0, 1, 0, 0, {\rm E})$. Für welche Sequenz entscheidet sich der Codewortschätzer?
+
{What are the consequences of the red entries for&nbsp; $\boldsymbol{\rm H}_{\rm K}$&nbsp; and&nbsp; $z_{\rm K}$&nbsp; (see graphic in the information section)?
 
|type="[]"}
 
|type="[]"}
- $\underline{z} \ = \ (1, 0, 0, 1, 0, 0, 0),$
+
+ The erasure vector is&nbsp; $\underline{z}_{\rm E} = (z_{5}, z_{6}, z_{7}).$
+ $\underline{z} \ = \ (1, 1, 0, 1, 0, 0, 1),$
+
+ The received word is&nbsp; $\underline{y} = (1, 1, 0, 1, {\rm E}, {\rm E}, {\rm E}).$
- $\underline{z} \ = \ (1, 0, 0, 1, 0, 0, 1).$
+
- $\boldsymbol{\rm H}_{\rm E}$&nbsp; is a&nbsp; $2 \times 3$&nbsp; matrix.
 +
+ $\boldsymbol{\rm H}_{\rm E}$&nbsp; is a&nbsp; $3 \times 3$&nbsp; matrix.
  
{Welche Konsequenzen ergeben sich aus den roten Eintragungen für $\boldsymbol{\rm H}_{\rm K}$ und $z_{\rm K}$ (siehe Grafik auf der Angabenseite)?
+
{Now apply&nbsp; $\underline{y} = (1, 1, 0, 1, {\rm E}, {\rm E}, {\rm E}).$&nbsp; Which code word is selected?
|type="[]"}
+
|type="()"}
+ Der Erasure–Vektor lautet $\underline{z}_{\rm E} = (z_{5}, z_{6}, z_{7}).$
 
+ Das Empfangswort lautet $\underline{y} = (1, 1, 0, 1, {\rm E}, {\rm E}, {\rm E}).$
 
- $\boldsymbol{\rm H}_{\rm E}$ ist eine 2 x 3–Matrix.
 
+ $\boldsymbol{\rm H}_{\rm E}$ ist eine 3 x 3–Matrix.
 
 
 
{Nun gelte $\underline{y} = (1, 1, 0, 1, {\rm E}, {\rm E}, {\rm E}).$ Welches Codewort wird ausgewählt?
 
|type="[]"}
 
 
- $\underline{z} = (1, 1, 0, 1, 1, 1, 0),$
 
- $\underline{z} = (1, 1, 0, 1, 1, 1, 0),$
 
+ $\underline{z} = (1, 1, 0, 1, 0, 0, 1),$
 
+ $\underline{z} = (1, 1, 0, 1, 0, 0, 1),$
 
- $\underline{z} = (1, 1, 0, 0, 0, 1, 0).$
 
- $\underline{z} = (1, 1, 0, 0, 0, 1, 0).$
- Für das vorliegende <u>y</u> ist keine eindeutige Decodierung möglich.
+
- For the present&nbsp; $\underline{y}$ &nbsp; no unique decoding is possible.
  
{Welche Konsequenzen ergeben sich aus den grünen Eintragungen für $\boldsymbol{\rm H}_{\rm K}$ und  $z_{\rm K}$ (siehe Grafik auf der Angabenseite)?
+
{What are the consequences of the green entries for&nbsp; $\boldsymbol{\rm H}_{\rm K}$&nbsp; and&nbsp; $z_{\rm K}$&nbsp; (see graphic in the information section)?
 
|type="[]"}
 
|type="[]"}
+ Das Empfangswort lautet $\underline{y} = (1, 1, 0, {\rm E}, 0, {\rm E}, {\rm E}).$
+
+ The received word is&nbsp; $\underline{y} = (1, 1, 0, {\rm E}, 0, {\rm E}, {\rm E}).$
- $\boldsymbol{\rm H}_{\rm K}$ unterscheidet sich gegenüber Teilfrage (2) in der letzten Zeile.
+
- $\boldsymbol{\rm H}_{\rm K}$&nbsp; differs from subtask&nbsp; '''(2)'''&nbsp; in the last row.
+ $\boldsymbol{\rm H}_{\rm K}$ unterscheidet sich gegenüber Teilfrage (2) in der letzten Spalte.
+
+ $\boldsymbol{\rm H}_{\rm K}$&nbsp; differs from subtask&nbsp; '''(2)'''&nbsp; in the last column.
  
{Nun gelte $\underline{y} = (1, 1, 0, {\rm E}, 0, {\rm E}, {\rm E}).$ Welches Codewort wird ausgewählt?
+
{Now apply&nbsp; $\underline{y} = (1, 1, 0, {\rm E}, 0, {\rm E}, {\rm E}).$&nbsp; Which code word is selected?
|type="[]"}
+
|type="()"}
 
- $\underline{z} = (1, 1, 0, 1, 1, 1, 0),$
 
- $\underline{z} = (1, 1, 0, 1, 1, 1, 0),$
 
- $\underline{z} = (1, 1, 0, 1, 0, 0, 1),$
 
- $\underline{z} = (1, 1, 0, 1, 0, 0, 1),$
 
- $\underline{z} = (1, 1, 0, 0, 0, 1, 0).$
 
- $\underline{z} = (1, 1, 0, 0, 0, 1, 0).$
+ Für das vorliegende <u>''y''</u> ist keine eindeutige Decodierung möglich.
+
+ For the present&nbsp; $\underline{y}$ &nbsp; no unique decoding is possible.
 +
 
 +
{Which statements result for the correction capability at the BEC?&nbsp; $n_{\rm E}$&nbsp; indicates in the following the number of erasures.
 +
|type="[]"}
 +
+ For&nbsp; $n_{\rm E} < d_{\rm min}$&nbsp; unique decoding is always possible.
 +
- For&nbsp; $n_{\rm E} = d_{\rm min}$&nbsp; unique decoding is always possible.
 +
+ For&nbsp; $n_{\rm E} = d_{\rm min}$&nbsp; sometimes a unique decoding is possible.
 +
+ For&nbsp; $n_{\rm E} > d_{\rm min}$&nbsp; unique decoding is never possible.
 +
</quiz>
 +
 
 +
===Solution===
 +
{{ML-Kopf}}
 +
'''(1)'''&nbsp;  The received vector is&nbsp; $\underline{y} = (1, {\rm E}, 0, 1, 0, 0, {\rm E})$.&nbsp; Thus,&nbsp; the code symbols at positions 2 and 7 were erased.&nbsp; Based on the given parity-check matrix
 +
 
 +
:$${ \boldsymbol{\rm H}} = \begin{pmatrix} 1 &1 &1 &0 &1 &0 &0\\ 0 &1 &1 &1 &0 &1 &0\\ 1 &1 &0 &1 &0 &0 &1 \end{pmatrix}$$
 +
 
 +
of the Hamming code is obtained for vector and matrix
 +
 
 +
* with respect to all&nbsp; "correctly transmitted code symbols"&nbsp; $($index&nbsp; "$\rm K$"&nbsp; from German&nbsp; "korrekt"$)$&nbsp; which are  known to the code word finder:
 +
 
 +
:$$\underline{z}_{\rm K} = (1, 0, 1, 0, 0)\hspace{0.05cm},\hspace{0.3cm}{ \boldsymbol{\rm H}}_{\rm K} = \begin{pmatrix} 1 &1 &0 &1 &0\\ 0 &1 &1 &0 &1\\ 1 &0 &1 &0 &0 \end{pmatrix} \hspace{0.05cm},$$
 +
 
 +
*with respect to the two&nbsp; "erased code symbols"&nbsp; $z_{2}$&nbsp; and&nbsp; $z_{7}$&nbsp; $($index&nbsp; $\rm E)$&nbsp; to be determined:
 +
 
 +
:$$\underline{z}_{\rm E} = (z_2, z_7)\hspace{0.05cm},\hspace{0.3cm}{ \boldsymbol{\rm H}}_{\rm E} = \begin{pmatrix} 1 &0\\ 1 &0\\ 1 &1 \end{pmatrix} \hspace{0.05cm}.$$
 +
 
 +
The equation of determination is thus:
 +
:$${ \boldsymbol{\rm H}}_{\rm E} \cdot \underline{z}_{\rm E}^{\rm T}= { \boldsymbol{\rm H}}_{\rm K} \cdot \underline{z}_{\rm K}^{\rm T}\hspace{0.3cm}
 +
\Rightarrow \hspace{0.3cm} \begin{pmatrix} 1 &0\\ 1 &0\\ 1 &1 \end{pmatrix} \cdot \begin{pmatrix} z_2 \\ z_7 \end{pmatrix} = \begin{pmatrix} 1 &1 &0 &1 &0\\ 0 &1 &1 &0 &1\\ 1 &0 &1 &0 &0 \end{pmatrix} \cdot \begin{pmatrix} 1 \\ 0 \\ 1 \\ 0 \\ 0 \end{pmatrix} = \begin{pmatrix} 1 \\ 1 \\ 0 \end{pmatrix} \hspace{0.05cm}.$$
 +
 
 +
This results in three equations for the two unknowns&nbsp; $z_{2}$&nbsp; and&nbsp; $z_{7}$:
 +
 
 +
:$${\rm (a)}\ z_{2} = 1,$$
 +
:$${\rm (b)}\ z_{2} = 1,$$
 +
:$${\rm (c)}\ z_{2} + z_{7} = 0 \ \Rightarrow \ z_{7}= 1.$$
 +
 
 +
Thus,&nbsp; the code word finder returns $\underline{z} = (1, 1, 0, 1, 0, 0, 1)$ &nbsp; ⇒ &nbsp; <u>Suggested solution 2</u>.
  
  
{Welche Aussagen ergeben sich für die Korrekturfähigkeit beim BEC? $n_{\rm E}$ gibt dabei Anzahl der Auslöschungen (''Erasures'') an.
+
'''(2)'''&nbsp;  Looking at the given matrix&nbsp; $\boldsymbol{\rm H}_{\rm K}$,&nbsp; we see that it coincides with the first four columns of the parity-check matrix&nbsp; $\boldsymbol{\rm H}$.
|type="[]"}
+
+ Für $n_{\rm E} \  < \ d_{\rm min}$ ist stets eine eindeutige Decodierung möglich.
+
*The erasures thus affect the last three bits of the received word &nbsp; ⇒ &nbsp; $\underline{z}_{\rm E} = (z_{5}, z_{6}, z_{7})$    &nbsp; ⇒ &nbsp;    $\underline{y} = (1, 1, 0, 1, {\rm E}, {\rm E}, {\rm E})$.
- Für $n_{\rm E} \ = \ d_{\rm min}$ ist stets eine eindeutige Decodierung möglich.
+
*The erasure matrix reads:
+ Für $n_{\rm E} \ = \ d_{\rm min}$ ist manchmal eine eindeutige Decodierung möglich.
+
+ Für $n_{\rm E} \ >d_{\rm min}$ ist eine eindeutige Decodierung nie möglich.
+
:$${ \boldsymbol{\rm H}}_{\rm E} = \begin{pmatrix} 1 &0 &0\\ 0 &1 &0\\ 0 &0 &1 \end{pmatrix} \hspace{0.05cm}.$$
 +
 
 +
Correct are therefore the&nbsp; <u>statements 1, 2 and 4</u>.
 +
 
 +
 
 +
'''(3)'''&nbsp; One obtains after some matrix multiplications:
 +
:$${ \boldsymbol{\rm H}}_{\rm K} \cdot \underline{z}_{\rm K}^{\rm T} = \begin{pmatrix} 1 &1 &1 &0\\ 0 &1 &1 &1\\ 1 &1 &0 &1 \end{pmatrix} \cdot \begin{pmatrix} 1 \\ 1 \\ 0 \\ 1 \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \\ 1 \end{pmatrix} \hspace{0.05cm},\hspace{1cm}
 +
{ \boldsymbol{\rm H}}_{\rm E} \cdot \underline{z}_{\rm E}^{\rm T} = \begin{pmatrix} 1 &0 &0\\ 0 &1 &0\\ 0 &0 &1 \end{pmatrix} \cdot \begin{pmatrix} z_5 \\ z_6 \\ z_7 \end{pmatrix} = \begin{pmatrix} z_5 \\ z_6 \\ z_7 \end{pmatrix} \hspace{0.05cm}.$$
 +
   
 +
*By equating it follows&nbsp; $z_{5} = 0, \ z_{6} = 0, \ z_{7} = 1$ &nbsp; ⇒ &nbsp; <u>Suggested solution 2</u>.
 +
 
 +
 
 +
 
 +
'''(4)'''&nbsp; The matrix comparison shows that the first three columns of&nbsp; $\boldsymbol{\rm H}$ &nbsp; and &nbsp; $\boldsymbol{\rm H}_{\rm K}$ &nbsp; are identical.
 +
* The fourth column of&nbsp; $\boldsymbol{\rm H}_{\rm K}$&nbsp; is equal to the fifth column of the parity-check matrix&nbsp; $\boldsymbol{\rm H}$.
 +
 +
*From this follows for the vector&nbsp; $z_{\rm E} = (z_{4}, z_{6}, z_{7})$&nbsp; and further for the received vector&nbsp; $\underline{y} = (1, 1, 0, {\rm E}, 0, {\rm E}, {\rm E})$ &nbsp; ⇒ &nbsp; <u>Suggested solutions 1 and 3</u>.  
 +
 
 +
 
 +
'''(5)'''&nbsp;  Analogous to subtask&nbsp; '''(3)''',&nbsp; we obtain:
 +
 +
:$${ \boldsymbol{\rm H}}_{\rm K} \cdot \underline{z}_{\rm K}^{\rm T} = \begin{pmatrix} 1 &1 &1 &1\\ 0 &1 &1 &0\\ 1 &1 &0 &0 \end{pmatrix} \cdot \begin{pmatrix} 1 \\ 1 \\ 0 \\ 0 \end{pmatrix} = \begin{pmatrix} 0 \\ 1 \\ 0 \end{pmatrix} \hspace{0.05cm},\hspace{1cm}
 +
{ \boldsymbol{\rm H}}_{\rm E} \cdot \underline{z}_{\rm E}^{\rm T} = \begin{pmatrix} 0 &0 &0\\ 1 &1 &0\\ 1 &0 &1 \end{pmatrix} \cdot \begin{pmatrix} z_4 \\ z_6 \\ z_7 \end{pmatrix} = \begin{pmatrix} 0 \\ z_4 + z_6 \\ z_4 + z_7 \end{pmatrix} \hspace{0.05cm}.$$
 +
 +
*If we now set the two column vectors equal,&nbsp;  we obtain only two equations for the three unknowns &nbsp; ⇒ &nbsp; <u>Suggested solution 4</u>.&nbsp; Or in other words: &nbsp;
 +
 
 +
*If the number of extinctions of the BEC channel is larger than the rank of the matrix $\boldsymbol{\rm H}_{\rm E}$,&nbsp; then no unique solution of the resulting system of equations results.
 +
 
 +
 
 +
 
 +
'''(6)'''&nbsp; To solve this exercise,&nbsp; we again refer to the systematic Hamming code&nbsp; $(7, 4, 3)$&nbsp; according to the given parity-check equation and code table.&nbsp; <u>Note:</u>&nbsp;
 +
[[File:P_ID2540__KC_A_1_13f.png|right|frame|Code table of the systematic&nbsp; $(7, 4, 3)$&nbsp; Hamming code<br><br><br>]]
 +
*The information bits are shown in black and the parity bits in red.&nbsp; The minimum distance of this code is&nbsp; $d_{\rm min} = 3$.
  
 +
*Further we assume that always the code word&nbsp; $\underline{x} = (1, 1, 0, 1, 0, 0, 1)$&nbsp; with yellow background was sent.&nbsp; Then holds:
  
 +
*If the number&nbsp; $n_{\rm E}$&nbsp; of erasures is smaller than&nbsp; $d_{\rm min} = 3$,&nbsp; decoding by the method described here is always possible &nbsp; ⇒ &nbsp; see e.g. subtask&nbsp; '''(1)'''&nbsp; with&nbsp; $n_{\rm E}= 2$.
  
 +
*For&nbsp; $n_{\rm E} = d_{\rm min} = 3$&nbsp; decoding is sometimes possible,&nbsp; see subtask&nbsp; '''(3)'''.&nbsp; In the code table,&nbsp; there is only one code word that could match &nbsp; $\underline{y} = (1, 1, 0, 1, {\rm E}, {\rm E}, {\rm E})$ &nbsp; namely the code word highlighted in yellow &nbsp; $\underline{x} = (1, 1, 0, 1, 0, 0, 1)$ .
  
 +
*In contrast &nbsp; $\underline{y} = (1, 1, 0, {\rm E}, 0, {\rm E}, {\rm E})$&nbsp; according to subtask&nbsp; '''(4)'''&nbsp; could not be decoded. In the code table, besides&nbsp; $(1, 1, 0, 1, 0, 0, 1)$&nbsp; with&nbsp; $(1, 1, 0, 0, 0, 1, 0)$&nbsp; one can recognize another code word&nbsp; (highlighted in green),&nbsp; which becomes the received word&nbsp; $\underline{y}$&nbsp; due to the&nbsp; $n_{\rm E} = 3$&nbsp; erasures concerning bit 4, 6 and 7.
  
</quiz>
+
*This case,&nbsp; when the&nbsp; $n_{\rm E} = d_{\rm min}$&nbsp; extinctions affect exactly the&nbsp; $d_{\rm min}$&nbsp; different bits of two code words, leads to a matrix&nbsp; $\mathbf{H}_{\rm E}$&nbsp; with rank smaller&nbsp; $d_{\rm min}$.
  
===Musterlösung===
+
*If&nbsp; $n_{\rm E} > d_{\rm min}$, the number&nbsp; $n - n_{\rm E}$&nbsp; of bits not erased is smaller than the number&nbsp; $k$&nbsp; of information bits.&nbsp; In this case,&nbsp; of course,&nbsp; the code word cannot be decoded.
{{ML-Kopf}}
 
'''(1)'''&nbsp;  
 
'''2.'''
 
'''3.'''
 
'''4.'''
 
'''5.'''
 
'''6.'''
 
'''7.'''
 
{{ML-Fuß}}
 
  
  
 +
That is: &nbsp; Applicable are the&nbsp; <u>statements 1, 3 and 4</u>.
 +
{{ML-Fuß}}
  
[[Category:Aufgaben zu  Kanalcodierung|^1.5 Decodierung linearer Blockcodes
 
  
  
^]]
+
[[Category:Channel Coding: Exercises|^1.5 Linear Block Code Decoding^]]

Latest revision as of 17:02, 23 January 2023

Decoding at the BEC

We assume here the model in section  "Decoding at the Binary Erasure Channel"  (BEC configuration highlighted in green):

  • Each information word  $\underline{u}$  is encoded blockwise and yields the code word  $\underline{x}$.  Let the block code be linear and completely given by its parity-check matrix  $\boldsymbol{\rm H}$.
  • During transmission  $n_{\rm E}$  bits of the code word are erased   ⇒  "Binary Erasure Channel"  $\rm (BEC)$.  The code word  $\underline{x}$  thus becomes the received word  $\underline{y}$.
  • If the number  $n_{\rm E}$  of erasures is less than the  "minimum distance"  $d_{\rm min}$  of the code,  it is possible to reconstruct from  $\underline{y}$  the code word  $\underline{z} = \underline{x}$  without error,  and thus the correct information word  $\underline{v} = \underline{u}$  is also obtained.


For exercise description,  let us now consider the Hamming code word  $\underline{x} = (0, 1, 0, 1, 1, 0, 0)$  and the received word  $\underline{y} = (0, 1, {\rm E} , {\rm E}, 1, 0, 0).$

  • The third and fourth bit were thus erased by the channel.  The  "code word finder"  thus has the task to determine the vector  $z_{\rm E} = (z_{3}, z_{4})$  with  $z_{3}, \ z_{4} \in \{0, 1\}$  to be determined.  This is done according to the equation
$${ \boldsymbol{\rm H}}_{\rm E} \cdot \underline{z}_{\rm E}^{\rm T}= { \boldsymbol{\rm H}}_{\rm K} \cdot \underline{z}_{\rm K}^{\rm T}\hspace{0.05cm}.$$
  • In the present example:
$$\underline{z}_{\rm K} = (0, 1, 1, 0, 0)\hspace{0.05cm},\hspace{0.3cm}{ \boldsymbol{\rm H}}_{\rm K} = \begin{pmatrix} 1 &1 &1 &0 &0\\ 0 &1 &0 &1 &0\\ 1 &1 &0 &0 &1 \end{pmatrix} \hspace{0.05cm}, \hspace{0.3cm}{ \boldsymbol{\rm H}}_{\rm E} = \begin{pmatrix} 1 &0\\ 1 &1\\ 0 &1 \end{pmatrix} \hspace{0.05cm}.$$
  • This equation gives two equations for the bits to be determined,  whose solution leads to the result  $z_{3} = 0$  and  $z_{4} = 1$.



Hints:

  • The algorithm for mapping the received word  $\underline{y}$  to the correct code word  $\underline{z} = \underline{x}$   is described in detail in the  "theory section" .
  • We would like to remind again that in BEC decoding we use the first decoder block  $\underline{y} → \underline{z}$  as  "code word finder"  since wrong decisions are excluded here.  Each received word is decoded correctly,  or it may not be decoded at all.
  • In the BSC model,  on the other hand,  decoding errors cannot be avoided.  Accordingly,  we refer to the corresponding block there as  "code word estimator".



Questions

1

  $\underline{y} = (1, {\rm E}, 0, 1, 0, 0, {\rm E})$  was received.  Which sequence does the code word finder decide to use?

$\underline{z} = (1, 0, 0, 1, 0, 0, 0),$
$\underline{z} = (1, 1, 0, 1, 0, 0, 1),$
$\underline{z} = (1, 0, 0, 1, 0, 0, 1).$

2

What are the consequences of the red entries for  $\boldsymbol{\rm H}_{\rm K}$  and  $z_{\rm K}$  (see graphic in the information section)?

The erasure vector is  $\underline{z}_{\rm E} = (z_{5}, z_{6}, z_{7}).$
The received word is  $\underline{y} = (1, 1, 0, 1, {\rm E}, {\rm E}, {\rm E}).$
$\boldsymbol{\rm H}_{\rm E}$  is a  $2 \times 3$  matrix.
$\boldsymbol{\rm H}_{\rm E}$  is a  $3 \times 3$  matrix.

3

Now apply  $\underline{y} = (1, 1, 0, 1, {\rm E}, {\rm E}, {\rm E}).$  Which code word is selected?

$\underline{z} = (1, 1, 0, 1, 1, 1, 0),$
$\underline{z} = (1, 1, 0, 1, 0, 0, 1),$
$\underline{z} = (1, 1, 0, 0, 0, 1, 0).$
For the present  $\underline{y}$   no unique decoding is possible.

4

What are the consequences of the green entries for  $\boldsymbol{\rm H}_{\rm K}$  and  $z_{\rm K}$  (see graphic in the information section)?

The received word is  $\underline{y} = (1, 1, 0, {\rm E}, 0, {\rm E}, {\rm E}).$
$\boldsymbol{\rm H}_{\rm K}$  differs from subtask  (2)  in the last row.
$\boldsymbol{\rm H}_{\rm K}$  differs from subtask  (2)  in the last column.

5

Now apply  $\underline{y} = (1, 1, 0, {\rm E}, 0, {\rm E}, {\rm E}).$  Which code word is selected?

$\underline{z} = (1, 1, 0, 1, 1, 1, 0),$
$\underline{z} = (1, 1, 0, 1, 0, 0, 1),$
$\underline{z} = (1, 1, 0, 0, 0, 1, 0).$
For the present  $\underline{y}$   no unique decoding is possible.

6

Which statements result for the correction capability at the BEC?  $n_{\rm E}$  indicates in the following the number of erasures.

For  $n_{\rm E} < d_{\rm min}$  unique decoding is always possible.
For  $n_{\rm E} = d_{\rm min}$  unique decoding is always possible.
For  $n_{\rm E} = d_{\rm min}$  sometimes a unique decoding is possible.
For  $n_{\rm E} > d_{\rm min}$  unique decoding is never possible.


Solution

(1)  The received vector is  $\underline{y} = (1, {\rm E}, 0, 1, 0, 0, {\rm E})$.  Thus,  the code symbols at positions 2 and 7 were erased.  Based on the given parity-check matrix

$${ \boldsymbol{\rm H}} = \begin{pmatrix} 1 &1 &1 &0 &1 &0 &0\\ 0 &1 &1 &1 &0 &1 &0\\ 1 &1 &0 &1 &0 &0 &1 \end{pmatrix}$$

of the Hamming code is obtained for vector and matrix

  • with respect to all  "correctly transmitted code symbols"  $($index  "$\rm K$"  from German  "korrekt"$)$  which are known to the code word finder:
$$\underline{z}_{\rm K} = (1, 0, 1, 0, 0)\hspace{0.05cm},\hspace{0.3cm}{ \boldsymbol{\rm H}}_{\rm K} = \begin{pmatrix} 1 &1 &0 &1 &0\\ 0 &1 &1 &0 &1\\ 1 &0 &1 &0 &0 \end{pmatrix} \hspace{0.05cm},$$
  • with respect to the two  "erased code symbols"  $z_{2}$  and  $z_{7}$  $($index  $\rm E)$  to be determined:
$$\underline{z}_{\rm E} = (z_2, z_7)\hspace{0.05cm},\hspace{0.3cm}{ \boldsymbol{\rm H}}_{\rm E} = \begin{pmatrix} 1 &0\\ 1 &0\\ 1 &1 \end{pmatrix} \hspace{0.05cm}.$$

The equation of determination is thus:

$${ \boldsymbol{\rm H}}_{\rm E} \cdot \underline{z}_{\rm E}^{\rm T}= { \boldsymbol{\rm H}}_{\rm K} \cdot \underline{z}_{\rm K}^{\rm T}\hspace{0.3cm} \Rightarrow \hspace{0.3cm} \begin{pmatrix} 1 &0\\ 1 &0\\ 1 &1 \end{pmatrix} \cdot \begin{pmatrix} z_2 \\ z_7 \end{pmatrix} = \begin{pmatrix} 1 &1 &0 &1 &0\\ 0 &1 &1 &0 &1\\ 1 &0 &1 &0 &0 \end{pmatrix} \cdot \begin{pmatrix} 1 \\ 0 \\ 1 \\ 0 \\ 0 \end{pmatrix} = \begin{pmatrix} 1 \\ 1 \\ 0 \end{pmatrix} \hspace{0.05cm}.$$

This results in three equations for the two unknowns  $z_{2}$  and  $z_{7}$:

$${\rm (a)}\ z_{2} = 1,$$
$${\rm (b)}\ z_{2} = 1,$$
$${\rm (c)}\ z_{2} + z_{7} = 0 \ \Rightarrow \ z_{7}= 1.$$

Thus,  the code word finder returns $\underline{z} = (1, 1, 0, 1, 0, 0, 1)$   ⇒   Suggested solution 2.


(2)  Looking at the given matrix  $\boldsymbol{\rm H}_{\rm K}$,  we see that it coincides with the first four columns of the parity-check matrix  $\boldsymbol{\rm H}$.

  • The erasures thus affect the last three bits of the received word   ⇒   $\underline{z}_{\rm E} = (z_{5}, z_{6}, z_{7})$   ⇒   $\underline{y} = (1, 1, 0, 1, {\rm E}, {\rm E}, {\rm E})$.
  • The erasure matrix reads:
$${ \boldsymbol{\rm H}}_{\rm E} = \begin{pmatrix} 1 &0 &0\\ 0 &1 &0\\ 0 &0 &1 \end{pmatrix} \hspace{0.05cm}.$$

Correct are therefore the  statements 1, 2 and 4.


(3)  One obtains after some matrix multiplications:

$${ \boldsymbol{\rm H}}_{\rm K} \cdot \underline{z}_{\rm K}^{\rm T} = \begin{pmatrix} 1 &1 &1 &0\\ 0 &1 &1 &1\\ 1 &1 &0 &1 \end{pmatrix} \cdot \begin{pmatrix} 1 \\ 1 \\ 0 \\ 1 \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \\ 1 \end{pmatrix} \hspace{0.05cm},\hspace{1cm} { \boldsymbol{\rm H}}_{\rm E} \cdot \underline{z}_{\rm E}^{\rm T} = \begin{pmatrix} 1 &0 &0\\ 0 &1 &0\\ 0 &0 &1 \end{pmatrix} \cdot \begin{pmatrix} z_5 \\ z_6 \\ z_7 \end{pmatrix} = \begin{pmatrix} z_5 \\ z_6 \\ z_7 \end{pmatrix} \hspace{0.05cm}.$$
  • By equating it follows  $z_{5} = 0, \ z_{6} = 0, \ z_{7} = 1$   ⇒   Suggested solution 2.


(4)  The matrix comparison shows that the first three columns of  $\boldsymbol{\rm H}$   and   $\boldsymbol{\rm H}_{\rm K}$   are identical.

  • The fourth column of  $\boldsymbol{\rm H}_{\rm K}$  is equal to the fifth column of the parity-check matrix  $\boldsymbol{\rm H}$.
  • From this follows for the vector  $z_{\rm E} = (z_{4}, z_{6}, z_{7})$  and further for the received vector  $\underline{y} = (1, 1, 0, {\rm E}, 0, {\rm E}, {\rm E})$   ⇒   Suggested solutions 1 and 3.


(5)  Analogous to subtask  (3),  we obtain:

$${ \boldsymbol{\rm H}}_{\rm K} \cdot \underline{z}_{\rm K}^{\rm T} = \begin{pmatrix} 1 &1 &1 &1\\ 0 &1 &1 &0\\ 1 &1 &0 &0 \end{pmatrix} \cdot \begin{pmatrix} 1 \\ 1 \\ 0 \\ 0 \end{pmatrix} = \begin{pmatrix} 0 \\ 1 \\ 0 \end{pmatrix} \hspace{0.05cm},\hspace{1cm} { \boldsymbol{\rm H}}_{\rm E} \cdot \underline{z}_{\rm E}^{\rm T} = \begin{pmatrix} 0 &0 &0\\ 1 &1 &0\\ 1 &0 &1 \end{pmatrix} \cdot \begin{pmatrix} z_4 \\ z_6 \\ z_7 \end{pmatrix} = \begin{pmatrix} 0 \\ z_4 + z_6 \\ z_4 + z_7 \end{pmatrix} \hspace{0.05cm}.$$
  • If we now set the two column vectors equal,  we obtain only two equations for the three unknowns   ⇒   Suggested solution 4.  Or in other words:  
  • If the number of extinctions of the BEC channel is larger than the rank of the matrix $\boldsymbol{\rm H}_{\rm E}$,  then no unique solution of the resulting system of equations results.


(6)  To solve this exercise,  we again refer to the systematic Hamming code  $(7, 4, 3)$  according to the given parity-check equation and code table.  Note: 

Code table of the systematic  $(7, 4, 3)$  Hamming code


  • The information bits are shown in black and the parity bits in red.  The minimum distance of this code is  $d_{\rm min} = 3$.
  • Further we assume that always the code word  $\underline{x} = (1, 1, 0, 1, 0, 0, 1)$  with yellow background was sent.  Then holds:
  • If the number  $n_{\rm E}$  of erasures is smaller than  $d_{\rm min} = 3$,  decoding by the method described here is always possible   ⇒   see e.g. subtask  (1)  with  $n_{\rm E}= 2$.
  • For  $n_{\rm E} = d_{\rm min} = 3$  decoding is sometimes possible,  see subtask  (3).  In the code table,  there is only one code word that could match   $\underline{y} = (1, 1, 0, 1, {\rm E}, {\rm E}, {\rm E})$   namely the code word highlighted in yellow   $\underline{x} = (1, 1, 0, 1, 0, 0, 1)$ .
  • In contrast   $\underline{y} = (1, 1, 0, {\rm E}, 0, {\rm E}, {\rm E})$  according to subtask  (4)  could not be decoded. In the code table, besides  $(1, 1, 0, 1, 0, 0, 1)$  with  $(1, 1, 0, 0, 0, 1, 0)$  one can recognize another code word  (highlighted in green),  which becomes the received word  $\underline{y}$  due to the  $n_{\rm E} = 3$  erasures concerning bit 4, 6 and 7.
  • This case,  when the  $n_{\rm E} = d_{\rm min}$  extinctions affect exactly the  $d_{\rm min}$  different bits of two code words, leads to a matrix  $\mathbf{H}_{\rm E}$  with rank smaller  $d_{\rm min}$.
  • If  $n_{\rm E} > d_{\rm min}$, the number  $n - n_{\rm E}$  of bits not erased is smaller than the number  $k$  of information bits.  In this case,  of course,  the code word cannot be decoded.


That is:   Applicable are the  statements 1, 3 and 4.