Difference between revisions of "Aufgaben:Exercise 4.3: Iterative Decoding at the BSC"

From LNTwww
 
(26 intermediate revisions by 6 users not shown)
Line 1: Line 1:
{{quiz-Header|Buchseite=Kanalcodierung/Soft–in Soft–out Decoder}}
+
{{quiz-Header|Buchseite=Channel_Coding/Soft-in_Soft-Out_Decoder}}
  
[[File:P_ID2987__KC_A_4_3_v1.png|right|frame|BSC–Modell und mögliche Empfangswerte]]
+
[[File:EN_KC_A_4_3.png|right|frame|BSC model and&nbsp; (above) <br>possible received values]]
Wir betrachten in dieser Aufgabe zwei Codes:
+
We consider in this exercise  two codes:
* den Single Parity&ndash;Code &nbsp;&#8658;&nbsp; [[Kanalcodierung/Beispiele_bin%C3%A4rer_Blockcodes#Single_Parity.E2.80.93check_Codes_.281.29| SPC (3, 2, 2)]]:
+
* the&nbsp; "single parity&ndash;check code" &nbsp; &#8658; &nbsp; [[Channel_Coding/Examples_of_Binary_Block_Codes#Single_Parity-check_Codes| $\text{SPC (3, 2, 2)}$]]:
 
:$$\underline{x} = \big (\hspace{0.05cm}(0, 0, 0), \hspace{0.1cm}
 
:$$\underline{x} = \big (\hspace{0.05cm}(0, 0, 0), \hspace{0.1cm}
 
(0, 1, 1), \hspace{0.1cm}
 
(0, 1, 1), \hspace{0.1cm}
Line 9: Line 9:
 
(1, 1, 0) \hspace{0.05cm} \big )
 
(1, 1, 0) \hspace{0.05cm} \big )
 
\hspace{0.05cm}, $$
 
\hspace{0.05cm}, $$
* den Wiederholungscode &nbsp;&#8658;&nbsp; [[Kanalcodierung/Beispiele_bin%C3%A4rer_Blockcodes#Wiederholungscodes| RC (3, 1, 3)]]:
+
* the repetition code &nbsp; &#8658; &nbsp; [[Channel_Coding/Examples_of_Binary_Block_Codes#Repetition_Codes| $\text{RC (3, 1, 3)}$]]:
 
:$$\underline{x} = \big (\hspace{0.05cm}(0, 0, 0), \hspace{0.1cm}
 
:$$\underline{x} = \big (\hspace{0.05cm}(0, 0, 0), \hspace{0.1cm}
 
(1, 1, 1)  \hspace{0.05cm} \big )
 
(1, 1, 1)  \hspace{0.05cm} \big )
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
  
 +
The channel is described at bit level by the&nbsp; [[Digital_Signal_Transmission/Binary_Symmetric_Channel|$\text{BSC model}$]].&nbsp; According to the graphic,&nbsp; the following applies:
 +
:$${\rm Pr}(y_i \ne x_i) =\varepsilon = 0.269\hspace{0.05cm},$$
 +
:$${\rm Pr}(y_i = x_i) =1-\varepsilon = 0.731\hspace{0.05cm}.$$
  
Der Kanal wird auf Bitebene durch das [[Digitalsignal%C3%BCbertragung/Binary_Symmetric_Channel_(BSC)|BSC&ndash;Modell]] beschrieben. Entsprechend der Grafik gilt dabei:
+
Here,&nbsp; $\varepsilon$&nbsp; denotes the falsification probability of the BSC model.
:$${\rm Pr}(y_i \ne x_i) \hspace{-0.15cm} \ = \  \hspace{-0.15cm}\varepsilon = 0.269\hspace{0.05cm},$$
 
:$${\rm Pr}(y_i = x_i) \hspace{-0.15cm} \ = \ \hspace{-0.15cm}1-\varepsilon = 0.731\hspace{0.05cm}.$$
 
  
Hierbei bezeichnet $\epsilon$ die Verfälschungswahrscheinlichkeit.
+
Except for the last subtask,&nbsp; the following received value is always assumed:
 
 
Bis auf die letzte Teilaufgabe wird stets von folgendem Empfangswert ausgegangen:
 
 
:$$\underline{y} = (0, 1, 0) =\underline{y}_2
 
:$$\underline{y} = (0, 1, 0) =\underline{y}_2
 
\hspace{0.05cm}. $$
 
\hspace{0.05cm}. $$
  
Die hier gewählte Indizierung aller möglichen Empfangsvektoren kann der Grafik entnommen werden. Der meist betrachtete Vektor $\underline{y}_2$ ist hierbei rot hervorgehoben. Für die Teilaufgabe (6) gilt dann:
+
The chosen indexing of all possible received vectors can be taken from the graphic.  
 +
*The most considered vector&nbsp; $\underline{y}_2$&nbsp; is highlighted in red.
 +
 +
*Only for subtask&nbsp; '''(6)'''&nbsp; applies:
 
:$$\underline{y} = (1, 1, 0) =\underline{y}_6
 
:$$\underline{y} = (1, 1, 0) =\underline{y}_6
 
\hspace{0.05cm}. $$
 
\hspace{0.05cm}. $$
  
Zur Decodierung sollen in der Aufgabe untersucht werden:
+
For decoding purposes, the exercise will examine:
* die [[Kanalcodierung/Decodierung_linearer_Blockcodes#Verallgemeinerung_der_Syndromdecodierung|Syndromdecodierung]], die bei den hier betrachteten Codes als <i>Hard Decision Maximum Likelihood Detection</i> (HD&ndash;ML) vornimmt. <i>Hinweis:</i> Softwerte liegen beim BSC nicht vor.
+
* the&nbsp; [[Channel_Coding/Decoding_of_Linear_Block_Codes#Generalization_of_syndrome_coding|"syndrome decoding"]],&nbsp; which follows the concept&nbsp; "hard decision maximum likelihood detection"&nbsp; $\rm (HD-ML)$&nbsp; for the codes under consideration. <br>$($because soft values are not available at the BSC$)$;
* die symbolweise [[Kanalcodierung/Soft%E2%80%93in_Soft%E2%80%93out_Decoder#Symbolweise_Soft.E2.80.93in_Soft.E2.80.93out_Decodierung|Soft&ndash;in Soft&ndash;out Decodierung]] (SISO) entsprechend dieses Abschnitts.
+
 
 +
* the bit-wise&nbsp; [[Channel_Coding/Soft-in_Soft-Out_Decoder#Bit-wise_soft-in_soft-out_decoding|"Soft&ndash;in Soft&ndash;out Decoding"]]&nbsp; $\rm (SISO)$&nbsp; according to this section.
 +
 
 +
 
 +
 
 +
 
  
  
''Hinweise:''
 
* Die Aufgabe bezieht sich auf das Kapitel [[Kanalcodierung/Soft%E2%80%93in_Soft%E2%80%93out_Decoder| Soft&ndash;in Soft&ndash;out Decoder]].
 
* Das vom Decoder ausgewählte Codewort wird in den Fragen mit $\underline{z}$ bezeichnet.
 
* Sollte die Eingabe des Zahlenwertes &bdquo;0&rdquo; erforderlich sein, so geben Sie bitte &bdquo;0.&rdquo; ein.
 
  
 +
<u>Hints:</u>
 +
* This exercise refers to the chapter&nbsp; [[Channel_Coding/Soft-in_Soft-Out_Decoder| "Soft&ndash;in Soft&ndash;out Decoder"]].
  
 +
* Reference is made in particular to the sections
 +
:*[[Channel_Coding/Soft-in_Soft-Out_Decoder#Bit-wise_soft-in_soft-out_decoding|"Bit-wise Soft&ndash;in Soft&ndash;out Decoding"]],&nbsp;
 +
:*[[Channel_Coding/Channel_Models_and_Decision_Structures#Binary_Symmetric_Channel_.E2.80.93_BSC|"Binary Symmetric Channel"]]
 +
* The code word selected by the decoder is denoted in the questions  by&nbsp; $\underline{z}$.
 +
  
===Fragebogen===
+
 
 +
 
 +
===Questions===
 
<quiz display=simple>
 
<quiz display=simple>
{Welche Aussagen gelten für die Decodierung des SPC (3, 2, 2)?
+
{Which statements are valid for decoding the&nbsp; $\text{SPC (3, 2, 2)}$?
|type="[]"}
+
|type="()"}
- Die HD&ndash;Syndromdecodierung liefert das Ergebnis $\underline{z} = (0, \, 1, \, 0)$,
+
- The hard decision syndrome decoding yields the result&nbsp; $\underline{z} = (0, \, 1, \, 0)$.
- Die HD&ndash;Syndromdecodierung liefert das Ergebnis $\underline{z} = (0, \, 0, \, 0)$,
+
- The hard decision  syndrome decoding returns the result&nbsp; $\underline{z} = (0, \, 0, \, 0)$.
+ Die HD&ndash;Syndromdecodierung versagt hier.
+
+ The hard decision  syndrome decoding fails here.
  
{Welche Aussagen gelten für den RC (3, 1, 3)?
+
{Which statements are valid for the&nbsp; $\text{ RC (3, 1, 3)}$?
|type="[]"}
+
|type="()"}
- Die HD&ndash;Syndromdecodierung liefert das Ergebnis $\underline{z} = (0, \, 1, \, 0)$,
+
- The hard decision  syndrome decoding returns the result&nbsp; $\underline{z} = (0, \, 1, \, 0)$.
+ Die HD&ndash;Syndromdecodierung liefert das Ergebnis $\underline{z} = (0, \, 0, \, 0)$,
+
+ The hard decision  syndrome decoding returns the result&nbsp; $\underline{z} = (0, \, 0, \, 0)$.
- Die HD&ndash;Syndromdecodierung versagt hier.
+
- The hard decision  syndrome decoding fails here.
  
{Wie sicher ist diese Entscheidung, wenn man als Sicherheit $S$ den Quotienten der Wahrscheinlichkeiten für eine richtige bzw. falsche Entscheidung definiert?
+
{How certain is this decision if we define reliability&nbsp; $($"security"$)$&nbsp; $S$&nbsp; as the quotient of the probabilities for a correct or an incorrect decision? <br>Set the falsification probability of the BSC model to&nbsp; $\varepsilon = 26.9\%$.
 
|type="{}"}
 
|type="{}"}
$\underline{y} = \underline{y}_2 \text{:} \hspace{0.2cm} S \ = \ ${ 2.717 3% }
+
$S \ = \ ${ 2.717 3% }
 
$\hspace{0.75cm} \ln {(S)} \ = \ ${ 1 3% }
 
$\hspace{0.75cm} \ln {(S)} \ = \ ${ 1 3% }
  
{Wie lauten die intrinsischen $L$&ndash;Werte für die iterative symbolweise Decodierung des RC (3, 1)&ndash;Empfangswortes $\underline{y}_2 = (0, \, 1, \, 0)$?
+
{What are the intrinsic log likelihood ratios for iterative bit-wise decoding of the&nbsp; $\text{RC (3, 1)}$&nbsp; received word&nbsp; $\underline{y}_2 = (0, \, 1, \, 0)$?
 
|type="{}"}
 
|type="{}"}
$\underline{y} = \underline{y}_2 \text{:} \hspace{0.2cm} L_{\rm K}(1) \ = \ ${ 1 3% }  
+
$L_{\rm K}(1) \ = \ ${ 1 3% }  
$\hspace{1.55cm} L_{\rm K}(2) \ = \ ${ -1.03--0.97 }  
+
$L_{\rm K}(2) \ = \ ${ -1.03--0.97 }  
$\hspace{1.55cm} L_{\rm K}(3) \ = \ ${ 1 3% }  
+
$L_{\rm K}(3) \ = \ ${ 1 3% }  
  
{Welche Aussagen sind für die Decodierung des Empfangswortes $\underline{y}_2 = (0, \, 1, \, 0)$ zutreffend? Gehen Sie weiterhin vom RC (3, 1, 3) aus.
+
{Which statements are true for decoding the received word&nbsp; $\underline{y}_2 = (0, \, 1, \, 0)$?&nbsp; Continue to assume the&nbsp; $\text{RC (3, 1, 3)}$.
 
|type="[]"}
 
|type="[]"}
+ Ab der ersten Iteration sind alle Vorzeichen von $L_{\rm APP}(i)$ positiv.
+
+ From the first iteration all signs of&nbsp; $L_{\rm APP}(i)$&nbsp; are positive.
+ Bereits nach der zweiten Iteration ist ${\rm Pr}(\underline{x}_0 | \underline{y}_2)$ größer als $99\%$.
+
+ Already after the second iteration&nbsp; ${\rm Pr}(\underline{x}_0\hspace{0.05cm} |\hspace{0.05cm} \underline{y}_2)$&nbsp; is greater than&nbsp; $99\%$.
+ Mit jeder Iteration werden die Beträge $L_{\rm APP}(i)$ größer.
+
+ With each iteration the absolute values&nbsp; $|L_{\rm APP}(i)|$&nbsp; become larger.
  
{Welche Aussagen sind für die Decodierung des Empfangswortes $\underline{y}_6 = (1, 1, 0)$ zutreffend, wenn $\underline{x}_0 = (0, 0, 0)$ gesendet wurde?
+
{Which statements are true for decoding the received word&nbsp; $\underline{y}_6 = (1, 1, 0)$,&nbsp; when&nbsp; $\underline{x}_0 = (0, 0, 0)$&nbsp; was sent?
 
|type="[]"}
 
|type="[]"}
- Der iterative Decoder entscheidet richtig.
+
- The iterative decoder decides correctly.
+ Der iterative Decoder entscheidet falsch.
+
+ The iterative decoder decides wrong.
+ Die &bdquo;Zuverlässigkeit&rdquo; für &bdquo;$\underline{y}_6 \Rightarrow \underline{x}_0$&rdquo; steigt mit wachsendem $I$.
+
+ The&nbsp; "reliability"&nbsp; for&nbsp; $\underline{y}_6 \Rightarrow \underline{x}_0$&nbsp; increases with increasing&nbsp; $I$.
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''(1)'''&nbsp; Das Empfangswort $\underline{y}_2 = (0, 1, 0)$ ist kein gültiges Codewort bezüglich des <i>Single Parity&ndash;check Codes</i> SPC (3, 2). Somit ist die erste Aussage falsch.
+
'''(1)'''&nbsp; Correct is the&nbsp; <u>proposed solution 3</u>:
 +
*The received word&nbsp; $\underline{y}_2 = (0, 1, 0)$&nbsp; is not a valid code word of the single parity&ndash;check code&nbsp; $\text{SPC (3, 2)}$.&nbsp; Thus,&nbsp; the first statement is false.
  
Da der SPC (3, 2) zudem nur die minimale Distanz $d_{\rm min} = 2$ aufweist, kann auch kein Fehler korrigiert werden. Richtig ist somit der <u>Lösungsvorschlag 3</u>.
+
*In addition,&nbsp; since the&nbsp; $\text{SPC (3, 2)}$&nbsp; has only the minimum distance&nbsp; $d_{\rm min} = 2$,&nbsp; no error can be corrected.  
  
  
'''(2)'''&nbsp; Die möglichen Codeworte beim RP (3, 1) sind $\underline{x}_0 = (0, 0, 0)$ und $\underline{x}_1 = (1, 1, 1)$. Die minimale Distanz dieses Codes beträgt $d_{\rm min} = 3$, so dass $t = (d_{\rm min} \, &ndash;1)/2 = 1$ Fehler korrigiert werden kann. Neben $\underline{y}_0 = (0, 0, 0)$ werden auch die Empfangsworte $\underline{y}_1 = (0, 0, 1), \ \underline{y}_2 = (0, 1, 0)$ und $\underline{y}_4 = (1, 0, 0)$ dem Decodierergebnis $\underline{x}_0 = (0, 0, 0)$ zugeordnet &nbsp;&#8658;&nbsp; <u>Lösungsvorschlag 2</u>.
 
  
 +
'''(2)'''&nbsp; Correct is the&nbsp; <u>proposed solution 2</u>:
 +
*The possible code words at the&nbsp; $\text{RP (3, 1)}$&nbsp; are&nbsp; $\underline{x}_0 = (0, 0, 0)$&nbsp; and&nbsp; $\underline{x}_1 = (1, 1, 1)$.
 +
 +
*The minimum distance of this code is&nbsp; $d_{\rm min} = 3$,&nbsp; so&nbsp; $t = (d_{\rm min} \, - 1)/2 = 1$&nbsp; error can be corrected.
 +
 +
*In addition to&nbsp; $\underline{y}_0 = (0, 0, 0)$,&nbsp; the received words&nbsp; $\underline{y}_1 = (0, 0, 1), \ \underline{y}_2 = (0, 1, 0)$,&nbsp; and&nbsp; $\underline{y}_4 = (1, 0, 0)$&nbsp; are also assigned to the decoding result&nbsp; $\underline{x}_0 = (0, 0, 0)$.
  
'''(3)'''&nbsp; Entsprechend dem BSC&ndash;Modell gilt für die bedingte Wahrscheinlichkeit, dass $\underline{y}_2 = (0, 1, 0)$ empfangen wird, unter der Voraussetzung, dass $\underline{x}_0 = (0, 0, 0)$ gesendet wurde:
+
 
 +
 
 +
'''(3)'''&nbsp; According to the BSC model,&nbsp; the conditional probability that&nbsp; $\underline{y}_2 = (0, 1, 0)$&nbsp; is received,&nbsp; given that&nbsp; $\underline{x}_0 = (0, 0, 0)$&nbsp; was sent:
 
:$${\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} =  \underline{x}_0 ) = (1-\varepsilon)^2 \cdot  \varepsilon\hspace{0.05cm}.$$
 
:$${\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} =  \underline{x}_0 ) = (1-\varepsilon)^2 \cdot  \varepsilon\hspace{0.05cm}.$$
  
Der erste Term $(1 \, &ndash;\epsilon)^2$ gibt dabei die Wahrscheinlichkeit dafür an, dass das erste und das dritte Bit richtig übertragen wurden und $\epsilon$ berücksichtigt die Verfälschungswahrscheinlichkeit für das zweite Bit.  
+
*The first term&nbsp; $(1 \, &ndash;\varepsilon)^2$&nbsp; indicates the probability that the first and third bit were transmitted correctly.&nbsp; $\varepsilon$ describes the falsification probability for the second bit.  
  
Entsprechend gilt für das zweite mögliche Codewort $\underline{x}_1 = (1, 1, 1)$:
+
*Correspondingly,&nbsp; for the second possible code word&nbsp; $\underline{x}_1 = (1, 1, 1)$:
 
:$${\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} =  \underline{x}_1 ) = \varepsilon^2 \cdot (1-\varepsilon)  \hspace{0.05cm}.$$
 
:$${\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} =  \underline{x}_1 ) = \varepsilon^2 \cdot (1-\varepsilon)  \hspace{0.05cm}.$$
  
Nach dem Satz von Bayes gilt dann für die Rückschlusswahrscheinlichkeiten:
+
*According to Bayes' theorem,&nbsp; the inference probabilities are then:
 
:$${\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y} = \underline{y}_2  ) \hspace{-0.15cm} \ = \ \hspace{-0.15cm} {\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} =  \underline{x}_0 )  \cdot \frac{{\rm Pr}(\underline{x} =  \underline{x}_0)}
 
:$${\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y} = \underline{y}_2  ) \hspace{-0.15cm} \ = \ \hspace{-0.15cm} {\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} =  \underline{x}_0 )  \cdot \frac{{\rm Pr}(\underline{x} =  \underline{x}_0)}
 
{{\rm Pr}(\underline{y} =  \underline{y}_2)} \hspace{0.05cm},$$
 
{{\rm Pr}(\underline{y} =  \underline{y}_2)} \hspace{0.05cm},$$
 
:$${\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_1 \hspace{0.1cm}| \hspace{0.1cm}\underline{y} = \underline{y}_2  ) \hspace{-0.15cm} \ = \ \hspace{-0.15cm} {\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} =  \underline{x}_1 )  \cdot \frac{{\rm Pr}(\underline{x} =  \underline{x}_1)}
 
:$${\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_1 \hspace{0.1cm}| \hspace{0.1cm}\underline{y} = \underline{y}_2  ) \hspace{-0.15cm} \ = \ \hspace{-0.15cm} {\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} =  \underline{x}_1 )  \cdot \frac{{\rm Pr}(\underline{x} =  \underline{x}_1)}
 
{{\rm Pr}(\underline{y} =  \underline{y}_2)} $$
 
{{\rm Pr}(\underline{y} =  \underline{y}_2)} $$
:$$\Rightarrow \hspace{0.3cm} S =  \frac{{\rm Pr(richtige \hspace{0.15cm}Entscheidung)}}
+
:$$\Rightarrow \hspace{0.3cm} S =  \frac{{\rm Pr(correct \hspace{0.15cm}decision)}}
{{\rm Pr(falsche \hspace{0.15cm}Entscheidung) }} = \frac{(1-\varepsilon)^2 \cdot  \varepsilon}{\varepsilon^2 \cdot (1-\varepsilon)}= \frac{(1-\varepsilon)}{\varepsilon}\hspace{0.05cm}.$$
+
{{\rm Pr(wrong \hspace{0.15cm}decision) }} = \frac{(1-\varepsilon)^2 \cdot  \varepsilon}{\varepsilon^2 \cdot (1-\varepsilon)}= \frac{(1-\varepsilon)}{\varepsilon}\hspace{0.05cm}.$$
  
Mit $\epsilon = 0.269$ erhält man folgende Zahlenwerte:
+
*With&nbsp; $\varepsilon = 0.269$&nbsp; we get the following numerical values:
 
:$$S =  {0.731}/{0.269}\hspace{0.15cm}\underline {= 2.717}\hspace{0.3cm}\Rightarrow   
 
:$$S =  {0.731}/{0.269}\hspace{0.15cm}\underline {= 2.717}\hspace{0.3cm}\Rightarrow   
 
\hspace{0.3cm}{\rm ln}\hspace{0.15cm}(S)\hspace{0.15cm} \underline {= 1}\hspace{0.05cm}.$$
 
\hspace{0.3cm}{\rm ln}\hspace{0.15cm}(S)\hspace{0.15cm} \underline {= 1}\hspace{0.05cm}.$$
  
  
'''(4)'''&nbsp; Das Vorzeichen des Kanal&ndash;$L$&ndash;Wertes $L_{\rm K}(i)$ ist positiv, falls $y_i = 0$, und negativ für $y_i = 1$. Der Betrag gibt die Zuverlässigkeit von $y_i$ an. Beim BSC&ndash;Modell gilt $|L_{\rm K}(i)| = \ln {(1 \, &ndash; \epsilon)/\epsilon} = 1$ für alle $i$. Also:
+
[[File:P_ID2988__KC_A_4_3e_v1.png|rightr|frame| Iterative decoding of&nbsp; $(+1, -1, +1)$]]
 +
'''(4)'''&nbsp; The sign of the channel log likelihood ratios&nbsp; $L_{\rm K}(i)$&nbsp; is
 +
*positive if&nbsp; $y_i = 0$,&nbsp; and
 +
 +
*negative for&nbsp; $y_i = 1$.
 +
 
 +
 +
The absolute value &nbsp; $|L_{\rm K}(i)|$&nbsp; indicates the reliability of&nbsp; $y_i$.&nbsp;
 +
*In the BSC model,&nbsp; $|L_{\rm K}(i)| = \ln {(1 \, &ndash; \varepsilon)/\varepsilon} = 1$&nbsp; for all&nbsp; $i$.&nbsp; Thus:
 
:$$\underline {L_{\rm K}}(1)\hspace{0.15cm} \underline {= +1}\hspace{0.05cm},\hspace{0.5cm}
 
:$$\underline {L_{\rm K}}(1)\hspace{0.15cm} \underline {= +1}\hspace{0.05cm},\hspace{0.5cm}
 
\underline {L_{\rm K}}(2)\hspace{0.15cm} \underline {= -1}\hspace{0.05cm},\hspace{0.5cm}
 
\underline {L_{\rm K}}(2)\hspace{0.15cm} \underline {= -1}\hspace{0.05cm},\hspace{0.5cm}
Line 116: Line 144:
  
  
'''(5)'''&nbsp; Die erste Tabelle verdeutlicht die iterative symbolweise Decodierung ausgehend von $\underline{y}_2 = (0, \, 1, \, 0)$.
 
[[File:P_ID2988__KC_A_4_3e_v1.png|center|frame| Iterative Decodierung von $(+1, –1, +1)$]]
 
  
Diese Ergebnisse lassen sich wie folgt interpretieren:
+
'''(5)'''&nbsp; The adjacent table illustrates the iterative bit-wise decoding starting from&nbsp; $\underline{y}_2 = (0, \, 1, \, 0)$.
* Die Vorbelegung (Iteration $I = 0$) geschieht entsprechend $\underline{L}_{\rm APP} = \underline{L}_{\rm K}$. Eine harte Entscheidung &nbsp;&#8658;&nbsp; &bdquo;${\rm sign} \sign {\underline{L}_{\rm APP}(i)}$&rdquo; würde zum Decodierergebnis $(0, \, 1, \, 0)$ führen. Die Zuverlässigkeit dieses offensichtlich falschen Ergebnisses wird mit $|\Sigma_{\rm APP}| = 1$ angegeben. Dieser Wert stimmt mit dem in Teilaufgaben (3) berechneten &bdquo;$\ln (S)$&rdquo; überein.
+
 
* Nach der ersten Iteration $(I = 1)$ sind alle Aposteriori&ndash;$L$&ndash;Werte $L_{\rm APP}(i) = +1$. Eine harte Entscheidung würde hier das (voraussichtlich) richtige Ergebnis $\underline{x}_{\rm APP} = (0, \, 0, \, 0)$ liefern. Die Wahrscheinlichkeit, dass dieses Ergebnis richtig ist, wird durch $|\Sigma_{\rm APP}| = 3$ quantifiziert:
+
These results can be interpreted as follows:
 +
* The preassignment&nbsp; $($iteration&nbsp; $I = 0)$&nbsp; happens according to&nbsp; $\underline{L}_{\rm APP} = \underline{L}_{\rm K}$.&nbsp; A hard decision &nbsp;&#8658;&nbsp; "$\sign \ {\underline{L}_{\rm APP}(i)}$"&nbsp; would lead to the decoding result&nbsp; $(0, \, 1, \, 0)$.  
 +
 
 +
*The reliability of this obviously incorrect result is given as&nbsp; $|{\it \Sigma}_{\rm APP}| = 1$.&nbsp; This value agrees with&nbsp; $\ln (S)$&nbsp; calculated in subtask&nbsp; '''(3)'''.
 +
 
 +
* After the first iteration&nbsp; $(I = 1)$&nbsp; all a-posteriori log likelihood ratios are&nbsp; $L_{\rm APP}(i) = +1$.&nbsp; A hard decision here would yield the&nbsp; $($expected$)$&nbsp; correct result&nbsp; $\underline{x}_{\rm APP} = (0, \, 0, \, 0)$.&nbsp;
 +
 
 +
*The probability that this outcome is correct is quantified by&nbsp; $|{\it \Sigma}_{\rm APP}| = 3$:
 
:$${\rm ln}\hspace{0.25cm}\frac{{\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)}{1-{\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)} = 3
 
:$${\rm ln}\hspace{0.25cm}\frac{{\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)}{1-{\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)} = 3
 
\hspace{0.3cm}\Rightarrow \hspace{0.3cm}
 
\hspace{0.3cm}\Rightarrow \hspace{0.3cm}
Line 127: Line 160:
 
:$$\hspace{0.3cm}\Rightarrow \hspace{0.3cm}{\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2) = {20}/{21}
 
:$$\hspace{0.3cm}\Rightarrow \hspace{0.3cm}{\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2) = {20}/{21}
 
  {\approx 95.39\%}\hspace{0.05cm}.$$
 
  {\approx 95.39\%}\hspace{0.05cm}.$$
* Die zweite Iteration bestätigt das Decodierergebnis der ersten Iteration. Die Zuverlässigkeit wird hier sogar mit &bdquo;$9$&rdquo; beziffert. Dieser Wert kann wie folgt interpretiert werden:
+
* The second iteration confirms the decoding result of the first iteration.&nbsp; The reliability is even quantified here with&nbsp; $9$.&nbsp; This value can be interpreted as follows:
 
:$$\frac{{\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)}{1-{\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)} = {\rm e}^9
 
:$$\frac{{\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)}{1-{\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)} = {\rm e}^9
 
\hspace{0.3cm}\Rightarrow \hspace{0.3cm}
 
\hspace{0.3cm}\Rightarrow \hspace{0.3cm}
 
{\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2) = {{\rm e}^9}/{({\rm e}^9+1)}
 
{\rm Pr}(\hspace{0.1cm}\underline{x} =  \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2) = {{\rm e}^9}/{({\rm e}^9+1)}
 
  \approx 99.99\% \hspace{0.05cm}.$$
 
  \approx 99.99\% \hspace{0.05cm}.$$
 +
*With each further iteration the reliability value and thus the probability&nbsp; ${\rm Pr}(\underline{x}_0 | \underline{y}_2)$&nbsp; increases drastically &nbsp; &#8658; &nbsp; <u>All proposed solutions</u>&nbsp; are correct.
 +
 +
  
Mit jeder weiteren Iteration nimmt der Zuverlässigkeitswert und damit die Wahrscheinlichkeit ${\rm Pr}(\underline{x}_0 | \underline{y}_2)$ drastisch zu &nbsp;&#8658;&nbsp; <u>Alle Lösungsvorschläge</u> sind richtig.  
+
[[File:P_ID2991__KC_A_4_3f_v1.png|right|frame|Iterative decoding of&nbsp; $(–1, –1, +1)$]]
 +
'''(6)'''&nbsp; Correct are&nbsp; <u>the proposed solutions 2 and 3</u>:
 +
*For the received vector&nbsp; $\underline{y}_6 = (1, \, 1, \, 0)$,&nbsp; the second table applies.
  
 +
*The decoder now decides for the sequence&nbsp; $\underline{x}_1 = (1, \, 1, \, 1)$.
 +
 +
*The case&nbsp; &raquo;$\underline{y}_6 = (1, \, 1, \, 0)$&nbsp; received under the condition&nbsp; $\underline{x}_1 = (1, \, 1, \, 1)$&nbsp; sent&laquo;&nbsp; would correspond exactly to the constellation&nbsp; &raquo;$\underline{y}_2 = (0, \, 1, \, 0)$&nbsp; received and&nbsp; $\underline{x}_0 = (0, \, 0, \, 0)$ sent&laquo;&nbsp; considered in the last subtask.
  
'''(6)'''&nbsp; Für den Empfangsvektor $\underline{y}_6 = (1, \, 1, \, 0)$ gilt folgende Tabelle:
+
*But since&nbsp; $\underline{x}_0 = (0, \, 0, \, 0)$&nbsp; was sent,&nbsp; there are now two bit errors with the following consequence:
[[File:P_ID2991__KC_A_4_3f_v1.png|center|frame|Iterative Decodierung von $(–1, –1, +1)$]]
+
:* The iterative decoder decides incorrectly.
  
Der Decoder entscheidet sich nun für die Folge $\underline{x}_1 = (1, \, 1, \, 1)$. Der Fall &bdquo;$\underline{y}_3 = (1, \, 1, \, 0)$ empfangen unter der Voraussetzung $\underline{x}_1 = (1, \, 1, \, 1)$ gesendet&rdquo; würde genau der in der letzten Teilaufgabe betrachteten Konstellation &bdquo;\underline{y}_2 = (1, \, 0, \, 1)$ empfangen und $\underline{x}_0 = (0, \, 0, \, 0)$ gesendet&rdquo; entsprechen.
+
:* With each further iteration the wrong decision is declared as more reliable.
  
Da aber $\underline{x}_0 = (0, \, 0, \, 0)$ gesendet wurde, gibt es nun zwei Bitfehler mit folgender Konsequenz:
 
* Der iterative Decoder entscheidet falsch.
 
* Mit jeder weiteren Iteration wird die falsche Entscheidung als zuverlässiger deklariert.
 
  
  
Richtig sind <u>die Lösungsvorschläge 2 und 3</u>.
 
 
{{ML-Fuß}}
 
{{ML-Fuß}}
  
  
  
[[Category:Aufgaben zu  Kanalcodierung|^4.1 Soft–in Soft–out Decoder^]]
+
[[Category:Channel Coding: Exercises|^4.1 Soft–in Soft–out Decoder^]]

Latest revision as of 17:44, 30 November 2022

BSC model and  (above)
possible received values

We consider in this exercise two codes:

$$\underline{x} = \big (\hspace{0.05cm}(0, 0, 0), \hspace{0.1cm} (0, 1, 1), \hspace{0.1cm} (1, 0, 1), \hspace{0.1cm} (1, 1, 0) \hspace{0.05cm} \big ) \hspace{0.05cm}, $$
$$\underline{x} = \big (\hspace{0.05cm}(0, 0, 0), \hspace{0.1cm} (1, 1, 1) \hspace{0.05cm} \big ) \hspace{0.05cm}.$$

The channel is described at bit level by the  $\text{BSC model}$.  According to the graphic,  the following applies:

$${\rm Pr}(y_i \ne x_i) =\varepsilon = 0.269\hspace{0.05cm},$$
$${\rm Pr}(y_i = x_i) =1-\varepsilon = 0.731\hspace{0.05cm}.$$

Here,  $\varepsilon$  denotes the falsification probability of the BSC model.

Except for the last subtask,  the following received value is always assumed:

$$\underline{y} = (0, 1, 0) =\underline{y}_2 \hspace{0.05cm}. $$

The chosen indexing of all possible received vectors can be taken from the graphic.

  • The most considered vector  $\underline{y}_2$  is highlighted in red.
  • Only for subtask  (6)  applies:
$$\underline{y} = (1, 1, 0) =\underline{y}_6 \hspace{0.05cm}. $$

For decoding purposes, the exercise will examine:

  • the  "syndrome decoding",  which follows the concept  "hard decision maximum likelihood detection"  $\rm (HD-ML)$  for the codes under consideration.
    $($because soft values are not available at the BSC$)$;




Hints:

  • Reference is made in particular to the sections
  • The code word selected by the decoder is denoted in the questions by  $\underline{z}$.



Questions

1

Which statements are valid for decoding the  $\text{SPC (3, 2, 2)}$?

The hard decision syndrome decoding yields the result  $\underline{z} = (0, \, 1, \, 0)$.
The hard decision syndrome decoding returns the result  $\underline{z} = (0, \, 0, \, 0)$.
The hard decision syndrome decoding fails here.

2

Which statements are valid for the  $\text{ RC (3, 1, 3)}$?

The hard decision syndrome decoding returns the result  $\underline{z} = (0, \, 1, \, 0)$.
The hard decision syndrome decoding returns the result  $\underline{z} = (0, \, 0, \, 0)$.
The hard decision syndrome decoding fails here.

3

How certain is this decision if we define reliability  $($"security"$)$  $S$  as the quotient of the probabilities for a correct or an incorrect decision?
Set the falsification probability of the BSC model to  $\varepsilon = 26.9\%$.

$S \ = \ $

$\hspace{0.75cm} \ln {(S)} \ = \ $

4

What are the intrinsic log likelihood ratios for iterative bit-wise decoding of the  $\text{RC (3, 1)}$  received word  $\underline{y}_2 = (0, \, 1, \, 0)$?

$L_{\rm K}(1) \ = \ $

$L_{\rm K}(2) \ = \ $

$L_{\rm K}(3) \ = \ $

5

Which statements are true for decoding the received word  $\underline{y}_2 = (0, \, 1, \, 0)$?  Continue to assume the  $\text{RC (3, 1, 3)}$.

From the first iteration all signs of  $L_{\rm APP}(i)$  are positive.
Already after the second iteration  ${\rm Pr}(\underline{x}_0\hspace{0.05cm} |\hspace{0.05cm} \underline{y}_2)$  is greater than  $99\%$.
With each iteration the absolute values  $|L_{\rm APP}(i)|$  become larger.

6

Which statements are true for decoding the received word  $\underline{y}_6 = (1, 1, 0)$,  when  $\underline{x}_0 = (0, 0, 0)$  was sent?

The iterative decoder decides correctly.
The iterative decoder decides wrong.
The  "reliability"  for  $\underline{y}_6 \Rightarrow \underline{x}_0$  increases with increasing  $I$.


Solution

(1)  Correct is the  proposed solution 3:

  • The received word  $\underline{y}_2 = (0, 1, 0)$  is not a valid code word of the single parity–check code  $\text{SPC (3, 2)}$.  Thus,  the first statement is false.
  • In addition,  since the  $\text{SPC (3, 2)}$  has only the minimum distance  $d_{\rm min} = 2$,  no error can be corrected.


(2)  Correct is the  proposed solution 2:

  • The possible code words at the  $\text{RP (3, 1)}$  are  $\underline{x}_0 = (0, 0, 0)$  and  $\underline{x}_1 = (1, 1, 1)$.
  • The minimum distance of this code is  $d_{\rm min} = 3$,  so  $t = (d_{\rm min} \, - 1)/2 = 1$  error can be corrected.
  • In addition to  $\underline{y}_0 = (0, 0, 0)$,  the received words  $\underline{y}_1 = (0, 0, 1), \ \underline{y}_2 = (0, 1, 0)$,  and  $\underline{y}_4 = (1, 0, 0)$  are also assigned to the decoding result  $\underline{x}_0 = (0, 0, 0)$.


(3)  According to the BSC model,  the conditional probability that  $\underline{y}_2 = (0, 1, 0)$  is received,  given that  $\underline{x}_0 = (0, 0, 0)$  was sent:

$${\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} = \underline{x}_0 ) = (1-\varepsilon)^2 \cdot \varepsilon\hspace{0.05cm}.$$
  • The first term  $(1 \, –\varepsilon)^2$  indicates the probability that the first and third bit were transmitted correctly.  $\varepsilon$ describes the falsification probability for the second bit.
  • Correspondingly,  for the second possible code word  $\underline{x}_1 = (1, 1, 1)$:
$${\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} = \underline{x}_1 ) = \varepsilon^2 \cdot (1-\varepsilon) \hspace{0.05cm}.$$
  • According to Bayes' theorem,  the inference probabilities are then:
$${\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y} = \underline{y}_2 ) \hspace{-0.15cm} \ = \ \hspace{-0.15cm} {\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} = \underline{x}_0 ) \cdot \frac{{\rm Pr}(\underline{x} = \underline{x}_0)} {{\rm Pr}(\underline{y} = \underline{y}_2)} \hspace{0.05cm},$$
$${\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_1 \hspace{0.1cm}| \hspace{0.1cm}\underline{y} = \underline{y}_2 ) \hspace{-0.15cm} \ = \ \hspace{-0.15cm} {\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} = \underline{x}_1 ) \cdot \frac{{\rm Pr}(\underline{x} = \underline{x}_1)} {{\rm Pr}(\underline{y} = \underline{y}_2)} $$
$$\Rightarrow \hspace{0.3cm} S = \frac{{\rm Pr(correct \hspace{0.15cm}decision)}} {{\rm Pr(wrong \hspace{0.15cm}decision) }} = \frac{(1-\varepsilon)^2 \cdot \varepsilon}{\varepsilon^2 \cdot (1-\varepsilon)}= \frac{(1-\varepsilon)}{\varepsilon}\hspace{0.05cm}.$$
  • With  $\varepsilon = 0.269$  we get the following numerical values:
$$S = {0.731}/{0.269}\hspace{0.15cm}\underline {= 2.717}\hspace{0.3cm}\Rightarrow \hspace{0.3cm}{\rm ln}\hspace{0.15cm}(S)\hspace{0.15cm} \underline {= 1}\hspace{0.05cm}.$$


Iterative decoding of  $(+1, -1, +1)$

(4)  The sign of the channel log likelihood ratios  $L_{\rm K}(i)$  is

  • positive if  $y_i = 0$,  and
  • negative for  $y_i = 1$.


The absolute value   $|L_{\rm K}(i)|$  indicates the reliability of  $y_i$. 

  • In the BSC model,  $|L_{\rm K}(i)| = \ln {(1 \, – \varepsilon)/\varepsilon} = 1$  for all  $i$.  Thus:
$$\underline {L_{\rm K}}(1)\hspace{0.15cm} \underline {= +1}\hspace{0.05cm},\hspace{0.5cm} \underline {L_{\rm K}}(2)\hspace{0.15cm} \underline {= -1}\hspace{0.05cm},\hspace{0.5cm} \underline {L_{\rm K}}(3)\hspace{0.15cm} \underline {= +1}\hspace{0.05cm}.$$


(5)  The adjacent table illustrates the iterative bit-wise decoding starting from  $\underline{y}_2 = (0, \, 1, \, 0)$.

These results can be interpreted as follows:

  • The preassignment  $($iteration  $I = 0)$  happens according to  $\underline{L}_{\rm APP} = \underline{L}_{\rm K}$.  A hard decision  ⇒  "$\sign \ {\underline{L}_{\rm APP}(i)}$"  would lead to the decoding result  $(0, \, 1, \, 0)$.
  • The reliability of this obviously incorrect result is given as  $|{\it \Sigma}_{\rm APP}| = 1$.  This value agrees with  $\ln (S)$  calculated in subtask  (3).
  • After the first iteration  $(I = 1)$  all a-posteriori log likelihood ratios are  $L_{\rm APP}(i) = +1$.  A hard decision here would yield the  $($expected$)$  correct result  $\underline{x}_{\rm APP} = (0, \, 0, \, 0)$. 
  • The probability that this outcome is correct is quantified by  $|{\it \Sigma}_{\rm APP}| = 3$:
$${\rm ln}\hspace{0.25cm}\frac{{\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)}{1-{\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)} = 3 \hspace{0.3cm}\Rightarrow \hspace{0.3cm} \frac{{\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)}{1-{\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)} = {\rm e}^3 \approx 20$$
$$\hspace{0.3cm}\Rightarrow \hspace{0.3cm}{\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2) = {20}/{21} {\approx 95.39\%}\hspace{0.05cm}.$$
  • The second iteration confirms the decoding result of the first iteration.  The reliability is even quantified here with  $9$.  This value can be interpreted as follows:
$$\frac{{\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)}{1-{\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)} = {\rm e}^9 \hspace{0.3cm}\Rightarrow \hspace{0.3cm} {\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2) = {{\rm e}^9}/{({\rm e}^9+1)} \approx 99.99\% \hspace{0.05cm}.$$
  • With each further iteration the reliability value and thus the probability  ${\rm Pr}(\underline{x}_0 | \underline{y}_2)$  increases drastically   ⇒   All proposed solutions  are correct.


Iterative decoding of  $(–1, –1, +1)$

(6)  Correct are  the proposed solutions 2 and 3:

  • For the received vector  $\underline{y}_6 = (1, \, 1, \, 0)$,  the second table applies.
  • The decoder now decides for the sequence  $\underline{x}_1 = (1, \, 1, \, 1)$.
  • The case  »$\underline{y}_6 = (1, \, 1, \, 0)$  received under the condition  $\underline{x}_1 = (1, \, 1, \, 1)$  sent«  would correspond exactly to the constellation  »$\underline{y}_2 = (0, \, 1, \, 0)$  received and  $\underline{x}_0 = (0, \, 0, \, 0)$ sent«  considered in the last subtask.
  • But since  $\underline{x}_0 = (0, \, 0, \, 0)$  was sent,  there are now two bit errors with the following consequence:
  • The iterative decoder decides incorrectly.
  • With each further iteration the wrong decision is declared as more reliable.