Difference between revisions of "Aufgaben:Exercise 4.3: Iterative Decoding at the BSC"

From LNTwww
Line 1: Line 1:
 
{{quiz-Header|Buchseite=Channel_Coding/Soft-in_Soft-Out_Decoder}}
 
{{quiz-Header|Buchseite=Channel_Coding/Soft-in_Soft-Out_Decoder}}
  
[[File:P_ID2987__KC_A_4_3_v1.png|right|frame|BSC model and possible received values]]
+
[[File:P_ID2987__KC_A_4_3_v1.png|right|frame|BSC model and&nbsp; (above) <br>possible received values]]
We consider two codes in this exercise:
+
We consider in this exercise two codes:
* the Single Parity&ndash;Code &nbsp; &#8658; &nbsp; [[Channel_Coding/Examples_of_Binary_Block_Codes#Single_Parity-check_Codes| $\text{"SPC (3, 2, 2)"}$]]:
+
* the&nbsp; "single parity&ndash;check code" &nbsp; &#8658; &nbsp; [[Channel_Coding/Examples_of_Binary_Block_Codes#Single_Parity-check_Codes| $\text{SPC (3, 2, 2)}$]]:
 
:$$\underline{x} = \big (\hspace{0.05cm}(0, 0, 0), \hspace{0.1cm}
 
:$$\underline{x} = \big (\hspace{0.05cm}(0, 0, 0), \hspace{0.1cm}
 
(0, 1, 1), \hspace{0.1cm}
 
(0, 1, 1), \hspace{0.1cm}
Line 9: Line 9:
 
(1, 1, 0) \hspace{0.05cm} \big )
 
(1, 1, 0) \hspace{0.05cm} \big )
 
\hspace{0.05cm}, $$
 
\hspace{0.05cm}, $$
* the repetition code &nbsp; &#8658; &nbsp; [[Channel_Coding/Examples_of_Binary_Block_Codes#Repetition_Codes| $\text{"RC (3, 1, 3)"}$]]:
+
* the repetition code &nbsp; &#8658; &nbsp; [[Channel_Coding/Examples_of_Binary_Block_Codes#Repetition_Codes| $\text{RC (3, 1, 3)}$]]:
 
:$$\underline{x} = \big (\hspace{0.05cm}(0, 0, 0), \hspace{0.1cm}
 
:$$\underline{x} = \big (\hspace{0.05cm}(0, 0, 0), \hspace{0.1cm}
 
(1, 1, 1)  \hspace{0.05cm} \big )
 
(1, 1, 1)  \hspace{0.05cm} \big )
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
  
 +
The channel is described at bit level by the&nbsp; [[Digital_Signal_Transmission/Binary_Symmetric_Channel|$\text{BSC model}$]].&nbsp; According to the graphic,&nbsp; the following applies:
 +
:$${\rm Pr}(y_i \ne x_i) =\varepsilon = 0.269\hspace{0.05cm},$$
 +
:$${\rm Pr}(y_i = x_i) =1-\varepsilon = 0.731\hspace{0.05cm}.$$
  
The channel is described at bit level by the&nbsp; [[Digital_Signal_Transmission/Binary_Symmetric_Channel|"BSC&ndash;model"]]&nbsp;. According to the graphic, the following applies:
+
Here,&nbsp; $\varepsilon$&nbsp; denotes the falsification probability of the BSC model.
:$${\rm Pr}(y_i \ne x_i) \hspace{-0.15cm} \ = \  \hspace{-0.15cm}\varepsilon = 0.269\hspace{0.05cm},$$
 
:$${\rm Pr}(y_i = x_i) \hspace{-0.15cm} \ = \ \hspace{-0.15cm}1-\varepsilon = 0.731\hspace{0.05cm}.$$
 
  
Here,&nbsp; $\varepsilon$&nbsp; denotes the corruption probability of the BSC model.
+
Except for the last subtask,&nbsp; the following received value is always assumed:
 
 
Except for the last subtask, the following received value is always assumed:
 
 
:$$\underline{y} = (0, 1, 0) =\underline{y}_2
 
:$$\underline{y} = (0, 1, 0) =\underline{y}_2
 
\hspace{0.05cm}. $$
 
\hspace{0.05cm}. $$
  
The here chosen indexing of all possible received vectors can be taken from the graphic.  
+
The chosen indexing of all possible received vectors can be taken from the graphic.  
*The most considered vector&nbsp; $\underline{y}_2$&nbsp; is highlighted in red here.  
+
*The most considered vector&nbsp; $\underline{y}_2$&nbsp; is highlighted in red.
*For the subtask '''(6)''' then applies:
+
 +
*Only for subtask&nbsp; '''(6)'''&nbsp; applies:
 
:$$\underline{y} = (1, 1, 0) =\underline{y}_6
 
:$$\underline{y} = (1, 1, 0) =\underline{y}_6
 
\hspace{0.05cm}. $$
 
\hspace{0.05cm}. $$
  
 
For decoding purposes, the exercise will examine:
 
For decoding purposes, the exercise will examine:
* the&nbsp; [[Channel_Coding/Decoding_of_Linear_Block_Codes#Generalization_of_syndrome_coding|"Syndrome Decoding"]], which follows the concept&nbsp; <i>hard decision maximum likelihood detection</i>&nbsp; (HD ML) for the codes under consideration. <br>(soft values are not available at the BSC),
+
* the&nbsp; [[Channel_Coding/Decoding_of_Linear_Block_Codes#Generalization_of_syndrome_coding|"syndrome decoding"]],&nbsp; which follows the concept&nbsp; "hard decision maximum likelihood detection"&nbsp; $\rm (HD-ML)$&nbsp; for the codes under consideration. <br>$($because soft values are not available at the BSC$)$;
* the symbol-wise&nbsp; [[Channel_Coding/Soft-in_Soft-Out_Decoder#Symbol-wise_soft-in_soft-out_decoding|"Soft&ndash;in Soft&ndash;out Decoding"]]&nbsp; (SISO) according to this section.
 
  
 +
* the symbol-wise&nbsp; [[Channel_Coding/Soft-in_Soft-Out_Decoder#Symbol-wise_soft-in_soft-out_decoding|"Soft&ndash;in Soft&ndash;out Decoding"]]&nbsp; $\rm (SISO)$&nbsp; according to this section.
  
  
Line 42: Line 42:
  
  
 +
<u>Hints:</u>
 +
* This exercise refers to the chapter&nbsp; [[Channel_Coding/Soft-in_Soft-Out_Decoder| "Soft&ndash;in Soft&ndash;out Decoder"]].
  
 
+
* Reference is made in particular to the sections
Hints:
+
:*[[Channel_Coding/Soft-in_Soft-Out_Decoder#Symbol-wise_soft-in_soft-out_decoding|"Symbol-wise Soft&ndash;in Soft&ndash;out Decoding"]],&nbsp;
* This exercise refers to the chapter&nbsp; [[Channel_Coding/Soft-in_Soft-Out_Decoder| "Soft&ndash;in Soft&ndash;out Decoder"]].
+
:*[[Channel_Coding/Channel_Models_and_Decision_Structures#Binary_Symmetric_Channel_.E2.80.93_BSC|"Binary Symmetric Channel"]]
* Reference is made in particular to the pages
+
* The code word selected by the decoder is denoted in the questions  by&nbsp; $\underline{z}$.
::[[Channel_Coding/Soft-in_Soft-Out_Decoder#Symbol-wise_soft-in_soft-out_decoding|"Symbol-wise Soft&ndash;in Soft&ndash;out_Decoding"]], as well as
 
::[[Channel_Coding/Channel_Models_and_Decision_Structures#Binary_Symmetric_Channel_.E2.80.93_BSC|''Binary Symmetric Channel'']]
 
* The code word selected by the decoder is denoted by&nbsp; $\underline{z}$&nbsp; in the questions.
 
 
   
 
   
  
Line 58: Line 57:
 
{Which statements are valid for decoding the&nbsp; $\text{SPC (3, 2, 2)}$?
 
{Which statements are valid for decoding the&nbsp; $\text{SPC (3, 2, 2)}$?
 
|type="()"}
 
|type="()"}
- The HD syndrome decoding yields the result&nbsp; $\underline{z} = (0, \, 1, \, 0)$.
+
- The hard decision syndrome decoding yields the result&nbsp; $\underline{z} = (0, \, 1, \, 0)$.
- The HD syndrome decoding returns the result&nbsp; $\underline{z} = (0, \, 0, \, 0)$.
+
- The hard decision  syndrome decoding returns the result&nbsp; $\underline{z} = (0, \, 0, \, 0)$.
+ The HD syndrome decoding fails here.
+
+ The hard decision  syndrome decoding fails here.
  
 
{Which statements are valid for the&nbsp; $\text{ RC (3, 1, 3)}$?
 
{Which statements are valid for the&nbsp; $\text{ RC (3, 1, 3)}$?
 
|type="()"}
 
|type="()"}
- The HD syndrome decoding returns the result&nbsp; $\underline{z} = (0, \, 1, \, 0)$.
+
- The hard decision  syndrome decoding returns the result&nbsp; $\underline{z} = (0, \, 1, \, 0)$.
+ The HD syndrome decoding returns the result&nbsp; $\underline{z} = (0, \, 0, \, 0)$.
+
+ The hard decision  syndrome decoding returns the result&nbsp; $\underline{z} = (0, \, 0, \, 0)$.
- The HD syndrome decoding fails here.
+
- The hard decision  syndrome decoding fails here.
  
{How certain is this decision if we define certainty&nbsp; $S$&nbsp; as the quotient of the probabilities for a correct or incorrect decision? <br>Set the corruption probability of the BSC model to&nbsp; $\varepsilon = 26.9\%$.
+
{How certain is this decision if we define certainty&nbsp; $($"security"$)$&nbsp; $S$&nbsp; as the quotient of the probabilities for a correct or an incorrect decision? <br>Set the falsificationtion probability of the BSC model to&nbsp; $\varepsilon = 26.9\%$.
 
|type="{}"}
 
|type="{}"}
 
$S \ = \ ${ 2.717 3% }
 
$S \ = \ ${ 2.717 3% }
 
$\hspace{0.75cm} \ln {(S)} \ = \ ${ 1 3% }
 
$\hspace{0.75cm} \ln {(S)} \ = \ ${ 1 3% }
  
{What are the intrinsic LLR for the iterative symbol-wise decoding of the&nbsp; $\text{RC (3, 1)}$ received word&nbsp; $\underline{y}_2 = (0, \, 1, \, 0)$?
+
{What are the intrinsic log likelihood ratios for iterative symbol-wise decoding of the&nbsp; $\text{RC (3, 1)}$ received word&nbsp; $\underline{y}_2 = (0, \, 1, \, 0)$?
 
|type="{}"}
 
|type="{}"}
 
$L_{\rm K}(1) \ = \ ${ 1 3% }  
 
$L_{\rm K}(1) \ = \ ${ 1 3% }  

Revision as of 16:10, 29 November 2022

BSC model and  (above)
possible received values

We consider in this exercise two codes:

$$\underline{x} = \big (\hspace{0.05cm}(0, 0, 0), \hspace{0.1cm} (0, 1, 1), \hspace{0.1cm} (1, 0, 1), \hspace{0.1cm} (1, 1, 0) \hspace{0.05cm} \big ) \hspace{0.05cm}, $$
$$\underline{x} = \big (\hspace{0.05cm}(0, 0, 0), \hspace{0.1cm} (1, 1, 1) \hspace{0.05cm} \big ) \hspace{0.05cm}.$$

The channel is described at bit level by the  $\text{BSC model}$.  According to the graphic,  the following applies:

$${\rm Pr}(y_i \ne x_i) =\varepsilon = 0.269\hspace{0.05cm},$$
$${\rm Pr}(y_i = x_i) =1-\varepsilon = 0.731\hspace{0.05cm}.$$

Here,  $\varepsilon$  denotes the falsification probability of the BSC model.

Except for the last subtask,  the following received value is always assumed:

$$\underline{y} = (0, 1, 0) =\underline{y}_2 \hspace{0.05cm}. $$

The chosen indexing of all possible received vectors can be taken from the graphic.

  • The most considered vector  $\underline{y}_2$  is highlighted in red.
  • Only for subtask  (6)  applies:
$$\underline{y} = (1, 1, 0) =\underline{y}_6 \hspace{0.05cm}. $$

For decoding purposes, the exercise will examine:

  • the  "syndrome decoding",  which follows the concept  "hard decision maximum likelihood detection"  $\rm (HD-ML)$  for the codes under consideration.
    $($because soft values are not available at the BSC$)$;




Hints:

  • Reference is made in particular to the sections
  • The code word selected by the decoder is denoted in the questions by  $\underline{z}$.



Questions

1

Which statements are valid for decoding the  $\text{SPC (3, 2, 2)}$?

The hard decision syndrome decoding yields the result  $\underline{z} = (0, \, 1, \, 0)$.
The hard decision syndrome decoding returns the result  $\underline{z} = (0, \, 0, \, 0)$.
The hard decision syndrome decoding fails here.

2

Which statements are valid for the  $\text{ RC (3, 1, 3)}$?

The hard decision syndrome decoding returns the result  $\underline{z} = (0, \, 1, \, 0)$.
The hard decision syndrome decoding returns the result  $\underline{z} = (0, \, 0, \, 0)$.
The hard decision syndrome decoding fails here.

3

How certain is this decision if we define certainty  $($"security"$)$  $S$  as the quotient of the probabilities for a correct or an incorrect decision?
Set the falsificationtion probability of the BSC model to  $\varepsilon = 26.9\%$.

$S \ = \ $

$\hspace{0.75cm} \ln {(S)} \ = \ $

4

What are the intrinsic log likelihood ratios for iterative symbol-wise decoding of the  $\text{RC (3, 1)}$ received word  $\underline{y}_2 = (0, \, 1, \, 0)$?

$L_{\rm K}(1) \ = \ $

$L_{\rm K}(2) \ = \ $

$L_{\rm K}(3) \ = \ $

5

Which statements are true for decoding the received word  $\underline{y}_2 = (0, \, 1, \, 0)$ ? Continue to assume the  $\text{RC (3, 1, 3)}$.

From the first iteration all signs of  $L_{\rm APP}(i)$  are positive.
Already after the second iteration  ${\rm Pr}(\underline{x}_0\hspace{0.05cm} |\hspace{0.05cm} \underline{y}_2)$  is greater than  $99\%$.
With each iteration the absolute values  $L_{\rm APP}(i)$  become larger.

6

Which statements are true for decoding the received word  $\underline{y}_6 = (1, 1, 0)$  when  $\underline{x}_0 = (0, 0, 0)$  was sent?

The iterative decoder decides correctly.
The iterative decoder decides wrong.
The "reliability" for "$\underline{y}_6 \Rightarrow \underline{x}_0$" increases with increasing  $I$.


Solution

(1)  Correct is the proposed solution 3:

  • The received word $\underline{y}_2 = (0, 1, 0)$ is not a valid code word of the single parity–check code SPC (3, 2). Thus, the first statement is false.
  • In addition, since the SPC (3, 2) has only the minimum distance $d_{\rm min} = 2$, no error can be corrected.


(2)  Correct is the proposed solution 2:

  • The possible code words at RP (3, 1) are $\underline{x}_0 = (0, 0, 0)$ and $\underline{x}_1 = (1, 1, 1)$.
  • The minimum distance of this code is $d_{\rm min} = 3$, so $t = (d_{\rm min} \, - 1)/2 = 1$ error can be corrected.
  • In addition to $\underline{y}_0 = (0, 0, 0)$, $\underline{y}_1 = (0, 0, 1), \ \underline{y}_2 = (0, 1, 0)$, and $\underline{y}_4 = (1, 0, 0)$ are also assigned to the decoding result $\underline{x}_0 = (0, 0, 0)$.


(3)  According to the BSC model, the conditional probability is that $\underline{y}_2 = (0, 1, 0)$ is received, given that $\underline{x}_0 = (0, 0, 0)$ was sent:

$${\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} = \underline{x}_0 ) = (1-\varepsilon)^2 \cdot \varepsilon\hspace{0.05cm}.$$
  • The first term $(1 \, –\varepsilon)^2$ indicates the probability that the first and the third bit were transmitted correctly and $\varepsilon$ considers the corruption probability for the second bit.
  • Correspondingly, for the second possible code word $\underline{x}_1 = (1, 1, 1)$:
$${\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} = \underline{x}_1 ) = \varepsilon^2 \cdot (1-\varepsilon) \hspace{0.05cm}.$$
  • According to Bayes' theorem, the inference probabilities are then:
$${\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y} = \underline{y}_2 ) \hspace{-0.15cm} \ = \ \hspace{-0.15cm} {\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} = \underline{x}_0 ) \cdot \frac{{\rm Pr}(\underline{x} = \underline{x}_0)} {{\rm Pr}(\underline{y} = \underline{y}_2)} \hspace{0.05cm},$$
$${\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_1 \hspace{0.1cm}| \hspace{0.1cm}\underline{y} = \underline{y}_2 ) \hspace{-0.15cm} \ = \ \hspace{-0.15cm} {\rm Pr}(\underline{y} = \underline{y}_2 \hspace{0.1cm}| \hspace{0.1cm}\underline{x} = \underline{x}_1 ) \cdot \frac{{\rm Pr}(\underline{x} = \underline{x}_1)} {{\rm Pr}(\underline{y} = \underline{y}_2)} $$
$$\Rightarrow \hspace{0.3cm} S = \frac{{\rm Pr(correct \hspace{0.15cm}decision)}} {{\rm Pr(wrong \hspace{0.15cm}decision) }} = \frac{(1-\varepsilon)^2 \cdot \varepsilon}{\varepsilon^2 \cdot (1-\varepsilon)}= \frac{(1-\varepsilon)}{\varepsilon}\hspace{0.05cm}.$$
  • With $\varepsilon = 0.269$ we get the following numerical values:
$$S = {0.731}/{0.269}\hspace{0.15cm}\underline {= 2.717}\hspace{0.3cm}\Rightarrow \hspace{0.3cm}{\rm ln}\hspace{0.15cm}(S)\hspace{0.15cm} \underline {= 1}\hspace{0.05cm}.$$


(4)  The sign of the channel LLR $L_{\rm K}(i)$ is positive if $y_i = 0$, and negative for $y_i = 1$.

  • The absolute value indicates the reliability of $y_i$. In the BSC model, $|L_{\rm K}(i)| = \ln {(1 \, – \varepsilon)/\varepsilon} = 1$ for all $i$. Thus:
Iterative decoding of $(+1, -1, +1)$
$$\underline {L_{\rm K}}(1)\hspace{0.15cm} \underline {= +1}\hspace{0.05cm},\hspace{0.5cm} \underline {L_{\rm K}}(2)\hspace{0.15cm} \underline {= -1}\hspace{0.05cm},\hspace{0.5cm} \underline {L_{\rm K}}(3)\hspace{0.15cm} \underline {= +1}\hspace{0.05cm}.$$


(5)  The adjacent table illustrates the iterative symbol-wise decoding starting from $\underline{y}_2 = (0, \, 1, \, 0)$.
These results can be interpreted as follows:

  • The preassignment (iteration $I = 0$) happens according to $\underline{L}_{\rm APP} = \underline{L}_{\rm K}$. A hard decision  ⇒  "$\sign {\underline{L}_{\rm APP}(i)}$" would lead to the decoding result $(0, \, 1, \, 0)$. The reliability of this obviously incorrect result is given as $|{\it \Sigma}| = 1$. This value agrees with the "$\ln (S)$" calculated in subtasks (3).
  • After the first iteration $(I = 1)$ all a posteriori LLRs are $L_{\rm APP}(i) = +1$. A hard decision here would yield the (expected) correct result $\underline{x}_{\rm APP} = (0, \, 0, \, 0)$. The probability that this outcome is correct is quantified by $|{\it \Sigma}_{\rm APP}| = 3$:
$${\rm ln}\hspace{0.25cm}\frac{{\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)}{1-{\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)} = 3 \hspace{0.3cm}\Rightarrow \hspace{0.3cm} \frac{{\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)}{1-{\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)} = {\rm e}^3 \approx 20$$
$$\hspace{0.3cm}\Rightarrow \hspace{0.3cm}{\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2) = {20}/{21} {\approx 95.39\%}\hspace{0.05cm}.$$
  • The second iteration confirms the decoding result of the first iteration. The reliability is even quantified here with "$9$". This value can be interpreted as follows:
$$\frac{{\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)}{1-{\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2)} = {\rm e}^9 \hspace{0.3cm}\Rightarrow \hspace{0.3cm} {\rm Pr}(\hspace{0.1cm}\underline{x} = \underline{x}_0 \hspace{0.1cm}| \hspace{0.1cm}\underline{y}=\underline{y}_2) = {{\rm e}^9}/{({\rm e}^9+1)} \approx 99.99\% \hspace{0.05cm}.$$
  • With each further iteration the reliability value and thus the probability ${\rm Pr}(\underline{x}_0 | \underline{y}_2)$ increases drastically  ⇒  All proposed solutions are correct.


Iterative decoding of $(–1, –1, +1)$

(6)  Correct are the proposed solutions 2 and 3:

  • For the received vector $\underline{y}_6 = (1, \, 1, \, 0)$, the second table applies.
  • The decoder now decides for the sequence $\underline{x}_1 = (1, \, 1, \, 1)$.
  • The case "$\underline{y}_3 = (1, \, 1, \, 0)$ received under the condition $\underline{x}_1 = (1, \, 1, \, 1)$ sent" would correspond exactly to the constellation "$\underline{y}_2 = (1, \, 0, \, 1)$ received and $\underline{x}_0 = (0, \, 0, \, 0)$ sent" considered in the last subtask.
  • But since $\underline{x}_0 = (0, \, 0, \, 0)$ was sent, there are now two bit errors with the following consequence:
  • The iterative decoder decides incorrectly.
  • With each further iteration the wrong decision is declared as more reliable.