Difference between revisions of "Aufgaben:Exercise 4.13: Decoding LDPC Codes"

From LNTwww
 
(23 intermediate revisions by 5 users not shown)
Line 1: Line 1:
{{quiz-Header|Buchseite=Kanalcodierung/Grundlegendes zu den Low–density Parity–check Codes}}
+
{{quiz-Header|Buchseite=Channel_Coding/The_Basics_of_Low-Density_Parity_Check_Codes}}
  
[[File:P_ID3083__KC_A_4_13_v1.png|right|frame|Gegebene LDPC–Prüfmatrix]]
+
[[File:P_ID3083__KC_A_4_13_v1.png|right|frame|Given LDPC parity-check matrix]]
Die Aufgabe behandelt die Decodierung von LDPC&ndash;Codes und den <font color="#cc0000"><span style="font-weight: bold;">Message&ndash;passing Algorithmus</span></font> gemäß [[Kanalcodierung/Grundlegendes_zu_den_Low%E2%80%93density_Parity%E2%80%93check_Codes|Kapitel 4.4]].
+
The exercise deals with&nbsp; [[Channel_Coding/The_Basics_of_Low-Density_Parity_Check_Codes#Iterative_decoding_of_LDPC_codes|"Iterative decoding of LDPC&ndash;codes"]]&nbsp; according to the&nbsp; ''Message passing algorithm''.
  
Ausgangspunkt ist die dargestellte $9 &times 12$&ndash;Prüfmatrix $\mathbf{H}$, die zu Beginn der Aufgabe als Tanner&ndash;Graph dargestellt werden soll. Dabei ist anzumerken:
+
The starting point is the presented&nbsp; $9 &times 12$&nbsp; parity-check matrix&nbsp; $\mathbf{H}$,&nbsp; which is to be represented as Tanner graph at the beginning of the exercise.&nbsp; It should be noted:
* Die <i>Variable Nodes</i> (abgekürzt VNs) $V_i$ bezeichnen die $n$ Codewortbits.
+
# The&nbsp; "variable nodes"&nbsp; $V_i$&nbsp; denote the&nbsp; $n$&nbsp; bits of the code word.
* Die <i>Check Nodes</i> (abgekürzt CNs) $C_j$ stehen für die $m$ Prüfgleichungen.
+
# The&nbsp; "check nodes"&nbsp;  $C_j$&nbsp; represent the&nbsp; $m$&nbsp; parity-check equations.
* Eine Verbindung zwischen $V_i$ und $C_j$ zeigt an, dass das Matrixelement $h_{j, i}$ der Prüfmatrix $\mathbf{H}$ (in Zeile $j$, Spalte $i$) gleich $1$ ist. Für $h_{j,i} = 0$ gibt es keine Verbindung zwischen $V_i$ und $C_j$.
+
# A connection between&nbsp; $V_i$&nbsp; and&nbsp; $C_j$&nbsp; indicates that the element of matrix&nbsp; $\mathbf{H}$&nbsp; $($in row&nbsp; $j$, column&nbsp; $i)$&nbsp; is &nbsp; $h_{j,\hspace{0.05cm} i} =1$.
* Als die <i>Nachbarn</i> $N(V_i)$ von $V_i$ bezeichnet man die Menge aller <i>Check Nodes</i> $C_j$, die mit $V_i$ im Tanner&ndash;Graphen verbunden sind. Entsprechend gehören zu $N(C_j)$ alle <i>Variable Nodes</i> $V_i$ mit einer Verbindung zu $C_j$.
+
#For&nbsp; $h_{j,\hspace{0.05cm}i} = 0$&nbsp; there is no connection between&nbsp; $V_i$&nbsp; and&nbsp; $C_j$.
 +
# The&nbsp; "neighbors&nbsp; $N(V_i)$&nbsp; of&nbsp; $V_i$"&nbsp; is called the set of all&nbsp; check nodes&nbsp; $C_j$ connected to&nbsp; $V_i$&nbsp; in the Tanner graph.
 +
#Correspondingly,&nbsp;  to&nbsp; $N(C_j)$&nbsp; belong all variable nodes&nbsp; $V_i$&nbsp; with a connection to&nbsp; $C_j$.
  
  
Die Decodierung erfolgt abwechselnd bezüglich
+
The decoding is performed alternately with respect to
* den <i>Variable Nodes</i> &nbsp;&#8658;&nbsp; <i>Variable Nodes Decoder</i> (VND), und
+
* the&nbsp; variable nodes &nbsp; &#8658; &nbsp; "variable nodes decoder"&nbsp; $\rm (VND)$,&nbsp; and
* den <i>Check Nodes</i> &nbsp;&#8658;&nbsp; <i>Check Nodes Decoder</i> (CND).
 
  
 +
* the&nbsp; check nodes &nbsp; &#8658; &nbsp; "check nodes decoder"&nbsp; $\rm (CND)$.
  
Hierauf wird in den Teilaufgaben (5) und (6) Bezug genommen.
 
  
''Hinweise:''
+
This is referred to in subtasks&nbsp; '''(5)'''&nbsp; and&nbsp; '''(6)'''.
* Die Aufgabe gehört zum Themengebiet des Kapitels [[Kanalcodierung/Grundlegendes_zu_den_Low%E2%80%93density_Parity%E2%80%93check_Codes| Grundlegendes zu den Low&ndash;density Parity&ndash;check Codes]].
 
* Sollte die Eingabe des Zahlenwertes &bdquo;0&rdquo; erforderlich sein, so geben Sie bitte &bdquo;0.&rdquo; ein.
 
  
  
  
===Fragebogen===
+
 
 +
<u>Hints:</u>
 +
*The exercise belongs to the chapter&nbsp; [[Channel_Coding/The_Basics_of_Low-Density_Parity_Check_Codes| "Basic information about Low&ndash;density Parity&ndash;check Codes"]].
 +
 
 +
*Reference is made in particular to the section&nbsp; [[Channel_Coding/The_Basics_of_Low-Density_Parity_Check_Codes#Iterative_decoding_of_LDPC_codes|"Iterative decoding of LDPC codes"]].
 +
 +
 
 +
 
 +
 
 +
===Questions===
 
<quiz display=simple>
 
<quiz display=simple>
{Wie viele <i>Variable Nodes</i> und <i>Check Nodes</i> sind zu berücksichtigen?
+
{How many&nbsp; variable nodes&nbsp; $(I_{\rm VN})$&nbsp; and&nbsp; check nodes&nbsp; $(I_{\rm CN})$&nbsp; are to be considered?
 
|type="{}"}
 
|type="{}"}
$I_{\rm VN} \ = \ ${ 12 3% }  
+
$I_{\rm VN} \ = \ ${ 12 }  
$I_{\rm CN} \ = \ ${ 9 3% }
+
$I_{\rm CN} \ = \ ${ 9 }
  
{Welche der folgenden <i>Variable Nodes</i> und <i>Check Nodes</i> sind verbunden?
+
{Which of the following&nbsp; check nodes&nbsp; and&nbsp; variable nodes&nbsp; are connected?
 
|type="[]"}
 
|type="[]"}
+ $V_5$ und $C_5$.
+
+ $C_4$&nbsp; and &nbsp;$V_6$.
+ $V_6$ und $C_4$.
+
+ $C_5$&nbsp; and &nbsp;$V_5$.
- $C_6$ und $V_4$.
+
- $C_6$&nbsp; and &nbsp;$V_4$.
- $C_6$ und $V_i$ für $i > 9$.
+
- $C_6$&nbsp; and &nbsp;$V_i$&nbsp; for &nbsp;$i > 9$.
+ $C_j$ und $V_{j-1}$ für $j > 6$.
+
+ $C_j$&nbsp; and &nbsp;$V_{j-1}$&nbsp; for&nbsp; $j > 6$.
  
{Welche Aussagen treffen bezüglich der Nachbarn $N(V_i)$ und $N(C_j)$ zu?
+
{Which statements are true regarding neighbors &nbsp; $N(V_i)$ &nbsp; and &nbsp; $N(C_j)$&nbsp;?
 
|type="[]"}
 
|type="[]"}
 
- $N(V_1) = \{C_1, \ C_2, \ C_3, \ C_4\}$,
 
- $N(V_1) = \{C_1, \ C_2, \ C_3, \ C_4\}$,
Line 46: Line 54:
 
- $N(C_9) = \{V_3, \ V_5, \ V_7\}$.
 
- $N(C_9) = \{V_3, \ V_5, \ V_7\}$.
  
{Welche Aussagen treffen für den <i>Variable Node Decoder</i> (VND) zu?
+
{Which statements are true for the&nbsp; variable node decoder&nbsp; $\rm (VND)$?
 
|type="[]"}
 
|type="[]"}
+ Zu Beginn (Iteration 0) werden die $L$&ndash;Werte der Knoten $V_1, \ ... \ , \ V_n$ entsprechend den Kanaleingangswerten $y_i$ vorbelegt.
+
+ At the beginning&nbsp; $($iteration 0$)$&nbsp; the&nbsp; $L$&ndash;values of the nodes&nbsp; $V_1, \hspace{0.05cm} \text{...} \hspace{0.05cm}, \ V_n$&nbsp; are preassigned corresponding to the channel input values&nbsp; $y_i$.
+ Für den VND stellt $L(C_j &#8594; V_i)$ Apriori&ndash;Information dar.
+
+ For the VND represents&nbsp; $L(C_j &#8594; V_i)$&nbsp; a-priori information.
- Es gibt Analogien zwischen VND und der Decodierung eines <i>Single Parity&ndash;check Codes</i>-
+
- There are analogies between the&nbsp; "variable node decoder"&nbsp; and the decoding of a single parity&ndash;check code.
  
{Welche Aussagen treffen für den <i>Check Node Decoder</i> (CND) zu?
+
{Which statements are true for the&nbsp; check node decoder&nbsp; $\rm (CND)$?
 
|type="[]"}
 
|type="[]"}
- Der CND liefert am Ende die gewünschten Aposteriori&ndash;$L$&ndash;Werte.  
+
- The CND returns  at the end the desired a-posteriori&nbsp; $L$&ndash;values.  
- Für den CND stellt $L(C_j &#8594; V_i)$ Apriori&ndash;Information dar.
+
- For the CND represents&nbsp; $L(C_j &#8594; V_i)$&nbsp; a-priori information.
+ Es gibt Analogien zwischen CND und der Decodierung eines <i>Single Parity&ndash;check Codes</i>.
+
+ There are analogies between the&nbsp; "check node decoder"&nbsp; and the decoding of a single parity&ndash;check code.
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''(1)'''&nbsp; Der <i>Variable Node</i> $\rm (VN)$ $V_i$ steht für das $i$&ndash;te Codewortbit, so dass $I_{\rm VN}$ gleich der Codewortlänge $n$ ist. Aus der Spaltenzahl der $\mathbf{H}$&ndash;Matrix erkennt man $I_{\rm VN} = n \ \underline{= 12}$. Für die Menge aller <i>Variable Nodes</i> kann man somit allgemein schreiben: ${\rm VN} = \{V_1, \ ... \ , V_i, \ ... \ , \ V_n\}$.  
+
'''(1)'''&nbsp; The variable node&nbsp; $V_i$&nbsp; stands for the&nbsp; $i$<sup>th</sup>&nbsp; code word bit,&nbsp; so that&nbsp; $I_{\rm VN}$&nbsp; is&nbsp; equal to the code word length&nbsp; $n$.
 +
*From the column number of the&nbsp; $\mathbf{H}$&nbsp; matrix,&nbsp; we can see&nbsp; $I_{\rm VN} = n \ \underline{= 12}$.
 +
 +
*For the set of all variable nodes,&nbsp; one can thus write in general:&nbsp; ${\rm VN} = \{V_1, \hspace{0.05cm} \text{...} \hspace{0.05cm} , V_i, \hspace{0.05cm} \text{...} \hspace{0.05cm} , \ V_n\}$.
 +
 +
*The check node&nbsp; $ C_j$&nbsp; represents the&nbsp; $j$<sup>th</sup>&nbsp; parity-check equation,&nbsp; and for the set of all check nodes:
 +
:$${\rm CN} = \{C_1, \hspace{0.05cm} \text{...} \hspace{0.05cm} , \ C_j, \hspace{0.05cm} \text{...} \hspace{0.05cm} , \ C_m\}.$$
 +
*From the number of rows of the&nbsp; $\mathbf{H}$&nbsp; matrix we get&nbsp; $I_{\rm CN} \ \underline {= m = 9}$.
 +
 
 +
 
 +
[[File:P_ID3084__KC_A_4_13c_v1.png|right|frame|Tanner graph for the present example ]]
 +
 
 +
'''(2)'''&nbsp; The results can be read from the Tanner graph sketched on the right.
 +
 
 +
Correct are <u>the proposed solutions 1, 2 and 5</u>:
 +
* The  element&nbsp; $h_{5,\hspace{0.05cm}5}=1$ &nbsp; $($column 5, row 5$)$ &nbsp; &#8658; &nbsp; red edge.
 +
 
 +
* The element&nbsp; $h_{4,\hspace{0.05cm} 6}=1$&nbsp; $($column 4, row 6$)$ &nbsp; &#8658; &nbsp; blue edge.
 +
 
 +
* The element&nbsp; $h_{6, \hspace{0.05cm}4}=0$&nbsp; $($column 6, row 4$)$  &nbsp; &#8658; &nbsp; no edge.
 +
 
 +
* $h_{6,\hspace{0.05cm} 10} = h_{6,\hspace{0.05cm} 11} = 1$,&nbsp; $h_{6,\hspace{0.05cm}12} = 0$ &nbsp; &#8658; &nbsp; not all three edges exist.
 +
 
 +
* It holds&nbsp; $h_{7,\hspace{0.05cm}6} = h_{8,\hspace{0.05cm}7} = h_{9,\hspace{0.05cm}8} = 1$ &nbsp; &#8658; &nbsp; green edges.
 +
 
 +
 
 +
 
 +
'''(3)'''&nbsp; It is a regular LDPC code with
 +
* $w_{\rm R}(j) = 4 = w_{\rm R}$ for $1 &#8804; j &#8804; 9$,
 +
 
 +
* $w_{\rm C}(i) = 3 = w_{\rm C}$ for $1 &#8804; i &#8804; 12$.
 +
 
 +
 
 +
The&nbsp; <u>answers 2 and 3</u>&nbsp; are correct,&nbsp; as can be seen from the first row and ninth column,&nbsp; respectively, of the parity-check matrix&nbsp; $\mathbf{H}$.  
  
Der <i>Check Node</i> ${\rm (CN)} C_j$ steht für die $j$&ndash;Prüfgleichung, und für die Menge aller <i>Check Nodes</i> gilt: ${\rm CN} = \{C_1, \ ... \ C_j, \ ... \ , \ C_m\}$. Aus der Zeilenzahl der $\mathbf{H}$&ndash;Matrix ergibt sich $I_{\rm CN} \ \underline {= m = 9}$.
+
The Tanner graph confirms these results:
 +
* From&nbsp; $C_1$&nbsp; there are edges to&nbsp; $V_1, \ V_2, \ V_3$, and $V_4$.
  
 +
* From&nbsp; $V_9$&nbsp; there are edges to $C_3, \ C_5$, and $C_7$.
  
'''(2)'''&nbsp; Die Ergebnisse können aus dem nachfolgend skizzierten Tanner&ndash;Graphen abgelesen werden.
 
  
[[File:P_ID3084__KC_A_4_13c_v1.png|center|frame|Tanner&ndash;Graph]]
+
The answers 1 and 4 cannot be correct already because
 +
* the neighborhood&nbsp; $N(V_i)$&nbsp; of each variable node&nbsp; $V_i$&nbsp; contains exactly&nbsp; $w_{\rm C} = 3$&nbsp; elements,&nbsp; and
  
Richtig sind <u>die Lösungsvorschläge 1, 2 und 5</u>:
+
* the neighborhood&nbsp; $N(C_j)$&nbsp; of each check node&nbsp; $C_j$&nbsp; contains exactly&nbsp; $w_{\rm R} = 4$&nbsp; elements.
* Das Matrixelement $h_{5,5}$ (Spalte 5, Zeile 5) ist $1$ &nbsp;&#8658;&nbsp; rote Verbindung.
 
* Das Matrixelement $h_{4, 6}$ (Spalte 4, Zeile 6) ist $1$ &nbsp;&#8658;&nbsp; blaue Verbindung.
 
* Das Matrixelement $h_{6, 4}$ (Spalte 6, Zeile 4) ist $0$ &nbsp;&#8658;&nbsp; keine Verbindung.
 
* Es gilt $h_{6, 10} = h_{6, 11} = 1$. Aber $h_{6,11} = 0$ &nbsp;&#8658;&nbsp; es bestehen nicht alle drei Verbindungen.
 
* Es gilt $h_{7,6} = h_{8,7} = h_{9,8} = 1$ &nbsp;&#8658;&nbsp; grüne Verbindungen.
 
  
  
'''(3)'''&nbsp; Es handelt sich um einen regulären LDPC&ndash;Code mit
 
* $w_{\rm Z}(j) = 4 = w_{\rm Z}$ für $1 &#8804; j &#8804; 9$,
 
* $w_{\rm S}(i) = 3 = w_{\rm S}$ für $1 &#8804; i &#8804; 12$.
 
  
 +
'''(4)'''&nbsp; Correct are the&nbsp; <u>proposed solutions 1 and 2</u>,&nbsp; as can be seen from the&nbsp; [[Channel_Coding/The_Basics_of_Low-Density_Parity_Check_Codes#Iterative_decoding_of_LDPC_codes|"corresponding theory page"]]:
 +
* At the start of decoding&nbsp; $($so to speak at iteration&nbsp; $I=0)$&nbsp; the&nbsp; $L$&ndash;values of the variable nodes &nbsp; &#8658; &nbsp; $L(V_i)$ are preallocated with the channel input values.
  
Die Antworten 1 und 4 können schon allein deshalb nicht richtig sein, da
+
* Later&nbsp; $($from iteration $I = 1)$&nbsp; the log likelihood ratio&nbsp; $L(C_j &#8594; V_i)$&nbsp; transmitted by the CND is considered in the VND as a-priori information.
* die Nachbarschaft $N(V_i)$ eines jeden <i>Variable Nodes</i> $V_i$ genau $w_{\rm S} = 3$ Elemente beinhaltet, und
 
* die Nachbarschaft $N(C_j)$ eines jeden <i>Check Nodes</i> $C_j$ genau $w_{\rm Z} = 4$ Elemente.
 
  
 +
* Answer 3 is wrong.&nbsp; Rather,&nbsp; the correct answer would be:&nbsp; There are analogies between the VND algorithm and the decoding of a&nbsp; "repetition code".
  
Die <u>Antworten 2 und 3</u> sind richtig, wie aus der ersten Zeilebzw. der neunten Spalte der Prüfmatrix $\mathbf{H}$ hervorgeht. Der Tanner&ndash;Graph bestätigt diese Ergebnisse:
 
* Von $C_1$ gibt es Verbindungen zu $V_1, \ V_2, \ V_3$, und $V_4$.
 
* Von $V_9$ gibt es Verbindungen zu $C_3, \ C_5$, und $C_7$.
 
  
  
'''(4)'''&nbsp; Richtig sind die <u>Lösungsvorschläge 1 und 2</u>, wie aus der [[entsprechenden Theorieseite]] hervorgeht:
+
'''(5)'''&nbsp; Correct is&nbsp; <u>only proposed solution 3</u>&nbsp; because
* Zu Beginn der Decodierung (sozusagen: Iteration 0) werden die $L$&ndash;Werte der <i>Variable Nodes</i> &nbsp;&#8658;&nbsp; $L(V_i)$ entsprechend den Kanaleingangswerten vorbelegt.
+
* the final a-posteriori&nbsp; $L$&ndash;values are derived from the VND,&nbsp; not from the CND;
* Später (ab der Iteration $I = 1$) wird im VND das vom CND übermittelte Log&ndash;Likelihood&ndash;Verhältnis $L(C_j &#8594; V_i)$ als Apriori&ndash;Information berücksichtigt.
 
* Antwort 3 ist falsch. Richtig wäre vielmehr: Es gibt Analogien zwischen dem VND&ndash;Algorithmus und der Decodierung eines <i>Repetition Codes</i>.
 
  
 +
* the&nbsp; $L$&ndash;value&nbsp; $L(C_j &#8594; V_i)$&nbsp; represents extrinsic information for the CND;&nbsp; and
  
'''(5)'''&nbsp; Richtig ist <u>nur der Lösungsvorschlag 3</u>, weil
+
* there are indeed analogies between the CND algorithm and SPC decoding.
* die endgültigen Aposteriori&ndash;$L$&ndash;Werte vom VND abgeleitet werden, nicht vom CND,
 
* die $L$&ndash;Wert $L(C_j &#8594; V_i)$ für den CND extrinsische Information darstellt, und
 
* es tatsächlich Analogien zwischen dem CND&ndash;Algorithmus und der SPC&ndash;Decodierung gibt.
 
 
{{ML-Fuß}}
 
{{ML-Fuß}}
  
  
  
[[Category:Aufgaben zu  Kanalcodierung|^4.4 Grundlegendes zu den Low–density Parity–check Codes^]]
+
[[Category:Channel Coding: Exercises|^4.4 Low–density Parity–check Codes^]]

Latest revision as of 18:30, 17 December 2022

Given LDPC parity-check matrix

The exercise deals with  "Iterative decoding of LDPC–codes"  according to the  Message passing algorithm.

The starting point is the presented  $9 × 12$  parity-check matrix  $\mathbf{H}$,  which is to be represented as Tanner graph at the beginning of the exercise.  It should be noted:

  1. The  "variable nodes"  $V_i$  denote the  $n$  bits of the code word.
  2. The  "check nodes"  $C_j$  represent the  $m$  parity-check equations.
  3. A connection between  $V_i$  and  $C_j$  indicates that the element of matrix  $\mathbf{H}$  $($in row  $j$, column  $i)$  is   $h_{j,\hspace{0.05cm} i} =1$.
  4. For  $h_{j,\hspace{0.05cm}i} = 0$  there is no connection between  $V_i$  and  $C_j$.
  5. The  "neighbors  $N(V_i)$  of  $V_i$"  is called the set of all  check nodes  $C_j$ connected to  $V_i$  in the Tanner graph.
  6. Correspondingly,  to  $N(C_j)$  belong all variable nodes  $V_i$  with a connection to  $C_j$.


The decoding is performed alternately with respect to

  • the  variable nodes   ⇒   "variable nodes decoder"  $\rm (VND)$,  and
  • the  check nodes   ⇒   "check nodes decoder"  $\rm (CND)$.


This is referred to in subtasks  (5)  and  (6).



Hints:



Questions

1

How many  variable nodes  $(I_{\rm VN})$  and  check nodes  $(I_{\rm CN})$  are to be considered?

$I_{\rm VN} \ = \ $

$I_{\rm CN} \ = \ $

2

Which of the following  check nodes  and  variable nodes  are connected?

$C_4$  and  $V_6$.
$C_5$  and  $V_5$.
$C_6$  and  $V_4$.
$C_6$  and  $V_i$  for  $i > 9$.
$C_j$  and  $V_{j-1}$  for  $j > 6$.

3

Which statements are true regarding neighbors   $N(V_i)$   and   $N(C_j)$ ?

$N(V_1) = \{C_1, \ C_2, \ C_3, \ C_4\}$,
$N(C_1) = \{V_1, \ V_2, \ V_3, \ V_4\}$,
$N(V_9) = \{C_3, \ C_5, \, C_7\}$,
$N(C_9) = \{V_3, \ V_5, \ V_7\}$.

4

Which statements are true for the  variable node decoder  $\rm (VND)$?

At the beginning  $($iteration 0$)$  the  $L$–values of the nodes  $V_1, \hspace{0.05cm} \text{...} \hspace{0.05cm}, \ V_n$  are preassigned corresponding to the channel input values  $y_i$.
For the VND represents  $L(C_j → V_i)$  a-priori information.
There are analogies between the  "variable node decoder"  and the decoding of a single parity–check code.

5

Which statements are true for the  check node decoder  $\rm (CND)$?

The CND returns at the end the desired a-posteriori  $L$–values.
For the CND represents  $L(C_j → V_i)$  a-priori information.
There are analogies between the  "check node decoder"  and the decoding of a single parity–check code.


Solution

(1)  The variable node  $V_i$  stands for the  $i$th  code word bit,  so that  $I_{\rm VN}$  is  equal to the code word length  $n$.

  • From the column number of the  $\mathbf{H}$  matrix,  we can see  $I_{\rm VN} = n \ \underline{= 12}$.
  • For the set of all variable nodes,  one can thus write in general:  ${\rm VN} = \{V_1, \hspace{0.05cm} \text{...} \hspace{0.05cm} , V_i, \hspace{0.05cm} \text{...} \hspace{0.05cm} , \ V_n\}$.
  • The check node  $ C_j$  represents the  $j$th  parity-check equation,  and for the set of all check nodes:
$${\rm CN} = \{C_1, \hspace{0.05cm} \text{...} \hspace{0.05cm} , \ C_j, \hspace{0.05cm} \text{...} \hspace{0.05cm} , \ C_m\}.$$
  • From the number of rows of the  $\mathbf{H}$  matrix we get  $I_{\rm CN} \ \underline {= m = 9}$.


Tanner graph for the present example

(2)  The results can be read from the Tanner graph sketched on the right.

Correct are the proposed solutions 1, 2 and 5:

  • The element  $h_{5,\hspace{0.05cm}5}=1$   $($column 5, row 5$)$   ⇒   red edge.
  • The element  $h_{4,\hspace{0.05cm} 6}=1$  $($column 4, row 6$)$   ⇒   blue edge.
  • The element  $h_{6, \hspace{0.05cm}4}=0$  $($column 6, row 4$)$   ⇒   no edge.
  • $h_{6,\hspace{0.05cm} 10} = h_{6,\hspace{0.05cm} 11} = 1$,  $h_{6,\hspace{0.05cm}12} = 0$   ⇒   not all three edges exist.
  • It holds  $h_{7,\hspace{0.05cm}6} = h_{8,\hspace{0.05cm}7} = h_{9,\hspace{0.05cm}8} = 1$   ⇒   green edges.


(3)  It is a regular LDPC code with

  • $w_{\rm R}(j) = 4 = w_{\rm R}$ for $1 ≤ j ≤ 9$,
  • $w_{\rm C}(i) = 3 = w_{\rm C}$ for $1 ≤ i ≤ 12$.


The  answers 2 and 3  are correct,  as can be seen from the first row and ninth column,  respectively, of the parity-check matrix  $\mathbf{H}$.

The Tanner graph confirms these results:

  • From  $C_1$  there are edges to  $V_1, \ V_2, \ V_3$, and $V_4$.
  • From  $V_9$  there are edges to $C_3, \ C_5$, and $C_7$.


The answers 1 and 4 cannot be correct already because

  • the neighborhood  $N(V_i)$  of each variable node  $V_i$  contains exactly  $w_{\rm C} = 3$  elements,  and
  • the neighborhood  $N(C_j)$  of each check node  $C_j$  contains exactly  $w_{\rm R} = 4$  elements.


(4)  Correct are the  proposed solutions 1 and 2,  as can be seen from the  "corresponding theory page":

  • At the start of decoding  $($so to speak at iteration  $I=0)$  the  $L$–values of the variable nodes   ⇒   $L(V_i)$ are preallocated with the channel input values.
  • Later  $($from iteration $I = 1)$  the log likelihood ratio  $L(C_j → V_i)$  transmitted by the CND is considered in the VND as a-priori information.
  • Answer 3 is wrong.  Rather,  the correct answer would be:  There are analogies between the VND algorithm and the decoding of a  "repetition code".


(5)  Correct is  only proposed solution 3  because

  • the final a-posteriori  $L$–values are derived from the VND,  not from the CND;
  • the  $L$–value  $L(C_j → V_i)$  represents extrinsic information for the CND;  and
  • there are indeed analogies between the CND algorithm and SPC decoding.