Difference between revisions of "Aufgaben:Exercise 2.11Z: Erasure Channel for Symbols"

From LNTwww
 
(23 intermediate revisions by 5 users not shown)
Line 1: Line 1:
{{quiz-Header|Buchseite=Kanalcodierung/Reed–Solomon–Decodierung beim Auslöschungskanal}}
+
{{quiz-Header|Buchseite=Channel_Coding/Reed-Solomon_Decoding_for_the_Erasure_Channel}}
  
[[File:P_ID2543__KC_Z_2_11.png|right|frame|Auslöschungskanal für Symbole: $m$–BEC]]
+
[[File:P_ID2543__KC_Z_2_11.png|right|frame|Erasure channel for symbols:  "m–BEC"]]
Das Kanalmodell [[Kanalcodierung/Kanalmodelle_und_Entscheiderstrukturen#Binary_Symmetric_Channel_.E2.80.93_BSC|Binary Erasure Channel]] (BEC) beschreibt einen Auslöschungskanal auf Bitebene. Ein Binärsymbol $0$ bzw. $1$ wird mit der Wahrscheinlichkeit $1 - \lambda$ richtig übertragen und mit der Wahrscheinlichkeit $\lambda$ als Auslöschung $\rm E$ (<i>Erasure</i>) markiert. Im Gegensatz zum [[Kanalcodierung/Kanalmodelle_und_Entscheiderstrukturen#Binary_Symmetric_Channel_.E2.80.93_BSC| BSC]] kann es hier nicht zu Verfälschungen $(0 &#8594 1, \ 1 &#8594; 0)$ kommen.
+
The channel model&nbsp; [[Channel_Coding/Channel_Models_and_Decision_Structures#Binary_Erasure_Channel_.E2.80.93_BEC|"Binary Erasure Channel"]]&nbsp; $\rm (BEC)$ describes a bit-level erasure channel:
 +
*A binary symbol&nbsp; $(0$&nbsp; or&nbsp; $1)$&nbsp; is correctly transmitted with probability&nbsp; $1 - \lambda$&nbsp; and marked as an erasure&nbsp; "$\rm E$"&nbsp; with probability&nbsp; $\lambda$.
 +
 +
*In contrast to the&nbsp; [[Channel_Coding/Channel_Models_and_Decision_Structures#Binary_Symmetric_Channel_.E2.80.93_BSC| "Binary Symmetric Channel"]]&nbsp; $\rm (BSC)$,&nbsp; falsifications&nbsp; $(0 &#8594 1, \ 1 &#8594; 0)$&nbsp; cannot occur here.
  
Ein Reed&ndash;Solomon&ndash;Code basiert auf einem Galoisfeld ${\rm GF}(2^m)$ mit ganzzahligem $m$. Jedes Codesymbol $c$ lässt sich somit durch $m \ \rm Bit$ darstellen. Will man hier das BEC&ndash;Modell anwenden, so muss man dieses zum <b><span style="color: rgb(204, 0, 0);"><i>m</i>&ndash;BEC&ndash;Modell</span></b> modifizieren, wie es in der unteren Grafik für $m = 2$ gezeigt ist:
 
  
Alle Codesymbole &ndash; in binärer Darstellung $00, \ 01, \ 10$ und $11$ &ndash; werden mit der Wahrscheinlichkeit $1 - \lambda_2$ richtig übertragen. Damit beträgt die Wahrscheinlichkeit für ein ausgelöschtes Symbol $\lambda_2$. Zu beachten ist, dass bereits ein einziges ausgelöschtes Bit zum ausgelöschten Empfangssymbol $y = \rm E$ führt.
+
A Reed&ndash;Solomon code is based on a Galois field &nbsp; ${\rm GF}(2^m)$ &nbsp; with integer&nbsp; $m$.&nbsp; Thus,&nbsp; each code symbol &nbsp; $c$ &nbsp; can be represented by&nbsp; $m$&nbsp; bits.&nbsp; If one wants to apply the BEC model here,&nbsp; one has to modify it to the&nbsp; "$m$&ndash;BEC"&nbsp; model&nbsp; as shown in the diagram below for&nbsp; $m = 2$:
  
''Hinweise:''
+
*All code symbols &nbsp; $($in binary representation&nbsp; $00, \ 01, \ 10, \ 11)$ &nbsp; are transmitted correctly with probability&nbsp; $1 - \lambda_2$&nbsp;.  
* Die Aufgabe gehört zum Kapitel [[Kanalcodierung/Reed%E2%80%93Solomon%E2%80%93Decodierung_beim_Ausl%C3%B6schungskanal| Reed&ndash;Solomon&ndash;Decodierung beim Auslöschungskanal]].
 
* Bei einem auf ${\rm GF}(2^m)$ basierenden Code ist das skizzierte 2&ndash;BEC&ndash;Modell zum $m$&ndash;BEC zu erweitern. Die Auslöschungswahrscheinlichkeit dieses Modell wird dann mit $\lambda_m$ bezeichnet.
 
* Für die Teilaufgaben (1), (2) und (3) gelte für die Auslöschungswahrscheinlichkeit des Grundmodells gemäß der oberen Grafik stets $\lambda = 0.2$.
 
* Sollte die Eingabe des Zahlenwertes &bdquo;0&rdquo; erforderlich sein, so geben Sie bitte &bdquo;0.&rdquo; ein.
 
  
 +
*Thus,&nbsp; the probability of an erased symbol&nbsp; is&nbsp; $\lambda_2$.
 +
 +
*Note that already a single erased bit leads to the erased received symbol&nbsp; $y = \rm E$&nbsp;.
  
  
===Fragebogen===
+
 
 +
 
 +
 
 +
 
 +
 
 +
Hints:
 +
* This exercise belongs to the chapter&nbsp; [[Channel_Coding/Reed-Solomon_Decoding_for_the_Erasure_Channel| "Reed&ndash;Solomon Decoding for the Erasure Channel"]].
 +
 
 +
* For a code based on&nbsp; ${\rm GF}(2^m)$&nbsp; the outlined&nbsp; $2$&ndash;BEC model is to be extended to&nbsp; $m$&ndash;BEC.
 +
 +
*The erasure probability of this model is then denoted by&nbsp; $\lambda_m$.
 +
 
 +
*For the first subtasks,&nbsp; use always&nbsp; $\lambda = 0.2$&nbsp; for the erasure probability of the basic model according to the upper graph.
 +
 +
 
 +
 
 +
 
 +
===Questions===
 
<quiz display=simple>
 
<quiz display=simple>
{Es gelte $\lambda = 0.2$. Mit welchen Wahrscheinlichkeiten treten beim BEC&ndash;Modell die möglichen Empfangswerte auf?
+
{$\lambda = 0.2$&nbsp; is valid.&nbsp; With what probabilities do the possible received values occur in the&nbsp; BEC basic model&nbsp;?
 
|type="{}"}
 
|type="{}"}
$1-{\rm BEC} \text{:} \hspace{0.2cm} {\rm Pr}(y = 0) \ = \ ${ 0.4 3% }
+
${\rm Pr}(y = 0) \ = \ ${ 40 3% } $\ \%$
$\hspace{1.2cm} {\rm Pr}(y = {\rm E}) \ = \ ${ 0.2 3% }
+
${\rm Pr}(y = {\rm E}) \ = \ ${ 20 3% } $\ \%$
$\hspace{1.2cm} {\rm Pr}(y = {\rm 1}) \ = \ ${ 0.4 3% }
+
${\rm Pr}(y = {\rm 1}) \ = \ ${ 40 3% } $\ \%$
  
{Wie groß ist die Auslöschungswahrscheinlichkeit $\lambda_2$ auf Symbolebene, wenn der Reed&ndash;Solomon&ndash;Code auf $\rm GF(2^2)$ basiert $(\lambda = 0.2)$?
+
{$\lambda = 0.2$&nbsp; is valid.&nbsp; What is the erasure probability&nbsp; $\lambda_2$&nbsp; at symbol level&nbsp; $(2$&ndash;BEC model$)$&nbsp; when the Reed-Solomon code is based on&nbsp; $\rm GF(2^2)$?
 
|type="{}"}
 
|type="{}"}
$2&ndash;{\rm BEC} \text{:} \hspace{0.2cm} \lambda_2 \ = \ ${ 0.36 3% }  
+
$\lambda_2 \ = \ ${ 36 3% } $\ \%$
  
{Wie groß ist die Auslöschungswahrscheinlichkeit $\lambda_m$, wenn das $m$&ndash;BEC&ndash;Modell an den $\rm RSC \, (255, \, 223, \, 33)_{256}$ angepasst wird $(\lambda = 0.2)$?
+
{$\lambda = 0.2$&nbsp; is valid.&nbsp; What is the erasure probability&nbsp; $\lambda_m$,&nbsp; when the&nbsp; $m$&ndash;BEC model&nbsp; is fitted to the&nbsp; $\rm RSC \, (255, \, 223, \, 33)_{256}$?
 
|type="{}"}
 
|type="{}"}
$m&ndash;{\rm BEC} \text{:} \hspace{0.2cm} \lambda_m \ = \ ${ 0.832 3% }  
+
$\lambda_m \ = \ ${ 83.2 3% } $\ \%$
  
{Wie groß darf die Auslöschungswahrscheinlichkeit $\lambda$ beim Grundmodell (BEC) maximal sein, damit $\lambda_m &#8804; 0.2$ gilt?
+
{What is the maximum allowed erasure probability&nbsp; $\lambda$&nbsp; of the&nbsp; "BEC basic model"&nbsp; for&nbsp; $\lambda_m &#8804; 0.2$&nbsp; to hold?
 
|type="{}"}
 
|type="{}"}
$\lambda_m &#8804; 0.2 \text{:} \hspace{0.2cm} {\rm Max}[\lambda] \ = \ ${ 0.0275 3% }
+
${\rm Max} \ \big[\lambda\big ] \ = \ ${ 2.75 3% } $\ \%$
  
{Mit welcher Wahrscheinlichkeit wird damit das &bdquo;Nullsymbol&rdquo; empfange?
+
{What is the probability of receiving the&nbsp; "zero symbol"&nbsp; in this case?
 
|type="{}"}
 
|type="{}"}
$\lambda_m = 0.2 \text{:} \hspace{0.2cm} {\rm Pr}(y_{\rm bin} = 00000000) \ = \ ${ 0.003125 3% }  
+
${\rm Pr}(y_{\rm bin} = 00000000) \ = \ ${ 0.3125 3% } $\ \%$
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''(1)'''&nbsp;  
+
'''(1)'''&nbsp; Due to the symmetry of the given BEC model&nbsp; ("bit-level erasure channel"),&nbsp; the erasure probability is:
'''(2)'''&nbsp;  
+
:$$\ {\rm Pr}(y = {\rm E}) = \lambda \ \underline{= 20\%}.$$
'''(3)'''&nbsp;  
+
*Since the code symbols&nbsp; $0$&nbsp; and&nbsp; $1$&nbsp; are equally likely,&nbsp; we obtain &nbsp; ${\rm Pr}(y = 0) \ \underline{= 40\%}$ &nbsp; and &nbsp; ${\rm Pr}(y = 1) \ \underline{= 40\%}$ &nbsp; for their probabilities.
'''(4)'''&nbsp;  
+
 
'''(5)'''&nbsp;  
+
 
 +
 
 +
'''(2)'''&nbsp; Without limiting generality,&nbsp; we assume the code symbol&nbsp; $c_{\rm bin} = $ "$00$"&nbsp; to solve this exercise.
 +
*According to the&nbsp; $2$&ndash;BEC model,&nbsp; the received symbol&nbsp; $y_{\rm bin}$&nbsp; can then be either&nbsp; "$00$"&nbsp; or erased&nbsp; $(\rm E)$&nbsp; and it holds:
 +
:$${\rm Pr}(y_{\rm bin} = "00"\hspace{0.05cm} |\hspace{0.05cm} c_{\rm bin} = "00") \hspace{-0.15cm} \ = \ \hspace{-0.15cm} ( 1 - \lambda)^2 = 0.8^2 = 0.64 = 1 - \lambda_2\hspace{0.3cm}
 +
\Rightarrow \hspace{0.3cm} \lambda_2 \hspace{-0.15cm} \ = \ \hspace{-0.15cm} 1 - ( 1 - \lambda)^2 \hspace{0.15cm}\underline{= 36\%}\hspace{0.05cm}. $$
 +
*This assumes that an erasure is avoided only if neither of the two bits has been erased.
 +
 
 +
 
 +
 
 +
'''(3)'''&nbsp; The &nbsp; $\rm RSC \, (255, \, 223, \, 33)_{256}$ &nbsp; is based on the Galois field&nbsp; $\rm GF(256) = GF(2^8) \ \Rightarrow \ \it m = \rm 8$.
 +
*The result of subtask&nbsp; '''(2)'''&nbsp; must now be adapted to this case.&nbsp; For the&nbsp; $8$&ndash;BEC holds:
 +
:$$1 - \lambda_8 = ( 1 - \lambda)^8 = 0.8^8 \approx 0.168 \hspace{0.3cm} \Rightarrow \hspace{0.3cm}
 +
\lambda_m = \lambda_8  \hspace{0.15cm}\underline{\approx 83.2\%}\hspace{0.05cm}. $$
 +
 
 +
 
 +
 
 +
'''(4)'''&nbsp; From the condition &nbsp; $\lambda_m &#8804; 0.2$&nbsp; it follows directly&nbsp; $1 - \lambda_m &#8805; 0.8$.&nbsp; From this follows further:
 +
:$$( 1 - \lambda)^8 \ge 0.8  \hspace{0.3cm} \Rightarrow \hspace{0.3cm}
 +
1 - \lambda \ge 0.8^{0.125} \approx 0.9725 \hspace{0.3cm} \Rightarrow \hspace{0.3cm}\lambda \hspace{0.15cm}
 +
\underline{\le 2.75\%}\hspace{0.05cm}.$$
 +
 
 +
 
 +
 
 +
'''(5)'''&nbsp; Here one proceeds as follows:
 +
*With&nbsp; $\lambda = 0.0275 \Rightarrow \ \lambda_m = 0.2$ &nbsp; &rArr; &nbsp; $20\%$&nbsp; of the received symbols are&nbsp; erasured.
 +
 +
*The&nbsp; $2^8 = 256$&nbsp; received symbols&nbsp; $(00000000$ &nbsp; ... &nbsp; $11111111)$&nbsp; are all equally likely.&nbsp; It follows:
 +
:$${\rm Pr}(y_{\rm bin} = 00000000) = \hspace{0.1cm}\text{...} \hspace{0.1cm}= {\rm Pr}(y_{\rm bin} = 11111111)= \frac{0.8}{256}  \hspace{0.15cm}\underline{= 0.3125\%}\hspace{0.05cm}.$$
 
{{ML-Fuß}}
 
{{ML-Fuß}}
  
  
  
[[Category:Aufgaben zu  Kanalcodierung|^2.4 Reed–Solomon–Decodierung beim Auslöschungskanal^]]
+
[[Category:Channel Coding: Exercises|^2.4 Reed-Solomon Decoding at the BEC^]]

Latest revision as of 14:12, 21 October 2022

Erasure channel for symbols:  "m–BEC"

The channel model  "Binary Erasure Channel"  $\rm (BEC)$ describes a bit-level erasure channel:

  • A binary symbol  $(0$  or  $1)$  is correctly transmitted with probability  $1 - \lambda$  and marked as an erasure  "$\rm E$"  with probability  $\lambda$.


A Reed–Solomon code is based on a Galois field   ${\rm GF}(2^m)$   with integer  $m$.  Thus,  each code symbol   $c$   can be represented by  $m$  bits.  If one wants to apply the BEC model here,  one has to modify it to the  "$m$–BEC"  model  as shown in the diagram below for  $m = 2$:

  • All code symbols   $($in binary representation  $00, \ 01, \ 10, \ 11)$   are transmitted correctly with probability  $1 - \lambda_2$ .
  • Thus,  the probability of an erased symbol  is  $\lambda_2$.
  • Note that already a single erased bit leads to the erased received symbol  $y = \rm E$ .




Hints:

  • For a code based on  ${\rm GF}(2^m)$  the outlined  $2$–BEC model is to be extended to  $m$–BEC.
  • The erasure probability of this model is then denoted by  $\lambda_m$.
  • For the first subtasks,  use always  $\lambda = 0.2$  for the erasure probability of the basic model according to the upper graph.



Questions

1

$\lambda = 0.2$  is valid.  With what probabilities do the possible received values occur in the  BEC basic model ?

${\rm Pr}(y = 0) \ = \ $

$\ \%$
${\rm Pr}(y = {\rm E}) \ = \ $

$\ \%$
${\rm Pr}(y = {\rm 1}) \ = \ $

$\ \%$

2

$\lambda = 0.2$  is valid.  What is the erasure probability  $\lambda_2$  at symbol level  $(2$–BEC model$)$  when the Reed-Solomon code is based on  $\rm GF(2^2)$?

$\lambda_2 \ = \ $

$\ \%$

3

$\lambda = 0.2$  is valid.  What is the erasure probability  $\lambda_m$,  when the  $m$–BEC model  is fitted to the  $\rm RSC \, (255, \, 223, \, 33)_{256}$?

$\lambda_m \ = \ $

$\ \%$

4

What is the maximum allowed erasure probability  $\lambda$  of the  "BEC basic model"  for  $\lambda_m ≤ 0.2$  to hold?

${\rm Max} \ \big[\lambda\big ] \ = \ $

$\ \%$

5

What is the probability of receiving the  "zero symbol"  in this case?

${\rm Pr}(y_{\rm bin} = 00000000) \ = \ $

$\ \%$


Solution

(1)  Due to the symmetry of the given BEC model  ("bit-level erasure channel"),  the erasure probability is:

$$\ {\rm Pr}(y = {\rm E}) = \lambda \ \underline{= 20\%}.$$
  • Since the code symbols  $0$  and  $1$  are equally likely,  we obtain   ${\rm Pr}(y = 0) \ \underline{= 40\%}$   and   ${\rm Pr}(y = 1) \ \underline{= 40\%}$   for their probabilities.


(2)  Without limiting generality,  we assume the code symbol  $c_{\rm bin} = $ "$00$"  to solve this exercise.

  • According to the  $2$–BEC model,  the received symbol  $y_{\rm bin}$  can then be either  "$00$"  or erased  $(\rm E)$  and it holds:
$${\rm Pr}(y_{\rm bin} = "00"\hspace{0.05cm} |\hspace{0.05cm} c_{\rm bin} = "00") \hspace{-0.15cm} \ = \ \hspace{-0.15cm} ( 1 - \lambda)^2 = 0.8^2 = 0.64 = 1 - \lambda_2\hspace{0.3cm} \Rightarrow \hspace{0.3cm} \lambda_2 \hspace{-0.15cm} \ = \ \hspace{-0.15cm} 1 - ( 1 - \lambda)^2 \hspace{0.15cm}\underline{= 36\%}\hspace{0.05cm}. $$
  • This assumes that an erasure is avoided only if neither of the two bits has been erased.


(3)  The   $\rm RSC \, (255, \, 223, \, 33)_{256}$   is based on the Galois field  $\rm GF(256) = GF(2^8) \ \Rightarrow \ \it m = \rm 8$.

  • The result of subtask  (2)  must now be adapted to this case.  For the  $8$–BEC holds:
$$1 - \lambda_8 = ( 1 - \lambda)^8 = 0.8^8 \approx 0.168 \hspace{0.3cm} \Rightarrow \hspace{0.3cm} \lambda_m = \lambda_8 \hspace{0.15cm}\underline{\approx 83.2\%}\hspace{0.05cm}. $$


(4)  From the condition   $\lambda_m ≤ 0.2$  it follows directly  $1 - \lambda_m ≥ 0.8$.  From this follows further:

$$( 1 - \lambda)^8 \ge 0.8 \hspace{0.3cm} \Rightarrow \hspace{0.3cm} 1 - \lambda \ge 0.8^{0.125} \approx 0.9725 \hspace{0.3cm} \Rightarrow \hspace{0.3cm}\lambda \hspace{0.15cm} \underline{\le 2.75\%}\hspace{0.05cm}.$$


(5)  Here one proceeds as follows:

  • With  $\lambda = 0.0275 \Rightarrow \ \lambda_m = 0.2$   ⇒   $20\%$  of the received symbols are  erasured.
  • The  $2^8 = 256$  received symbols  $(00000000$   ...   $11111111)$  are all equally likely.  It follows:
$${\rm Pr}(y_{\rm bin} = 00000000) = \hspace{0.1cm}\text{...} \hspace{0.1cm}= {\rm Pr}(y_{\rm bin} = 11111111)= \frac{0.8}{256} \hspace{0.15cm}\underline{= 0.3125\%}\hspace{0.05cm}.$$