Difference between revisions of "Aufgaben:Exercise 4.2: Channel Log Likelihood Ratio at AWGN"

From LNTwww
Line 1: Line 1:
{{quiz-Header|Buchseite=Kanalcodierung/Soft–in Soft–out Decoder}}
+
{{quiz-Header|Buchseite=Channel_Coding/Soft-in_Soft-Out_Decoder}}
  
[[File:P_ID2980__KC_A_4_2_v2.png|right|frame|Bedingte Gaußfunktionen]]
+
[[File:P_ID2980__KC_A_4_2_v2.png|right|frame|Conditional Gaussian functions]]
Wir betrachten zwei Kanäle  $\rm A$  und  $\rm B$ , jeweils mit
+
We consider two channels  $\rm A$  and  $\rm B$ , each with.
* binärem bipolaren Eingang  $x ∈ \{+1, \, -1\}$, und
+
* binary bipolar input  $x ∈ \{+1, \, -1\}$, and
* wertkontinuierlichem Ausgang  $y ∈ {\rm \mathcal{R}}$  (reelle Zahl).
+
* continuous-valued output  $y ∈ {\rm \mathcal{R}}$  (real number).
  
  
Die Grafik zeigt für beide Kanäle
+
The graph shows for both channels
* als blaue Kurve die Dichtefunktionen  $f_{y\hspace{0.05cm}|\hspace{0.05cm}x=+1}$,
+
* as blue curve the density functions  $f_{y\hspace{0.05cm}|\hspace{0.05cm}x=+1}$,
* als rote Kurve die Dichtefunktionen  $f_{y\hspace{0.05cm}|\hspace{0.05cm}x=-1}$.
+
* as red curve the density functions  $f_{y\hspace{0.05cm}|\hspace{0.05cm}x=-1}$.
  
  
Im&nbsp; [[Channel_Coding/Soft%E2%80%93in_Soft%E2%80%93out_Decoder#Zuverl.C3.A4ssigkeitsinformation_.E2.80.93_Log_Likelihood_Ratio| Theorieteil]]&nbsp; wurde für diese AWGN&ndash;Konstellation der Kanal&ndash;$L$&ndash;Wert (englisch:&nbsp; <i>Channel Log Likelihood Ratio</i>, oder kurz&nbsp; <i>Channel LLR</i>&nbsp;) wie folgt hergeleitet:
+
In&nbsp; [[Channel_Coding/Soft-in_Soft-Out_Decoder#Reliability_information_-_Log_Likelihood_Ratio| "theory section"]]&nbsp; the channel LLR was derived for this AWGN constellation as follows:
 
:$$L_{\rm K}(y) = L(y\hspace{0.05cm}|\hspace{0.05cm}x) =  {\rm ln} \hspace{0.15cm} \frac{{\rm Pr}(y \hspace{0.05cm}|\hspace{0.05cm}x=+1) }{{\rm Pr}(y \hspace{0.05cm}|\hspace{0.05cm}x = -1)}
 
:$$L_{\rm K}(y) = L(y\hspace{0.05cm}|\hspace{0.05cm}x) =  {\rm ln} \hspace{0.15cm} \frac{{\rm Pr}(y \hspace{0.05cm}|\hspace{0.05cm}x=+1) }{{\rm Pr}(y \hspace{0.05cm}|\hspace{0.05cm}x = -1)}
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
  
Wertet man diese Gleichung analytisch aus, so erhält man mit der Proportionalitätskonstanten&nbsp; $K_{\rm L} = 2/\sigma^2$:
+
Evaluating this equation analytically, we obtain with the proportionality constant&nbsp; $K_{\rm L} = 2/\sigma^2$:
 
:$$L_{\rm K}(y) =
 
:$$L_{\rm K}(y) =
 
K_{\rm L} \cdot y
 
K_{\rm L} \cdot y
Line 28: Line 28:
  
  
''Hinweise:''
+
Hints:
* Die Aufgabe gehört zum Kapitel&nbsp; [[Channel_Coding/Soft%E2%80%93in_Soft%E2%80%93out_Decoder| Soft&ndash;in Soft&ndash;out Decoder]].
+
* This exercise belongs to the chapter&nbsp; [[Channel_Coding/Soft-in_Soft-Out_Decoder| "Soft&ndash;in Soft&ndash;out Decoder"]].
* Bezug genommen wird insbesondere auf die Seiten&nbsp; [[Channel_Coding/Soft–in_Soft–out_Decoder#Zuverl.C3.A4ssigkeitsinformation_.E2.80.93_Log_Likelihood_Ratio| Zuverlässigkeitsinformation &ndash; Log Likelihood Ratio]]&nbsp; sowie&nbsp;   [[Channel_Coding/Kanalmodelle_und_Entscheiderstrukturen#AWGN.E2.80.93Kanal_bei_bin.C3.A4rem_Eingang|AWGN&ndash;Kanal bei binärem Eingang]].
+
* Reference is made in particular to the pages&nbsp; [[Channel_Coding/Soft-in_Soft-Out_Decoder#Reliability_information_-_Log_Likelihood_Ratio|"Reliability Information &ndash; Log Likelihood Ratio"]]&nbsp; and&nbsp; [[Channel_Coding/Channel_Models_and_Decision_Structures#AWGN_channel_at_Binary_Input|"AWGN&ndash;Channel at Binary Input"].
 
   
 
   
  
Line 36: Line 36:
  
  
===Fragebogen===
+
===Questions===
 
<quiz display=simple>
 
<quiz display=simple>
{Welche Eigenschaften weisen die in der Grafik dargestellten Kanäle auf?
+
{What are the characteristics of the channels shown in the diagram?
 
|type="[]"}
 
|type="[]"}
+ Sie beschreiben die Binärübertragung bei Gaußscher Störung.
+
+ They describe the binary transmission under Gaussian interference.
+ Die Bitfehlerwahrscheinlichkeit ohne Codierung ist&nbsp; ${\rm Q}(1/\sigma)$.
+
+ The bit error probability without coding is&nbsp; ${\rm Q}(1/\sigma)$.
+ Das Kanal&ndash;LLR ist als&nbsp; $L_{\rm K}(y) = K_{\rm L} \cdot y$&nbsp; darstellbar.
+
+ The channel&ndash;LLR is given as&nbsp; $L_{\rm K}(y) = K_{\rm L} \cdot y$&nbsp; representable.
  
{Welche Konstante $K_{\rm L}$ kennzeichnet den Kanal&nbsp; $\rm A$?
+
{Which constant $K_{\rm L}$ characterizes the channel&nbsp; $\rm A$?
 
|type="{}"}
 
|type="{}"}
 
$K_{\rm L} \ = \ ${ 2 3% }  
 
$K_{\rm L} \ = \ ${ 2 3% }  
  
{Welche Informationen liefern bei Kanal&nbsp; $\rm A$&nbsp; die Empfangswerte&nbsp; $y_1 = 1, \ y_2 = 0.5$,&nbsp; $y_3 = \, -1.5$&nbsp; über die gesendeten Binärsymbole&nbsp; $x_1, \ x_2$&nbsp; bzw.&nbsp; $x_3$?
+
{For channel&nbsp; $\rm A$&nbsp; what information do the received values&nbsp; $y_1 = 1, \ y_2 = 0.5$,&nbsp; $y_3 = \, -1.5$&nbsp; provide about the transmitted binary symbols&nbsp; $x_1, \ x_2$&nbsp; and&nbsp; $x_3$, respectively?
 
|type="[]"}
 
|type="[]"}
+ $y_1 = 1.0$&nbsp; sagt aus, dass wahrscheinlich&nbsp; $x_1 = +1$&nbsp; gesendet wurde.
+
+ $y_1 = 1.0$&nbsp; states that probably&nbsp; $x_1 = +1$&nbsp; was sent.
+ $y_2 = 0.5$&nbsp; sagt aus, dass wahrscheinlich&nbsp; $x_2 = +1$&nbsp; gesendet wurde.
+
+ $y_2 = 0.5$&nbsp; states that probably&nbsp; $x_2 = +1$&nbsp; was sent.
+ $y_3 = \, -1.5$&nbsp; sagt aus, dass wahrscheinlich&nbsp; $x_3 = \, -1$&nbsp; gesendet wurde.
+
+ $y_3 = \, -1.5$&nbsp; states that probably&nbsp; $x_3 = \, -1$&nbsp; was sent.
+ Die Entscheidung&nbsp; "$y_1 &#8594; x_1$"&nbsp; ist sicherer als&nbsp; "$y_2 &#8594; x_2$".
+
+ The decision&nbsp; "$y_1 &#8594; x_1$"&nbsp; is more certain than&nbsp; "$y_2 &#8594; x_2$".
- Die Entscheidung&nbsp; "$y_1 &#8594; x_1$"&nbsp; ist sicherer als&nbsp; "$y_3 &#8594; x_3$".
+
- The decision&nbsp; "$y_1 &#8594; x_1$"&nbsp; is safer than&nbsp; "$y_3 &#8594; x_3$".
  
{Welches&nbsp; $K_{\rm L}$&nbsp; kennzeichnet den Kanal&nbsp; $\rm B$?
+
{Which&nbsp; $K_{\rm L}$&nbsp; identifies the channel&nbsp; $\rm B$?
 
|type="{}"}
 
|type="{}"}
 
$K_{\rm L} \ = \ ${ 8 3% }
 
$K_{\rm L} \ = \ ${ 8 3% }
  
{Welche Informationen liefern bei Kanal&nbsp; $\rm B$&nbsp; die Empfangswerte&nbsp; $y_1 = 1, \ y_2 = 0.5$,&nbsp; $y_3 = -1.5$&nbsp; über die gesendeten Binärsymbole&nbsp; $x_1, \ x_2$&nbsp; bzw.&nbsp; $x_3$?
+
{What information does channel&nbsp; $\rm B$&nbsp; provide about the received values&nbsp; $y_1 = 1, \ y_2 = 0.5$,&nbsp; $y_3 = -1.5$&nbsp; about the transmitted binary symbols&nbsp; $x_1, \ x_2$&nbsp; respectively.&nbsp; $x_3$?
 
|type="[]"}
 
|type="[]"}
+ Für&nbsp; $x_1, \ x_2, \ x_3$&nbsp; wird gleich entschieden wie bei Kanal&nbsp; $\rm A$.
+
+ For&nbsp; $x_1, \ x_2, \ x_3$&nbsp; is decided the same as for channel&nbsp; $\rm A$.
+ Die Schätzung&nbsp; "$x_2 = +1$"&nbsp; ist viermal sicherer als bei Kanal&nbsp; $\rm A$.
+
+ The estimate&nbsp; "$x_2 = +1$"&nbsp; is four times more certain than for channel&nbsp; $\rm A$.
- Die Schätzung&nbsp; "$x_3 = \, -1$"&nbsp; bei Kanal&nbsp; $\rm A$&nbsp; ist zuverlässiger als die Schätzung&nbsp; "$x_2 = +1$" bei Kanal&nbsp; $\rm B$.
+
- The estimate&nbsp; "$x_3 = \, -1$"&nbsp; at channel&nbsp; $\rm A$&nbsp; is more reliable than the estimate&nbsp; "$x_2 = +1$" at channel&nbsp; $\rm B$.
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''(1)'''&nbsp; <u>Alle Lösungsvorschläge</u> sind richtig:
+
'''(1)'''&nbsp; <u>All proposed solutions</u> are correct:
* Die Übertragungsgleichung lautet stets $y = x + n$, mit $x &#8712; \{+1, \, -1\}$; $n$ gibt eine Gaußsche Zufallsgröße mit Streuung $\sigma$ &nbsp; &#8658; &nbsp; Varianz $\sigma^2$ an &nbsp; &#8658; &nbsp; [[Channel_Coding/Kanalmodelle_und_Entscheiderstrukturen#AWGN.E2.80.93Kanal_bei_bin.C3.A4rem_Eingang| AWGN&ndash;Kanal]].
+
* The transfer equation is always $y = x + n$, with $x &#8712; \{+1, \, -1\}$; $n$ gives a Gaussian random variable with variance $\sigma$ &nbsp; &#8658; &nbsp; variance $\sigma^2$ &nbsp; &#8658; &nbsp; [[Channel_Coding/Channel_Models_and_Decision_Structures#AWGN_channel_at_Binary_Input| "AWGN Channel"]].
* Die [[Digital_Signal_Transmission/Fehlerwahrscheinlichkeit_bei_Basisbandübertragung#Fehlerwahrscheinlichkeit_bei_Gau.C3.9Fschem_Rauschen|AWGN&ndash;Bitfehlerwahrscheinlichkeit]] berechnet sich mit der Streuung $\sigma$ zu ${\rm Q}(1/\sigma)$ wobei ${\rm Q}(x)$ die [[Theory_of_Stochastic_Signals/Gaußverteilte_Zufallsgrößen#.C3.9Cberschreitungswahrscheinlichkeit|komplementäre Gaußsche Fehlerfunktion]] bezeichnet.  
+
* The [[Digital_Signal_Transmission/Error_Probability_for_Baseband_Transmission#Error_probability_with_Gaussian_noise|"AWGN Bit Error Probability"]] is calculated using the dispersion $\sigma$ to ${\rm Q}(1/\sigma)$ where ${\rm Q}(x)$ denotes the [[Theory_of_Stochastic_Signals/Gaussian_Distributed_Random_Variables#Exceedance_probability|"complementary Gaussian error function"]].  
* Für jeden AWGN&ndash;Kanal ergibt sich entsprechend dem [[Channel_Coding/Soft%E2%80%93in_Soft%E2%80%93out_Decoder#Zuverl.C3.A4ssigkeitsinformation_.E2.80.93_Log_Likelihood_Ratio|Theorieteil]] das Kanal&ndash;LLR stets zu $L_{\rm K}(y) = L(y|x) = K_{\rm L} \cdot y$.  
+
* For each AWGN channel, according to the [[Channel_Coding/Soft-in_Soft-Out_Decoder#Reliability_information_-_Log_Likelihood_Ratio|"theory section"]], the channel&ndash;LLR always results in $L_{\rm K}(y) = L(y|x) = K_{\rm L} \cdot y$.  
*Die Konstante $K_{\rm L}$ ist für die beiden Kanäle allerdings unterschiedlich.
+
*The constant $K_{\rm L}$ is different for the two channels, however.
  
  
 +
'''(2)'''&nbsp; For the AWGN channel, $L_{\rm K}(y) = K_{\rm L} \cdot y$ with constant $K_{\rm L} = 2/\sigma^2$. The standard deviation $\sigma$ can be read from the graph on the data page as the distance of the inflection points within the Gaussian curves from their respective midpoints. For '''channel A''', $\sigma = 1$ results.
  
'''(2)'''&nbsp; Beim AWGN&ndash;Kanal gilt $L_{\rm K}(y) = K_{\rm L} \cdot y$ mit der Konstanten $K_{\rm L} = 2/\sigma^2$. Die Streuung $\sigma$ kann aus der Grafik auf der Angabenseite als der Abstand der Wendepunkte innerhalb der Gaußkurven von ihren jeweiligen Mittelpunkten abgelesen werden. Beim '''Kanal A''' ergibt sich $\sigma = 1$.
+
*The same result is obtained by evaluating the Gaussian function
 
 
*Zum gleichen Ergebnis kommt man durch Auswertung der Gaußfunktion
 
 
:$$\frac{f_{\rm G}( y = \sigma)}{f_{\rm G}( y = 0)} = {\rm e} ^{ -  y^2/(2\sigma^2) } \Bigg |_{\hspace{0.05cm} y \hspace{0.05cm} = \hspace{0.05cm} \sigma} = {\rm e} ^{ -0.5} \approx 0.6065\hspace{0.05cm}.$$
 
:$$\frac{f_{\rm G}( y = \sigma)}{f_{\rm G}( y = 0)} = {\rm e} ^{ -  y^2/(2\sigma^2) } \Bigg |_{\hspace{0.05cm} y \hspace{0.05cm} = \hspace{0.05cm} \sigma} = {\rm e} ^{ -0.5} \approx 0.6065\hspace{0.05cm}.$$
  
*Das bedeutet: Beim Abszissenwert $y = \sigma$ ist die mittelwertfreie Gaußfunktion $f_{\rm G}(y)$ auf $60.65\%$ ihres Maximalwertes abgeklungen. Somit gilt für die Konstante beim '''Kanal A''': &nbsp; $K_{\rm L} = 2/\sigma^2 \ \underline{= 2}$.
+
*This means: At the abscissa value $y = \sigma$ the mean-free Gaussian function $f_{\rm G}(y)$ has decayed to $60.65\%$ of its maximum value. Thus, for the constant at '''channel A''': &nbsp; $K_{\rm L} = 2/\sigma^2 \ \underline{= 2}$.
  
  
  
'''(3)'''&nbsp; Richtig sind die <u>Lösungsvorschläge 1 bis 4</u>:
+
'''(3)'''&nbsp; The correct <u>solutions are 1 to 4</u>:
*Wir geben zunächst die jeweiligen $L$&ndash;Werte von '''Kanal A''' an:
+
*We first give the respective LLRs of '''Channel A''':
 
:$$L_{\rm K}(y_1 = +1.0) = +2\hspace{0.05cm},\hspace{0.3cm}
 
:$$L_{\rm K}(y_1 = +1.0) = +2\hspace{0.05cm},\hspace{0.3cm}
 
L_{\rm K}(y_2 = +0.5) = +1\hspace{0.05cm},\hspace{0.3cm}
 
L_{\rm K}(y_2 = +0.5) = +1\hspace{0.05cm},\hspace{0.3cm}
 
L_{\rm K}(y_3 = -1.5) = -3\hspace{0.05cm}. $$
 
L_{\rm K}(y_3 = -1.5) = -3\hspace{0.05cm}. $$
*Daraus ergeben sich folgende Konsequenzen:
+
*This results in the following consequences:
# Die Entscheidung für das (wahrscheinlichste) Codebit $x_i$ wird aufgrund des Vorzeichens von $L_{\rm K}(y_i)$ getroffen: $x_1 = +1, \ x_2 = +1, \ x_3 = \, -1$ &nbsp; &#8658; &nbsp; die <u>Lösungsvorschläge 1, 2 und 3</u> sind richtig.
+
# The decision for the (most probable) codebit $x_i$ is made based on the sign of $L_{\rm K}(y_i)$: $x_1 = +1, \ x_2 = +1, \ x_3 = \, -1$ &nbsp; &#8658; &nbsp; the <u>proposed solutions 1, 2 and 3</u> are correct.
# Die Entscheidung "$x_1 = +1$" ist wegen $|L_{\rm K}(y_1)| > |L_{\rm K}(y_3)|$ zuverlässiger als die Entscheidung "$x_2 = +1$" &nbsp; &#8658; &nbsp; <u>Lösungsvorschlag 4</u> ist ebenfalls richtig.
+
# The decision "$x_1 = +1$" is more reliable than the decision "$x_2 = +1$" &nbsp; &#8658; &nbsp; <u>Proposition 4</u> is also correct.
# Die Entscheidung "$x_1 = +1$" ist aber weniger zuverlässig als die Entscheidung "$x_3 = \, &ndash;1$", da $|L_{\rm K}(y_1)|$ kleiner als $|L_{\rm K}(y_3)|$ ist &nbsp; &#8658; &nbsp; Lösungsvorschlag 5 ist falsch.
+
# However, the decision "$x_1 = +1$" is less reliable than the decision "$x_3 = \, &ndash;1$" because $|L_{\rm K}(y_1)|$ is smaller than $|L_{\rm K}(y_3)|$ &nbsp; &#8658; &nbsp; Proposed solution 5 is incorrect.
  
  
Dies kann man auch so interpretieren: Der Quotient zwischen dem roten und dem blauen WDF&ndash;Wert ist bei $y_3 = \, -1.5$ größer als der Quotient zwischen dem blauen und dem roten WDF&ndash;Wert bei $y_1 = +1$.
+
This can also be interpreted as follows: The quotient between the red and the blue PDF value at $y_3 = \, -1.5$ is larger than the quotient between the blue and the red PDF value at $y_1 = +1$.
  
  
  
'''(4)'''&nbsp; Nach gleichen Überlegungen wie bei der Teilaufgabe (2) ergibt sich für die Streuung von '''Kanal B''': &nbsp; $\sigma = 1/2 \ \Rightarrow \ K_{\rm L} = 2/\sigma^2 \ \underline{= 8}$.
+
'''(4)'''&nbsp; Following the same considerations as in subtask (2), the scattering of '''channel B'''' is given by: &nbsp; $\sigma = 1/2 \ \Rightarrow \ K_{\rm L} = 2/\sigma^2 \ \underline{= 8}$.
  
  
  
'''(5)'''&nbsp; Für den '''Kanal B''' gilt: &nbsp; $L_{\rm K}(y_1 = +1.0) = +8, \ L_{\rm K}(y_2 = +0.5) = +4$ und $L_{\rm K}(y_3 = \, -1.5) = \, -12$.  
+
'''(5)'''&nbsp; For '''channel B''', the following applies: &nbsp; $L_{\rm K}(y_1 = +1.0) = +8, \ L_{\rm K}(y_2 = +0.5) = +4$ und $L_{\rm K}(y_3 = \, -1.5) = \, -12$.  
  
*Damit ist offensichtlich, dass <u>die beiden ersten Lösungsvorschläge</u> zutreffen, nicht aber der dritte, weil
+
*It is obvious that <u>the first two proposed solutions</u> are true, but not the third, because
:$$|L_{\rm K}(y_3 = -1.5, {\rm Kanal\hspace{0.15cm} A)}| = 3
+
:$$|L_{\rm K}(y_3 = -1.5, {\rm channel\hspace{0.15cm} A)}| = 3
 
\hspace{0.5cm} <\hspace{0.5cm}
 
\hspace{0.5cm} <\hspace{0.5cm}
|L_{\rm K}(y_2 = 0.5, {\rm Kanal\hspace{0.15cm} B)}| = 4\hspace{0.05cm} . $$
+
|L_{\rm K}(y_2 = 0.5, {\rm channel\hspace{0.15cm} B)}| = 4\hspace{0.05cm} . $$
 
{{ML-Fuß}}
 
{{ML-Fuß}}
  

Revision as of 17:36, 27 October 2022

Conditional Gaussian functions

We consider two channels  $\rm A$  and  $\rm B$ , each with.

  • binary bipolar input  $x ∈ \{+1, \, -1\}$, and
  • continuous-valued output  $y ∈ {\rm \mathcal{R}}$  (real number).


The graph shows for both channels

  • as blue curve the density functions  $f_{y\hspace{0.05cm}|\hspace{0.05cm}x=+1}$,
  • as red curve the density functions  $f_{y\hspace{0.05cm}|\hspace{0.05cm}x=-1}$.


In  "theory section"  the channel LLR was derived for this AWGN constellation as follows:

$$L_{\rm K}(y) = L(y\hspace{0.05cm}|\hspace{0.05cm}x) = {\rm ln} \hspace{0.15cm} \frac{{\rm Pr}(y \hspace{0.05cm}|\hspace{0.05cm}x=+1) }{{\rm Pr}(y \hspace{0.05cm}|\hspace{0.05cm}x = -1)} \hspace{0.05cm}.$$

Evaluating this equation analytically, we obtain with the proportionality constant  $K_{\rm L} = 2/\sigma^2$:

$$L_{\rm K}(y) = K_{\rm L} \cdot y \hspace{0.05cm}.$$





Hints:



Questions

1

What are the characteristics of the channels shown in the diagram?

They describe the binary transmission under Gaussian interference.
The bit error probability without coding is  ${\rm Q}(1/\sigma)$.
The channel–LLR is given as  $L_{\rm K}(y) = K_{\rm L} \cdot y$  representable.

2

Which constant $K_{\rm L}$ characterizes the channel  $\rm A$?

$K_{\rm L} \ = \ $

3

For channel  $\rm A$  what information do the received values  $y_1 = 1, \ y_2 = 0.5$,  $y_3 = \, -1.5$  provide about the transmitted binary symbols  $x_1, \ x_2$  and  $x_3$, respectively?

$y_1 = 1.0$  states that probably  $x_1 = +1$  was sent.
$y_2 = 0.5$  states that probably  $x_2 = +1$  was sent.
$y_3 = \, -1.5$  states that probably  $x_3 = \, -1$  was sent.
The decision  "$y_1 → x_1$"  is more certain than  "$y_2 → x_2$".
The decision  "$y_1 → x_1$"  is safer than  "$y_3 → x_3$".

4

Which  $K_{\rm L}$  identifies the channel  $\rm B$?

$K_{\rm L} \ = \ $

5

What information does channel  $\rm B$  provide about the received values  $y_1 = 1, \ y_2 = 0.5$,  $y_3 = -1.5$  about the transmitted binary symbols  $x_1, \ x_2$  respectively.  $x_3$?

For  $x_1, \ x_2, \ x_3$  is decided the same as for channel  $\rm A$.
The estimate  "$x_2 = +1$"  is four times more certain than for channel  $\rm A$.
The estimate  "$x_3 = \, -1$"  at channel  $\rm A$  is more reliable than the estimate  "$x_2 = +1$" at channel  $\rm B$.


Solution

(1)  All proposed solutions are correct:

  • The transfer equation is always $y = x + n$, with $x ∈ \{+1, \, -1\}$; $n$ gives a Gaussian random variable with variance $\sigma$   ⇒   variance $\sigma^2$   ⇒   "AWGN Channel".
  • The "AWGN Bit Error Probability" is calculated using the dispersion $\sigma$ to ${\rm Q}(1/\sigma)$ where ${\rm Q}(x)$ denotes the "complementary Gaussian error function".
  • For each AWGN channel, according to the "theory section", the channel–LLR always results in $L_{\rm K}(y) = L(y|x) = K_{\rm L} \cdot y$.
  • The constant $K_{\rm L}$ is different for the two channels, however.


(2)  For the AWGN channel, $L_{\rm K}(y) = K_{\rm L} \cdot y$ with constant $K_{\rm L} = 2/\sigma^2$. The standard deviation $\sigma$ can be read from the graph on the data page as the distance of the inflection points within the Gaussian curves from their respective midpoints. For channel A, $\sigma = 1$ results.

  • The same result is obtained by evaluating the Gaussian function
$$\frac{f_{\rm G}( y = \sigma)}{f_{\rm G}( y = 0)} = {\rm e} ^{ - y^2/(2\sigma^2) } \Bigg |_{\hspace{0.05cm} y \hspace{0.05cm} = \hspace{0.05cm} \sigma} = {\rm e} ^{ -0.5} \approx 0.6065\hspace{0.05cm}.$$
  • This means: At the abscissa value $y = \sigma$ the mean-free Gaussian function $f_{\rm G}(y)$ has decayed to $60.65\%$ of its maximum value. Thus, for the constant at channel A:   $K_{\rm L} = 2/\sigma^2 \ \underline{= 2}$.


(3)  The correct solutions are 1 to 4:

  • We first give the respective LLRs of Channel A:
$$L_{\rm K}(y_1 = +1.0) = +2\hspace{0.05cm},\hspace{0.3cm} L_{\rm K}(y_2 = +0.5) = +1\hspace{0.05cm},\hspace{0.3cm} L_{\rm K}(y_3 = -1.5) = -3\hspace{0.05cm}. $$
  • This results in the following consequences:
  1. The decision for the (most probable) codebit $x_i$ is made based on the sign of $L_{\rm K}(y_i)$: $x_1 = +1, \ x_2 = +1, \ x_3 = \, -1$   ⇒   the proposed solutions 1, 2 and 3 are correct.
  2. The decision "$x_1 = +1$" is more reliable than the decision "$x_2 = +1$"   ⇒   Proposition 4 is also correct.
  3. However, the decision "$x_1 = +1$" is less reliable than the decision "$x_3 = \, –1$" because $|L_{\rm K}(y_1)|$ is smaller than $|L_{\rm K}(y_3)|$   ⇒   Proposed solution 5 is incorrect.


This can also be interpreted as follows: The quotient between the red and the blue PDF value at $y_3 = \, -1.5$ is larger than the quotient between the blue and the red PDF value at $y_1 = +1$.


(4)  Following the same considerations as in subtask (2), the scattering of channel B' is given by:   $\sigma = 1/2 \ \Rightarrow \ K_{\rm L} = 2/\sigma^2 \ \underline{= 8}$.


(5)  For channel B, the following applies:   $L_{\rm K}(y_1 = +1.0) = +8, \ L_{\rm K}(y_2 = +0.5) = +4$ und $L_{\rm K}(y_3 = \, -1.5) = \, -12$.

  • It is obvious that the first two proposed solutions are true, but not the third, because
$$|L_{\rm K}(y_3 = -1.5, {\rm channel\hspace{0.15cm} A)}| = 3 \hspace{0.5cm} <\hspace{0.5cm} |L_{\rm K}(y_2 = 0.5, {\rm channel\hspace{0.15cm} B)}| = 4\hspace{0.05cm} . $$