Difference between revisions of "Aufgaben:Exercise 4.9Z: Is Channel Capacity C ≡ 1 possible with BPSK?"

From LNTwww
 
(6 intermediate revisions by 2 users not shown)
Line 3: Line 3:
 
}}
 
}}
  
[[File:EN_Inf_Z_4_9.png|right|frame|Zwei unterschiedliche <br>Dichtefunktionen&nbsp; $f_N(n)$]]
+
[[File:EN_Inf_Z_4_9.png|right|frame|Two different PDFs&nbsp; $f_N(n)$&nbsp; for the impairments&nbsp; (e.g. noise)]]
  
Wir gehen hier von einem binären bipolaren Quellensignal aus &nbsp;  &#8658; &nbsp; $ x \in X = \{+1, -1\}$.  
+
We assume here a binary bipolar source signal &nbsp;  &#8658; &nbsp; $ x \in X = \{+1, -1\}$.  
  
Die Wahrscheinlichkeitsdichtefunktion (WDF) der Quelle lautet somit:
+
Thus,&nbsp;  the probability density function&nbsp; $\rm (PDF)$&nbsp; of the source is:
 
:$$f_X(x) = {1}/{2} \cdot \delta (x-1) + {1}/{2} \cdot \delta (x+1)\hspace{0.05cm}.  $$
 
:$$f_X(x) = {1}/{2} \cdot \delta (x-1) + {1}/{2} \cdot \delta (x+1)\hspace{0.05cm}.  $$
Die Transinformation zwischen der Quelle&nbsp; $X$&nbsp; und der Sinke&nbsp; $Y$&nbsp; kann gemäß der Gleichung
+
The mutual information between the source&nbsp; $X$&nbsp; and the sink&nbsp; $Y$&nbsp; can be calculated according to the equation
 
:$$I(X;Y) = h(Y) - h(N)$$
 
:$$I(X;Y) = h(Y) - h(N)$$
berechnet werden, wobei gilt:
+
where holds:
* $h(Y)$&nbsp; bezeichnet die&nbsp; '''differentielle Sinkenentropie'''
+
* $h(Y)$&nbsp; denotes the&nbsp; '''differential sink entropy'''
 
:$$h(Y) =  
 
:$$h(Y) =  
 
\hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}(f_Y)} \hspace{-0.35cm}  f_Y(y) \cdot {\rm log}_2 \hspace{0.1cm} \big[ f_Y(y) \big] \hspace{0.1cm}{\rm d}y  
 
\hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}(f_Y)} \hspace{-0.35cm}  f_Y(y) \cdot {\rm log}_2 \hspace{0.1cm} \big[ f_Y(y) \big] \hspace{0.1cm}{\rm d}y  
 
\hspace{0.05cm},$$
 
\hspace{0.05cm},$$
:$${\rm mit}\hspace{0.5cm}
+
:$${\rm with}\hspace{0.5cm}
 
f_Y(y) = {1}/{2} \cdot  \big[ f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}{X}=-1) + f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}{X}=+1)  \big ]\hspace{0.05cm}.$$
 
f_Y(y) = {1}/{2} \cdot  \big[ f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}{X}=-1) + f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}{X}=+1)  \big ]\hspace{0.05cm}.$$
* $h(N)$&nbsp;  gibt die&nbsp; '''differentielle Störentropie'''&nbsp; an, allein berechenbar aus der WDF&nbsp; $f_N(n)$:
+
* $h(N)$&nbsp;  gives the&nbsp; '''differential noise entropy'''&nbsp; computable from the PDF&nbsp; $f_N(n)$&nbsp; alone:
 
:$$h(N) =  
 
:$$h(N) =  
 
\hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}(f_N)} \hspace{-0.35cm}  f_N(n) \cdot {\rm log}_2 \hspace{0.1cm} \big[ f_N(n) \big] \hspace{0.1cm}{\rm d}n  
 
\hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}(f_N)} \hspace{-0.35cm}  f_N(n) \cdot {\rm log}_2 \hspace{0.1cm} \big[ f_N(n) \big] \hspace{0.1cm}{\rm d}n  
 
\hspace{0.05cm}.$$
 
\hspace{0.05cm}.$$
  
Nimmt man für die Störung &nbsp;$N$&nbsp; eine Gaußverteilung &nbsp;$f_N(n)$&nbsp; entsprechend der oberen Skizze an, so ergibt sich die Kanalkapazität &nbsp;$C_\text{BPSK} = I(X;Y)$, die&nbsp; [[Information_Theory/AWGN–Kanalkapazität_bei_wertdiskretem_Eingang#AWGN.E2.80.93Kanalkapazit.C3.A4t_f.C3.BCr_bin.C3.A4re_Eingangssignale|im Theorieteil]]&nbsp; abhängig von &nbsp;$10 \cdot \lg (E_{\rm B}/{N_0})$&nbsp; dargestellt ist.
+
Assuming a Gaussian distribution &nbsp;$f_N(n)$&nbsp; for the noise &nbsp;$N$&nbsp; according to the upper sketch,&nbsp; we obtain the channel capacity &nbsp;$C_\text{BPSK} = I(X;Y)$,&nbsp; which is shown in the&nbsp; [[Information_Theory/AWGN–Kanalkapazität_bei_wertdiskretem_Eingang#AWGN_channel_capacity_for_binary_input_signals|theory section]]&nbsp; depending on &nbsp;$10 \cdot \lg (E_{\rm B}/{N_0})$&nbsp;.
  
Beantwortet werden soll die Frage, ob es einen endlichen &nbsp;$E_{\rm B}/{N_0}$&ndash;Wert  gibt, für den&nbsp; $C_\text{BPSK}(E_{\rm B}/{N_0})  &equiv; 1 \ \rm bit/Kanalzugriff $&nbsp; möglich ist &nbsp; &#8658; &nbsp; Teilaufgabe&nbsp; '''(5)'''.
+
The question to be answered is whether there is a finite &nbsp;$E_{\rm B}/{N_0}$&nbsp; value for which&nbsp; $C_\text{BPSK}(E_{\rm B}/{N_0})  &equiv; 1 \ \rm bit/channel\:use $&nbsp; is possible &nbsp; &#8658; &nbsp; subtask&nbsp; '''(5)'''.
  
In den Teilaufgaben&nbsp; '''(1)'''&nbsp; bis&nbsp; '''(4)'''&nbsp; werden Vorarbeiten zur Beantwortung dieser Frage geleistet. Dabei wird stets von der gleichverteilten Stör&ndash;WDF&nbsp; $f_N(n)$&nbsp; ausgegangen (siehe untere Skizze):
+
In subtasks&nbsp; '''(1)'''&nbsp; to&nbsp; '''(4)''',&nbsp; preliminary work is done to answer this question.&nbsp; The uniformly distributed noise PDF&nbsp; $f_N(n)$&nbsp; is always assumed (see sketch below):
 
:$$f_N(n) =
 
:$$f_N(n) =
 
\left\{ \begin{array}{c} 1/(2A) \\  0 \\  \end{array} \right. \begin{array}{*{20}c}  {\rm{f\ddot{u}r}} \hspace{0.3cm} |\hspace{0.05cm}n\hspace{0.05cm}| < A, \\    {\rm{f\ddot{u}r}} \hspace{0.3cm} |\hspace{0.05cm}n\hspace{0.05cm}| > A. \\ \end{array} $$
 
\left\{ \begin{array}{c} 1/(2A) \\  0 \\  \end{array} \right. \begin{array}{*{20}c}  {\rm{f\ddot{u}r}} \hspace{0.3cm} |\hspace{0.05cm}n\hspace{0.05cm}| < A, \\    {\rm{f\ddot{u}r}} \hspace{0.3cm} |\hspace{0.05cm}n\hspace{0.05cm}| > A. \\ \end{array} $$
Line 35: Line 35:
  
  
 
+
Hints:
 
+
*The exercise belongs to the chapter&nbsp; [[Information_Theory/AWGN_Channel_Capacity_for_Discrete_Input|AWGN channel capacitance for discrete input]].
 
+
*Reference is made in particular to the page&nbsp; [[Information_Theory/AWGN_Channel_Capacity_for_Discrete_Input#AWGN_channel_capacity_for_binary_input_signals|AWGN channel capacitance for binary input signals]].  
''Hinweise:''
 
*Die Aufgabe gehört zum  Kapitel&nbsp; [[Information_Theory/AWGN–Kanalkapazität_bei_wertdiskretem_Eingang|AWGN–Kanalkapazität bei wertdiskretem Eingang]].
 
*Bezug genommen wird insbesondere auf die Seite&nbsp; [[Information_Theory/AWGN–Kanalkapazität_bei_wertdiskretem_Eingang#AWGN.E2.80.93Kanalkapazit.C3.A4t_f.C3.BCr_bin.C3.A4re_Eingangssignale|AWGN-Kanalkapazität für binäre Eingangssignale]].  
 
 
   
 
   
  
 
    
 
    
  
===Fragebogen===
+
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
  
{ Wie groß ist die differentielle Störentropie bei gleichverteilter Störung mit&nbsp; $\underline{A = 1/8}$?
+
{ What is the differential entropy with the uniform PDF&nbsp; $f_N(n)$&nbsp; and&nbsp; $\underline{A = 1/8}$?
 
|type="{}"}
 
|type="{}"}
$h(N) \ = \ $ { -2.06--1.94 } $\ \rm bit/Symbol$
+
$h(N) \ = \ $ { -2.06--1.94 } $\ \rm bit/symbol$
  
{Wie groß ist die differentielle Sinkenentropie bei gleichverteilter Störung mit&nbsp; $\underline{A = 1/8}$?
+
{What is the differential sink entropy with the uniform PDF&nbsp; $f_N(n)$&nbsp; and&nbsp; $\underline{A = 1/8}$?
 
|type="{}"}
 
|type="{}"}
$h(Y) \ = \ $ { -1.03--0.97 }  $\ \rm bit/Symbol$
+
$h(Y) \ = \ $ { -1.03--0.97 }  $\ \rm bit/symbol$
  
{Wie groß ist die Transinformation zwischen Quelle und Sinke?&nbsp; Gehen Sie weiterhin von einer gleichverteilten Störung mit&nbsp; $\underline{A = 1/8}$&nbsp; aus.
+
{What is the magnitude of the mutual information between the source and sink?&nbsp; Assume further a uniformly distributed impairments with&nbsp; $\underline{A = 1/8}$&nbsp;.
 
|type="{}"}
 
|type="{}"}
$I(X;Y) \ = \ $ { 1 3% } $\ \rm bit/Symbol$
+
$I(X;Y) \ = \ $ { 1 3% } $\ \rm bit/symbol$
  
  
{Unter welchen Bedingungen ändert sich am Ergebnis der Teilaufgabe&nbsp; '''(3)'''&nbsp; nichts?
+
{Under what conditions does the result of subtask&nbsp; '''(3)'''&nbsp; not change?
 
|type="[]"}
 
|type="[]"}
+ Für jedes&nbsp; $A &#8804; 1$&nbsp; bei der vorgegebenen Gleichverteilung.
+
+ For any&nbsp; $A &#8804; 1$&nbsp; for the given uniform distribution.
+ Für jede andere WDF&nbsp; $f_N(n)$, die auf den Bereich&nbsp; $|\hspace{0.05cm}n\hspace{0.05cm}| < 1$&nbsp; begrenzt ist.
+
+ For any other PDF&nbsp; $f_N(n)$,&nbsp; limited to the range&nbsp; $|\hspace{0.05cm}n\hspace{0.05cm}| < 1$&nbsp;.
+ Wenn sich &nbsp;$f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.08cm}|\hspace{0.05cm}{X}=-1)$&nbsp; und &nbsp;$f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.08cm}|\hspace{0.05cm}{X}=+1)$&nbsp; nicht überlappen.
+
+ If &nbsp;$f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.08cm}|\hspace{0.05cm}{X}=-1)$&nbsp; and &nbsp;$f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.08cm}|\hspace{0.05cm}{X}=+1)$&nbsp; do not overlap.
  
  
{Beantworten Sie nun die entscheidende Frage, unter der Voraussetzung, <br>dass eine Gaußsche Störung vorliegt und der Quotient &nbsp;$E_{\rm B}/{N_0}$&nbsp; endlich ist.
+
{Now answer the crucial question, assuming,&nbsp; that Gaussian noise is the only impairment and the quotient &nbsp;$E_{\rm B}/{N_0}$&nbsp; is finite.
 
|type="()"}
 
|type="()"}
- $C_\text{BPSK}(E_{\rm B}/{N_0})  &equiv; 1 \ \rm bit/Kanalzugriff $&nbsp; ist mit einer Gauß&ndash;WDF möglich.
+
- $C_\text{BPSK}(E_{\rm B}/{N_0})  &equiv; 1 \ \rm bit/channel\:use $&nbsp; is possible with a Gaussian PDF.
+ Bei Gaußscher Störung mit endlichem &nbsp;$E_{\rm B}/{N_0}$&nbsp; gilt stets&nbsp; $C_\text{BPSK}(E_{\rm B}/{N_0})  < 1 \ \rm bit/Kanalzugriff $.
+
+ For Gaussian noise with finite &nbsp;$E_{\rm B}/{N_0}$&nbsp;, &nbsp; $C_\text{BPSK}(E_{\rm B}/{N_0})  < 1 \ \rm bit/channel\:use $ is always valid.
  
  
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''(1)'''&nbsp; Die differentielle Entropie einer Gleichverteilung der absoluten Breite&nbsp; $2A$&nbsp; ist gleich
+
'''(1)'''&nbsp; The differential entropy of a uniform distribution of absolute width&nbsp; $2A$&nbsp; is equal to
 
:$$ h(N) =  {\rm log}_2 \hspace{0.1cm} (2A)
 
:$$ h(N) =  {\rm log}_2 \hspace{0.1cm} (2A)
 
\hspace{0.3cm}\Rightarrow \hspace{0.3cm} A=1/8\hspace{-0.05cm}:
 
\hspace{0.3cm}\Rightarrow \hspace{0.3cm} A=1/8\hspace{-0.05cm}:
 
\hspace{0.15cm}h(N) =  {\rm log}_2 \hspace{0.1cm} (1/4)
 
\hspace{0.15cm}h(N) =  {\rm log}_2 \hspace{0.1cm} (1/4)
\hspace{0.15cm}\underline{= -2\,{\rm bit(/Symbol)}}\hspace{0.05cm}.$$
+
\hspace{0.15cm}\underline{= -2\,{\rm bit/symbol}}\hspace{0.05cm}.$$
 
 
 
 
  
[[File:EN_Inf_Z_4_9b.png|right|frame|WDF der Ausgangsgröße &nbsp;$Y$&nbsp;  <br>bei gleichverteilter Störung  &nbsp;$N$]]
 
'''(2)'''&nbsp; Die Wahrscheinlichkeitsdichtefunktion am Ausgang ergibt sich gemäß der Gleichung:
 
:$$f_Y(y) = {1}/{2} \cdot \big [ f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}-1) + f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}+1) \big  ]\hspace{0.05cm}.$$
 
Die Grafik zeigt das Ergebnis für unser Beispiel&nbsp; $(A = 1/8)$:
 
* Rot gezeichnet ist der erste Term&nbsp; ${1}/{2} \cdot  f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}-1)$, wobei das Rechteck&nbsp; $f_N(n)$&nbsp; an die Stelle&nbsp; $y = -1$&nbsp; verschoben und mit&nbsp; $1/2$&nbsp; multipliziert wird.&nbsp; Es ergibt sich ein Rechteck der Breite&nbsp; $2A = 1/4$&nbsp; und der Höhe&nbsp; $1/(4A) = 2$.     
 
* Blau dargestellt ist der zweite Term&nbsp; ${1}/{2} \cdot  f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}+1)$&nbsp; mit der Mitte bei&nbsp; $y = +1$.
 
* Lässt man die Farben außer Betracht, so ergibt sich die gesamte WDF&nbsp; $f_Y(y)$.
 
  
*Die differentiellen Entropie wird nicht verändert wird, wenn man nicht überlappende WDF–Abschnitte verschiebt.&nbsp;  
+
[[File:EN_Inf_Z_4_9e_neu.png|right|frame|PDF of the output variable &nbsp;$Y$&nbsp;  <br>with uniformly distributed noise &nbsp;$N$]]
*Somit ergibt sich für die gesuchte differentielle Sinkenentropie:
+
'''(2)'''&nbsp; The probability density function at the output is obtained according to the equation:
 +
:$$f_Y(y) = {1}/{2} \cdot \big [ f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}x=-1) + f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}x=+1) \big  ]\hspace{0.05cm}.$$
 +
The graph shows the result for our example&nbsp; $(A = 1/8)$:
 +
*Drawn in red is the first term&nbsp; ${1}/{2} \cdot  f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}-1)$,&nbsp; where the rectangle&nbsp; $f_N(n)$&nbsp; is shifted to the center position&nbsp; $y = -1$&nbsp; and is multiplied by&nbsp; $1/2$&nbsp;.&nbsp; The result is a rectangle of width&nbsp; $2A = 1/4$&nbsp; and height&nbsp; $1/(4A) = 2$.     
 +
*Shown in blue is the second term&nbsp; ${1}/{2} \cdot  f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}+1)$&nbsp; centered at&nbsp; $y = +1$.
 +
*Leaving the colors out of account,&nbsp; the total PDF&nbsp; $f_Y(y)$ is obtained.
 +
*The differential entropy is not changed by moving non-overlapping PDF sections. &nbsp;  
 +
*Thus, for the differential sink entropy we are looking for,&nbsp; we get:
 
:$$h(Y) =  {\rm log}_2 \hspace{0.1cm} (4A)
 
:$$h(Y) =  {\rm log}_2 \hspace{0.1cm} (4A)
 
\hspace{0.3cm}\Rightarrow \hspace{0.3cm} A=1/8\hspace{-0.05cm}:
 
\hspace{0.3cm}\Rightarrow \hspace{0.3cm} A=1/8\hspace{-0.05cm}:
 
\hspace{0.15cm}h(Y) =  {\rm log}_2 \hspace{0.1cm} (1/2)
 
\hspace{0.15cm}h(Y) =  {\rm log}_2 \hspace{0.1cm} (1/2)
\hspace{0.15cm}\underline{= -1\,{\rm bit(/Symbol)}}\hspace{0.05cm}.$$
+
\hspace{0.15cm}\underline{= -1\,{\rm bit/symbol}}\hspace{0.05cm}.$$
 
 
  
 
   
 
   
'''(3)'''&nbsp; Damit erhält man für die Transinformation zwischen Quelle und Sinke:
+
'''(3)'''&nbsp; Thus, for the mutual information between source and sink, we obtain:
:$$I(X; Y) = h(Y) \hspace{-0.05cm}-\hspace{-0.05cm} h(N) = (-1\,{\rm bit/Symbol})\hspace{-0.05cm} -\hspace{-0.05cm}(-2\,{\rm bit/Symbol})  
+
:$$I(X; Y) = h(Y) \hspace{-0.05cm}-\hspace{-0.05cm} h(N) = (-1\,{\rm bit/symbol})\hspace{-0.05cm} -\hspace{-0.05cm}(-2\,{\rm bit/symbol})  
\hspace{0.15cm}\underline{= +1\,{\rm bit/Symbol}}\hspace{0.05cm}.$$
+
\hspace{0.15cm}\underline{= +1\,{\rm bit/symbol}}\hspace{0.05cm}.$$
 
 
  
  
'''(4)'''&nbsp; <u>Alle Lösungsvorschläge</u> sind zutreffend:
+
'''(4)'''&nbsp; <u>All the proposed solutions</u>&nbsp; are true:
* Für jedes&nbsp; $A &#8804; 1$&nbsp; gilt
+
*For each&nbsp; $A &#8804; 1$&nbsp; holds
 
:$$ h(Y)  =    {\rm log}_2 \hspace{0.1cm} (4A) = {\rm log}_2 \hspace{0.1cm} (2A) + {\rm log}_2 \hspace{0.1cm} (2)\hspace{0.05cm}, \hspace{0.5cm}
 
:$$ h(Y)  =    {\rm log}_2 \hspace{0.1cm} (4A) = {\rm log}_2 \hspace{0.1cm} (2A) + {\rm log}_2 \hspace{0.1cm} (2)\hspace{0.05cm}, \hspace{0.5cm}
 
h(N)  =    {\rm log}_2 \hspace{0.1cm} (2A)$$
 
h(N)  =    {\rm log}_2 \hspace{0.1cm} (2A)$$
 
:$$\Rightarrow \hspace{0.3cm} I(X; Y) = h(Y) \hspace{-0.05cm}- \hspace{-0.05cm}h(N) = {\rm log}_2 \hspace{0.1cm} (2)
 
:$$\Rightarrow \hspace{0.3cm} I(X; Y) = h(Y) \hspace{-0.05cm}- \hspace{-0.05cm}h(N) = {\rm log}_2 \hspace{0.1cm} (2)
\hspace{0.15cm}\underline{= +1\,{\rm bit/Symbol}}\hspace{0.05cm}.$$
+
\hspace{0.15cm}\underline{= +1\,{\rm bit/symbol}}\hspace{0.05cm}.$$
* An diesem Prinzip ändert sich auch bei anderer WDF&nbsp; $f_N(n)$&nbsp; nichts, solange die Störung auf den Bereich&nbsp; $|\hspace{0.05cm}n\hspace{0.05cm}| &#8804; 1$&nbsp; begrenzt ist.
+
[[File:EN_Inf_Z_4_9e.png|right|frame|PDF of the output quantity &nbsp;$Y$&nbsp;  <br>with Gaussian noise &nbsp;$N$]]
* Überlappen sich jedoch die beiden bedingten Wahrscheinlichkeitsdichtefunktionen, so ergibt sich für&nbsp; $h(Y)$&nbsp; ein kleinerer Wert als oben berechnet und damit auch eine kleinere Transinformation.
+
*This principle does not change even if the PDF&nbsp; $f_N(n)$&nbsp; is different,&nbsp; as long as the noise is limited to the range&nbsp; $|\hspace{0.05cm}n\hspace{0.05cm}| &#8804; 1$&nbsp;.
 
+
*However,&nbsp; if the two conditional probability density functions overlap,&nbsp; the result is a smaller value for&nbsp; $h(Y)$&nbsp; than calculated above and thus smaller mutual information.
  
  
  
[[File:EN_Inf_Z_4_9e.png|right|frame|WDF der Ausgangsgröße &nbsp;$Y$&nbsp;  <br>bei gaußverteilter Störung  &nbsp;$N$]]
+
'''(5)'''&nbsp; Correct is the&nbsp; <u>proposed solution 2</u>:
'''(5)'''&nbsp; Richtig ist der <u>Lösungsvorschlag 2</u>:
+
* The Gaussian function decays very fast,&nbsp; but it never becomes exactly equal to zero.
* Die Gaußfunktion klingt zwar sehr schnell ab, sie wird aber nie exakt gleich Null.
+
* There is always an overlap of the conditional density functions&nbsp; $f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.08cm}|\hspace{0.05cm}x=-1)$&nbsp; and&nbsp;  $f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.08cm}|\hspace{0.05cm}x=+1)$.
* Es kommt hier immer zu einer Überlappung der bedingten Dichtefunktionen&nbsp; $f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.08cm}|\hspace{0.05cm}-1)$&nbsp; und&nbsp;  $f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.08cm}|\hspace{0.05cm}+1)$.
+
*According to subtask&nbsp; '''(4)'''&nbsp;,&nbsp; $C_\text{BPSK}(E_{\rm B}/{N_0})  &equiv; 1 \ \rm bit/channel\:use $&nbsp; is therefore not possible.
* Entsprechend der Teilaufgabe&nbsp; '''(4)'''&nbsp; ist deshalb&nbsp; $C_\text{BPSK}(E_{\rm B}/{N_0})  &equiv; 1 \ \rm bit/Kanalzugriff $&nbsp; nicht möglich.
 
  
 
{{ML-Fuß}}
 
{{ML-Fuß}}

Latest revision as of 17:11, 9 November 2021

Two different PDFs  $f_N(n)$  for the impairments  (e.g. noise)

We assume here a binary bipolar source signal   ⇒   $ x \in X = \{+1, -1\}$.

Thus,  the probability density function  $\rm (PDF)$  of the source is:

$$f_X(x) = {1}/{2} \cdot \delta (x-1) + {1}/{2} \cdot \delta (x+1)\hspace{0.05cm}. $$

The mutual information between the source  $X$  and the sink  $Y$  can be calculated according to the equation

$$I(X;Y) = h(Y) - h(N)$$

where holds:

  • $h(Y)$  denotes the  differential sink entropy
$$h(Y) = \hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}(f_Y)} \hspace{-0.35cm} f_Y(y) \cdot {\rm log}_2 \hspace{0.1cm} \big[ f_Y(y) \big] \hspace{0.1cm}{\rm d}y \hspace{0.05cm},$$
$${\rm with}\hspace{0.5cm} f_Y(y) = {1}/{2} \cdot \big[ f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}{X}=-1) + f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}{X}=+1) \big ]\hspace{0.05cm}.$$
  • $h(N)$  gives the  differential noise entropy  computable from the PDF  $f_N(n)$  alone:
$$h(N) = \hspace{0.1cm} - \hspace{-0.45cm} \int\limits_{{\rm supp}(f_N)} \hspace{-0.35cm} f_N(n) \cdot {\rm log}_2 \hspace{0.1cm} \big[ f_N(n) \big] \hspace{0.1cm}{\rm d}n \hspace{0.05cm}.$$

Assuming a Gaussian distribution  $f_N(n)$  for the noise  $N$  according to the upper sketch,  we obtain the channel capacity  $C_\text{BPSK} = I(X;Y)$,  which is shown in the  theory section  depending on  $10 \cdot \lg (E_{\rm B}/{N_0})$ .

The question to be answered is whether there is a finite  $E_{\rm B}/{N_0}$  value for which  $C_\text{BPSK}(E_{\rm B}/{N_0}) ≡ 1 \ \rm bit/channel\:use $  is possible   ⇒   subtask  (5).

In subtasks  (1)  to  (4),  preliminary work is done to answer this question.  The uniformly distributed noise PDF  $f_N(n)$  is always assumed (see sketch below):

$$f_N(n) = \left\{ \begin{array}{c} 1/(2A) \\ 0 \\ \end{array} \right. \begin{array}{*{20}c} {\rm{f\ddot{u}r}} \hspace{0.3cm} |\hspace{0.05cm}n\hspace{0.05cm}| < A, \\ {\rm{f\ddot{u}r}} \hspace{0.3cm} |\hspace{0.05cm}n\hspace{0.05cm}| > A. \\ \end{array} $$



Hints:



Questions

1

What is the differential entropy with the uniform PDF  $f_N(n)$  and  $\underline{A = 1/8}$?

$h(N) \ = \ $

$\ \rm bit/symbol$

2

What is the differential sink entropy with the uniform PDF  $f_N(n)$  and  $\underline{A = 1/8}$?

$h(Y) \ = \ $

$\ \rm bit/symbol$

3

What is the magnitude of the mutual information between the source and sink?  Assume further a uniformly distributed impairments with  $\underline{A = 1/8}$ .

$I(X;Y) \ = \ $

$\ \rm bit/symbol$

4

Under what conditions does the result of subtask  (3)  not change?

For any  $A ≤ 1$  for the given uniform distribution.
For any other PDF  $f_N(n)$,  limited to the range  $|\hspace{0.05cm}n\hspace{0.05cm}| < 1$ .
If  $f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.08cm}|\hspace{0.05cm}{X}=-1)$  and  $f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.08cm}|\hspace{0.05cm}{X}=+1)$  do not overlap.

5

Now answer the crucial question, assuming,  that Gaussian noise is the only impairment and the quotient  $E_{\rm B}/{N_0}$  is finite.

$C_\text{BPSK}(E_{\rm B}/{N_0}) ≡ 1 \ \rm bit/channel\:use $  is possible with a Gaussian PDF.
For Gaussian noise with finite  $E_{\rm B}/{N_0}$ ,   $C_\text{BPSK}(E_{\rm B}/{N_0}) < 1 \ \rm bit/channel\:use $ is always valid.


Solution

(1)  The differential entropy of a uniform distribution of absolute width  $2A$  is equal to

$$ h(N) = {\rm log}_2 \hspace{0.1cm} (2A) \hspace{0.3cm}\Rightarrow \hspace{0.3cm} A=1/8\hspace{-0.05cm}: \hspace{0.15cm}h(N) = {\rm log}_2 \hspace{0.1cm} (1/4) \hspace{0.15cm}\underline{= -2\,{\rm bit/symbol}}\hspace{0.05cm}.$$


PDF of the output variable  $Y$ 
with uniformly distributed noise  $N$

(2)  The probability density function at the output is obtained according to the equation:

$$f_Y(y) = {1}/{2} \cdot \big [ f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}x=-1) + f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}x=+1) \big ]\hspace{0.05cm}.$$

The graph shows the result for our example  $(A = 1/8)$:

  • Drawn in red is the first term  ${1}/{2} \cdot f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}-1)$,  where the rectangle  $f_N(n)$  is shifted to the center position  $y = -1$  and is multiplied by  $1/2$ .  The result is a rectangle of width  $2A = 1/4$  and height  $1/(4A) = 2$.
  • Shown in blue is the second term  ${1}/{2} \cdot f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.05cm}|\hspace{0.05cm}+1)$  centered at  $y = +1$.
  • Leaving the colors out of account,  the total PDF  $f_Y(y)$ is obtained.
  • The differential entropy is not changed by moving non-overlapping PDF sections.  
  • Thus, for the differential sink entropy we are looking for,  we get:
$$h(Y) = {\rm log}_2 \hspace{0.1cm} (4A) \hspace{0.3cm}\Rightarrow \hspace{0.3cm} A=1/8\hspace{-0.05cm}: \hspace{0.15cm}h(Y) = {\rm log}_2 \hspace{0.1cm} (1/2) \hspace{0.15cm}\underline{= -1\,{\rm bit/symbol}}\hspace{0.05cm}.$$


(3)  Thus, for the mutual information between source and sink, we obtain:

$$I(X; Y) = h(Y) \hspace{-0.05cm}-\hspace{-0.05cm} h(N) = (-1\,{\rm bit/symbol})\hspace{-0.05cm} -\hspace{-0.05cm}(-2\,{\rm bit/symbol}) \hspace{0.15cm}\underline{= +1\,{\rm bit/symbol}}\hspace{0.05cm}.$$


(4)  All the proposed solutions  are true:

  • For each  $A ≤ 1$  holds
$$ h(Y) = {\rm log}_2 \hspace{0.1cm} (4A) = {\rm log}_2 \hspace{0.1cm} (2A) + {\rm log}_2 \hspace{0.1cm} (2)\hspace{0.05cm}, \hspace{0.5cm} h(N) = {\rm log}_2 \hspace{0.1cm} (2A)$$
$$\Rightarrow \hspace{0.3cm} I(X; Y) = h(Y) \hspace{-0.05cm}- \hspace{-0.05cm}h(N) = {\rm log}_2 \hspace{0.1cm} (2) \hspace{0.15cm}\underline{= +1\,{\rm bit/symbol}}\hspace{0.05cm}.$$
PDF of the output quantity  $Y$ 
with Gaussian noise  $N$
  • This principle does not change even if the PDF  $f_N(n)$  is different,  as long as the noise is limited to the range  $|\hspace{0.05cm}n\hspace{0.05cm}| ≤ 1$ .
  • However,  if the two conditional probability density functions overlap,  the result is a smaller value for  $h(Y)$  than calculated above and thus smaller mutual information.


(5)  Correct is the  proposed solution 2:

  • The Gaussian function decays very fast,  but it never becomes exactly equal to zero.
  • There is always an overlap of the conditional density functions  $f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.08cm}|\hspace{0.05cm}x=-1)$  and  $f_{Y\hspace{0.05cm}|\hspace{0.05cm}{X}}(y\hspace{0.08cm}|\hspace{0.05cm}x=+1)$.
  • According to subtask  (4) ,  $C_\text{BPSK}(E_{\rm B}/{N_0}) ≡ 1 \ \rm bit/channel\:use $  is therefore not possible.