Difference between revisions of "Aufgaben:Exercise 3.11: Erasure Channel"

From LNTwww
m (Text replacement - "Category:Aufgaben zu Informationstheorie" to "Category:Information Theory: Exercises")
 
(8 intermediate revisions by 2 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Anwendung auf die Digitalsignalübertragung
+
{{quiz-Header|Buchseite=Information_Theory/Application_to_Digital_Signal_Transmission
 
}}
 
}}
  
[[File:P_ID2791__Inf_A_3_10.png|right|frame|Auslöschungskanal mit vier Eingängen und fünf Ausgängen]]
+
[[File:P_ID2791__Inf_A_3_10.png|right|frame|Erasure channel with <br>four inputs and five outputs]]
Betrachtet wird ein Auslöschungskanal mit
+
An erasure channel is considered with
* den&nbsp; $M$&nbsp; Eingängen&nbsp; $x ∈ X = \{1,\ 2, \ \text{...} \  ,\ M\}$,&nbsp; und
+
* the&nbsp; $M$&nbsp; inputs&nbsp; $x ∈ X = \{1,\ 2, \ \text{...} \  ,\ M\}$,&nbsp; and
* den&nbsp; $M + 1$&nbsp; Ausgängen&nbsp; $y ∈ Y = \{1,\ 2,\ \ \text{...} \  ,\ M,\ \text{E}\}.$
+
* the&nbsp; $M + 1$&nbsp; outputs&nbsp; $y ∈ Y = \{1,\ 2,\ \ \text{...} \  ,\ M,\ \text{E}\}.$
  
  
Die Grafik zeigt das Modell für den Sonderfall&nbsp; $M = 4$.&nbsp; Das Sinkensymbol&nbsp; $y = \text{E}$&nbsp; berücksichtigt eine&nbsp; ''Auslöschung''&nbsp; (englisch:&nbsp; ''Erasure''&nbsp;)&nbsp; für den Fall, dass der Empfänger keine hinreichend gesicherte Entscheidung treffen kann.
+
The graph shows the model for the special case&nbsp; $M = 4$.&nbsp; The sink symbol&nbsp; $y = \text{E}$&nbsp; takes into account an erasure for the case that the receiver cannot make a sufficiently certain decision.
  
Die Übergangswahrscheinlichkeiten sind für&nbsp; $1 ≤ μ ≤ M$&nbsp; wie folgt gegeben:
+
The transition probabilities are given for&nbsp; $1 ≤ μ ≤ M$&nbsp; as follows:
 
:$${\rm Pr}(Y \hspace{-0.05cm} = \mu\hspace{-0.05cm}\mid \hspace{-0.05cm} X \hspace{-0.05cm}= \mu) = 1-\lambda \hspace{0.05cm},$$
 
:$${\rm Pr}(Y \hspace{-0.05cm} = \mu\hspace{-0.05cm}\mid \hspace{-0.05cm} X \hspace{-0.05cm}= \mu) = 1-\lambda \hspace{0.05cm},$$
 
:$${\rm Pr}(Y \hspace{-0.05cm} = {\rm E}\hspace{-0.05cm}\mid \hspace{-0.05cm} X \hspace{-0.05cm}= \mu) = \lambda \hspace{0.05cm}.$$
 
:$${\rm Pr}(Y \hspace{-0.05cm} = {\rm E}\hspace{-0.05cm}\mid \hspace{-0.05cm} X \hspace{-0.05cm}= \mu) = \lambda \hspace{0.05cm}.$$
Gesucht werden:
+
We are looking for:
* die Kapazität &nbsp;$C_{M\rm –EC}$&nbsp; dieses ''M–ary Erasure Channels'',
+
* the capacity &nbsp;$C_{M\rm –EC}$&nbsp; of this M–ary Erasure Channel,
* die Kapazität &nbsp;$C_{\rm BEC}$&nbsp; des&nbsp; [[Channel_Coding/Kanalmodelle_und_Entscheiderstrukturen#Binary_Erasure_Channel_.E2.80.93_BEC|Binary Erasure Channels]]&nbsp; als Sonderfall des obigen Modells.
+
* the capacity &nbsp;$C_{\rm BEC}$&nbsp; of the&nbsp; [[Channel_Coding/Kanalmodelle_und_Entscheiderstrukturen#Binary_Erasure_Channel_.E2.80.93_BEC|Binary Erasure Channels]]&nbsp; as a special case of the above model.
  
  
Line 25: Line 25:
  
  
''Hinweise:''
+
 
*Die Aufgabe gehört zum  Kapitel&nbsp; [[Information_Theory/Anwendung_auf_die_Digitalsignalübertragung|Anwendung auf die Digitalsignalübertragung]].
+
Hints:
*Bezug genommen wird insbesondere auf die Seite&nbsp;    [[Information_Theory/Anwendung_auf_die_Digitalsignalübertragung#Informationstheoretisches_Modell_der_Digitalsignal.C3.BCbertragung|Informationstheoretisches Modell der Digitalsignalübertragung]].
+
*The exercise belongs to the chapter&nbsp; [[Information_Theory/Anwendung_auf_die_Digitalsignalübertragung|Application to Digital Signal Transmission]].
*Im obigen Schaubild sind Auslöschungen&nbsp; $($mit Wahrscheinlichkeit&nbsp; $λ)$&nbsp; blau gezeichnet.
+
*Reference is made in particular to the page&nbsp;    [[Information_Theory/Anwendung_auf_die_Digitalsignalübertragung#Information-theoretical_model_of_digital_signal_transmission|Information-theoretical model of digital signal transmission]].
* „Richtige Übertragungswege”&nbsp; $($also von&nbsp; $X = μ$&nbsp; nach&nbsp; $Y = μ)$&nbsp; sind rot  dargestellt &nbsp;($1 ≤ μ ≤ M$).
+
*In the above diagram,&nbsp; "erasures"&nbsp; $($with probability&nbsp; $λ)$&nbsp; are drawn in blue.
 +
* "Correct transmission paths"&nbsp; $($i.e., from&nbsp; $X = μ$&nbsp; to&nbsp; $Y = μ)$&nbsp; are shown in red &nbsp;($1 ≤ μ ≤ M$).
 
   
 
   
  
  
  
===Fragebogen===
+
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
{ Welches&nbsp; $P_X(X)$&nbsp; ist zur Kanalkapazitätsberechnung allgemein anzusetzen?
+
{ What&nbsp; $P_X(X)$&nbsp; should be generally applied for the channel capacity calculation?
 
|type="[]"}
 
|type="[]"}
 
- $P_X(X) = (0.5, \  0.5),$
 
- $P_X(X) = (0.5, \  0.5),$
Line 43: Line 44:
 
- $P_X(X) = (0.1,\ 0.2,\ 0.3,\ 0.4).$
 
- $P_X(X) = (0.1,\ 0.2,\ 0.3,\ 0.4).$
  
{Wie viele Wahrscheinlichkeiten&nbsp; $p_{μκ} = {\rm Pr}\big[(X = μ) ∩ (Y = κ)\big]$&nbsp; sind ungleich Null?
+
{How many probabilities&nbsp; $p_{μκ} = {\rm Pr}\big[(X = μ) ∩ (Y = κ)\big]$&nbsp; are nonzero?
 
|type="()"}
 
|type="()"}
- Genau&nbsp; $M · (M + 1)$,
+
- Exactly&nbsp; $M · (M + 1)$,
- Genau&nbsp; $M$,
+
- Exactly&nbsp; $M$,
+ Genau&nbsp; $2 · M$.
+
+ Exactly&nbsp; $2 · M$.
  
  
{Wie groß ist die Sinkenentropie allgemein und für &nbsp;$M = 4$&nbsp; und &nbsp;$λ = 0.2$?
+
{What is the sink entropy in general and for &nbsp;$M = 4$&nbsp; and &nbsp;$λ = 0.2$?
 
|type="{}"}
 
|type="{}"}
 
$H(Y) \ = \ $  { 2.322 3% } $\ \rm bit$
 
$H(Y) \ = \ $  { 2.322 3% } $\ \rm bit$
  
{Berechnen Sie die Irrelevanz.&nbsp; Welcher Wert ergibt sich für &nbsp;$M = 4$&nbsp; und &nbsp;$λ = 0.2$?
+
{Calculate the irrelevance.&nbsp; What is the value for &nbsp;$M = 4$&nbsp; and &nbsp;$λ = 0.2$?
 
|type="{}"}
 
|type="{}"}
 
$H(Y|X) \ = \ $ { 0.722 3% } $\ \rm bit$
 
$H(Y|X) \ = \ $ { 0.722 3% } $\ \rm bit$
  
{Wie groß ist die Kanalkapazität &nbsp;$C$&nbsp; in Abhängigkeit von &nbsp;$M$?
+
{What is the channel capacity &nbsp;$C$&nbsp; as a function of &nbsp;$M$?
 
|type="{}"}
 
|type="{}"}
 
$M = 4\text{:} \hspace{0.5cm}  C\ = \ $ { 1.6 3% } $\ \rm bit$
 
$M = 4\text{:} \hspace{0.5cm}  C\ = \ $ { 1.6 3% } $\ \rm bit$
 
$M = 2\text{:} \hspace{0.5cm}  C\ = \ $ { 0.8 3% } $\ \rm bit$
 
$M = 2\text{:} \hspace{0.5cm}  C\ = \ $ { 0.8 3% } $\ \rm bit$
  
{Wie lautet die Kanalkapazität des BEC–Kanals in kompakter Form?  
+
{What is the channel capacity of the BEC channel in compact form?  
 
|type="()"}
 
|type="()"}
 
+ $C_{\rm BEC} = 1 - λ,$
 
+ $C_{\rm BEC} = 1 - λ,$
Line 73: Line 74:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''(1)'''&nbsp;  Richtig ist der <u>Lösungsvorschlag 2:</u>
+
'''(1)'''&nbsp;  Correct is the <u>proposed solution 2:</u>
* Aufgrund der Symmetrie der Übergangswahrscheinlichkeiten&nbsp; $P_{Y|X}(Y|X)$&nbsp; ist offensichtlich, dass eine Gleichverteilung zur maximalen Transinformation&nbsp; $I(X; Y)$&nbsp; und damit zur Kanalkapazität&nbsp; $C$&nbsp; führen wird:
+
* Due to the symmetry of the transition probabilities&nbsp; $P_{Y|X}(Y|X)$&nbsp; it is obvious that a uniform distribution will lead to the maximum mutual information&nbsp; $I(X; Y)$&nbsp; and therefore to the channel capacity&nbsp; $C$&nbsp;:
 
:$$ P_X(X) = P_X\big ( \hspace{0.03cm}X\hspace{-0.03cm}=1\hspace{0.03cm}, \hspace{0.08cm} X\hspace{-0.03cm}=2\hspace{0.03cm},\hspace{0.08cm}\text{...}\hspace{0.08cm}, X\hspace{-0.03cm}=M\hspace{0.03cm}\big ) = \big [\hspace{0.03cm}1/M\hspace{0.03cm}, \hspace{0.08cm} 1/M\hspace{0.03cm},\hspace{0.03cm}\text{...}\hspace{0.08cm},\hspace{0.08cm} 1/M\hspace{0.03cm}\big ]\hspace{0.05cm}.$$
 
:$$ P_X(X) = P_X\big ( \hspace{0.03cm}X\hspace{-0.03cm}=1\hspace{0.03cm}, \hspace{0.08cm} X\hspace{-0.03cm}=2\hspace{0.03cm},\hspace{0.08cm}\text{...}\hspace{0.08cm}, X\hspace{-0.03cm}=M\hspace{0.03cm}\big ) = \big [\hspace{0.03cm}1/M\hspace{0.03cm}, \hspace{0.08cm} 1/M\hspace{0.03cm},\hspace{0.03cm}\text{...}\hspace{0.08cm},\hspace{0.08cm} 1/M\hspace{0.03cm}\big ]\hspace{0.05cm}.$$
*Im Sonderfall&nbsp; $M = 2$&nbsp; wäre auch&nbsp; $P_X(X) = (0.5, \ 0.5)$&nbsp; richtig.
+
*In the special case&nbsp; $M = 2$&nbsp; also&nbsp; $P_X(X) = (0.5, \ 0.5)$&nbsp; would be correct.
  
  
  
'''(2)'''&nbsp;  Zutreffend ist  der <u>Lösungsvorschlag 3</u>, also genau&nbsp; $2M$&nbsp; Verbindungen.&nbsp; Da:
+
'''(2)'''&nbsp;  Correct is the <u>proposed solution 3</u>, i.e. exactly&nbsp; $2M$&nbsp; connections.&nbsp; Because:
*Von jedem Quellensymbol&nbsp; $X = μ$&nbsp; kommt man sowohl zum Sinkensymbol&nbsp; $Y = μ$&nbsp; als auch zum Erasure&nbsp; $Y = \text{E}$.
+
*From each source symbol&nbsp; $X = μ$&nbsp; one gets to the sink symbol&nbsp; $Y = μ$&nbsp; as well as to the erasure&nbsp; $Y = \text{E}$.
  
 
   
 
   
  
'''(3)'''&nbsp;  Alle Wahrscheinlichkeiten&nbsp; ${\rm Pr}(Y = 1), \hspace{0.05cm} \text{...}\hspace{0.05cm} , \hspace{0.08cm}{\rm Pr}(Y = M)$&nbsp; sind gleich groß.&nbsp; Damit erhält man für&nbsp; $μ = 1, \hspace{0.05cm} \text{...} \hspace{0.05cm} , \hspace{0.08cm} M$:
+
'''(3)'''&nbsp;  All probabilities&nbsp; ${\rm Pr}(Y = 1), \hspace{0.05cm} \text{...}\hspace{0.05cm} , \hspace{0.08cm}{\rm Pr}(Y = M)$&nbsp; are equal.&nbsp; Thus, for&nbsp; $μ = 1, \hspace{0.05cm} \text{...} \hspace{0.05cm} , \hspace{0.08cm} M$ we obtain:
 
:$${\rm Pr}(Y \hspace{-0.05cm} = \mu) = ( 1-\lambda)/M \hspace{0.05cm}.$$
 
:$${\rm Pr}(Y \hspace{-0.05cm} = \mu) = ( 1-\lambda)/M \hspace{0.05cm}.$$
*Außerdem kommt man von jedem Quellensymbol&nbsp; $X = 1, \hspace{0.05cm} \text{...}\hspace{0.05cm}  , X = M$&nbsp; auch zum Erasure&nbsp; $Y = \text{E}$:
+
*Moreover, from each source symbol&nbsp; $X = 1, \hspace{0.05cm} \text{...}\hspace{0.05cm}  , X = M$&nbsp; one also gets to the erasure&nbsp; $Y = \text{E}$:
 
:$${\rm Pr}(Y \hspace{-0.05cm} = {\rm E}) = \lambda \hspace{0.05cm}.$$
 
:$${\rm Pr}(Y \hspace{-0.05cm} = {\rm E}) = \lambda \hspace{0.05cm}.$$
*Die Kontrolle ergibt, dass die Summe aller&nbsp; $M + 1$&nbsp; Sinkensymbolwahrscheinlichkeiten tatsächlich&nbsp; $1$&nbsp; ergibt.&nbsp;  
+
*Inspection reveals that the sum of all&nbsp; $M + 1$&nbsp; sink symbol probabilities actually adds up t&nbsp; $1$&nbsp;.&nbsp;  
*Daraus folgt für die Sinkenentropie:
+
*It follows for the sink entropy:
 
:$$H(Y) = M \cdot \frac{ 1-\lambda }{M} \cdot {\rm log}_2 \hspace{0.1cm} \frac{M}{1 - \lambda} \hspace{0.15cm}+\hspace{0.15cm} \lambda \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{\lambda} \hspace{0.05cm}.$$
 
:$$H(Y) = M \cdot \frac{ 1-\lambda }{M} \cdot {\rm log}_2 \hspace{0.1cm} \frac{M}{1 - \lambda} \hspace{0.15cm}+\hspace{0.15cm} \lambda \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{\lambda} \hspace{0.05cm}.$$
*Zusammengefasst ergibt dies mit der binären Entropiefunktion:
+
*Summarized with the binary entropy function, this gives:
 
:$$H(Y) = (1-\lambda) \cdot {\rm log}_2 \hspace{0.1cm} M \hspace{0.15cm}+\hspace{0.15cm} H_{\rm bin} (\lambda ) \hspace{0.05cm}$$
 
:$$H(Y) = (1-\lambda) \cdot {\rm log}_2 \hspace{0.1cm} M \hspace{0.15cm}+\hspace{0.15cm} H_{\rm bin} (\lambda ) \hspace{0.05cm}$$
:und mit&nbsp; $M = 4$&nbsp; &nbsp;sowie&nbsp; $ λ = 0.2$:
+
:and with&nbsp; $M = 4$&nbsp; &nbsp;as well as&nbsp; $ λ = 0.2$:
 
:$$H(Y) = 1.6 \,{\rm bit} + H_{\rm bin} (0.2 ) \hspace{0.15cm} \underline {=2.322\,{\rm bit}} \hspace{0.05cm}.$$
 
:$$H(Y) = 1.6 \,{\rm bit} + H_{\rm bin} (0.2 ) \hspace{0.15cm} \underline {=2.322\,{\rm bit}} \hspace{0.05cm}.$$
  
  
  
'''(4)'''&nbsp;  Die&nbsp; $2M$&nbsp; Verbundwahrscheinlichkeiten
+
'''(4)'''&nbsp;  The&nbsp; $2M$&nbsp; joint probabilities
 
:$${\rm Pr} \big[(X = μ) ∩ (Y = κ)\big] ≠ 0$$  
 
:$${\rm Pr} \big[(X = μ) ∩ (Y = κ)\big] ≠ 0$$  
:und die bedingten Wahrscheinlichkeiten
+
:and the conditional probabilities
 
:$$pκ|μ = {\rm Pr}(Y = κ|X = μ)$$
 
:$$pκ|μ = {\rm Pr}(Y = κ|X = μ)$$
:zeigen folgende Eigenschaften:  
+
:show the following properties:  
#&nbsp; Die Kombination&nbsp; $p_{μκ} = (1 – λ)/M$  &nbsp;und&nbsp;  $p_{κ|μ} = 1 – λ$&nbsp; kommt&nbsp; $M$&nbsp; mal vor.
+
#&nbsp; The combination&nbsp; $p_{μκ} = (1 – λ)/M$  &nbsp;and&nbsp;  $p_{κ|μ} = 1 – λ$&nbsp; occurs&nbsp; $M$&nbsp; times.
#&nbsp; Die Kombination&nbsp; $p_{μκ} = λ/M$  &nbsp;und&nbsp;  $p_{κ|μ} = λ$&nbsp; kommt ebenfalls $M$ mal vor.
+
#&nbsp; The combination&nbsp; $p_{μκ} = λ/M$  &nbsp;and&nbsp;  $p_{κ|μ} = λ$&nbsp; also occurs $M$ times.
  
  
Daraus folgt:
+
It follows that:
 
:$$ H(Y \hspace{-0.15cm}\mid \hspace{-0.15cm} X) \hspace{-0.01cm}  =\hspace{-0.01cm} M \cdot \frac{ 1-\lambda }{M} \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{1 - \lambda} \hspace{0.15cm}+\hspace{0.15cm}M \cdot \frac{ \lambda }{M} \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{ \lambda} = ( 1-\lambda) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{1 - \lambda} \hspace{0.15cm}+\hspace{0.15cm} \lambda \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{ \lambda} = H_{\rm bin} (\lambda)\hspace{0.05cm}.$$
 
:$$ H(Y \hspace{-0.15cm}\mid \hspace{-0.15cm} X) \hspace{-0.01cm}  =\hspace{-0.01cm} M \cdot \frac{ 1-\lambda }{M} \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{1 - \lambda} \hspace{0.15cm}+\hspace{0.15cm}M \cdot \frac{ \lambda }{M} \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{ \lambda} = ( 1-\lambda) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{1 - \lambda} \hspace{0.15cm}+\hspace{0.15cm} \lambda \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{ \lambda} = H_{\rm bin} (\lambda)\hspace{0.05cm}.$$
*Das Ergebnis ist unabhängig von&nbsp; $M$.&nbsp; Mit&nbsp; $λ = 0.2$&nbsp; erhält man:
+
*The result is independent of&nbsp; $M$.&nbsp; With&nbsp; $λ = 0.2$&nbsp; we obtain:
 
:$$H(Y \hspace{-0.1cm}\mid \hspace{-0.1cm} X) = H_{\rm bin} (0.2 ) \hspace{0.15cm} \underline {=0.722\,{\rm bit}} \hspace{0.05cm}.$$
 
:$$H(Y \hspace{-0.1cm}\mid \hspace{-0.1cm} X) = H_{\rm bin} (0.2 ) \hspace{0.15cm} \underline {=0.722\,{\rm bit}} \hspace{0.05cm}.$$
  
  
  
'''(5)'''&nbsp;  Die Kanalkapazität&nbsp; $C$&nbsp; ist gleich der maximalen Transinformation&nbsp; $I(X; Y)$,&nbsp; wobei die Maximierung hinsichtlich&nbsp; $P_X(X)$&nbsp; bereits durch den symmetrischen Ansatz berücksichtigt wurde:
+
'''(5)'''&nbsp;  The channel capacity&nbsp; $C$&nbsp; is equal to the maximum mutual information&nbsp; $I(X; Y)$,&nbsp; where the maximization with respect to&nbsp; $P_X(X)$&nbsp; has already been considered by the symmetric approach:
 
:$$ C = \max_{P_X(X)} \hspace{0.15cm} I(X;Y) = H(Y) - H(Y \hspace{-0.1cm}\mid \hspace{-0.1cm} X) = ( 1-\lambda) \cdot {\rm log}_2 \hspace{0.1cm} M + H_{\rm bin} (\lambda) - H_{\rm bin} (\lambda) = ( 1-\lambda) \cdot {\rm log}_2 \hspace{0.1cm} M \hspace{0.05cm}$$
 
:$$ C = \max_{P_X(X)} \hspace{0.15cm} I(X;Y) = H(Y) - H(Y \hspace{-0.1cm}\mid \hspace{-0.1cm} X) = ( 1-\lambda) \cdot {\rm log}_2 \hspace{0.1cm} M + H_{\rm bin} (\lambda) - H_{\rm bin} (\lambda) = ( 1-\lambda) \cdot {\rm log}_2 \hspace{0.1cm} M \hspace{0.05cm}$$
 
:$$\Rightarrow \hspace{0.3cm} M = 4\text{:} \hspace{0.3cm} \underline {C=1.6\,\,{\rm bit}} \hspace{0.05cm}, \hspace{0.8cm}
 
:$$\Rightarrow \hspace{0.3cm} M = 4\text{:} \hspace{0.3cm} \underline {C=1.6\,\,{\rm bit}} \hspace{0.05cm}, \hspace{0.8cm}
Line 124: Line 125:
  
  
'''(6)'''&nbsp;  Der&nbsp; ''Binary Erasure Channel''&nbsp; $\rm (BEC)$&nbsp; ist ein Sonderfall des hier betrachteten allgemeinen Modells mit&nbsp; $M = 2$:
+
'''(6)'''&nbsp;  The&nbsp; "Binary Erasure Channel"&nbsp; $\rm (BEC)$&nbsp; is a special case of the general model considered here with&nbsp; $M = 2$:
 
:$$C_{\rm BEC} = 1-\lambda \hspace{0.05cm}.$$
 
:$$C_{\rm BEC} = 1-\lambda \hspace{0.05cm}.$$
*Richtig ist somit der <u>Lösungsvorschlag 1</u>.  
+
*Thus, the correct <u>solution proposal 1</u>.  
*Der zweite Lösungsvorschlag gilt dagegen für den&nbsp; ''Binary Symmetric Channel''&nbsp; $\rm (BSC)$&nbsp; mit der Verfälschungswahrscheinlichkeit&nbsp; $λ$.
+
*The second proposed solution, on the other hand, applies to the&nbsp; "Binary Symmetric Channel"&nbsp; $\rm (BSC)$&nbsp; with distortion probability&nbsp; $λ$.
  
 
{{ML-Fuß}}
 
{{ML-Fuß}}
Line 133: Line 134:
  
  
[[Category:Information Theory: Exercises|^3.3 Anwendung auf DSÜ-Kanäle^]]
+
[[Category:Information Theory: Exercises|^3.3 Application to Digital Signal Transmission^]]

Latest revision as of 13:56, 24 September 2021

Erasure channel with
four inputs and five outputs

An erasure channel is considered with

  • the  $M$  inputs  $x ∈ X = \{1,\ 2, \ \text{...} \ ,\ M\}$,  and
  • the  $M + 1$  outputs  $y ∈ Y = \{1,\ 2,\ \ \text{...} \ ,\ M,\ \text{E}\}.$


The graph shows the model for the special case  $M = 4$.  The sink symbol  $y = \text{E}$  takes into account an erasure for the case that the receiver cannot make a sufficiently certain decision.

The transition probabilities are given for  $1 ≤ μ ≤ M$  as follows:

$${\rm Pr}(Y \hspace{-0.05cm} = \mu\hspace{-0.05cm}\mid \hspace{-0.05cm} X \hspace{-0.05cm}= \mu) = 1-\lambda \hspace{0.05cm},$$
$${\rm Pr}(Y \hspace{-0.05cm} = {\rm E}\hspace{-0.05cm}\mid \hspace{-0.05cm} X \hspace{-0.05cm}= \mu) = \lambda \hspace{0.05cm}.$$

We are looking for:

  • the capacity  $C_{M\rm –EC}$  of this M–ary Erasure Channel,
  • the capacity  $C_{\rm BEC}$  of the  Binary Erasure Channels  as a special case of the above model.





Hints:



Questions

1

What  $P_X(X)$  should be generally applied for the channel capacity calculation?

$P_X(X) = (0.5, \ 0.5),$
$P_X(X) = (1/M,\ 1/M, \ \text{...} \ ,\ 1/M),$
$P_X(X) = (0.1,\ 0.2,\ 0.3,\ 0.4).$

2

How many probabilities  $p_{μκ} = {\rm Pr}\big[(X = μ) ∩ (Y = κ)\big]$  are nonzero?

Exactly  $M · (M + 1)$,
Exactly  $M$,
Exactly  $2 · M$.

3

What is the sink entropy in general and for  $M = 4$  and  $λ = 0.2$?

$H(Y) \ = \ $

$\ \rm bit$

4

Calculate the irrelevance.  What is the value for  $M = 4$  and  $λ = 0.2$?

$H(Y|X) \ = \ $

$\ \rm bit$

5

What is the channel capacity  $C$  as a function of  $M$?

$M = 4\text{:} \hspace{0.5cm} C\ = \ $

$\ \rm bit$
$M = 2\text{:} \hspace{0.5cm} C\ = \ $

$\ \rm bit$

6

What is the channel capacity of the BEC channel in compact form?

$C_{\rm BEC} = 1 - λ,$
$C_{\rm BEC} = 1 - H_{\rm bin}(λ).$


Solution

(1)  Correct is the proposed solution 2:

  • Due to the symmetry of the transition probabilities  $P_{Y|X}(Y|X)$  it is obvious that a uniform distribution will lead to the maximum mutual information  $I(X; Y)$  and therefore to the channel capacity  $C$ :
$$ P_X(X) = P_X\big ( \hspace{0.03cm}X\hspace{-0.03cm}=1\hspace{0.03cm}, \hspace{0.08cm} X\hspace{-0.03cm}=2\hspace{0.03cm},\hspace{0.08cm}\text{...}\hspace{0.08cm}, X\hspace{-0.03cm}=M\hspace{0.03cm}\big ) = \big [\hspace{0.03cm}1/M\hspace{0.03cm}, \hspace{0.08cm} 1/M\hspace{0.03cm},\hspace{0.03cm}\text{...}\hspace{0.08cm},\hspace{0.08cm} 1/M\hspace{0.03cm}\big ]\hspace{0.05cm}.$$
  • In the special case  $M = 2$  also  $P_X(X) = (0.5, \ 0.5)$  would be correct.


(2)  Correct is the proposed solution 3, i.e. exactly  $2M$  connections.  Because:

  • From each source symbol  $X = μ$  one gets to the sink symbol  $Y = μ$  as well as to the erasure  $Y = \text{E}$.


(3)  All probabilities  ${\rm Pr}(Y = 1), \hspace{0.05cm} \text{...}\hspace{0.05cm} , \hspace{0.08cm}{\rm Pr}(Y = M)$  are equal.  Thus, for  $μ = 1, \hspace{0.05cm} \text{...} \hspace{0.05cm} , \hspace{0.08cm} M$ we obtain:

$${\rm Pr}(Y \hspace{-0.05cm} = \mu) = ( 1-\lambda)/M \hspace{0.05cm}.$$
  • Moreover, from each source symbol  $X = 1, \hspace{0.05cm} \text{...}\hspace{0.05cm} , X = M$  one also gets to the erasure  $Y = \text{E}$:
$${\rm Pr}(Y \hspace{-0.05cm} = {\rm E}) = \lambda \hspace{0.05cm}.$$
  • Inspection reveals that the sum of all  $M + 1$  sink symbol probabilities actually adds up t  $1$ . 
  • It follows for the sink entropy:
$$H(Y) = M \cdot \frac{ 1-\lambda }{M} \cdot {\rm log}_2 \hspace{0.1cm} \frac{M}{1 - \lambda} \hspace{0.15cm}+\hspace{0.15cm} \lambda \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{\lambda} \hspace{0.05cm}.$$
  • Summarized with the binary entropy function, this gives:
$$H(Y) = (1-\lambda) \cdot {\rm log}_2 \hspace{0.1cm} M \hspace{0.15cm}+\hspace{0.15cm} H_{\rm bin} (\lambda ) \hspace{0.05cm}$$
and with  $M = 4$   as well as  $ λ = 0.2$:
$$H(Y) = 1.6 \,{\rm bit} + H_{\rm bin} (0.2 ) \hspace{0.15cm} \underline {=2.322\,{\rm bit}} \hspace{0.05cm}.$$


(4)  The  $2M$  joint probabilities

$${\rm Pr} \big[(X = μ) ∩ (Y = κ)\big] ≠ 0$$
and the conditional probabilities
$$pκ|μ = {\rm Pr}(Y = κ|X = μ)$$
show the following properties:
  1.   The combination  $p_{μκ} = (1 – λ)/M$  and  $p_{κ|μ} = 1 – λ$  occurs  $M$  times.
  2.   The combination  $p_{μκ} = λ/M$  and  $p_{κ|μ} = λ$  also occurs $M$ times.


It follows that:

$$ H(Y \hspace{-0.15cm}\mid \hspace{-0.15cm} X) \hspace{-0.01cm} =\hspace{-0.01cm} M \cdot \frac{ 1-\lambda }{M} \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{1 - \lambda} \hspace{0.15cm}+\hspace{0.15cm}M \cdot \frac{ \lambda }{M} \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{ \lambda} = ( 1-\lambda) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{1 - \lambda} \hspace{0.15cm}+\hspace{0.15cm} \lambda \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{ \lambda} = H_{\rm bin} (\lambda)\hspace{0.05cm}.$$
  • The result is independent of  $M$.  With  $λ = 0.2$  we obtain:
$$H(Y \hspace{-0.1cm}\mid \hspace{-0.1cm} X) = H_{\rm bin} (0.2 ) \hspace{0.15cm} \underline {=0.722\,{\rm bit}} \hspace{0.05cm}.$$


(5)  The channel capacity  $C$  is equal to the maximum mutual information  $I(X; Y)$,  where the maximization with respect to  $P_X(X)$  has already been considered by the symmetric approach:

$$ C = \max_{P_X(X)} \hspace{0.15cm} I(X;Y) = H(Y) - H(Y \hspace{-0.1cm}\mid \hspace{-0.1cm} X) = ( 1-\lambda) \cdot {\rm log}_2 \hspace{0.1cm} M + H_{\rm bin} (\lambda) - H_{\rm bin} (\lambda) = ( 1-\lambda) \cdot {\rm log}_2 \hspace{0.1cm} M \hspace{0.05cm}$$
$$\Rightarrow \hspace{0.3cm} M = 4\text{:} \hspace{0.3cm} \underline {C=1.6\,\,{\rm bit}} \hspace{0.05cm}, \hspace{0.8cm} M = 2\text{:} \hspace{0.3cm} \underline {C=0.8\,\,{\rm bit}} \hspace{0.05cm}.$$


(6)  The  "Binary Erasure Channel"  $\rm (BEC)$  is a special case of the general model considered here with  $M = 2$:

$$C_{\rm BEC} = 1-\lambda \hspace{0.05cm}.$$
  • Thus, the correct solution proposal 1.
  • The second proposed solution, on the other hand, applies to the  "Binary Symmetric Channel"  $\rm (BSC)$  with distortion probability  $λ$.