Difference between revisions of "Aufgaben:Exercise 3.11Z: Extremely Asymmetrical Channel"

From LNTwww
 
(17 intermediate revisions by 4 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Anwendung auf die Digitalsignalübertragung
+
{{quiz-Header|Buchseite=Information_Theory/Application_to_Digital_Signal_Transmission
 
}}
 
}}
  
[[File:P_ID2800__Inf_Z_3_10.png|right|Extrem Unsymmetrischer Kanal]]
+
[[File:P_ID2800__Inf_Z_3_10.png|right|frame|One-sided distorting channel]]
Betrachtet wird der nebenstehend gezeichnete Kanal mit den folgenden Eigenschaften:
+
The channel shown opposite with the following properties is considered:
* Das Symbol $X = 0$ wird immer richtig übertragen und führt stets zum Ergebnis $Y = 0$.
+
* The symbol  $X = 0$  is always transmitted correctly and leads always to the result  $Y = 0$.
* Das Symbol $X = 1$ wird maximal verfälscht. Aus Sicht der Informationstheorie bedeutet diese Aussage:
+
* The symbol  $X = 1$  is distorted to the maximum.   
:$${\rm Pr}(Y \hspace{-0.05cm} = 0\hspace{-0.05cm}\mid \hspace{-0.05cm} X \hspace{-0.05cm}= 1) ={\rm Pr}(Y \hspace{-0.05cm} = 1\hspace{-0.05cm}\mid \hspace{-0.05cm} X \hspace{-0.05cm}= 1) = 0.5 \hspace{0.05cm}$$
 
Zu bestimmen sind in dieser Aufgabe:
 
* die Transinformation $I(X; Y)$für $P_X(0) = p_0 = 0.4$ und $P_X(1) = p_1 = 0.6$. Es gilt allgemein:
 
:$$ I(X;Y) \hspace{-0.15cm}  =\hspace{-0.15cm} H(X) - H(X \hspace{-0.1cm}\mid \hspace{-0.1cm} Y)\hspace{0.05cm}$$
 
$$I(X;Y) \hspace{-0.15cm}  = \hspace{-0.15cm} H(Y) - H(Y \hspace{-0.1cm}\mid \hspace{-0.1cm} X)\hspace{0.05cm}$$
 
$$I(X;Y) \hspace{-0.15cm}  =\hspace{-0.15cm} H(X) + H(Y)- H(XY)\hspace{0.05cm}$$
 
* die Kanalkapazität:
 
$$ C = \max_{P_X(X)} \hspace{0.15cm} I(X;Y) \hspace{0.05cm}$$
 
  
''Hinweise:''
 
*Die Aufgabe gehört zum  Kapitel [[Informationstheorie/Anwendung_auf_die_Digitalsignalübertragung|Anwendung auf die Digitalsignalübertragung]].
 
*Bezug genommen wird insbesondere auf die Seite    [[Informationstheorie/Anwendung_auf_die_Digitalsignalübertragung#Informationstheoretisches_Modell_der_Digitalsignal.C3.BCbertragung|Informationstheoretisches Modell der Digitalsignalübertragung]].
 
*Im obigen Schaubild sind Auslöschungen (Wahrscheinlichkeit $λ$) blau gezeichnet und „richtige Übertragungswege” (also von $X = μ$ nach $Y = μ$) blau ($1 ≤ μ ≤ M$).
 
*Sollte die Eingabe des Zahlenwertes „0” erforderlich sein, so geben Sie bitte „0.” ein.
 
  
 +
From the point of view of information theory, this means:
 +
:$${\rm Pr}(Y \hspace{-0.05cm} = 0\hspace{-0.05cm}\mid \hspace{-0.05cm} X \hspace{-0.05cm}= 1) ={\rm Pr}(Y \hspace{-0.05cm} = 1\hspace{-0.05cm}\mid \hspace{-0.05cm} X \hspace{-0.05cm}= 1) = 0.5 \hspace{0.05cm}.$$
 +
To be determined in this task are:
 +
* the mutual information&nbsp; $I(X; Y)$&nbsp; for&nbsp; $P_X(0) = p_0 = 0.4$&nbsp; and&nbsp; $P_X(1) = p_1 = 0.6$.&nbsp; <br>The general rule is:
 +
:$$ I(X;Y) = H(X) - H(X \hspace{-0.1cm}\mid \hspace{-0.1cm} Y)\hspace{0.05cm}=H(Y) - H(Y \hspace{-0.1cm}\mid \hspace{-0.1cm} X)\hspace{0.05cm} =\hspace{-0.15cm} H(X) + H(Y)- H(XY)\hspace{0.05cm},$$
 +
* the channel capacity:
 +
:$$ C = \max_{P_X(X)} \hspace{0.15cm} I(X;Y) \hspace{0.05cm}.$$
  
'''Hinweis:'''  Die Aufgabe beschreibt einen Teilaspekt von [http://en.lntwww.de/Informationstheorie/Anwendung_auf_die_Digitalsignal%C3%BCbertragung Kapitel 3.3]. In der Aufgabe A3.13 sollen die hier gefundenen Ergebnisse im Vergleich zum BSC–Kanal interpretiert werden
 
  
===Fragebogen===
+
 
 +
 
 +
 
 +
 
 +
 
 +
 
 +
Hints:
 +
*The exercise belongs to the chapter&nbsp; [[Information_Theory/Anwendung_auf_die_Digitalsignalübertragung|Application to digital signal transmission]].
 +
*Reference is made in particular to the page&nbsp;    [[Information_Theory/Anwendung_auf_die_Digitalsignalübertragung#Channel_capacity_of_a_binary_channel|Channel capacity of a binary channel]].
 +
*In&nbsp; [[Aufgaben:Exercise_3.14:_Channel_Coding_Theorem|Exercise 3.14]]&nbsp; the results found here are to be interpreted in comparison to the BSC channel.
 +
 
 +
 +
 
 +
 
 +
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
  
{Berechnen Sie die Quellenentropie allgemein und für $p_0 = 0.4$.
+
{Calculate the source entropy in general and for&nbsp; $\underline{p_0 = 0.4}$.
 
|type="{}"}
 
|type="{}"}
 
$H(X) \ = \ $ { 0.971 3% } $\ \rm bit$
 
$H(X) \ = \ $ { 0.971 3% } $\ \rm bit$
  
{Berechnen Sie die Sinkenentropie allgemein und für $p_0 = 0.4$.
+
{Calculate the sink entropy in general and for&nbsp; $p_0 = 0.4$.
 
|type="{}"}
 
|type="{}"}
 
$H(Y) \ = \ $ { 0.881 3% }  $\ \rm bit$
 
$H(Y) \ = \ $ { 0.881 3% }  $\ \rm bit$
  
{Berechnen Sie die Verbundentropie allgemein und für$p_0 = 0.4$.
+
{Calculate the joint entropy in general and for&nbsp; $p_0 = 0.4$.
 
|type="{}"}
 
|type="{}"}
 
$H(XY) \ = \ $ { 1.571 3% } $\ \rm bit$   
 
$H(XY) \ = \ $ { 1.571 3% } $\ \rm bit$   
  
{Berechnen Sie die Transinformation allgemein und für $p_0 = 0.4$.
+
{Calculate the mutual information in general and for&nbsp; $p_0 = 0.4$.
 
|type="{}"}
 
|type="{}"}
 
$I(X; Y) \ = \ $ { 0.281 3% } $\ \rm bit$
 
$I(X; Y) \ = \ $ { 0.281 3% } $\ \rm bit$
  
{Welche Wahrscheinlichkeit $p_0^{(*)}$ führt zur Kanalkapazität $C$?
+
{What probability&nbsp; $p_0^{(*)}$&nbsp; leads to channel capacity&nbsp; $C$?
 
|type="{}"}
 
|type="{}"}
 
$p_0^{(*)}  \ = \ $ { 0.6 3% }  
 
$p_0^{(*)}  \ = \ $ { 0.6 3% }  
  
{Wie groß ist die Kanalkapazität des vorliegenden Kanals?
+
{What is the channel capacity of the present channel?
 
|type="{}"}
 
|type="{}"}
 
$C  \ = \ $ { 0.322 3% }  $\ \rm bit$
 
$C  \ = \ $ { 0.322 3% }  $\ \rm bit$
  
{Wie groß sind die bedingten Entropien mit $p_0 = p_0^{(*)}$ gemäß Teilaufgabe (5)?
+
{What are the conditional entropies with&nbsp; $p_0 = p_0^{(*)}$&nbsp; according to subtask&nbsp; '''(5)'''?
 
|type="{}"}
 
|type="{}"}
 
$H(X|Y) \ = \ $ { 0.649 3% }  $\ \rm bit$
 
$H(X|Y) \ = \ $ { 0.649 3% }  $\ \rm bit$
Line 61: Line 68:
  
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''1.''' Die Quellenentropie ergibt sich entsprechend der binären Entropiefunktion:
+
'''(1)'''&nbsp; The source entropy results according to the binary entropy function:
$$H(X) = H_{\rm bin}(p_0)= H_{\rm bin}(0.4) \hspace{0.15cm} \underline {=0.971\,{\rm bit}} \hspace{0.05cm}$$
+
:$$H(X) = H_{\rm bin}(p_0)= H_{\rm bin}(0.4) \hspace{0.15cm} \underline {=0.971\,{\rm bit}} \hspace{0.05cm}.$$
'''2.'''Die Wahrscheinlichkeiten der Sinkensymbole sind:
+
 
$$P_Y(1) = p_1/2 = (1 - p_0)/2 = 0.3\hspace{0.05cm},\hspace{0.2cm} P_Y(0) = 1-P_Y(1) = p_1/2 = (1 - p_0)/2 = 0.7$$
+
 
$$\Rightarrow \hspace{0.3cm} H(Y) = H_{\rm bin}(\frac{1+p_0}{2})= H_{\rm bin}(0.7) \hspace{0.15cm} \underline {=0.881\,{\rm bit}} \hspace{0.05cm}$$
+
 
'''3.''' Die Verbundwahrscheinlichkeiten $p_{μκ} = Pr[(X = μ) ∩ (Y = κ)]$ ergeben sich zu:
+
'''(2)'''&nbsp; The probabilities of the sink symbols are:
$$ p_{00} = p_0 \hspace{0.05cm},\hspace{0.3cm} p_{01} = 0 \hspace{0.05cm},\hspace{0.3cm} p_{10} = (1 - p_0)/2 \hspace{0.05cm},\hspace{0.3cm} p_{11} = (1 - p_0)/2$$
+
:$$P_Y(1) = p_1/2 = (1 - p_0)/2 = 0.3\hspace{0.05cm},\hspace{0.2cm} P_Y(0) = 1-P_Y(1) = p_1/2 = (1 - p_0)/2 = 0.7$$
$$\Rightarrow \hspace{0.3cm} H(XY) \hspace{-0.15cm}=\hspace{-0.15cm} p_0 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{ p_0} + 2 \cdot \frac{1-p_0}{2} \cdot {\rm log}_2 \hspace{0.1cm} \frac{2}{ 1- p_0} =$$
+
:$$\Rightarrow \hspace{0.3cm} H(Y) = H_{\rm bin}(\frac{1+p_0}{2})= H_{\rm bin}(0.7) \hspace{0.15cm} \underline {=0.881\,{\rm bit}} \hspace{0.05cm}.$$
$$=\hspace{-0.15cm} p_0 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{ p_0} + (1-p_0) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{ 1- p_0} + (1-p_0) \cdot {\rm log}_2 \hspace{0.1cm} (2) =$$
+
 
$$=\hspace{-0.15cm}H_{\rm bin}(p_0) + 1 - p_0 \hspace{0.05cm}$$
+
 
Das numerische Ergebnis für $p_0 = 0.4$ ist somit:
+
 
$$H(XY) = H_{\rm bin}(0.4) + 0.6 = 0.971 + 0.6 \hspace{0.15cm} \underline {=1.571\,{\rm bit}} \hspace{0.05cm}$$
+
'''(3)'''&nbsp; The joint probabilities &nbsp;$p_{μκ} = {\rm Pr}\big[(X = μ) ∩ (Y = κ)\big]&nbsp;$ are obtained as:
 +
:$$ p_{00} = p_0 \hspace{0.05cm},\hspace{0.3cm} p_{01} = 0 \hspace{0.05cm},\hspace{0.3cm} p_{10} = (1 - p_0)/2 \hspace{0.05cm},\hspace{0.3cm} p_{11} = (1 - p_0)/2$$
 +
:$$\Rightarrow \hspace{0.3cm} H(XY) =p_0 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{ p_0} + 2 \cdot \frac{1-p_0}{2} \cdot {\rm log}_2 \hspace{0.1cm} \frac{2}{ 1- p_0} = p_0 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{ p_0} + (1-p_0) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{ 1- p_0} + (1-p_0) \cdot {\rm log}_2 \hspace{0.1cm} (2)$$
 +
:$$\Rightarrow \hspace{0.3cm}H(XY) =H_{\rm bin}(p_0) + 1 - p_0 \hspace{0.05cm}.$$
 +
*The numerical result for&nbsp; $p_0 = 0.4$&nbsp; is thus:
 +
:$$H(XY) = H_{\rm bin}(0.4) + 0.6 = 0.971 + 0.6 \hspace{0.15cm} \underline {=1.571\,{\rm bit}} \hspace{0.05cm}.$$
 +
 
 +
 
 +
 
 +
'''(4)'''&nbsp; A (possible) equation to calculate the mutual information is:
 +
:$$ I(X;Y) = H(X) + H(Y)- H(XY)\hspace{0.05cm}.$$
 +
*From this, using the results of the first three subtasks, one obtains:
 +
:$$I(X;Y) = H_{\rm bin}(p_0) + H_{\rm bin}(\frac{1+p_0}{2}) - H_{\rm bin}(p_0) -1 + p_0 = H_{\rm bin}(\frac{1+p_0}{2}) -1 + p_0.$$
 +
:$$ \Rightarrow \hspace{0.3cm} p_0 = 0.4 {\rm :}\hspace{0.5cm} I(X;Y) = H_{\rm bin}(0.7) - 0.6 = 0.881 - 0.6 \hspace{0.15cm} \underline {=0.281\,{\rm bit}}\hspace{0.05cm}.$$
 +
 
 +
 
 +
 
 +
'''(5)'''&nbsp; The channel capacity&nbsp; $C$&nbsp; is the mutual information &nbsp;$I(X; Y)$&nbsp; at best possible probabilities &nbsp;$p_0$&nbsp;  and &nbsp; $p_1$&nbsp;  of the source symbols.
 +
*After differentiation, the determination equation is obtained:
 +
:$$\frac{\rm d}{{\rm d}p_0} \hspace{0.1cm} I(X;Y) =
 +
\frac{\rm d}{{\rm d}p_0} \hspace{0.1cm}  H_{\rm bin}(\frac{1+p_0}{2}) +1 \stackrel{!}{=} 0
 +
\hspace{0.05cm}.$$
 +
*With the differential quotient of the binary entropy function
 +
:$$ \frac{\rm d}{{\rm d}p} \hspace{0.1cm}  H_{\rm bin}(p) = {\rm log}_2 \hspace{0.1cm} \frac{1-p}{ p} \hspace{0.05cm},$$
 +
:and corresponding post-differentialisation one obtains:
 +
:$${1}/{2} \cdot {\rm log}_2 \hspace{0.1cm} \frac{(1-p_0)/2}{1- (1-p_0)/2} +1 \stackrel{!}{=} 0 \hspace{0.3cm} \Rightarrow \hspace{0.3cm} {1}/{2} \cdot {\rm log}_2 \hspace{0.1cm} \frac{(1-p_0)/2}{(1+p_0)/2} +1 \stackrel{!}{=} 0$$
 +
:$$ \Rightarrow \hspace{0.3cm} {\rm log}_2 \hspace{0.1cm} \frac{1+p_0}{1-p_0} \stackrel{!}{=} 2 \hspace{0.3cm} \Rightarrow \hspace{0.3cm} \frac{1+p_0}{1-p_0} \stackrel{!}{=} 4 \hspace{0.3cm}\Rightarrow \hspace{0.3cm} p_0 \hspace{0.15cm} \underline {=0.6}=p_0^{(*)}\hspace{0.05cm}.$$
 +
 
 +
 
 +
 
 +
'''(6)'''&nbsp; Accordingly, for the channel capacity:
 +
:$$C = I(X;Y) \big |_{p_0 \hspace{0.05cm}=\hspace{0.05cm} 0.6} = H_{\rm bin}(0.8) - 0.4 = 0.722 -0.4 \hspace{0.15cm} \underline {=0.322\,{\rm bit}}\hspace{0.05cm}.$$
 +
*Exercise 3.14 interprets this result in comparison to the BSC channel model.
 +
 
 +
 
 +
 
  
'''4.'''Eine (mögliche) Gleichung zur Berechnung der Transinformation lautet:
+
'''(7)'''&nbsp; For the equivocation holds:
$$ I(X;Y) = H(X) + H(Y)- H(XY)\hspace{0.05cm}$$
+
:$$ H(X \hspace{-0.1cm}\mid \hspace{-0.1cm}Y) = H(X) - I(X;Y) = 0.971 -0.322 \hspace{0.15cm} \underline {=0.649\,{\rm bit}}\hspace{0.05cm}.$$
Daraus erhält man mit den Ergebnissen der Teilaufgaben (a), (b) und (c):
+
*Because of &nbsp; $H_{\rm bin}(0.4) = H_{\rm bin}(0.6)$&nbsp; the source entropy&nbsp; $H(X)$&nbsp; is the same as in subtask&nbsp; '''(1)'''.  
$$I(X;Y) = H_{\rm bin}(p_0) + H_{\rm bin}(\frac{1+p_0}{2}) - H_{\rm bin}(p_0) -1 + p_0 = H_{\rm bin}(\frac{1+p_0}{2}) -1 + p_0$$
+
*The sink entropy must be recalculated.&nbsp; With&nbsp; $p_0 = 0.6$&nbsp; we get&nbsp; $H(Y) = H_{\rm bin}(0.8) = 0.722\ \rm  bit$.
$$ \Rightarrow \hspace{0.3cm} p_0 = 0.4 {\rm :}\hspace{0.5cm} I(X;Y) = H_{\rm bin}(0.7) - 0.6 = 0.881 - 0.6 \hspace{0.15cm} \underline {=0.281\,{\rm bit}}\hspace{0.05cm}$$
+
*This gives the irrelevance:
'''5''' Die Kanalkapazität $C$ ist die Transinformation $I(X; Y) $bei bestmöglichen Wahrscheinlichkeiten $p_0$ und $p_1$ der Quellensymbole. Nach Differentiation erhält man die Bestimmungsgleichung:
+
:$$H(Y \hspace{-0.1cm}\mid \hspace{-0.1cm} X) = H(Y) - I(X;Y) = 0.722 -0.322 \hspace{0.15cm} \underline {=0.400\,{\rm bit}}\hspace{0.05cm}.$$
$$\frac{d}{d_{p_0}} I(X;Y) =  \frac{d}{d_{p_0}} H_{bin}(\frac{1+p_0}{2} +1\stackrel{!}{=} 0.$$
 
Mit dem Differentialquotienten der binären Entropiefunktion
 
$$ \frac{d}{d_p}H_{bin} = log_2 \frac{1-p}{p}, $$
 
und entsprechendes Nachdifferenzieren erhält man :
 
$$\frac{1}{2} \cdot {\rm log}_2 \hspace{0.1cm} \frac{(1-p_0)/2}{1- (1-p_0)/2} +1 \stackrel{!}{=} 0 \hspace{0.3cm} \Rightarrow \hspace{0.3cm} \frac{1}{2} \cdot {\rm log}_2 \hspace{0.1cm} \frac{(1-p_0)/2}{(1+p_0)/2} +1 \stackrel{!}{=} 0$$
 
$$ \Rightarrow \hspace{0.3cm} {\rm log}_2 \hspace{0.1cm} \frac{1+p_0}{1-p_0} \stackrel{!}{=} 2 \hspace{0.3cm} \Rightarrow \hspace{0.3cm} \frac{1+p_0}{1-p_0} \stackrel{!}{=} 4 \hspace{0.3cm}\Rightarrow \hspace{0.3cm} p_0 \hspace{0.15cm} \underline {=0.6}\hspace{0.05cm}$$
 
'''6.'''  Für die Kanalkapazität gilt dementsprechend:
 
$$C = I(X;Y) \big |_{p_0 \hspace{0.05cm}=\hspace{0.05cm} 0.6} = H_{\rm bin}(0.8) - 0.4 = 0.722 -0.4 \hspace{0.15cm} \underline {=0.322\,{\rm bit}}\hspace{0.05cm}$$
 
In Aufgabe A3.13 wird dieses Ergebnis im Vergleich zum BSC–Kanalmodell interpretiert.
 
'''7.''' Für die Äquivokation gilt:
 
$$ H(X \hspace{-0.1cm}\mid \hspace{-0.1cm}Y) = H(X) - I(X;Y) = 0.971 -0.322 \hspace{0.15cm} \underline {=0.649\,{\rm bit}}\hspace{0.05cm}$$
 
Wegen $H_{bin}(0.4) = H_{bin}(0.6)$ ergibt sich die gleiche Quellenentropie $H(X)$ wie in Teilaufgabe (a). Die Sinkenentropie muss neu berechnet werden. Mit $p_0 = 0.6$ erhält man $H(Y) = H_{bin}(0.8) = 0.722 bit$, und damit ergibt sich für die Irrelevanz:
 
$$H(Y \hspace{-0.1cm}\mid \hspace{-0.1cm} X) = H(Y) - I(X;Y) = 0.722 -0.322 \hspace{0.15cm} \underline {=0.400\,{\rm bit}}\hspace{0.05cm}$$
 
  
 
{{ML-Fuß}}
 
{{ML-Fuß}}
Line 100: Line 129:
  
  
[[Category:Aufgaben zu  Informationstheorie|^3.3 Anwendung auf DSÜ-Kanäle^]]
+
[[Category:Information Theory: Exercises|^3.3 Application to Digital Signal Transmission^]]

Latest revision as of 12:56, 24 September 2021

One-sided distorting channel

The channel shown opposite with the following properties is considered:

  • The symbol  $X = 0$  is always transmitted correctly and leads always to the result  $Y = 0$.
  • The symbol  $X = 1$  is distorted to the maximum. 


From the point of view of information theory, this means:

$${\rm Pr}(Y \hspace{-0.05cm} = 0\hspace{-0.05cm}\mid \hspace{-0.05cm} X \hspace{-0.05cm}= 1) ={\rm Pr}(Y \hspace{-0.05cm} = 1\hspace{-0.05cm}\mid \hspace{-0.05cm} X \hspace{-0.05cm}= 1) = 0.5 \hspace{0.05cm}.$$

To be determined in this task are:

  • the mutual information  $I(X; Y)$  for  $P_X(0) = p_0 = 0.4$  and  $P_X(1) = p_1 = 0.6$. 
    The general rule is:
$$ I(X;Y) = H(X) - H(X \hspace{-0.1cm}\mid \hspace{-0.1cm} Y)\hspace{0.05cm}=H(Y) - H(Y \hspace{-0.1cm}\mid \hspace{-0.1cm} X)\hspace{0.05cm} =\hspace{-0.15cm} H(X) + H(Y)- H(XY)\hspace{0.05cm},$$
  • the channel capacity:
$$ C = \max_{P_X(X)} \hspace{0.15cm} I(X;Y) \hspace{0.05cm}.$$





Hints:



Questions

1

Calculate the source entropy in general and for  $\underline{p_0 = 0.4}$.

$H(X) \ = \ $

$\ \rm bit$

2

Calculate the sink entropy in general and for  $p_0 = 0.4$.

$H(Y) \ = \ $

$\ \rm bit$

3

Calculate the joint entropy in general and for  $p_0 = 0.4$.

$H(XY) \ = \ $

$\ \rm bit$

4

Calculate the mutual information in general and for  $p_0 = 0.4$.

$I(X; Y) \ = \ $

$\ \rm bit$

5

What probability  $p_0^{(*)}$  leads to channel capacity  $C$?

$p_0^{(*)} \ = \ $

6

What is the channel capacity of the present channel?

$C \ = \ $

$\ \rm bit$

7

What are the conditional entropies with  $p_0 = p_0^{(*)}$  according to subtask  (5)?

$H(X|Y) \ = \ $

$\ \rm bit$
$H(Y|X) \ = \ $

$\ \rm bit$


Solution

(1)  The source entropy results according to the binary entropy function:

$$H(X) = H_{\rm bin}(p_0)= H_{\rm bin}(0.4) \hspace{0.15cm} \underline {=0.971\,{\rm bit}} \hspace{0.05cm}.$$


(2)  The probabilities of the sink symbols are:

$$P_Y(1) = p_1/2 = (1 - p_0)/2 = 0.3\hspace{0.05cm},\hspace{0.2cm} P_Y(0) = 1-P_Y(1) = p_1/2 = (1 - p_0)/2 = 0.7$$
$$\Rightarrow \hspace{0.3cm} H(Y) = H_{\rm bin}(\frac{1+p_0}{2})= H_{\rm bin}(0.7) \hspace{0.15cm} \underline {=0.881\,{\rm bit}} \hspace{0.05cm}.$$


(3)  The joint probabilities  $p_{μκ} = {\rm Pr}\big[(X = μ) ∩ (Y = κ)\big] $ are obtained as:

$$ p_{00} = p_0 \hspace{0.05cm},\hspace{0.3cm} p_{01} = 0 \hspace{0.05cm},\hspace{0.3cm} p_{10} = (1 - p_0)/2 \hspace{0.05cm},\hspace{0.3cm} p_{11} = (1 - p_0)/2$$
$$\Rightarrow \hspace{0.3cm} H(XY) =p_0 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{ p_0} + 2 \cdot \frac{1-p_0}{2} \cdot {\rm log}_2 \hspace{0.1cm} \frac{2}{ 1- p_0} = p_0 \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{ p_0} + (1-p_0) \cdot {\rm log}_2 \hspace{0.1cm} \frac{1}{ 1- p_0} + (1-p_0) \cdot {\rm log}_2 \hspace{0.1cm} (2)$$
$$\Rightarrow \hspace{0.3cm}H(XY) =H_{\rm bin}(p_0) + 1 - p_0 \hspace{0.05cm}.$$
  • The numerical result for  $p_0 = 0.4$  is thus:
$$H(XY) = H_{\rm bin}(0.4) + 0.6 = 0.971 + 0.6 \hspace{0.15cm} \underline {=1.571\,{\rm bit}} \hspace{0.05cm}.$$


(4)  A (possible) equation to calculate the mutual information is:

$$ I(X;Y) = H(X) + H(Y)- H(XY)\hspace{0.05cm}.$$
  • From this, using the results of the first three subtasks, one obtains:
$$I(X;Y) = H_{\rm bin}(p_0) + H_{\rm bin}(\frac{1+p_0}{2}) - H_{\rm bin}(p_0) -1 + p_0 = H_{\rm bin}(\frac{1+p_0}{2}) -1 + p_0.$$
$$ \Rightarrow \hspace{0.3cm} p_0 = 0.4 {\rm :}\hspace{0.5cm} I(X;Y) = H_{\rm bin}(0.7) - 0.6 = 0.881 - 0.6 \hspace{0.15cm} \underline {=0.281\,{\rm bit}}\hspace{0.05cm}.$$


(5)  The channel capacity  $C$  is the mutual information  $I(X; Y)$  at best possible probabilities  $p_0$  and   $p_1$  of the source symbols.

  • After differentiation, the determination equation is obtained:
$$\frac{\rm d}{{\rm d}p_0} \hspace{0.1cm} I(X;Y) = \frac{\rm d}{{\rm d}p_0} \hspace{0.1cm} H_{\rm bin}(\frac{1+p_0}{2}) +1 \stackrel{!}{=} 0 \hspace{0.05cm}.$$
  • With the differential quotient of the binary entropy function
$$ \frac{\rm d}{{\rm d}p} \hspace{0.1cm} H_{\rm bin}(p) = {\rm log}_2 \hspace{0.1cm} \frac{1-p}{ p} \hspace{0.05cm},$$
and corresponding post-differentialisation one obtains:
$${1}/{2} \cdot {\rm log}_2 \hspace{0.1cm} \frac{(1-p_0)/2}{1- (1-p_0)/2} +1 \stackrel{!}{=} 0 \hspace{0.3cm} \Rightarrow \hspace{0.3cm} {1}/{2} \cdot {\rm log}_2 \hspace{0.1cm} \frac{(1-p_0)/2}{(1+p_0)/2} +1 \stackrel{!}{=} 0$$
$$ \Rightarrow \hspace{0.3cm} {\rm log}_2 \hspace{0.1cm} \frac{1+p_0}{1-p_0} \stackrel{!}{=} 2 \hspace{0.3cm} \Rightarrow \hspace{0.3cm} \frac{1+p_0}{1-p_0} \stackrel{!}{=} 4 \hspace{0.3cm}\Rightarrow \hspace{0.3cm} p_0 \hspace{0.15cm} \underline {=0.6}=p_0^{(*)}\hspace{0.05cm}.$$


(6)  Accordingly, for the channel capacity:

$$C = I(X;Y) \big |_{p_0 \hspace{0.05cm}=\hspace{0.05cm} 0.6} = H_{\rm bin}(0.8) - 0.4 = 0.722 -0.4 \hspace{0.15cm} \underline {=0.322\,{\rm bit}}\hspace{0.05cm}.$$
  • Exercise 3.14 interprets this result in comparison to the BSC channel model.



(7)  For the equivocation holds:

$$ H(X \hspace{-0.1cm}\mid \hspace{-0.1cm}Y) = H(X) - I(X;Y) = 0.971 -0.322 \hspace{0.15cm} \underline {=0.649\,{\rm bit}}\hspace{0.05cm}.$$
  • Because of   $H_{\rm bin}(0.4) = H_{\rm bin}(0.6)$  the source entropy  $H(X)$  is the same as in subtask  (1).
  • The sink entropy must be recalculated.  With  $p_0 = 0.6$  we get  $H(Y) = H_{\rm bin}(0.8) = 0.722\ \rm bit$.
  • This gives the irrelevance:
$$H(Y \hspace{-0.1cm}\mid \hspace{-0.1cm} X) = H(Y) - I(X;Y) = 0.722 -0.322 \hspace{0.15cm} \underline {=0.400\,{\rm bit}}\hspace{0.05cm}.$$