Loading [MathJax]/jax/output/HTML-CSS/fonts/TeX/fontdata.js

Exercise 4.9Z: Is Channel Capacity C ≡ 1 possible with BPSK?

From LNTwww

Two different PDFs  fN(n)  for the impairments  (e.g. noise)

We assume here a binary bipolar source signal   ⇒   xX={+1,1}.

Thus,  the probability density function  (PDF)  of the source is:

fX(x)=1/2δ(x1)+1/2δ(x+1).

The mutual information between the source  X  and the sink  Y  can be calculated according to the equation

I(X;Y)=h(Y)h(N)

where holds:

  • h(Y)  denotes the  differential sink entropy
h(Y)=supp(fY)fY(y)log2[fY(y)]dy,
withfY(y)=1/2[fY|X(y|X=1)+fY|X(y|X=+1)].
  • h(N)  gives the  differential noise entropy  computable from the PDF  fN(n)  alone:
h(N)=supp(fN)fN(n)log2[fN(n)]dn.

Assuming a Gaussian distribution  fN(n)  for the noise  N  according to the upper sketch,  we obtain the channel capacity  CBPSK=I(X;Y),  which is shown in the  theory section  depending on  10lg(EB/N0) .

The question to be answered is whether there is a finite  EB/N0  value for which  CBPSK(EB/N0)1 bit/channeluse  is possible   ⇒   subtask  (5).

In subtasks  (1)  to  (4),  preliminary work is done to answer this question.  The uniformly distributed noise PDF  fN(n)  is always assumed (see sketch below):

fN(n)={1/(2A)0f¨ur|n|<A,f¨ur|n|>A.



Hints:



Questions

1

What is the differential entropy with the uniform PDF  fN(n)  and  A=1/8_?

h(N) = 

 bit/symbol

2

What is the differential sink entropy with the uniform PDF  fN(n)  and  A=1/8_?

h(Y) = 

 bit/symbol

3

What is the magnitude of the mutual information between the source and sink?  Assume further a uniformly distributed impairments with  A=1/8_ .

I(X;Y) = 

 bit/symbol

4

Under what conditions does the result of subtask  (3)  not change?

For any  A1  for the given uniform distribution.
For any other PDF  fN(n),  limited to the range  |n|<1 .
If  fY|X(y|X=1)  and  fY|X(y|X=+1)  do not overlap.

5

Now answer the crucial question, assuming,  that Gaussian noise is the only impairment and the quotient  EB/N0  is finite.

CBPSK(EB/N0)1 bit/channeluse  is possible with a Gaussian PDF.
For Gaussian noise with finite  EB/N0 ,   CBPSK(EB/N0)<1 bit/channeluse is always valid.


Solution

(1)  The differential entropy of a uniform distribution of absolute width  2A  is equal to

h(N)=log2(2A)A=1/8:h(N)=log2(1/4)=2bit/symbol_.


PDF of the output variable  Y 
with uniformly distributed noise  N

(2)  The probability density function at the output is obtained according to the equation:

fY(y)=1/2[fY|X(y|x=1)+fY|X(y|x=+1)].

The graph shows the result for our example  (A=1/8):

  • Drawn in red is the first term  1/2fY|X(y|1),  where the rectangle  fN(n)  is shifted to the center position  y=1  and is multiplied by  1/2 .  The result is a rectangle of width  2A=1/4  and height  1/(4A)=2.
  • Shown in blue is the second term  1/2fY|X(y|+1)  centered at  y=+1.
  • Leaving the colors out of account,  the total PDF  fY(y) is obtained.
  • The differential entropy is not changed by moving non-overlapping PDF sections.  
  • Thus, for the differential sink entropy we are looking for,  we get:
h(Y)=log2(4A)A=1/8:h(Y)=log2(1/2)=1bit/symbol_.


(3)  Thus, for the mutual information between source and sink, we obtain:

I(X;Y)=h(Y)h(N)=(1bit/symbol)(2bit/symbol)=+1bit/symbol_.


(4)  All the proposed solutions  are true:

  • For each  A1  holds
h(Y)=log2(4A)=log2(2A)+log2(2),h(N)=log2(2A)
I(X;Y)=h(Y)h(N)=log2(2)=+1bit/symbol_.
PDF of the output quantity  Y 
with Gaussian noise  N
  • This principle does not change even if the PDF  fN(n)  is different,  as long as the noise is limited to the range  |n|1 .
  • However,  if the two conditional probability density functions overlap,  the result is a smaller value for  h(Y)  than calculated above and thus smaller mutual information.


(5)  Correct is the  proposed solution 2:

  • The Gaussian function decays very fast,  but it never becomes exactly equal to zero.
  • There is always an overlap of the conditional density functions  fY|X(y|x=1)  and  fY|X(y|x=+1).
  • According to subtask  (4) ,  CBPSK(EB/N0)1 bit/channeluse  is therefore not possible.