Exercise 4.9Z: Is Channel Capacity C ≡ 1 possible with BPSK?
We assume here a binary bipolar source signal ⇒ x∈X={+1,−1}.
Thus, the probability density function (PDF) of the source is:
- fX(x)=1/2⋅δ(x−1)+1/2⋅δ(x+1).
The mutual information between the source X and the sink Y can be calculated according to the equation
- I(X;Y)=h(Y)−h(N)
where holds:
- h(Y) denotes the differential sink entropy
- h(Y)=−∫supp(fY)fY(y)⋅log2[fY(y)]dy,
- withfY(y)=1/2⋅[fY|X(y|X=−1)+fY|X(y|X=+1)].
- h(N) gives the differential noise entropy computable from the PDF fN(n) alone:
- h(N)=−∫supp(fN)fN(n)⋅log2[fN(n)]dn.
Assuming a Gaussian distribution fN(n) for the noise N according to the upper sketch, we obtain the channel capacity CBPSK=I(X;Y), which is shown in the theory section depending on 10⋅lg(EB/N0) .
The question to be answered is whether there is a finite EB/N0 value for which CBPSK(EB/N0)≡1 bit/channeluse is possible ⇒ subtask (5).
In subtasks (1) to (4), preliminary work is done to answer this question. The uniformly distributed noise PDF fN(n) is always assumed (see sketch below):
- fN(n)={1/(2A)0f¨ur|n|<A,f¨ur|n|>A.
Hints:
- The exercise belongs to the chapter AWGN channel capacitance for discrete input.
- Reference is made in particular to the page AWGN channel capacitance for binary input signals.
Questions
Solution
- h(N)=log2(2A)⇒A=1/8:h(N)=log2(1/4)=−2bit/symbol_.
(2) The probability density function at the output is obtained according to the equation:
- fY(y)=1/2⋅[fY|X(y|x=−1)+fY|X(y|x=+1)].
The graph shows the result for our example (A=1/8):
- Drawn in red is the first term 1/2⋅fY|X(y|−1), where the rectangle fN(n) is shifted to the center position y=−1 and is multiplied by 1/2 . The result is a rectangle of width 2A=1/4 and height 1/(4A)=2.
- Shown in blue is the second term 1/2⋅fY|X(y|+1) centered at y=+1.
- Leaving the colors out of account, the total PDF fY(y) is obtained.
- The differential entropy is not changed by moving non-overlapping PDF sections.
- Thus, for the differential sink entropy we are looking for, we get:
- h(Y)=log2(4A)⇒A=1/8:h(Y)=log2(1/2)=−1bit/symbol_.
(3) Thus, for the mutual information between source and sink, we obtain:
- I(X;Y)=h(Y)−h(N)=(−1bit/symbol)−(−2bit/symbol)=+1bit/symbol_.
(4) All the proposed solutions are true:
- For each A≤1 holds
- h(Y)=log2(4A)=log2(2A)+log2(2),h(N)=log2(2A)
- ⇒I(X;Y)=h(Y)−h(N)=log2(2)=+1bit/symbol_.
- This principle does not change even if the PDF fN(n) is different, as long as the noise is limited to the range |n|≤1 .
- However, if the two conditional probability density functions overlap, the result is a smaller value for h(Y) than calculated above and thus smaller mutual information.
(5) Correct is the proposed solution 2:
- The Gaussian function decays very fast, but it never becomes exactly equal to zero.
- There is always an overlap of the conditional density functions fY|X(y|x=−1) and fY|X(y|x=+1).
- According to subtask (4) , CBPSK(EB/N0)≡1 bit/channeluse is therefore not possible.