Loading [MathJax]/jax/output/HTML-CSS/fonts/TeX/fontdata.js

Exercise 4.2: Triangular PDF

From LNTwww

Two triangular PDFs

Two probability density functions  (PDF)  with triangular shapes are considered.

  • The random variable  X  is limited to the range from  0  to  1 ,  and it holds for the PDF (upper sketch):
fX(x)={2x0f¨ur0x1else.
  • According to the lower sketch, the random variable  Y  has the following PDF:
fY(y)={1|y|0f¨ur|y|1else.

For both random variables, the  differential entropy  is to be determined in each case.

For example, the corresponding equation for the random variable  X  is:

h(X)=supp(fX)fX(x)log[fX(x)]dxwithsupp(fX)={x: fX(x)>0}.
  • If the  "natural logarithm",  the pseudo-unit  "nat"  must be added.
  • If, on the other hand, the result is asked in  "bit"  then the  "dual logarithm"   ⇒   "log2"  is to be used.


In the fourth subtask, the new random variable  Z=AY  is considered. Here,  the PDF parameter  A  is to be determined in such a way that the differential entropy of the new random variable  Z  yields exactly  1  bit :

h(Z)=h(AY)=h(Y)+log2(A)=1 bit.





Hints:

  • The task belongs to the chapter  Differential Entropy.
  • Useful hints for solving this task and further information on continuous random variables can be found in the third chapter "Continuous Random Variables" of the book  Theory of Stochastic Signals.
  • Given the following indefinite integral:
ξln(ξ)dξ=ξ2[1/2ln(ξ)1/4].


Questions

1

Calculate the differential entropy of the random variable  X  in  "nat".

h(X) = 

 nat

2

What result is obtained with the pseudo-unit  "bit"?

h(X) = 

 bit

3

Calculate the differential entropy of the random variable  Y.

h(Y) = 

 bit

4

Determine the PDF parameter  A  such that  h(Z)=h(AY)=1 bit_ .

A =


Solution

(1)  For the probability density function, in the range  0X1 , it is agreed that:

fX(x)=2x=Cx.
  • Here we have replaced  "2"  by  C    ⇒   generalization in order to be able to use the following calculation again in subtask  (3) .
  • Since the differential entropy is sought in  "nat",  we use the natural logarithm.  With the substitution  ξ=Cx  we obtain:
hnat(X)=10Cxln[Cx]dx=1CC0ξln[ξ]dξ=ξ2C[ln(ξ)214]ξ=Cξ=0
  • Here the indefinite integral given in the front was used.  After inserting the limits, considering  C=2,  we obtain::
hnat(X)=C/2[ln(C)1/2]=ln(2)+1/2=ln(2)+1/2ln(e)=ln(e/2)=0.193h(X)=0.193nat_.


(2)  In general:

To calculate  h(Y)
hbit(X)=hnat(X)ln(2)nat/bit=0.279h(X)=0.279bit_.
  • You can save this conversion if you directly replace  (1)  direct  "ln"  by  "log2"  already in the analytical result of subtask:
h(X)= log2(e/2),pseudounit:bit.


(3)  We again use the natural logarithm and divide the integral into two partial integrals:

h(Y)=supp(fY)fY(y)ln[fY(y)]dy=Ineg+Ipos.
  • The first integral for the range  1y0  is identical in form to that of subtask  (1)  and only shifted with respect to it, which does not affect the result.
  • Now the height  C=1  instead of  C=2  has to be considered:
Ineg=C/2[ln(C)1/2]=1/2[ln(1)1/2ln(e)]=1/4ln(e).
  • The second integrand is identical to the first except for a shift and reflection.  Moreover, the integration intervals do not overlap   ⇒   Ipos=Ineg:
hnat(Y)=2Ineg=1/2ln(e)=ln(e)hbit(Y)=log2(e)h(Y)=log2(1.649)=0.721bit_.


(4)  For the differential entropy of the random variable  Z=AY  holds in general:

h(Z)=h(AY)=h(Y)+log2(A).
  • Thus, from the requiremen  h(Z)=1 bit  and the result of subtask  (3)  follows:
log2(A)=1bit0.721bit=0.279bitA=20.279=1.213_.