Information Theory: Exercises
From LNTwww
Pages in category "Information Theory: Exercises"
The following 71 pages are in this category, out of 71 total.
1.1 Memoryless Sources
1.2 Sources with Memory
1.3 Natural Discrete Sources
2.1 General Description
2.2 Lempel-Ziv-Welch Compression
2.3 Entropy Coding according to Huffman
2.4 Further Source Coding Methods
3.1 General Information on 2D Random Variables
- Exercise 3.1: Probabilities when Rolling Dice
- Exercise 3.1Z: Drawing Cards
- Exercise 3.2: Expected Value Calculations
- Exercise 3.2Z: Two-dimensional Probability Mass Function
- Exercise 3.3: Entropy of Ternary Quantities
- Exercise 3.4: Entropy for Different PMF
- Exercise 3.5: Kullback-Leibler Distance and Binomial Distribution
- Exercise 3.5Z: Kullback-Leibler Distance again
- Exercise 3.6: Partitioning Inequality
3.2 Entropies of 2D Random Variables
3.3 Application to Digital Signal Transmission
- Exercise 3.10: Mutual Information at the BSC
- Exercise 3.10Z: BSC Channel Capacity
- Exercise 3.11: Erasure Channel
- Exercise 3.11Z: Extremely Asymmetrical Channel
- Exercise 3.12: Strictly Symmetrical Channels
- Exercise 3.13: Code Rate and Reliability
- Exercise 3.14: Channel Coding Theorem
- Exercise 3.15: Data Processing Theorem
4.1 Differential Entropy
- Exercise 4.1: PDF, CDF and Probability
- Exercise 4.1Z: Calculation of Moments
- Exercise 4.2: Triangular PDF
- Exercise 4.2Z: Mixed Random Variables
- Exercise 4.3: PDF Comparison with Regard to Differential Entropy
- Exercise 4.3Z: Exponential and Laplace Distribution
- Exercise 4.4: Conventional Entropy and Differential Entropy