The following pages link to ML-Kopf:
View (previous 50 | next 50) (20 | 50 | 100 | 250 | 500)- Exercise 2.10: Reed-Solomon Error Detection (transclusion) (← links)
- Exercise 2.10Z: Code Rate and Minimum Distance (transclusion) (← links)
- Exercise 2.11Z: Erasure Channel for Symbols (transclusion) (← links)
- Exercise 2.11: Reed-Solomon Decoding according to "Erasures" (transclusion) (← links)
- Exercise 2.12: Decoding at RSC (7, 4, 4) to Base 8 (transclusion) (← links)
- Exercise 2.12Z: Reed-Solomon Syndrome Calculation (transclusion) (← links)
- Exercise 2.13: Decoding at RSC (7, 3, 5) to Base 8 (transclusion) (← links)
- Exercise 2.14: Petersen Algorithm? (transclusion) (← links)
- Exercise 2.15: Block Error Probability with AWGN (transclusion) (← links)
- Exercise 2.15Z: Block Error Probability once more (transclusion) (← links)
- Exercise 2.16: Bounded Distance Decoding: Decision Regions (transclusion) (← links)
- Exercise 3.1: Analysis of a Convolutional Encoder (transclusion) (← links)
- Exercise 3.1Z: Convolution Codes of Rate 1/2 (transclusion) (← links)
- Exercise 3.2: G-matrix of a Convolutional Encoder (transclusion) (← links)
- Exercise 3.2Z: (3, 1, 3) Convolutional Encoder (transclusion) (← links)
- Exercise 3.3: Code Sequence Calculation via U(D) and G(D) (transclusion) (← links)
- Exercise 3.3Z: Convolution and D-Transformation (transclusion) (← links)
- Exercise 3.4: Systematic Convolution Codes (transclusion) (← links)
- Exercise 3.4Z: Equivalent Convolution Codes? (transclusion) (← links)
- Exercise 3.5: Recursive Filters for GF(2) (transclusion) (← links)
- Exercise 3.6: State Transition Diagram (transclusion) (← links)
- Exercise 3.6Z: Transition Diagram at 3 States (transclusion) (← links)
- Exercise 3.7: Comparison of Two Convolutional Encoders (transclusion) (← links)
- Exercise 3.7Z: Which Code is Catastrophic? (transclusion) (← links)
- Exercise 3.8: Rate Compatible Punctured Convolutional Codes (transclusion) (← links)
- Exercise 3.09: Basics of the Viterbi Algorithm (transclusion) (← links)
- Exercise 3.09Z: Viterbi Algorithm again (transclusion) (← links)
- Exercise 3.10: Metric Calculation (transclusion) (← links)
- Exercise 3.10Z: Maximum Likelihood Decoding of Convolutional Codes (transclusion) (← links)
- Exercise 3.11: Viterbi Path Finding (transclusion) (← links)
- Exercise 3.12: Path Weighting Function (transclusion) (← links)
- Exercise 3.12Z: Ring and Feedback (transclusion) (← links)
- Exercise 3.13: Path Weighting Function again (transclusion) (← links)
- Exercise 3.14: Error Probability Bounds (transclusion) (← links)
- Exercise 4.1: Log Likelihood Ratio (transclusion) (← links)
- Exercise 4.1Z: Log Likelihood Ratio at the BEC Model (transclusion) (← links)
- Exercise 4.2: Channel Log Likelihood Ratio at AWGN (transclusion) (← links)
- Exercise 4.3: Iterative Decoding at the BSC (transclusion) (← links)
- Exercise 4.3Z: Conversions of L-value and S-value (transclusion) (← links)
- Exercise 4.4: Extrinsic L-values at SPC (transclusion) (← links)
- Exercise 4.4Z: Supplement to Exercise 4.4 (transclusion) (← links)
- Exercise 4.5: On the Extrinsic L-values again (transclusion) (← links)
- Exercise 4.5Z: Tangent Hyperbolic and Inverse (transclusion) (← links)
- Exercise 4.6: Product Code Generation (transclusion) (← links)
- Exercise 4.6Z: Basics of Product Codes (transclusion) (← links)
- Exercise 4.7: Product Code Decoding (transclusion) (← links)
- Exercise 4.7Z: Principle of Syndrome Decoding (transclusion) (← links)
- Exercise 4.08: Repetition to the Convolutional Codes (transclusion) (← links)
- Exercise 4.08Z: Basics about Interleaving (transclusion) (← links)
- Exercise 4.09: Recursive Systematic Convolutional Codes (transclusion) (← links)