Difference between revisions of "Channel Coding"

From LNTwww
 
(29 intermediate revisions by the same user not shown)
Line 1: Line 1:
Channel coding (also known as error-control coding) includes both
+
===Brief summary===
*Error detection procedures
 
*and forward error correction (FEC), which often make digital signal transmission possible with a bad channel (low SNR) and lead to very low error rates with a sufficiently good channel (high SNR).
 
  
 +
{{BlueBox|TEXT=»'''Channel Coding'''«  $($or  »Error-Control Coding«$)$  includes both,  »Error Detection«  as well as   »Forward Error Correction«  
 +
*which first enables digital signal transmission when the channel is bad  $($small SNR$)$ 
  
Linear block codes, Reed-Solomon codes and convolutional and turbo codes as well as their (possibly iterative) decoding are described.  
+
*and leads to very small error rates when the channel is good enough  $($large SNR$)$.
  
  
The scope of this book corresponds to a course with three semester hours (SWS) of lecture and two SWS of exercises.
+
Here are some keywords from the book content: 
 +
 +
# Binary linear block codes:  Generator matrix,  parity-heck matrix and decoding.  Examples:  Single parity-check codes,  repetition codes,  Hamming codes.     
 +
# Error probability bounds:  Minimum distance,  Union bound,  Shannon bound.  Channel coding theorem and channel capacity: error rate vs. code rate.   
 +
# Reed-Solomon codes:  Algebra fundamentals,  Extension fields,  code parameters,  encoding– and decoding principle,  Singleton bound,  applications. 
 +
# Convolutional codes:  Algebraic and polynomial description,  state and trellis diagram,  decoding using Viterbi  and BCJR  algorithm.
 +
# Iterative decoding methods:  Soft-in soft-out decoders,  product codes,  turbo codes and low-density parity-check $($LDPC$)$ codes.
 +
 
 +
 
 +
<u>Notes:</u>
 +
 
 +
*Knowledge of&nbsp; &raquo;[[Theory_of_Stochastic_Signals]]&laquo;&nbsp; and the&nbsp; &raquo;[[Information theory]]&laquo;&nbsp; is helpful,&nbsp; but not essential for understanding channel coding.
 +
 
 +
*Their mathematics is fundamentally different from that in other disciplines.&nbsp; However, analogies can often be seen,&nbsp; e.g., to &nbsp;[[Signal_Representation/The_Convolution_Theorem_and_Operation|&raquo;conventional convolution&laquo;]].
 +
 
 +
* An encoding type with a different goal is the&nbsp; [[Information_Theory/General_Description|&raquo;Source coding&laquo;]]&nbsp; $($"Data compression"$)$.&nbsp; Here,&nbsp; redundancy is not added,&nbsp; but reduced.
 +
 
 +
* Another type of coding is&nbsp; [[Digital_Signal_Transmission/Basics_of_Coded_Transmission#Source_coding_.E2.80.93_Channel_coding_.E2.80.93_Line_coding|&raquo;Line Coding&laquo;]]&nbsp; with the aim,&nbsp; to adapt the transmitted signal spectrally to the transmission channel in the best possible way.
 +
 
 +
 
 +
&rArr; &nbsp; Here first a&nbsp; &raquo;'''contents overview'''&laquo;&nbsp; based on the &nbsp;&raquo;'''four main chapters'''&laquo;&nbsp; with a total of&nbsp; &raquo;'''22 individual chapters'''&laquo;&nbsp; and&nbsp; &raquo;'''175 sections'''&laquo;.}}
  
First of all, here is an overview of the contents based on the four main chapters with a total of 22 chapters.
 
  
 
===Contents===
 
===Contents===
Line 29: Line 48:
 
*[[/Reed-Solomon Decoding for the Erasure Channel/]]
 
*[[/Reed-Solomon Decoding for the Erasure Channel/]]
 
*[[/Error Correction According to Reed-Solomon Coding/]]
 
*[[/Error Correction According to Reed-Solomon Coding/]]
*[[/Error Probability and Areas of Application/]]
+
*[[/Error Probability and Application Areas/]]
 
}}
 
}}
 
{{Collapse3 | header=Convolutional Codes and Their Decoding
 
{{Collapse3 | header=Convolutional Codes and Their Decoding
Line 37: Line 56:
 
*[[/Code Description with State and Trellis Diagram/]]
 
*[[/Code Description with State and Trellis Diagram/]]
 
*[[/Decoding of Convolutional Codes/]]
 
*[[/Decoding of Convolutional Codes/]]
*[[/Distance Characteristics and Error Probability Barriers/]]
+
*[[/Distance Characteristics and Error Probability Bounds/]]
 
}}
 
}}
 
{{Collapse4 | header=Iterative Decoding Methods  
 
{{Collapse4 | header=Iterative Decoding Methods  
Line 47: Line 66:
 
}}
 
}}
 
{{Collapsible-Fuß}}
 
{{Collapsible-Fuß}}
 +
===Exercises and multimedia===
 +
{{BlaueBox|TEXT=
 +
In addition to these theory pages,&nbsp; we also offer exercises and multimedia modules on this topic,&nbsp; which could help to clarify the teaching material:
  
In addition to these theory pages, we also offer tasks and multimedia modules that could help to clarify the topic:
+
$(1)$&nbsp; &nbsp; [https://en.lntwww.de/Category:Channel_Coding:_Exercises $\text{Exercises}$]
*[https://en.lntwww.de/Kategorie:Aufgaben_zu_Kanalcodierung $\text{Exercises}$;]
 
*[[LNTwww:Lernvideos_zu_Kanalcodierung|$\text{Learning videos in german}$;]]
 
*[[LNTwww:HTML5-Applets_zu_Kanalcodierung|$\text{new designed applets}$]], based on HTML5, also executable on smartphones.
 
 
 
  
 +
$(2)$&nbsp; &nbsp; [[LNTwww:Learning_Videos_to_"Channel_Coding"|$\text{Learning videos}$]]
  
$\text{Recommended Literature:}$
+
$(3)$&nbsp; &nbsp; [[LNTwww:Applets_to_"Channel_Coding"|$\text{Applets}$]]&nbsp;}}
  
*Böcherer, G.: Channel Coding. Vorlesungsmanuskript. Lehrstuhl Für Nachrichtentechnkik, TU München, 2015
 
*Bossert, M.: Channel Coding for Telecommunications. Chichester: Wiley, 2000. ISBN 978-0-471-98277-7
 
*Bossert, M.: Kanalcodierung. Stuttgart: Vieweg+Teubner Verlag, 2014. ISBN 978-3-322-90917-6
 
*Cover, T. M.; Thomas, J. A.: Elements of Information Theory. 2. Aufl. Hoboken, N.J: Wiley-Interscience, 2006. ISBN 978-0-47124-195-9
 
*Friedrichs, B.: Kanalcodierung. Grundlagen und Anwendungen in modernen Kommunikationssystemen. Berlin u.a.: Springer, 1996. ISBN 3-540-58232-0
 
*Gallager, R. G.: Information Theory and Reliable Communication. New York NY u.a.: Wiley, 1968. ISBN 0-471-29048-3
 
*Hindelang, T.: Source-Controlled Channel Decoding and Decoding for Mobile Communications. Dissertation. Lehrstuhl für Nachrichtentechnik. <br>München: VDI Fortschritt-Berichte, Reihe 10, Nr. 695, 2002
 
*Huber, J.: Codierung für gedächtnisbehaftete Kanäle. Dissertation – Universität der Bundeswehr München, 1982
 
*Johannesson, R.; Zigangirov, K. S.: Fundamentals of Convolutional Coding. New York: IEEE Press, 1999. ISBN 978-0-470-27683-9
 
*Klimant, H.; Piotraschke, R.; Schönfeld, D.: Informations- und Kodierungstheorie. 2. Aufl. Wiesbaden (u.a.): Vieweg+Teubner Verlag, 2003. ISBN 978-3-51923-003-8
 
*Kötter, R.; Mayer, T.; Tüchler, M.; Schreckenbach, F.; Brauchle, J.: Channel Coding. Vorlesungsmanuskript. Lehrstuhl für Nachrichtentechnik, TU München, 2008
 
*Liva, G.: Channel Coding. Vorlesungsmanuskript. Lehrstuhl Für Nachrichtentechnik, TU München und DLR Oberpfaffenhofen, 2010
 
*Ryan, W.; Lin, S.: Channel Codes. Classical and modern. Cambridge: Cambridge University Press, 2009. ISBN 978-0-52184-868-8
 
*Schönfeld, D.; Klimant, H.; Piotraschke, R.: Informations- und Kodierungstheorie. 4. Aufl. Wiesbaden: Vieweg+Teubner Verlag, 2012. ISBN 978-3-83480-647-5
 
*Schneider-Obermann, H.; Mildenberger, O.: Kanalcodierung. Theorie und Praxis fehlerkorrigierender Codes. Wiesbaden: Vieweg+Teubner Verlag, 1998. <br>ISBN 978-3-528-03101-5
 
*Schulz, R.-H.: Codierungstheorie. Eine Einführung. 2. Aufl. Wiesbaden: Vieweg+Teubner Verlag, 2003. ISBN 978-3-322-80328-3
 
*Shannon, C. E.; Weaver, W.: The Mathematical Theory of Communication. Urbana: Univ. of Illinois Press, 1998. ISBN 978-0-25272-548-7
 
*Tröndle, K.: Codier-und Decodiermethoden zur Fehlerkorrektur. Habilitationsschrift. München: TU München, 1974
 
*Wachter-Zeh, A.: Channel Coding. Vorlesungsmanuskript. Professur für Coding for Communications and Data Storage, TU München, 2017
 
  
 +
===Further links===
  
 +
{{BlaueBox|TEXT=
 +
$(4)$&nbsp; &nbsp; [[LNTwww:Bibliography_to_"Channel_Coding"|$\text{Bibliography}$]]
  
[[LNTwww:Authors#Kanalcodierung|$\text{Notes on the authors and the materials used as a basis for the preparation of the book}$.]]
+
$(5)$&nbsp; &nbsp; [[LNTwww:Imprint_for_the_book_"Channel_Coding"|$\text{Impressum}$]]}}
 +
<br><br>
  
 
{{Display}}
 
{{Display}}

Latest revision as of 18:17, 26 March 2023

Brief summary

»Channel Coding«  $($or  »Error-Control Coding«$)$  includes both,  »Error Detection«  as well as  »Forward Error Correction« 

  • which first enables digital signal transmission when the channel is bad  $($small SNR$)$ 
  • and leads to very small error rates when the channel is good enough  $($large SNR$)$.


Here are some keywords from the book content:

  1. Binary linear block codes:  Generator matrix,  parity-heck matrix and decoding.  Examples:  Single parity-check codes,  repetition codes,  Hamming codes.
  2. Error probability bounds:  Minimum distance,  Union bound,  Shannon bound.  Channel coding theorem and channel capacity: error rate vs. code rate.
  3. Reed-Solomon codes:  Algebra fundamentals,  Extension fields,  code parameters,  encoding– and decoding principle,  Singleton bound,  applications.
  4. Convolutional codes:  Algebraic and polynomial description,  state and trellis diagram,  decoding using Viterbi and BCJR algorithm.
  5. Iterative decoding methods:  Soft-in soft-out decoders,  product codes,  turbo codes and low-density parity-check $($LDPC$)$ codes.


Notes:

  • Their mathematics is fundamentally different from that in other disciplines.  However, analogies can often be seen,  e.g., to  »conventional convolution«.
  • An encoding type with a different goal is the  »Source coding«  $($"Data compression"$)$.  Here,  redundancy is not added,  but reduced.
  • Another type of coding is  »Line Coding«  with the aim,  to adapt the transmitted signal spectrally to the transmission channel in the best possible way.


⇒   Here first a  »contents overview«  based on the  »four main chapters«  with a total of  »22 individual chapters«  and  »175 sections«.


Contents

Exercises and multimedia

In addition to these theory pages,  we also offer exercises and multimedia modules on this topic,  which could help to clarify the teaching material:

$(1)$    $\text{Exercises}$

$(2)$    $\text{Learning videos}$

$(3)$    $\text{Applets}$ 


Further links