Information Theory

From LNTwww

Brief summary

From the earliest beginnings of message transmission as an engineering discipline,  it has been the endeavour of many engineers and mathematicians  to find a quantitative measure for the

  • contained  $\rm information$  $($quite generally:  »the knowledge about something«$)$
  • in a  $\rm message$  $($here we mean  »a collection of symbols and/or states»$)$.


The  $($abstract$)$  information is communicated by the  $($concrete$)$  message and can be conceived as the interpretation of a message.

Claude Elwood Shannon  succeeded in 1948,  in establishing a consistent theory about the information content of messages,  which was revolutionary in its time and created a new,  still highly topical field of science:  »Shannon's information theory«  named after him.

This is what the fourth book in the  $\rm LNTwww$ series deals with,  in particular:

  1. Entropy of discrete-value sources with and without memory,  as well as natural message sources:  Definition,  meaning and computational possibilities.
  2. Source coding and data compression,  especially the   »Lempel–Ziv–Welch method«   and   »Huffman's entropy encoding«.
  3. Various entropies of two-dimensional discrete-value random quantities.  Mutual information and channel capacity.  Application to digital signal transmission.
  4. Discrete-value information theory.  Differential entropy.  AWGN channel capacity with continuous-valued as well as discrete-valued input.


⇒   First a  »content overview«  on the basis of the  »four main chapters«  with a total of  »13 individual chapters«  and  »106 sections«:


Content

Exercises and multimedia

In addition to these theory pages,  we also offer exercises and multimedia modules on this topic,  which could help to clarify the teaching material:

$(1)$    $\text{Exercises}$

$(2)$    $\text{Learning videos}$

$(3)$    $\text{Applets}$ 


Further links