Difference between revisions of "Information Theory"

From LNTwww
 
(10 intermediate revisions by 2 users not shown)
Line 2: Line 2:
  
 
{{BlueBox|TEXT=From the earliest beginnings of message transmission as an engineering discipline,  it has been the endeavour of many engineers and mathematicians  to find a quantitative measure for the  
 
{{BlueBox|TEXT=From the earliest beginnings of message transmission as an engineering discipline,  it has been the endeavour of many engineers and mathematicians  to find a quantitative measure for the  
*contained  $\rm information$  $($quite generally:  "the knowledge about something"$)$
+
*contained  $\rm information$  $($quite generally:  »the knowledge about something«$)$
  
*in a  $\rm message$  $($here we mean  "a collection of symbols and/or states"$)$.
+
*in a  $\rm message$  $($here we mean  »a collection of symbols and/or states»$)$.
 
   
 
   
  
 
The  $($abstract$)$  information is communicated by the  $($concrete$)$  message and can be conceived as the interpretation of a message.  
 
The  $($abstract$)$  information is communicated by the  $($concrete$)$  message and can be conceived as the interpretation of a message.  
  
[https://de.wikipedia.org/wiki/Claude_Shannon '''Claude Elwood Shannon''']  succeeded in 1948,  in establishing a consistent theory about the information content of messages,  which was revolutionary in its time and created a new,  still highly topical field of science:   »'''Shannon's information theory«'''  named after him.»
+
[https://en.wikipedia.org/wiki/Claude_Shannon '''Claude Elwood Shannon''']  succeeded in 1948,  in establishing a consistent theory about the information content of messages,  which was revolutionary in its time and created a new,  still highly topical field of science:   »'''Shannon's information theory«'''  named after him.
  
This is what the fourth book in the  $\rm LNTww$ series deals with,  in particular:  
+
This is what the fourth book in the  $\rm LNTwww$ series deals with,  in particular:  
# Entropy of discrete-value sources with and withott memory,  as well as natural message sources:  Definition,  meaning and computational possibilities.
+
# Entropy of discrete-value sources with and without memory,  as well as natural message sources:  Definition,  meaning and computational possibilities.
# Source coding and data compression,  especially the   "Lempel–Ziv–Welch method"   and   "Huffman's entropy encoding".   
+
# Source coding and data compression,  especially the   »Lempel–Ziv–Welch method«   and   »Huffman's entropy encoding«.   
 
# Various entropies of two-dimensional discrete-value random quantities.  Mutual information and channel capacity.  Application to digital signal transmission.     
 
# Various entropies of two-dimensional discrete-value random quantities.  Mutual information and channel capacity.  Application to digital signal transmission.     
 
# Discrete-value information theory.  Differential entropy.  AWGN channel capacity with continuous-valued as well as discrete-valued input.
 
# Discrete-value information theory.  Differential entropy.  AWGN channel capacity with continuous-valued as well as discrete-valued input.
Line 51: Line 51:
 
}}
 
}}
 
{{Collapsible-Fuß}}
 
{{Collapsible-Fuß}}
 +
 +
===Exercises and multimedia===
  
 
{{BlaueBox|TEXT=
 
{{BlaueBox|TEXT=
Line 59: Line 61:
 
$(2)$    [[LNTwww:Learning_videos_to_"Information_Theory"|$\text{Learning videos}$]]
 
$(2)$    [[LNTwww:Learning_videos_to_"Information_Theory"|$\text{Learning videos}$]]
  
$(3)$    [[Applets_to_"Information_Theory"|$\text{Applets}$]] }}  
+
$(3)$    [[LNTwww:Applets_to_"Information_Theory"|$\text{Applets}$]] }}  
  
  
LNTwww:Imprint for the book "Stochastic Signal Theory"
 
 
===Further links===
 
===Further links===
  
 
{{BlaueBox|TEXT=
 
{{BlaueBox|TEXT=
$(4)$    [[LNTwww:Bibliography_to_"Theory_of_Stochastic_Signals"|$\text{Bibliography}$]]
+
$(4)$    [[LNTwww:Bibliography_to_"Information_Theory"|$\text{Bibliography}$]]
  
$(5)$    [[LNTwww:Imprint_for_the_book_"Stochastic_Signal_Theory"|$\text{Impressum}$]]}}
+
$(5)$    [[LNTwww:Imprint_for_the_book_"Information_Theory"|$\text{Impressum}$]]}}
 
<br><br>
 
<br><br>
  
  
In addition to these theory pages, we also offer exercises and multimedia modules that could help to clarify the teaching material:
 
 
*[https://en.lntwww.de/Category:Information_Theory:_Exercises $\text{Exercises}$]
 
*[[LNTwww:Learning_videos_to_"Information_Theory"|$\text{Learning videos}$]]
 
*[[LNTwww:Applets_to_"Information_Theory"|$\text{Applets}$]]
 
<br><br>
 
$\text{Other links:}$
 
 
$(1)$&nbsp; &nbsp; [[LNTwww:Bibliography_to_"Information_Theory"|$\text{Bibliography to the book}$]]
 
  
$(2)$&nbsp; &nbsp; [[LNTwww:General_notes_about_Information_Theory|$\text{General notes about the book}$]] &nbsp; (authors,&nbsp; other participants,&nbsp; materials as a starting point for the book,&nbsp; list of sources)
 
 
<br><br>
 
  
 
__NOTOC__
 
__NOTOC__
 
__NOEDITSECTION__
 
__NOEDITSECTION__

Latest revision as of 18:50, 31 December 2023

Brief summary

From the earliest beginnings of message transmission as an engineering discipline,  it has been the endeavour of many engineers and mathematicians  to find a quantitative measure for the

  • contained  $\rm information$  $($quite generally:  »the knowledge about something«$)$
  • in a  $\rm message$  $($here we mean  »a collection of symbols and/or states»$)$.


The  $($abstract$)$  information is communicated by the  $($concrete$)$  message and can be conceived as the interpretation of a message.

Claude Elwood Shannon  succeeded in 1948,  in establishing a consistent theory about the information content of messages,  which was revolutionary in its time and created a new,  still highly topical field of science:  »Shannon's information theory«  named after him.

This is what the fourth book in the  $\rm LNTwww$ series deals with,  in particular:

  1. Entropy of discrete-value sources with and without memory,  as well as natural message sources:  Definition,  meaning and computational possibilities.
  2. Source coding and data compression,  especially the   »Lempel–Ziv–Welch method«   and   »Huffman's entropy encoding«.
  3. Various entropies of two-dimensional discrete-value random quantities.  Mutual information and channel capacity.  Application to digital signal transmission.
  4. Discrete-value information theory.  Differential entropy.  AWGN channel capacity with continuous-valued as well as discrete-valued input.


⇒   First a  »content overview«  on the basis of the  »four main chapters«  with a total of  »13 individual chapters«  and  »106 sections«:


Content

Exercises and multimedia

In addition to these theory pages,  we also offer exercises and multimedia modules on this topic,  which could help to clarify the teaching material:

$(1)$    $\text{Exercises}$

$(2)$    $\text{Learning videos}$

$(3)$    $\text{Applets}$ 


Further links