Difference between revisions of "Information Theory"

From LNTwww
Line 1: Line 1:
Since the early beginnings of communications as an engineering discipline,  many engineers and mathematicians have sought
+
===Brief summary===
*to find a quantitative measure of the  $\rm information$  $($in general: "the knowledge of something"$)$
 
  
*contained in a  $\rm message$  $($here we understand  "a collection of symbols and/or states"$)$.
+
{{BlueBox|TEXT=From the earliest beginnings of message transmission as an engineering discipline,  it has been the endeavour of many engineers and mathematicians  to find a quantitative measure for the
 +
*contained  $\rm information$  $($quite generally:  "the knowledge about something"$)$
  
 +
*in a  $\rm message$  $($here we mean  "a collection of symbols and/or states"$)$.
 +
  
The  (abstract)  information is communicated by the  (concrete)  message and can be seen as an interpretation of a message.  
+
The  $($abstract$)$  information is communicated by the  $($concrete$)$  message and can be conceived as the interpretation of a message.  
  
:[https://de.wikipedia.org/wiki/Claude_Shannon '''Claude Elwood Shannon''']&nbsp; succeeded in 1948 in establishing a consistent theory of the information content of messages,&nbsp; <br>which was revolutionary in its time and created a new,&nbsp; still highly topical field of science:&nbsp; <br>The theory named after him:&nbsp; $\text{Shannon's Information Theory}$.
+
[https://de.wikipedia.org/wiki/Claude_Shannon '''Claude Elwood Shannon''']&nbsp; succeeded in 1948,&nbsp; in establishing a consistent theory about the information content of messages,&nbsp; which was revolutionary in its time and created a new,&nbsp; still highly topical field of science:&nbsp; &raquo;'''Shannon's information theory&laquo;'''&nbsp; named after him.&raquo;
+
 
 +
This is what the fourth book in the&nbsp; $\rm LNTww$ series deals with,&nbsp; in particular:
 +
# Entropy of discrete-value sources with and withott memory,&nbsp; as well as natural message sources:&nbsp; Definition,&nbsp; meaning and computational possibilities.
 +
# Source coding and data compression,&nbsp; especially the &nbsp; "Lempel&ndash;Ziv&ndash;Welch method" &nbsp; and &nbsp; "Huffman's entropy encoding". 
 +
# Various entropies of two-dimensional discrete-value random quantities.&nbsp; Mutual information and channel capacity.&nbsp; Application to digital signal transmission.   
 +
# Discrete-value information theory.&nbsp; Differential entropy.&nbsp; AWGN channel capacity with continuous-valued as well as discrete-valued input.
 +
 
 +
 
 +
&rArr; &nbsp; First a&nbsp; &raquo;'''content overview'''&laquo;&nbsp; on the basis of the&nbsp; &raquo;'''four main chapters'''&laquo;&nbsp; with a total of&nbsp; &raquo;'''13 individual chapters'''&laquo;&nbsp; and&nbsp; &raquo;'''106 sections'''&laquo;:}}
 +
 
 +
 
 +
 
 +
===Content===
  
Here first a&nbsp; &raquo;'''content overview'''&laquo;&nbsp; on the basis of the&nbsp; &raquo;'''four main chapters'''&laquo;&nbsp; with a total of&nbsp; &raquo;'''13 individual chapters&laquo;''':
 
===Contents===
 
 
{{Collapsible-Kopf}}
 
{{Collapsible-Kopf}}
 
{{Collapse1| header=Entropy of Discrete Sources
 
{{Collapse1| header=Entropy of Discrete Sources
Line 39: Line 51:
 
}}
 
}}
 
{{Collapsible-Fuß}}
 
{{Collapsible-Fuß}}
 +
 +
{{BlaueBox|TEXT=
 +
In addition to these theory pages,&nbsp; we also offer exercises and multimedia modules on this topic,&nbsp; which could help to clarify the teaching material:
 +
 +
$(1)$&nbsp; &nbsp; [https://en.lntwww.de/Category:Information_Theory:_Exercises $\text{Exercises}$]
 +
 +
$(2)$&nbsp; &nbsp; [[LNTwww:Learning_videos_to_"Information_Theory"|$\text{Learning videos}$]]
 +
 +
$(3)$&nbsp; &nbsp; [[Applets_to_"Information_Theory"|$\text{Applets}$]]&nbsp;}}
 +
 +
 +
LNTwww:Imprint for the book "Stochastic Signal Theory"
 +
===Further links===
 +
 +
{{BlaueBox|TEXT=
 +
$(4)$&nbsp; &nbsp; [[LNTwww:Bibliography_to_"Theory_of_Stochastic_Signals"|$\text{Bibliography}$]]
 +
 +
$(5)$&nbsp; &nbsp; [[LNTwww:Imprint_for_the_book_"Stochastic_Signal_Theory"|$\text{Impressum}$]]}}
 +
<br><br>
 +
  
 
In addition to these theory pages, we also offer exercises and multimedia modules that could help to clarify the teaching material:
 
In addition to these theory pages, we also offer exercises and multimedia modules that could help to clarify the teaching material:

Revision as of 16:17, 8 March 2023

Brief summary

From the earliest beginnings of message transmission as an engineering discipline,  it has been the endeavour of many engineers and mathematicians  to find a quantitative measure for the

  • contained  $\rm information$  $($quite generally:  "the knowledge about something"$)$
  • in a  $\rm message$  $($here we mean  "a collection of symbols and/or states"$)$.


The  $($abstract$)$  information is communicated by the  $($concrete$)$  message and can be conceived as the interpretation of a message.

Claude Elwood Shannon  succeeded in 1948,  in establishing a consistent theory about the information content of messages,  which was revolutionary in its time and created a new,  still highly topical field of science:  »Shannon's information theory«  named after him.»

This is what the fourth book in the  $\rm LNTww$ series deals with,  in particular:

  1. Entropy of discrete-value sources with and withott memory,  as well as natural message sources:  Definition,  meaning and computational possibilities.
  2. Source coding and data compression,  especially the   "Lempel–Ziv–Welch method"   and   "Huffman's entropy encoding".
  3. Various entropies of two-dimensional discrete-value random quantities.  Mutual information and channel capacity.  Application to digital signal transmission.
  4. Discrete-value information theory.  Differential entropy.  AWGN channel capacity with continuous-valued as well as discrete-valued input.


⇒   First a  »content overview«  on the basis of the  »four main chapters«  with a total of  »13 individual chapters«  and  »106 sections«:


Content

In addition to these theory pages,  we also offer exercises and multimedia modules on this topic,  which could help to clarify the teaching material:

$(1)$    $\text{Exercises}$

$(2)$    $\text{Learning videos}$

$(3)$    $\text{Applets}$ 


LNTwww:Imprint for the book "Stochastic Signal Theory"

Further links




In addition to these theory pages, we also offer exercises and multimedia modules that could help to clarify the teaching material:



$\text{Other links:}$

$(1)$    $\text{Bibliography to the book}$

$(2)$    $\text{General notes about the book}$   (authors,  other participants,  materials as a starting point for the book,  list of sources)