Difference between revisions of "Information Theory"

From LNTwww
(Die Seite wurde neu angelegt: „==Buchübersicht== Seit den ersten Anfängen der Nachrichtenübertragung als naturwissenschaftliche Disziplin war es das Bestreben vieler Ingenieure und Mathem…“)
 
 
(88 intermediate revisions by 7 users not shown)
Line 1: Line 1:
==Buchübersicht==
+
===Brief summary===
Seit den ersten Anfängen der Nachrichtenübertragung als naturwissenschaftliche Disziplin war es das Bestreben vieler Ingenieure und Mathematiker, ein quantitatives Maß für die in einer Nachricht enthaltene Information zu finden. Hierbei soll unter „Information“ ganz allgemein die Kenntnis über irgend etwas verstanden werden, während wir im folgenden eine „Nachricht“ stets als eine Zusammenstellung von Symbolen und/oder Zuständen betrachten, die zur Übermittlung von Information dient. Die (abstrakte) Information wird durch die (konkrete) Nachricht mitgeteilt und kann in vielerlei Hinsicht als Interpretation einer Nachricht aufgefasst werden.
 
  
 +
{{BlueBox|TEXT=From the earliest beginnings of message transmission as an engineering discipline,  it has been the endeavour of many engineers and mathematicians  to find a quantitative measure for the
 +
*contained  $\rm information$  $($quite generally:  »the knowledge about something«$)$
  
Claude Elwood Shannon gelang es 1948, eine in sich konsistente Theorie über den Informationsgehalt von Nachrichten zu begründen, die zu ihrer Zeit revolutionär war und ein neues, bis heute hochaktuelles Wissenschaftsgebiet kreierte: die nach ihm benannte Shannonsche Informationstheorie. Von dieser handelt dieses Lehrbuch, das im Mai 2011 begonnen und im Sommer 2015 fertiggestellt wurde.
+
*in a  $\rm message$  $($here we mean  »a collection of symbols and/or states»$)$.
 +
  
 +
The  $($abstract$)$  information is communicated by the  $($concrete$)$  message and can be conceived as the interpretation of a message.
 +
 +
[https://en.wikipedia.org/wiki/Claude_Shannon '''Claude Elwood Shannon''']  succeeded in 1948,  in establishing a consistent theory about the information content of messages,  which was revolutionary in its time and created a new,  still highly topical field of science:   »'''Shannon's information theory«'''  named after him.
 +
 +
This is what the fourth book in the  $\rm LNTwww$ series deals with,  in particular:
 +
# Entropy of discrete-value sources with and without memory,  as well as natural message sources:  Definition,  meaning and computational possibilities.
 +
# Source coding and data compression,  especially the   »Lempel–Ziv–Welch method«   and   »Huffman's entropy encoding«. 
 +
# Various entropies of two-dimensional discrete-value random quantities.  Mutual information and channel capacity.  Application to digital signal transmission.   
 +
# Discrete-value information theory.  Differential entropy.  AWGN channel capacity with continuous-valued as well as discrete-valued input.
 +
 +
 +
⇒   First a  »'''content overview'''«  on the basis of the  »'''four main chapters'''«  with a total of  »'''13 individual chapters'''«  and  »'''106 sections'''«:}}
 +
 +
 +
 +
===Content===
  
===Inhalt===
 
 
{{Collapsible-Kopf}}
 
{{Collapsible-Kopf}}
{{Collapse1| header=Grundbegriffe der Nachrichtentechnik
+
{{Collapse1| header=Entropy of Discrete Sources
 
| submenu=  
 
| submenu=  
*[[/Prinzip der Nachrichtenübertragung/]]
+
*[[/Discrete Memoryless Sources/]]
*[[/Klassifizierung von Signalen/]]
+
*[[/Discrete Sources with Memory/]]
*[[/Zum Rechnen mit komplexen Zahlen/]]
+
*[[/Natural Discrete Sources/]]
 
}}
 
}}
{{Collapse2 | header=Periodische Signale
+
{{Collapse2 | header=Source Coding - Data Compression
 
|submenu=
 
|submenu=
*[[/Allgemeine Beschreibung/]]
+
*[[/General Description/]]
*[[/Gleichsignal - Grenzfall eines periodischen Signals/]]
+
*[[/Compression According to Lempel, Ziv and Welch/]]
*[[/Harmonische Schwingung/]]
+
*[[/Entropy Coding According to Huffman/]]
*[[/Fourierreihe/]]
+
*[[/Further Source Coding Methods/]]
 
}}
 
}}
{{Collapse3 | header=Aperiodische Signale - Impulse
+
{{Collapse3 | header=Mutual Information Between Two Discrete Random Variables
 
|submenu=
 
|submenu=
*[[/Fouriertransformation und -rücktransformation/]]
+
*[[/Some Preliminary Remarks on Two-Dimensional Random Variables/]]
*[[/Einige Sonderfälle impulsartiger Signale/]]
+
*[[/Different Entropy Measures of Two-Dimensional Random Variables/]]
*[[/Gesetzmäßigkeiten der Fouriertransformation/]]
+
*[[/Application to Digital Signal Transmission/]]
*[[/Faltungssatz und Faltungsoperation/]]
 
 
}}
 
}}
{{Collapse4 | header=Bandpassartige Signale
+
{{Collapse4 | header=Information Theory for Continuous Random Variables
 
|submenu=
 
|submenu=
*[[/Unterschiede und Gemeinsamkeiten von TP- und BP-Signalen/]]
+
*[[/Differential Entropy/]]
*[[/Analytisches Signal und zugehörige Spektralfunktion/]]
+
*[[/AWGN Channel Capacity for Continuous-Valued Input/]]
*[[/Äquivalentes Tiefpass-Signal und zugehörige Spektralfunktion/]]
+
*[[/AWGN Channel Capacity for Discrete-Valued Input/]]
}}
 
{{Collapse5 | header=Zeit- und frequenzdisktrete Signaldarstellung
 
|submenu=
 
*[[/Zeitdiskrete Signaldarstellung/]]
 
*[[/Diskrete Fouriertransformation (DFT)/]]
 
*[[/Fehlermöglichkeiten bei Anwendung der DFT/]]
 
*[[/Spektralanalyse/]]
 
*[[/Fast-Fouriertransformation (FFT)/]]
 
 
}}
 
}}
 
{{Collapsible-Fuß}}
 
{{Collapsible-Fuß}}
 +
 +
===Exercises and multimedia===
 +
 +
{{BlaueBox|TEXT=
 +
In addition to these theory pages,  we also offer exercises and multimedia modules on this topic,  which could help to clarify the teaching material:
 +
 +
$(1)$    [https://en.lntwww.de/Category:Information_Theory:_Exercises $\text{Exercises}$]
 +
 +
$(2)$    [[LNTwww:Learning_videos_to_"Information_Theory"|$\text{Learning videos}$]]
 +
 +
$(3)$    [[LNTwww:Applets_to_"Information_Theory"|$\text{Applets}$]] }}
 +
 +
 +
===Further links===
 +
 +
{{BlaueBox|TEXT=
 +
$(4)$    [[LNTwww:Bibliography_to_"Information_Theory"|$\text{Bibliography}$]]
 +
 +
$(5)$    [[LNTwww:Imprint_for_the_book_"Information_Theory"|$\text{Impressum}$]]}}
 +
<br><br>
 +
 +
  
  
 
__NOTOC__
 
__NOTOC__
 
__NOEDITSECTION__
 
__NOEDITSECTION__

Latest revision as of 18:50, 31 December 2023

Brief summary

From the earliest beginnings of message transmission as an engineering discipline,  it has been the endeavour of many engineers and mathematicians  to find a quantitative measure for the

  • contained  $\rm information$  $($quite generally:  »the knowledge about something«$)$
  • in a  $\rm message$  $($here we mean  »a collection of symbols and/or states»$)$.


The  $($abstract$)$  information is communicated by the  $($concrete$)$  message and can be conceived as the interpretation of a message.

Claude Elwood Shannon  succeeded in 1948,  in establishing a consistent theory about the information content of messages,  which was revolutionary in its time and created a new,  still highly topical field of science:  »Shannon's information theory«  named after him.

This is what the fourth book in the  $\rm LNTwww$ series deals with,  in particular:

  1. Entropy of discrete-value sources with and without memory,  as well as natural message sources:  Definition,  meaning and computational possibilities.
  2. Source coding and data compression,  especially the   »Lempel–Ziv–Welch method«   and   »Huffman's entropy encoding«.
  3. Various entropies of two-dimensional discrete-value random quantities.  Mutual information and channel capacity.  Application to digital signal transmission.
  4. Discrete-value information theory.  Differential entropy.  AWGN channel capacity with continuous-valued as well as discrete-valued input.


⇒   First a  »content overview«  on the basis of the  »four main chapters«  with a total of  »13 individual chapters«  and  »106 sections«:


Content

Exercises and multimedia

In addition to these theory pages,  we also offer exercises and multimedia modules on this topic,  which could help to clarify the teaching material:

$(1)$    $\text{Exercises}$

$(2)$    $\text{Learning videos}$

$(3)$    $\text{Applets}$ 


Further links