Difference between revisions of "Information Theory"

From LNTwww
 
(77 intermediate revisions by 5 users not shown)
Line 1: Line 1:
Seit den ersten Anfängen der Nachrichtenübertragung als naturwissenschaftliche Disziplin war es das Bestreben vieler Ingenieure und Mathematiker, ein quantitatives Maß zu finden für die in
+
===Brief summary===
*einer Nachricht (hierunter verstehen wir „eine Zusammenstellung von Symbolen und/oder Zuständen“)
 
*enthaltene Information (ganz allgemein: „die Kenntnis über irgend etwas“)
 
  
 +
{{BlueBox|TEXT=From the earliest beginnings of message transmission as an engineering discipline,  it has been the endeavour of many engineers and mathematicians  to find a quantitative measure for the
 +
*contained  $\rm information$  $($quite generally:  »the knowledge about something«$)$
  
Die (abstrakte) Information wird durch die (konkrete) Nachricht mitgeteilt und kann als Interpretation einer Nachricht aufgefasst werden. Claude Elwood Shannon gelang es 1948, eine in sich konsistente Theorie über den Informationsgehalt von Nachrichten zu begründen, die zu ihrer Zeit revolutionär war und ein neues, bis heute hochaktuelles Wissenschaftsgebiet kreierte: die nach ihm benannte ''Shannonsche Informationstheorie''.  
+
*in a  $\rm message$  $($here we mean  »a collection of symbols and/or states»$)$.
 +
 +
 
 +
The  $($abstract$)$  information is communicated by the  $($concrete$)$  message and can be conceived as the interpretation of a message.
 +
 
 +
[https://en.wikipedia.org/wiki/Claude_Shannon '''Claude Elwood Shannon''']  succeeded in 1948,  in establishing a consistent theory about the information content of messages,  which was revolutionary in its time and created a new,  still highly topical field of science:   »'''Shannon's information theory«'''  named after him.
 +
 
 +
This is what the fourth book in the  $\rm LNTwww$ series deals with,  in particular:
 +
# Entropy of discrete-value sources with and without memory,  as well as natural message sources:  Definition,  meaning and computational possibilities.
 +
# Source coding and data compression,  especially the   »Lempel–Ziv–Welch method«   and   »Huffman's entropy encoding«. 
 +
# Various entropies of two-dimensional discrete-value random quantities.  Mutual information and channel capacity.  Application to digital signal transmission.   
 +
# Discrete-value information theory.  Differential entropy.  AWGN channel capacity with continuous-valued as well as discrete-valued input.
 +
 
 +
 
 +
⇒   First a  »'''content overview'''«  on the basis of the  »'''four main chapters'''«  with a total of  »'''13 individual chapters'''«  and  »'''106 sections'''«:}}
 +
 
 +
 
 +
 
 +
===Content===
  
===Inhalt===
 
 
{{Collapsible-Kopf}}
 
{{Collapsible-Kopf}}
{{Collapse1| header=Entropie wertdiskreter Nachrichtenquellen
+
{{Collapse1| header=Entropy of Discrete Sources
 
| submenu=  
 
| submenu=  
*[[/Gedächtnislose Nachrichtenquellen/]]
+
*[[/Discrete Memoryless Sources/]]
*[[/Nachrichtenquellen mit Gedächtnis/]]
+
*[[/Discrete Sources with Memory/]]
*[[/Natürliche wertdiskrete Nachrichtenquellen/]]
+
*[[/Natural Discrete Sources/]]
 
}}
 
}}
{{Collapse2 | header=Quellencodierung - Datenkomprimierung
+
{{Collapse2 | header=Source Coding - Data Compression
 
|submenu=
 
|submenu=
*[[/Allgemeine Beschreibung/]]
+
*[[/General Description/]]
*[[/Komprimierung nach Lempel, Ziv und Welch/]]
+
*[[/Compression According to Lempel, Ziv and Welch/]]
*[[/Entropiecodierung nach Huffman/]]
+
*[[/Entropy Coding According to Huffman/]]
*[[/Weitere Quellencodierverfahren/]]
+
*[[/Further Source Coding Methods/]]
 
}}
 
}}
{{Collapse3 | header=Information zwischen zwei wertdiskreten Zufallsgrößen
+
{{Collapse3 | header=Mutual Information Between Two Discrete Random Variables
 
|submenu=
 
|submenu=
*[[/Einige Vorbemerkungen zu zweidimensionalen Zufallsgrößen/]]
+
*[[/Some Preliminary Remarks on Two-Dimensional Random Variables/]]
*[[/Verschiedene Entropien zweidimensionaler Zufallsgrößen/]]
+
*[[/Different Entropy Measures of Two-Dimensional Random Variables/]]
*[[/Anwendung auf die Digitalsignalübertragung/]]
+
*[[/Application to Digital Signal Transmission/]]
 
}}
 
}}
{{Collapse4 | header=Wertkontinuierliche Informationstheorie
+
{{Collapse4 | header=Information Theory for Continuous Random Variables
 
|submenu=
 
|submenu=
*[[/Differentielle Entropie/]]
+
*[[/Differential Entropy/]]
*[[/AWGN–Kanalkapazität bei wertkontinuierlichem Eingang/]]
+
*[[/AWGN Channel Capacity for Continuous-Valued Input/]]
*[[/AWGN–Kanalkapazität bei wertdiskretem Eingang/]]
+
*[[/AWGN Channel Capacity for Discrete-Valued Input/]]
 
}}
 
}}
 
{{Collapsible-Fuß}}
 
{{Collapsible-Fuß}}
Der Umfang dieses Buches entspricht einer Lehrveranstaltung mit zwei Semesterwochenstunden (SWS) Vorlesung und einer SWS Übungen.
 
  
'''Empfohlene Literatur:'''
+
===Exercises and multimedia===
*Abel, J.: Grundlagen des Burrows-Wheeler-Kompressionsalgorithmus. PDF–Internetdokument
+
 
*Blahut, R.E.: Principles and Practice of Information Theory. Massachusetts: Addison-Wesley, 1987.
+
{{BlaueBox|TEXT=
*Bodden, E.; Clasen, M.; Kneis, J.: Algebraische Kodierung. Proseminar, Lehrstuhl für Informatik IV, RWTH Aachen, 2002.
+
In addition to these theory pages,  we also offer exercises and multimedia modules on this topic,  which could help to clarify the teaching material:
*Cover, T.M.; Thomas, J.A.: Elements of Information Theory. West Sussex: John Wiley & Sons, 2nd Edition, 2006.
+
 
*Fano, R.M.: Transmission of Information: A Statistical Theory of Communication. New York: John Wiley & Sons, 1961.
+
$(1)$    [https://en.lntwww.de/Category:Information_Theory:_Exercises $\text{Exercises}$]
*Forney, G.D.: Information Theory. Stanford University, 1972.
+
 
*Friedrichs, B.: Kanalcodierung – Grundlagen und Anwendungen in modernen Kommunikations- systemen. Berlin – Heidelberg: Springer, 1996.
+
$(2)$    [[LNTwww:Learning_videos_to_"Information_Theory"|$\text{Learning videos}$]]
*Gallager, R.G.: Information Theory and Reliable Communication. New York: John Wiley & Sons, 1968.
+
 
*Hartley, R. V. L.: Transmission of Information. In: Bell Syst. Techn. J. Vol. 39, 1928, pp. 535.
+
$(3)$    [[LNTwww:Applets_to_"Information_Theory"|$\text{Applets}$]] }}
*Johannesson, R.: Informationstheorie. Lund: Studienliteratur, 1992.
+
 
*Kramer, G.: Information Theory. Vorlesungsmanuskript, Lehrstuhl für Nachrichtentechnik, Technische Universität München, 2016.
+
 
*Küpfmüller, K.: Die Entropie der deutschen Sprache. Fernmeldetechnische Zeitung 7, 1954, S. 265-272.
+
===Further links===
*McEliece, R.J.: The Theory of Information Theory and Coding. Massachusetts: Addison-Wesley, 1977.
+
 
*Mecking, M.: Information Theory. Vorlesungsmanuskript, Lehrstuhl für Nachrichtentechnik, Technische Universität München, 2009.
+
{{BlaueBox|TEXT=
*Proakis, J. G.; Salehi, M.: Communications Systems Engineering. Prentice Hall, 2002.
+
$(4)$    [[LNTwww:Bibliography_to_"Information_Theory"|$\text{Bibliography}$]]
*Shannon, C.E.: A Mathematical Theory of Communication. In: Bell Syst. Techn. J. 27 (1948), S. 379-423 und S. 623-656.
+
 
*Shannon, C.E.: Prediction and Entropy of Printed English. Bell Syst. Techn. J., Band 30, 1951, S. 50-64.
+
$(5)$    [[LNTwww:Imprint_for_the_book_"Information_Theory"|$\text{Impressum}$]]}}
*Wyner, A. D.; Ziv, J.: A Theorem on the Entropy of Certain Binary Sequencies and Applications. IEEE Transactions on Information Theory, IT-19, S. *769-772, 1973.
+
<br><br>
  
  
'''ALT'''
 
*Anderson, J.B.; Mohan, S.: Source and Channel Coding. Norwell (Mass.).: Kluwer Academic Publisher, 1990.
 
*Böhme, J.R.: Stochastische Signale. Stuttgart: B.G. Teubner, 1993.
 
*Bratley, R.; Fox, B.L.; Schräge, L.E.: A Guide to Simulation. New York: Springer, 1987.
 
*Davenport, W.B.: Probability and Random Processes. New York: McGraw-Hill, 1970.
 
*Fisz, M.: Wahrscheinlichkeitsrechnung und mathematische Statistik. 9. Auflage. Berlin: Deutscher Verlag der Wissenschaften, 1978.
 
*Greiner, M.; Tinhofer, G.: Stochastik für Studienanfänger der Informatik. München: Carl Hanser, 1996.
 
*Hänsler, E.: Statistische Signale: Grundlagen und Anwendungen. 2. Auflage. Berlin – Heidelberg: Springer, 1997.
 
*Jackson, L.B.: Digital Filters and Signal Processing. Boston: Kluwer, 1986.
 
*Knuth, D.E.: The Art of Computer Programming – Volume l. Second Edition. Reading, Mass.: Addison-Wesley, 1973.
 
*Knuth, D.E.: The Art of Computer Programming – Volume 2. Second Edition. Reading, Mass.: Addison-Wesley, 1981.
 
*Kolmogoroff, A.N.: Grundbegriffe der Wahrscheinlichkeitsrechnung. Berlin – Heidelberg: Springer, 1933.
 
*Kreyszig, E.: Statistische Methoden und ihre Anwendungen. 7. Auflage. Göttingen: Vandenhoeck & Ruprecht, 1985.
 
*Lücker, R.: Grundlagen digitaler Filter. Berlin – Heidelberg: Springer, 1980.
 
*Lüke, H.D.: Korrelationssignale. Berlin – Heidelberg: Springer, 1992.
 
*Müller, P.H.: Lexikon der Stochastik. 5. Auflage. Berlin: Akademie-Verlag, 1991.
 
*Papoulis, A.; Pillai, S.U.: Probability, Random Variables, and Stochastic Processes. Fourth Edition. New York: McGraw-Hill, 2002.
 
*Söder, G.: Modellierung, Simulation und Optimierung von Nachrichtensystemen. Berlin – Heidelberg: Springer, 1993.
 
*Thomas, J.B.: Introduction to Probability. New York: Springer, 1986.
 
  
  
 
__NOTOC__
 
__NOTOC__
 
__NOEDITSECTION__
 
__NOEDITSECTION__

Latest revision as of 17:50, 31 December 2023

Brief summary

From the earliest beginnings of message transmission as an engineering discipline,  it has been the endeavour of many engineers and mathematicians  to find a quantitative measure for the

  • contained  $\rm information$  $($quite generally:  »the knowledge about something«$)$
  • in a  $\rm message$  $($here we mean  »a collection of symbols and/or states»$)$.


The  $($abstract$)$  information is communicated by the  $($concrete$)$  message and can be conceived as the interpretation of a message.

Claude Elwood Shannon  succeeded in 1948,  in establishing a consistent theory about the information content of messages,  which was revolutionary in its time and created a new,  still highly topical field of science:  »Shannon's information theory«  named after him.

This is what the fourth book in the  $\rm LNTwww$ series deals with,  in particular:

  1. Entropy of discrete-value sources with and without memory,  as well as natural message sources:  Definition,  meaning and computational possibilities.
  2. Source coding and data compression,  especially the   »Lempel–Ziv–Welch method«   and   »Huffman's entropy encoding«.
  3. Various entropies of two-dimensional discrete-value random quantities.  Mutual information and channel capacity.  Application to digital signal transmission.
  4. Discrete-value information theory.  Differential entropy.  AWGN channel capacity with continuous-valued as well as discrete-valued input.


⇒   First a  »content overview«  on the basis of the  »four main chapters«  with a total of  »13 individual chapters«  and  »106 sections«:


Content

Exercises and multimedia

In addition to these theory pages,  we also offer exercises and multimedia modules on this topic,  which could help to clarify the teaching material:

$(1)$    $\text{Exercises}$

$(2)$    $\text{Learning videos}$

$(3)$    $\text{Applets}$ 


Further links