Difference between revisions of "Information Theory"
Line 51: | Line 51: | ||
<br><br> | <br><br> | ||
<br><br> | <br><br> | ||
− | + | ||
− | |||
(1) [[LNTwww:Bibliography_to_Information_Theory|Bibliography to the book]] | (1) [[LNTwww:Bibliography_to_Information_Theory|Bibliography to the book]] | ||
Revision as of 14:08, 26 September 2021
Since the early beginnings of communications as an engineering discipline, many engineers and mathematicians have sought to find a quantitative measure of
- the Information (in general: "the knowledge of something") contained in a message (here we understand "a collection of symbols and/or states").
The (abstract) information is communicated by the (concrete) message and can be seen as an interpretation of a message.
Claude Elwood Shannon succeeded in 1948 in establishing a consistent theory of the information content of messages, which was revolutionary in its time and created a new, still highly topical field of science: the theory named after him Shannon's Information Theory.
The subject matter corresponds to a lecture with two semester hours per week (SWS) and one additional SWS exercise.
Here is a table of contents based on the four main chapters with a total of 13 individual chapters.
Contents
In addition to these theory pages, we also offer exercises and multimedia modules that could help to clarify the teaching material:
More links:
(2) Notes on the authors and materials used in the preparation of the book
(2) General notes about the book (Authors, other participants, materials as a starting point for the book, list of sources)