Difference between revisions of "Information Theory"
Line 51: | Line 51: | ||
Other links: | Other links: | ||
− | (1) [[LNTwww: | + | (1) [[LNTwww:Bibliography_to_"Information_Theory"|Bibliography to the book]] |
(2) [[LNTwww:General_notes_about_Information_Theory|General notes about the book]] (authors, other participants, materials as a starting point for the book, list of sources) | (2) [[LNTwww:General_notes_about_Information_Theory|General notes about the book]] (authors, other participants, materials as a starting point for the book, list of sources) |
Revision as of 14:46, 14 October 2021
Since the early beginnings of communications as an engineering discipline, many engineers and mathematicians have sought
- to find a quantitative measure of the Information (in general: "the knowledge of something")
- contained in a message (here we understand "a collection of symbols and/or states").
The (abstract) information is communicated by the (concrete) message and can be seen as an interpretation of a message.
Claude Elwood Shannon succeeded in 1948 in establishing a consistent theory of the information content of messages, which was revolutionary in its time and created a new, still highly topical field of science: the theory named after him Shannon's Information Theory.
The subject matter corresponds to a lecture with two semester hours per week (SWS) and one additional SWS exercise.
Here is a table of contents based on the four main chapters with a total of 13 individual chapters.
Contents
In addition to these theory pages, we also offer exercises and multimedia modules that could help to clarify the teaching material:
Other links:
(2) General notes about the book (authors, other participants, materials as a starting point for the book, list of sources)