Für diesen Artikel ist leider kein Bild verfügbar.

Entropy and Information Theory

(Autor)

Buch | Hardcover
332 Seiten
1990
Springer-Verlag New York Inc.
978-0-387-97371-5 (ISBN)
117,65 inkl. MwSt
  • Titel erscheint in neuer Auflage
  • Artikel merken
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.
Verlagsort New York, NY
Sprache englisch
Maße 159 x 241 mm
Gewicht 680 g
Themenwelt Informatik Theorie / Studium Künstliche Intelligenz / Robotik
Naturwissenschaften Physik / Astronomie Thermodynamik
ISBN-10 0-387-97371-0 / 0387973710
ISBN-13 978-0-387-97371-5 / 9780387973715
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich
dem Menschen überlegen – wie KI uns rettet und bedroht

von Manfred Spitzer

Buch | Hardcover (2023)
Droemer (Verlag)
24,00