Coding Theorems of Information Theory - J. Wolfowitz

Coding Theorems of Information Theory

(Autor)

Buch | Softcover
XII, 176 Seiten
2011 | 3rd ed. 1978. Softcover reprint of the original 3rd ed. 1978
Springer Berlin (Verlag)
978-3-642-66824-1 (ISBN)
106,99 inkl. MwSt
The objective of the present edition of this monograph is the same as that of earlier editions, namely, to provide readers with some mathemati cal maturity a rigorous and modern introduction to the ideas and principal theorems of probabilistic information theory. It is not necessary that readers have any prior knowledge whatever of information theory. The rapid development of the subject has had the consequence that any one book can now cover only a fraction of the literature. The latter is often written by engineers for engineers, and the mathematical reader may have some difficulty with it. The mathematician who understands the content and methods of this monograph should be able to read the literature and start on research of his own in a subject of mathematical beauty and interest. The present edition differs from the second in the following: Chapter 6 has been completely replaced by one on arbitrarily varying channels. Chapter 7 has been greatly enlarged. Chapter 8 on semi-continuous channels has been drastically shortened, and Chapter 11 on sequential decoding completely removed. The new Chapters 11-15 consist entirely of material which has been developed only in the last few years. The topics discussed are rate distortion, source coding, multiple access channels, and degraded broadcast channels. Even the specialist will find a new approach in the treatment of these subjects. Many of the proofs are new, more perspicuous, and considerably shorter than the original ones.

1. Heuristic Introduction to the Discrete Memoryless Channel.- 2. Combinatorial Preliminaries.- 2.1. Generated sequences.- 2.2. Properties of the entropy function.- Remarks.- 3. The Discrete Memoryless Channel.- 3.1. Description of the channel.- 3.2. A coding theorem.- 3.3. The strong converse.- 3.4. Strong converse for the binary symmetric channel.- 3.5. The finite-state channel with state calculable by both sender and receiver.- 3.6. The finite-state channel with state calculable only by the sender.- Remarks.- 4. Compound Channels.- 4.1. Introduction.- 4.2. The canonical channel.- 4.3. A coding theorem.- 4.4. Strong converse.- 4.5. Compound d.m.c. with c.p.f. known only to the receiver or only to the sender.- 4.6. Channels where the c.p.f. for each letter is stochastically determined.- 4.7. Proof of Theorem 4.6.4.- 4.8. The d.m.c. with feedback.- Remarks.- 5. The Discrete Finite-Memory Channel.- 5.1. The discrete channel.- 5.2. The discrete finite-memory channel.- 5.3. The coding theorem for the d.f.m.c..- 5.4. Strong converse of the coding theorem for the d.f.m.c.- 5.5. Rapidity of approach to C in the d.f.m.c.- 5.6. Discussion of the d.f.m.c.- Remarks.- 6. Channels with Arbitrarily Varying Channel Probability Functions.- 6.1. Introduction.- 6.2. Necessary and sufficient conditions for a positive rate of transmission.- 6.3. Remarks on the capacity of an arbitrarily varying channel.- 6.4. The capacity C of an arbitrarily varying channel when b = 2.- 6.5. Certain results for the general arbitrarily varying channel.- Remarks.- 7. General Discrete Channels.- 7.1. Alternative description of the general discrete channel.- 7.2. The method of maximal codes.- 7.3. The method of random codes.- 7.4. Weak converses.- 7.5. Digression on the d.m.c.- 7.6. Discussion of the foregoing.- 7.7. Channels without a capacity.- 7.8. Strong converses.- 7.9. The strong converse for the d.m.c. revisited.- Remarks.- 8. The Semi-Continuous Memoryless Channel.- 8.1. Introduction.- 8.2. A coding theorem and its strong converse.- 9. Continuous Channels with Additive Gaussian Noise.- 9.1. A continuous memoryless channel with additive Gaussian noise.- 9.2. Message sequences within a suitable sphere.- 9.3. Message sequences on the periphery of the sphere or within a shell adjacent to the boundary.- 9.4. Another proof of Theorems 9.2.1 and 9.2.2.- Remarks.- 10. Mathematical Miscellanea.- 10.1. Introduction.- 10.2. The asymptotic equipartition property.- 10.3. Admissibility of an ergodic input for a discrete finite-memory channel.- 11. Fundamentals of Rate Distortion Theory.- 11.1. Introduction.- 11.2. The approximation theorem.- 11.3. Converse of the approximation theorem.- 11.4. Summary of the previous results.- 11.5. The rate distortion function when side information is available.- Remarks.- 12. Source Coding.- 12.1. Separate coding to span the product of two spaces.- 12.2. Source coding with side information at the decoder.- 12.3. Encoding assisted by a common channel.- Remarks.- 13. Source Coding and Rate Distortion.- 13.1. The problem of Section 12.3 for rate distortion.- 13.2. The rate distortion function for source coding with side information at the decoder.- 14. Multiple Access Channels.- 14.1. Description of the problem.- 14.2. A coding theorem.- 14.3. Converse of the coding theorem.- 14.4. Miscellaneous remarks.- 15. Degraded Broadcast Channels.- 15.1. Formulation of the problem.- 15.2. A coding theorem.- 15.3. Beginning of the proof of the strong converse.- 15.4. Proof of the weak converse.- 15.5. Miscellaneous remarks.- References.

Erscheint lt. Verlag 15.11.2011
Reihe/Serie Ergebnisse der Mathematik und ihrer Grenzgebiete. 2. Folge
Zusatzinfo XII, 176 p.
Verlagsort Berlin
Sprache englisch
Maße 170 x 244 mm
Gewicht 341 g
Themenwelt Mathematik / Informatik Mathematik Wahrscheinlichkeit / Kombinatorik
Schlagworte Code • Communication theory • Information • Informationstheorie • Information Theory • Proof • Theorem • Theorems
ISBN-10 3-642-66824-0 / 3642668240
ISBN-13 978-3-642-66824-1 / 9783642668241
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich

von Jim Sizemore; John Paul Mueller

Buch | Softcover (2024)
Wiley-VCH (Verlag)
28,00
Eine Einführung in die faszinierende Welt des Zufalls

von Norbert Henze

Buch | Softcover (2024)
Springer Spektrum (Verlag)
39,99