Probabilistic Methods and Distributed Information (eBook)

Rudolf Ahlswede’s Lectures on Information Theory 5
eBook Download: PDF
2018 | 1st ed. 2019
XVIII, 581 Seiten
Springer International Publishing (Verlag)
978-3-030-00312-8 (ISBN)

Lese- und Medienproben

Probabilistic Methods and Distributed Information - Rudolf Ahlswede
Systemvoraussetzungen
96,29 inkl. MwSt
  • Download sofort lieferbar
  • Zahlungsarten anzeigen

The fifth volume of Rudolf Ahlswede's lectures on Information Theory focuses on several problems that were at the heart of a lot of his research. One of the highlights of the entire lecture note series is surely Part I of this volume on arbitrarily varying channels (AVC), a subject in which Ahlswede was probably the world's leading expert. Appended to Part I is a survey by Holger Boche and Ahmed Mansour on recent results concerning AVC and arbitrarily varying wiretap channels (AVWC). After a short Part II on continuous data compression, Part III, the longest part of the book, is devoted to distributed information. This Part includes discussions on a variety of related topics; among them let us emphasize two which are famously associated with Ahlswede: 'multiple descriptions', on which he produced some of the best research worldwide, and 'network coding', which had Ahlswede among the authors of its pioneering paper. The final Part IV on 'Statistical Inference under Communication constraints' is mainly based on Ahlswede's joint paper with Imre Csiszar, which received the Best Paper Award of the IEEE Information Theory Society.

The lectures presented in this work, which consists of 10 volumes, are suitable for graduate students in Mathematics, and also for those working in Theoretical Computer Science, Physics, and Electrical Engineering with a background in basic Mathematics. The lectures can be used either as the basis for courses or to supplement them in many ways. Ph.D. students will also find research problems, often with conjectures, that offer potential subjects for a thesis. More advanced researchers may find questions which form the basis of entire research programs. 




Rudolf Ahlswede (1938-2010) studied Mathematics in Göttingen, and held postdoc positions in Erlangen, Germany and Ohio, USA. From 1977 on he was full Professor of Applied Mathematics at the University of Bielefeld. His work represents an essential contribution to information theory and networking. He developed and contributed to a number of central areas, including network coding and the theory of identification, while also advancing the fields of combinatorics and number theory. These efforts culminated in his research program on the 'Development of a General Theory of Information Transfer'. In recognition of his work, he received several awards for 'Best Paper', as well as the distinguished 'Shannon Award'. 


Rudolf Ahlswede (1938–2010) studied Mathematics in Göttingen, and held postdoc positions in Erlangen, Germany and Ohio, USA. From 1977 on he was full Professor of Applied Mathematics at the University of Bielefeld. His work represents an essential contribution to information theory and networking. He developed and contributed to a number of central areas, including network coding and the theory of identification, while also advancing the fields of combinatorics and number theory. These efforts culminated in his research program on the “Development of a General Theory of Information Transfer”. In recognition of his work, he received several awards for “Best Paper”, as well as the distinguished “Shannon Award”. 

Preface 7
Words and Introduction of the Editors 9
Contents 12
Part I Arbitrarily Varying Channels 18
1 Preliminaries 19
1.1 Introduction 19
1.2 Basic Definitions 21
1.3 The Models of Arbitrarily Varying Channels 22
1.4 AVC and Zero-Error Capacity 24
1.5 Positivity of the Capacity 26
References 30
2 Random Correlated Codes for the AVC and the Compound Channels 32
2.1 The Capacity of the Compound Channel 32
2.2 The Random Code Capacity 33
2.3 Channels Without a Strong Converse 36
References 37
3 Elimination and Robustification Techniques 38
3.1 Elimination Technique and Dichotomy Formula 38
3.2 Robustification Techniques 40
3.3 A Capacity Formula of Arbitrarily Varying Multiple-Access Channels 42
3.4 Arbitrarily Varying Channels with State Sequence Known to the Sender 46
References 51
4 Arbitrarily Varying Channels with Worst Channels 53
4.1 Arbitrarily Varying Channels with Binary Output Alphabet 53
4.2 A Channel with Additive Gaussian Noise of Arbitrarily Varying Variances 55
References 59
5 Non-standard Decoders 60
5.1 Arbitrarily Varying Channels with the Criterion of Maximum Probability of Error 60
5.2 Positivity of Arbitrarily Varying Channels with Average Error Probability Criterion 67
5.3 The Smallest List Size of Codes for an Arbitrarily Varying Channel with Average Probability of Error Criterion 76
5.4 A Channel with Additive Gaussian Noise of Arbitrarily Varying Means 84
5.5 Interior of the Achievable Regions of Arbitrarily Varying Multiple-Access Channels with Average Probability of Error Criterion 95
References 103
6 Feedback and Correlated Sources 105
6.1 Arbitrarily Varying Channels with Noiseless Feedback 105
6.2 Correlated Sources Help the Transmission Over Arbitrarily Varying Channels 126
6.3 Arbitrarily Varying Multiple-Access Channels with Correlated Sender's Side Information or Correlated Messages 129
References 142
7 Arbitrarily Varying Source 143
7.1 Single-User Arbitrarily Varying Source 143
7.2 Multi-user Arbitrarily Varying Sources and Coloring Hypergraphs 145
References 157
8 Applications and Related Problems 159
8.1 Coding for Channels with Localized Errors and Arbitrarily Varying Channels 159
8.2 Application to Writing-Type Memories and OV-Channels 177
References 185
9 Appendix to Part I: The AVC and AVWC 187
9.1 Channel Models 187
9.1.1 Code Concepts 189
9.1.2 Capacity Results 195
9.1.3 Motivation 197
9.2 Basic Tools and Main Properties 198
9.2.1 Basic Tools 198
9.2.2 Analytical Properties of the Correlated Random Capacities 201
9.2.3 Discontinuity Behaviour Under List Decoding 202
9.2.4 Additivity and Super-Activation Under List Decoding 204
9.3 Further Applications and Open Problems 206
9.3.1 ?-Capacity 206
9.3.2 Secure Identification 207
9.3.3 Correlated Jamming 209
References 210
Part II Continuous Data Compression 212
10 Ergodic Theory and Encoding of Individual Sequences 213
10.1 Introduction 213
10.2 Formal Statement of the Problem and Results 213
10.3 Proof of Theorem 10.1 217
10.4 Proof of Theorem 10.2 (Converse Part) 226
References 229
11 The Slepian-Wolf Theorem for Individual Sequences 231
11.1 Introduction 231
11.2 Formal Statement of the Problem and the Main Result 232
11.3 Auxiliary Results 235
11.4 Proof of the Direct Part 238
11.5 Proof of the Converse Part 242
References 244
Part III Distributed Information 246
12 A Wringing Method: An Elementary Proof of the Strong Converse Theorem for Multiple-Access Channels 249
12.1 Introduction 249
12.2 The Strong Converse Theorem for the MAC 249
12.3 The Packing Lemma and a Bound on Codes for the MAC 251
12.4 Wringing Techniques 257
12.5 Proof of Theorem 12.1 263
References 265
13 Extremal Properties of Rate-Distortion Functions 266
13.1 Basic Concepts and Auxiliary Results 266
13.2 The Key Ideas and a Basic Inequality 269
13.3 Schur-Concavity in the Hamming Case 271
13.4 The Counterexample 275
13.5 A Consequence for Error Exponents 278
References 279
14 Multiple Descriptions 280
14.1 Introduction 280
14.2 Definitions and Formulation of Basic Results 281
14.3 Preliminaries 283
14.4 Wringing Techniques 287
14.5 Proof of Theorem 14.1 290
14.6 Witsenhausen's Hyperbola Conjecture for a Binary Source 293
14.7 A Zero-Distortion Problem 295
14.8 On Team Guessing 295
14.9 Proof of Theorem 14.2 299
14.10 Proof of Theorem 14.3 300
14.11 Continuity Properties 303
14.12 A Missing Step in Work on Breakdown Degradation 306
References 307
15 Distributive Information Storage 308
15.1 Introduction 308
15.2 Single-User Distributed Storage 310
15.3 Multiple Users with Common Information 319
15.4 Storage of Independent Information 324
15.5 Floating Parity for Disk Arrays 329
15.6 The Use of One Bit of Memory to Store Information About Bernoulli Sequence 333
References 339
16 Network Coding 340
16.1 The Network Coding Homepage 340
16.2 Information Flow in Networks 342
16.3 A Missed Theory and Possible Implications … 355
References 363
17 Random Network Coding 365
17.1 The Benefits of Coding Over Routing in a Randomized Setting 365
17.2 Another Look on the Subject: Practical Network Coding 374
17.3 Further Progress on Random Linear Network Coding 376
17.4 Error Correction Capability of Random Network Error Correction Codes 378
References 389
18 On Perfect Codes and Related Concepts 390
18.1 Introduction 390
18.2 A Local Inequality 395
18.3 Examples of D-Diameter Perfect Codes in J(n,k) 399
18.4 Examples of D-Diameter Perfect Codes in mathcalHq(n) 401
18.5 Tiling in J(n,k) with Caps 403
18.6 Open Problems 406
References 406
19 On Error Control Codes for Random Network Coding 408
19.1 Introduction 408
19.2 Bounds on the Size of Codes 410
19.3 Insertion/Deletion Channel 413
19.4 Code Construction 415
References 416
20 Classical Work: Edge-Disjoint Branchings, Min-Max Theorems, and Shortest Connection Networks 418
20.1 Edge-Disjoint Branchings 418
20.2 On Two Minimax Theorems in a Graph 421
20.3 On the Shortest Spanning Subtree of a Graph and the Traveling Salesman Problem 427
20.4 Shortest Connection Networks and Some Generalizations 429
References 437
21 On the Advantage of Network Coding 439
21.1 Introduction 440
21.2 Problems in the Multicast Networks 443
21.3 Network Switching for Multisource Multicast Networks with Links … 448
21.4 Multicast Network Switching Formulated as Matrix Game 455
21.5 Computation of Maximum Achievable Information Rate for Single-Source Multicast Network Switching 470
21.6 Computation of Achievable Information Rate Region for Multisource Multicast Network Switching 482
21.7 Maximum Information Flow with Network Switching Versus Maximum Information Flow with Network Coding 494
21.8 Achievable Information Rate Regions for a Class of Multisource Multicast Networks with Network Switching and Network Coding 504
21.9 Conclusion 508
References 509
Part IV Statistical Inference Under Communication Constraints 511
22 Hypothesis Testing Under Communication Constraints 512
22.1 Introduction 512
22.2 Statement and Discussion of Results 514
22.3 Lower Bound to ?(R) 520
22.4 Independence of ? of the Exponent 527
22.4.1 Blowing Up Lemma 527
22.5 Identification in a Large Population 532
References 534
23 Estimation Under Communication Constraints 536
23.1 Introduction 536
23.2 A Model for Parameter Estimation in the Presence of Side Information 538
23.3 On Fisher Information, Mutual Information, and the J Function 539
23.4 The Informational Inequality 542
23.5 Encoding the Side Information 547
23.6 Regularity Conditions for Achievability of the Informational Bound 554
23.7 Asymptotic Achievability of the Informational Bound in Case of a Finite mathcalX 560
23.8 Does J Single-Letterize in the Symmetric Bernoulli Case? 563
References 565
Supplement 567
Author Index 574
Subject Index 577

Erscheint lt. Verlag 31.12.2018
Reihe/Serie Foundations in Signal Processing, Communications and Networking
Co-Autor Vladimir Blinovsky, Holger Boche, Ulrich Krengel, Ahmed Mansour
Zusatzinfo XVIII, 581 p. 36 illus.
Verlagsort Cham
Sprache englisch
Themenwelt Mathematik / Informatik Mathematik
Technik Nachrichtentechnik
Schlagworte 94Axx, 94A15, 94A34, 68M10 • Data Compression • distributed information • Information and Communication, Circuits • Information Theory • multiple access channel • multiple Descriptions • network coding • Statistical interference
ISBN-10 3-030-00312-4 / 3030003124
ISBN-13 978-3-030-00312-8 / 9783030003128
Haben Sie eine Frage zum Produkt?
Wie bewerten Sie den Artikel?
Bitte geben Sie Ihre Bewertung ein:
Bitte geben Sie Daten ein:
PDFPDF (Wasserzeichen)
Größe: 8,0 MB

DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasser­zeichen und ist damit für Sie persona­lisiert. Bei einer missbräuch­lichen Weiter­gabe des eBooks an Dritte ist eine Rück­ver­folgung an die Quelle möglich.

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.

Zusätzliches Feature: Online Lesen
Dieses eBook können Sie zusätzlich zum Download auch online im Webbrowser lesen.

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
Ein Übungsbuch für Fachhochschulen

von Michael Knorrenschild

eBook Download (2023)
Carl Hanser Verlag GmbH & Co. KG
16,99