Information Theoretic Principles for Agent Learning - Jerry D. Gibson

Information Theoretic Principles for Agent Learning

(Autor)

Buch | Hardcover
IX, 95 Seiten
2024 | 2025
Springer International Publishing (Verlag)
978-3-031-65387-2 (ISBN)
53,49 inkl. MwSt

This book provides readers with the fundamentals of information theoretic techniques for statistical data science analyses and for characterizing the behavior and performance of a learning agent outside of the standard results on communications and compression fundamental limits. Readers will benefit from the presentation of information theoretic quantities, definitions, and results that provide or could provide insights into data science and learning.

Jerry D. Gibson is Professor of Electrical and Computer Engineering at the University of California, Santa Barbara. He has been an Associate Editor of the IEEE Transactions on Communications and the IEEE Transactions on Information Theory. He was an IEEE Communications Society Distinguished Lecturer for 2007-2008. He is an IEEE Fellow, and he has received The Fredrick Emmons Terman Award (1990), the 1993 IEEE Signal Processing Society Senior Paper Award, the 2009 IEEE Technical Committee on Wireless Communications Recognition Award, and the 2010 Best Paper Award from the IEEE Transactions on Multimedia. He is the author, coauthor, and editor of several books, the most recent of which are The Mobile Communications Handbook (Editor, 3rd ed., 2012), Rate Distortion Bounds for Voice and Video (Coauthor with Jing Hu, NOW Publishers, 2014), and Information Theory and Rate Distortion Theory for Communications and Compression (Morgan-Claypool, 2014). His research interests are lossy source coding, wireless communications and networks, and digital signal processing.

Background and Overview.- Entropy and Mutual Information.- Differential Entropy, Entropy Rate, and Maximum Entropy.- Typical Sequences and The AEP.- Markov Chains and Cascaded Systems.- Hypothesis Testing, Estimation, Information, and Sufficient Statistics.- Information Theoretic Quantities and Learning.- Estimation and Entropy Power.- Time Series Analyses.- Information Bottleneck Principle.- Channel Capacity.- Rate Distortion Theory.

Erscheinungsdatum
Reihe/Serie Synthesis Lectures on Engineering, Science, and Technology
Zusatzinfo IX, 95 p. 11 illus., 2 illus. in color.
Verlagsort Cham
Sprache englisch
Maße 168 x 240 mm
Themenwelt Technik Elektrotechnik / Energietechnik
Technik Nachrichtentechnik
Schlagworte Asymptotic equipartition • Deep learning • Entropy and mutual information for discrete random variables • Information Theory • learning agent
ISBN-10 3-031-65387-4 / 3031653874
ISBN-13 978-3-031-65387-2 / 9783031653872
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich