Probability and Information
An Integrated Approach
Seiten
2008
|
2nd Revised edition
Cambridge University Press (Verlag)
978-0-521-89904-8 (ISBN)
Cambridge University Press (Verlag)
978-0-521-89904-8 (ISBN)
An update of this popular introduction to probability theory and information theory with new material on Markov chains.
This updated textbook is an excellent way to introduce probability and information theory to new students in mathematics, computer science, engineering, statistics, economics, or business studies. Only requiring knowledge of basic calculus, it starts by building a clear and systematic foundation to the subject: the concept of probability is given particular attention via a simplified discussion of measures on Boolean algebras. The theoretical ideas are then applied to practical areas such as statistical inference, random walks, statistical mechanics and communications modelling. Topics covered include discrete and continuous random variables, entropy and mutual information, maximum entropy methods, the central limit theorem and the coding and transmission of information, and added for this new edition is material on Markov chains and their entropy. Lots of examples and exercises are included to illustrate how to use the theory in a wide range of applications, with detailed solutions to most exercises available online for instructors.
This updated textbook is an excellent way to introduce probability and information theory to new students in mathematics, computer science, engineering, statistics, economics, or business studies. Only requiring knowledge of basic calculus, it starts by building a clear and systematic foundation to the subject: the concept of probability is given particular attention via a simplified discussion of measures on Boolean algebras. The theoretical ideas are then applied to practical areas such as statistical inference, random walks, statistical mechanics and communications modelling. Topics covered include discrete and continuous random variables, entropy and mutual information, maximum entropy methods, the central limit theorem and the coding and transmission of information, and added for this new edition is material on Markov chains and their entropy. Lots of examples and exercises are included to illustrate how to use the theory in a wide range of applications, with detailed solutions to most exercises available online for instructors.
David Applebaum is a Professor in the Department of Probability and Statistics at the University of Sheffield.
Preface to the first edition; Preface to the second edition; 1. Introduction; 2. Combinatorics; 3. Sets and measures; 4. Probability; 5. Discrete random variables; 6. Information and entropy; 7. Communication; 8. Random variables with probability density functions; 9. Random vectors; 10. Markov chains and their entropy; Exploring further; Appendix 1. Proof by mathematical induction; Appendix 2. Lagrange multipliers; Appendix 3. Integration of exp (-½x²); Appendix 4. Table of probabilities associated with the standard normal distribution; Appendix 5. A rapid review of Matrix algebra; Selected solutions; Index.
Erscheint lt. Verlag | 14.8.2008 |
---|---|
Zusatzinfo | Worked examples or Exercises; 3 Tables, unspecified; 65 Line drawings, unspecified |
Verlagsort | Cambridge |
Sprache | englisch |
Maße | 177 x 252 mm |
Gewicht | 730 g |
Themenwelt | Mathematik / Informatik ► Mathematik ► Wahrscheinlichkeit / Kombinatorik |
ISBN-10 | 0-521-89904-4 / 0521899044 |
ISBN-13 | 978-0-521-89904-8 / 9780521899048 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
Mehr entdecken
aus dem Bereich
aus dem Bereich
Eine Einführung in die faszinierende Welt des Zufalls
Buch | Softcover (2024)
Springer Spektrum (Verlag)
39,99 €
Buch | Softcover (2024)
Springer Spektrum (Verlag)
44,99 €