Algebraic Geometry and Statistical Learning Theory
Seiten
2009
Cambridge University Press (Verlag)
978-0-521-86467-1 (ISBN)
Cambridge University Press (Verlag)
978-0-521-86467-1 (ISBN)
Sure to be influential, this book lays the foundations for the use of algebraic geometry in statistical learning theory. Many widely used statistical models are singular: mixture models, neural networks, HMMs, and Bayesian networks are major examples. The theory achieved here underpins accurate estimation techniques in the presence of singularities.
Sure to be influential, this book lays the foundations for the use of algebraic geometry in statistical learning theory. Many widely used statistical models and learning machines applied to information science have a parameter space that is singular: mixture models, neural networks, HMMs, Bayesian networks, and stochastic context-free grammars are major examples. Algebraic geometry and singularity theory provide the necessary tools for studying such non-smooth models. Four main formulas are established: 1. the log likelihood function can be given a common standard form using resolution of singularities, even applied to more complex models; 2. the asymptotic behaviour of the marginal likelihood or 'the evidence' is derived based on zeta function theory; 3. new methods are derived to estimate the generalization errors in Bayes and Gibbs estimations from training errors; 4. the generalization errors of maximum likelihood and a posteriori methods are clarified by empirical process theory on algebraic varieties.
Sure to be influential, this book lays the foundations for the use of algebraic geometry in statistical learning theory. Many widely used statistical models and learning machines applied to information science have a parameter space that is singular: mixture models, neural networks, HMMs, Bayesian networks, and stochastic context-free grammars are major examples. Algebraic geometry and singularity theory provide the necessary tools for studying such non-smooth models. Four main formulas are established: 1. the log likelihood function can be given a common standard form using resolution of singularities, even applied to more complex models; 2. the asymptotic behaviour of the marginal likelihood or 'the evidence' is derived based on zeta function theory; 3. new methods are derived to estimate the generalization errors in Bayes and Gibbs estimations from training errors; 4. the generalization errors of maximum likelihood and a posteriori methods are clarified by empirical process theory on algebraic varieties.
Sumio Watanabe is a Professor in the Precision and Intelligence Laboratory at the Tokyo Institute of Technology.
Preface; 1. Introduction; 2. Singularity theory; 3. Algebraic geometry; 4. Zeta functions and singular integral; 5. Empirical processes; 6. Singular learning theory; 7. Singular learning machines; 8. Singular information science; Bibliography; Index.
Erscheint lt. Verlag | 13.8.2009 |
---|---|
Reihe/Serie | Cambridge Monographs on Applied and Computational Mathematics |
Zusatzinfo | Worked examples or Exercises; 13 Halftones, unspecified |
Verlagsort | Cambridge |
Sprache | englisch |
Maße | 155 x 233 mm |
Gewicht | 560 g |
Themenwelt | Informatik ► Theorie / Studium ► Künstliche Intelligenz / Robotik |
Mathematik / Informatik ► Mathematik ► Geometrie / Topologie | |
ISBN-10 | 0-521-86467-4 / 0521864674 |
ISBN-13 | 978-0-521-86467-1 / 9780521864671 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
Mehr entdecken
aus dem Bereich
aus dem Bereich
Buch | Softcover (2024)
REDLINE (Verlag)
20,00 €
Eine kurze Geschichte der Informationsnetzwerke von der Steinzeit bis …
Buch | Hardcover (2024)
Penguin (Verlag)
28,00 €