Learning Theory
Springer Berlin (Verlag)
978-3-540-26556-6 (ISBN)
Peter Auer ist Koch und hat in vielen renommierten Restaurants im In- und Ausland gelernt und gearbeitet. Sein kulinarisches Wissen gibt er auch als Referent an der Akademie des deutschen Hotel- und Gaststättenverbandes und in vielen Kochkursen weiter.
Learning to Rank.- Ranking and Scoring Using Empirical Risk Minimization.- Learnability of Bipartite Ranking Functions.- Stability and Generalization of Bipartite Ranking Algorithms.- Loss Bounds for Online Category Ranking.- Boosting.- Margin-Based Ranking Meets Boosting in the Middle.- Martingale Boosting.- The Value of Agreement, a New Boosting Algorithm.- Unlabeled Data, Multiclass Classification.- A PAC-Style Model for Learning from Labeled and Unlabeled Data.- Generalization Error Bounds Using Unlabeled Data.- On the Consistency of Multiclass Classification Methods.- Sensitive Error Correcting Output Codes.- Online Learning I.- Data Dependent Concentration Bounds for Sequential Prediction Algorithms.- The Weak Aggregating Algorithm and Weak Mixability.- Tracking the Best of Many Experts.- Improved Second-Order Bounds for Prediction with Expert Advice.- Online Learning II.- Competitive Collaborative Learning.- Analysis of Perceptron-Based Active Learning.- A New Perspective on an Old Perceptron Algorithm.- Support Vector Machines.- Fast Rates for Support Vector Machines.- Exponential Convergence Rates in Classification.- General Polynomial Time Decomposition Algorithms.- Kernels and Embeddings.- Approximating a Gram Matrix for Improved Kernel-Based Learning.- Learning Convex Combinations of Continuously Parameterized Basic Kernels.- On the Limitations of Embedding Methods.- Leaving the Span.- Inductive Inference.- Variations on U-Shaped Learning.- Mind Change Efficient Learning.- On a Syntactic Characterization of Classification with a Mind Change Bound.- Unsupervised Learning.- Ellipsoid Approximation Using Random Vectors.- The Spectral Method for General Mixture Models.- On Spectral Learning of Mixtures of Distributions.- From Graphs to Manifolds - Weak andStrong Pointwise Consistency of Graph Laplacians.- Towards a Theoretical Foundation for Laplacian-Based Manifold Methods.- Generalization Bounds.- Permutation Tests for Classification.- Localized Upper and Lower Bounds for Some Estimation Problems.- Improved Minimax Bounds on the Test and Training Distortion of Empirically Designed Vector Quantizers.- Rank, Trace-Norm and Max-Norm.- Query Learning, Attribute Efficiency, Compression Schemes.- Learning a Hidden Hypergraph.- On Attribute Efficient and Non-adaptive Learning of Parities and DNF Expressions.- Unlabeled Compression Schemes for Maximum Classes.- Economics and Game Theory.- Trading in Markovian Price Models.- From External to Internal Regret.- Separation Results for Learning Models.- Separating Models of Learning from Correlated and Uncorrelated Data.- Asymptotic Log-Loss of Prequential Maximum Likelihood Codes.- Teaching Classes with High Teaching Dimension Using Few Examples.- Open Problems.- Optimum Follow the Leader Algorithm.- The Cross Validation Problem.- Compute Inclusion Depth of a Pattern.
Erscheint lt. Verlag | 20.6.2005 |
---|---|
Reihe/Serie | Lecture Notes in Artificial Intelligence | Lecture Notes in Computer Science |
Zusatzinfo | XII, 692 p. |
Verlagsort | Berlin |
Sprache | englisch |
Maße | 155 x 235 mm |
Gewicht | 984 g |
Themenwelt | Informatik ► Theorie / Studium ► Algorithmen |
Informatik ► Theorie / Studium ► Künstliche Intelligenz / Robotik | |
Schlagworte | Algorithm analysis and problem complexity • Algorithmic Learning • Boosting • classification • Computational Learning • Decision Theory • Game Theory • Inductive Inference • Kernel Methods • learning • Learning theory • machine learning • Online Learning • Statistical Learning • supervised learning • Support Vector Machine • Unsupervised Learning |
ISBN-10 | 3-540-26556-2 / 3540265562 |
ISBN-13 | 978-3-540-26556-6 / 9783540265566 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
aus dem Bereich