Computational Learning Theory

Third European Conference, EuroCOLT '97, Jerusalem, Israel, March 17 - 19, 1997, Proceedings

Shai Ben-David (Herausgeber)

Buch | Softcover
CCCXLVIII, 338 Seiten
1997 | 1997
Springer Berlin (Verlag)
978-3-540-62685-5 (ISBN)

Lese- und Medienproben

Computational Learning Theory -
53,49 inkl. MwSt
This book constitutes the refereed proceedings of the Third European Conference on Computational Learning Theory, EuroCOLT'97, held in Jerusalem, Israel, in March 1997.
The book presents 25 revised full papers carefully selected from a total of 36 high-quality submissions. The volume spans the whole spectrum of computational learning theory, with a certain emphasis on mathematical models of machine learning. Among the topics addressed are machine learning, neural nets, statistics, inductive inference, computational complexity, information theory, and theoretical physics.

Sample compression, learnability, and the Vapnik-Chervonenkis dimension.- Learning boxes in high dimension.- Learning monotone term decision lists.- Learning matrix functions over rings.- Learning from incomplete boundary queries using split graphs and hypergraphs.- Generalization of the PAC-model for learning with partial information.- Monotonic and dual-monotonic probabilistic language learning of indexed families with high probability.- Closedness properties in team learning of recursive functions.- Structural measures for games and process control in the branch learning model.- Learning under persistent drift.- Randomized hypotheses and minimum disagreement hypotheses for learning with noise.- Learning when to trust which experts.- On learning branching programs and small depth circuits.- Learning nearly monotone k-term DNF.- Optimal attribute-efficient learning of disjunction, parity, and threshold functions.- learning pattern languages using queries.- On fast and simple algorithms for finding Maximal subarrays and applications in learning theory.- A minimax lower bound for empirical quantizer design.- Vapnik-Chervonenkis dimension of recurrent neural networks.- Linear Algebraic proofs of VC-Dimension based inequalities.- A result relating convex n-widths to covering numbers with some applications to neural networks.- Confidence estimates of classification accuracy on new examples.- Learning formulae from elementary facts.- Control structures in hypothesis spaces: The influence on learning.- Ordinal mind change complexity of language identification.- Robust learning with infinite additional information.

Erscheint lt. Verlag 3.3.1997
Reihe/Serie Lecture Notes in Artificial Intelligence
Lecture Notes in Computer Science
Zusatzinfo CCCXLVIII, 338 p.
Verlagsort Berlin
Sprache englisch
Maße 155 x 235 mm
Gewicht 450 g
Themenwelt Informatik Software Entwicklung User Interfaces (HCI)
Informatik Theorie / Studium Künstliche Intelligenz / Robotik
Schlagworte algorithm • Algorithmic Learning • algorithmische Komplexität • Algorithmisches Lernen • algorithms • Algorithmus • classification • Complexity • Computational Complexity • Hardcover, Softcover / Informatik, EDV/Informatik • HC/Informatik, EDV/Informatik • Inductive Inference • Induktive Inferenz • Information Theory • Künstliche Intelligenz • Language Learning • Learning theory • machine learning • Maschinelles Lernen • Neural networks • Neuronale Netze • Neuronales Netz • Sprachenlernen • Spracherwerb
ISBN-10 3-540-62685-9 / 3540626859
ISBN-13 978-3-540-62685-5 / 9783540626855
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich
Aus- und Weiterbildung nach iSAQB-Standard zum Certified Professional …

von Mahbouba Gharbi; Arne Koschel; Andreas Rausch; Gernot Starke

Buch | Hardcover (2023)
dpunkt Verlag
34,90
Wissensverarbeitung - Neuronale Netze

von Uwe Lämmel; Jürgen Cleve

Buch | Hardcover (2023)
Carl Hanser (Verlag)
34,99