Learning and Generalisation
With Applications to Neural Networks
Seiten
2002
|
2nd ed. 2002
Springer London Ltd (Verlag)
978-1-85233-373-7 (ISBN)
Springer London Ltd (Verlag)
978-1-85233-373-7 (ISBN)
Provides a formal mathematical theory for addressing intuitive questions such as: How does a machine learn a new concept on the basis of examples? How can a neural network, after sufficient training, correctly predict the outcome of a previously unseen input? How much training is required to achieve a specified level of accuracy in the prediction?
Learning and Generalization provides a formal mathematical theory addressing intuitive questions of the type:
• How does a machine learn a concept on the basis of examples?
• How can a neural network, after training, correctly predict the outcome of a previously unseen input?
• How much training is required to achieve a given level of accuracy in the prediction?
• How can one identify the dynamical behaviour of a nonlinear control system by observing its input-output behaviour over a finite time?
The second edition covers new areas including:
• support vector machines;
• fat-shattering dimensions and applications to neural network learning;
• learning with dependent samples generated by a beta-mixing process;
• connections between system identification and learning theory;
• probabilistic solution of 'intractable problems' in robust control and matrix theory using randomized algorithms.
It also contains solutions to some of the open problems posed in the first edition, while adding new open problems.
Learning and Generalization provides a formal mathematical theory addressing intuitive questions of the type:
• How does a machine learn a concept on the basis of examples?
• How can a neural network, after training, correctly predict the outcome of a previously unseen input?
• How much training is required to achieve a given level of accuracy in the prediction?
• How can one identify the dynamical behaviour of a nonlinear control system by observing its input-output behaviour over a finite time?
The second edition covers new areas including:
• support vector machines;
• fat-shattering dimensions and applications to neural network learning;
• learning with dependent samples generated by a beta-mixing process;
• connections between system identification and learning theory;
• probabilistic solution of 'intractable problems' in robust control and matrix theory using randomized algorithms.
It also contains solutions to some of the open problems posed in the first edition, while adding new open problems.
1. Introduction.- 2. Preliminaries.- 3. Problem Formulations.- 4. Vapnik-Chervonenkis, Pseudo- and Fat-Shattering Dimensions.- 5. Uniform Convergence of Empirical Means.- 6. Learning Under a Fixed Probability Measure.- 7. Distribution-Free Learning.- 8. Learning Under an Intermediate Family of Probabilities.- 9. Alternate Models of Learning.- 10. Applications to Neural Networks..- 11. Applications to Control Systems.- 12. Some Open Problems.
Erscheint lt. Verlag | 27.9.2002 |
---|---|
Reihe/Serie | Communications and Control Engineering |
Zusatzinfo | XXI, 488 p. |
Verlagsort | England |
Sprache | englisch |
Maße | 155 x 235 mm |
Themenwelt | Informatik ► Theorie / Studium ► Künstliche Intelligenz / Robotik |
Technik ► Elektrotechnik / Energietechnik | |
Technik ► Maschinenbau | |
ISBN-10 | 1-85233-373-1 / 1852333731 |
ISBN-13 | 978-1-85233-373-7 / 9781852333737 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
Mehr entdecken
aus dem Bereich
aus dem Bereich
Buch | Softcover (2024)
REDLINE (Verlag)
20,00 €
Eine kurze Geschichte der Informationsnetzwerke von der Steinzeit bis …
Buch | Hardcover (2024)
Penguin (Verlag)
28,00 €