Ensemble Methods
Foundations and Algorithms
Seiten
2025
|
2nd edition
Chapman & Hall/CRC (Verlag)
978-1-032-96060-9 (ISBN)
Chapman & Hall/CRC (Verlag)
978-1-032-96060-9 (ISBN)
- Noch nicht erschienen (ca. Februar 2025)
- Versandkostenfrei innerhalb Deutschlands
- Auch auf Rechnung
- Verfügbarkeit in der Filiale vor Ort prüfen
- Artikel merken
Ensemble methods that train multiple learners and then combine them to use, with /textit{Boosting} and /textit{Bagging} as representatives, are well-known machine learning approaches. An ensemble is significantly more accurate than a single learner, and ensemble methods have already achieved great success in various real-world tasks.
Ensemble methods that train multiple learners and then combine them to use, with /textit{Boosting} and /textit{Bagging} as representatives, are well-known machine learning approaches. It has become common sense that an ensemble is usually significantly more accurate than a single learner, and ensemble methods have already achieved great success in various real-world tasks.
Twelve years have passed since the publication of the first edition of the book in 2012 (Japanese and Chinese versions published in 2017 and 2020, respectively). Many significant advances in this field have been developed. First, many theoretical issues have been tackled, for example, the fundamental question of /textit{why AdaBoost seems resistant to overfitting} gets addressed, so that now we understand much more about the essence of ensemble methods. Second, ensemble methods have been well developed in more machine learning fields, e.g., /textit{isolation forest} in anomaly detection, so that now we have powerful ensemble methods for tasks beyond conventional supervised learning. Third, ensemble mechanisms have also been found helpful in emerging areas such as deep learning and online learning.
This edition expands on the previous one with additional content to reflect the significant advances in the field, and is written in a concise but comprehensive style to be approachable to readers new to the subject.
Ensemble methods that train multiple learners and then combine them to use, with /textit{Boosting} and /textit{Bagging} as representatives, are well-known machine learning approaches. It has become common sense that an ensemble is usually significantly more accurate than a single learner, and ensemble methods have already achieved great success in various real-world tasks.
Twelve years have passed since the publication of the first edition of the book in 2012 (Japanese and Chinese versions published in 2017 and 2020, respectively). Many significant advances in this field have been developed. First, many theoretical issues have been tackled, for example, the fundamental question of /textit{why AdaBoost seems resistant to overfitting} gets addressed, so that now we understand much more about the essence of ensemble methods. Second, ensemble methods have been well developed in more machine learning fields, e.g., /textit{isolation forest} in anomaly detection, so that now we have powerful ensemble methods for tasks beyond conventional supervised learning. Third, ensemble mechanisms have also been found helpful in emerging areas such as deep learning and online learning.
This edition expands on the previous one with additional content to reflect the significant advances in the field, and is written in a concise but comprehensive style to be approachable to readers new to the subject.
Zhi-Hua Zhou, Professor of Computer Science and Artificial Intelligence at Nanjing University, President of IJCAI trustee, Fellow of the ACM, AAAI, AAAS, IEEE, recipient of the IEEE Computer Society Edward J. McCluskey Technical Achievement Award, CCF-ACM Artificial Intelligence Award.
Preface Notations 1. Introduction 2. Boosting 3. Bagging 4. Combination Methods 5. Diversity 6. Ensemble Pruning 7. Clustering Ensemble 8. Anomaly Detection and Isolation Forest 9. Semi-Supervised Ensemble 10. Class-Imbalance and Cost-Sensitive Ensemble 11. Deep Learning and Deep Forest 12. Advanced Topics References Index
Erscheint lt. Verlag | 27.2.2025 |
---|---|
Reihe/Serie | Chapman & Hall/CRC Machine Learning & Pattern Recognition |
Zusatzinfo | 4 Tables, black and white; 43 Line drawings, color; 27 Line drawings, black and white; 43 Illustrations, color; 27 Illustrations, black and white |
Sprache | englisch |
Maße | 156 x 234 mm |
Themenwelt | Informatik ► Theorie / Studium ► Künstliche Intelligenz / Robotik |
Mathematik / Informatik ► Mathematik | |
Technik ► Elektrotechnik / Energietechnik | |
ISBN-10 | 1-032-96060-4 / 1032960604 |
ISBN-13 | 978-1-032-96060-9 / 9781032960609 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
Mehr entdecken
aus dem Bereich
aus dem Bereich
Buch | Softcover (2024)
REDLINE (Verlag)
20,00 €
Eine kurze Geschichte der Informationsnetzwerke von der Steinzeit bis …
Buch | Hardcover (2024)
Penguin (Verlag)
28,00 €