Linear Algebra for Data Science, Machine Learning, and Signal Processing
Cambridge University Press (Verlag)
978-1-009-41814-0 (ISBN)
Maximise student engagement and understanding of matrix methods in data-driven applications with this modern teaching package. Students are introduced to matrices in two preliminary chapters, before progressing to advanced topics such as the nuclear norm, proximal operators and convex optimization. Highlighted applications include low-rank approximation, matrix completion, subspace learning, logistic regression for binary classification, robust PCA, dimensionality reduction and Procrustes problems. Extensively classroom-tested, the book includes over 200 multiple-choice questions suitable for in-class interactive learning or quizzes, as well as homework exercises (with solutions available for instructors). It encourages active learning with engaging 'explore' questions, with answers at the back of each chapter, and Julia code examples to demonstrate how the mathematics is actually used in practice. A suite of computational notebooks offers a hands-on learning experience for students. This is a perfect textbook for upper-level undergraduates and first-year graduate students who have taken a prior course in linear algebra basics.
Jeffrey A. Fessler is the William L. Root Professor of EECS at the University of Michigan. He received the Edward Hoffman Medical Imaging Scientist Award in 2013, and an IEEE EMBS Technical Achievement Award in 2016. He received the 2023 Steven S. Attwood Award, the highest honor awarded to a faculty member by the College of Engineering at the University of Michigan. He is a fellow of the IEEE and of the AIMBE. Raj Rao Nadakuditi is an Associate Professor of EECS at the University of Michigan. He received the Jon R. and Beverly S. Holt Award for Excellence in Teaching in 2018 and the Ernest and Bettine Kuh Distinguished Faculty Award in 2021.
1. Getting started; 2. Introduction to Matrices; 3. Matrix factorization: eigendecomposition and SVD; 4. Subspaces, rank and nearest-subspace classification; 5. Linear least-squares regression and binary classification; 6. Norms and Procrustes problems; 7. Low-rank approximation and multidimensional scaling; 8. Special matrices, Markov chains and PageRank; 9. Optimization basics and logistic regression; 10. Matrix completion and recommender systems; 11. Neural network models; 12. Random matrix theory, signal+ noise matrices, and phase transitions.
Erscheinungsdatum | 17.05.2024 |
---|---|
Zusatzinfo | Worked examples or Exercises |
Verlagsort | Cambridge |
Sprache | englisch |
Maße | 176 x 251 mm |
Gewicht | 920 g |
Themenwelt | Informatik ► Datenbanken ► Data Warehouse / Data Mining |
Mathematik / Informatik ► Informatik ► Theorie / Studium | |
Mathematik / Informatik ► Mathematik ► Algebra | |
Mathematik / Informatik ► Mathematik ► Angewandte Mathematik | |
ISBN-10 | 1-009-41814-9 / 1009418149 |
ISBN-13 | 978-1-009-41814-0 / 9781009418140 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
aus dem Bereich