Large-Scale Convex Optimization - Ernest K. Ryu, Wotao Yin

Large-Scale Convex Optimization

Algorithms & Analyses via Monotone Operators
Buch | Hardcover
400 Seiten
2022
Cambridge University Press (Verlag)
978-1-009-16085-8 (ISBN)
74,80 inkl. MwSt
This introduction to the theory of convex optimization algorithms presents a unified analysis of first-order optimization methods using the abstraction of monotone operators. The text empowers graduate students in mathematics, computer science, and engineering to choose and design the splitting methods best suited for a given problem.
Starting from where a first course in convex optimization leaves off, this text presents a unified analysis of first-order optimization methods – including parallel-distributed algorithms – through the abstraction of monotone operators. With the increased computational power and availability of big data over the past decade, applied disciplines have demanded that larger and larger optimization problems be solved. This text covers the first-order convex optimization methods that are uniquely effective at solving these large-scale optimization problems. Readers will have the opportunity to construct and analyze many well-known classical and modern algorithms using monotone operators, and walk away with a solid understanding of the diverse optimization algorithms. Graduate students and researchers in mathematical optimization, operations research, electrical engineering, statistics, and computer science will appreciate this concise introduction to the theory of convex optimization algorithms.

Ernest K. Ryu is Assistant Professor of Mathematical Sciences at Seoul National University. He previously served as Assistant Adjunct Professor with the Department of Mathematics at the University of California, Los Angeles from 2016 to 2019, before joining Seoul National University in 2020. He received a BS with distinction in physics and electrical engineering from the California Institute of Technology in 2010; and then an MS in statistics and a PhD – with the Gene Golub Best Thesis Award – in computational mathematics at Stanford University in 2016. His current research focuses on mathematical optimization and machine learning. Wotao Yin is Director of the Decision Intelligence Lab with Alibaba Group (US), Damo Academy, and a former Professor of Mathematics at the University of California, Los Angeles. He received his PhD in operations research from Columbia University in 2006. His numerous accolades include an NSF CAREER Award in 2008, an Alfred P. Sloan Research Fellowship in 2009, a Morningside Gold Medal in 2016, and a Damo Award and Egon Balas Prize in 2021. He invented fast algorithms for sparse optimization, image processing, and large-scale distributed optimization problems, and is among the top 1 percent of cited researchers by Clarivate Analytics. His research interests include computational optimization and its applications in signal processing, machine learning, and other data science problems.

Preface; 1. Introduction and preliminaries; Part I. Monotone Operator Methods: 2. Monotone operators and base splitting schemes; 3. Primal-dual splitting methods; 4. Parallel computing; 5. Randomized coordinate update methods; 6. Asynchronous coordinate update methods; Part II. Additional Topics: 7. Stochastic optimization; 8. ADMM-type methods; 9. Duality in splitting methods; 10. Maximality and monotone operator theory; 11. Distributed and decentralized optimization; 12. Acceleration; 13. Scaled relative graphs; Appendices; References; Index.

Erscheinungsdatum
Zusatzinfo Worked examples or Exercises
Verlagsort Cambridge
Sprache englisch
Maße 183 x 262 mm
Gewicht 800 g
Themenwelt Mathematik / Informatik Mathematik Geometrie / Topologie
ISBN-10 1-009-16085-0 / 1009160850
ISBN-13 978-1-009-16085-8 / 9781009160858
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich

von Hans Marthaler; Benno Jakob; Katharina Schudel

Buch | Softcover (2024)
hep verlag
61,00