Lectures on Convex Optimization
Springer International Publishing (Verlag)
978-3-319-91577-7 (ISBN)
This book provides a comprehensive, modern introduction to convex optimization, a field that is becoming increasingly important in applied mathematics, economics and finance, engineering, and computer science, notably in data science and machine learning.
Written by a leading expert in the field, this book includes recent advances in the algorithmic theory of convex optimization, naturally complementing the existing literature. It contains a unified and rigorous presentation of the acceleration techniques for minimization schemes of first- and second-order. It provides readers with a full treatment of the smoothing technique, which has tremendously extended the abilities of gradient-type methods. Several powerful approaches in structural optimization, including optimization in relative scale and polynomial-time interior-point methods, are also discussed in detail.
Researchers in theoretical optimization as well as professionals working on optimization problems will findthis book very useful. It presents many successful examples of how to develop very fast specialized minimization algorithms. Based on the author's lectures, it can naturally serve as the basis for introductory and advanced courses in convex optimization for students in engineering, economics, computer science and mathematics.Yurii Nesterov is a well-known specialist in optimization. He is an author of pioneering works related to fast gradient methods, polynomial-time interior-point methods, smoothing technique, regularized Newton methods, and others. He is a winner of several prestigious international prizes, including George Danzig prize (2000), von Neumann Theory prize (2009), SIAM Outstanding Paper Award (20014), and Euro Gold Medal (2016).
Introduction.- Part I Black-Box Optimization.- 1 Nonlinear Optimization.- 2 Smooth Convex Optimization.- 3 Nonsmooth Convex Optimization.- 4 Second-Order Methods.- Part II Structural Optimization.- 5 Polynomial-time Interior-Point Methods.- 6 Primal-Dual Model of Objective Function.- 7 Optimization in Relative Scale.- Bibliographical Comments.- Appendix A. Solving some Auxiliary Optimization Problems.- References.- Index.
"It is a must-read for both students involved in the operations research programs, as well as the researchers in the area of nonlinear programming, in particular in convex optimization." (Marcin Anholcer, zbMATH 1427.90003, 2020)
“It is a must-read for both students involved in the operations research programs, as well as the researchers in the area of nonlinear programming, in particular in convex optimization.” (Marcin Anholcer, zbMATH 1427.90003, 2020)
Erscheinungsdatum | 23.09.2018 |
---|---|
Reihe/Serie | Springer Optimization and Its Applications |
Zusatzinfo | XXIII, 589 p. 1 illus. |
Verlagsort | Cham |
Sprache | englisch |
Maße | 155 x 235 mm |
Gewicht | 1081 g |
Themenwelt | Informatik ► Theorie / Studium ► Algorithmen |
Mathematik / Informatik ► Mathematik ► Analysis | |
Mathematik / Informatik ► Mathematik ► Angewandte Mathematik | |
Schlagworte | 90C51, 90C52, 90C60 • Algorithm analysis and problem complexity • Complexity • Complexity theory • Cubic Regularization of Newton Method • Fast Gradient Methods • Graphs • interior-point methods • Mathematical Programming • MSC 2010 49M15, 49M29, 49N15, 65K05, 65K10, 90C25, • MSC 2010 49M15, 49M29, 49N15, 65K05, 65K10, 90C25, 90C30, 90C46 • Optimization • Optimization in Relative Scale • Self-Concordant Functions • Smoothing Technique |
ISBN-10 | 3-319-91577-0 / 3319915770 |
ISBN-13 | 978-3-319-91577-7 / 9783319915777 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
aus dem Bereich