An Introduction to Optimization
John Wiley & Sons Inc (Verlag)
978-1-118-27901-4 (ISBN)
- Titel erscheint in neuer Auflage
- Artikel merken
Praise for the Third Edition ". . . guides and leads the reader through the learning path . . . [e]xamples are stated very clearly and the results are presented with attention to detail." —MAA Reviews
Fully updated to reflect new developments in the field, the Fourth Edition of Introduction to Optimization fills the need for accessible treatment of optimization theory and methods with an emphasis on engineering design. Basic definitions and notations are provided in addition to the related fundamental background for linear algebra, geometry, and calculus.
This new edition explores the essential topics of unconstrained optimization problems, linear programming problems, and nonlinear constrained optimization. The authors also present an optimization perspective on global search methods and include discussions on genetic algorithms, particle swarm optimization, and the simulated annealing algorithm. Featuring an elementary introduction to artificial neural networks, convex optimization, and multi-objective optimization, the Fourth Edition also offers:
A new chapter on integer programming
Expanded coverage of one-dimensional methods
Updated and expanded sections on linear matrix inequalities
Numerous new exercises at the end of each chapter
MATLAB exercises and drill problems to reinforce the discussed theory and algorithms
Numerous diagrams and figures that complement the written presentation of key concepts
MATLAB M-files for implementation of the discussed theory and algorithms (available via the book's website)
Introduction to Optimization, Fourth Edition is an ideal textbook for courses on optimization theory and methods. In addition, the book is a useful reference for professionals in mathematics, operations research, electrical engineering, economics, statistics, and business.
Edwin K. P. Chong, PHD, is Professor of Electrical and Computer Engineering as well as Professor of Mathematics at Colorado State University. He is a Fellow of the IEEE and Senior Editor of IEEE Transactions on Automatic Control. Stanislaw H. Zak, PHD, is Professor in the School of Electrical and Computer Engineering at Purdue University. He is former associate editor of Dynamics and Control and IEEE Transactions on Neural Networks
Preface xiii
PART I MATHEMATICAL REVIEW
1 Methods of Proof and Some Notation 3
1.1 Methods of Proof 3
1.2 Notation 5
Exercises 6
2 Vector Spaces and Matrices 7
2.1 Vector and Matrix 7
2.2 Rank of a Matrix 13
2.3 Linear Equations 17
2.4 Inner Products and Norms 19
Exercises 22
3 Transformations 25
3.1 Linear Transformations 25
3.2 Eigenvalues and Eigenvectors 26
3.3 Orthogonal Projections 29
3.4 Quadratic Forms 31
3.5 Matrix Norms 35
Exercises 40
4 Concepts from Geometry 45
4.1 Line Segments 45
4.2 Hyperplanes and Linear Varieties 46
4.3 Convex Sets 48
4.4 Neighborhoods 50
4.5 Polytopes and Polyhedra 52
Exercises 53
5 Elements of Calculus 55
5.1 Sequences and Limits 55
5.2 Differentiability 62
5.3 The Derivative Matrix 63
5.4 Differentiation Rules 67
5.5 Level Sets and Gradients 68
5.6 Taylor Series 72
Exercises 77
PART II UNCONSTRAINED OPTIMIZATION
6 Basics of Set-Constrained and Unconstrained Optimization 81
6.1 Introduction 81
6.2 Conditions for Local Minimizers 83
Exercises 93
7 One-Dimensional Search Methods 103
7.1 Introduction 103
7.2 Golden Section Search 104
7.3 Fibonacci Method 108
7.4 Bisection Method 116
7.5 Newton’s Method 116
7.6 Secant Method 120
7.7 Bracketing 123
7.8 Line Search in Multidimensional Optimization 124
Exercises 126
8 Gradient Methods 131
8.1 Introduction 131
8.2 The Method of Steepest Descent 133
8.3 Analysis of Gradient Methods 141
Exercises 153
9 Newton’s Method 161
9.1 Introduction 161
9.2 Analysis of Newton’s Method 164
9.3 Levenberg-Marquardt Modification 168
9.4 Newton’s Method for Nonlinear Least Squares 168
Exercises 171
10 Conjugate Direction Methods 175
10.1 Introduction 175
10.2 The Conjugate Direction Algorithm 177
10.3 The Conjugate Gradient Algorithm 182
10.4 The Conjugate Gradient Algorithm for Nonquadratic
Problems 186
Exercises 189
11 Quasi-Newton Methods 193
11.1 Introduction 193
11.2 Approximating the Inverse Hessian 194
11.3 The Rank One Correction Formula 197
11.4 The DFP Algorithm 202
11.5 The BFGS Algorithm 207
Exercises 211
12 Solving Linear Equations 217
12.1 Least-Squares Analysis 217
12.2 The Recursive Least-Squares Algorithm 227
12.3 Solution to a Linear Equation with Minimum Norm 231
12.4 Kaczmarz’s Algorithm 232
12.5 Solving Linear Equations in General 236
Exercises 244
13 Unconstrained Optimization and Neural Networks 253
13.1 Introduction 253
13.2 Single-Neuron Training 256
13.3 The Backpropagation Algorithm 258
Exercises 270
14 Global Search Algorithms 273
14.1 Introduction 273
14.2 The Nelder-Mead Simplex Algorithm 274
14.3 Simulated Annealing 278
14.4 Particle Swarm Optimization 282
14.5 Genetic Algorithms 285
Exercises 298
PART III LINEAR PROGRAMMING
15 Introduction to Linear Programming 305
15.1 Brief History of Linear Programming 305
15.2 Simple Examples of Linear Programs 307
15.3 Two-Dimensional Linear Programs 314
15.4 Convex Polyhedra and Linear Programming 316
15.5 Standard Form Linear Programs 318
15.6 Basic Solutions 324
15.7 Properties of Basic Solutions 327
15.8 Geometric View of Linear Programs 330
Exercises 335
16 Simplex Method 339
16.1 Solving Linear Equations Using Row Operations 339
16.2 The Canonical Augmented Matrix 346
16.3 Updating the Augmented Matrix 349
16.4 The Simplex Algorithm 350
16.5 Matrix Form of the Simplex Method 357
16.6 Two-Phase Simplex Method 361
16.7 Revised Simplex Method 364
Exercises 369
17 Duality 379
17.1 Dual Linear Programs 379
17.2 Properties of Dual Problems 387
Exercises 394
18 Nonsimplex Methods 403
18.1 Introduction 403
18.2 Khachiyan’s Method 405
18.3 Affine Scaling Method 408
18.4 Karmarkar’s Method 413
Exercises 426
19 Integer Linear Programming 429
19.1 Introduction 429
19.2 Unimodular Matrices 430
19.3 The Gomory Cutting-Plane Method 437
Exercises 447
PART IV NONLINEAR CONSTRAINED OPTIMIZATION
20 Problems with Equality Constraints 453
20.1 Introduction 453
20.2 Problem Formulation 455
20.3 Tangent and Normal Spaces 456
20.4 Lagrange Condition 463
20.5 Second-Order Conditions 472
20.6 Minimizing Quadratics Subject to Linear Constraints 476
Exercises 481
21 Problems with Inequality Constraints 487
21.1 Karush-Kuhn-Tucker Condition 487
21.2 Second-Order Conditions 496
Exercises 501
22 Convex Optimization Problems 509
22.1 Introduction 509
22.2 Convex Functions 512
22.3 Convex Optimization Problems 521
22.4 Semidefinite Programming 527
Exercises 540
23 Algorithms for Constrained Optimization 549
23.1 Introduction 549
23.2 Projections 549
23.3 Projected Gradient Methods with Linear Constraints 553
23.4 Lagrangian Algorithms 557
23.5 Penalty Methods 564
Exercises 571
24 Multiobjective Optimization 577
24.1 Introduction 577
24.2 Pareto Solutions 578
24.3 Computing the Pareto Front 581
24.4 From Multiobjective to Single-Objective Optimization 585
24.5 Uncertain Linear Programming Problems 588
Exercises 596
References 599
Index 609
Erscheint lt. Verlag | 1.2.2013 |
---|---|
Reihe/Serie | Wiley Series in Discrete Mathematics and Optimization |
Verlagsort | New York |
Sprache | englisch |
Maße | 165 x 239 mm |
Gewicht | 975 g |
Themenwelt | Mathematik / Informatik ► Mathematik ► Angewandte Mathematik |
Mathematik / Informatik ► Mathematik ► Finanz- / Wirtschaftsmathematik | |
Technik ► Elektrotechnik / Energietechnik | |
Wirtschaft ► Betriebswirtschaft / Management | |
ISBN-10 | 1-118-27901-8 / 1118279018 |
ISBN-13 | 978-1-118-27901-4 / 9781118279014 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
aus dem Bereich