Nonlinear Parameter Optimization Using R Tools - John C. Nash

Nonlinear Parameter Optimization Using R Tools

(Autor)

Buch | Hardcover
304 Seiten
2014
John Wiley & Sons Inc (Verlag)
978-1-118-56928-3 (ISBN)
71,64 inkl. MwSt
Studibuch Logo

...gebraucht verfügbar!

Nonlinear Parameter Optimization Using R John C.
Nonlinear Parameter Optimization Using R
John C. Nash, Telfer School of Management, University of Ottawa, Canada

A systematic and comprehensive treatment of optimization software using R

In recent decades, optimization techniques have been streamlined by computational and artificial intelligence methods to analyze more variables, especially under non–linear, multivariable conditions, more quickly than ever before.

Optimization is an important tool for decision science and for the analysis of physical systems used in engineering. Nonlinear Parameter Optimization with R explores the principal tools available in R for function minimization, optimization, and nonlinear parameter determination and features numerous examples throughout.

Nonlinear Parameter Optimization with R:



Provides a comprehensive treatment of optimization techniques
Examines optimization problems that arise in statistics and how to solve them using R
Enables researchers and practitioners to solve parameter determination problems
Presents traditional methods as well as recent developments in R
Is supported by an accompanying website featuring R code, examples and datasets

Researchers and practitioners who have to solve parameter determination problems who are users of R but are novices in the field optimization or function minimization will benefit from this book. It will also be useful for scientists building and estimating nonlinear models in various fields such as hydrology, sports forecasting, ecology, chemical engineering, pharmaco-kinetics, agriculture, economics and statistics.

JOHN C. NASH, Telfer School of Management, University of Ottawa, Canada

Preface xv

1 Optimization problem tasks and how they arise 1

1.1 The general optimization problem 1

1.2 Why the general problem is generally uninteresting 2

1.3 (Non-)Linearity 4

1.4 Objective function properties 4

1.4.1 Sums of squares 4

1.4.2 Minimax approximation 5

1.4.3 Problems with multiple minima 5

1.4.4 Objectives that can only be imprecisely computed 5

1.5 Constraint types 5

1.6 Solving sets of equations 6

1.7 Conditions for optimality 7

1.8 Other classifications 7

References 8

2 Optimization algorithms – an overview 9

2.1 Methods that use the gradient 9

2.2 Newton-like methods 12

2.3 The promise of Newton’s method 13

2.4 Caution: convergence versus termination 14

2.5 Difficulties with Newton’s method 14

2.6 Least squares: Gauss–Newton methods 15

2.7 Quasi-Newton or variable metric method 17

2.8 Conjugate gradient and related methods 18

2.9 Other gradient methods 19

2.10 Derivative-free methods 19

2.10.1 Numerical approximation of gradients 19

2.10.2 Approximate and descend 19

2.10.3 Heuristic search 20

2.11 Stochastic methods 20

2.12 Constraint-based methods – mathematical programming 21

References 22

3 Software structure and interfaces 25

3.1 Perspective 25

3.2 Issues of choice 26

3.3 Software issues 27

3.4 Specifying the objective and constraints to the optimizer 28

3.5 Communicating exogenous data to problem definition functions 28

3.5.1 Use of “global” data and variables 31

3.6 Masked (temporarily fixed) optimization parameters 32

3.7 Dealing with inadmissible results 33

3.8 Providing derivatives for functions 34

3.9 Derivative approximations when there are constraints 36

3.10 Scaling of parameters and function 36

3.11 Normal ending of computations 36

3.12 Termination tests – abnormal ending 37

3.13 Output to monitor progress of calculations 37

3.14 Output of the optimization results 38

3.15 Controls for the optimizer 38

3.16 Default control settings 39

3.17 Measuring performance 39

3.18 The optimization interface 39

References 40

4 One-parameter root-finding problems 41

4.1 Roots 41

4.2 Equations in one variable 42

4.3 Some examples 42

4.3.1 Exponentially speaking 42

4.3.2 A normal concern 44

4.3.3 Little Polly Nomial 46

4.3.4 A hypothequial question 49

4.4 Approaches to solving 1D root-finding problems 51

4.5 What can go wrong? 52

4.6 Being a smart user of root-finding programs 54

4.7 Conclusions and extensions 54

References 55

5 One-parameter minimization problems 56

5.1 The optimize() function 56

5.2 Using a root-finder 57

5.3 But where is the minimum? 58

5.4 Ideas for 1D minimizers 59

5.5 The line-search subproblem 61

References 62

6 Nonlinear least squares 63

6.1 nls() from package stats 63

6.1.1 A simple example 63

6.1.2 Regression versus least squares 65

6.2 A more difficult case 65

6.3 The structure of the nls() solution 72

6.4 Concerns with nls() 73

6.4.1 Small residuals 74

6.4.2 Robustness – “singular gradient” woes 75

6.4.3 Bounds with nls() 77

6.5 Some ancillary tools for nonlinear least squares 79

6.5.1 Starting values and self-starting problems 79

6.5.2 Converting model expressions to sum-of-squares functions 80

6.5.3 Help for nonlinear regression 80

6.6 Minimizing Rfunctions that compute sums of squares 81

6.7 Choosing an approach 82

6.8 Separable sums of squares problems 86

6.9 Strategies for nonlinear least squares 93

References 93

7 Nonlinear equations 95

7.1 Packages and methods for nonlinear equations 95

7.1.1 BB 96

7.1.2 nleqslv 96

7.1.3 Using nonlinear least squares 96

7.1.4 Using function minimization methods 96

7.2 A simple example to compare approaches 97

7.3 A statistical example 103

References 106

8 Function minimization tools in the base R system 108

8.1 optim() 108

8.2 nlm() 110

8.3 nlminb() 111

8.4 Using the base optimization tools 112

References 114

9 Add-in function minimization packages for R 115

9.1 Package optimx 115

9.1.1 Optimizers in optimx 116

9.1.2 Example use of optimx() 117

9.2 Some other function minimization packages 118

9.2.1 nloptr and nloptwrap 118

9.2.2 trust and trustOptim 119

9.3 Should we replace optim() routines? 121

References 122

10 Calculating and using derivatives 123

10.1 Why and how 123

10.2 Analytic derivatives – by hand 124

10.3 Analytic derivatives – tools 125

10.4 Examples of use of R tools for differentiation 125

10.5 Simple numerical derivatives 127

10.6 Improved numerical derivative approximations 128

10.6.1 The Richardson extrapolation 128

10.6.2 Complex-step derivative approximations 128

10.7 Strategy and tactics for derivatives 129

References 131

11 Bounds constraints 132

11.1 Single bound: use of a logarithmic transformation 132

11.2 Interval bounds: Use of a hyperbolic transformation 133

11.2.1 Example of the tanh transformation 134

11.2.2 A fly in the ointment 134

11.3 Setting the objective large when bounds are violated 135

11.4 An active set approach 136

11.5 Checking bounds 138

11.6 The importance of using bounds intelligently 138

11.6.1 Difficulties in applying bounds constraints 139

11.7 Post-solution information for bounded problems 139

Appendix 11.A Function transfinite 141

References 142

12 Using masks 143

12.1 An example 143

12.2 Specifying the objective 143

12.3 Masks for nonlinear least squares 147

12.4 Other approaches to masks 148

References 148

13 Handling general constraints 149

13.1 Equality constraints 149

13.1.1 Parameter elimination 151

13.1.2 Which parameter to eliminate? 153

13.1.3 Scaling and centering? 154

13.1.4 Nonlinear programming packages 154

13.1.5 Sequential application of an increasing penalty 156

13.2 Sumscale problems 158

13.2.1 Using a projection 162

13.3 Inequality constraints 163

13.4 A perspective on penalty function ideas 167

13.5 Assessment 167

References 168

14 Applications of mathematical programming 169

14.1 Statistical applications of math programming 169

14.2 R packages for math programming 170

14.3 Example problem: L1 regression 171

14.4 Example problem: minimax regression 177

14.5 Nonlinear quantile regression 179

14.6 Polynomial approximation 180

References 183

15 Global optimization and stochastic methods 185

15.1 Panorama of methods 185

15.2 R packages for global and stochastic optimization 186

15.3 An example problem 187

15.3.1 Method SANN from optim() 187

15.3.2 Package GenSA 188

15.3.3 Packages DEoptim and RcppDE 189

15.3.4 Package smco 191

15.3.5 Package soma 192

15.3.6 Package Rmalschains 193

15.3.7 Package rgenoud 193

15.3.8 Package GA 194

15.3.9 Package gaoptim 195

15.4 Multiple starting values 196

References 202

16 Scaling and reparameterization 203

16.1 Why scale or reparameterize? 203

16.2 Formalities of scaling and reparameterization 204

16.3 Hobbs’ weed infestation example 205

16.4 The KKT conditions and scaling 210

16.5 Reparameterization of the weeds problem 214

16.6 Scale change across the parameter space 214

16.7 Robustness of methods to starting points 215

16.7.1 Robustness of optimization techniques 218

16.7.2 Robustness of nonlinear least squares methods 220

16.8 Strategies for scaling 222

References 223

17 Finding the right solution 224

17.1 Particular requirements 224

17.1.1 A few integer parameters 225

17.2 Starting values for iterative methods 225

17.3 KKT conditions 226

17.3.1 Unconstrained problems 226

17.3.2 Constrained problems 227

17.4 Search tests 228

References 229

18 Tuning and terminating methods 230

18.1 Timing and profiling 230

18.1.1 rbenchmark 231

18.1.2 microbenchmark 231

18.1.3 Calibrating our timings 232

18.2 Profiling 234

18.2.1 Trying possible improvements 235

18.3 More speedups of R computations 238

18.3.1 Byte-code compiled functions 238

18.3.2 Avoiding loops 238

18.3.3 Package upgrades - an example 239

18.3.4 Specializing codes 241

18.4 External language compiled functions 242

18.4.1 Building an R function using Fortran 244

18.4.2 Summary of Rayleigh quotient timings 246

18.5 Deciding when we are finished 247

18.5.1 Tests for things gone wrong 248

References 249

19 Linking R to external optimization tools 250

19.1 Mechanisms to link R to external software 251

19.1.1 R functions to call external (sub)programs 251

19.1.2 File and system call methods 251

19.1.3 Thin client methods 252

19.2 Prepackaged links to external optimization tools 252

19.2.1 NEOS 252

19.2.2 Automatic Differentiation Model Builder (ADMB) 252

19.2.3 NLopt 253

19.2.4 BUGS and related tools 253

19.3 Strategy for using external tools 253

References 254

20 Differential equation models 255

20.1 The model 255

20.2 Background 256

20.3 The likelihood function 258

20.4 A first try at minimization 258

20.5 Attempts with optimx 259

20.6 Using nonlinear least squares 260

20.7 Commentary 261

Reference 262

21 Miscellaneous nonlinear estimation tools for R 263

21.1 Maximum likelihood 263

21.2 Generalized nonlinear models 266

21.3 Systems of equations 268

21.4 Additional nonlinear least squares tools 268

21.5 Nonnegative least squares 270

21.6 Noisy objective functions 273

21.7 Moving forward 274

References 275

Appendix A R packages used in examples 276

Index 279

Verlagsort New York
Sprache englisch
Maße 158 x 236 mm
Gewicht 517 g
Themenwelt Informatik Office Programme Outlook
Mathematik / Informatik Mathematik Angewandte Mathematik
Mathematik / Informatik Mathematik Computerprogramme / Computeralgebra
Mathematik / Informatik Mathematik Finanz- / Wirtschaftsmathematik
ISBN-10 1-118-56928-8 / 1118569288
ISBN-13 978-1-118-56928-3 / 9781118569283
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich