ISBN-10:
0387987932
ISBN-13:
9780387987934
Pub. Date:
04/28/2000
Publisher:
Springer-Verlag New York, LLC
Numerical Optimization / Edition 1

Numerical Optimization / Edition 1

by Jorge Nocedal, Stephen J. Wright

Hardcover

Current price is , Original price is $84.95. You
Select a Purchase Option (Older Edition)
  • purchase options

Product Details

ISBN-13: 9780387987934
Publisher: Springer-Verlag New York, LLC
Publication date: 04/28/2000
Series: Series in Operations Research
Edition description: Older Edition
Pages: 636
Product dimensions: 7.31(w) x 9.48(h) x 1.46(d)

Table of Contents


Preface     xvii
Preface to the Second Edition     xxi
Introduction     1
Mathematical Formulation     2
Example: A Transportation Problem     4
Continuous versus Discrete Optimization     5
Constrained and Unconstrained Optimization     6
Global and Local Optimization     6
Stochastic and Deterministic Optimization     7
Convexity     7
Optimization Algorithms     8
Notes and References     9
Fundamentals of Unconstrained Optimization     10
What Is a Solution?     12
Recognizing a Local Minimum     14
Nonsmooth Problems     17
Overview of Algorithms     18
Two Strategies: Line Search and Trust Region     19
Search Directions for Line Search Methods     20
Models for Trust-Region Methods     25
Scaling     26
Exercises     27
Line Search Methods     30
Step Length     31
The Wolfe Conditions     33
The Goldstein Conditions     36
Sufficient Decrease and Backtracking     37
Convergence of Line Search Methods     37
Rate of Convergence     41
Convergence Rate of Steepest Descent     42
Newton's Method     44
Quasi-Newton Methods     46
Newton's Method with Hessian Modification     48
Eigenvalue Modification     49
Adding a Multiple of the Identity     51
Modified Cholesky Factorization     52
Modified Symmetric Indefinite Factorization     54
Step-Length Selection Algorithms     56
Interpolation     57
Initial Step Length     59
A Line Search Algorithm for the Wolfe Conditions     60
Notes and References     62
Exercises     63
Trust-Region Methods     66
Outline of the Trust-Region Approach     68
Algorithms Based on the Cauchy Point     71
The Cauchy Point     71
Improving on the Cauchy Point     73
The Dogleg Method     73
Two-Dimensional Subspace Minimization     76
Global Convergence     77
Reduction Obtained by the Cauchy Point     77
Convergence to Stationary Points     79
Iterative Solution of the Subproblem     83
The Hard Case     87
Proof of Theorem 4.1     89
Convergence of Algorithms Based on Nearly Exact Solutions     91
Local Convergence of Trust-Region Newton Methods     92
Other Enhancements     95
Scaling     95
Trust Regions in Other Norms     97
Notes and References     98
Exercises     98
Conjugate Gradient Methods     101
The Linear Conjugate Gradient Method     102
Conjugate Direction Methods     102
Basic Properties of the Conjugate Gradient Method     107
A Practical Form of the Conjugate Gradient Method     111
Rate of Convergence     112
Preconditioning     118
Practical Preconditioners     120
Nonlinear Conjugate Gradient Methods     121
The Fletcher-Reeves Method     121
The Polak-Ribiere Method and Variants     122
Quadratic Termination and Restarts     124
Behavior of the Fletcher-Reeves Method     125
Global Convergence     127
Numerical Performance     131
Notes and References     132
Exercises     133
Quasi-Newton Methods     135
The BFGS Method     136
Properties of the BFGS Method     141
Implementation     142
The SR1 Method     144
Properties of SR1 Updating     147
The Broyden Class     149
Convergence Analysis     153
Global Convergence of the BFGS Method     153
Superlinear Convergence of the BFGS Method     156
Convergence Analysis of the SR1 Method     160
Notes and References     161
Exercises     162
Large-Scale Unconstrained Optimization     164
Inexact Newton Methods     165
Local Convergence of Inexact Newton Methods     166
Line Search Newton-CG Method     168
Trust-Region Newton-CG Method     170
Preconditioning the Trust-Region Newton-CG Method     174
Trust-Region Newton-Lanczos Method     175
Limited-Memory Quasi-Newton Methods     176
Limited-Memory BFGS     177
Relationship with Conjugate Gradient Methods     180
General Limited-Memory Updating     181
Compact Representation of BFGS Updating     181
Unrolling the Update     184
Sparse Quasi-Newton Updates     185
Algorithms for Partially Separable Functions     186
Perspectives and Software     189
Notes and References     190
Exercises     191
Calculating Derivatives     193
Finite-Difference Derivative Approximations     194
Approximating the Gradient     195
Approximating a Sparse Jacobian     197
Approximating the Hessian     201
Approximating a Sparse Hessian     202
Automatic Differentiation     204
An Example     205
The Forward Mode     206
The Reverse Mode     207
Vector Functions and Partial Separability     210
Calculating Jacobians of Vector Functions     212
Calculating Hessians: Forward Mode     213
Calculating Hessians: Reverse Mode     215
Current Limitations     216
Notes and References     217
Exercises     217
Derivative-Free Optimization     220
Finite Differences and Noise     221
Model-Based Methods     223
Interpolation and Polynomial Bases     226
Updating the Interpolation Set     227
A Method Based on Minimum-Change Updating     228
Coordinate and Pattern-Search Methods     229
Coordinate Search Method      230
Pattern-Search Methods     231
A Conjugate-Direction Method     234
Nelder-Mead Method     238
Implicit Filtering     240
Notes and References     242
Exercises     242
Least-Squares Problems     245
Background     247
Linear Least-Squares Problems     250
Algorithms for Nonlinear Least-Squares Problems     254
The Gauss-Newton Method     254
Convergence of the Gauss-Newton Method     255
The Levenberg-Marquardt Method     258
Implementation of the Levenberg-Marquardt Method     259
Convergence of the Levenberg-Marquardt Method     261
Methods for Large-Residual Problems     262
Orthogonal Distance Regression     265
Notes and References     267
Exercises     269
Nonlinear Equations     270
Local Algorithms     274
Newton's Method for Nonlinear Equations     274
Inexact Newton Methods     277
Broyden's Method     279
Tensor Methods     283
Practical Methods     285
Merit Functions     285
Line Search Methods      287
Trust-Region Methods     290
Continuation/Homotopy Methods     296
Motivation     296
Practical Continuation Methods     297
Notes and References     302
Exercises     302
Theory of Constrained Optimization     304
Local and Global Solutions     305
Smoothness     306
Examples     307
A Single Equality Constraint     308
A Single Inequality Constraint     310
Two Inequality Constraints     313
Tangent Cone and Constraint Qualifications     315
First-Order Optimality Conditions     320
First-Order Optimality Conditions: Proof     323
Relating the Tangent Cone and the First-Order Feasible Direction Set     323
A Fundamental Necessary Condition     325
Farkas' Lemma     326
Proof of Theorem 12.1     329
Second-Order Conditions     330
Second-Order Conditions and Projected Hessians     337
Other Constraint Qualifications     338
A Geometric Viewpoint     340
Lagrange Multipliers and Sensitivity     341
Duality     343
Notes and References      349
Exercises     351
Linear Programming: The Simplex Method     355
Linear Programming     356
Optimality and Duality     358
Optimality Conditions     358
The Dual Problem     359
Geometry of the Feasible Set     362
Bases and Basic Feasible Points     362
Vertices of the Feasible Polytope     365
The Simplex Method     366
Outline     366
A Single Step of the Method     370
Linear Algebra in the Simplex Method     372
Other Important Details     375
Pricing and Selection of the Entering Index     375
Starting the Simplex Method     378
Degenerate Steps and Cycling     381
The Dual Simplex Method     382
Presolving     385
Where Does the Simplex Method Fit?     388
Notes and References     389
Exercises     389
Linear Programming: Interior-Point Methods     392
Primal-Dual Methods     393
Outline     393
The Central Path     397
Central Path Neighborhoods and Path-Following Methods     399
Practical Primal-Dual Algorithms      407
Corrector and Centering Steps     407
Step Lengths     409
Starting Point     410
A Practical Algorithm     411
Solving the Linear Systems     411
Other Primal-Dual Algorithms and Extensions     413
Other Path-Following Methods     413
Potential-Reduction Methods     414
Extensions     415
Perspectives and Software     416
Notes and References     417
Exercises     418
Fundamentals of Algorithms for Nonlinear Constrained Optimization     421
Categorizing Optimization Algorithms     422
The Combinatorial Difficulty of Inequality-Constrained Problems     424
Elimination of Variables     426
Simple Elimination using Linear Constraints     428
General Reduction Strategies for Linear Constraints     431
Effect of Inequality Constraints     434
Merit Functions and Filters     435
Merit Functions     435
Filters     437
The Maratos Effect     440
Second-Order Correction and Nonmonotone Techniques     443
Nonmonotone (Watchdog) Strategy     444
Notes and References      446
Exercises     446
Quadratic Programming     448
Equality-Constrained Quadratic Programs     451
Properties of Equality-Constrained QPs     451
Direct Solution of the KKT System     454
Factoring the Full KKT System     454
Schur-Complement Method     455
Null-Space Method     457
Iterative Solution of the KKT System     459
CG Applied to the Reduced System     459
The Projected CG Method     461
Inequality-Constrained Problems     463
Optimality Conditions for Inequality-Constrained Problems     464
Degeneracy     465
Active-Set Methods for Convex QPs     467
Specification of the Active-Set Method for Convex QP     472
Further Remarks on the Active-Set Method     476
Finite Termination of Active-Set Algorithm on Strictly Convex QPs     477
Updating Factorizations     478
Interior-Point Methods     480
Solving the Primal-Dual System     482
Step Length Selection     483
A Practical Primal-Dual Method     484
The Gradient Projection Method     485
Cauchy Point Computation     486
Subspace Minimization      488
Perspectives and Software     490
Notes and References     492
Exercises     492
Penalty and Augmented Lagrangian Methods     497
The Quadratic Penalty Method     498
Motivation     498
Algorithmic Framework     501
Convergence of the Quadratic Penalty Method     502
Ill Conditioning and Reformulations     505
Nonsmooth Penalty Functions     507
A Practical l[subscript 1] Penalty Method     511
A General Class of Nonsmooth Penalty Methods     513
Augmented Lagrangian Method: Equality Constraints     514
Motivation and Algorithmic Framework     514
Properties of the Augmented Lagrangian     517
Practical Augmented Lagrangian Methods     519
Bound-Constrained Formulation     519
Linearly Constrained Formulation     522
Unconstrained Formulation     523
Perspectives and Software     525
Notes and References     526
Exercises     527
Sequential Quadratic Programming     529
Local SQP Method     530
SQP Framework     531
Inequality Constraints     532
Preview of Practical SQP Methods     533
IQP and EQP     533
Enforcing Convergence     534
Algorithmic Development     535
Handling Inconsistent Linearizations     535
Full Quasi-Newton Approximations     536
Reduced-Hessian Quasi-Newton Approximations     538
Merit Functions     540
Second-Order Correction     543
A Practical Line Search SQP Method     545
Trust-Region SQP Methods     546
A Relaxation Method for Equality-Constrained Optimization     547
Sl[subscript 1] QP (Sequential l[subscript 1] Quadratic Programming)     549
Sequential Linear-Quadratic Programming (SLQP)     551
A Technique for Updating the Penalty Parameter     553
Nonlinear Gradient Projection     554
Convergence Analysis     556
Rate of Convergence     557
Perspectives and Software     560
Notes and References     561
Exercises     561
Interior-Point Methods for Nonlinear Programming     563
Two Interpretations     564
A Basic Interior-Point Algorithm     566
Algorithmic Development     569
Primal vs. Primal-Dual System      570
Solving the Primal-Dual System     570
Updating the Barrier Parameter     572
Handling Nonconvexity and Singularity     573
Step Acceptance: Merit Functions and Filters     575
Quasi-Newton Approximations     575
Feasible Interior-Point Methods     576
A Line Search Interior-Point Method     577
A Trust-Region Interior-Point Method     578
An Algorithm for Solving the Barrier Problem     578
Step Computation     580
Lagrange Multipliers Estimates and Step Acceptance     581
Description of a Trust-Region Interior-Point Method     582
The Primal Log-Barrier Method     583
Global Convergence Properties     587
Failure of the Line Search Approach     587
Modified Line Search Methods     589
Global Convergence of the Trust-Region Approach     589
Superlinear Convergence     591
Perspectives and Software     592
Notes and References     593
Exercises     594
Background Material     598
Elements of Linear Algebra     598
Vectors and Matrices     598
Norms     600
Subspaces      602
Eigenvalues, Eigenvectors, and the Singular-Value Decomposition     603
Determinant and Trace     605
Matrix Factorizations: Cholesky, LU, QR     606
Symmetric Indefinite Factorization     610
Sherman-Morrison-Woodbury Formula     612
Interlacing Eigenvalue Theorem     613
Error Analysis and Floating-Point Arithmetic     613
Conditioning and Stability     616
Elements of Analysis, Geometry, Topology     617
Sequences     617
Rates of Convergence     619
Topology of the Euclidean Space R[superscript n]     620
Convex Sets in R[superscript n]     621
Continuity and Limits     623
Derivatives     625
Directional Derivatives     628
Mean Value Theorem     629
Implicit Function Theorem     630
Order Notation     631
Root-Finding for Scalar Equations     633
A Regularization Procedure     635
References     637
Index     653

Customer Reviews

Most Helpful Customer Reviews

See All Customer Reviews