Introduction to Optimization Methods
During the last decade the techniques of non-linear optim­ ization have emerged as an important subject for study and research. The increasingly widespread application of optim­ ization has been stimulated by the availability of digital computers, and the necessity of using them in the investigation of large systems. This book is an introduction to non-linear methods of optimization and is suitable for undergraduate and post­ graduate courses in mathematics, the physical and social sciences, and engineering. The first half of the book covers the basic optimization techniques including linear search methods, steepest descent, least squares, and the Newton-Raphson method. These are described in detail, with worked numerical examples, since they form the basis from which advanced methods are derived. Since 1965 advanced methods of unconstrained and constrained optimization have been developed to utilise the computational power of the digital computer. The second half of the book describes fully important algorithms in current use such as variable metric methods for unconstrained problems and penalty function methods for constrained problems. Recent work, much of which has not yet been widely applied, is reviewed and compared with currently popular techniques under a few generic main headings. vi PREFACE Chapter I describes the optimization problem in mathematical form and defines the terminology used in the remainder of the book. Chapter 2 is concerned with single variable optimization. The main algorithms of both search and approximation methods are developed in detail since they are an essential part of many multi-variable methods.
1001032155
Introduction to Optimization Methods
During the last decade the techniques of non-linear optim­ ization have emerged as an important subject for study and research. The increasingly widespread application of optim­ ization has been stimulated by the availability of digital computers, and the necessity of using them in the investigation of large systems. This book is an introduction to non-linear methods of optimization and is suitable for undergraduate and post­ graduate courses in mathematics, the physical and social sciences, and engineering. The first half of the book covers the basic optimization techniques including linear search methods, steepest descent, least squares, and the Newton-Raphson method. These are described in detail, with worked numerical examples, since they form the basis from which advanced methods are derived. Since 1965 advanced methods of unconstrained and constrained optimization have been developed to utilise the computational power of the digital computer. The second half of the book describes fully important algorithms in current use such as variable metric methods for unconstrained problems and penalty function methods for constrained problems. Recent work, much of which has not yet been widely applied, is reviewed and compared with currently popular techniques under a few generic main headings. vi PREFACE Chapter I describes the optimization problem in mathematical form and defines the terminology used in the remainder of the book. Chapter 2 is concerned with single variable optimization. The main algorithms of both search and approximation methods are developed in detail since they are an essential part of many multi-variable methods.
54.99 In Stock
Introduction to Optimization Methods

Introduction to Optimization Methods

by P. Adby
Introduction to Optimization Methods

Introduction to Optimization Methods

by P. Adby

Paperback

$54.99 
  • SHIP THIS ITEM
    In stock. Ships in 6-10 days.
  • PICK UP IN STORE

    Your local store may have stock of this item.

Related collections and offers


Overview

During the last decade the techniques of non-linear optim­ ization have emerged as an important subject for study and research. The increasingly widespread application of optim­ ization has been stimulated by the availability of digital computers, and the necessity of using them in the investigation of large systems. This book is an introduction to non-linear methods of optimization and is suitable for undergraduate and post­ graduate courses in mathematics, the physical and social sciences, and engineering. The first half of the book covers the basic optimization techniques including linear search methods, steepest descent, least squares, and the Newton-Raphson method. These are described in detail, with worked numerical examples, since they form the basis from which advanced methods are derived. Since 1965 advanced methods of unconstrained and constrained optimization have been developed to utilise the computational power of the digital computer. The second half of the book describes fully important algorithms in current use such as variable metric methods for unconstrained problems and penalty function methods for constrained problems. Recent work, much of which has not yet been widely applied, is reviewed and compared with currently popular techniques under a few generic main headings. vi PREFACE Chapter I describes the optimization problem in mathematical form and defines the terminology used in the remainder of the book. Chapter 2 is concerned with single variable optimization. The main algorithms of both search and approximation methods are developed in detail since they are an essential part of many multi-variable methods.

Product Details

ISBN-13: 9780412110405
Publisher: Springer Netherlands
Publication date: 10/31/1974
Series: Chapman and Hall Mathematics Series
Pages: 204
Product dimensions: 5.51(w) x 8.50(h) x 0.02(d)

Table of Contents

1 The optimization problem.- 1.1 Introduction.- 1.2 Problem definition.- 1.3 Optimization in one dimension.- 1.4 Optimization in n dimensions.- 2 Single variable optimization.- 2.1 Review of methods.- 2.2 The Fibonacci search.- 2.3 The Golden Section search.- 2.4 The Algorithm of Davies, Swann, and Campey.- 3 Multi-variable optimization.- 3.1 Introduction.- 3.2 Search methods.- 3.3 Gradient methods.- 4 Advanced methods.- 4.1 Introduction.- 4.2 General considerations.- 4.3 Advanced search methods.- 4.4 Advanced gradient methods.- 4.5 Minimax methods.- 5 Constrained optimization.- 5.1 Introduction.- 5.2 The Kuhn-Tucker conditions.- 5.3 Constrained optimization techniques.- 5.4 Direct search methods with constraints.- 5.5 Small step gradient methods.- 5.6 Sequential unconstrained methods.- 5.7 Large step gradient methods.- 5.8 Lagrangian methods.- 5.9 General considerations.- 5.10 Conclusion.- References.- Further reading.
From the B&N Reads Blog

Customer Reviews