Optimal Control and the Calculus of Variations / Edition 1

Optimal Control and the Calculus of Variations / Edition 1

by Enid R. Pinch
Pub. Date:
Oxford University Press, USA

Paperback - Rent for

Select a Purchase Option (Reprint)
  • purchase options

Temporarily Out of Stock Online


Optimal Control and the Calculus of Variations / Edition 1

Optimal control is a modern development of the calculus of variations and classical optimization theory. For that reason, this introduction to the theory of optimal control starts by considering the problem of minimizing a function of many variables. It moves through an exposition of the calculus of variations, to the optimal control of systems governed by ordinary differential equations. This approach should enable students to see the essential unity of important areas of mathematics, and also allow optimal control and the Pontryagin maximum principle to be placed in a proper context. A good knowledge of analysis, algebra, and methods is assumed. All the theorems are carefully proved, and there are many worked examples and exercises. Although this book is written for the advanced undergraduate mathematician, engineers and scientists who regularly rely on mathematics will also find it a useful text.

Product Details

ISBN-13: 9780198514893
Publisher: Oxford University Press, USA
Publication date: 09/28/1995
Edition description: Reprint
Pages: 248
Product dimensions: 6.13(w) x 9.19(h) x 0.61(d)

Table of Contents

PART I: Introduction
1.1. The maxima and minima of functions
1.2. The calculus of variations
1.3. Optimal control
PART II: Optimization in
2.1. Functions of one variable
2.2. Critical points, end-points, and points of discontinuity
2.3. Functions of several variables
2.4. Minimization with constraints
2.5. A geometrical interpretation
2.6. Distinguishing maxima from minima
PART III: The calculus of variations
3.1. Problems in which the end-points are not fixed
3.2. Finding minimizing curves
3.3. Isoperimetric problems
3.4. Sufficiency conditions
3.5. Fields of extremals
3.6. Hilbert's invariant integral
3.7. Semi-fields and the Jacobi condition
PART IV: Optimal Control I: Theory
4.1. Introduction
4.2. Control of a simple first-order system
4.3. Systems governed by ordinary differential equations
4.4. The optimal control problem
4.5. The Pontryagin maximum principle
4.6. Optimal control to target curves
PART V: Optimal Control II: Applications
5.1. Time-optimal control of linear systems
5.2. Optimal control to target curves
5.3. Singular controls
5.4. Fuel-optimal controls
5.5. Problems where the cost depends on X (t l)
5.6. Linear systems with quadratic cost
5.7. The steady-state Riccai equation
5.8. The calculus of variations revisited
PART VI: Proof of the Maximum Principle of Pontryagin
6.1. Convex sets in
6.2. The linearized state equations
6.3. Behaviour of H on an optimal path
6.4. Sufficiency conditions for optimal control

Customer Reviews

Most Helpful Customer Reviews

See All Customer Reviews