Optimal Control: An Introduction to the Theory with Applications
Systems that evolve with time occur frequently in nature and modelling the behavior of such systems provides an important application of mathematics. These systems can be completely deterministic, but it may be possible too to control their behavior by intervention through "controls". The theory of optimal control is concerned with determining such controls which, at minimum cost, either direct the system along a given trajectory or enable it to reach a given point in its state space. This textbook is a straightforward introduction to the theory of optimal control with an emphasis on presenting many different applications. Professor Hocking has taken pains to ensure that the theory is developed to display the main themes of the arguments but without using sophisticated mathematical tools. Problems in this setting can arise across a wide range of subjects and there are illustrative examples of systems from fields as diverse as dynamics, economics, population control, and medicine. Throughout there are many worked examples, and numerous exercises (with solutions) are provided.
1101391799
Optimal Control: An Introduction to the Theory with Applications
Systems that evolve with time occur frequently in nature and modelling the behavior of such systems provides an important application of mathematics. These systems can be completely deterministic, but it may be possible too to control their behavior by intervention through "controls". The theory of optimal control is concerned with determining such controls which, at minimum cost, either direct the system along a given trajectory or enable it to reach a given point in its state space. This textbook is a straightforward introduction to the theory of optimal control with an emphasis on presenting many different applications. Professor Hocking has taken pains to ensure that the theory is developed to display the main themes of the arguments but without using sophisticated mathematical tools. Problems in this setting can arise across a wide range of subjects and there are illustrative examples of systems from fields as diverse as dynamics, economics, population control, and medicine. Throughout there are many worked examples, and numerous exercises (with solutions) are provided.
85.0 Out Of Stock
Optimal Control: An Introduction to the Theory with Applications

Optimal Control: An Introduction to the Theory with Applications

by Leslie M. Hocking
Optimal Control: An Introduction to the Theory with Applications

Optimal Control: An Introduction to the Theory with Applications

by Leslie M. Hocking

Paperback(New Edition)

$85.00 
  • SHIP THIS ITEM
    Temporarily Out of Stock Online
  • PICK UP IN STORE

    Your local store may have stock of this item.

Related collections and offers


Overview

Systems that evolve with time occur frequently in nature and modelling the behavior of such systems provides an important application of mathematics. These systems can be completely deterministic, but it may be possible too to control their behavior by intervention through "controls". The theory of optimal control is concerned with determining such controls which, at minimum cost, either direct the system along a given trajectory or enable it to reach a given point in its state space. This textbook is a straightforward introduction to the theory of optimal control with an emphasis on presenting many different applications. Professor Hocking has taken pains to ensure that the theory is developed to display the main themes of the arguments but without using sophisticated mathematical tools. Problems in this setting can arise across a wide range of subjects and there are illustrative examples of systems from fields as diverse as dynamics, economics, population control, and medicine. Throughout there are many worked examples, and numerous exercises (with solutions) are provided.

Product Details

ISBN-13: 9780198596820
Publisher: Oxford University Press
Publication date: 03/28/1991
Series: Oxford Applied Mathematics and Computing Science Series
Edition description: New Edition
Pages: 272
Product dimensions: 8.50(w) x 5.44(h) x 0.63(d)

About the Author

University College, London

Table of Contents

1. Optimal Control Problems2. Systems of Differential Equations, Matrices, and SetsPART A: Time-Optimal Control of Linear Systems3. Controllability4. Time-Optimal Control5. Further ExamplesPART B: The Pontryagin Maximum Principle6. The Basic Pontryagin Maximum Principle (PMP)7. Extensions to the PMP8. Linear State Equations with Quadratic Costs9. Proof of the Pontryagin Maximum Principle10. Further Applications and ExtensionsPART C: Applications of Optimal Control Theory11. Some Applied Optimal Control Problems12. Numerical Methods for Optimal Control Problems
From the B&N Reads Blog

Customer Reviews