Optimal Control: Calculus of Variations, Optimal Control Theory and Numerical Methods

Optimal Control: Calculus of Variations, Optimal Control Theory and Numerical Methods

by Bulirsch, Miele, Stoer, Well

Paperback(Softcover reprint of the original 1st ed. 1993)

Choose Expedited Shipping at checkout for guaranteed delivery by Monday, September 23


"Optimal Control" reports on new theoretical and practical advances essential for analysing and synthesizing optimal controls of dynamical systems governed by partial and ordinary differential equations. New necessary and sufficient conditions for optimality are given. Recent advances in numerical methods are discussed. These have been achieved through new techniques for solving large-sized nonlinear programs with sparse Hessians, and through a combination of direct and indirect methods for solving the multipoint boundary value problem. The book also focuses on the construction of feedback controls for nonlinear systems and highlights advances in the theory of problems with uncertainty. Decomposition methods of nonlinear systems and new techniques for constructing feedback controls for state- and control constrained linear quadratic systems are presented. The book offers solutions to many complex practical optimal control problems.

Product Details

ISBN-13: 9783034875417
Publisher: Birkhäuser Basel
Publication date: 03/12/2013
Series: International Series of Numerical Mathematics , #111
Edition description: Softcover reprint of the original 1st ed. 1993
Pages: 350
Product dimensions: 6.10(w) x 9.25(h) x 0.03(d)

Table of Contents

Optimality Conditions and Algorithms.- Tent Method in Optimal Control Theory.- Pontryagin’s Maximum Principle for Multidimensional Control Problems.- An Algorithm for Abstract Optimal Control Problems Using Maximum Principles and Applications to a Class of Distributed Parameter Systems.- Convexification of Control Problems in Evolution Equations.- Semidiscrete Ritz-Galerkin Approximation of Nonlinear Parabolic Boundary Control Problems.- Iterative Methods for Optimal Control Processes governed by Integral Equations.- Solving Equations a Problem of Optimal Control.- On the Minimax Optimal Control Problem and Its Variations.- Numerical Methods.- Trajectory Optimization Using Sparse Sequential Quadratic Programming.- Numerical Solution of Optimal Control Problems by Direct Collocation.- Reduced SQP Methods for Nonlinear Heat Conduction Control Problems.- Analysis and Synthesis of Nonlinear Systems.- Decomposition and Feedback Control of Nonlinear Dynamic Systems.- A Discrete Stabilizing Study Strategy for a Student Related Problem under Uncertainty.- Stability Conditions in Terms of Eigenvalues of a Nonlinear Optimal Controlled System.- Program-Positional Optimization for Dynamic Systems.- Synthesis of Bilinear Controlled Systems with Delay.- Constructing Feedback Control in Differential Games by Use of “Central” Trajectories.- Applications to Mechanical and Aerospace Systems.- Singular Pertubation Time-Optimal Controller on Disk-Drives.- Optimal Design of Elastic Bars.- Combining Indirect and Direct Methods in Optimal Control: Range Maximization of a Hang Glider.- Periodic Optimal Trajectories with Singular Control for Aircraft with High Aerodynamic Efficiency.- Optimal Flight Paths with Constrained Dynamic Pressure.- Optimal Ascent of a Hypersonic Space Vehicle.- Controllability Investigations of a Two-Stage-to-Orbit Vehicle.- Optimal Design of a Mission to Neptune.

Customer Reviews

Most Helpful Customer Reviews

See All Customer Reviews