Designed both for those who seek an acquaintance with dynamic programming and for those wishing to become experts, this text is accessible to anyone who's taken a course in operations research. It starts with a basic introduction to sequential decision processes and proceeds to the use of dynamic programming in studying models of resource allocation. Subsequent topics include methods for approximating solutions of control problems in continuous time, production control, decision-making in the face of an uncertain future, and inventory control models. The final chapter introduces sequential decision processes that lack fixed planning horizons, and the supplementary chapters treat data structures and the basic properties of convex functions. 1982 edition. Preface to the Dover Edition.
Table of Contents
1. Introduction to Sequential Decision Processes.
2. The Prototype Sequential Decision Process.
3. Allocation, Marginal Analysis, and Lagrange Multipliers.
4. Stages, Grids, and Discretizing Control Problems.
5. Production Control and Network Flow.
6. A Markov Decision Model.
7. Inventory Control: (s, S)-Policies.
8. A Discounted Markov Decision Model.
Supplements: 1. Data Structures. 2. Convex Functions.