Abstract
In Chap. II optimality problems were studied through differential properties of mappings into the space of controls. The method of Dynamic Programming takes a different approach. In Dynamic Programming a family of fixed initial point control problems is considered. The minimum value of the performance criterion is considered as a function of this initial point. This function is called the value function. Whenever the value function is differentiable it satisfies a first order partial differential equation called the partial differential equation of dynamic programming.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1975 Springer-Verlag New York Inc.
About this chapter
Cite this chapter
Fleming, W., Rishel, R. (1975). Dynamic Programming. In: Deterministic and Stochastic Optimal Control. Applications of Mathematics, vol 1. Springer, New York, NY. https://doi.org/10.1007/978-1-4612-6380-7_4
Download citation
DOI: https://doi.org/10.1007/978-1-4612-6380-7_4
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4612-6382-1
Online ISBN: 978-1-4612-6380-7
eBook Packages: Springer Book Archive