Abstract
Ever since Bellman formulated his Principle of Optimality in the early 1950s, the Principle has been the subject of considerable criticism. In fact, a number of dynamic programming (DP) scholars quantified specific difficulties with the common interpretation of Bellman’s Principle and proposed constructive remedies. In the case of stochastic processes with a non-denumerable state space, the remedy requires the incorporation of the faithful “with probability one” clause. In this short article we are reminded that if one sticks to Bellman’s original version of the principle, then no such a fix is necessary. We also reiterate the central role that Bellman’s favourite “final state condition” plays in the theory of DP in general and the validity of the Principle of Optimality in particular.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Bellman, R. (1957). Dynamic Programming, Princeton University Press, Princeton, NJ.
Bertsekas, D.P. (1976). Dynamic Programming and Stochastic Control, Academic Press, NY.
Carraway R.L, T.L. Morin, and H. Moskovwitz. (1990). Generalized dynamic programmingfor multicriteria optimization, European Journal of Operations Research, 44, 95–104.
Denardo, E.V. and L.G. Mitten. (1967). Elements of sequential decision processes, Journal of Industrial Engineering, 18, 106–112.
Denardo, E.V.(1982). Dynamic Programming Models and Applications, Prentic-Hall, Englewood Cliffs, NJ.
Domingo A. and M. Sniedovich. (1993). Experiments with algorithms for nonseparable dynamic programming problems, European Journal of Operational Research 67(4.1), 172–187.
Karp, R.M. and M. Held. (1967). Finite-state processes and dynamic programming, SIAM Journal of Applied Mathematics, 15, 693–718.
Kushner, H. (1971). Introduction to Stochastic Control, Holt, Rinehart and Winston, NY.
Mitten, L.G. (1964). Composition principles for synthesis of optimal multistage processes, Operations Research, 12, 414–424.
Morin, T.L. (1982). Monotonicity and the principle of optimality, Journal of Mathematical analysis and Applications, 88, 665–674.
Porteus, E. (1975). An informal look at the principle of optimality, Management Science, 21, 1346–1348.
Sniedovich, M. (1986). A new look at Bellman’s principle of optimality, Journal of Optimization Theory and Applications, 49(1.1), 161–176.
Sniedovich, M. (1992). Dynamic Programming, Marcel Dekker, NY.
Yakowitz S. (1969). Mathematic of Adaptive Control Processes, Elsevier, NY.
Woeginger, G.J. (2000). When does a dynamic programming formulation guarantee the existence of a fully polynomial time approximation scheme (FPTAS), INFORMS Journal on Computing.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer Science + Business Media, Inc.
About this chapter
Cite this chapter
Sniedovich, M. (2002). Eureka! Bellman’s Principle of Optimality is Valid!. In: Dror, M., L’Ecuyer, P., Szidarovszky, F. (eds) Modeling Uncertainty. International Series in Operations Research & Management Science, vol 46. Springer, New York, NY. https://doi.org/10.1007/0-306-48102-2_29
Download citation
DOI: https://doi.org/10.1007/0-306-48102-2_29
Publisher Name: Springer, New York, NY
Print ISBN: 978-0-7923-7463-3
Online ISBN: 978-0-306-48102-4
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)