Abstract
A well-known sufficient condition for optimality is expressed in terms of a continuously differentiable function which is a solution to the Hamilton-Jacobi equation of Dynamic Programming. (A function which serves this purpose is called a Caratheodory function.) However a continuously differentiable may fail to exist, and this limits the usefulness of the condition as classically formulated. Here we ask, how might the condition be modified to extend its applicability? Emphasis is given to problems involving terminal constraints on the trajectories. These pose a special challenge since there is no obvious candidate for a Caratheodory function; we must surmise its existence from abstract arguments, or construct it as the value function of an auxiliary problem. Some interesting connections are made with the theory of necessary conditions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
F.H. Clarke. Optimization and Nonsmooth Analysis. Wiley, New York, 1983.
F.H. Clarke and R.B. Vinter. Local Optimality Conditions and Lipschitzian Solutions to the Hamilton-Jacobi Equation. SIAM J. Control and Opt., 21, 6, 1983.
W.H. Fleming and R.W. Rishel. Deterministic and Stochastic Optimal Control. Springer, New York, 1975.
A.D. Ioffe. Convex Functions Occurring in Variational Problems and the Absolute Minimum Problem. Mat. Sb., 88, 1972 and Math. USSR — Sb., 1972.
A.D. Ioffe, Private Communication.
A.D. Ioffe and V.M. Tihomirov. Theory of Extremal Problems. North Holland, Amsterdam, 1979.
R.M. Lewis and R.B. Vinter. New Representation Theorems for Consistent Flows, Proc. London Math. Soc., 3, 4, 1980.
R.M. Lewis and R.B. Vinter. Relaxation of Optimal Control Problems to Equivalent Convex Programs. J. Math. An. and Appl., 74, 2, 1980.
S. Mirica. Dynamic Programming for Stratified Optimal Control Problems. To appear.
R.T. Rockafellar. Conjugate Duality and Optimization. Regional Conference Series in Applied Mathematics, SIAM, Philadelphia, 1974.
H. Sussman. Subanalytic Sets and Feedback Control. J. Diff. Equs., 31, 1979.
R.B. Vinter. Weakest Conditions for Existence of Lipschitz Continuous Krotov Functions in Optimal Control Theory. SIAM J. Control and Opt., 21, 2, 1983.
R.B. Vinter. New Global Optimality Conditions in Optimal Control Theory, SIAM J. Control and Opt., 21, 2, 1983.
R.B. Vinter. The Equivalence of "Strong Calmness" and "Calmness" in Optimal Control Theory. J. Math. An. and Appl., 91, 1, 1983.
R.B. Vinter and R.M. Lewis. The Equivalence of Strong and Weak Formulations for Certain Problems in Optimal Control. SIAM J. Control and Opt., 16, 4, 1978.
R.B. Vinter. A Necessary and Sufficient Condition for Optimality of Dynamic Programming Type, Making No A Priori Assumptions on the Controls. SIAM J. Control and Opt., 16, 4, 1978.
R.B. Vinter. A Verification Theorem which Provides a Necessary and Sufficient Condition for Optimality. IEEE Trans. Aut. Control, 25, 1, 1080.
L.C. Young. Lectures in the Calculus of Variations and Optimal Control Theory. Saunders, Philadelphia, 1969.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1985 Springer-Verlag
About this paper
Cite this paper
Vinter, R.B. (1985). Dynamic programming for optimal control problems with terminal constraints. In: Dolcetta, I.C., Fleming, W.H., Zolezzi, T. (eds) Recent Mathematical Methods in Dynamic Programming. Lecture Notes in Mathematics, vol 1119. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0074786
Download citation
DOI: https://doi.org/10.1007/BFb0074786
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-15217-0
Online ISBN: 978-3-540-39365-8
eBook Packages: Springer Book Archive