Skip to main content

Dynamic programming for optimal control problems with terminal constraints

  • Conference paper
  • First Online:
Recent Mathematical Methods in Dynamic Programming

Part of the book series: Lecture Notes in Mathematics ((LNM,volume 1119))

Abstract

A well-known sufficient condition for optimality is expressed in terms of a continuously differentiable function which is a solution to the Hamilton-Jacobi equation of Dynamic Programming. (A function which serves this purpose is called a Caratheodory function.) However a continuously differentiable may fail to exist, and this limits the usefulness of the condition as classically formulated. Here we ask, how might the condition be modified to extend its applicability? Emphasis is given to problems involving terminal constraints on the trajectories. These pose a special challenge since there is no obvious candidate for a Caratheodory function; we must surmise its existence from abstract arguments, or construct it as the value function of an auxiliary problem. Some interesting connections are made with the theory of necessary conditions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 29.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 37.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. F.H. Clarke. Optimization and Nonsmooth Analysis. Wiley, New York, 1983.

    MATH  Google Scholar 

  2. F.H. Clarke and R.B. Vinter. Local Optimality Conditions and Lipschitzian Solutions to the Hamilton-Jacobi Equation. SIAM J. Control and Opt., 21, 6, 1983.

    Google Scholar 

  3. W.H. Fleming and R.W. Rishel. Deterministic and Stochastic Optimal Control. Springer, New York, 1975.

    Book  MATH  Google Scholar 

  4. A.D. Ioffe. Convex Functions Occurring in Variational Problems and the Absolute Minimum Problem. Mat. Sb., 88, 1972 and Math. USSR — Sb., 1972.

    Google Scholar 

  5. A.D. Ioffe, Private Communication.

    Google Scholar 

  6. A.D. Ioffe and V.M. Tihomirov. Theory of Extremal Problems. North Holland, Amsterdam, 1979.

    MATH  Google Scholar 

  7. R.M. Lewis and R.B. Vinter. New Representation Theorems for Consistent Flows, Proc. London Math. Soc., 3, 4, 1980.

    Google Scholar 

  8. R.M. Lewis and R.B. Vinter. Relaxation of Optimal Control Problems to Equivalent Convex Programs. J. Math. An. and Appl., 74, 2, 1980.

    Google Scholar 

  9. S. Mirica. Dynamic Programming for Stratified Optimal Control Problems. To appear.

    Google Scholar 

  10. R.T. Rockafellar. Conjugate Duality and Optimization. Regional Conference Series in Applied Mathematics, SIAM, Philadelphia, 1974.

    Google Scholar 

  11. H. Sussman. Subanalytic Sets and Feedback Control. J. Diff. Equs., 31, 1979.

    Google Scholar 

  12. R.B. Vinter. Weakest Conditions for Existence of Lipschitz Continuous Krotov Functions in Optimal Control Theory. SIAM J. Control and Opt., 21, 2, 1983.

    Google Scholar 

  13. R.B. Vinter. New Global Optimality Conditions in Optimal Control Theory, SIAM J. Control and Opt., 21, 2, 1983.

    Google Scholar 

  14. R.B. Vinter. The Equivalence of "Strong Calmness" and "Calmness" in Optimal Control Theory. J. Math. An. and Appl., 91, 1, 1983.

    Google Scholar 

  15. R.B. Vinter and R.M. Lewis. The Equivalence of Strong and Weak Formulations for Certain Problems in Optimal Control. SIAM J. Control and Opt., 16, 4, 1978.

    Google Scholar 

  16. R.B. Vinter. A Necessary and Sufficient Condition for Optimality of Dynamic Programming Type, Making No A Priori Assumptions on the Controls. SIAM J. Control and Opt., 16, 4, 1978.

    Google Scholar 

  17. R.B. Vinter. A Verification Theorem which Provides a Necessary and Sufficient Condition for Optimality. IEEE Trans. Aut. Control, 25, 1, 1080.

    Google Scholar 

  18. L.C. Young. Lectures in the Calculus of Variations and Optimal Control Theory. Saunders, Philadelphia, 1969.

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Italo Capuzzo Dolcetta Wendell H. Fleming Tullio Zolezzi

Rights and permissions

Reprints and permissions

Copyright information

© 1985 Springer-Verlag

About this paper

Cite this paper

Vinter, R.B. (1985). Dynamic programming for optimal control problems with terminal constraints. In: Dolcetta, I.C., Fleming, W.H., Zolezzi, T. (eds) Recent Mathematical Methods in Dynamic Programming. Lecture Notes in Mathematics, vol 1119. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0074786

Download citation

  • DOI: https://doi.org/10.1007/BFb0074786

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-15217-0

  • Online ISBN: 978-3-540-39365-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics