Skip to main content

Optimal Control

  • Chapter
  • First Online:
Control Engineering and Finance

Part of the book series: Lecture Notes in Control and Information Sciences ((LNCIS,volume 467))

  • 1271 Accesses

Abstract

Optimal Control is a very important and broad branch of control engineering. Clearly, no single chapter can possibly claim to cover all of this field. Therefore, this Chapter, which is intended solely as an introduction or a refresher, begins with calculus of variations, goes through the fixed and variable endpoint problems as well as the variation problem with constraints. These results are then applied to dynamic systems, leading to the solution of the optimal control problem using the Hamilton-Jacobi and Pontryagin methods. The concept of dynamic programming is explained and the chapter ends with a brief introduction to differential games.

Engineers aren’t boring people, we just get excited over boring things.

— Seen on a t-shirt

The scientific imagination always restrains itself within the limits of probability.

— Thomas Huxley

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Notes

  1. 1.

    This chapter which is an extended version of an Appendix in [51] is thought as an introduction to Optimal Control. For rigorous definitions, derivations and proofs, the reader is referred to standard textbooks e.g.,  [4, 45, 75, 113].

  2. 2.

    A more interesting problem is to find the curve with the fastest trajectory of a ball rolling only due to gravity from a high point A to a lower point B. This is known as the brachistochrone curve and its solution is left to the interested reader.

  3. 3.

    William Rowan Hamilton, Irish physicist, astronomer and mathematician (1805–1865).

  4. 4.

    Lev Semyonovich Pontryagin, Russian mathematician (1908–1988).

  5. 5.

    Carl Gustav Jacob Jacobi, Prussian mathematician (1804–1851).

  6. 6.

    See Chapter 6 for a detailed explanation of this interpretation.

  7. 7.

    This nifty trick is very useful when one has to deal with the absolute value of a variable which is rather nasty in analytical solutions.

  8. 8.

    Some authors use the term recurrence relation for difference equations.

  9. 9.

    John Forbes Nash, American mathematician (1928–2015).

  10. 10.

    This section follows the structure of the lecture notes [18] and some parts are borrowed from it.

  11. 11.

    All variables are assumed to be scalar for simplicity; the extension to multivariable systems is straight forward.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Selim S. Hacιsalihzade .

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Hacιsalihzade, S.S. (2018). Optimal Control. In: Control Engineering and Finance. Lecture Notes in Control and Information Sciences, vol 467. Springer, Cham. https://doi.org/10.1007/978-3-319-64492-9_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-64492-9_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-64491-2

  • Online ISBN: 978-3-319-64492-9

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics