Abstract
This chapter summarizes the mathematical foundations of optimal control theory and explains its philosophy, main concepts and techniques, making emphasis on the continuity that this theory entails with respect the use of system of equations. First in a static framework and then in a dynamic setting, the construction and use of lagrangian and hamiltonian functions are discussed, Pontryiagin’s maximum principle is demonstrated, and the solution procedures are commented making use of simple illustrative examples.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2012 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Gutiérrez Diez, P.J., Russo, I.H., Russo, J. (2012). Optimal Control Theory: From Knowledge to Control (I). Basic Concepts. In: The Evolution of the Use of Mathematics in Cancer Research. Springer, Boston, MA. https://doi.org/10.1007/978-1-4614-2397-3_8
Download citation
DOI: https://doi.org/10.1007/978-1-4614-2397-3_8
Published:
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4614-2396-6
Online ISBN: 978-1-4614-2397-3
eBook Packages: Biomedical and Life SciencesBiomedical and Life Sciences (R0)