Abstract
The optimal control problem was defined in the last chapter as the problem of dynamically changing the system parameters in response to the system evolution so as to optimize its performance. It is more common to use the phrase “choose an action” in place of “set the parameters” in the context of control problems. The rule that specifies what action to choose as a function of the system evolution is called a policy.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer Science+Business Media New York
About this chapter
Cite this chapter
Kulkarni, V. (1999). Optimal Control. In: Modeling, Analysis, Design, and Control of Stochastic Systems. Springer Text in Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4757-3098-2_10
Download citation
DOI: https://doi.org/10.1007/978-1-4757-3098-2_10
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4419-3154-2
Online ISBN: 978-1-4757-3098-2
eBook Packages: Springer Book Archive