Encyclopedia of Operations Research and Management Science

2013 Edition
| Editors: Saul I. Gass, Michael C. Fu

Optimal Control

Reference work entry
DOI: https://doi.org/10.1007/978-1-4419-1153-7_200547

Branch of engineering and applied mathematics dealing with optimization of a dynamical system in continuous time. Similar to dynamic programming, the (optimal) value function satisfies an optimality condition, the Hamilton-Jacobi-Bellman equation. For the special case of a linear time-invariant dynamical system with quadratic cost, an explicit solution for the optimal feedback control policy can be found by solving the Riccati equation.

See

References

  1. Bryson, A. E., & Ho, Y. C. (1975). Applied optimal control. Washington, DC: Hemisphere.Google Scholar
  2. Sethi, S. P., & Thompson, G. L. (2000). Optimal control theory: Applications to management science and economics (2nd ed.). New York: Springer.Google Scholar

Copyright information

© Springer Science+Business Media New York 2013