Abstract
Since our approach in this book is based on (dynamic) game theory, it will be useful to present at the outset some of the basic notions of zero-sum game theory and some general results on the existence and characterization of saddle points. We first discuss, in the next section, static zero-sum games, that is, games where the actions of the players are selected independently of each other; in this case we also say that the players’ strategies are constants. We then discuss in Sections 2.2 and 2.3 some general properties of dynamic games (with possibly nonlinear dynamics), first in the discrete time and then in the continuous time, with the latter class of games known also as differential games. In both cases we also introduce the important notions of representation of a strategy, strong time consistency, and noise insensitivity.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer Science+Business Media New York
About this chapter
Cite this chapter
Başar, T., Bernhard, P. (2008). Basic Elements of Static and Dynamic Games. In: H∞-Optimal Control and Related Minimax Design Problems. Modern Birkhäuser Classics. Birkhäuser, Boston, MA. https://doi.org/10.1007/978-0-8176-4757-5_2
Download citation
DOI: https://doi.org/10.1007/978-0-8176-4757-5_2
Publisher Name: Birkhäuser, Boston, MA
Print ISBN: 978-0-8176-4756-8
Online ISBN: 978-0-8176-4757-5
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)