Skip to main content

Part of the book series: Modern Birkhäuser Classics ((MBC))

  • 3175 Accesses

Abstract

Since our approach in this book is based on (dynamic) game theory, it will be useful to present at the outset some of the basic notions of zero-sum game theory and some general results on the existence and characterization of saddle points. We first discuss, in the next section, static zero-sum games, that is, games where the actions of the players are selected independently of each other; in this case we also say that the players’ strategies are constants. We then discuss in Sections 2.2 and 2.3 some general properties of dynamic games (with possibly nonlinear dynamics), first in the discrete time and then in the continuous time, with the latter class of games known also as differential games. In both cases we also introduce the important notions of representation of a strategy, strong time consistency, and noise insensitivity.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer Science+Business Media New York

About this chapter

Cite this chapter

Başar, T., Bernhard, P. (2008). Basic Elements of Static and Dynamic Games. In: H-Optimal Control and Related Minimax Design Problems. Modern Birkhäuser Classics. Birkhäuser, Boston, MA. https://doi.org/10.1007/978-0-8176-4757-5_2

Download citation

Publish with us

Policies and ethics