Abstract
The chapter is devoted to two-player, zero-sum differential games, with a special emphasis on the existence of a value and its characterization in terms of a partial differential equation, the Hamilton-Jacobi-Isaacs equation. We discuss different classes of games: in finite horizon, in infinite horizon, and pursuit-evasion games. We also analyze differential games in which the players do not have a full information on the structure of the game or cannot completely observe the state. We complete the chapter by a discussion on differential games depending on a singular parameter: for instance, we provide conditions under which the differential game has a long-time average.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Unless one allows an information advantage to one player, amounting to letting him know his opponent’s control at each time (Krasovskii and Subbotin 1988).
References
Alpern S, Gal S (2003) The theory of search games and rendezvous, vol 55. Springer science and business media, New York
Alvarez O, Bardi M (2010) Ergodicity, stabilization, and singular perturbations for Bellman-Isaacs equations, vol 204, no. 960. American Mathematical Society, Providence
Başar T, Olsder GJ (1999) Dynamic noncooperative game theory. Reprint of the second (1995) edition. Classics in applied mathematics, vol 23. Society for Industrial and Applied Mathematics (SIAM), Philadelphia
Bardi M, Capuzzo Dolcetta I (1996) Optimal control and viscosity solutions of Hamilton-Jacobi-Bellman equations. Birkhäuser, Basel
Blaquière A, Gérard F, Leitman G (1969) Quantitative and qualitative games. Academic Press, New York
Buckdahn R, Cardaliaguet P, Quincampoix M (2011) Some recent aspects of differential game theory. Dyn Games Appl 1(1):74–114
Crandall MG, Ishii H, Lions P-L (1992) User’s guide to viscosity solutions of second-order partial differential equations. Bull Am Soc 27:1–67
Evans LC, Souganidis PE (1984) Differential games and representation formulas for solutions of Hamilton-Jacobi equations. Indiana Univ Math J 282:487–502
Fleming WH (1961) The convergence problem for differential games. J Math Anal Appl 3: 102–116
Fleming WH, Souganidis PE (1989) On the existence of value functions of two-player, zero-sum stochastic differential games. Indiana Univ Math J 38(2):293–314
Friedman A (1971) Differential games. Wiley, New York
Isaacs R (1965) Differential games. Wiley, New York
Lewin J (1994) Differential games. Theory and methods for solving game problems with singular surfaces. Springer, London
Krasovskii NN, Subbotin AI (1988) Game-theorical control problems. Springer, New York
Melikyan AA (1998) Generalized characteristics of first order PDEs. Applications in optimal control and differential games. Birkhäuser, Boston
Petrosjan L (1993) A. Differential games of pursuit. Translated from the Russian by J. M. Donetz and the author. Series on optimization, vol 2. World Scientific Publishing Co., Inc., River Edge
Pontryagin LS (1968) Linear differential games I and II. Soviet Math Doklady 8(3 and 4):769–771 and 910–912
Subbotin AI (1995) Generalized solutions of first order PDEs: the dynamical optimization perspective. Birkäuser, Boston
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this entry
Cite this entry
Cardaliaguet, P., Rainer, C. (2018). Zero-Sum Differential Games. In: Başar, T., Zaccour, G. (eds) Handbook of Dynamic Game Theory. Springer, Cham. https://doi.org/10.1007/978-3-319-44374-4_4
Download citation
DOI: https://doi.org/10.1007/978-3-319-44374-4_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-44373-7
Online ISBN: 978-3-319-44374-4
eBook Packages: Mathematics and StatisticsReference Module Computer Science and Engineering