Advertisement

Optimal Control of Jump-Markov Processes and Viscosity Solutions

  • Halil Mete Soner
Part of the The IMA Volumes in Mathematics and Its Applications book series (IMA, volume 10)

Abstract

We investigate the Bellman equation that arises in the optimal control of Markov processes. This is a fully nonlinear integro-differential equation. The notion of viscosity solutions is introduced and then existence and uniqueness results are obtained. Also, the connection between the optimal control problem and the Bellman equation is developed.

Keywords

Viscosity Solution Bellman Equation Martingale Problem Stochastic Control Problem Stochastic Differential Game 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    M.C. Crandall, L.C. Evans and P.-L. Lions, Some properties of viscosity solutions of Hamilton-Jacobi equations, Trans. of AMS, 282 (1984), pp. 487–502.MathSciNetMATHCrossRefGoogle Scholar
  2. [2]
    M.C. Crandall and P.-L. Lions, Viscosity solutions of Hamilton-Jacobi equations, Trans. of AMS, 277 (1983), pp. 1–42.MathSciNetMATHCrossRefGoogle Scholar
  3. [3]
    M. Davis, Piecewise deterministic Markov processes: A general class of non-diffusion stochastic models, J. Royal Stat. Society (B), 46 (1984), pp. 343–388.Google Scholar
  4. [4]
    W.H. Fleming and H.M. Soner, work in progress.Google Scholar
  5. [5]
    Y.-C. Liao and S. Lenhart, Integro-differential equations associated with optimal stopping of a piecewise deterministic process, Stochastics, 15 (1985), pp. 183–207.MathSciNetMATHCrossRefGoogle Scholar
  6. [6]
    P.-L. Lions and P.E. Souganidis, Viscosity solutions of second-order equations, stochastic control and stochastic differential games, IMA Volumes in Mathematics and its Applications (1987).Google Scholar
  7. [7]
    H. Pragarauskas, On the control of jump processes, Lecture Notes in Control and Information Systems, 43, pp. 338–344.Google Scholar
  8. [8]
    R. Rishel, Dynamics programming and minimum principles for systems, SIAM J. Cont. and Opt., 13 (1975), pp. 338–371.MathSciNetMATHCrossRefGoogle Scholar
  9. [9]
    A.V. Skorokhod, Studies in the Theory of Random Processes, Dover, New York.Google Scholar
  10. [10]
    H.M. Soner, Optimal control with state-space constraint, II, SIAM Control and Opt., 24 /6 (1986), pp. 1110–1122.MathSciNetMATHCrossRefGoogle Scholar
  11. [11]
    D. Stroock, Diffusion processes associated with Lévy generators, Z. Warsch, 32 (1975), pp. 209–244.MathSciNetMATHCrossRefGoogle Scholar
  12. [12]
    D. Vermes, Optimal control of piecewise deterministic processes, Stochastics, 14 (1985), pp. 165–208.MathSciNetMATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag New York Inc. 1988

Authors and Affiliations

  • Halil Mete Soner
    • 1
  1. 1.Deparment of MathematicsCarnegie-Mellon UniversityPittsburghUSA

Personalised recommendations