The finite-state, finite-action Markov decision process is a particularly simple and relatively tractable model of sequential decision making under uncertainty. It has been applied in such diverse fields as health care, highway maintenance, inventory, ma-chine maintenance, cash-flow management, and regulation of water reservoir capacity (Derman, 1970; Hernandez-Lermer, 1989; Ross, 1970; White, 1969). Here we present a definition of a Markov decision process and illustrate it with an example, followed by a discussion of the various solution procedures for several different types of Markov decision processes, all of which are based on dynamic programming (Bertsekas, 1987; Howard, 1971; Puterman, 1994; Sennott, 1999).
PROBLEM FORMULATION
Let k ∈ {0, 1,..., K − 1} represent the kth stage or decision epoch, that is, when the kth decision must be selected; K < ∞ represents the planning horizon of the Markov decision process. Let s k be the state of the system to be con-trolled at stage k....
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bertsekas, D.P. (1987). Dynamic Programming: Deterministic and Stochastic Models, Wiley-Interscience, New York.
Derman, C. (1970). Finite State Markovian Decision Processes, Academic Press, New York.
Hernandez-Lermer, O. (1989). Adaptive Markov Control Processes. Springer-Verlag, New York.
Howard, R. (1971). Dynamic Programming and Markov Processes, MIT Press, Cambridge, Massachusetts.
Puterman, M.L. (1994). Markov Decision Processes: Discrete Dynamic Programming, Wiley-Interscience, New York.
Ross, S.M. (1970). Applied Probability Models with Optimization Applications, Holden-Day, San Francisco.
Sennott, L.I. (1999). Stochastic Dynamic Programming and the Control of Queueing Systems, John Wiley, New York.
White, D.J. (1969). Markov Decision Processes, John Wiley, Chichester, UK.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Kluwer Academic Publishers
About this entry
Cite this entry
White, C.C. (2001). Markov decision processes . In: Gass, S.I., Harris, C.M. (eds) Encyclopedia of Operations Research and Management Science. Springer, New York, NY. https://doi.org/10.1007/1-4020-0611-X_580
Download citation
DOI: https://doi.org/10.1007/1-4020-0611-X_580
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-0-7923-7827-3
Online ISBN: 978-1-4020-0611-1
eBook Packages: Springer Book Archive