Abstract
In Chapter 7 we considered Markov chains as a means to model stochastic DES for which explicit closed-form solutions can be obtained. Then, in Chapter 8, we saw how special classes of Markov chains (mostly, birth-death chains) can be used to model queueing systems. We pointed out, however, that queueing theory is largely “descriptive” in nature; that is, its main objective is to evaluate the behavior of queueing systems operating under a particular set of rules. On the other hand, we are often interested in “prescriptive” techniques, based on which we can make decisions regarding the “best” way to operate a system and ultimately control its performance. In this chapter, we describe some such techniques for Markov chains. Our main objective is to introduce the framework known as Markov Decision Theory, and to present some key results and techniques which can be used to control DES modeled as Markov chains. At the heart of these techniques is dynamic programming, which has played a critical role in both deterministic and stochastic control theory since the 1960s. The material in this chapter is more advanced than that of previous ones, it involves some results that were published in the research literature fairly recently, and it demands slightly higher mathematical sophistication. The results, however, should be quite gratifying for the reader, as they lead to the solution of some basic problems from everyday life experience, or related to the
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Selected References
Bertsekas, D.P., Dynamic Programming: Deterministic and Stochastic Models, Prentice-Hall, Englewood Cliffs, NJ, 1987.
Bertsekas, D.P., Dynamic Programming and Optimal Control, Volumes 1 and 2, Athena Scientific, Belmont, MA, 1995.
Hajek, B., “Optimal Control of Two Interacting Service Stations,” IEEE Transactions on Automatic Control, Vol. AC-29, pp. 491–499, 1984.
Kumar, P.R., and P. Varaiya, Stochastic Systems: Estimation, Identifica- tion, and Adaptive Control, Prentice-Hall, Englewood Cliffs, NJ, 1986.
Marshall, A.W., and I. Olkin, Inequalities: Theory of Majorization and its Applications, Academic Press, New York, 1979.
Pepyne, D.L., and C.G. Cassandras, “Optimal Dispatching Control for Elevator Systems During Uppeak Traffic,” IEEE Transactions on Control Systems Technology, Vol. 5, No. 6, pp. 629–643, 1997.
Ross, S.M., Stochastic Processes, Wiley, New York, 1983a.
Ross, S.M., Introduction to Stochastic Dynamic Programming, Academic Press, New York, 1983b.
Stidham, S., Jr., “Optimal Control of Admission to a Queueing System,” IEEE Transactions on Automatic Control, Vol. AC-30, pp. 705–713, 1985.
Trivedi, K.S., Probability and Statistics with Reliability, Queuing and Computer Science Applications, Prentice-Hall, Englewood Cliffs, NJ, 1982.
Walrand, J., An Introduction to Queueing Networks, Prentice-Hall, Englewood Cliffs, NJ, 1988.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer Science+Business Media New York
About this chapter
Cite this chapter
Cassandras, C.G., Lafortune, S. (1999). Controlled Markov Chains. In: Introduction to Discrete Event Systems. The Kluwer International Series on Discrete Event Dynamic Systems, vol 11. Springer, Boston, MA. https://doi.org/10.1007/978-1-4757-4070-7_9
Download citation
DOI: https://doi.org/10.1007/978-1-4757-4070-7_9
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4757-4072-1
Online ISBN: 978-1-4757-4070-7
eBook Packages: Springer Book Archive