Abstract
Markov processes are a special class of stochastic processes. In order to fully understand Markov processes we first need to introduce stochastic processes. However, to help us understand stochastic process we need to remember the basic probability theory associated with it, which was briefly reviewed in the last chapter.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Alfa AS, Margolius BH (2008) Two classes of time-inhomogeneous Markov chains: Analysis of the periodic case. Annals of Oper Res 160:121–137
Bhat NU (1972) Elements of Applied Stochastic Proceses. Wiley, New York
Howard R (1960) Dynamic Programming and Markov Processes. MIT Press, Cambridge
Markov AA (1907) Extension of limit theorems of probability theory to sum of variables connected in a chain. The Notes of the Imperial Academy of Sciences of St Petersburg VIII Series, Physio-Mathematical College XXII(9)
Neuts MF (1973) Probability. Allyn and Bacon, Boston
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer Science+Business Media New York
About this chapter
Cite this chapter
Alfa, A.S. (2016). Discrete-Time Markov Chains. In: Applied Discrete-Time Queues. Springer, New York, NY. https://doi.org/10.1007/978-1-4939-3420-1_3
Download citation
DOI: https://doi.org/10.1007/978-1-4939-3420-1_3
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4939-3418-8
Online ISBN: 978-1-4939-3420-1
eBook Packages: EngineeringEngineering (R0)