Encyclopedia of Operations Research and Management Science

2001 Edition
| Editors: Saul I. Gass, Carl M. Harris

Markov chains

  • Carl M. Harris
Reference work entry
DOI: https://doi.org/10.1007/1-4020-0611-X_579

INTRODUCTION

A Markov chain is a Markov process {X(t), tT } whose state space S is discrete, while its time domain T may be either continuous or discrete. In the following discussion, we shall focus only on the countable state-space problem; continuous-time chains are described in Markov processes. There is a vast literature on the subject, including Breiman (1986), Çinlar (1975), Chung (1967), Feller (1968), Heyman and Sobel (1982), Isaacson and Madsen (1976), Iosifescu (1980), Karlin and Taylor (1975), Kemeny and Snell (1976), Kemeny, Snell and Knapp (1966), and Parzen (1962).

As a stochastic process of the Markov type, chains possess the Markov or “lack-of-memory” property. The Markov property means that the probabilities of future events are completely determined by the present state of the process and the probabilities of its behavior from the present point on. That is, the past behavior of the process provides no additional information in determining the probabilities of...
This is a preview of subscription content, log in to check access.

References

  1. [1]
    Breiman, L. (1986). Probability and Stochastic Processes, With a View Toward Applications, Second Edition, The Scientific Press, Palo Alto, California.Google Scholar
  2. [2]
    Çinlar, E. (1975). Introduction to Stochastic Processes, Prentice-Hall, Englewood Cliffs, New Jersey.Google Scholar
  3. [3]
    Chung, K.L. (1967). Markov Chains with Stationary Transition Probabilities, Springer-Verlag, New York.Google Scholar
  4. [4]
    Feller, W. (1968). An Introduction to Probability Theory and Its Applications, Volume I, Third Edition, Wiley, New York.Google Scholar
  5. [5]
    Heyman, D.P. and Sobel, M.J. (1982). Stochastic Models in Operations Research, Volume I: Stochastic Processes and Operating Characteristics, McGraw-Hill, New York.Google Scholar
  6. [6]
    Iosifescu, M. (1980). Finite Markov Processes and their Application, Wiley, New York.Google Scholar
  7. [7]
    Isaacson, D.L. and Madsen, R.W. (1976). Markov Chains: Theory and Applications, Wiley, New York.Google Scholar
  8. [8]
    Karlin, S. and Taylor, H.M. (1975). A First Course in Stochastic Processes, Second Edition, Academic Press, New York.Google Scholar
  9. [9]
    Kemeny, J.G. and Snell, J.L. (1976). Finite Markov Chains, Springer-Verlag, New York. Google Scholar
  10. [10]
    Kemeny, J.G., Snell, J.L., and Knapp, A.W. (1966). Denumerable Markov Chains, Van Nostrand, Princeton.Google Scholar
  11. [11]
    Parzen, E. (1962). Stochastic Processes. Holden-Day, San Francisco.Google Scholar

Copyright information

© Kluwer Academic Publishers 2001

Authors and Affiliations

  • Carl M. Harris
    • 1
  1. 1.George Mason UniversityFairfaxUSA