Dynamic decision making in stochastic partially observable medical domains: Ischemic heart disease example

  • Milos Hauskrecht
Probabilistic Models and Fuzzy Logic
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1211)


The focus of this paper is the framework of partially observable Markov decision processes (POMDPs) and its role in modeling and solving complex dynamic decision problems in stochastic and partially observable medical domains. The paper summarizes some of the basic features of the POMDP framework and explores its potential in solving the problem of the management of the patient with chronic ischemic heart disease.


Ischemic Heart Disease Markov Decision Process Finite Horizon Partially Observable Markov Decision Process Dynamic Decision 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    K.J. Astrom. Optimal control of Markov decision processes with incomplete state estimation. Journal of Mathematical Analysis and Applications, 10, pp. 174–205, 1965Google Scholar
  2. 2.
    A.R. Cassandra. Optimal policies for partially observable Markov decision processes. Brown University, Technical report CS-94-14, 1994.Google Scholar
  3. 3.
    M. Hauskrecht. Planning and control in stochastic domains with imperfect information. MIT EECS PhD thesis proposal, August 1996, 133 pages.Google Scholar
  4. 4.
    M. Hauskrecht. Dynamic decision making in stochastic partially observable medical domains. AAAI Spring symposium, pp. 69–72, 1996.Google Scholar
  5. 5.
    R. Hovorka, Causal probabilistic network modelling — An illustration of its role in the management of chronic diseases. IBM Systems Journal, 31:4, pp.635–648, 1992.Google Scholar
  6. 6.
    T.-Y. Leong. An integrated approach to dynamic decision making under uncertainty. MIT/LCS/TR-631, 1994.Google Scholar
  7. 7.
    W.S. Lovejoy. A survey of algorithmic methods for partially observed Markov decision processes. Annals of Operations Research, 28, pp. 47–66, 1991.Google Scholar
  8. 8.
    C.H. Papadimitriou, J.N. Tsitsiklis. The complexity of Markov decision processes. Mathematics of Operations Research, 12:3, pp. 441–450, 1987.Google Scholar
  9. 9.
    R.D. Smallwood, E.J. Sondik. The optimal control of partially observable processes over a finite horizon. Operations Research, 21, pp. 1071–1088.Google Scholar
  10. 10.
    J.B. Wong, Myocardial revascularization for chronic stable angina. Annals of Internal Medicine, 113 (1), pp. 852–871, 1990.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1997

Authors and Affiliations

  • Milos Hauskrecht
    • 1
  1. 1.MIT Lab for Computer ScienceCambridge

Personalised recommendations