Abstract
This paper proposes a new model, the EMDP (Evidential Markov Decision Process). It is a MDP (Markov Decision Process) for belief functions in which rewards are defined for each state transition, like in a classical MDP, whereas the transitions are modeled as in an EMC (Evidential Markov Chain), i.e. they are sets transitions instead of states transitions. The EMDP can fit to more applications than a MDPST (MDP with Set-valued Transitions). Generalizing to belief functions allows us to cope with applications with high uncertainty (imprecise or lacking data) where probabilistic approaches fail. Implementation results are shown on a search-and-rescue unmanned rotorcraft benchmark.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Doré, P.-E., Martin, A., Abi-Zeid, I., Jousselme, A.-L., Maupin, P.: Belief functions induced by multimodal probability density functions, an application to search and rescue pronlem. RAIRO Operations Research 44 (October-December 2010)
Ferson, S., Kreinovich, V., Ginzburg, L., Myers, D.S., Sentz, K.: Constructing probability boxes and Dempster-Shafer structures. Sand report no sand2002-4015, Sandia National Laboratories, Albuquerque, New Mexico and Livermore, California (January 2003)
Fouque, L., Appriou, A., Pieczynski, W.: An evidential Markovian model for data fusion and unsupervised image classification. In: Proc. of 3rd Int. Conf. on Information Fusion (FUSION), Paris, France, pp. YuB4-25–TuB4-31, July 10-13 (2000)
Geffner, H.: Planning under uncertainty: a survey. In: Ghallab, M., Hertzberg, J., Traverso, P. (eds.) Proc. of 6th Int. Conf. on Artificial Intelligence Planning Syst (AIPS), Toulouse, France, April 23-27 (2002)
Givan, R., Leach, S., Dean, T.: Bounded-parameter Markov decision processes. In: Steel, S. (ed.) ECP 1997. LNCS, vol. 1348, pp. 234–246. Springer, Heidelberg (1997)
Harmanec, D.: Generalizing Markov decision processes to imprecise probabilities. J. Stat. Planning and Inference 105, 199–213 (2002)
Lanchantin, P., Pieczynski, W.: Unsupervised restoration of hidden nonstationary Markov chains using evidential priors. IEEE Trans. on Signal Processing 53, 3091–3098 (2005)
Perny, P., Spanjaard, O., Weng, P.: Algebraic Markov decision processes. In: Proc. of the 19th IJCAI, Edinburgh, Scotland, July 30-August 5, pp. 1372–1377 (2005)
Perny, P., Weng, P.: On finding compromise solutions in multiobjective Markov decision processes. In: Porc. of 5th Multidisciplinary M-PREF Workshop on Advances in Preference Handling (ECAI 2010), Lisbon, Portugal, August 2010, pp. 55–60 (2010)
Pieczynski, W.: Multisensor triplet Markov chain and theory of evidence. Int. J. Approximate Reasoning 45, 1–16 (2007)
Puterman, M.L.: Markov Decision Processes. Wiley, New York (1994)
Rachelson, E., Teichteil, F., Garcia, F.: XMDP: un modèle de planification temporelle dans l’incertain á actions paramétriques. Journées Françaises Planification Décision Apprentissage (2007)
Sabbadin, R.: Une Approche Ordinale de la Decision dans l’Incertain: Axiomatisation, Representation Logique et Application à la Décision Séquentielle. Ph.d. thesis, Université Paul Sabatier, Toulouse, France (December 1998)
Satia, J.K., Lave, R.E.: MDPs with uncertain transition probabilities. Operations Research 21, 728–740 (1970)
Schmeidler, D.: Subjective probability and expected utility without additivity. Econometrica 57, 571–587 (1989)
Shafer, G.: A Mathematical Theory of Evidence. Princeton Univ. Press, Princeton (1976)
Soubaras, H.: An Evidential Measure of Risk in Evidential Markov Chains. In: Sossai, C., Chemello, G. (eds.) ECSQARU 2009. LNCS, vol. 5590, pp. 863–874. Springer, Heidelberg (2009)
Soubaras, H.: Probabilistic and non-probabilistic measures of risk in Markov-type systems for Planning under Uncertainty. Ph.d. thesis, Télécom ParisTech, France, January 21 (2011)
Teichteil-Konigsbuch, F., Fabiani, P.: An hybrid probabilistic model for autonomous exploration. In: Proc. of 20th RFIA (2004)
Teichteil-Konigsbuch, F., Fabiani, P.: Symbolic heuristic policy iteration algorithms for structured decision-theoretic exploration problems. In: Proc. of Workshop WS6 of ICAPS 2005, Monterye, CA, June 6-10 (2005)
Teichteil-Konigsbuch, F., Fabiani, P.: A multi-thread decisional architecture for real-time planning under uncertaint. In: Proc. of 3rd Workshop on Planning and Plan Execution for Real-World Syst., September 22 (2007)
Teichteil-Konigsbuch, F., Infantes, G.: MDP hybrides sans intégration analytique en utilisant régression, échantillonnage et mises-à-jour locales. In: Proc. of JFPDA (April 2009)
Trevizan, F.W., Cozman, F.G., de Barros, L.N.: Planning under risk and Knightian uncertainty. In: Proc. of 20th IJCAI, Hyderabad, India, January 6-12, pp. 2023–2028 (2007)
Trevizan, F.W., Cozman, F.G., de Barros, L.N.: Mixed probabilistic and nondeterministic factored planning through Markov decision processes with set-valued transitions. In: Proc. of 18th Int. Conf. on Automated Planning and Scheduling (ICAPS), Sydney, Australia, September 14-18 (2008)
Walley, P.: Statistical Reasoning With Imprecise Probabilities. Chapman & Hall, New York (1991)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Soubaras, H., Labreuche, C., Savéant, P. (2011). Evidential Markov Decision Processes. In: Liu, W. (eds) Symbolic and Quantitative Approaches to Reasoning with Uncertainty. ECSQARU 2011. Lecture Notes in Computer Science(), vol 6717. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-22152-1_29
Download citation
DOI: https://doi.org/10.1007/978-3-642-22152-1_29
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-22151-4
Online ISBN: 978-3-642-22152-1
eBook Packages: Computer ScienceComputer Science (R0)