A Solution for the Learning Problem in Evidential (Partially) Hidden Markov Models Based on Conditional Belief Functions and EM
Evidential Hidden Markov Models (EvHMM) is a particular Evidential Temporal Graphical Model that aims at statistically representing the kynetics of a system by means of an Evidential Markov Chain and an observation model. Observation models are made of mixture of densities to represent the inherent variability of sensor measurements, whereas uncertainty on the latent structure, that is generally only partially known due to lack of knowledge, is managed by Dempster-Shafer’s theory of belief functions. This paper is dedicated to the presentation of an Expectation-Maximization procedure to learn parameters in EvHMM. Results demonstrate the high potential of this method illustrated on complex datasets originating from turbofan engines where the aim is to provide early warnings of malfunction and failure.
KeywordsEvidential Temporal Graphical Model Evidential latent variable Markov chain Belief functions Parameter learning
The author would like to express his gratitude to Michèle Rombaut, Denis Pellerin and Thierry Denoeux for discussions around inference in EvHMM and EM-based learning in HMM. This work has been carried out in the following projects: the CNRS-PEPS project “EVIPRO”, the “SMART COMPOSITES” project (FRI2). It also got support from the Laboratory of Excellence “ACTION” (reference ANR-11-LABX-01-01).
- 3.Juesas, P., Ramasso, E.: Ascertainment-adjusted parameter estimation approach to improve robustness against misspecification of health monitoring methods. In: Mechanical Systems and Signal Processing (2016). http://dx.doi.org/10.1016/j.ymssp.2016.03.022
- 8.Frederick, D., DeCastro, J., Litt, J.: User’s guide for the commercial modular aero-propulsion system simulation (C-MAPSS). Technical report, National Aeronautics and Space Administration (NASA), Glenn Research Center, Cleveland, Ohio 44135, USA (2007)Google Scholar
- 10.Naim, I., Gildea, D.: Convergence of the EM algorithm for gaussian mixtures with unbalanced mixing coefficients. In: Langford, J., Pineau, J. (eds.) Proceedings of the 29th International Conference on Machine Learning (ICML-12), pp. 1655–1662. ACM, New York (2012)Google Scholar
- 11.Patil, G.: Weighted Distributions, vol. 4. Wiley, Chichester (2002)Google Scholar
- 14.Ramasso, E.: Investigating computational geometry for failure prognostics. Int. J. Prognostics Health Manage. 5(5), 1–18 (2014)Google Scholar
- 17.Ramasso, E., Saxena, A.: Performance benchmarking and analysis of prognostic methods for CMAPSS datasets. Int. J. Prognostics Health Manage. 5(2), 1–15 (2014)Google Scholar
- 18.Ramasso, E.: Inference and learning in evidential discrete-latent Markov models. IEEE Trans. Fuzzy Syst. (2016). submitted, ver. 2, 30 January 2016Google Scholar
- 19.Ramasso, E.: Segmentation of CMAPSS health indicators into discrete states for sequence-based classification and prediction purposes. Technical report 6839, FEMTO-ST institute, January 2016Google Scholar
- 20.Saxena, A., Goebel, K., Simon, D., Eklund, N.: Damage propagation modeling for aircraft engine run-to-failure simulation. In: IEEE Prognostics and Health Management (2008)Google Scholar
- 21.Serir, L., Ramasso, E., Zerhouni, N.: Time-sliced temporal evidential networks: the case of evidential HMM with application to dynamical system analysis. In: 2011 IEEE Conference on Prognostics and Health Management (PHM), pp. 1–10, June 2011Google Scholar
- 24.Vinh, N., Epps, J., Bailey, J.: Information theoretic measures for clustering comparison: is a correction for chance necessary? In: Proceedings of the 26th Annual International Conference on Machine Learning, pp. 1073–1080 (2009)Google Scholar