Learning and Designing Stochastic Processes from Logical Constraints
- 953 Downloads
Continuous time Markov Chains (CTMCs) are a convenient mathematical model for a broad range of natural and computer systems. As a result, they have received considerable attention in the theoretical computer science community, with many important techniques such as model checking being now mainstream. However, most methodologies start with an assumption of complete specification of the CTMC, in terms of both initial conditions and parameters. While this may be plausible in some cases (e.g. small scale engineered systems) it is certainly not valid nor desirable in many cases (e.g. biological systems), and it does not lead to a constructive approach to rational design of systems based on specific requirements. Here we consider the problems of learning and designing CTMCs from observations/ requirements formulated in terms of satisfaction of temporal logic formulae. We recast the problem in terms of learning and maximising an unknown function (the likelihood of the parameters) which can be numerically estimated at any value of the parameter space (at a non-negligible computational cost). We adapt a recently proposed, provably convergent global optimisation algorithm developed in the machine learning community, and demonstrate its efficacy on a number of non-trivial test cases.
KeywordsModel Check Logical Constraint Continuous Time Markov Chain Infected Node Statistical Model Check
Unable to display preview. Download preview PDF.
- 2.Andersson, H., Britton, T.: Stochastic Epidemic Models and Their Statistical Analysis. Springer (2000)Google Scholar
- 3.Andreychenko, A., Mikeev, L., Spieler, D., Wolf, V.: Approximate maximum likelihood estimation for stochastic chemical kinetics. EURASIP Journal on Bioinf. and Sys. Bio. 9 (2012)Google Scholar
- 4.Baier, C., Haverkort, B., Hermanns, H., Katoen, J.P.: Model checking continuous-time Markov chains by transient analysis. IEEE TSE 29(6), 524–541 (2003)Google Scholar
- 6.Bishop, C.M.: Pattern Recognition and Machine Learning. Springer (2006)Google Scholar
- 9.Cover, T., Thomas, J.: Elements of Information Theory, 2nd edn. Wiley (2006)Google Scholar
- 10.Durrett, R.: Essentials of stochastic processes. Springer (2012)Google Scholar
- 11.Gillespie, D.T.: Exact stochastic simulation of coupled chemical reactions. J. of Physical Chemistry 81(25) (1977)Google Scholar
- 15.Kwiatkowska, M., Norman, G., Parker, D.: Probabilistic symbolic model checking with PRISM: A hybrid approach. Int. Jour. on Softw. Tools for Tech. Transf. 6(2), 128–142 (2004)Google Scholar
- 17.Opper, M., Sanguinetti, G.: Variational inference for Markov jump processes. In: Proc. of NIPS (2007)Google Scholar
- 18.Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press (2006)Google Scholar
- 21.Vezhnevets, A., Ferrari, V., Buhmann, J.: Weakly supervised structured output learning for semantic segmentation. In: Comp. Vision and Pattern Recog. (2012)Google Scholar