Advertisement

Sliding Window Abstraction for Infinite Markov Chains

  • Thomas A. Henzinger
  • Maria Mateescu
  • Verena Wolf
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5643)

Abstract

We present an on-the-fly abstraction technique for infinite-state continuous -time Markov chains. We consider Markov chains that are specified by a finite set of transition classes. Such models naturally represent biochemical reactions and therefore play an important role in the stochastic modeling of biological systems. We approximate the transient probability distributions at various time instances by solving a sequence of dynamically constructed abstract models, each depending on the previous one. Each abstract model is a finite Markov chain that represents the behavior of the original, infinite chain during a specific time interval. Our approach provides complete information about probability distributions, not just about individual parameters like the mean. The error of each abstraction can be computed, and the precision of the abstraction refined when desired. We implemented the algorithm and demonstrate its usefulness and efficiency on several case studies from systems biology.

Keywords

Markov Chain Abstract Model Probability Mass Transition Class Reachable State 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Abdulla, P., Baier, C., Iyer, S., Jonsson, B.: Reasoning about probabilistic lossy channel systems. In: Palamidessi, C. (ed.) CONCUR 2000. LNCS, vol. 1877, pp. 320–333. Springer, Heidelberg (2000)CrossRefGoogle Scholar
  2. 2.
    Abdulla, P., Bertrand, N., Rabinovich, A., Schnoebelen, P.: Verification of probabilistic systems with faulty communication. Inf. Comput. 202(2), 141–165 (2005)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Abdulla, P., Henda, N.B., Mayr, R.: Verifying infinite Markov chains with a finite attractor or the global coarseness property. In: Proc. LICS 2005, pp. 127–136. IEEE Computer Society, Los Alamitos (2005)Google Scholar
  4. 4.
    Arkin, A., Ross, J., McAdams, H.H.: Stochastic kinetic analysis of developmental pathway bifurcation in phage λ-infected escherichia coli cells. Genetics 149, 1633–1648 (1998)Google Scholar
  5. 5.
    Baker, G.A.: The essentials of Padé approximants. Academic Press, New York (1975)zbMATHGoogle Scholar
  6. 6.
    De Boer, R.: Theoretical Fysiology. Online Lecture Notes (2006)Google Scholar
  7. 7.
    Burrage, K., Hegland, M., Macnamara, F., Sidje, B.: A Krylov-based finite state projection algorithm for solving the chemical master equation arising in the discrete modelling of biological systems. In: Proc. of the Markov 150th Anniversary Conference, pp. 21–38. Boson Books (2006)Google Scholar
  8. 8.
    Busch, H., Sandmann, W., Wolf, V.: A numerical aggregation algorithm for the enzyme-catalyzed substrate conversion. In: Priami, C. (ed.) CMSB 2006. LNCS (LNBI), vol. 4210, pp. 298–311. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  9. 9.
    Çinlar, E.: Introduction to Stochastic Processes. Prentice-Hall, Englewood Cliffs (1975)zbMATHGoogle Scholar
  10. 10.
    D’Argenio, P., Jeannet, B., Jensen, H., Larsen, K.: Reachability analysis of probabilistic systems by successive refinements. In: de Luca, L., Gilmore, S. (eds.) PROBMIV 2001, PAPM-PROBMIV 2001, and PAPM 2001. LNCS, vol. 2165, pp. 39–56. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  11. 11.
    de Alfaro, L., Pritam, R.: Magnifying-lens abstraction for Markov decision processes. In: Damm, W., Hermanns, H. (eds.) CAV 2007. LNCS, vol. 4590, pp. 325–338. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  12. 12.
    Didier, F., Henzinger, T.A., Mateescu, M., Wolf, V.: Approximation of event probabilities in noisy cellular processes. Technical Report MTC-REPORT-2009-002, EPF Lausanne, Switzerland (2009), http://infoscience.epfl.ch/record/135535
  13. 13.
    Esparza, J., Etessami, K.: Verifying probabilistic procedural programs. In: Lodaya, K., Mahajan, M. (eds.) FSTTCS 2004. LNCS, vol. 3328, pp. 16–31. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  14. 14.
    Esparza, J., Kucera, A., Mayr, R.: Model checking probabilistic pushdown automata. In: Proc. LICS 2004, pp. 12–21. IEEE Computer Society, Los Alamitos (2004)Google Scholar
  15. 15.
    Etessami, K., Yannakakis, M.: Algorithmic verification of recursive probabilistic state machines. In: Halbwachs, N., Zuck, L.D. (eds.) TACAS 2005. LNCS, vol. 3440, pp. 253–270. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  16. 16.
    Gillespie, D.T.: Markov Processes. Academic Press, New York (1992)zbMATHGoogle Scholar
  17. 17.
    Gross, D., Miller, D.: The randomization technique as a modeling tool and solution procedure for transient Markov processes. Operations Research 32(2), 926–944 (1984)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    Hairer, E., Norsett, S., Wanner, G.: Solving Ordinary Differential Equations I: Nonstiff Problems. Springer, Heidelberg (2008)zbMATHGoogle Scholar
  19. 19.
    Hairer, E., Wanner, G.: Solving Ordinary Differential Equations II. Stiff and Differential-Algebraic Problems. Springer, Heidelberg (2004)zbMATHGoogle Scholar
  20. 20.
    Haseltine, E., Rawlings, J.: Approximate simulation of coupled fast and slow reactions for stochastic chemical kinetics. The Journal of Chemical Physics 117(15), 6959–6969 (2002)CrossRefGoogle Scholar
  21. 21.
    Hasty, J., Pradines, J., Dolnik, M., Collins, J.J.: Noise-based switches and amplifiers for gene expression. PNAS 97, 2075 (2000)CrossRefzbMATHGoogle Scholar
  22. 22.
    Henderson, D.A., Boys, R.J., Proctor, C.J., Wilkinson, D.J.: Linking systems biology models to data: a stochastic kinetic model of p53 oscillations. In: O’Hagan, A., West, M. (eds.) Handbook of Applied Bayesian Analysis, Oxford University Press, Oxford (2009)Google Scholar
  23. 23.
    Henzinger, T.A., Mateescu, M., Wolf, V.: Sliding window abstraction for infinite Markov chains. Technical Report MTC-REPORT-2009-003, EPF Lausanne, Switzerland (2009), http://infoscience.epfl.ch/record/135537
  24. 24.
    Jensen, A.: Markoff chains as an aid in the study of Markoff processes. Skandinavisk Aktuarietidskrift 36, 87–91 (1953)MathSciNetzbMATHGoogle Scholar
  25. 25.
    van Kampen, N.G.: Stochastic Processes in Physics and Chemistry, 3rd edn. Elsevier, Amsterdam (2007)zbMATHGoogle Scholar
  26. 26.
    Katoen, J.-P., Klink, D., Leucker, M., Wolf, V.: Three-valued abstraction for continuous-time Markov chains. In: Damm, W., Hermanns, H. (eds.) CAV 2007. LNCS, vol. 4590, pp. 316–329. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  27. 27.
    Kucera, A.: Methods for quantitative analysis of probabilistic pushdown automata. Electr. Notes Theor. Comput. Sci. 149(1), 3–15 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  28. 28.
    Kwiatkowska, M., Norman, G., Parker, D.: Game-based abstraction for Markov decision processes. In: QEST, pp. 157–166. IEEE CS Press, Los Alamitos (2006)Google Scholar
  29. 29.
    Marie, R.A., Reibman, A.L., Trivedi, K.S.: Transient analysis of acyclic Markov chains. Perform. Eval. 7(3), 175–194 (1987)MathSciNetCrossRefzbMATHGoogle Scholar
  30. 30.
    McAdams, H.H., Arkin, A.: Stochastic mechanisms in gene expression. Proceedings of the National Academy of Science 94, 814–819 (1997)CrossRefGoogle Scholar
  31. 31.
    Moler, C., Van Loan, C.: Nineteen dubious ways to compute the exponential of a matrix, twenty-five years later. SIAM Review 45(1), 3–49 (2003)MathSciNetCrossRefzbMATHGoogle Scholar
  32. 32.
    Munsky, B., Khammash, M.: The finite state projection algorithm for the solution of the chemical master equation. J. Chem. Phys. 124, 44144 (2006)CrossRefzbMATHGoogle Scholar
  33. 33.
    Norris, J.: Markov Chains, 1st edn. Cambridge University Press, Cambridge (1999)zbMATHGoogle Scholar
  34. 34.
    Patel, P., Arcangioli, B., Baker, S., Bensimon, A., Rhind, N.: DNA replication origins fire stochastically in fission yeast. Mol. Biol. Cell 17, 308–316 (2006)CrossRefGoogle Scholar
  35. 35.
    Paulsson, J.: Summing up the noise in gene networks. Nature 427(6973), 415–418 (2004)CrossRefGoogle Scholar
  36. 36.
    Peles, S., Munsky, B., Khammash, M.: Reduction and solution of the chemical master equation using time scale separation and finite state projection. J. Chem. Phys. 125, 204104 (2006)CrossRefGoogle Scholar
  37. 37.
    Philippe, B., Sidje, R.: Transient solutions of Markov processes by Krylov subspaces. In: Proc. International Workshop on the Numerical Solution of Markov Chains, pp. 95–119. Kluwer Academic Publishers, Dordrecht (1995)Google Scholar
  38. 38.
    Rabinovich, A.: Quantitative analysis of probabilistic lossy channel systems. Inf. Comput. 204(5), 713–740 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  39. 39.
    Rao, C., Wolf, D., Arkin, A.: Control, exploitation and tolerance of intracellular noise. Nature 420(6912), 231–237 (2002)CrossRefGoogle Scholar
  40. 40.
    Remke, A.: Model Checking Structured Infinite Markov Chains. PhD thesis (2008)Google Scholar
  41. 41.
    Sandmann, W., Wolf, V.: A computational stochastic modeling formalism for biological networks. Enformatika Transactions on Engineering, Computing and Technology 14, 132–137 (2006)Google Scholar
  42. 42.
    Stewart, W.J.: Introduction to the Numerical Solution of Markov Chains. Princeton University Press, Princeton (1995)Google Scholar
  43. 43.
    Strelen, C.: Approximate disaggregation-aggregation solutions for general queueing networks. In: Society for Computer Simulation, pp. 773–778 (1997)Google Scholar
  44. 44.
    Swain, P.S., Elowitz, M.B., Siggia, E.D.: Intrinsic and extrinsic contributions to stochasticity in gene expression. Proceedings of the National Academy of Science, USA 99(20), 12795–12800 (2002)CrossRefGoogle Scholar
  45. 45.
    Thattai, M., van Oudenaarden, A.: Intrinsic noise in gene regulatory networks. PNAS 98(15), 8614–8619 (2001)CrossRefGoogle Scholar
  46. 46.
    Warmflash, A., Dinner, A.: Signatures of combinatorial regulation in intrinsic biological noise. PNAS 105(45), 17262–17267 (2008)CrossRefGoogle Scholar
  47. 47.
    Zhang, L., Hermanns, H., Moritz Hahn, E., Wachter, B.: Time-bounded model checking of infinite-state continuous-time Markov chains. In: ACSD, China (2008)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Thomas A. Henzinger
    • 1
  • Maria Mateescu
    • 1
  • Verena Wolf
    • 1
    • 2
  1. 1.EPFLSwitzerland
  2. 2.Saarland UniversityGermany

Personalised recommendations