Flow-Based Bayesian Estimation of Nonlinear Differential Equations for Modeling Biological Networks

  • Nicolas J. -B. Brunel
  • Florence d’Alché-Buc
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6282)

Abstract

We consider the problem of estimating parameters and unobserved trajectories in nonlinear ordinary differential equations (ODEs) from noisy and partially observed data. We focus on a class of state-space models defined from the integration of the differential equation in the evolution equation. Within a Bayesian framework, we derive a non-sequential estimation procedure that infers the parameters and the initial condition of the ODE, taking into account that both are required to fully characterize the solution of the ODE. This point of view, new in the context of state-space models, modifies the learning problem. To evaluate the relevance of this approach, we use an Adaptive Importance Sampling in a population Monte Carlo scheme to approximate the posterior probability distribution. We compare this approach to recursive estimation via Unscented Kalman Filtering on two reverse-modeling problems in systems biology. On both problems, our method improves on classical smoothing methods used in state space models for the estimation of unobserved trajectories.

Keywords

Importance Sampling Hide State Unscented Kalman Filter Proposal Distribution Posterior Probability Distribution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Rodriguez-Fernandez, M., Egea, J.A., Banga, J.R.: Novel metaheuristic for parameter estimation in nonlinear dynamic biological systems. BMC Bioinformatics 7(483) (2006)Google Scholar
  2. 2.
    Calderhead, B., Girolami, M., Lawrence, N.D.: Accelerating bayesian inference over nonlinear differential equations with gaussian processes. In: Koller, D., Schuurmans, D., Bengio, Y., Bottou, L. (eds.) Advances in Neural Information Processing Systems, vol. 21, pp. 217–224. MIT Press, Cambridge (2009)Google Scholar
  3. 3.
    Cappé, O., Douc, R., Guillin, A., Marin, J.M., Robert, C.P.: Adaptive importance sampling in general mixture classes. Statistics and Computing 18(4), 447–459 (2008)CrossRefGoogle Scholar
  4. 4.
    Cappé, O., Guillin, A., Marin, J.M., Robert, C.P.: Population monte carlo. Journal of Computational and Graphical Statistics 13(4), 907–929 (2004)CrossRefGoogle Scholar
  5. 5.
    Cappé, O., Moulines, E., Rydén, T.: Inference in Hidden Markov Models. Springer, Heidelberg (2005)Google Scholar
  6. 6.
    d’Alché-Buc, F., Brunel, N.J.-B.: Learning and inference in computational systems biology. In: Estimation of Parametric Nonlinear ODEs for Biological Networks Identification. MIT Press, Cambridge (2010)Google Scholar
  7. 7.
    Douc, R., Guillin, A., Marin, J.M., Robert, C.: Convergence of adaptive mixtures of importance sampling schemes. Annals of Statistics 35(1), 420–448 (2007)CrossRefGoogle Scholar
  8. 8.
    Elowitz, M., Leibler, S.: A synthetic oscillatory network of transcriptional regulators. Nature 403, 335–338 (2000)CrossRefPubMedGoogle Scholar
  9. 9.
    Gentle, J.E., Hardle, W., Mori, Y.: Handbook of computational statistics: concepts and methods. Springer, Heidelberg (2004)Google Scholar
  10. 10.
    Ionides, E., Breto, C., King, A.: Inference for nonlinear dynamical systems. Proceedings of the National Academy of Sciences 103, 18438–18443 (2006)CrossRefGoogle Scholar
  11. 11.
    de Jong, H.: Modeling and simulation of genetic regulatory systems: A literature review. Journal of Computational Biology 9(1), 67–103 (2002)CrossRefPubMedGoogle Scholar
  12. 12.
    Li, Z., Osborne, M.R., Prvan, T.: Parameter estimation of ordinary differential equations. IMA Journal of Numerical Analysis 25, 264–285 (2005)CrossRefGoogle Scholar
  13. 13.
    Liu, J., West, M.: Combined parameter and state estimation in simulation-based filtering. In: Doucet, A., de Freitas, N., Gordon, N. (eds.) Sequential Monte Carlo Methods in Practice, pp. 197–217. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  14. 14.
    Mendes, P.: Learning and inference in computational systems biology. In: Comparative Assessment of Parameter Estimation and Inference Methods. MIT Press, Cambridge (2010)Google Scholar
  15. 15.
    Lawrence, N., Girolami, M., Rattray, M., Sanguinetti, G.: Learning and Inference in Computational Systems Biology. MIT Press, Cambridge (2010)Google Scholar
  16. 16.
    Quach, M., Brunel, N., d’Alché-Buc, F.: Estimating parameters and hidden variables in non-linear state-space models based on odes for biological networks inference. Bioinformatics 23(23), 3209–3216 (2007)CrossRefPubMedGoogle Scholar
  17. 17.
    Ramsay, J.O., Hooker, G., Campbell, D., Cao, J.: Parameter estimation for differential equations: A generalized smoothing approach. Journal of the Royal Statistical Society, Series B 69, 741–796 (2007)CrossRefGoogle Scholar
  18. 18.
    Robert, C.P., Casella, G.: Monte Carlo Statistical Methods. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  19. 19.
    Rodriguez-Fernandez, M., Egea, J.A., Banga, J.R.: Novel metaheuristic for parameter estimation in nonlinear dynamic biological systems. BMC Bioinformatics 7(483) (2006)Google Scholar
  20. 20.
    Sitz, A., Schwarz, U., Kurths, J., Voss, H.: Estimation of parameters and unobserved components for nonlinear systems from noisy time series. Physical review E 66, 16210 (2002)CrossRefGoogle Scholar
  21. 21.
    Sun, X., Jin, L., Xiong, M.: Extended kalman filter for estimation of parameters in nonlinear state-space models of biochemical networks. PLoS ONE 3(11), e3758+ (2008)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Nicolas J. -B. Brunel
    • 1
  • Florence d’Alché-Buc
    • 1
    • 2
  1. 1.Laboratoire IBISCUniversité d’EvryFrance
  2. 2.URA 2171Institut PasteurFrance

Personalised recommendations