Skip to main content

Free Energy of Stochastic Context Free Grammar on Variational Bayes

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4232))

Abstract

Variational Bayesian learning is proposed for approximation method of Bayesian learning. In spite of efficiency and experimental good performance, their mathematical property has not yet been clarified. In this paper we analyze variational Bayesian Stochastic Context Free Grammar which includes the true distribution thus the model is non-identifiable. We derive their asymptotic free energy. It is shown that in some prior conditions, the free energy is much smaller than identifiable models and satisfies eliminating redundant non-terminals.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Attias, H.: Inferring parameters and structure of latent variable models by variational Bayes. In: Proc. 15th Conference on Uncertainty in Artificial Intelligence, pp. 21–20 (1999)

    Google Scholar 

  2. Beal, M.J.: Variational Algorithms for Approximate Bayesian Inference, PhD thesis, University College London (2003)

    Google Scholar 

  3. Gassiat, E., Boucheron, S.: Optimal error exponents in hidden Markov models order estimation. IEEE Transactions on Information Theory 49(2), 964–980 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  4. Hosino, T., Watanabe, K., Watanabe, S.: Stochastic Complexity of Variational Bayesian Hidden Markov Models. In: International Joint Conference on Neural Networks (2005)

    Google Scholar 

  5. Kurihara, K., Sato, T.: An Application of the Variational Bayesian Approach to Probabilistic Context-Free Grammars. In: International Joint Conference on Natural Language Processing (2004)

    Google Scholar 

  6. Ito, H., Amari, S.-I., Kobayashi, K.: Identifiability of hidden Markov information sources and their minimum degrees of freedom. IEEE Transactions on Information Theory 38(2), 324–333 (1992)

    Article  MathSciNet  Google Scholar 

  7. Lari, K., Young, S.: The estimation of stochastic context-free grammars using the inside-outside algorithm. Computer Speech and Language 4, 33–56 (1990)

    Article  Google Scholar 

  8. Nakajima, S., Watanabe, S.: Generalization Error and Free Energy of Linear Neural Networks in Variational Bayes Approach. In: The 12th International Conference on Neural Information Processing (2005)

    Google Scholar 

  9. Schwarz, G.: Estimating the dimension of a model. Annals of Statistics 6(2), 461–464 (1978)

    Article  MATH  MathSciNet  Google Scholar 

  10. Watanabe, S.: Algebraic analysis for non-identifiable learning machines. Neural Computation 13(4), 899–933 (2001)

    Article  MATH  Google Scholar 

  11. Watanabe, K., Watanabe, S.: Variational bayesian stochastic complexity of mixture models. In: Advances in Neural Information Processing Systems 18. MIT Press, Cambridge (2006) (to appear)

    Google Scholar 

  12. Yamazaki, K., Watanabe, S.: Generalization Errors in Estimating of Stochastic Context-Free Grammar. Artificial Intelligence and Soft Computing (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Hosino, T., Watanabe, K., Watanabe, S. (2006). Free Energy of Stochastic Context Free Grammar on Variational Bayes. In: King, I., Wang, J., Chan, LW., Wang, D. (eds) Neural Information Processing. ICONIP 2006. Lecture Notes in Computer Science, vol 4232. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11893028_46

Download citation

  • DOI: https://doi.org/10.1007/11893028_46

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-46479-2

  • Online ISBN: 978-3-540-46480-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics