Advertisement

Adaptive Sparse Bayesian Regression with Variational Inference for Parameter Estimation

  • Satoru KodaEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10029)

Abstract

A relevance vector machine (RVM) is a sparse Bayesian modeling tool for regression analysis. Since it can estimate complex relationships among variables and provide sparse models, it has been known as an efficient tool. On the other hand, the accuracy of RVM models strongly depends on the selection of their kernel parameters. This article presents a kernel parameter estimation method based on variational inference theories. This approach is quite adaptive, which enables RVM models to capture nonlinearity and local structure automatically. We applied the proposed method to artificial and real datasets. The results showed that the proposed method can achieve more accurate regression than other RVMs.

Keywords

Posterior Distribution Bayesian Information Criterion Support Vector Regression Marginal Likelihood Kernel Parameter 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Bishop, C.M., Tipping, M.E.: Variational relevance vector machines. In: Proceedings of 16th Conference on Uncertainty in Artificial Intelligence, pp. 46–53 (2000)Google Scholar
  2. 2.
    Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)zbMATHGoogle Scholar
  3. 3.
    Faul, A.C., Tipping, M.E.: Analysis of sparse Bayesian learning. In: Advances in Neural Information Processing, vol. 14, pp. 383–389. MIP Press (2002)Google Scholar
  4. 4.
    Han, M., Zhao, Y.: Robust relevance vector machine with noise variance coefficient. In: Proceedings of the 2010 International Joint Conference on Neural Networks, pp. 1–6 (2010)Google Scholar
  5. 5.
    Hwang, M., Jeong, M.K., Yum, B.J.: Robust relevance vector machine with variational inference for improving virtual metrology accuracy. IEEE Trans. Semicond. Manuf. 27(1), 83–94 (2014)CrossRefGoogle Scholar
  6. 6.
    Matsuda, D.: Predictive model selection criteria for relevance vector regression models. Josai Math. Monogr. 8, 97–113 (2015)Google Scholar
  7. 7.
    Schmolck, A., Everson, R.: Smooth relevance vector machine: a smoothness prior extension of the RVM. Mach. Learn. 68(2), 107–135 (2007)CrossRefGoogle Scholar
  8. 8.
    Tipping, M.E.: Sparse Bayesian learning and the relevance vector machine. J. Mach. Learn. Res. 1, 211–244 (2001)MathSciNetzbMATHGoogle Scholar
  9. 9.
    Tipping, M.E., Lawrence, K.D.: A variational approach to robust Bayesian interpolation. In: Neural Networks for Signal Processing, pp. 229–238 (2003)Google Scholar
  10. 10.
    Tipping, M.E., Faul, A.: Fast marginal likelihood maximization for sparse Bayesian models. In: Proceedings of 9th International Workshop Artificial Intelligence and Statistics (2003)Google Scholar
  11. 11.
    Tripathi, S., Govindaraju, R.C.: On selection of kernel parameters in relevance vector machines for hydrologic applications. Stoch. Environ. Res. Risk Assess. 21(6), 747–764 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Tzikas, D.G., Likas, A.C., Galatsanos, N.P.: Sparse Bayesian modeling with adaptive kernel learning. IEEE Trans. Neural Netw. 20(6), 926–937 (2009)CrossRefGoogle Scholar
  13. 13.
    Wang, C., Blei, D.M.: Variational inference in nonconjugate models. J. Mach. Learn. Res. 14(1), 1005–1031 (2013)MathSciNetzbMATHGoogle Scholar
  14. 14.
    Lichman, M.: UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine, CA. http://archive.ics.uci.edu/ml

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.Graduate School of MathematicsKyushu UniversityFukuokaJapan

Personalised recommendations