Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7070))

Abstract

Typically, the first step in carrying out predictions is to develop an inductive model. In many instances, the best model is used, and it is often selected based on certain relevant criteria that may possibly ignore some of the model uncertainties. The Bayesian approach to model selection uses a weighted average of a class of models thereby overcoming some of the uncertainties associated with selecting the best model. This approach is referred to in the literature as the Bayesian Model Averaging (BMA) approach. It turns out that this approach has significant overlap with the theory of Algorithmic Probability (ALP) developed by R. J. Solomonoff in the early 1960s. The purpose of this article is to first highlight this connection by applying ALP to a set of nested stationary autoregressive time series models, and to give an algorithm to compute “relative weights” of models. This reveals, although empirically, a model weight that can be compared with the Schwarz Bayesian Selection criterion (BIC or SIC). We then develop an elementary algorithm of the Monte Carlo type to evaluate multidimensional integrals over stability domains, and use it to compute what we call the “trimmed weights”.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Akaike, H.: A new look at statistical model identification. IEEE Trans. Aut. Control 19, 716–723 (1974)

    Article  MathSciNet  MATH  Google Scholar 

  2. Akaike, H.: Information measures and model selection. Bull. of Int. Stat. Inst. 50, 277–290 (1983)

    MathSciNet  MATH  Google Scholar 

  3. Burnham, K.P., Anderson, D.R.: Model selection and multimodel inference: a practical information-theoretical approach, 2nd edn. Springer, N.Y. (2002)

    Google Scholar 

  4. Brandoff-Nielsen, O., Schou, G.: On the parametrization of autoregressive models by partial autocorrelations. J. Multivar. Anal. 3, 408–419 (1973)

    Article  Google Scholar 

  5. Chen, C., Davis, R.A., Brockwell, J.P., Bai, Z.D.: Order determination for autoregressive processes using resampling method. Stat. Sinica 3, 481–500 (1993)

    MathSciNet  MATH  Google Scholar 

  6. Fam, A.T.: The volume of the coefficient space stability domain of monic polynomials. In: IEEE Int. Symp. Circuits and Systems, Portland, Oregon, vol. 2, pp. 1780–1783 (1989)

    Google Scholar 

  7. Fitzgibbon, L.J., Dowe, D., Vahid, F.: Minimum message length autoregressive model order selection. In: Palanaswami, M., Chandra Sekhar, C., Kumar Venayagamoorthy, G., Mohan, S., Ghantasala, M.K. (eds.) International Conference on Intelligent Sensing and Information Processing (ICISIP), Chennai, India, January 4-7, pp. 439–444 (2004)

    Google Scholar 

  8. Hutter, M.: Algorithmic information theory. Scholarpedia 2(3), 2519 (2007)

    Article  Google Scholar 

  9. Hoeting, J.A., Madigan, D., Raftery, A.E., Volinsky, C.T.: Bayesian Model Averaging: A Tutorial. Statistical Science 14, 382–401 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  10. Hurvich, C.M., Tsai, C.-L.: Regression and time series model selection in small samples. Biometrika 76, 297–397 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  11. Jones, M.C.: Randomly choosing parameters from the stationary and invertibility regions of autoregressive-moving average models. J. Roy. Stat. Soc., Series C (Appl. Stat.) 36, 134–138 (1987)

    MATH  Google Scholar 

  12. Knuth, D.: The art of computer programming, Volume 2: Seminumerical Algorithms, 3rd edn. Addison-Wesley (1997)

    Google Scholar 

  13. Kass, R.E., Raftery, A.E.: Bayes Factors. J. Amer. Stat. Assoc. 90, 773–795 (1995)

    Article  MATH  Google Scholar 

  14. Liang, F., Barron, A.: Minimax strategies for predictive density estimation, data compression, and model selection. IEEE Trans. Info. Th. 50, 2708–2726 (2004)

    Article  MathSciNet  Google Scholar 

  15. Li, M., Vitányi, P.: An introduction to Kolomogorov complexity and its applications. Springer, N.Y. (1997)

    Google Scholar 

  16. Makhoul, J.: Linear prediction: a tutorial review. Proc. IEEE 63, 561–580 (1975)

    Article  Google Scholar 

  17. Monahan, J.F.: A note on enforcing stationarity in autoregressive-moving average models. Biometrika 71, 403–404 (1984)

    Article  MathSciNet  Google Scholar 

  18. Nikolaev, Y.P.: The multidimensional asymptotic stability domain of linear discrete systems: Its symmetry and other properties. Aut. and Rem. Control 62, 109–120 (2001)

    MathSciNet  Google Scholar 

  19. Piccolo, D.: The size of the stationarity and invertibility region of an autoregressive-moving average process. J. of Time Series Analysis 3, 245–247 (1982)

    Article  MathSciNet  MATH  Google Scholar 

  20. Rissanen, J.: Modeling by the shortest data description. Automatica 14, 465–471 (1978)

    Article  MATH  Google Scholar 

  21. Schwarz, G.: Estimating the dimension of a model. Ann. of Stat. 6, 461–464 (1978)

    Article  MATH  Google Scholar 

  22. Shlien, S.: A Geometric description of stable linear predictive coding digital filters. IEEE Trans. Info. Th. 31, 545–548 (1985)

    Article  MathSciNet  Google Scholar 

  23. Solomonoff, R.J.: A preliminary report on general theory of inductive inference (1960)

    Google Scholar 

  24. Solomonoff, R.J.: A formal theory of inductive inference. Inform. and Control, Part I, 1-22, Part II 7, 224–254 (1964)

    MathSciNet  MATH  Google Scholar 

  25. Solomonoff, R.J.: The discovery of algorithmic probability. J. Comp. & Sys. Sci. 55, 73–88 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  26. Solomonoff, R.J.: Algorithmic Probability: Theory and Applications, Revision of article. In: Emmert-Streib, F., Dehmer, M. (eds.) Information Theory and Statistical Learning, pp. 1–23. Springer Science+Business Media, N.Y. (2009)

    Google Scholar 

  27. Wallace, C.S., Boulton, D.M.: An information measure for classification. Comput. J. 11, 185–194 (1968)

    Article  MATH  Google Scholar 

  28. Wallace, C.S., Dowe, D.L.: Minimum Message Length and Kolmogorov Complexity. Computer J. 42, 270–283 (1999)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Solomonoff, R.J., Saleeby, E.G. (2013). On the Application of Algorithmic Probability to Autoregressive Models. In: Dowe, D.L. (eds) Algorithmic Probability and Friends. Bayesian Prediction and Artificial Intelligence. Lecture Notes in Computer Science, vol 7070. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-44958-1_29

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-44958-1_29

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-44957-4

  • Online ISBN: 978-3-642-44958-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics