References
Akaike, H. (1970). Statistical predictor identification,Ann. Inst. Statist. Math.,22, 203–217.
Akaike, H. (1973). Information theory and an extension of the maximum likelihood principle,Proceedings of the Second International Symposium on Information Theory, B. N. Petrov, and F. Csari, eds., Akademiai Kiado, Budapest, 267–281.
Akaike, H. (1974). A new look at the statistical model identification,IEEE Trans. Automat. Cont., AC-19, 716–723.
Akaike, H. (1976). Canonical correlation analysis of time series and the use of an information criterion,System Identification: Advances and Case Studies, R. K. Mehra and D. G. Lainiotis, eds., Academic Press, New York, 27–96.
Akaike, H. (1977). On entropy maximization principle,Proc. of the Symposium on Application of Statistics, P. R. Krishnaiah, ed., North-Holland, Amsterdam, to appear.
Doob, J. L. (1953).Stochastic Processes, John Wiley, New York.
Hannan, E. J. (1970).Multiple Time Series, John Wiley, New York.
Shibata, R. (1976) Selection of the order of an autoregressive model by Akaike's information criterion,Biometrika,63, 117–126.
Additional information
This research was partly supported by a National Grant in Aid for Scientific Research, 1976/77, no. 220928.
The Institute of Statistical Mathematics
About this article
Cite this article
Shimizu, R. Entropy maximization principle and selection of the order of an autoregressive Gaussian process. Ann Inst Stat Math 30, 263–270 (1978). https://doi.org/10.1007/BF02480217
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/BF02480217