Journal of Statistical Physics

, 136:1005 | Cite as

Prediction, Retrodiction, and the Amount of Information Stored in the Present

  • Christopher J. Ellison
  • John R. Mahoney
  • James P. Crutchfield
Open Access


We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy—a familiar measure of organization in complex systems—is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system measures for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all of these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.


Stored information Entropy rate Statistical complexity Excess entropy Causal irreversibility Crypticity 


  1. 1.
    Crutchfield, J.P., Young, K.: Inferring statistical complexity. Phys. Rev. Lett. 63, 105–108 (1989) CrossRefADSMathSciNetGoogle Scholar
  2. 2.
    Crutchfield, J.P., Shalizi, C.R.: Thermodynamic depth of causal states: Objective complexity via minimal representations. Phys. Rev. E 59(1), 275–283 (1999) CrossRefADSMathSciNetGoogle Scholar
  3. 3.
    Fraser, A.: Chaotic data and model building. In: Atmanspacher, H., Scheingraber, H. (eds.) Information Dynamics Volume Series B: Physics. NATO ASI Series, vol. 256, p. 125. Plenum, New York (1991) Google Scholar
  4. 4.
    Casdagli, M., Eubank, S. (eds.): Nonlinear Modeling, SFI Studies in the Sciences of Complexity. Addison-Wesley, Reading (1992) Google Scholar
  5. 5.
    Sprott, J.C.: Chaos and Time-Series Analysis, 2nd edn. Oxford University Press, Oxford (2003) MATHGoogle Scholar
  6. 6.
    Kantz, H., Schreiber, T.: Nonlinear Time Series Analysis, 2nd edn. Cambridge University Press, Cambridge (2006) Google Scholar
  7. 7.
    Arnold, D.: Information-theoretic analysis of phase transitions. Complex Syst. 10, 143–155 (1996) MATHGoogle Scholar
  8. 8.
    Crutchfield, J.P., Feldman, D.P.: Statistical complexity of simple one-dimensional spin systems. Phys. Rev. E 55(2), 1239R–1243R (1997) CrossRefADSGoogle Scholar
  9. 9.
    Feldman, D.P., Crutchfield, J.P.: Discovering non-critical organization: Statistical mechanical, information theoretic, and computational views of patterns in simple one-dimensional spin systems. Santa Fe Institute Working Paper 98-04-026 (1998) Google Scholar
  10. 10.
    Tononi, G., Sporns, O., Edelman, G.M.: A measure for brain complexity: relating functional segregation and integration in the nervous system. Proc. Natl. Acad. Sci. USA 91, 5033–5037 (1994) CrossRefADSGoogle Scholar
  11. 11.
    Bialek, W., Nemenman, I., Tishby, N.: Predictability, complexity, and learning. Neural Comput. 13, 2409–2463 (2001) MATHCrossRefGoogle Scholar
  12. 12.
    Ebeling, W., Poschel, T.: Entropy and long-range correlations in literary English. Europhys. Lett. 26, 241–246 (1994) CrossRefADSGoogle Scholar
  13. 13.
    Debowski, L.: On the vocabulary of grammar-based codes and the logical consistency of texts. IEEE Trans. Inf. Theory (2008, submitted). arXiv:0810.3125 [cs.JT]
  14. 14.
    Crutchfield, J.P., Ellison, C.J., Mahoney, J.R.: Time’s barbed arrow: Irreversibility, crypticity, and stored information. Phys. Rev. Lett. 103(9), 094101 (2009) CrossRefGoogle Scholar
  15. 15.
    Crutchfield, J.P., Ellison, C.J., Mahoney, J.R.: ε-Machine information measures (2009, in preparation) Google Scholar
  16. 16.
    Mahoney, J.R., Ellison, C.J., Crutchfield, J.P.: Information accessibility and cryptic processes. J. Phys. A, Math. Theor. 42, 362002 (2009) CrossRefGoogle Scholar
  17. 17.
    Crutchfield, J.P., Feldman, D.P.: Regularities unseen, randomness observed: Levels of entropy convergence. CHAOS 13(1), 25–54 (2003) MATHCrossRefADSMathSciNetGoogle Scholar
  18. 18.
    Crutchfield, J.P.: The calculi of emergence: Computation, dynamics, and induction. Physica D 75, 11–54 (1994) MATHCrossRefADSGoogle Scholar
  19. 19.
    Shalizi, C.R., Crutchfield, J.P.: Computational mechanics: Pattern and prediction, structure and simplicity. J. Stat. Phys. 104, 817–879 (2001) MATHCrossRefMathSciNetGoogle Scholar
  20. 20.
    Shannon, C.E., Weaver, W.: The Mathematical Theory of Communication. University of Illinois Press, Champaign-Urbana (1962) Google Scholar
  21. 21.
    Kolmogorov, A.N.: A new metric invariant of transient dynamical systems and automorphisms in Lebesgue spaces. Dokl. Akad. Nauk. SSSR 119, 861 (1958) (Russian). Math. Rev. 21(2035a) MATHMathSciNetGoogle Scholar
  22. 22.
    Crutchfield, J.P.: Semantics and thermodynamics. In: Casdagli, M., Eubank, S. (eds.) Nonlinear Modeling and Forecasting. Santa Fe Institute Studies in the Sciences of Complexity, vol. XII, pp. 317–359. Addison-Wesley, Reading (1992) Google Scholar
  23. 23.
    Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley-Interscience, New York (2006) MATHGoogle Scholar
  24. 24.
    Yeung, R.W.: A new outlook on Shannon’s information measures. IEEE Trans. Inf. Theory 37(3), 466–474 (1991) CrossRefMathSciNetGoogle Scholar
  25. 25.
    Shannon, C.E.: Communication theory of secrecy systems. Bell Sys. Tech. J. 28, 656–715 (1949) MathSciNetGoogle Scholar
  26. 26.
    Packard, N.H., Crutchfield, J.P., Farmer, J.D., Shaw, R.S.: Geometry from a time series. Phys. Rev. Lett. 45, 712 (1980) CrossRefADSGoogle Scholar
  27. 27.
    Crutchfield, J.P.: Information and its metric. In: Lam, L., Morris, H.C. (eds.) Nonlinear Structures in Physical Systems—Pattern Formation, Chaos and Waves, p. 119. Springer, New York (1990) Google Scholar
  28. 28.
    Upper, D.R.: Theory and algorithms for hidden Markov models and generalized hidden Markov models. PhD thesis, University of California, Berkeley. Published by University Microfilms Intl, Ann Arbor, Michigan (1997) Google Scholar
  29. 29.
    Weiss, B.: Subshifts of finite type and sofic systems. Monastsh. Math. 77, 462 (1973) MATHCrossRefGoogle Scholar
  30. 30.
    Ay, N., Crutchfield, J.P.: Reductions of hidden information sources. J. Stat. Phys. 210(3–4), 659–684 (2005) CrossRefADSMathSciNetGoogle Scholar
  31. 31.
    Hopcroft, J.E., Ullman, J.D.: Introduction to Automata Theory, Languages, and Computation. Addison-Wesley, Reading (1979) MATHGoogle Scholar
  32. 32.
    Ephraim, Y., Merhav, N.: Hidden Markov processes. IEEE Trans. Inf. Theory 48, 1518–1569 (2002) MATHCrossRefMathSciNetGoogle Scholar

Copyright information

© The Author(s) 2009

Authors and Affiliations

  • Christopher J. Ellison
    • 1
  • John R. Mahoney
    • 1
  • James P. Crutchfield
    • 1
    • 2
  1. 1.Complexity Sciences Center and Physics DepartmentUniversity of California at DavisDavisUSA
  2. 2.Santa Fe InstituteSanta FeUSA

Personalised recommendations