Skip to main content

Generic Construction of Scale-Invariantly Coarse Grained Memory

  • Conference paper
Artificial Life and Computational Intelligence (ACALCI 2015)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 8955))

Abstract

Encoding temporal information from the recent past as spatially distributed activations is essential in order for the entire recent past to be simultaneously accessible. Any biological or synthetic agent that relies on the past to predict/plan the future, would be endowed with such a spatially distributed temporal memory. Simplistically, we would expect that resource limitations would demand the memory system to store only the most useful information for future prediction. For natural signals in real world which show scale free temporal fluctuations, the predictive information encoded in memory is maximal if the past information is scale invariantly coarse grained. Here we examine the general mechanism to construct a scale invariantly coarse grained memory system. Remarkably, the generic construction is equivalent to encoding the linear combinations of Laplace transform of the past information and their approximated inverses. This reveals a fundamental construction constraint on memory networks that attempt to maximize predictive information storage relevant to the natural world.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Charles, A.S., Yap, H.L., Rozell, C.J.: Short-term memory capacity in networks via the restricted isometry property. Neural Computation 26, 1198–1235 (2014)

    Article  MathSciNet  Google Scholar 

  2. Ganguli, S., Huh, D., Sompolinsky, H.: Memory traces in dynamical systems. Proceedings of the National Academy of Sciences of the United States of America 105(48), 18970–18975 (2008)

    Article  Google Scholar 

  3. Ganguli, S., Sompolinsky, H.: Memory traces in dynamical systems. Annual Review of Neuroscience 35, 485–508 (2012)

    Article  Google Scholar 

  4. Gibbon, J.: Scalar expectancy theory and Weber’s law in animal timing. Psychological Review 84(3), 279–325 (1977)

    Article  Google Scholar 

  5. Hermans, M., Schrauwen, B.: Memory in linear recurrent neural networks in continuous time. Neural Networks 23(3), 341–355 (2010)

    Article  Google Scholar 

  6. Jaeger, H.: The echo state approach to analyzing and training recurrent networks. GMD-Report 148, GMD - German National Research Institute for Information Technology (2001)

    Google Scholar 

  7. Legenstein, R., Maass, W.: Edge of chaos and prediction of computational performance for neural circuit models. Neural Networks 20, 323–334 (2007)

    Article  MATH  Google Scholar 

  8. Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Computation 14(11), 2531–2560 (2002)

    Article  MATH  Google Scholar 

  9. Mandelbrot, B.: The Fractal Geometry of Nature. W. H. Freeman, San Fransisco (1982)

    MATH  Google Scholar 

  10. Post, E.: Generalized differentiation. Transactions of the American Mathematical Society 32, 723–781 (1930)

    Article  MATH  MathSciNet  Google Scholar 

  11. Rakitin, B.C., Gibbon, J., Penny, T.B., Malapani, C., Hinton, S.C., Meck, W.H.: Scalar expectancy theory and peak-interval timing in humans. Journal of Experimental Psychololgy: Animal Behavior Processes 24, 15–33 (1998)

    Google Scholar 

  12. Shankar, K.H., Howard, M.W.: A scale-invariant internal representation of time. Neural Computation 24, 134–193 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  13. Shankar, K.H., Howard, M.W.: Optimally fuzzy temporal memory. Journal of Machine Learning Research 14, 3785–3812 (2013)

    MATH  MathSciNet  Google Scholar 

  14. Strauss, T., Wustlich, W., Labahn, R.: Design strategies for weight matrices of echo state networks. Neural Computation 24, 3246–3276 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  15. Tank, D., Hopfield, J.: Neural computation by concentrating information in time. Proceedings of the National Academy of Sciences 84(7), 1896–1900 (1987)

    Article  MathSciNet  Google Scholar 

  16. Voss, R.F., Clarke, J.: 1/f noise in music and speech. Nature 258, 317–318 (1975)

    Article  Google Scholar 

  17. Vries, B.D., Principe, J.C.: The gamma model, a new neural model for temporal processing. Neural Networks 5(4), 565–576 (1992)

    Article  Google Scholar 

  18. Wallace, E., Maei, H.R., Latham, P.E.: Randomly connected networks have short temporal memory. Neural Computation 25, 1408–1439 (2013)

    Article  MathSciNet  Google Scholar 

  19. West, B.J., Shlesinger, M.F.: The noise in natural phenomena. American Scientist 78, 40–45 (1990)

    Google Scholar 

  20. White, O.L., Lee, D.D., Sompolinsky, H.: Short-term memory in orthogonal neural networks. Physical Review Letters 92(14), 148102 (2004)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Shankar, K.H. (2015). Generic Construction of Scale-Invariantly Coarse Grained Memory. In: Chalup, S.K., Blair, A.D., Randall, M. (eds) Artificial Life and Computational Intelligence. ACALCI 2015. Lecture Notes in Computer Science(), vol 8955. Springer, Cham. https://doi.org/10.1007/978-3-319-14803-8_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-14803-8_14

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-14802-1

  • Online ISBN: 978-3-319-14803-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics