Abstract
Encoding temporal information from the recent past as spatially distributed activations is essential in order for the entire recent past to be simultaneously accessible. Any biological or synthetic agent that relies on the past to predict/plan the future, would be endowed with such a spatially distributed temporal memory. Simplistically, we would expect that resource limitations would demand the memory system to store only the most useful information for future prediction. For natural signals in real world which show scale free temporal fluctuations, the predictive information encoded in memory is maximal if the past information is scale invariantly coarse grained. Here we examine the general mechanism to construct a scale invariantly coarse grained memory system. Remarkably, the generic construction is equivalent to encoding the linear combinations of Laplace transform of the past information and their approximated inverses. This reveals a fundamental construction constraint on memory networks that attempt to maximize predictive information storage relevant to the natural world.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Charles, A.S., Yap, H.L., Rozell, C.J.: Short-term memory capacity in networks via the restricted isometry property. Neural Computation 26, 1198–1235 (2014)
Ganguli, S., Huh, D., Sompolinsky, H.: Memory traces in dynamical systems. Proceedings of the National Academy of Sciences of the United States of America 105(48), 18970–18975 (2008)
Ganguli, S., Sompolinsky, H.: Memory traces in dynamical systems. Annual Review of Neuroscience 35, 485–508 (2012)
Gibbon, J.: Scalar expectancy theory and Weber’s law in animal timing. Psychological Review 84(3), 279–325 (1977)
Hermans, M., Schrauwen, B.: Memory in linear recurrent neural networks in continuous time. Neural Networks 23(3), 341–355 (2010)
Jaeger, H.: The echo state approach to analyzing and training recurrent networks. GMD-Report 148, GMD - German National Research Institute for Information Technology (2001)
Legenstein, R., Maass, W.: Edge of chaos and prediction of computational performance for neural circuit models. Neural Networks 20, 323–334 (2007)
Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Computation 14(11), 2531–2560 (2002)
Mandelbrot, B.: The Fractal Geometry of Nature. W. H. Freeman, San Fransisco (1982)
Post, E.: Generalized differentiation. Transactions of the American Mathematical Society 32, 723–781 (1930)
Rakitin, B.C., Gibbon, J., Penny, T.B., Malapani, C., Hinton, S.C., Meck, W.H.: Scalar expectancy theory and peak-interval timing in humans. Journal of Experimental Psychololgy: Animal Behavior Processes 24, 15–33 (1998)
Shankar, K.H., Howard, M.W.: A scale-invariant internal representation of time. Neural Computation 24, 134–193 (2012)
Shankar, K.H., Howard, M.W.: Optimally fuzzy temporal memory. Journal of Machine Learning Research 14, 3785–3812 (2013)
Strauss, T., Wustlich, W., Labahn, R.: Design strategies for weight matrices of echo state networks. Neural Computation 24, 3246–3276 (2012)
Tank, D., Hopfield, J.: Neural computation by concentrating information in time. Proceedings of the National Academy of Sciences 84(7), 1896–1900 (1987)
Voss, R.F., Clarke, J.: 1/f noise in music and speech. Nature 258, 317–318 (1975)
Vries, B.D., Principe, J.C.: The gamma model, a new neural model for temporal processing. Neural Networks 5(4), 565–576 (1992)
Wallace, E., Maei, H.R., Latham, P.E.: Randomly connected networks have short temporal memory. Neural Computation 25, 1408–1439 (2013)
West, B.J., Shlesinger, M.F.: The noise in natural phenomena. American Scientist 78, 40–45 (1990)
White, O.L., Lee, D.D., Sompolinsky, H.: Short-term memory in orthogonal neural networks. Physical Review Letters 92(14), 148102 (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Shankar, K.H. (2015). Generic Construction of Scale-Invariantly Coarse Grained Memory. In: Chalup, S.K., Blair, A.D., Randall, M. (eds) Artificial Life and Computational Intelligence. ACALCI 2015. Lecture Notes in Computer Science(), vol 8955. Springer, Cham. https://doi.org/10.1007/978-3-319-14803-8_14
Download citation
DOI: https://doi.org/10.1007/978-3-319-14803-8_14
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-14802-1
Online ISBN: 978-3-319-14803-8
eBook Packages: Computer ScienceComputer Science (R0)