Incremental Learning by Message Passing in Hierarchical Temporal Memory

  • Davide Maltoni
  • Erik M. Rehn
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7477)


Hierarchical Temporal Memory is a biologically-inspired framework that can be used to learn invariant representations of patterns. Classical HTM learning is mainly unsupervised and once training is completed the network structure is frozen, thus making further training quite critical. In this paper we develop a novel technique for HTM (incremental) supervised learning based on error minimization. We prove that error backpropagation can be naturally and elegantly implemented through native HTM message passing based on Belief Propagation. Our experimental results show that a two stage training composed by unsupervised pre-training + supervised refinement is very effective. This is in line with recent findings on other deep architectures.


HTM Deep architectures Backpropagation Incremental learning 


  1. 1.
    George, D., Hawkins, J.: A Hierarchical Bayesian Model of Invariant Pattern Recognition in the Visual Cortex. In: IJCNN (2005)Google Scholar
  2. 2.
    Ranzato, M., et al.: Unsupervised Learning of Invariant Feature Hierarchies with Applications to Object Recognition. In: CVPR (2007)Google Scholar
  3. 3.
    Bengio, Y.: Learning Deep Architectures for AI. Foundations and Trends in Machine Learning 2(1) (2009)Google Scholar
  4. 4.
    Jarrett, K., Kavukcuoglu, K., Ranzato, M., LeCun, Y.: What is the Best Multi-Stage Architecture for Object Recognition? In: ICCV (2009)Google Scholar
  5. 5.
    George, D., Hawkins, J.: Towards a Mathematical Theory of Cortical Micro-circuits. PLoS Computational Biology 5(10) (2009)Google Scholar
  6. 6.
    Lee, T.S., Mumford, D.: Hierarchical Bayesian inference in the visual cortex. Journal of the Optical Society of America 20(7), 1434–1448 (2003)CrossRefGoogle Scholar
  7. 7.
    Maltoni, D.: Pattern Recognition by Hierarchical Temporal Memory. DEIS TR (April 2011),
  8. 8.
    Pearl, J.: Probabilistic Reasoning in Intelligent Systems. Morgan-Kaufmann (1988)Google Scholar
  9. 9.
    George, D.: How the Brain Might Work: A Hierarchical and Temporal Model for Learning and Recognition. Ph.D. thesis, Stanford University (2008)Google Scholar
  10. 10.
    Maltoni, D., Rehn, E.M.: Incremental Learning by Message Passing in Hierarchical Temporal Memory. DEIS TR (May 2012),

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Davide Maltoni
    • 1
  • Erik M. Rehn
    • 2
  1. 1.Biometric System Laboratory, DEISUniversity of BolognaItaly
  2. 2.Bernstein Center for Computational NeuroscienceBerlinGermany

Personalised recommendations