Exploiting Natural Asynchrony and Local Knowledge within Systemic Computation to Enable Generic Neural Structures

  • Erwan Le Martelot
  • Peter J. Bentley
  • R. Beau Lotto
Conference paper
Part of the Proceedings in Information and Communications Technology book series (PICT, volume 1)


Bio-inspired processes are involved more and more in today’s technologies, yet their modelling and implementation tend to be taken away from their original concept because of the limitations of the classical computation paradigm. To address this, systemic computation (SC), a model of interacting systems with natural characteristics, followed by a modelling platform with a bio-inspired system implementation were introduced. In this paper, we investigate the impact of local knowledge and asynchronous computation: significant natural properties of biological neural networks (NN) and naturally handled by SC. We present here a bio-inspired model of artificial NN, focussing on agent interactions, and show that exploiting these built-in properties, which come for free, enables neural structure flexibility without reducing performance.


Local Knowledge Back Propagation Systemic Computation Natural Characteristic Global Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bentley, P.J.: Systemic computation: A Model of Interacting Systems with Natural Characteristics. Int.J. Parallel, Emergent and Distributed Systems 22, 103–121 (2007)zbMATHCrossRefMathSciNetGoogle Scholar
  2. 2.
    Le Martelot, E., Bentley, P.J., Lotto, R.B.: A Systemic Computation Platform for the Modelling and Analysis of Processes with Natural Characteristics. In: Proc. of Genetic and Evolutionary Computation Conference, pp. 2809–2816 (2007)Google Scholar
  3. 3.
    Tang, H., Tan, K.C., Yi, Z.: Neural Networks: Computational Models and Applications. Springer, Heidelberg (2007)zbMATHCrossRefGoogle Scholar
  4. 4.
    Kandel, E.R., Schwartz, J.H., Jessel, T.M.: Principles of Neural Science, 3rd edn., ch. 1,3. Elsevier, Amsterdam (1991)Google Scholar
  5. 5.
    Peterson, C., Anderson, J.R.: A Mean Field Theory Learning Algorithm for Neural Networks. Complex Systems 1, 995–1019 (1987)zbMATHGoogle Scholar
  6. 6.
    Hinton, G.E.: Deterministic Boltzmann Learning Performs Steepest Descent in Weight-space. Neural computation 1, 143–150 (1990)CrossRefGoogle Scholar
  7. 7.
    Maass, W.: Networks of spiking neurons: the third generation of neural network models. Neural Networks 10, 1659 (1997)CrossRefGoogle Scholar
  8. 8.
    O’Reilley, R.C.: Biologically Plausible Error-driven Learning using Local Activation Differences: The Generalized Recirculation Algorithm. Neural computation 8, 895–938 (1996)CrossRefGoogle Scholar
  9. 9.
    Yanling, Z., Bimin, D., Zhanrong, W.: Analysis and Study of Perceptron to Solve XOR Problem. In: Proc. of the 2nd Int. Workshop on Autonomous Decentralized System (2002)Google Scholar
  10. 10.
    Fisher, R.A., Marshall, M.: Iris Plants Database, UCI Machine Learning Repository (1988),

Copyright information

© Springer Tokyo 2009

Authors and Affiliations

  • Erwan Le Martelot
    • 1
    • 3
  • Peter J. Bentley
    • 2
  • R. Beau Lotto
    • 3
  1. 1.Engineering DepartmentUniversity College LondonLondonUK
  2. 2.Computer Science DepartmentUniversity College LondonLondonUK
  3. 3.Institute of OphthalmologyUniversity College LondonLondonUK

Personalised recommendations