Logical Connectionist Systems

  • I. Aleksander
Part of the Springer Study Edition book series (volume 41)


A universal node model is assumed in this general analysis of connectionist nets. It is based on a logic truth-table with a probabilistic element. It is argued that this covers other definitions. Algorithms are developed for training and testing techniques that involve reducing amounts of noise, giving a new perspective on annealing. The principle is further applied to ‘hard’ learning and shown to be achievable on the notorious parity-checking problem. The performance of the logic-probabilistic system is shown to be two orders of magnitude better than know back-error propagation techniques which have used this task as a benchmark.


Parity Checker Trained State Connectionist System Boltzmann Machine Probabilistic Node 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Rumelhart D.E. and McClelland J.L.., (eds.) : Parellel Distributed Processing, Vol. 1 & Vol 2, MIT Press, Cambridge, Mass, 1986.Google Scholar
  2. 2.
    Aleksander I., Adaptive Visions Systems and Boltzmann Machines : a Rapprochement, Pattern Recognition Letters, Vol. 6 pp. 113–120, July 1987.CrossRefGoogle Scholar
  3. 3.
    Hopfield J.J., : Neural Networks and Physical Systems with Emergent Computational Abilities, Proceedings of the National Academy of Sciences, U.S.A., Vol. 79, pp. 2554–2558, 1982.Google Scholar
  4. 4.
    Aleksander I., Thomas W.V., and Bowden P.A., : WISARD, a Radical Step Forward in Image Recognition, Sensor Review, vol. 4. no.3. pp. 120–124, 1984.CrossRefGoogle Scholar
  5. 5.
    Aleksander I., : Brain Cell to Microcircuit, Electronics and Power, Vol. 16, pp. 48–51, 1970.CrossRefGoogle Scholar
  6. 6.
    Rumelhart D.E., Hinton G.E., and Williams R.J., : Learning Internal Representations by Error Propagation, in Rumelhart D.E. and McClelland J.L., (eds.) : Parallel Distributed Processing, Vol. 1, MIT Press, Cambridge, Mass, 1986.Google Scholar
  7. 7.
    Minsky M. and Papert S., Perceptrons: an Introduction to Computational Geometry, MIT Press, Boston, 1969.MATHGoogle Scholar
  8. 8.
    Hinton G.E., Sejnowski T.J., Ackley, D.H., : Boltzmann Machines: Constraint Satisfaction Networks that Learn, Tech. Rep. CMU CS 84 119, Carnegie Mellon University, Pittsburgh, 1984.Google Scholar
  9. 9.
    Kauffmann, S.A. : Metabolic Stability and Epigenesis in Randomly Constructed Genetic Nets, J. Theoret. Biol. Vol. 22 pp. 437–467, 1986.CrossRefGoogle Scholar
  10. 10.
    Aleksander I. and Atlas, P., : Cyclic Activity in Nature: Causes of Stability, Int. J. of Neuroscience, Vol. 6, pp. 45–50, 1973.CrossRefGoogle Scholar
  11. 11.
    Aleksander I., : The Logic of Connectionist Systems, Neural Net Research Report, Imperial College, London, August 1987.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1989

Authors and Affiliations

  • I. Aleksander
    • 1
  1. 1.Department of ComputingImperial College of Science and TechnologyLondonEngland

Personalised recommendations