ConSTrainer: A Generic Toolkit for Connectionist Dataset Selection

  • Apostolos N. Refenes
Part of the Informatik-Fachberichte book series (INFORMATIK, volume 252)


Con ST rainer is a window-based toolkit dedicated to the task of collecting and validating datasets for training connectionist networks. Unlike other connectionist development tools, Con ST rainer is an application- and network-independent tool which can be configured to suit the requirements of a variety of applications through a simple-to-use configuration facility The facility allows the user to create and modify both domain/ranges and domain/range parameters alike. For each parameter in the training exemplar Con ST rainer supports the definition of mutually supportive and mutually exclusive parameter sets. A powerful set of consistency and validation checks is also supported, including vector orthogonality, weightsum checking, and re-ordering of the training dataset. This paper introduces the Con ST rainer toolkit and discusses its utilization in a non-trivial application for diagnostic decision support in Histopathology.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [AcHiSe85]
    Ackley D. H., Hinton G. E., and Sejnowski T. J. A learning algorithm for Boltzmann machines, Cognitive Science 9 (1985), 147–169.CrossRefGoogle Scholar
  2. [AngTre89]
    Angeniol B., and Treleaven P. C., “PYGMALION: Neural Network Programming & Applications”, proc. ESPRIT conf. 1989.Google Scholar
  3. [Barron83]
    I. Barron, et al, Transputer does 5 or more MIPS even when not used in parallel, Electronics, 56, 23, November 1983, 109–115.Google Scholar
  4. [Goddar89]
    Goddard N. H., et all, “Rochester Connectionist Simulator” Technical report TR-233, University of Rochester, Department of Computer Science, (July 1989).Google Scholar
  5. [Gutsch88]
    Gutschow T., “AXON: The researchers Neural Network Language”, Proc., Int. Neural Network Symp. INNS’88, (1988).Google Scholar
  6. [Hebb61]
    D. Hebb, Organisation of Behavior, Science Editions, New York, 1961.Google Scholar
  7. [Hanson87]
    Hanson W. A., et al “CONE Computational Network Environment”, proc., IEEE First Int. Conf. on neural Networks, pp III-531–538., June 1987.Google Scholar
  8. [Hinton87]
    G.E. Hinton, “Connectionist Learning Procedures”, Tecnical Report, Computer Science Department, Carnegie-Mellon University, 1–46, June 1987.Google Scholar
  9. [Hopfie82]
    J.J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proc. Nat. Acad. Sci, 2554–2558, 1982, 79.Google Scholar
  10. [McCPit43]
    McCulloch W. S., Pitts W, “A Logical Calculus of the Ideas Immanent in Nervous Activity”, Bulletin of Mathematical Biophysics, 5, 115–133, 1943, also in Anderson, Rosenfeld (eds.): Neurocomputing.Google Scholar
  11. [KraFroRi]
    Kraft T. T., Frostron S. A., MacRtchie B., and Rodgers A., “The Specification of a Concurrent Backpropagation Network Architecture Using Actors”, Technical Report SAIC, San Diego, California 92121.Google Scholar
  12. [PaGuSk87]
    Paik E., Gungner D., and Skrzypek J., “UCLA SFINX a neural network simulation environment”, proc., IEEE Int. Conf on Neural Networks, vol 3., pp. 367–376, (1986).Google Scholar
  13. [RefKam90]
    Refenes A. N. & Kamalati A. H., “Dataset Optimisation for Training Connectionist Systems”, Research Note /BIOLAB/RN-19/90, Department of Computer Science, University College London, Submitted IEEE 2nd Int. Symposium on Parallel Systems“.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1990

Authors and Affiliations

  • Apostolos N. Refenes
    • 1
  1. 1.Department of Computer ScienceUniversity College LondonLondonUK

Personalised recommendations