A Dynamics of the Hough Transform and Artificial Neural Networks

  • Atsushi Imiya
  • Kazuhiko Kawamoto
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1715)


The least-squares method efficiently solves the model fitting problems, if we assume model equations. However, to the model fitting for a collection of models, the classification of data is required as preprocessing. We show that the randomized Hough transform achieves both the model fitting by the least-squares method and the classification of sample points by permutation simultaneously. Furthermore, we derive a dynamical system for the line detection by the Hough transform, which achieves grouping of sample points as the permutation of data sequence. The theoretical analysis in this paper verifies the reliability of the Hough- transform based template matching for the detection of shapes from a scene.


Machine Vision Line Detection Linear Manifold Hough Transform Planar Line 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Kanatani, K., Statistical optimization and geometric inference in computer vision, Philosophical Transactions of the Royal Society of London, Series A, 356 1997, 1308–1320.Google Scholar
  2. 2.
    Brockett, R. W., Least square matching problem, Linear Algebra and its Applications, 122/123/124, 1989, 761–777.CrossRefMathSciNetGoogle Scholar
  3. 3.
    Brockett, R.W., Dynamical system that sort list, diagonalize matrices, and solve linear programming problems, Linear Algebra and its Applications, 146, 1991, 79–91.zbMATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Kohonen, T., Self-organization and Associative Memory, 2nd Ed, Springer, Berlin, 1988zbMATHGoogle Scholar
  5. 5.
    Ritter, H. and Schulten, K., Convergence Properties of Kohonen’s topological conservation maps: fluctuations, stability, and dimensional selection, Biological Cybernetics, 60 1988, 59–71.zbMATHCrossRefMathSciNetGoogle Scholar
  6. 6.
    Oja, E., Xu, L., and Kultanen, P., Curve detection by an extended self-organization map and related RHT method, Proc. International Neural Network Conference, 1, 1990, 27–30.Google Scholar
  7. 7.
    Xu, L., Oja, E. and Suen, C.Y., Modified Hebbian learning for curve and surface fitting, Neural Networks, 5, 1992, 441–457.CrossRefGoogle Scholar
  8. 8.
    Oja, E. Principal components, minor components, and linear neural networks, Neural Networks, 5, 1992, 927–935.CrossRefGoogle Scholar
  9. 9.
    Diamantaras, Y. and Kung, S.Y., Principal Component Neural Networks: Theory and Applications, John Wiley & Sons, New York, 1996.zbMATHGoogle Scholar
  10. 10.
    Karhunen, J. and Joutsenalo, J., Generalizations of principal components analysis, optimization problem, and neural networks, Neural Networks, 8, 1995, 549–562.CrossRefGoogle Scholar
  11. 11.
    Rao, C.R. and Mitra, S.K., Generalized Inverse of Matrices and its Applications, John Wiley & Sons, New York. 1971.zbMATHGoogle Scholar
  12. 12.
    Imiya, A., Detection of piecewise-linear signals by the randomized Hough transform, Pattern Recognition Letters, 17 1996 771–776.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1999

Authors and Affiliations

  • Atsushi Imiya
    • 1
  • Kazuhiko Kawamoto
    • 1
  1. 1.Deptartment of Information and Image SciencesChiba UniversityChibaJapan

Personalised recommendations