Advertisement

Algorithms for the Visualization of Large and Multivariate Data Sets

  • Friedhelm Schwenker
  • Hans A. Kestler
  • Günther Palm
Part of the Studies in Fuzziness and Soft Computing book series (STUDFUZZ, volume 78)

Abstract

In this chapter we discuss algorithms for clustering and visualization of large and multivariate data. We describe an algorithm for exploratory data analysis which combines adaptive c-means clustering and multi-dimensional scaling (ACMDS). ACMDS is an algorithm for the online visualization of clustering processes and may be considered as an alternative approach to Kohonen’s self organizing feature map (SOM). Whereas SOM is a heuristic neural network algorithm, ACMDS is derived from multivariate statistical algorithms. The implications of ACMDS are illustrated through five different data sets.

Keywords

Feature Space Cluster Center Multidimensional Scaling Independent Component Analysis Representation Center 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Bibliography on Chapter 8

  1. 1.
    A.J. Bell and T.J. Sejnowski. An information-maximization approach to blind separation and blind deconvolution. Neural Computation, 7: 1129–1159, 1995.CrossRefGoogle Scholar
  2. 2.
    G. Breithardt and M. Borggrefe. Pathophysiological mechanisms and clinical significance of ventricular late potentials. Eur Heart J, 7: 364–385, 1986.Google Scholar
  3. 3.
    G. Breithardt, M.E. Cain, N. El-Sherif, N. Flowers, V. Hombach, M. Janse, M.B. Sim-son, and G. Steinbeck. Standards for analysis of ventricular late potentials using high resolution or signal-averaged electrocardiography. Eur Heart J, 12: 473–80, 1991.Google Scholar
  4. 4.
    C. Darken and J. Moody. Fast adaptive k-means clustering: Some empirical results. In Proceedings International Joint Conference on Neural Networks, 1990.Google Scholar
  5. 5.
    J.H. Friedman. Exploratory projection pursuit. Journal of the American Statistical Association, 82 (397): 249–266, 1987.MathSciNetCrossRefMATHGoogle Scholar
  6. 6.
    J.H. Friedman and J.W. Tuckey. A projection pursuit algorithm for exploratory data analysis. IEEE Transactions on Computers, 9 (c-23): 881–890, 1974.CrossRefGoogle Scholar
  7. 7.
    J.A. Gomes, S.L. Winters, M. Martinson, J. Machac, D. Stewart, and A. Targonski. The prognostic significance of quantitative signal-averaged variables relative to clinical variables, site of myocardial infarction, ejection fraction and ventricular premature beats. JACC, 13: 377–384, 1989.CrossRefGoogle Scholar
  8. 8.
    H.H. Harman. Modern Factor Analysis. University of Chicago Press, 1967.Google Scholar
  9. 9.
    M. Höher and V. Hombach. Ventrikuläre Spätpotentiale - Teil I Grundlagen. Herz liu Rhythmus, 3 (3): 1–7, 1991.Google Scholar
  10. 10.
    M. Höher and V. Hombach. Ventrikuläre Spätpotentiale - Teil II Klinische Aspekte. Herz liu Rhythmus, 3 (4): 8–14, 1991.Google Scholar
  11. 11.
    K. Hornik. Konvergenzanalyse von nn-lernalgorithmen. In G. Dorffner, K. Möller, G. Paaß, and S. Vogel, editors, Konnektionismus und Neuronale Netze, volume 272 of GMD-Studien, pages 47–62, 1995.Google Scholar
  12. 12.
    P.J. Huber. Projection pursuit. The Annals of Statistics, 13 (2): 249–266, 1985.Google Scholar
  13. 13.
    A. Hyvärinen. Survey on independent component analysis. Neural Computing Surveys, 2: 94–128, 1999.Google Scholar
  14. 14.
    A.K. Jain and R.C. Dubes. Algorithms for Clustering Data. Prentice Hall, Englewood Cliffs, New Jersey, 1988.Google Scholar
  15. 15.
    I.T. Jollife. Principal Component Analysis. Springer-Verlag, 1986.Google Scholar
  16. 16.
    M. Kendall. Multivariate Analysis. Charles Griffin liu Co, 1975.Google Scholar
  17. 17.
    T. Kohonen. Self-Organizing Maps. Springer, 1995.Google Scholar
  18. 18.
    U. Kreßel. The Impact of the Learning-Set Size in Handwritten-Digit Recognition. In T. Kohonen, editor, Artificial Neural Networks. ICANN-91, North-Holland, 1991.Google Scholar
  19. 19.
    Y. Linde, A. Buzo, and R.M. Gray. An algorithm for vector quantizer design. IEEE Transactions on Communications, 28 (1): 84–95, 1980.CrossRefGoogle Scholar
  20. 20.
    S.P. Lloyd. Least squares quantization in PCM. IEEE Transactions on Information Theory, 28 (2): 129–137, 1982.MathSciNetCrossRefMATHGoogle Scholar
  21. 21.
    J. MacQueen. Some methods for classification and analysis of multivariate observations. In L.M.LeCam and J.Neyman, editors, Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, volume I, pages 281–297. Berkeley University of California Press, 1967.Google Scholar
  22. 22.
    D. Michie, D.J. Spiegelhalter, and C.C. Taylor. Machine Learning, Neural and Statistical Classification. Ellis Horwood, 1994.Google Scholar
  23. 23.
    J. Moody and C.J. Darken. Fast learning in networks of locally-tuned processing units. Neural Computation, 1 (2): 281–294, 1989.CrossRefGoogle Scholar
  24. 24.
    H. Ritter and K. Schulten. Convergence properties of Kohonen’s topology converving maps•fluctuations,stability, and dimension selection. Biological Cybernetics, 60: 59–71, 1988.MathSciNetCrossRefMATHGoogle Scholar
  25. 25.
    J.W. Sammon. A nonlinear mapping for data structure analysis. IEEE Transactions on Computers, C-18: 401–409, May 1969.Google Scholar
  26. 26.
    D.W. Scott. Multivariate Density Estimation. John Wiley liu Sons, New York, 1992.CrossRefMATHGoogle Scholar
  27. 27.
    M.B. Simson. Use of Signals in the Terminal QRS Complex to Identify Patients with Ventricular Tachycardia after Myocardial Infarction. Circulation, 64 (2): 235–242, 1981.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Friedhelm Schwenker
  • Hans A. Kestler
  • Günther Palm

There are no affiliations available

Personalised recommendations