Skip to main content

Feature Extraction and Classification System for Nonlinear and Online Data

  • Conference paper
Advances in Knowledge Discovery and Data Mining (PAKDD 2004)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3056))

Included in the following conference series:

Abstract

A novel incremental feature extraction and classification system is proposed. Kernel PCA is famous nonlinear feature extraction method. The problem of Kernel PCA is that the computation becomes prohibitive when the data set is large. Another problem is that, in order to update the eigenvectors with another data, the whole eigenspace should be recomputed. Proposed feature extraction method overcomes these problems by incrementally eigenspace update and using empirical kernel map as kernel function. Proposed feature extraction method is more efficient in memory requirement than a Kernel PCA and can be easily improved by re-learning the data. For classification extracted features are used as input for Least Squares SVM. In our experiments we show that proposed feature extraction method is comparable in performance to a Kernel PCA and proposed classification system shows a high classification performance on UCI benchmarking data and NIST handwritten data set.

This study was supported by a grant of the Korea Health 21 R&D Project, Ministry of Health & Welfare, Republic of Korea (02-PJ1-PG6-HI03-0004)

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Tipping, M.E., Bishop, C.M.: Mixtures of probabilistic principal component analysers. Neural Computation 11(2), 443–482 (1998)

    Article  Google Scholar 

  2. Kramer, M.A.: Nonlinear principal component analysis using autoassociative neural networks. AICHE Journal 37(2), 233–243 (1991)

    Article  Google Scholar 

  3. Diamantaras, K.I., Kung, S.Y.: Principal Component Neural Networks: Theory and Applications. John Wiley & Sons, Inc., Chichester (1996)

    MATH  Google Scholar 

  4. Kim, B.J., Shim, J.Y., Hwang, C.H., Kim, II.K.: Incremental Feature Extraction Based on Emperical Feature Map. In: Zhong, N., Raś, Z.W., Tsumoto, S., Suzuki, E. (eds.) ISMIS 2003. LNCS (LNAI), vol. 2871, pp. 440–444. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  5. Softky, W.S., Kammen, D.M.: Correlation in high dimensional or asymmetric data set: Hebbian neuronal processing. Neural Networks 4, 337–348 (1991)

    Article  Google Scholar 

  6. Gupta, H., Agrawal, A.K., Pruthi, T., Shekhar, C., Chellappa, R.: An Experimental Evaluation of Linear and Kernel-Based Methods for Face Recognition, accessible at http://citeseer.nj.nec.com

  7. Suykens, J.A.K., Vandewalle, J.: Least squares support vector machine classifiers. Neural Processing Letters 9, 293–300 (1999)

    Article  MathSciNet  Google Scholar 

  8. Vapnik, V.N.: Statistical learning theory. John Wiley & Sons, New York (1998)

    MATH  Google Scholar 

  9. Hall, P., Marshall, D., Martin, R.: Incremental eigenalysis for classification. In: British Machine Vision Conference, September 1998, vol. 1, pp. 286–295 (1998)

    Google Scholar 

  10. Winkeler, J., Manjunath, B.S., Chandrasekaran, S.: Subset selection for active object recognition. In: CVPR, June 1999, vol. 2, pp. 511–516. IEEE Computer Society Press, Los Alamitos (1999)

    Google Scholar 

  11. Murakami, H. Kumar., B.V.K.V.: Efficient calculation of primary images from a set of images. IEEE PAMI 4(5), 511–515 (1982)

    Google Scholar 

  12. Scholkopf, B., Smola, A., Muller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10(5), 1299–1319 (1998)

    Article  Google Scholar 

  13. Tsuda, K.: Support vector classifier based on asymmetric kernel function. In: Proc. ESANN (1999)

    Google Scholar 

  14. Mika, S.: Kernel algorithms for nonlinear signal processing in feature spaces. Master’s thesis, Technical University of Berlin (November 1998)

    Google Scholar 

  15. Diamantaras, K.I., Kung, S.Y.: Principal Component Neural Networks: Theory and Applications. John Wiley & Sons, Inc., New York (1996)

    MATH  Google Scholar 

  16. Suykens, J.A.K., Vandewalle, J.: Multiclass Least Squares Support Vector Machines. In: Proc. International Joint Conference on Neural Networks (IJCNN 1999), Washington DC (1999)

    Google Scholar 

  17. Gestel, V., Suykens, T., Lanckriet, J.A.K., Lambrechts, G., De Moor, A.B., Vandewalle, J.: A Bayesian Framework for Least Squares Support Vector Machine Classifiers,” Internal Report 00-65, ESAT-SISTA, K.U. Leuven.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kim, B.J., Kim, I.K., Kim, K.B. (2004). Feature Extraction and Classification System for Nonlinear and Online Data. In: Dai, H., Srikant, R., Zhang, C. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2004. Lecture Notes in Computer Science(), vol 3056. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-24775-3_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-24775-3_22

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-22064-0

  • Online ISBN: 978-3-540-24775-3

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics