Skip to main content

Using Feature Correlation Measurement to Improve the Kernel Minimum Squared Error Algorithm

  • Conference paper
  • First Online:
Pattern Recognition (CCPR 2016)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 662))

Included in the following conference series:

Abstract

The kernel minimum squared error (KMSE) is less computationally efficient when applied to large datasets. In this paper, we propose IKMSE, an algorithm which improves the computational efficiency of KMSE by using just a part of the training set, key nodes, as a certain linear combination of key nodes in the feature space can be used to approximate the discriminant vector. Our algorithm includes three steps. The first step is to measure the correlation between the column vectors in the kernel matrix, known as the feature correlation, using the cosine distance between them. The second step is to determine the key nodes using the following requirement: two arbitrary column vectors of the kernel matrix that correspond to the key nodes should have a small cosine distance value. In the third step, we use the key nodes to construct the KMSE model and classify the testing samples. There are usually many fewer key nodes than training samples and this is the basis of producing the efficiency of feature extraction in our method. Experimental results show that our improved method has low computational complexity as well as high classification accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Xu, J., Zhang, X., Li, Y.: Kernel MSE algorithm: a unified framework for KFD, LS-SVM and KRR. In: International Joint Conference on Neural Networks 2001, Washington DC, USA, pp. 1486–1491 (2001)

    Google Scholar 

  2. Billings, S.A., Lee, K.L.: Nonlinear Fisher discriminant analysis using a minimum squared error cost function and the orthogonal least squares algorithm. Neural Netw. 15(2), 263–270 (2002)

    Article  Google Scholar 

  3. Zhu, Q.: Reformative nonlinear feature extraction using kernel MSE. Neurocomputing 73(16), 3334–3337 (2010)

    Article  Google Scholar 

  4. Muller, K.R., Mika, S., Ratsch, G., Tsuda, K., et al.: An introduction to kernel-based learning algorithms. IEEE Trans. Neural Netw. 12(2), 181–201 (2001)

    Article  MATH  Google Scholar 

  5. Jenssen, R.: Kernel entropy component analysis. IEEE Trans. Pattern Anal. Mach. Intell. 32(5), 847–860 (2010)

    Article  Google Scholar 

  6. Wang, Y., Guan, L., Venetsanopoulos, A.N.: Kernel cross-modal factor analysis for information fusion with application to bimodal emotion recognition. IEEE Trans. Multimed. 14(3), 597–607 (2012)

    Article  Google Scholar 

  7. Schölkopf, B., Smola, A., Muller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput. 10(5), 1299–1319 (1998)

    Article  Google Scholar 

  8. Xu, Y., Lin, C., Zhao, W.: Producing computationally efficient KPCA-based feature extraction for classification problems. Electron. Lett. 46(6), 452–453 (2010)

    Article  Google Scholar 

  9. Xu, Y., Yang, J., Lu, J., Yu, D.: An efficient renovation on kernel Fisher discriminant analysis and face recognition experiments. Pattern Recogn. 37(10), 2091–2094 (2004)

    Article  Google Scholar 

  10. Wang, J.: Kernel supervised discriminant projection and its application for face recognition. Int. J. Pattern Recogn. Artif. Intell. 27(02), 1356003 (2013)

    Article  MathSciNet  Google Scholar 

  11. Honeine, P.: Online kernel principal component analysis: a reduced-order model. IEEE Trans. Pattern Anal. Mach. Intell. 34(9), 1814–1826 (2012)

    Article  Google Scholar 

  12. Diethe, T., Girolami, M.: Online learning with (multiple) kernels: a review. Neural Comput. 25(3), 567–625 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  13. Mika, S., Ratsch, G., Weston, J., Scholkopf, B., et al.: Fisher discriminant analysis with kernels. In: Proceedings of the 1999 IEEE Signal Processing Society Workshop on Neural Networks for Signal Processing IX, pp. 41–48 (1999)

    Google Scholar 

  14. Zheng, W., Zou, C., Zhao, L.: An improved algorithm for kernel principal component analysis. Neural Process. Lett. 22(1), 49–56 (2005)

    Article  Google Scholar 

  15. Liang, Z., Shi, P.: An efficient and effective method to solve kernel Fisher discriminant analysis. Neurocomputing 61, 485–493 (2004)

    Article  Google Scholar 

  16. Xiong, T., Ye, J., Li, Q., Janardan, R., et al.: Efficient kernel discriminant analysis via QR decomposition. In: Advances in Neural Information Processing Systems, pp. 1529–1536 (2004)

    Google Scholar 

  17. Wang, H., Hu, Z., Zhao, Y.E.: An efficient algorithm for generalized discriminant analysis using incomplete Cholesky decomposition. Pattern Recogn. Lett. 28(2), 254–259 (2007)

    Article  MathSciNet  Google Scholar 

  18. Ishii, T., Ashihara, M., Abe, S.: Kernel discriminant analysis based feature selection. Neurocomputing 71(13), 2544–2552 (2008)

    Article  Google Scholar 

  19. Xu, Y., Zhang, D., Jin, Z., Li, M., et al.: A fast kernel-based nonlinear discriminant analysis for multi-class problems. Pattern Recogn. 39(6), 1026–1033 (2006)

    Article  MATH  Google Scholar 

  20. Kim, J.S., Scott, C.D.: L2 kernel classification. IEEE Trans. Pattern Anal. Mach. Intell. 32(10), 1822–1831 (2010)

    Article  Google Scholar 

  21. Keerthi, S.S., Chapelle, O., DeCoste, D.: Building support vector machines with reduced classifier complexity. J. Mach. Learn. Res. 7, 1493–1515 (2006)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This article is partly supported by Natural Science Foundation of China (NSFC) under grants Nos. 61472138, 61263032 and 61362031, and Jiangxi Provincial Natural Science Foundation of China under Grant 20161BAB202066, as well as Science and Technology Foundation of Jiangxi Transportation Department of China (2015D0066).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zizhu Fan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer Nature Singapore Pte Ltd.

About this paper

Cite this paper

Fan, Z., Li, Z. (2016). Using Feature Correlation Measurement to Improve the Kernel Minimum Squared Error Algorithm. In: Tan, T., Li, X., Chen, X., Zhou, J., Yang, J., Cheng, H. (eds) Pattern Recognition. CCPR 2016. Communications in Computer and Information Science, vol 662. Springer, Singapore. https://doi.org/10.1007/978-981-10-3002-4_46

Download citation

  • DOI: https://doi.org/10.1007/978-981-10-3002-4_46

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-10-3001-7

  • Online ISBN: 978-981-10-3002-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics