Advertisement

Supervised Manifold-Preserving Graph Reduction for Noisy Data Classification

  • Zhiqiang Xu
  • Li Zhang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11061)

Abstract

Data reduction has become one of essential techniques in current knowledge discovery scenarios, dominated by noisy data. The manifold-preserving graph reduction (MPGR) algorithm has been proposed, which has the advantages of eliminating the influence of outliers and noisy and simultaneously accelerating the evaluation of predictors learned from manifolds. Based on MPGR, this paper utilizes the label information to guide the construction of graph and presents a supervised MPGR (SMPGR) method for classification tasks. In addition, we construct a similarity matrix using kernel tricks and develop the kernelized version for SMPGR. Empirical experiments on several datasets show the efficiency of the proposed algorithms.

Keywords

Data reduction Kernel trick Graph reduction Manifold learning Supervised learning 

Notes

Acknowledgment

This work was supported in part by the National Natural Science Foundation of China under Grants No. 61373093, No. 61402310, No. 61672364 and No. 61672365, by the Soochow Scholar Project of Soochow University, by the Six Talent Peak Project of Jiangsu Province of China.

References

  1. 1.
    Pyle, D.: Data preparation for data mining. Appl. Artif. Intell. 17(5–6), 375–381 (1999)Google Scholar
  2. 2.
    Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15(6), 1373–1396 (2003)CrossRefGoogle Scholar
  3. 3.
    Sun, S., Hussain, Z., Shawe-Taylor, J.: Manifold-preserving graph reduction for sparse semi-supervised learning. Neurocomputing 124(2), 13–21 (2014)CrossRefGoogle Scholar
  4. 4.
    Madigan, D., Nason, M.: Data reduction: sampling. In: Handbook of Data Mining and Knowledge Discovery, pp. 205–208 (2002)Google Scholar
  5. 5.
    Barca, J.C., Rumantir, G.: A modified K-means algorithm for noise reduction in optical motion capture data. In: 6th IEEE/ACIS International Conference on Computer and Information Science in Conjunction with 1st IEEE/ACIS International Workshop on e-Activity, pp. 118–122 (2007)Google Scholar
  6. 6.
    Ou, Y.Y., Chen, C.Y., Hwang, S.C., Oyang, Y.J.: Expediting model selection for support vector machines based on data reduction. IEEE Int. Conf. Syst. 1, 786–791 (2003)Google Scholar
  7. 7.
    Burges, C.J.C.: Geometry and invariance in kernel based methods. In: Advances in Kernel Methods (2008)Google Scholar
  8. 8.
    Panda, N., Chang, E.Y., Wu, G.: Concept boundary detection for speeding up SVMs. In: 23rd International Conference on Machine Learning, pp. 681–688 (2006)Google Scholar
  9. 9.
    Jinlong, A.N., Wang, Z.: Pre-extracting support vectors for support vector machine. In: 5th International Conference on Signal Processing, vol. 3, pp. 1432–1435 (2000)Google Scholar
  10. 10.
    Zhang, L., Zhou, W., Chen, G., Zhou, H., Ye, N., Jiao, L.: Pre-extracting boundary vectors for support vector machine using pseudo-density estimation method. In: International Symposium on Multispectral Image Processing and Pattern Recognition, vol. 7496, pp. 74960J–74960J-7 (2009)Google Scholar
  11. 11.
    Rowels, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)CrossRefGoogle Scholar
  12. 12.
    Sindhwani, V., Rosenberg, D.S.: An RKHS for multi-view learning and manifold co-regularization. Int. Conf. Mach. Learn. 307, 976–983 (2008)Google Scholar
  13. 13.
    Tenenbaum, J.B., De, S.V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)CrossRefGoogle Scholar
  14. 14.
    He, X., Cai, D., Yan, S., Zhang, H.J.: Neighborhood preserving embedding. Int. Conf. Comput. Vis. 2, 1208–1213 (2005)Google Scholar
  15. 15.
    Hinton, G., Roweis, S.: Stochastic neighbor embedding. Adv. Neural Inf. Process. Syst. 41(4), 833–840 (2002)Google Scholar
  16. 16.
    Sammon, J.W.: A nonlinear mapping for data structure analysis. IEEE Trans. Comput. c-18(5), 401–409 (2006)Google Scholar
  17. 17.
    Hinton, G., Rowels, S.: Stochastic neighbor embedding. Adv. Neural Inf. Process. Syst. 41(4), 833–840 (2002)Google Scholar
  18. 18.
    Shaw, B., Jebara, T.: Structure preserving embedding. Int. Conf. Mach. Learn. 382, 937–944 (2009)Google Scholar
  19. 19.
    Zhang, L., Zhou, W.: On the sparseness of 1-norm support vector machines. Neural Netw. 23(3), 373–385 (2010)CrossRefGoogle Scholar
  20. 20.
    Kivinen, J., Smola, A.J., Williamson, R.C.: Learning with Kernels. MIT Press, Cambridge (2002)MATHGoogle Scholar
  21. 21.
    Zhang, H., Huang, W., Huang, Z., Zhang, B.: A kernel autoassociator approach to pattern classification. IEEE Trans. Syst. Man Cybern. 35(3), 593–606 (2005)CrossRefGoogle Scholar
  22. 22.
    Zhou, W., Zhang, L., Jiao, L.: Hidden space principal component analysis. In: Ng, W.-K., Kitsuregawa, M., Li, J., Chang, K. (eds.) PAKDD 2006. LNCS (LNAI), vol. 3918, pp. 801–805. Springer, Heidelberg (2006).  https://doi.org/10.1007/11731139_93CrossRefGoogle Scholar
  23. 23.
    Zhang, L., Zhou, W., Jiao, C.: Hidden space support vector machines. IEEE Trans. Neural Netw. 15(6), 1424–1434 (2004)CrossRefGoogle Scholar
  24. 24.
    Han, M., Yin, J.: The hidden neurons selection of the eavelet networks using support vector machines and ridge regression. Neurocomputing 72(1–3), 471–479 (2008)CrossRefGoogle Scholar
  25. 25.
    Alvira, M., Rifkin, R.: An empirical comparison of SNoW and SVMs for face detection. Massachusetts Institute of Technology (2001)Google Scholar
  26. 26.
    Sun, S.: Ensembles of feature subspaces for object detection. In: Yu, W., He, H., Zhang, N. (eds.) ISNN 2009. LNCS, vol. 5552, pp. 996–1004. Springer, Heidelberg (2009).  https://doi.org/10.1007/978-3-642-01510-6_113CrossRefGoogle Scholar
  27. 27.
    UCI machine learning repository. http://archive.ics.uci.edu/ml/datasets.html. Accessed 21 Mar 2018
  28. 28.
    Zhang, L., Zhou, W., Chang, P., Liu, J., Yang, Z., Wang, T.: Kernel sparse representation-based classifier. IEEE Trans. Signal Process. 60(4), 1684–1695 (2012)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.School of Computer Science and Technology and Joint International Research Laboratory of Machine Learning and Neuromorphic ComputingSoochow UniversitySuzhouChina

Personalised recommendations