Advertisement

Neighborhood Structure Preserving Ridge Regression for Dimensionality Reduction

  • Xin Shu
  • Hongtao Lu
Part of the Communications in Computer and Information Science book series (CCIS, volume 321)

Abstract

Recent research work shows that linear regression bears strong connections to many subspace learning methods such as linear discriminant analysis, locality preserving projection. When linear regression methods are applied for dimensionality reduction, a major disadvantage is that it fails to consider the geometric structure in the data. In this paper, we propose a graph regularized ridge regression for dimensionality reduction. We develop a new algorithm for affinity graph construction based on nonnegative least squares and use affinity graph to capture the neighborhood geometric structure information. The global and neighborhood structures information are modeled as a graph regularized least squares problem. We design an efficient model selection scheme for the optimal parameter estimation, which balances the tradeoff between the global and neighborhood structures. Extensive experimental studies are conducted on benchmark data sets to show the effectiveness of our approach.

Keywords

linear regression dimensionality reduction geometric structure model selection 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Duda, R., Hart, P., Stork, D.: Pattern classification, vol. 2. Wiley, New York (2001)zbMATHGoogle Scholar
  2. 2.
    Hastie, T., Tibshirani, R., Friedman, J.: The elements of statistical learning: data mining, inference, and prediction. Springer (2009)Google Scholar
  3. 3.
    Jolliffe, I.: MyiLibrary: Principal component analysis, vol. 2. Wiley Online Library (2002)Google Scholar
  4. 4.
    Fukunaga, K.: Introduction to statistical pattern recognition. Academic Press Professional (1990)Google Scholar
  5. 5.
    Belhumeur, P., Hespanha, J., Kriegman, D.: Eigenfaces vs. fisherfaces: Recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(7), 711–720 (1997)CrossRefGoogle Scholar
  6. 6.
    Swets, D., Weng, J.: Using discriminant eigenfeatures for image retrieval. IEEE Transactions on Pattern Analysis and Machine Intelligence 18(8), 831–836 (1996)CrossRefGoogle Scholar
  7. 7.
    Ye, J., Janardan, R., Park, C., Park, H.: An optimization criterion for generalized discriminant analysis on undersampled problems. IEEE Transactions on Pattern Analysis and Machine Intelligence 26(8), 982–994 (2004)CrossRefGoogle Scholar
  8. 8.
    Dudoit, S., Fridlyand, J., Speed, T.: Comparison of discrimination methods for the classification of tumors using gene expression data. Journal of the American Statistical Association 97(457), 77–87 (2002)MathSciNetzbMATHCrossRefGoogle Scholar
  9. 9.
    Roweis, S., Saul, L.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)CrossRefGoogle Scholar
  10. 10.
    Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation 15(6), 1373–1396 (2003)zbMATHCrossRefGoogle Scholar
  11. 11.
    Tenenbaum, J., Silva, V., Langford, J.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)CrossRefGoogle Scholar
  12. 12.
    Belkin, M., Niyogi, P.: Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Advances in Neural Information Processing Systems, vol. 1, pp. 585–592 (2002)Google Scholar
  13. 13.
    Niyogi, X.: Locality preserving projections. In: Proceedings of the 2003 Conference Advances in Neural Information Processing Systems, vol. 16, p. 153. The MIT Press (2004)Google Scholar
  14. 14.
    Xiaofei, H., Deng, C., Shuicheng, Y., Zhang, H.: Neighborhood preserving embedding. In: Proc. of the 10th International Conference of Computer Vision, Beijing, China, pp. 1208–1213 (2005)Google Scholar
  15. 15.
    Ye, J.: Least squares linear discriminant analysis. In: Proceedings of the 24th International Conference on Machine Learning, pp. 1087–1093. ACM (2007)Google Scholar
  16. 16.
    Cai, D., He, X., Han, J.: Spectral regression: A unified approach for sparse subspace learning. In: Proc. Int. Conf. on Data Mining, ICDM 2007 (2007)Google Scholar
  17. 17.
    Cai, D., He, X., Han, J.: Spectral regression for dimensionality reduction. In: Computer Science Department, UIUC, UIUCDCS-R-2007-2856 (May 2007)Google Scholar
  18. 18.
    Björck, A.: Numerical methods for least squares problems. Society for Industrial Mathematics (1996)Google Scholar
  19. 19.
    Lawson, C., Hanson, R.: Solving least squares problems. Society for Industrial Mathematics (1995)Google Scholar
  20. 20.
    Golub, G., Van Loan, C.: Matrix computations. Johns Hopkins Univ. Press (1996)Google Scholar
  21. 21.
    The UCI kdd archiveGoogle Scholar
  22. 22.
    Frank, A., Asuncion, A.: UCI machine learning repository (2010)Google Scholar
  23. 23.
    Jin, Z., Yang, J., Hu, Z., Lou, Z.: Face recognition based on the uncorrelated discriminant transformation. Pattern Recognition 34(7), 1405–1416 (2001)zbMATHCrossRefGoogle Scholar
  24. 24.
    Jin, Z., Yang, J., Tang, Z., Hu, Z.: A theorem on the uncorrelated optimal discriminant vectors. Pattern Recognition 34(10), 2041–2047 (2001)zbMATHCrossRefGoogle Scholar
  25. 25.
    Ye, J., Xiong, T.: Computational and theoretical analysis of null space and orthogonal linear discriminant analysis. The Journal of Machine Learning Research 7, 1183–1204 (2006)MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Xin Shu
    • 1
  • Hongtao Lu
    • 1
  1. 1.Department of Computer Science and EngineeringShanghai JiaoTong UniversityChina

Personalised recommendations