Skip to main content

The Robust Sparse PCA for Data Reconstructive via Weighted Elastic Net

  • Conference paper
  • First Online:
Communications, Signal Processing, and Systems

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 202))

Abstract

2 1-norm is widely used to measure coding residual in principal component analysis (PCA). In this case, it usually assumes that the residual follows Gaussian/Laplacian distribution. However, it may fail to describe the coding errors in practice when there are outliers. Toward this end, this paper proposes a robust sparse PCA (RSPCA) approach to solve the outlier problem, by modeling the sparse coding as a sparsity-constrained weighted regression problem. By using a series of equivalent transformations, we show RSPCA is equivalent to the weighted elastic net (WEN) problem and thus the least angle regression elastic net (LARS-EN) method is used to yield the optimal solution. Simulation results illustrated the effectiveness of this approach.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Aanæs H, Fisker R, Astrom K, Carstensen JM (2002) Robust factorization. IEEE Trans Pattern Anal Mach Intell 24(9):1215–1225

    Google Scholar 

  2. Baccini A, Besse P, and Falguerolles de A (1996), An l1-norm pca and a heuristic approach. Ordinal and symbolic data analysis, pp 359–368

    Google Scholar 

  3. Candes EJ, Li X, Ma Y, Wright J (2009) Robust principal component analysis? Arxiv preprint ArXiv:0912.3599

    Google Scholar 

  4. Croux C, Filzmoser P, Fritz H (2011) Robust sparse principal component analysis. Catholic University of Leuven Department of Decision Science and Information Management Working Paper No. 1113

    Google Scholar 

  5. d’Aspremont v, El Ghaoui L, Jordan MI, Lanckriet GRG (2004) A direct formulation for sparse PCA using semidefinite programming. Computer Science Division, University of California

    Google Scholar 

  6. De la Torre F, Black MJ (2001) Robust principal component analysis for computer vision. In: IEEE international conference on computer vision (ICCV), vol 1. IEEE, pp 362–369

    Google Scholar 

  7. De La Torre F, Black MJ (2003) A framework for robust subspace learning. Int J Comput Vis 54(1):117–142

    Google Scholar 

  8. Ding C, Zhou D, He X, Zha H (2006) R 1-pca: rotational invariant l 1-norm principal component analysis for robust subspace factorization. In: Proceedings of the 23rd international conference on machine learning. ACM, New York, pp 281–288

    Google Scholar 

  9. Frieze A, Kannan R, Vempala S (2004) Fast monte-carlo algorithms for finding low-rank approximations. J ACM (JACM) 51(6):1025–1041

    Google Scholar 

  10. Jolliffe IT (2002) Principal component analysis, vol 2. Wiley Online Library

    Google Scholar 

  11. Ke Q, Kanade T (2005) Robust l1 norm factorization in the presence of outliers and missing data by alternative convex programming. In: IEEE computer society conference on computer vision and pattern recognition (CVPR), vol 1. IEEE, pp 739–746

    Google Scholar 

  12. Kwak N (2008) Principal component analysis based on l1-norm maximization. IEEE Trans Pattern Anal Mach Intell 30(9):1672–1680

    Google Scholar 

  13. Mackey L (2009) Deflation methods for sparse pca. Adv Neural Inf Process Syst 21:1017–1024

    Google Scholar 

  14. Moghaddam B, Weiss Y, Avidan S (2006) Spectral bounds for sparse pca: exact and greedy algorithms. Adv Neural Inf Process Syst 18:915

    Google Scholar 

  15. Rousseeuw PJ, Leroy AM, Wiley J (1987) Robust regression and outlier detection, vol 3. Wiley Online Library

    Google Scholar 

  16. Shen H, Huang JZ (2008) Sparse principal component analysis via regularized low rank matrix approximation. J Multivar Anal 99(6):1015–1034

    Google Scholar 

  17. Wold S, Esbensen K, Geladi P (1987) Principal component analysis. Chemom Intell Lab Syst 2(1–3):37–52

    Google Scholar 

  18. Yang M, Zhang L, Yang J, Zhang D (2011) Robust sparse coding for face recognition. In: IEEE conference on computer vision and pattern recognition (CVPR). IEEE, pp 625–632

    Google Scholar 

  19. Zhou T, Tao D, Wu X (2011) Manifold elastic net: a unified framework for sparse dimension reduction. Data Min Knowl Dis 22(3):340–371

    Google Scholar 

  20. Zou H, Hastie T (2003) Regression shrinkage and selection via the elastic net, with applications to microarrays. In: Technical report. Department of Statistics, Stanford University. Available via DIALOG.http://www-stat.stanford.edu/~hastie/pub.htm

  21. Zou H, Hastie T, Tibshirani R (2006) Sparse principal component analysis. J Comput Graph Stat 15(2):265–286

    Google Scholar 

Download references

Acknowledgements

This work was supported by “the Fundamental Research Funds for the Central Universities” under award number ZYGX2010J016.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wang Ling .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer Science+Business Media New York

About this paper

Cite this paper

Ling, W., Yin, J. (2012). The Robust Sparse PCA for Data Reconstructive via Weighted Elastic Net. In: Liang, Q., et al. Communications, Signal Processing, and Systems. Lecture Notes in Electrical Engineering, vol 202. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-5803-6_23

Download citation

  • DOI: https://doi.org/10.1007/978-1-4614-5803-6_23

  • Published:

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-1-4614-5802-9

  • Online ISBN: 978-1-4614-5803-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics