Advertisement

Robust Sparse PCA via Weighted Elastic Net

  • Ling Wang
  • Hong Cheng
Part of the Communications in Computer and Information Science book series (CCIS, volume 321)

Abstract

In principal component analysis (PCA), ℓ2 /ℓ1-norm is widely used to measure coding residual. In this case, it assume that the residual follows Gaussian/Laplacian distribution. However, it may fail to describe the coding errors in practice when there are outliers. Toward this end, this paper propose a Robust Sparse PCA (RSPCA) approach to solve the outlier problem, by modeling the sparse coding as a sparsity-constrained weighted regression problem. By using a series of equivalent transformations, we show the proposed RSPCA is equivalent to the Weighted Elastic Net (WEN) problem and thus the Least Angle Regression Elastic Net (LARS-EN) algorithm is used to yield the optimal solution. Simulation results illustrated the effectiveness of this approach.

Keywords

Principal Component Analysis Sparse Representation Robust statistics Elastic Net 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15, 265–286 (2006)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Zhou, T., Tao, D., Wu, X.: Manifold elastic net: a unified framework for sparse dimension reduction. In: Data Mining and Knowledge Discovery, vol. 22, pp. 340–371 (2011)Google Scholar
  3. 3.
    Jolliffe, I.T.: Principal component analysis. Wiley Online Library (2002)Google Scholar
  4. 4.
    Wold, S., Esbensen, K., Geladi, P.: Principal component analysis. Chemometrics and Intelligent Laboratory Systems 2, 37–52 (1987)CrossRefGoogle Scholar
  5. 5.
    d’Aspremont, A., El Ghaoui, L., Jordan, M.I., Lanckriet, G.R.G.: A direct formulation for sparse PCA using semidefinite programming. Computer Science Division (2004)Google Scholar
  6. 6.
    Moghaddam, B., Weiss, Y., Avidan, S.: Spectral bounds for sparse PCA: Exact and greedy algorithms. In: Advances in Neural Information Processing Systems, vol. 18 (2006)Google Scholar
  7. 7.
    Shen, H., Huang, J.Z.: Sparse principal component analysis via regularized low rank matrix approximation. Journal of Multivariate Analysis 99, 1015–1034 (2008)MathSciNetzbMATHCrossRefGoogle Scholar
  8. 8.
    Mackey, L.: Deflation methods for sparse PCA. In: Advances in Neural Information Processing Systems, vol. 21, pp. 1017–1024 (2009)Google Scholar
  9. 9.
    Frieze, A., Kannan, R., Vempala, S.: Fast Monte-Carlo algorithms for finding low-rank approximations. Journal of ACM 51, 1025–1041 (2004)MathSciNetzbMATHCrossRefGoogle Scholar
  10. 10.
    Meng, D., Zhao, Q., Xu, Z.: Robust sparse principal component analysis. Preprint (2010)Google Scholar
  11. 11.
    Ding, C., Zhou, D., He, X., Zha, H.: R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization. In: ICML (2006)Google Scholar
  12. 12.
    Ke, Q., Kanade, T.: Robust l1 norm factorization in the presence of outliers and missing data by alternative convex programming. In: IEEE CVPR (2005)Google Scholar
  13. 13.
    Kwak, N.: Principal component analysis based on L1-norm maximization. IEEE TPAMI 30, 1672–1680 (2008)CrossRefGoogle Scholar
  14. 14.
    De La Torre, F., Black, M.J.: Robust principal component analysis for computer vision. In: IEEE ICCV (2001)Google Scholar
  15. 15.
    De La Torre, F., Black, M.J.: A framework for robust subspace learning. International Journal of Computer Vision 54, 117–142 (2003)zbMATHCrossRefGoogle Scholar
  16. 16.
    Aanæs, H., Fisker, R., Astrom, K., Carstensen, J.M.: Robust factorization. IEEE Transactions on PAMI 24, 1215–1225 (2002)CrossRefGoogle Scholar
  17. 17.
    Candes, E.J., Li, X., Ma, Y., Wright, J.: Robust principal component analysis? Arxiv preprint ArXiv: 0912.3599 (2009)Google Scholar
  18. 18.
    Croux, C., Filzmoser, P., Fritz, H.: Robust sparse principal component analysis. Catholic University of Leuven Department of Decision Science and Information Management Working Paper (2011)Google Scholar
  19. 19.
    Zou, H., Hastie, T.: Regression shrinkage and selection via the elastic net, with applications to microarrays. Journal of Royal Statist Society B, 1–26 (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Ling Wang
    • 1
  • Hong Cheng
    • 1
  1. 1.University of Electronic Science and Technology of ChinaChengduChina

Personalised recommendations