Skip to main content

Improved Sample Complexity in Sparse Subspace Clustering with Noisy and Missing Observations

  • Chapter
  • First Online:
  • 2578 Accesses

Abstract

In this chapter, we show the results of the new CoCoSSC algorithm. The content is organized as follows: The main results concerning CoCoSSC algorithm are shown in Sect. 9.1. Following Sect. 9.1, we show the full proofs in Sect. 9.2. In Sect. 9.3, we show the performance for CoCoSSC algorithm and some related algorithms numerically. Finally, we conclude this work with some future directions.

Part of this chapter is in the paper titled “Improved Sample Complexity in Sparse Subspace Clustering with Noisy and Missing Observations” by Yining Wang, Bin Shi et al. (2018) presently under review for publication in AISTATS.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   99.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Z. Charles, A. Jalali, R. Willett, Sparse subspace clustering with missing and corrupted data. arXiv preprint: arXiv:1707.02461 (2017)

    Google Scholar 

  2. E.J. Candès, B. Recht, Exact matrix completion via convex optimization. Found. Comput. Math. 9(6), 717–772 (2009)

    Article  MathSciNet  Google Scholar 

  3. E.B. Fowlkes, C.L. Mallows, A method for comparing two hierarchical clusterings. J. Am. Stat. Assoc. 78(383), 553–569 (1983)

    Article  Google Scholar 

  4. R. Heckel, H. Bölcskei, Robust subspace clustering via thresholding. IEEE Trans. Inf. Theory 61(11), 6320–6342 (2015)

    Article  MathSciNet  Google Scholar 

  5. R.H. Keshavan, A. Montanari, S. Oh, Matrix completion from a few entries. IEEE Trans. Inf. Theory 56(6), 2980–2998 (2010)

    Article  MathSciNet  Google Scholar 

  6. B. Nasihatkon, R. Hartley, Graph connectivity in sparse subspace clustering, in CVPR (IEEE, Piscataway, 2011)

    Google Scholar 

  7. D. Park, C. Caramanis, S. Sanghavi, Greedy subspace clustering, in Advances in Neural Information Processing Systems (2014), pp. 2753–2761

    Google Scholar 

  8. B. Recht, A simpler approach to matrix completion. J. Mach. Learn. Res. 12, 3413–3430 (2011)

    MathSciNet  MATH  Google Scholar 

  9. M. Soltanolkotabi, E.J. Candes, A geometric analysis of subspace clustering with outliers. Ann. Stat. 40(4), 2195–2238 (2012)

    Article  MathSciNet  Google Scholar 

  10. M. Soltanolkotabi, E. Elhamifar, E.J. Candes, Robust subspace clustering. Ann. Stat. 42(2), 669–699 (2014)

    Article  MathSciNet  Google Scholar 

  11. Y. Wang, J. Wang, S. Balakrishnan, A. Singh, Rate optimal estimation and confidence intervals for high-dimensional regression with missing covariates. arXiv preprint arXiv:1702.02686 (2017)

    Google Scholar 

  12. Y. Wang, Y.-X. Wang, A. Singh, A deterministic analysis of noisy sparse subspace clustering for dimensionality-reduced data, in International Conference on Machine Learning (2015), pp. 1422–1431

    Google Scholar 

  13. Y. Wang, Y.-X. Wang, A. Singh, Graph connectivity in noisy sparse subspace clustering, in Artificial Intelligence and Statistics (2016), pp. 538–546

    Google Scholar 

  14. Y.-X. Wang, H. Xu, Noisy sparse subspace clustering. J. Mach. Learn. Res. 17(12), 1–41 (2016)

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Shi, B., Iyengar, S.S. (2020). Improved Sample Complexity in Sparse Subspace Clustering with Noisy and Missing Observations. In: Mathematical Theories of Machine Learning - Theory and Applications. Springer, Cham. https://doi.org/10.1007/978-3-030-17076-9_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-17076-9_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-17075-2

  • Online ISBN: 978-3-030-17076-9

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics