Advertisement

Fundamental conditions on the sampling pattern for union of low-rank subspaces retrieval

  • Morteza Ashraphijuo
  • Xiaodong WangEmail author
Article
  • 2 Downloads

Abstract

This paper is concerned with investigating the fundamental conditions on the locations of the sampled entries, i.e., sampling pattern, for finite completability of a matrix that represents the union of several subspaces with given ranks. In contrast with the existing analysis on Grassmannian manifold for the conventional matrix completion, we propose a geometric analysis on the manifold structure for the union of several subspaces to incorporate all given rank constraints simultaneously. In order to obtain the deterministic conditions on the sampling pattern, we characterizes the algebraic independence of a set of polynomials defined based on the sampling pattern, which is closely related to finite completion. We also give a probabilistic condition in terms of the number of samples per column, i.e., the sampling probability, which leads to finite completability with high probability. Furthermore, using the proposed geometric analysis for finite completability, we characterize sufficient conditions on the sampling pattern that ensure there exists only one completion for the sampled data.

Keywords

Low-rank data completion Matrix completion Manifold Union of subspaces Finite completability Unique completability 

Mathematics Subject Classification (2010)

68W01 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes

Acknowledgments

This work was supported in part by the U.S. National Science Foundation under Grant CCF-1814803 and in part by the U.S. Office of Naval Research under Grant N000141410667.

References

  1. 1.
    Ashraphijuo, M., Aggarwal, V., Wang, X.: Deterministic and probabilistic conditions for finite completability of low-Tucker-rank tensor. Accepted to IEEE Transactions on Information Theory, arXiv:1612.01597 (2016)
  2. 2.
    Ashraphijuo, M., Aggarwal, V., Wang, X.: On deterministic sampling patterns for robust low-rank matrix completion. IEEE Signal Process. Lett. 25(3), 343–347 (2018)CrossRefGoogle Scholar
  3. 3.
    Ashraphijuo, M., Madani, R., Lavaei, J.: Characterization of rank-constrained feasibility problems via a finite number of convex programs. In: 2016 IEEE 55th Conference on Decision and Control (CDC), pp. 6544–6550 (2016)Google Scholar
  4. 4.
    Ashraphijuo, M., Wang, X.: Fundamental conditions for low-CP-rank tensor completion. J. Mach. Learn. Res. 18(63), 1–29 (2017)MathSciNetzbMATHGoogle Scholar
  5. 5.
    Ashraphijuo, M., Wang, X.: A characterization of sampling patterns for union of low-rank subspaces retrieval problem. In: International Symposium on Artificial Intelligence and Mathematics, pp. 1–8 (2018)Google Scholar
  6. 6.
    Ashraphijuo, M., Wang, X.: Clustering a union of low-rank subspaces of different dimensions with missing data. Pattern Recogn. Lett. 120, 31–35 (2019)CrossRefGoogle Scholar
  7. 7.
    Ashraphijuo, M., Wang, X., Aggarwal, V.: Deterministic and probabilistic conditions for finite completability of low-rank multi-view data. arXiv:1701.00737 (2017)
  8. 8.
    Ashraphijuo, M., Wang, X., Aggarwal, V.: Rank determination for low-rank data completion. J. Mach. Learn. Res. 18(1), 3422–3450 (2017)MathSciNetzbMATHGoogle Scholar
  9. 9.
    Ashraphijuo, M., Wang, X., Zhang, J.: Low-rank data completion with very low sampling rate using newton’s method. IEEE Trans. Signal Process. 67(7), 1849–1859 (2019)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Balzano, L., Eriksson, B., Nowak, R.: High rank matrix completion and subspace clustering with missing data. In: The Conference on Artificial Intelligence and Statistics (AIStats) (2012)Google Scholar
  11. 11.
    Cai, J.F., Candès, E.J., Shen, Z.: A singular value thresholding algorithm for matrix completion. SIAM J. Optim. 20(4), 1956–1982 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Candès, E.J., Eldar, Y.C., Strohmer, T., Voroninski, V.: Phase retrieval via matrix completion. SIAM J. Imag. Sci. 6(1), 199–225 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Candès, E.J., Recht, B.: Exact matrix completion via convex optimization. Found. Comput. Math. 9(6), 717–772 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Candès, E.J., Tao, T.: The power of convex relaxation: Near-optimal matrix completion. IEEE Trans. Inf. Theory 56(5), 2053–2080 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    Eldén, L.: Matrix Methods in Data Mining and Pattern Recognition, vol. 4. Society for Industrial and Applied Mathematics (2007)Google Scholar
  16. 16.
    Elhamifar, E., Vidal, R.: Sparse subspace clustering. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 2790–2797. IEEE (2009)Google Scholar
  17. 17.
    Eriksson, B., Balzano, L., Nowak, R.: High-rank matrix completion. In: Artificial Intelligence and Statistics, pp. 373–381 (2012)Google Scholar
  18. 18.
    Gandy, S., Recht, B., Yamada, I.: Tensor completion and low-n-rank tensor recovery via convex optimization. Inverse Probl. 27(2), 1–19 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    Gao, P., Wang, M., Chow, J.H., Berger, M., Seversky, L.M.: Missing data recovery for high-dimensional signals with nonlinear low-dimensional structures. IEEE Trans. Signal Process. 65(20), 5421–5436 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  20. 20.
    Gao, P., Wang, M., Ghiocel, S.G., Chow, J.H., Fardanesh, B., Stefopoulos, G.: Missing data recovery by exploiting low-dimensionality in power system synchrophasor measurements. IEEE Trans. Power Syst. 31(2), 1006–1013 (2016)CrossRefGoogle Scholar
  21. 21.
    Gao, P., Wang, R., Wang, M., Chow, J.H.: Low-rank matrix recovery from quantized and erroneous measurements: Accuracy-preserved data privatization in power grids. In: 2016 50th Asilomar Conference on Signals, Systems and Computers, pp. 374–378. IEEE (2016)Google Scholar
  22. 22.
    Goldfarb, D., Qin, Z.: Robust low-rank tensor recovery: Models and algorithms. SIAM J. Matrix Anal Appl. 35(1), 225–253 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  23. 23.
    Harvey, N.J.A., Karger, D.R., Murota, K.: Deterministic network coding by matrix completion. In: ACM-SIAM Symposium on Discrete algorithms, pp. 489–498 (2005)Google Scholar
  24. 24.
    Ji, H., Liu, C., Shen, Z., Xu, Y.: Robust video denoising using low rank matrix completion. In: IEEE Conference on Conference on Computer Vision and Pattern Recognition, pp. 1791–1798 (2010)Google Scholar
  25. 25.
    Kreimer, N., Stanton, A., Sacchi, M.D.: Tensor completion based on nuclear norm minimization for 5D seismic data reconstruction. Geophysics 78(6), V273–V284 (2013)CrossRefGoogle Scholar
  26. 26.
    Kressner, D., Steinlechner, M., Vandereycken, B.: Low-rank tensor completion by Riemannian optimization. BIT Numer. Math. 54(2), 447–468 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    Krishnamurthy, A., Singh, A.: Low-rank matrix and tensor completion via adaptive sampling. In: Advances in Neural Information Processing Systems, pp. 836–844 (2013)Google Scholar
  28. 28.
    Liu, X.Y., Aeron, S., Aggarwal, V., Wang, X., Wu, M.Y.: Adaptive sampling of RF fingerprints for fine-grained indoor localization. IEEE Trans. Mob. Comput. 15 (10), 2411–2423 (2016)CrossRefGoogle Scholar
  29. 29.
    Liu, X-Y, Aeron, S., Aggarwal, V., Wang, X.: Low-tubal-rank tensor completion using alternating minimization. In: International Society for Optics and Photonics, pp. 984809–984809 (2016)Google Scholar
  30. 30.
    Parsons, L., Haque, E., Liu, H.: Subspace clustering for high dimensional data: A review. ACM SIGKDD Explor. Newslett. 6(1), 90–105 (2004)CrossRefGoogle Scholar
  31. 31.
    Pimentel, D., Nowak, R., Balzano, L.: On the sample complexity of subspace clustering with missing data. In: IEEE Workshop on Statistical Signal Processing (SSP), pp. 280–283. IEEE (2014)Google Scholar
  32. 32.
    Pimentel-Alarcón, D, Boston, N., Nowak, R.D.: Deterministic conditions for subspace identifiability from incomplete sampling. In: IEEE International Symposium on Information Theory (ISIT), pp. 2191–2195 (2015)Google Scholar
  33. 33.
    Pimentel-Alarcón, D, Balzano, L., Nowak, R.: Necessary and sufficient conditions for sketched subspace clustering. In: 2016 54th Annual Allerton Conference on Communication, Control, and Computing (Allerton), pp. 1335–1343. IEEE (2016)Google Scholar
  34. 34.
    Pimentel-Alarcón, D, Boston, N., Nowak, R.: A characterization of deterministic sampling patterns for low-rank matrix completion. IEEE J. Selected Topics Signal Process. 10(4), 623–636 (2016)CrossRefGoogle Scholar
  35. 35.
    Pimentel-Alarcón, D, Nowak, R.: The information-theoretic requirements of subspace clustering with missing data (2016)Google Scholar
  36. 36.
    Romera-Paredes, B., Pontil, M.: A new convex relaxation for tensor completion. In: Advances in Neural Information Processing Systems, pp. 2967–2975 (2013)Google Scholar
  37. 37.
    Signoretto, M., Dinh, Q.T., De Lathauwer, L., Suykens, J.A.K.: Learning with tensors: A framework based on convex optimization and spectral regularization. Mach. Learn. 94(3), 303–351 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  38. 38.
    Sturmfels, B.: Solving Systems of Polynomial Equations, Number 97, American Mathematical Society (2002)Google Scholar
  39. 39.
    Tomioka, R., Hayashi, K., Kashima, H.: Estimation of low-rank tensors via convex optimization. arXiv:1010.0789 (2010)
  40. 40.
    Wang, W., Aggarwal, V., Aeron, S.: Tensor completion by alternating minimization under the tensor train (TT) model. arXiv:1609.05587(2016)

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Columbia UniversityNew YorkUSA

Personalised recommendations