Advertisement

Rank minimization on tensor ring: an efficient approach for tensor decomposition and completion

  • Longhao Yuan
  • Chao Li
  • Jianting CaoEmail author
  • Qibin ZhaoEmail author
Article
  • 17 Downloads
Part of the following topical collections:
  1. Special Issue of the ACML 2019 Journal Track

Abstract

In recent studies, tensor ring decomposition (TRD) has become a promising model for tensor completion. However, TRD suffers from the rank selection problem due to the undetermined multilinear rank. For tensor decomposition with missing entries, the sub-optimal rank selection of traditional methods leads to the overfitting/underfitting problem. In this paper, we first explore the latent space of the TRD and theoretically prove the relationship between the TR-rank and the rank of the tensor unfoldings. Then, we propose two tensor completion models by imposing the different low-rank regularizations on the TR-factors, by which the TR-rank of the underlying tensor is minimized and the low-rank structures of the underlying tensor are exploited. By employing the alternating direction method of multipliers scheme, our algorithms obtain the TR factors and the underlying tensor simultaneously. In experiments of tensor completion tasks, our algorithms show robustness to rank selection and high computation efficiency, in comparison to traditional low-rank approximation algorithms.

Keywords

Tensor ring decomposition Tensor completion Structured nuclear norm ADMM scheme 

Notes

Acknowledgements

This work was supported by JSPS KAKENHI (Grant Nos. 17K00326, 18K04178), JST CREST (Grant No. JPMJCR1784).

References

  1. Bengua, J. A., Phien, H. N., Tuan, H. D., & Do, M. N. (2017). Efficient tensor completion for color image and video recovery: Low-rank tensor train. IEEE Transactions on Image Processing, 26(5), 2466–2479.MathSciNetCrossRefGoogle Scholar
  2. Boyd, S., Parikh, N., Chu, E., Peleato, B., Eckstein, J., et al. (2011). Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends® in Machine learning, 3(1), 1–122.CrossRefGoogle Scholar
  3. Cai, J. F., Candès, E. J., & Shen, Z. (2010). A singular value thresholding algorithm for matrix completion. SIAM Journal on Optimization, 20(4), 1956–1982.MathSciNetCrossRefGoogle Scholar
  4. Chen, Y., Jin, X., Kang, B., Feng, J., & Yan, S. (2018). Sharing residual units through collective tensor factorization to improve deep neural networks. In IJCAI (pp. 635–641).Google Scholar
  5. Cichocki, A., Mandic, D., De Lathauwer, L., Zhou, G., Zhao, Q., Caiafa, C., et al. (2015). Tensor decompositions for signal processing applications: From two-way to multiway component analysis. IEEE Signal Processing Magazine, 32(2), 145–163.CrossRefGoogle Scholar
  6. Du, B., Zhang, M., Zhang, L., Hu, R., & Tao, D. (2017). PLTD: Patch-based low-rank tensor decomposition for hyperspectral images. IEEE Transactions on Multimedia, 19(1), 67–79.CrossRefGoogle Scholar
  7. He, W., Zhang, H., Zhang, L., & Shen, H. (2015). Total-variation-regularized low-rank matrix factorization for hyperspectral image restoration. IEEE Transactions on Geoscience and Remote Sensing, 54(1), 178–188.CrossRefGoogle Scholar
  8. Hu, Y., Yi, X., & Davis, L.S. (2015). Collaborative fashion recommendation: A functional tensor factorization approach. In Proceedings of the 23rd ACM international conference on Multimedia (pp. 129–138). ACM.Google Scholar
  9. Khoo, Y., Lu, J., & Ying, L. (2017). Efficient construction of tensor ring representations from sampling. arXiv:1711.00954.
  10. Kolda, T. G., & Bader, B. W. (2009). Tensor decompositions and applications. SIAM Review, 51(3), 455–500.MathSciNetCrossRefGoogle Scholar
  11. Liu, J., Musialski, P., Wonka, P., & Ye, J. (2013). Tensor completion for estimating missing values in visual data. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(1), 208–220.CrossRefGoogle Scholar
  12. Liu, Y., Long, Z., & Zhu, C. (2019). Image completion using low tensor tree rank and total variation minimization. IEEE Transactions on Multimedia, 21(2), 338–350.CrossRefGoogle Scholar
  13. Liu, Y., Shang, F., Fan, W., Cheng, J., & Cheng, H. (2016). Generalized higher order orthogonal iteration for tensor learning and decomposition. IEEE Transactions on Neural Networks and Learning Systems, 27(12), 2551–2563.MathSciNetCrossRefGoogle Scholar
  14. Liu, Y., Shang, F., Jiao, L., Cheng, J., & Cheng, H. (2015). Trace norm regularized CANDECOMP/PARAFAC decomposition with missing data. IEEE Transactions on Cybernetics, 45(11), 2437–2448.CrossRefGoogle Scholar
  15. Long, Z., Liu, Y., Chen, L., & Zhu, C. (2018). Low rank tensor completion for multiway visual data. Signal Processing.Google Scholar
  16. Novikov, A., Podoprikhin, D., Osokin, A., & Vetrov, D. P. (2015). Tensorizing neural networks. In Advances in neural information processing systems (pp. 442–450).Google Scholar
  17. Oseledets, I. V. (2011). Tensor-train decomposition. SIAM Journal on Scientific Computing, 33(5), 2295–2317.MathSciNetCrossRefGoogle Scholar
  18. Sidiropoulos, N. D., De Lathauwer, L., Fu, X., Huang, K., Papalexakis, E. E., & Faloutsos, C. (2017). Tensor decomposition for signal processing and machine learning. IEEE Transactions on Signal Processing, 65(13), 3551–3582.MathSciNetCrossRefGoogle Scholar
  19. Song, Q., Ge, H., Caverlee, J., & Hu, X. (2019). Tensor completion algorithms in big data analytics. ACM Transactions on Knowledge Discovery from Data (TKDD), 13(1), 6.CrossRefGoogle Scholar
  20. Tomioka, R., & Suzuki, T. (2013). Convex tensor decomposition via structured Schatten norm regularization. In Advances in neural information processing systems (pp. 1331–1339).Google Scholar
  21. Wang, W., Aggarwal, V., & Aeron, S. (2017). Efficient low rank tensor ring completion. In 2017 IEEE international conference on computer vision (ICCV) (pp. 5698–5706). IEEE.Google Scholar
  22. Yokota, T., Erem, B., Guler, S., Warfield, S. K., & Hontani, H. (2018). Missing slice recovery for tensors using a low-rank model in embedded space. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 8251–8259).Google Scholar
  23. Yokota, T., Zhao, Q., & Cichocki, A. (2016). Smooth PARAFAC decomposition for tensor completion. IEEE Transactions on Signal Processing, 64(20), 5423–5436.MathSciNetCrossRefGoogle Scholar
  24. Yu, J., Li, C., Zhao, Q., & Zhou, G. (2019). Tensor-ring nuclear norm minimization and application for visual data completion. In ICASSP 2019-2019 IEEE international conference on acoustics, speech and signal processing (ICASSP) (pp. 3142–3146). IEEE.Google Scholar
  25. Yuan, L., Cao, J., Zhao, X., Wu, Q., & Zhao, Q. (2018). Higher-dimension tensor completion via low-rank tensor ring decomposition. In Proceedings, APSIPA annual summit and conference (Vol. 2018, pp. 12–15).Google Scholar
  26. Yuan, L., Li, C., Mandic, D., Cao, J., & Zhao, Q. (2018). Rank minimization on tensor ring: A new paradigm in scalable tensor decomposition and completion. arXiv:1805.08468.
  27. Yuan, L., Zhao, Q., Gui, L., & Cao, J. (2019). High-order tensor completion via gradient-based optimization under tensor train format. Signal Processing: Image Communication, 73, 53–61.Google Scholar
  28. Yuan, M., & Zhang, C. H. (2016). On tensor completion via nuclear norm minimization. Foundations of Computational Mathematics, 16(4), 1031–1068.MathSciNetCrossRefGoogle Scholar
  29. Zhang, H., He, W., Zhang, L., Shen, H., & Yuan, Q. (2014). Hyperspectral image restoration using low-rank matrix recovery. IEEE Transactions on Geoscience and Remote Sensing, 52(8), 4729–4743.CrossRefGoogle Scholar
  30. Zhang, Z., Ely, G., Aeron, S., Hao, N., & Kilmer, M. (2014). Novel methods for multilinear data completion and de-noising based on tensor-SVD. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 3842–3849).Google Scholar
  31. Zhao, Q., Caiafa, C. F., Mandic, D. P., Chao, Z. C., Nagasaka, Y., Fujii, N., et al. (2012). Higher order partial least squares (HOPLS): A generalized multilinear regression method. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(7), 1660–1673.CrossRefGoogle Scholar
  32. Zhao, Q., Sugiyama, M., Yuan, L., & Cichocki, A. (2019). Learning efficient tensor representations with ring-structured networks. In ICASSP 2019-2019 IEEE international conference on acoustics, speech and signal processing (ICASSP) (pp. 8608–8612). IEEE.Google Scholar
  33. Zhao, Q., Zhang, L., & Cichocki, A. (2015). Bayesian CP factorization of incomplete tensors with automatic rank determination. IEEE Transactions on Pattern Analysis and Machine Intelligence, 37(9), 1751–1763.CrossRefGoogle Scholar
  34. Zhao, Q., Zhou, G., Xie, S., Zhang, L., & Cichocki, A. (2016). Tensor ring decomposition. arXiv:1606.05535.
  35. Zhao, Q., Zhou, G., Zhang, L., Cichocki, A., & Amari, S. I. (2015). Bayesian robust tensor factorization for incomplete multiway data. IEEE Transactions on Neural Networks and Learning Systems, 27(4), 736–748.MathSciNetCrossRefGoogle Scholar

Copyright information

© The Author(s), under exclusive licence to Springer Science+Business Media LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.School of AutomationGuangdong University of TechnologyGuangzhouChina
  2. 2.Graduate School of EngineeringSaitama Institute of TechnologyFukayaJapan
  3. 3.Tensor Learning UnitRIKEN Center for Advanced Intelligence Project (AIP)TokyoJapan
  4. 4.School of Computer Science and TechnologyHangzhou Dianzi UniversityHangzhouChina

Personalised recommendations