Low-rank tensor completion based on non-convex logDet function and Tucker decomposition

Abstract

For the problem of low-rank tensor completion, rank estimation plays an extremely important role. And among some outstanding researches, nuclear norm is often used as a substitute of rank in the optimization due to its convex property. However, recent advances show that some non-convex functions could approximate the rank better, which can significantly improve the precision of the algorithm. While, the complexity of non-convex functions also leads to much higher computation cost, especially when the data are on a large scale. This paper proposes a mixture model for tensor completion by combining logDet function with Tucker decomposition, in which the logDet function is utilized as a much tighter rank approximation than the nuclear norm and the Tucker decomposition can significantly reduce the size of tensor that needs to be evaluated. In the implementation of the method, alternating direction method of multipliers is employed to obtain the optimal tensor completion. Several experiments are carried out to validate the effectiveness and efficiency of the method.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

References

  1. 1.

    De Lathauwer, L., Vandewalle, J.: Dimensionality reduction in higher-order signal processing and rank-(R1, R2, …, RN) reduction in multilinear algebra. Linear Algebra Appl. 391, 31–55 (2004)

    MathSciNet  Article  Google Scholar 

  2. 2.

    Vlasic, D., Brand, M., Pfister, H., Popović, J.: Face transfer with multilinear models. ACM Trans. Graph. 24(3), 426–433 (2005)

    Article  Google Scholar 

  3. 3.

    Beylkin, G., Mohlenkamp, M.J.: Numerical operator calculus in higher dimensions. Proc. Natl. Acad. Sci. 99, 10246–10251 (2002)

    MathSciNet  Article  Google Scholar 

  4. 4.

    Mørup, M.: Applications of tensor (multiway array) factorizations and decompositions in data mining. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 1, 124–140 (2011)

    Article  Google Scholar 

  5. 5.

    Komodakis, N., Tziritas, G.: Image completion using global optimization. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 442–449 (2006)

  6. 6.

    Patwardhan, K.A., Member, S.S., Sapiro, G., Member, S.S., Bertalmio, M.: Video inpainting under camera motion. IEEE Trans. Image Process. 16, 1–9 (2007)

    MathSciNet  Article  Google Scholar 

  7. 7.

    Varghees, V.: Adaptive MRI image denoising using total-variation and local noise estimation. In: IEEE International Conference on Advances in Engineering Science and Management, pp. 506–511 (2012)

  8. 8.

    Li, N., Li, B.: Tensor completion for on-board compression of hyperspectral images. In: Proceedings of the International Conference on Image Processing (ICIP), pp. 517–520 (2010)

  9. 9.

    Filipovic, M., Jukic, A.: Tucker factorization with missing data with application to low-n-rank tensor completion. Multidimens. Syst. Signal Process. 26, 677–692 (2015)

    Article  Google Scholar 

  10. 10.

    Kolda, T.G., Bader, B.W.: Tensor decompositions and applications. SIAM Rev. 51(3), 455–500 (2009)

    MathSciNet  Article  Google Scholar 

  11. 11.

    Håstad, J.: Tensor rank is NP-complete. In: Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence (LNAI) and Lecture Notes in Bioinformatic), pp. 451–460 (1989)

  12. 12.

    Recht, B., Fazel, M., Parrilo, P.A.: Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization. SIAM Rev. 52(3), 471–501 (2010)

    MathSciNet  Article  Google Scholar 

  13. 13.

    Candès, E.J., Recht, B.: Exact matrix completion via convex optimization. Found. Comput. Math. 9, 717–772 (2009)

    MathSciNet  Article  Google Scholar 

  14. 14.

    Liu, J., Musialski, P., Wonka, P., Ye, J.: Tensor completion for estimating missing values in visual data. IEEE Trans. Pattern Anal. Mach. Intell. 35 208–220 (2013)

    Article  Google Scholar 

  15. 15.

    Xu, Y., Hao, R., Yin, W., Su, Z.: Parallel matrix factorization for low-rank tensor completion. Inverse Probl. Imaging 9(2), 601–624 (2015)

    MathSciNet  Article  Google Scholar 

  16. 16.

    Tomioka, R., Suzuki, T.: Convex tensor decomposition via structured schatten norm regularization. Adv. Neural Inf. Process. Syst. 1, 1331–1339 (2013)

    Google Scholar 

  17. 17.

    Wimalawarne, K., Sugiyama, M., Tomioka, R.: Multitask learning meets tensor factorization: Task imputation via convex optimization. Adv. Neural Inf. Process. Syst. 4, 2825–2833 (2014)

    Google Scholar 

  18. 18.

    Nimishakavi, M., Jawanpuria, P.K.: A dual framework for low-rank tensor completion. Adv. Neural Inf. Process. Syst. 31, 5484–5495 (2018)

    Google Scholar 

  19. 19.

    Kang, Z., Peng, C., Cheng, Q.: Robust subspace clustering via smoothed rank approximation. IEEE Signal Process. Lett. 22, 2088–2092 (2015)

    Article  Google Scholar 

  20. 20.

    Ji, T.Y., Huang, T.Z., Le Zhao, X., Ma, T.H., Deng, L.J.: A non-convex tensor rank approximation for tensor completion. Appl. Math. Model. 48, 410–422 (2017)

    MathSciNet  Article  Google Scholar 

  21. 21.

    Li, Y.F., Zhang, Y.J., Huang, Z.H.: A reweighted nuclear norm minimization algorithm for low rank matrix recovery. J. Comput. Appl. Math. 263, 338–350 (2014)

    MathSciNet  Article  Google Scholar 

  22. 22.

    Gu, S., Zhang, L., Zuo, W., Feng, X.: Weighted nuclear norm minimization with application to image denoising. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 2862–2869 (2014)

  23. 23.

    Chen, S., Lyu, M.R., King, I., Xu, Z.: Exact and stable recovery of pairwise interaction tensors. Adv. Neural Inf. Process. Syst. 26, 1691–1699 (2013)

    Google Scholar 

  24. 24.

    Zheng, Y.B., Huang, T.Z., Ji, T.Y., Zhao, X.L., Jiang, T.X.: Low-rank tensor completion via smooth matrix factorization. Appl. Math. Model. 70, 677–695 (2019)

    MathSciNet  Article  Google Scholar 

  25. 25.

    Xu, Z., Yan, F., Qi, Y.: Bayesian nonparametric models for multiway data analysis. IEEE Trans. Pattern Anal. Mach. Intell. 37(2), 475–487 (2013)

    Article  Google Scholar 

  26. 26.

    Tucker, L.R.: Some mathematical notes on three-mode factor analysis. Psychometrika 31(3), 279–311 (1966)

    MathSciNet  Article  Google Scholar 

  27. 27.

    Sheehan, B.N., Saad, Y.: Higher order orthogonal iteration of tensors (hooi) and its relation to pca and glram. In: Proceedings of the 2007 SIAM International Conference on Data Mining, pp. 355–366 (2007)

  28. 28.

    He, B., Tao, M., Yuan, X.: Alternating direction method with Gaussian back substitution for separable convex programming. SIAM J. Optim. 22(2), 313–340 (2012)

    MathSciNet  Article  Google Scholar 

  29. 29.

    Wang, Z., Bovik, A.C., Sheikh, H.R., Simoncelli, E.P.: Image quality assessment: from error visibility to structural similarity. IEEE Trans. Image Process. 13, 600–612 (2004)

    Article  Google Scholar 

  30. 30.

    Zhang, Z., Ely, G., Aeron, S.: Novel methods for multilinear data completion and de-noising based on tensor-SVD. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 3842–3849 (2014)

  31. 31.

    Oseledets, I.: Tensor-train decomposition. SIAM J. Sci. Comput. 33(5), 2295–2317 (2011)

    MathSciNet  Article  Google Scholar 

  32. 32.

    Zhou, P., Lu, C., Member, S., Lin, Z., Member, S., Zhang, C.: Tensor factorization for low-rank tensor completion. IEEE Trans. Image Process. 27(3), 1152–1163 (2018)

    MathSciNet  Article  Google Scholar 

  33. 33.

    Bengua, J.A., Phien, H.N., Tuan, H.D., Do, M.N.: Efficient tensor completion for color image and video recovery: low-rank tensor train. IEEE Trans. Image Process. 26(5), 2466–2479 (2017)

    MathSciNet  Article  Google Scholar 

Download references

Acknowledgements

This research is supported by the National Key Research and Development Program of China (Project No. 2017YFD0700103) and National Natural Science Foundation of China (Grant Nos. #51475186 and #51775202).

Author information

Affiliations

Authors

Corresponding author

Correspondence to Chengfei Shi.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Shi, C., Huang, Z., Wan, L. et al. Low-rank tensor completion based on non-convex logDet function and Tucker decomposition. SIViP (2021). https://doi.org/10.1007/s11760-020-01845-7

Download citation

Keywords

  • Low-rank tensor completion
  • LogDet function
  • Tucker decomposition
  • Image recovery