Advertisement

Low-Rank and Sparse Matrix Completion for Recommendation

  • Zhi-Lin Zhao
  • Ling Huang
  • Chang-Dong WangEmail author
  • Jian-Huang Lai
  • Philip S. Yu
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10638)

Abstract

Recently, recommendation algorithms have been widely used to improve the benefit of businesses and the satisfaction of users in many online platforms. However, most of the existing algorithms generate intermediate output when predicting ratings and the error of intermediate output will be propagated to the final results. Besides, since most algorithms predict all the unrated items, some predicted ratings may be unreliable and useless which will lower the efficiency and effectiveness of recommendation. To this end, we propose a Low-rank and Sparse Matrix Completion (LSMC) method which recovers rating matrix directly to improve the quality of rating prediction. Following the common methodology, we assume the structure of the predicted rating matrix is low-rank since rating is just connected with some factors of user and item. However, different from the existing methods, we assume the matrix is sparse so some unreliable predictions will be removed and important results will be retained. Besides, a slack variable will be used to prevent overfitting and weaken the influence of noisy data. Extensive experiments on four real-world datasets have been conducted to verify that the proposed method outperforms the state-of-the-art recommendation algorithms.

Keywords

Recommendation algorithms Low-rank Sparse 

Notes

Acknowledgments

This work was supported by the Fundamental Research Funds for the Central Universities (16lgzd15) and Tip-top Scientific and Technical Innovative Youth Talents of Guangdong special support program (No. 2016TQ03X542).

References

  1. 1.
    Argyriou, A., Evgeniou, T., Pontil, M.: Multi-task feature learning. In: Advances in Neural Information Processing Systems, pp. 41–48 (2006)Google Scholar
  2. 2.
    Bhaskar, S.A.: Probabilistic low-rank matrix recovery from quantized measurements: application to image denoising. In: 2015 49th Asilomar Conference on Signals, Systems and Computers, pp. 541–545 (2015)Google Scholar
  3. 3.
    Boyd, S.P., Parikh, N., Chu, E., Peleato, B., Eckstein, J.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn. 3(1), 1–122 (2011)CrossRefzbMATHGoogle Scholar
  4. 4.
    Cai, J., Candès, E.J., Shen, Z.: A singular value thresholding algorithm for matrix completion. SIAM J. Optim. 20(4), 1956–1982 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Candès, E.J., Li, X., Ma, Y., Wright, J.: Robust principal component analysis. J. ACM 58(3), 1–39 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Chao, T., Lin, Y., Kuo, Y., Hsu, W.H.: Scalable object detection by filter compression with regularized sparse coding. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3900–3907 (2015)Google Scholar
  7. 7.
    Cheng, Y., Yin, L., Yu, Y.: LorSLIM: low rank sparse linear methods for Top-N recommendations. In: 2014 IEEE International Conference on Data Mining, pp. 90–99 (2014)Google Scholar
  8. 8.
    Guo, G., Zhang, J., Yorke-Smith, N.: A novel Bayesian similarity measure for recommender systems. In: Twenty-Third International Joint Conference on Artificial Intelligence, pp. 2619–2625 (2013)Google Scholar
  9. 9.
    Lemire, D., Maclachlan, A.: Slope one predictors for online rating-based collaborative filtering. In: Proceedings of the 2005 SIAM International Conference on Data Mining, pp. 471–475 (2005)Google Scholar
  10. 10.
    Lin, Z., Liu, R., Su, Z.: Linearized alternating direction method with adaptive penalty for low rank representation. In: Advances in Neural Information Processing Systems, pp. 612–620 (2011)Google Scholar
  11. 11.
    Paterek, A.: Improving regularized singular value decomposition for collaborative filtering. In: Kdd Cup & Workshop, pp. 39–42 (2007)Google Scholar
  12. 12.
    Pirasteh, P., Hwang, D., Jung, J.J.: Exploiting matrix factorization to asymmetric user similarities in recommendation systems. Knowl.-Based Syst. 83, 51–57 (2015)CrossRefGoogle Scholar
  13. 13.
    Roberge, J., Rispal, S., Wong, T., Duchaine, V.: Unsupervised feature learning for classifying dynamic tactile events using sparse coding. In: 2016 IEEE International Conference on Robotics and Automation, pp. 2675–2681 (2016)Google Scholar
  14. 14.
    Shi, J., Wang, N., Xia, Y., Yeung, D.Y., King, I., Jia, J.: SCMF: sparse covariance matrix factorization for collaborative filtering. In: Twenty-Third International Joint Conference on Artificial Intelligence, pp. 2705–2711 (2013)Google Scholar
  15. 15.
    Wang, J., de Vries, A.P., Reinders, M.J.: Unifying user-based and item-based collaborative filtering approaches by similarity fusion. In: Proceedings of the 29th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 501–508 (2006)Google Scholar
  16. 16.
    Zhang, D.C., Li, M., Wang, C.D.: Point of interest recommendation with social and geographical influence. In: IEEE International Conference on Big Data, pp. 1070–1075 (2016)Google Scholar
  17. 17.
    Zhang, Y., Jiang, Z., Davis, L.S.: Learning structured low-rank representations for image classification. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 676–683 (2013)Google Scholar
  18. 18.
    Zhang, Z., Bai, L., Liang, Y., Hancock, E.R.: Joint hypergraph learning and sparse regression for feature selection. Pattern Recogn. 63, 291–309 (2017)CrossRefGoogle Scholar
  19. 19.
    Zhao, Z.L., Wang, C.D., Lai, J.H.: AUI&GIV: recommendation with asymmetric user influence and global importance value. PLoS ONE 11(2), e0147944 (2016)CrossRefGoogle Scholar
  20. 20.
    Zhao, Z.L., Wang, C.D., Wan, Y.Y., Lai, J.H., Huang, D.: FTMF: recommendation in social network with feature transfer and probabilistic matrix factorization. In: 2016 International Joint Conference on Neural Networks, pp. 847–854 (2016)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Zhi-Lin Zhao
    • 1
  • Ling Huang
    • 1
  • Chang-Dong Wang
    • 1
    Email author
  • Jian-Huang Lai
    • 1
  • Philip S. Yu
    • 2
    • 3
  1. 1.School of Data and Computer ScienceSun Yat-sen UniversityGuangzhouChina
  2. 2.Department of Computer ScienceUniversity of Illinois at ChicagoChicagoUSA
  3. 3.Institute for Data ScienceTsinghua UniversityBeijingChina

Personalised recommendations