Skip to main content

Sharp Oracle Inequalities in Low Rank Estimation

  • Chapter
  • First Online:
Empirical Inference

Abstract

This chapter deals with the problem of penalized empirical risk minimizationEmpirical risk minimization (ERM) over a convex set of linear functionals on the space of Hermitian matrices with convex loss and nuclear norm penalty. Such penalization is often used in low rank matrix recovery in the cases when the target function can be well approximated by a linear functional generated by a Hermitian matrix of relatively small rank (comparing it with the size of the matrix). Our goal is to prove sharp low rank oracle inequalities that involve the excess risk (the approximation error) with constant equal to 1 and the random error term with correct dependence on the rank of the oracle.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Ahlswede, R., Winter, A.: Strong converse for identification via quantum channels. IEEE Trans. Inf. Theory 48, 569–679 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  2. Aubin, J.P., Ekeland, I.: Applied Nonlinear Analysis. Wiley, New York (1984)

    MATH  Google Scholar 

  3. Candes, E., Plan, Y.: Tight oracle bounds for low-rank matrix recovery from a minimal number of random measurements. IEEE Trans. Inf. Theory 57(4), 2342–2359 (2011)

    Article  MathSciNet  Google Scholar 

  4. Candes, E., Recht, B.: Exact matrix completion via convex optimization. Found. Comput. Math. 9(6), 717–777 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  5. Candes, E., Tao, T.: The power of convex relaxation: near-optimal matrix completion. IEEE Trans. Inf. Theory 56, 2053–2080 (2010)

    Article  MathSciNet  Google Scholar 

  6. Gross, D.: Recovering low-rank matrices from few coefficients in any basis. IEEE Trans. Inf. Theory 57(3), 1548–1566 (2011)

    Article  Google Scholar 

  7. Klopp, O.: Noisy low-rank matrix completion with general sampling distribution (2012, preprint). arXiv 1203:0108

    Google Scholar 

  8. Koltchinskii, V.: Oracle inequalities in empirical risk minimization and sparse recovery problems. Ecole d’ete de Probabilités de Saint-Flour 2008. Lecture Notes in Mathematics. Springer, Berlin/Heidelberg (2011)

    Google Scholar 

  9. Koltchinskii, V., Rangel, P.: Low rank estimation of smooth kernels on graphs. Annu. Stat. 41(2), 604–640 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  10. Koltchinskii, V., Lounici, K., Tsybakov, A.: Nuclear norm penalization and optimal rates for noisy matrix completion. Ann. Stat. 39(5), 2302–2329 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  11. Negahban, S., Wainwright, M.: Restricted strong convexity and weighted matrix completion: optimal bounds with noise. J. Mach. Learn. Res. 13, 1665–1697 (2012)

    MathSciNet  Google Scholar 

  12. Recht, B., Fazel, M., Parrilo, P.: Guaranteed minimum rank solutions of matrix equations via nuclear norm minimization. SIAM Rev. 52(3), 471–501 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  13. Rohde, A., Tsybakov, A.: Estimation of high-dimensional low rank matrices. Ann. Stat. 39, 887–930 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  14. Tropp, J.A.: User-friendly tail bounds for sums of random matrices. Found. Comput. Math. 12, 389–439 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  15. Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)

    MATH  Google Scholar 

Download references

Acknowledgements

This work was partially supported by NSF Grants DMS-1207808, DMS-0906880, and CCF-0808863.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Vladimir Koltchinskii .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Koltchinskii, V. (2013). Sharp Oracle Inequalities in Low Rank Estimation. In: Schölkopf, B., Luo, Z., Vovk, V. (eds) Empirical Inference. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-41136-6_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-41136-6_19

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-41135-9

  • Online ISBN: 978-3-642-41136-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics