Abstract
This chapter deals with the problem of penalized empirical risk minimizationEmpirical risk minimization (ERM) over a convex set of linear functionals on the space of Hermitian matrices with convex loss and nuclear norm penalty. Such penalization is often used in low rank matrix recovery in the cases when the target function can be well approximated by a linear functional generated by a Hermitian matrix of relatively small rank (comparing it with the size of the matrix). Our goal is to prove sharp low rank oracle inequalities that involve the excess risk (the approximation error) with constant equal to 1 and the random error term with correct dependence on the rank of the oracle.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Ahlswede, R., Winter, A.: Strong converse for identification via quantum channels. IEEE Trans. Inf. Theory 48, 569–679 (2002)
Aubin, J.P., Ekeland, I.: Applied Nonlinear Analysis. Wiley, New York (1984)
Candes, E., Plan, Y.: Tight oracle bounds for low-rank matrix recovery from a minimal number of random measurements. IEEE Trans. Inf. Theory 57(4), 2342–2359 (2011)
Candes, E., Recht, B.: Exact matrix completion via convex optimization. Found. Comput. Math. 9(6), 717–777 (2009)
Candes, E., Tao, T.: The power of convex relaxation: near-optimal matrix completion. IEEE Trans. Inf. Theory 56, 2053–2080 (2010)
Gross, D.: Recovering low-rank matrices from few coefficients in any basis. IEEE Trans. Inf. Theory 57(3), 1548–1566 (2011)
Klopp, O.: Noisy low-rank matrix completion with general sampling distribution (2012, preprint). arXiv 1203:0108
Koltchinskii, V.: Oracle inequalities in empirical risk minimization and sparse recovery problems. Ecole d’ete de Probabilités de Saint-Flour 2008. Lecture Notes in Mathematics. Springer, Berlin/Heidelberg (2011)
Koltchinskii, V., Rangel, P.: Low rank estimation of smooth kernels on graphs. Annu. Stat. 41(2), 604–640 (2013)
Koltchinskii, V., Lounici, K., Tsybakov, A.: Nuclear norm penalization and optimal rates for noisy matrix completion. Ann. Stat. 39(5), 2302–2329 (2011)
Negahban, S., Wainwright, M.: Restricted strong convexity and weighted matrix completion: optimal bounds with noise. J. Mach. Learn. Res. 13, 1665–1697 (2012)
Recht, B., Fazel, M., Parrilo, P.: Guaranteed minimum rank solutions of matrix equations via nuclear norm minimization. SIAM Rev. 52(3), 471–501 (2010)
Rohde, A., Tsybakov, A.: Estimation of high-dimensional low rank matrices. Ann. Stat. 39, 887–930 (2011)
Tropp, J.A.: User-friendly tail bounds for sums of random matrices. Found. Comput. Math. 12, 389–439 (2012)
Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)
Acknowledgements
This work was partially supported by NSF Grants DMS-1207808, DMS-0906880, and CCF-0808863.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Koltchinskii, V. (2013). Sharp Oracle Inequalities in Low Rank Estimation. In: Schölkopf, B., Luo, Z., Vovk, V. (eds) Empirical Inference. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-41136-6_19
Download citation
DOI: https://doi.org/10.1007/978-3-642-41136-6_19
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-41135-9
Online ISBN: 978-3-642-41136-6
eBook Packages: Computer ScienceComputer Science (R0)