nips nips2010 nips2010-110 nips2010-110-reference knowledge-graph by maker-knowledge-mining

110 nips-2010-Guaranteed Rank Minimization via Singular Value Projection


Source: pdf

Author: Prateek Jain, Raghu Meka, Inderjit S. Dhillon

Abstract: Minimizing the rank of a matrix subject to affine constraints is a fundamental problem with many important applications in machine learning and statistics. In this paper we propose a simple and fast algorithm SVP (Singular Value Projection) for rank minimization under affine constraints (ARMP) and show that SVP recovers the minimum rank solution for affine constraints that satisfy a restricted isometry property (RIP). Our method guarantees geometric convergence rate even in the presence of noise and requires strictly weaker assumptions on the RIP constants than the existing methods. We also introduce a Newton-step for our SVP framework to speed-up the convergence with substantial empirical gains. Next, we address a practically important application of ARMP - the problem of lowrank matrix completion, for which the defining affine constraints do not directly obey RIP, hence the guarantees of SVP do not hold. However, we provide partial progress towards a proof of exact recovery for our algorithm by showing a more restricted isometry property and observe empirically that our algorithm recovers low-rank incoherent matrices from an almost optimal number of uniformly sampled entries. We also demonstrate empirically that our algorithms outperform existing methods, such as those of [5, 18, 14], for ARMP and the matrix completion problem by an order of magnitude and are also more robust to noise and sampling schemes. In particular, results show that our SVP-Newton method is significantly robust to noise and performs impressively on a more realistic power-law sampling scheme for the matrix completion problem. 1


reference text

[1] Movie lens dataset. Public dataset. URL http://www.grouplens.org/taxonomy/term/14.

[2] K. Arrow, L. Hurwicz, and H. Uzawa. Studies in Linear and Nonlinear Programming. Stanford University Press, Stanford, 1958.

[3] Robert Bell and Yehuda Koren. Scalable collaborative filtering with jointly derived neighborhood interpolation weights. In ICDM, pages 43–52, 2007. doi: 10.1109/ICDM.2007.90.

[4] Matthew Brand. Fast online SVD revisions for lightweight recommender systems. In SIAM International Conference on Data Mining, 2003.

[5] Jian-Feng Cai, Emmanuel J. Cand` s, and Zuowei Shen. A singular value thresholding algorithm for e matrix completion. SIAM Journal on Optimization, 20(4):1956–1982, 2010.

[6] Emmanuel J. Cand` s and Benjamin Recht. Exact matrix completion via convex optimization. Foundations e of Computational Mathematics, 9(6):717–772, December 2009.

[7] Emmanuel J. Cand` s and Terence Tao. The power of convex relaxation: Near-optimal matrix completion. e IEEE Trans. Inform. Theory, 56(5):2053–2080, 2009.

[8] M. Fazel, H. Hindi, and S. Boyd. A rank minimization heuristic with application to minimum order system approximation. In American Control Conference, Arlington, Virginia, 2001.

[9] M. Fazel, H. Hindi, and S. Boyd. Log-det heuristic for matrix rank minimization with applications to Hankel and Euclidean distance matrices. In American Control Conference, 2003.

[10] M. Fazel, E. Candes, B. Recht, and P. Parrilo. Compressed sensing and robust recovery of low rank matrices. In Signals, Systems and Computers, 2008 42nd Asilomar Conference on, pages 1043–1047, Oct. 2008. doi: 10.1109/ACSSC.2008.5074571.

[11] Rahul Garg and Rohit Khandekar. Gradient descent with sparsification: an iterative algorithm for sparse recovery with restricted isometry property. In ICML, 2009.

[12] Donald Goldfarb and Shiqian Ma. Convergence of fixed point continuation algorithms for matrix rank minimization, 2009. Submitted.

[13] Shuiwang Ji and Jieping Ye. An accelerated gradient method for trace norm minimization. In ICML, 2009.

[14] Raghunandan H. Keshavan, Sewoong Oh, and Andrea Montanari. Matrix completion from a few entries. In ISIT’09: Proceedings of the 2009 IEEE international conference on Symposium on Information Theory, pages 324–328, Piscataway, NJ, USA, 2009. IEEE Press. ISBN 978-1-4244-4312-3.

[15] Yehuda Koren. Factorization meets the neighborhood: a multifaceted collaborative filtering model. In KDD, pages 426–434, 2008. doi: 10.1145/1401890.1401944.

[16] R.M. Larsen. Propack: a software for large and sparse SVD calculations. Available online. URL http: //sun.stanford.edu/rmunk/PROPACK/.

[17] Kiryung Lee and Yoram Bresler. Admira: Atomic decomposition for minimum rank approximation, 2009.

[18] Kiryung Lee and Yoram Bresler. Guaranteed minimum rank approximation from linear observations by nuclear norm minimization with an ellipsoidal constraint, 2009.

[19] Richard B. Lehoucq, Danny C. Sorensen, and Chao Yang. ARPACK Users’ Guide: Solution of LargeScale Eigenvalue Problems with Implicitly Restarted Arnoldi Methods. SIAM, Philadelphia, 1998.

[20] S. Ma, D. Goldfarb, and L. Chen. Fixed point and bregman iterative methods for matrix rank minimization. To appear, Mathematical Programming Series A, 2010.

[21] Arian Maleki. Coherence analysis of iterative thresholding algorithms. CoRR, abs/0904.1193, 2009.

[22] Raghu Meka, Prateek Jain, Constantine Caramanis, and Inderjit S. Dhillon. Rank minimization via online learning. In ICML, pages 656–663, 2008. doi: 10.1145/1390156.1390239.

[23] Raghu Meka, Prateek Jain, and Inderjit S. Dhillon. Matrix completion from power-law distributed samples. In NIPS, 2009.

[24] Benjamin Recht, Maryam Fazel, and Pablo A. Parrilo. Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization, 2007. To appear in SIAM Review.

[25] K.C. Toh and S. Yun. An accelerated proximal gradient algorithm for nuclear norm regularized least squares problems. Preprint, 2009. URL http://www.math.nus.edu.sg/˜matys/apg.pdf. 9