nips nips2012 nips2012-301 nips2012-301-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Thanh Ngo, Yousef Saad
Abstract: This paper describes gradient methods based on a scaled metric on the Grassmann manifold for low-rank matrix completion. The proposed methods significantly improve canonical gradient methods, especially on ill-conditioned matrices, while maintaining established global convegence and exact recovery guarantees. A connection between a form of subspace iteration for matrix completion and the scaled gradient descent procedure is also established. The proposed conjugate gradient method based on the scaled gradient outperforms several existing algorithms for matrix completion and is competitive with recently proposed methods. 1
[1] P.-A. Absil, R. Mahony, and R. Sepulchre. Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton, NJ, 2008.
[2] Y. Amit, M. Fink, N. Srebro, and S. Ullman. Uncovering shared structures in multiclass classification. In Proceedings of the 24th international conference on Machine learning, ICML ’07, pages 17–24, 2007.
[3] L. Armijo. Minimization of functions having Lipschitz continuous first partial derivatives. Pacific Journal of Mathematics, 16(1):1–3, 1966.
[4] J. Baglama, D. Calvetti, G. H. Golub, and L. Reichel. Adaptively preconditioned GMRES algorithms. SIAM J. Sci. Comput., 20(1):243–269, December 1998.
[5] L. Balzano, R. Nowak, and B. Recht. Online identification and tracking of subspaces from highly incomplete information. In Proceedings of Allerton, September 2010.
[6] N. Boumal and P.-A. Absil. Rtrmc: A riemannian trust-region method for low-rank matrix completion. In NIPS, 2011.
[7] J-F. Cai, E. J. Cand` s, and Z. Shen. A singular value thresholding algorithm for matrix completion. SIAM e Journal on Optimization, 20(4):1956–1982, 2010.
[8] E. Candes and T. Tao. The power of convex relaxation: Near-optimal matrix completion, 2009.
[9] P. Chen and D. Suter. Recovering the Missing Components in a Large Noisy Low-Rank Matrix: Application to SFM. IEEE Transactions on Pattern Analysis and Machine Intelligence, 26(8):1051–1063, 2004.
[10] W. Dai, E. Kerman, and O. Milenkovic. A geometric approach to low-rank matrix completion. IEEE Transactions on Information Theory, 58(1):237–247, 2012.
[11] A. Edelman, T. Arias, and S. T. Smith. The geometry of algorithms with orthogonality constraints. SIAM J. Matrix Anal. Appl, 20:303–353, 1998.
[12] P. Jain, R. Meka, and I. S. Dhillon. Guaranteed rank minimization via singular value projection. In NIPS, pages 937–945, 2010.
[13] R. Keshavan, A. Montanari, and S. Oh. Matrix completion from noisy entries. In Y. Bengio, D. Schuurmans, J. Lafferty, C. K. I. Williams, and A. Culotta, editors, Advances in Neural Information Processing Systems 22, pages 952–960. 2009.
[14] R. H. Keshavan, S. Oh, and A. Montanari. Matrix completion from a few entries. CoRR, abs/0901.3150, 2009.
[15] S. Ma, D. Goldfarb, and L. Chen. Fixed point and bregman iterative methods for matrix rank minimization. Math. Program., 128(1-2):321–353, 2011.
[16] B. Marlin. Collaborative filtering: A machine learning perspective, 2004.
[17] R. Mazumder, T. Hastie, and R. Tibshirani. Spectral regularization algorithms for learning large incomplete matrices. J. Mach. Learn. Res., 11:2287–2322, August 2010.
[18] B. Recht. A simpler approach to matrix completion. CoRR, abs/0910.0651, 2009.
[19] J. D. M. Rennie and N. Srebro. Fast maximum margin matrix factorization for collaborative prediction. In In Proceedings of the 22nd International Conference on Machine Learning (ICML, pages 713–719. ACM, 2005.
[20] Y. Saad. Numerical Methods for Large Eigenvalue Problems- classics edition. SIAM, Philadelpha, PA, 2011.
[21] N. Srebro and T. Jaakkola. Weighted low-rank approximations. In In 20th International Conference on Machine Learning, pages 720–727. AAAI Press, 2003.
[22] B. Vandereycken. Low-rank matrix completion by riemannian optimization. Technical report, Mathematics Section, Ecole Polytechnique Federale de de Lausanne, 2011.
[23] Z. Wen, W. Yin, and Y. Zhang. Solving a low-rank factorization model for matrix completion using a non-linear successive over-relaxation algorithm. In CAAM Technical Report. Rice University, 2010. 9