nips nips2006 nips2006-87 nips2006-87-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Kilian Q. Weinberger, Fei Sha, Qihui Zhu, Lawrence K. Saul
Abstract: In many areas of science and engineering, the problem arises how to discover low dimensional representations of high dimensional data. Recently, a number of researchers have converged on common solutions to this problem using methods from convex optimization. In particular, many results have been obtained by constructing semidefinite programs (SDPs) with low rank solutions. While the rank of matrix variables in SDPs cannot be directly constrained, it has been observed that low rank solutions emerge naturally by computing high variance or maximal trace solutions that respect local distance constraints. In this paper, we show how to solve very large problems of this type by a matrix factorization that leads to much smaller SDPs than those previously studied. The matrix factorization is derived by expanding the solution of the original problem in terms of the bottom eigenvectors of a graph Laplacian. The smaller SDPs obtained from this matrix factorization yield very good approximations to solutions of the original problem. Moreover, these approximations can be further refined by conjugate gradient descent. We illustrate the approach on localization in large scale sensor networks, where optimizations involving tens of thousands of nodes can be solved in just a few minutes. 1
[1] P. Biswas, T.-C. Liang, K.-C. Toh, T.-C. Wang, and Y. Ye. Semidefinite programming approaches for sensor network localization with noisy distance measurements. IEEE Transactions on Automation Science and Engineering, 3(4):360–371, 2006.
[2] B. Borchers. CSDP, a C library for semidefinite programming. Optimization Methods and Software 11(1):613-623, 1999.
[3] M. Bowling, A. Ghodsi, and D. Wilkinson. Action respecting embedding. In Proceedings of the Twenty Second International Conference on Machine Learning (ICML-05), pages 65–72, Bonn, Germany, 2005.
[4] O. Chapelle, B. Sch¨ lkopf, and A. Zien, editors. Semi-Supervised Learning. MIT Press, Cambridge, MA, o 2006.
[5] F. R. K. Chung. Spectral Graph Theory. American Mathematical Society, 1997.
[6] F. Lu, S. Keles, S. Wright, and G. Wahba. Framework for kernel regularization with application to protein clustering. Proceedings of the National Academy of Sciences, 102:12332–12337, 2005.
[7] F. Lu, Y. Lin, and G. Wahba. Robust manifold unfolding with kernel regularization. Technical Report 1108, Department of Statistics, University of Wisconsin-Madison, 2005.
[8] F. Sha and L. K. Saul. Analysis and extension of spectral methods for nonlinear dimensionality reduction. In Proceedings of the Twenty Second International Conference on Machine Learning (ICML-05), pages 785–792, Bonn, Germany, 2005.
[9] J. Sun, S. Boyd, L. Xiao, and P. Diaconis. The fastest mixing Markov process on a graph and a connection to a maximum variance unfolding problem. SIAM Review, 48(4):681–699, 2006.
[10] L. Vandenberghe and S. P. Boyd. Semidefinite programming. SIAM Review, 38(1):49–95, March 1996.
[11] K. Q. Weinberger, B. D. Packer, and L. K. Saul. Nonlinear dimensionality reduction by semidefinite programming and kernel matrix factorization. In Z. Ghahramani and R. Cowell, editors, Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics (AISTATS-05), pages 381–388, Barbados, West Indies, 2005.
[12] K. Q. Weinberger and L. K. Saul. Unsupervised learning of image manifolds by semidefinite programming. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR-04), volume 2, pages 988–995, Washington D.C., 2004. Extended version in International Journal of Computer Vision, 70(1): 77-90, 2006.
[13] K. Q. Weinberger and L. K. Saul. An introduction to nonlinear dimensionality reduction by maximum variance unfolding. In Proceedings of the Twenty First National Conference on Artificial Intelligence (AAAI-06), Cambridge, MA, 2006.
[14] K. Q. Weinberger, F. Sha, and L. K. Saul. Learning a kernel matrix for nonlinear dimensionality reduction. In Proceedings of the Twenty First International Conference on Machine Learning (ICML-04), pages 839–846, Banff, Canada, 2004.
[15] L. Xiao, J. Sun, and S. Boyd. A duality view of spectral methods for dimensionality reduction. In Proceedings of the Twenty Third International Conference on Machine Learning (ICML-06), pages 1041– 1048, Pittsburgh, PA, 2006.