nips nips2004 nips2004-62 nips2004-62-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Amir Globerson, Gal Chechik, Fernando Pereira, Naftali Tishby
Abstract: Embedding algorithms search for low dimensional structure in complex data, but most algorithms only handle objects of a single type for which pairwise distances are specified. This paper describes a method for embedding objects of different types, such as images and text, into a single common Euclidean space based on their co-occurrence statistics. The joint distributions are modeled as exponentials of Euclidean distances in the low-dimensional embedding space, which links the problem to convex optimization over positive semidefinite matrices. The local structure of our embedding corresponds to the statistical correlations via random walks in the Euclidean space. We quantify the performance of our method on two text datasets, and show that it consistently and significantly outperforms standard methods of statistical correspondence modeling, such as multidimensional scaling and correspondence analysis. 1
[1] E.D. Bolker. and B. Roth. When is a bipartite graph a rigid framework? 90:27–44, 1980. Pacific J. Math.,
[2] S. Boyd and L. Vandenberghe. Convex Optimization. Cambridge Univ. Press, 2004.
[3] G. Chechik and N. Tishby. Extracting relevant structures with side information. In S. Becker, S. Thrun, and K. Obermayer, editors, NIPS 15, 2002.
[4] T. Cox and M. Cox. Multidimensional Scaling. Chapman and Hall, London, 1984.
[5] M. Fazel, H. Hindi, and S. P. Boyd. A rank minimization heuristic with application to minimum order system approximation. In Proc. of the American Control Conference, 2001.
[6] R.A. Fisher. The percision of discriminant functions. Ann. Eugen. Lond., 10:422–429, 1940.
[7] A. Globerson and N. Tishby. Sufficient dimensionality reduction. Journal of Machine Learning Research, 3:1307–1331, 2003.
[8] M.J. Greenacre. Theory and applications of correspondence analysis. Academic Press, 1984.
[9] G. Hinton and S.T. Roweis. Stochastic neighbor embedding. In NIPS 15, 2002.
[10] H. Hotelling. The most predictable criterion. Journal of Educational Psych., 26:139–142, 1935.
[11] T. Iwata, K. Saito, N. Ueda, S. Stromsten, T. Griffiths, and J. Tenenbaum. Parametric embedding for class visualization. In NIPS 18, 2004.
[12] S. T. Roweis and L. K. Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290:2323–2326, 2000.
[13] J.B. Tenenbaum, V. de Silva, and J. C. Langford. A global geometric framework for nonlinear dimensionality reduction. Science, 290:2319–2323, 2000.
[14] K. Q. Weinberger and L. K. Saul. Unsupervised learning of image manifolds by semidefinite programming. In CVPR, 2004.