nips nips2003 nips2003-120 nips2003-120-reference knowledge-graph by maker-knowledge-mining

120 nips-2003-Locality Preserving Projections


Source: pdf

Author: Xiaofei He, Partha Niyogi

Abstract: Many problems in information processing involve some form of dimensionality reduction. In this paper, we introduce Locality Preserving Projections (LPP). These are linear projective maps that arise by solving a variational problem that optimally preserves the neighborhood structure of the data set. LPP should be seen as an alternative to Principal Component Analysis (PCA) – a classical linear technique that projects the data along the directions of maximal variance. When the high dimensional data lies on a low dimensional manifold embedded in the ambient space, the Locality Preserving Projections are obtained by finding the optimal linear approximations to the eigenfunctions of the Laplace Beltrami operator on the manifold. As a result, LPP shares many of the data representation properties of nonlinear techniques such as Laplacian Eigenmaps or Locally Linear Embedding. Yet LPP is linear and more crucially is defined everywhere in ambient space rather than just on the training data points. This is borne out by illustrative examples on some high dimensional data sets.


reference text

[1] P.N. Belhumeur, J.P. Hepanha, and D.J. Kriegman, “Eigenfaces vs. fisherfaces: recognition using class specific linear projection,”IEEE. Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 7, pp. 711-720, July 1997.

[2] M. Belkin and P. Niyogi, “Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering ,” Advances in Neural Information Processing Systems 14, Vancouver, British Columbia, Canada, 2002.

[3] C. L. Blake and C. J. Merz, ”UCI repository of machine learning databases”, http://www.ics.uci.edu/ mlearn/MLRepository.html. Irvine, CA, University of California, Department of Information and Computer Science, 1998.

[4] Fan R. K. Chung, Spectral Graph Theory, Regional Conference Series in Mathematics, number 92, 1997.

[5] Sam Roweis, and Lawrence K. Saul, “Nonlinear Dimensionality Reduction by Locally Linear Embedding,” Science, vol 290, 22 December 2000.

[6] Joshua B. Tenenbaum, Vin de Silva, and John C. Langford, “A Global Geometric Framework for Nonlinear Dimensionality Reduction,” Science, vol 290, 22 December 2000.

[7] M. Turk and A. Pentland, “Eigenfaces for recognition,” Journal of Cognitive Neuroscience, 3(1):71-86, 1991.

[8] Yale Univ. Face Database, http://cvc.yale.edu/projects/yalefaces/yalefaces.html.