nips nips2005 nips2005-172 nips2005-172-reference knowledge-graph by maker-knowledge-mining

172 nips-2005-Selecting Landmark Points for Sparse Manifold Learning


Source: pdf

Author: Jorge Silva, Jorge Marques, João Lemos

Abstract: There has been a surge of interest in learning non-linear manifold models to approximate high-dimensional data. Both for computational complexity reasons and for generalization capability, sparsity is a desired feature in such models. This usually means dimensionality reduction, which naturally implies estimating the intrinsic dimension, but it can also mean selecting a subset of the data to use as landmarks, which is especially important because many existing algorithms have quadratic complexity in the number of observations. This paper presents an algorithm for selecting landmarks, based on LASSO regression, which is well known to favor sparse approximations because it uses regularization with an l1 norm. As an added benefit, a continuous manifold parameterization, based on the landmarks, is also found. Experimental results with synthetic and real data illustrate the algorithm. 1


reference text

[1] A. Bjorck and G. H. Golub. Numerical methods for computing angles between linear subspaces. Mathematical Computation, 27, 1973.

[2] J. Chen and X. Huo. Sparse representation for multiple measurement vectors (mmv) in an over-complete dictionary. ICASSP, 2005.

[3] V. de Silva and J. B. Tenenbaum. Global versus local methods in nonlinear dimensionality reduction. NIPS, 15, 2002.

[4] B. Efron, T. Hastie, I. Johnstone, and R. Tibshirani. Least angle regression. Annals of Statistics, 2003.

[5] T. Hastie, R. Tibshirani, and J. H. Friedman. The Elements of Statistical Learning. Springer, 2001.

[6] H. L¨ desm¨ ki, O. Yli-Harja, W. Zhang, and I. Shmulevich. Intrinsic dimensionality in gene a a expression analysis. GENSIPS, 2005.

[7] T. Poggio and S. Smale. The mathematics of learning: Dealing with data. Notices of the American Mathematical Society, 2003.

[8] S. T. Roweis and L. K. Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290:2323–2326, 2000.

[9] J. Silva, J. Marques, and J. M. Lemos. Non-linear dimension reduction with tangent bundle approximation. ICASSP, 2005.

[10] J. B. Tenenbaum, V. de Silva, and J. C. Langford. A global geometric framework for nonlinear dimensionality reduction. Science, 290:2319–2323, 2000.

[11] M. Turk and A. Pentland. Eigenfaces for recognition. Journal of Cognitive Neuroscience, 3:71–86, 1991.