nips nips2004 nips2004-17 nips2004-17-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Jing Wang, Zhenyue Zhang, Hongyuan Zha
Abstract: Recently, there have been several advances in the machine learning and pattern recognition communities for developing manifold learning algorithms to construct nonlinear low-dimensional manifolds from sample data points embedded in high-dimensional spaces. In this paper, we develop algorithms that address two key issues in manifold learning: 1) the adaptive selection of the neighborhood sizes; and 2) better fitting the local geometric structure to account for the variations in the curvature of the manifold and its interplay with the sampling density of the data set. We also illustrate the effectiveness of our methods on some synthetic data sets. 1
[1] M. Brand. Charting a manifold. Advances in Neural Information Processing Systems, 15, MIT Press, 2003.
[2] D. Donoho and C. Grimes. Hessian Eigenmaps: new tools for nonlinear dimensionality reduction. Proceedings of National Academy of Science, 5591-5596, 2003.
[3] S. Roweis and L. Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290: 2323–2326, 2000.
[4] L. Saul and S. Roweis. Think globally, fit locally: unsupervised learning of nonlinear manifolds. Journal of Machine Learning Research, 4:119-155, 2003.
[5] E. Teh and S. Roweis. Automatic Alignment of Local Representations. Advances in Neural Information Processing Systems, 15, MIT Press, 2003.
[6] J. Tenenbaum, V. De Silva and J. Langford. A global geometric framework for nonlinear dimension reduction. Science, 290:2319–2323, 2000.
[7] Z. Zhang and H. Zha. Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment. SIAM J. Scientific Computing, 26:313–338, 2004.
[8] J. Wang, Z. Zhang and H. Zha. Adaptive Manifold Learning. Technical Report CSE04-21, Dept. CSE, Pennsylvania State University, 2004.