nips nips2007 nips2007-107 nips2007-107-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Michael Gashler, Dan Ventura, Tony Martinez
Abstract: Many algorithms have been recently developed for reducing dimensionality by projecting data onto an intrinsic non-linear manifold. Unfortunately, existing algorithms often lose significant precision in this transformation. Manifold Sculpting is a new algorithm that iteratively reduces dimensionality by simulating surface tension in local neighborhoods. We present several experiments that show Manifold Sculpting yields more accurate results than existing algorithms with both generated and natural data-sets. Manifold Sculpting is also able to benefit from both prior dimensionality reduction efforts. 1
[1] Joshua B. Tenenbaum, Vin de Silva, and John C. Langford. A global geometric framework for nonlinear dimensionality reduction. Science, 290:2319–2323, 2000.
[2] Sam T. Roweis and Lawrence K. Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290:2323–2326, 2000.
[3] Vin de Silva and Joshua B. Tenenbaum. Global versus local methods in nonlinear dimensionality reduction. In NIPS, pages 705–712, 2002.
[4] Bernhard Sch¨ lkopf, Alexander J. Smola, and Klaus-Robert M¨ ller. Kernel principal compoo u nent analysis. Advances in kernel methods: support vector learning, pages 327–352, 1999.
[5] Mikhail Belkin and Partha Niyogi. Laplacian eigenmaps and spectral techniques for embedding and clustering. In Advances in Neural Information Processing Systems, 14, pages 585– 591, 2001.
[6] Matthew Brand. Charting a manifold. In Advances in Neural Information Processing Systems, 15, pages 961–968. MIT Press, Cambridge, MA, 2003.
[7] Pascal Vincent and Yoshua Bengio. Manifold parzen windows. In Advances in Neural Information Processing Systems 15, pages 825–832. MIT Press, Cambridge, MA, 2003.
[8] D. Donoho and C. Grimes. Hessian eigenmaps: locally linear embedding techniques for high dimensional data. Proc. of National Academy of Sciences, 100(10):5591–5596, 2003.
[9] Yoshua Bengio and Martin Monperrus. Non-local manifold tangent learning. In Advances in Neural Information Processing Systems 17, pages 129–136. MIT Press, Cambridge, MA, 2005.
[10] Elizaveta Levina and Peter J. Bickel. Maximum likelihood estimation of intrinsic dimension. In NIPS, 2004.
[11] Zhenyue Zhang and Hongyuan Zha. A domain decomposition method for fast manifold learning. In Y. Weiss, B. Sch¨ lkopf, and J. Platt, editors, Advances in Neural Information Processing o Systems 18. MIT Press, Cambridge, MA, 2006.
[12] Sam Roweis. Em algorithms for PCA and SPCA. In Michael I. Jordan, Michael J. Kearns, and Sara A. Solla, editors, Advances in Neural Information Processing Systems, volume 10, 1998.
[13] Lawrence K. Saul and Sam T. Roweis. Think globally, fit locally: Unsupervised learning of low dimensional manifolds. Journal of Machine Learning Research, 4:119–155, 2003. 8