jmlr jmlr2013 jmlr2013-86 jmlr2013-86-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Binbin Lin, Xiaofei He, Chiyuan Zhang, Ming Ji
Abstract: We propose a novel local isometry based dimensionality reduction method from the perspective of vector fields, which is called parallel vector field embedding (PFE). We first give a discussion on local isometry and global isometry to show the intrinsic connection between parallel vector fields and isometry. The problem of finding an isometry turns out to be equivalent to finding orthonormal parallel vector fields on the data manifold. Therefore, we first find orthonormal parallel vector fields by solving a variational problem on the manifold. Then each embedding function can be obtained by requiring its gradient field to be as close to the corresponding parallel vector field as possible. Theoretical results show that our method can precisely recover the manifold if it is isometric to a connected open subset of Euclidean space. Both synthetic and real data examples demonstrate the effectiveness of our method even if there is heavy noise and high curvature. Keywords: manifold learning, isometry, vector field, covariant derivative, out-of-sample extension
M. Belkin and P. Niyogi. Laplacian eigenmaps and spectral techniques for embedding and clustering. In Advances in Neural Information Processing Systems 14, pages 585–591. MIT Press, Cambridge, MA, 2001. M. Belkin and P. Niyogi. Towards a theoretical foundation for laplacian-based manifold methods. In COLT, pages 486–500, 2005. M. Belkin and P. Niyogi. Convergence of laplacian eigenmaps. In Advances in Neural Information Processing Systems 19, pages 129–136. 2007. Y. Bengio, J.-F. Paiement, P. Vincent, O. Delalleau, N. Le Roux, and M. Ouimet. Out-of-sample extensions for LLE, isomap, MDS, eigenmaps, and spectral clustering. In Advances in Neural Information Processing Systems 16, 2003. M. Brand. Charting a manifold. In Advances in Neural Information Processing Systems 16, 2003. B. Chow, P. Lu, and L. Ni. Hamilton’s Ricci Flow. AMS, Providence, Rhode Island, 2006. F. R. K. Chung. Spectral Graph Theory, volume 92 of Regional Conference Series in Mathematics. AMS, 1997. R. R. Coifman and S. Lafon. Diffusion maps. Applied and Computational Harmonic Analysis, 21 (1):5 – 30, 2006. Diffusion Maps and Wavelets. T. Cox and M. Cox. Multidimensional Scalling. Chapman & Hall, London, 1994. A. Defant and K. Floret. Tensor Norms and Operator Ideals. North-Holland Mathematics Studies, North-Holland, Amsterdam, 1993. P. Doll´ r, V. Rabaud, and S. Belongie. Non-isometric manifold learning: Analysis and an algorithm. a In Proceedings of the Twenty-fourth International Conference on Machine Learning, pages 241– 248, 2007. 2975 L IN , H E , Z HANG AND J I D. L. Donoho and C. E. Grimes. Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences of the United States of America, 100(10):5591–5596, 2003. R. O. Duda, P. E. Hart, and D. G. Stork. Pattern Classification. Wiley-Interscience, Hoboken, NJ, 2nd edition, 2000. Y. Goldberg and Y. Ritov. Local procrustes for manifold embedding: a measure of embedding quality and embedding algorithms. Machine Learning, 77(1):1–25, 2009. Y. Goldberg, A. Zakai, D. Kushnir, and Y. Ritov. Manifold learning: The price of normalization. The Journal of Machine Learning Research, 9:1909–1939, 2008. G. H. Golub and C. F. Van Loan. Matrix Computations. Johns Hopkins University Press, 3rd edition, 1996. J. Ham, D. D. Lee, Sebastian M., and Bernhard S. A kernel view of the dimensionality reduction of manifolds. In Proceedings of the Twenty-first International Conference on Machine Learning, 2004. M. Hein, J. Audibert, and U. V. Luxburg. From graphs to manifolds - weak and strong pointwise consistency of graph laplacians. In COLT, pages 470–485. Springer, 2005. I. T. Jolliffe. Principal Component Analysis. Springer-Verlag, New York, 1989. K. I. Kim, F. Steinke, and M. Hein. Semi-supervised regression using hessian energy with an application to semi-supervised dimensionality reduction. In Advances in Neural Information Processing Systems 22, pages 979–987. 2009. S. Lafon and A. B. Lee. Diffusion maps and coarse-graining: A unified framework for dimensionality reduction, graph partitioning, and data set parameterization. IEEE Transactions on Pattern Analysis and Machine Intelligence, 28:1393–1403, 2006. J. M. Lee. Introduction to Smooth Manifolds. Springer Verlag, New York, 2nd edition, 2003. T. Lin and H. Zha. Riemannian manifold learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 30(5):796–809, 2008. B. Nadler, S. Lafon, R. Coifman, and I. Kevrekidis. Diffusion maps, spectral clustering and eigenfunctions of fokker-planck operators. In Advances in Neural Information Processing Systems 18, pages 955–962. 2006. P. Petersen. Riemannian Geometry. Springer, New York, 1998. S. Roweis and L. Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290(5500):2323–2326, 2000. B. Sch¨ lkopf, A. Smola, and K. M¨ ller. Nonlinear component analysis as a kernel eigenvalue o u problem. Neural Computation, (10):1299–1319, 1998. 2976 PARALLEL V ECTOR F IELD E MBEDDING R. Sibson. Studies in the robustness of multidimensional scaling: Procrustes statistics. Journal of the Royal Statistical Society. Series B (Methodological), 40(2):234–238, 1978. T. Sim, S. Baker, and M. Bsat. The cmu pose, illumination, and expression database. IEEE Transactions on Pattern Analysis and Machine Intelligence, 25(12):1615–1618, 2003. A. Singer and H.-t. Wu. Vector diffusion maps and the connection Laplacian. ArXiv e-prints, 2011. J. Tenenbaum, V. de Silva, and J. Langford. A global geometric framework for nonlinear dimensionality reduction. Science, 290(5500):2319–2323, 2000. K. Q. Weinberger, F. Sha, and L. K. Saul. Learning a kernel matrix for nonlinear dimensionality reduction. In Proceedings of the Twenty-first International Conference on Machine Learning, 2004. Z. Zhang and H. Zha. Principal manifolds and nonlinear dimension reduction via local tangent space alignment. SIAM Journal of Scientific Computing, 26(1), 2004. 2977