jmlr jmlr2013 jmlr2013-69 jmlr2013-69-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Partha Niyogi
Abstract: Manifold regularization (Belkin et al., 2006) is a geometrically motivated framework for machine learning within which several semi-supervised algorithms have been constructed. Here we try to provide some theoretical understanding of this approach. Our main result is to expose the natural structure of a class of problems on which manifold regularization methods are helpful. We show that for such problems, no supervised learner can learn effectively. On the other hand, a manifold based learner (that knows the manifold or “learns” it from unlabeled examples) can learn with relatively few labeled examples. Our analysis follows a minimax style with an emphasis on finite sample results (in terms of n: the number of labeled examples). These results allow us to properly interpret manifold regularization and related spectral and geometric algorithms in terms of their potential use in semi-supervised learning. Keywords: semi-supervised learning, manifold regularization, graph Laplacian, minimax rates
Robert A. Adams and John J.F. Fournier. Sobolev Spaces, volume 140. Academic press, 2003. Mikhail Belkin and Partha Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6):1373–1396, 2003. Mikhail Belkin and Partha Niyogi. Towards a theoretical foundation for Laplacian-based manifold methods. In Eighteenth Annual Conference on Learning Theory, pages 486–500. Springer, Bertinoro, Italy, 2005. Mikhail Belkin and Partha Niyogi. Convergence of Laplacian eigenmaps. In NIPS, pages 129–136, 2006. Mikhail Belkin and Partha Niyogi. Towards a theoretical foundation for Laplacian-based manifold methods. Journal of Computer and System Sciences, 74(8):1289–1308, 2008. Mikhail Belkin, Partha Niyogi, and Vikas Sindhwani. Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. Journal of Machine Learning Research, 7:2399–2434, 2006. Peter J Bickel and Bo Li. Local polynomial regression on unknown manifolds. IMS Lecture NotesMonograph Series, pages 177–186, 2007. Vittorio Castelli and Thomas M. Cover. The relative value of labeled and unlabeled samples in pattern recognition with an unknown mixing parameter. Information Theory, IEEE Transactions on, 42(6):2102–2117, 1996. Ronald R Coifman and St´ phane Lafon. Diffusion maps. Applied and Computational Harmonic e Analysis, 21(1):5–30, 2006. 1249 N IYOGI Theodoros Evgeniou, Massimiliano Pontil, and Tomaso Poggio. Regularization networks and support vector machines. Advances in Computational Mathematics, 13(1):1–50, 2000. Evarist Gin´ and Vladimir Koltchinskii. Empirical graph Laplacian approximation of Laplacee Beltrami operators: Large sample results. Lecture Notes-Monograph Series, pages 238–259, 2006. Alexander Grigoryan. Heat kernels on weighted manifolds and applications. Contemporary Mathematics, 398:93–191, 2006. Matthias Hein, Jean-Yves Audibert, and Ulrike von Luxburg. From graphs to manifolds - weak and strong pointwise consistency of graph Laplacians. In COLT, pages 470–485, 2005. John Lafferty and Larry Wasserman. Statistical analysis of semi-supervised regression. In Advances in Neural Information Processing Systems. NIPS Foundation, 2007. Sam T Roweis and Lawrence K Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290(5500):2323–2326, 2000. Bernhard Sch¨ lkopf and Alexander J Smola. Learning with Kernels: Support Vector Machines, o Regularization, Optimization and Beyond. the MIT Press, 2002. Grace Wahba. Spline Models for Observational Data, volume 59. Society for industrial and applied mathematics, 1990. Xiaojin Zhu. Semi-supervised learning literature survey. Technical Report TR 1530, University of Wisconsin–Madison, Computer Sciences Department, 2008. 1250