nips nips2012 nips2012-179 nips2012-179-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Guillermo Canas, Tomaso Poggio, Lorenzo Rosasco
Abstract: We study the problem of estimating a manifold from random samples. In particular, we consider piecewise constant and piecewise linear estimators induced by k-means and k-flats, and analyze their performance. We extend previous results for k-means in two separate directions. First, we provide new results for k-means reconstruction on manifolds and, secondly, we prove reconstruction bounds for higher-order approximation (k-flats), for which no known results were previously available. While the results for k-means are novel, some of the technical tools are well-established in the literature. In the case of k-flats, both the results and the mathematical tools are new. 1
[1] William K Allard, Guangliang Chen, and Mauro Maggioni. Multiscale geometric methods for data sets ii: Geometric multi-resolution analysis. Applied and Computational Harmonic Analysis, 1:1–38, 2011.
[2] David Arthur and Sergei Vassilvitskii. k–means++: the advantages of careful seeding. In Proceedings of the eighteenth annual ACM-SIAM symposium on Discrete algorithms, SODA ’07, pages 1027–1035, Philadelphia, PA, USA, 2007. SIAM.
[3] Franz Aurenhammer. Voronoi diagrams: A survey of a fundamental geometric data structure. ACM Comput. Surv., 23:345–405, September 1991.
[4] Peter L. Bartlett, Tamas Linder, and Gabor Lugosi. The minimax distortion redundancy in empirical quantizer design. IEEE Transactions on Information Theory, 44:1802–1813, 1998.
[5] M. Belkin and P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput., 15(6):1373–1396, 2003.
[6] Mikhail Belkin, Partha Niyogi, and Vikas Sindhwani. Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. J. Mach. Learn. Res., 7:2399–2434, 2006.
[7] Shai Ben-David. A framework for statistical clustering with constant time approximation algorithms for k-median and k-means clustering. Mach. Learn., 66(2-3):243–257, March 2007.
[8] G´ rard Biau, Luc Devroye, and G´ bor Lugosi. On the performance of clustering in hilbert spaces. IEEE e a Transactions on Information Theory, 54(2):781–790, 2008.
[9] P. S. Bradley and O. L. Mangasarian. k-plane clustering. J. of Global Optimization, 16:23–32, January 2000.
[10] Joachim M. Buhmann. Empirical risk approximation: An induction principle for unsupervised learning. Technical report, University of Bonn, 1998.
[11] Joachim M. Buhmann. Information theoretic model validation for clustering. In International Symposium on Information Theory, Austin Texas. IEEE, 2010. (in press). 8
[12] Kenneth L. Clarkson. Building triangulations using -nets. In Proceedings of the thirty-eighth annual ACM symposium on Theory of computing, STOC ’06, pages 326–335, New York, NY, USA, 2006. ACM.
[13] A. Cuevas and R. Fraiman. Set estimation. In New perspectives in stochastic geometry, pages 374–397. Oxford Univ. Press, Oxford, 2010.
[14] A. Cuevas and A. Rodr´guez-Casal. Set estimation: an overview and some recent developments. In Recent ı advances and trends in nonparametric statistics, pages 251–264. Elsevier B. V., Amsterdam, 2003.
[15] Inderjit S. Dhillon, Yuqiang Guan, and Brian Kulis. Kernel k-means: spectral clustering and normalized cuts. In Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining, KDD ’04, pages 551–556, New York, NY, USA, 2004. ACM.
[16] M.P. DoCarmo. Riemannian geometry. Theory and Applications Series. Birkh¨ user, 1992. a
[17] Allen Gersho and Robert M. Gray. Vector quantization and signal compression. Kluwer Academic Publishers, Norwell, MA, USA, 1991.
[18] Siegfried Graf and Harald Luschgy. Foundations of quantization for probability distributions. SpringerVerlag New York, Inc., Secaucus, NJ, USA, 2000.
[19] P. M. Gruber. Asymptotic estimates for best and stepwise approximation of convex bodies i. Forum Mathematicum, 15:281–297, 1993.
[20] Peter M. Gruber. Optimum quantization and its applications. Adv. Math, 186:2004, 2002.
[21] P.M. Gruber. Convex and discrete geometry. Grundlehren der mathematischen Wissenschaften. Springer, 2007.
[22] Matthias Hein and Jean-Yves Audibert. Intrinsic dimensionality estimation of submanifolds in rd. In ICML ’05: Proceedings of the 22nd international conference on Machine learning, pages 289–296, 2005.
[23] V. De Silva J. B. Tenenbaum and J. C. Langford. A global geometric framework for nonlinear dimensionality reduction. Science, 290(5500):2319–2323, 2000.
[24] Ravikrishna Kolluri, Jonathan Richard Shewchuk, and James F. O’Brien. Spectral surface reconstruction from noisy point clouds. In Proceedings of the 2004 Eurographics/ACM SIGGRAPH symposium on Geometry processing, SGP ’04, pages 11–21, New York, NY, USA, 2004. ACM.
[25] M. Ledoux. The Concentration of Measure Phenomenon. Mathematical Surveys and Monographs. American Mathematical Society, 2001.
[26] David Levin. Mesh-independent surface interpolation. In Hamann Brunnett and Mueller, editors, Geometric Modeling for Scientific Visualization, pages 37–49. Springer-Verlag, 2003.
[27] Stuart P. Lloyd. Least squares quantization in pcm. IEEE Transactions on Information Theory, 28:129– 137, 1982.
[28] J. B. MacQueen. Some methods for classification and analysis of multivariate observations. In L. M. Le Cam and J. Neyman, editors, Proc. of the fifth Berkeley Symposium on Mathematical Statistics and Probability, volume 1, pages 281–297. University of California Press, 1967.
[29] Julien Mairal, Francis Bach, Jean Ponce, and Guillermo Sapiro. Online dictionary learning for sparse coding. In Proceedings of the 26th Annual International Conference on Machine Learning, ICML ’09, pages 689–696, 2009.
[30] A. Maurer and M. Pontil. K–dimensional coding schemes in hilbert spaces. IEEE Transactions on Information Theory, 56(11):5839 –5846, nov. 2010.
[31] A. Maurer and M. Pontil. K-dimensional coding schemes in Hilbert spaces. IEEE Trans.Inf.Th, 56(11), 2010.
[32] Hariharan Narayanan and Sanjoy Mitter. Sample complexity of testing the manifold hypothesis. In Advances in Neural Information Processing Systems 23, pages 1786–1794. MIT Press, 2010.
[33] David Pollard. Strong consistency of k-means clustering. Annals of Statistics, 9(1):135–140, 1981.
[34] ST Roweis and LK Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290:2323–2326, 2000.
[35] Florian Steinke, Matthias Hein, and Bernhard Sch¨ lkopf. Nonparametric regression between general o Riemannian manifolds. SIAM J. Imaging Sci., 3(3):527–563, 2010.
[36] I. Steinwart and A. Christmann. Support vector machines. Information Science and Statistics. Springer, New York, 2008.
[37] Ulrike von Luxburg. A tutorial on spectral clustering. Stat. Comput., 17(4):395–416, 2007. 9