nips nips2011 nips2011-305 nips2011-305-reference knowledge-graph by maker-knowledge-mining

305 nips-2011-k-NN Regression Adapts to Local Intrinsic Dimension


Source: pdf

Author: Samory Kpotufe

Abstract: Many nonparametric regressors were recently shown to converge at rates that depend only on the intrinsic dimension of data. These regressors thus escape the curse of dimension when high-dimensional data has low intrinsic dimension (e.g. a manifold). We show that k-NN regression is also adaptive to intrinsic dimension. In particular our rates are local to a query x and depend only on the way masses of balls centered at x vary with radius. Furthermore, we show a simple way to choose k = k(x) locally at any x so as to nearly achieve the minimax rate at x in terms of the unknown intrinsic dimension in the vicinity of x. We also establish that the minimax rate does not depend on a particular choice of metric space or distribution, but rather that this minimax rate holds for any metric space and doubling measure. 1


reference text

[1] C. J. Stone. Optimal rates of convergence for non-parametric estimators. Ann. Statist., 8:1348– 1360, 1980.

[2] C. J. Stone. Optimal global rates of convergence for non-parametric estimators. Ann. Statist., 10:1340–1353, 1982.

[3] S. Roweis and L. Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290, 2000.

[4] J. Tenebaum, V. de Silva, and J. Langford. A global geometric framework for nonlinear dimensionality reduction. Science, 290, 2000.

[5] M. Belkin and P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6):1373–1396, 2003.

[6] P. Bickel and B. Li. Local polynomial regression on unknown manifolds. Tech. Re. Dep. of Stats. UC Berkley, 2006.

[7] S. Kpotufe. Escaping the curse of dimensionality with a tree-based regressor. Conference On Learning Theory, 2009.

[8] S. Kpotufe. Fast, smooth, and adaptive regression in metric spaces. Neural Information Processing Systems, 2009.

[9] S. Kulkarni and S. Posner. Rates of convergence of nearest neighbor estimation under arbitrary sampling. IEEE Transactions on Information Theory, 41, 1995.

[10] L. Gyorfi, M. Kohler, A. Krzyzak, and H. Walk. A Distribution Free Theory of Nonparametric Regression. Springer, New York, NY, 2002.

[11] C. Cutler. A review of the theory and estimation of fractal dimension. Nonlinear Time Series and Chaos, Vol. I: Dimension Estimation and Models, 1993.

[12] K. Clarkson. Nearest-neighbor searching and metric space dimensions. Nearest-Neighbor Methods for Learning and Vision: Theory and Practice, 2005.

[13] M. do Carmo. Riemannian Geometry. Birkhauser, 1992.

[14] V. Vapnik and A. Chervonenkis. On the uniform convergence of relative frequencies of events to their expectation. Theory of probability and its applications, 16:264–280, 1971.

[15] S. Kpotufe. k-NN regression adapts to local intrinsic dimension. arXiv:1110.4300, 2011.

[16] J. G. Staniswalis. Local bandwidth selection for kernel estimates. Journal of the American Statistical Association, 84:284–288, 1989.

[17] R. Cao-Abad. Rate of convergence for the wild bootstrap in nonpara- metric regression. Annals of Statistics, 19:2226–2231, 1991. 9