nips nips2013 nips2013-31 nips2013-31-reference knowledge-graph by maker-knowledge-mining

31 nips-2013-Adaptivity to Local Smoothness and Dimension in Kernel Regression


Source: pdf

Author: Samory Kpotufe, Vikas Garg

Abstract: We present the first result for kernel regression where the procedure adapts locally at a point x to both the unknown local dimension of the metric space X and the unknown H¨ lder-continuity of the regression function at x. The result holds with o high probability simultaneously at all points x in a general metric space X of unknown structure. 1


reference text

[1] C. J. Stone. Optimal rates of convergence for non-parametric estimators. Ann. Statist., 8:1348– 1360, 1980.

[2] C. J. Stone. Optimal global rates of convergence for non-parametric estimators. Ann. Statist., 10:1340–1353, 1982.

[3] W. S. Cleveland and C. Loader. Smoothing by local regression: Principles and methods. Statistical theory and computational aspects of smoothing, 1049, 1996.

[4] L. Gyorfi, M. Kohler, A. Krzyzak, and H. Walk. A Distribution Free Theory of Nonparametric Regression. Springer, New York, NY, 2002. 8

[5] J. Lafferty and L. Wasserman. Rodeo: Sparse nonparametric regression in high dimensions. Arxiv preprint math/0506342, 2005.

[6] O. V. Lepski, E. Mammen, and V. G. Spokoiny. Optimal spatial adaptation to inhomogeneous smoothness: an approach based on kernel estimates with variable bandwidth selectors. The Annals of Statistics, pages 929–947, 1997.

[7] O. V. Lepski and V. G. Spokoiny. Optimal pointwise adaptive methods in nonparametric estimation. The Annals of Statistics, 25(6):2512–2546, 1997.

[8] O. V. Lepski and B. Y. Levit. Adaptive minimax estimation of infinitely differentiable functions. Mathematical Methods of Statistics, 7(2):123–156, 1998.

[9] S. Kpotufe. k-NN Regression Adapts to Local Intrinsic Dimension. NIPS, 2011.

[10] K. Clarkson. Nearest-neighbor searching and metric space dimensions. Nearest-Neighbor Methods for Learning and Vision: Theory and Practice, 2005. 9