jmlr jmlr2007 jmlr2007-71 jmlr2007-71-reference knowledge-graph by maker-knowledge-mining

71 jmlr-2007-Refinable Kernels


Source: pdf

Author: Yuesheng Xu, Haizhang Zhang

Abstract: Motivated by mathematical learning from training data, we introduce the notion of refinable kernels. Various characterizations of refinable kernels are presented. The concept of refinable kernels leads to the introduction of wavelet-like reproducing kernels. We also investigate a refinable kernel that forms a Riesz basis. In particular, we characterize refinable translation invariant kernels, and refinable kernels defined by refinable functions. This study leads to multiresolution analysis of reproducing kernel Hilbert spaces. Keywords: refinable kernels, refinable feature maps, wavelet-like reproducing kernels, dual kernels, learning with kernels, reproducing kernel Hilbert spaces, Riesz bases


reference text

U. Amato, A. Antoniadis and M. Pensky. Wavelet kernel penalized estimation for non-equispaced design regression. Stat. Comput., 16: 37–55, 2006. N. Aronszajn. Theory of reproducing kernels. Trans. Amer. Math. Soc., 68: 337–404, 1950. A. Beurling. On two problems concerning linear transformations in Hilbert space. Acta Math., 81: 239–255, 1949. S. Bochner. Lectures on Fourier Integrals with an Author’s Supplement on Monotonic Functions, Stieltjes Integrals, and Harmonic Analysis. Annals of Mathematics Studies 42, Princeton University Press, New Jersey, 1959. O. Bousquet and A. Elisseeff. Stability and generalization. Journal of Machine Learning Research, 2: 499–526, 2002. A. S. Cavaretta, W. Dahmen and C. A. Micchelli. Stationary subdivision. Mem. Amer. Math. Soc., 93, no. 453, 1991. Q. Chen, C. A. Micchelli, S. Peng and Y. Xu. Multivariate filters banks having a matrix factorization. SIAM J. Matrix Anal. Appl., 25: 517-531, 2003. Q. Chen, C. A. Micchelli and Y. Xu. On the matrix completion problem for multivariate filter bank construction. Adv. Comput. Math., 26: 173–204, 2007. Z. Chen, B. Wu and Y. Xu. Multilevel augmentation methods for solving operator equations. Numer. Math. J. Chinese Univ., 14: 31–55, 2005. Z. Chen, B. Wu and Y. Xu. Multilevel augmentation methods for differential equations. Adv. Comput. Math., 24: 213–238, 2006. 2117 X U AND Z HANG Z. Chen, Y. Xu and H. Yang. A multilevel augmentation method for solving ill-posed operator equations. Inverse Problems, 22: 155–174, 2006. J. B. Conway. A Course in Functional Analysis. 2nd Edition, Springer-Verlag, New York, 1990. F. Cucker and S. Smale. On the mathematical foundations of learning. Bull. Amer. Math. Soc., 39: 1–49, 2002. I. Daubechies. Ten Lectures on Wavelets. CBMS-NSF Regional Conference Series in Applied Mathematics 61, Society for Industrial and Applied Mathematics (SIAM), Philadelphia, PA, 1992. R. J. Duffin and A. C. Schaeffer. A class of nonharmonic Fourier series. Trans. Amer. Math. Soc., 72: 341–366, 1952. T. Evgeniou, M. Pontil and T. Poggio. Regularization networks and support vector machines. Adv. Comput. Math., 13: 1–50, 2000. C. H. FitzGerald, C. A. Micchelli and A. Pinkus. Functions that preserve families of positive semidefinite matrices. Linear Algebra Appl., 221: 83–102, 1995. J. B. Gao, C. J. Harris and S. R. Gunn. On a class of support vector kernels based on frames in function Hilbert spaces. Neural Comput., 13: 1975–1994, 2001. L. Grafakos. Classical and Modern Fourier Analysis. Prentice Hall, New Jersey, 2004. B. Gr¨ nbaum and G. C. Shephard. Tilings and Patterns. W. H. Freeman and Company, New York, u 1989. G. Kimeldorf and G. Wahba. Some results on Tchebycheffian spline functions. J. Math. Anal. Appl., 33: 82–95, 1971. S. Mallat. Multiresolution approximations and wavelet orthonormal bases of L 2 (R). Trans. Amer. Math. Soc., 315: 69–87, 1989. S. Mallat. A Wavelet Tour of Signal Processing. 2nd Edition, Academic Press, San Diego, CA, 1998. Y. Meyer. Wavelets and Operators. Cambridge University Press, Cambridge, 1992. C. A. Micchelli and M. Pontil. A function representation for learning in Banach spaces. In Proceeding of the 17th Annual Conference on Learning Theory (COLT 04), pages 255–269, Banff, Alberta, 2004. C. A. Micchelli and M. Pontil. Learning the kernel function via regularization. Journal of Machine Learning Research, 6: 1099–1125, 2005. C. A. Micchelli and M. Pontil. On learning vector-valued functions. Neural Comput., 17: 177–204, 2005. C. A. Micchelli, Y. Xu and P. Ye. Cucker Smale learning theory in Besov spaces. In Advances in Learning Theory: Methods, Models and Applications, pages 47–68, IOS Press, Amsterdam, The Netherlands, 2003. 2118 R EFINABLE K ERNELS C. A. Micchelli, Y. Xu and H. Zhang. Universal kernels. Journal of Machine Learning Research, 7: 2651–2667, 2006. S. Mukherjee, P. Niyogi, T. Poggio and R. Rifkin. Learning theory: stability is sufficient for generalization and necessary and sufficient for empirical risk minimization. Adv. Comput. Math., 25: 161–193, 2006. R. Opfer. Multiscale kernels. Adv. Comput. Math., 25: 357–380, 2006. R. Opfer. Tight frame expansions of multiscale reproducing kernels in Sobolev spaces. Appl. Comput. Harmon. Anal., 20: 357–374, 2006. A. Rakotomamonjy and S. Canu. Frames, reproducing kernels, regularization and learning. Journal of Machine Learning Research, 6: 1485–1515, 2005. A. Rakotomamonjy, X. Mary and S. Canu. Non-parametric regression with wavelet kernels. Appl. Stoch. Models Bus. Ind., 21: 153–163, 2005. W. Rudin. Real and Complex Analysis. 3rd Edition, McGraw-Hill, New York, 1987. B. Sch¨ lkopf, R. Herbrich and A. J. Smola. A generalized representer theorem. In Proceeding of the o 14th Annual Conference on Computational Learning Theory and the 5th European Conference on Computational Learning Theory, pages 416–426, Springer-Verlag, London, UK, 2001. B. Sch¨ lkopf and A. J. Smola. Learning with Kernels: Support Vector Machines, Regularization, o Optimization, and Beyond. MIT Press, Cambridge, Mass, 2002. J. Shawe-Taylor and N. Cristianini. Kernel Methods for Pattern Analysis. Cambridge University Press, Cambridge, 2004. S. Smale and D. X. Zhou. Estimating the approximation error in learning theory. Anal. Appl., 1: 17–41, 2003. S. Smale and D. X. Zhou. Shannon sampling and function reconstruction from point values. Bull. Amer. Math. Soc., 41: 279–305, 2004. S. Smale and D. X. Zhou. Learning theory estimates via integral operators and their approximations. Constr. Approx., 26: 153–172, 2007. I. Steinwart and C. Scovel. Fast rates for support vector machines using Gaussian kernels. In Proceeding of the 18th Annual Conference on Learning Theory (COLT 05), pages 279–294, Bertinoro, 2005. V. N. Vapnik. Statistical Learning Theory. Wiley, New York, 1998. G. Wahba. Support vector machines, reproducing kernel Hilbert spaces and the randomized GACV. In Advances in Kernel Methods–Support Vector Learning, pages 69–86, MIT Press, Cambridge, Mass, 1999. ¨ C. Walder, K. I. Kim and B. Scholkopf. Sparse multiscale Gaussian process regression. Technical Report No. TR-162, Max Planck Institute for Biological Cybernetics, 2007. 2119 X U AND Z HANG C. Walder, B. Sch¨ lkopf and O. Chapelle. Implicit surface modelling with a globally regularised o basis of compact support. Computer Graphics Forum, 25: 635–644, 2006. Y. Ying and D. X. Zhou. Learnability of Gaussians with flexible variances. Journal of Machine Learning Research, 8: 249–276, 2007. R. M. Young. An Introduction to Nonharmonic Fourier Series. Academic Press, New York, 1980. B. Yu and H. Zhang. The Bedrosian identity and homogeneous semi-convolution equations. J. Integral Equations Appl., accepted, 2006. T. Zhang. Statistical behavior and consistency of classification methods based on convex risk minimization. Ann. Statis., 32: 56–85, 2004. D. X. Zhou. Density problem and approximation error in learning theory. Preprint, 2003. 2120