jmlr jmlr2009 jmlr2009-78 jmlr2009-78-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Yuesheng Xu, Haizhang Zhang
Abstract: We continue our recent study on constructing a refinement kernel for a given kernel so that the reproducing kernel Hilbert space associated with the refinement kernel contains that with the original kernel as a subspace. To motivate this study, we first develop a refinement kernel method for learning, which gives an efficient algorithm for updating a learning predictor. Several characterizations of refinement kernels are then presented. It is shown that a nontrivial refinement kernel for a given kernel always exists if the input space has an infinite cardinal number. Refinement kernels for translation invariant kernels and Hilbert-Schmidt kernels are investigated. Various concrete examples are provided. Keywords: reproducing kernels, reproducing kernel Hilbert spaces, learning with kernels, refinement kernels, translation invariant kernels, Hilbert-Schmidt kernels
N. Aronszajn. Theory of reproducing kernels. Trans. Amer. Math. Soc., 68:337–404, 1950. S. Bochner. Lectures on Fourier Integrals with an Author’s Supplement on Monotonic Functions, Stieltjes Integrals, and Harmonic Analysis. Annals of Mathematics Studies 42, Princeton University Press, New Jersey, 1959. O. Bousquet and A. Elisseeff. Stability and generalization. Journal of Machine Learning Research, 2:499–526, 2002. Q. Chen, C. A. Micchelli, S. Peng and Y. Xu. Multivariate filters banks having a matrix factorization. SIAM J. Matrix Anal. Appl., 25:517-531, 2003. Q. Chen, C. A. Micchelli and Y. Xu. On the matrix completion problem for multivariate filter bank construction. Adv. Comput. Math., 26:173–204, 2007. J. B. Conway. A Course in Functional Analysis. 2nd Edition, Springer-Verlag, New York, 1990. F. Cucker and S. Smale. On the mathematical foundations of learning. Bull. Amer. Math. Soc., 39:1– 49, 2002. I. Daubechies. Ten Lectures on Wavelets. CBMS-NSF Regional Conference Series in Applied Mathematics 61, Society for Industrial and Applied Mathematics (SIAM), Philadelphia, PA, 1992. T. Evgeniou, M. Pontil and T. Poggio. Regularization networks and support vector machines. Adv. Comput. Math., 13:1–50, 2000. C. H. FitzGerald, C. A. Micchelli and A. Pinkus. Functions that preserve families of positive semidefinite matrices. Linear Algebra Appl., 221:83–102, 1995. C. Gasquet and P. Witomski. Fourier Analysis and Applications. Springer-Verlag, New York, 1999. G. H. Golub and C. F. van Loan. Matrix Computations. 3rd Edition, Johns Hopkins University Press, Baltimore, MD, 1996. T. N. T. Goodman and S. L. Lee. Wavelets of multiplicity r. Trans. Amer. Math. Soc., 342:307–324, 1994. L. Grafakos. Classical and Modern Fourier Analysis. Prentice Hall, New Jersey, 2004. H. Hochstadt. Integral Equations. Wiley, New York, 1973. 138 R EFINEMENT OF R EPRODUCING K ERNELS R. Q. Jia. Characterization of smoothness of multivariate refinable functions in Sobolev spaces. Trans. Amer. Math. Soc., 351:4089–4112, 1999. R. Q. Jia, S. D. Riemenschneider and D. X. Zhou. Smoothness of multiple refinable functions and multiple wavelets. SIAM J. Matrix Anal. Appl., 21:1–28, 1999. G. Kimeldorf and G. Wahba. Some results on Tchebycheffian spline functions. J. Math. Anal. Appl., 33:82–95, 1971. J. Mercer. Functions of positive and negative type and their connection with the theory of integral equations. Philos. Trans. R. Soc. Lond. Ser. A Math. Phys. Eng. Sci., 209: 415–446, 1909. C. A. Micchelli and M. Pontil. Learning the kernel function via regularization. Journal of Machine Learning Research, 6: 1099–1125, 2005. C. A. Micchelli and M. Pontil. On learning vector-valued functions. Neural Comput., 17:177–204, 2005. C. A. Micchelli and T. Sauer. Regularity of multiwavelets. Adv. Comput. Math., 7:455–545, 1997. C. A. Micchelli and Y. Xu. Using the matrix refinement equation for the construction of wavelets on invariant sets. Appl. Comput. Harmon. Anal., 1:391–401, 1994. C. A. Micchelli, Y. Xu and P. Ye. Cucker Smale learning theory in Besov spaces. In Advances in Learning Theory: Methods, Models and Applications, pages 47–68, IOS Press, Amsterdam, The Netherlands, 2003. C. A. Micchelli, Y. Xu and H. Zhang. Universal kernels. Journal of Machine Learning Research, 7:2651–2667, 2006. S. Mukherjee, P. Niyogi, T. Poggio and R. Rifkin. Learning theory: Stability is sufficient for generalization and necessary and sufficient for empirical risk minimization. Adv. Comput. Math., 25:161–193, 2006. J. R. Munkres. Topology. 2nd Edition, Prentice Hall, Upper Saddle River, New Jersey, 2000. R. Opfer. Multiscale kernels. Adv. Comput. Math., 25: 357–380, 2006. A. Rakotomamonjy and S. Canu. Frames, reproducing kernels, regularization and learning. Journal of Machine Learning Research, 6:1485–1515, 2005. T. J. Rivlin. Chebyshev Polynomials. 2nd Edition, Wiley, New York, 1990. W. Rudin. Real and Complex Analysis. 3rd Edition, McGraw-Hill, New York, 1987. S. Saitoh. Theory of Reproducing Kernels and its Applications. Pitman Research Note in Mathematics Series 189, Longman, UK, 1988. I. J. Schoenberg. Metric spaces and completely monotone functions. Ann. of Math. (2), 39:811–841, 1938. I. J. Schoenberg. Positive definite functions on spheres. Duke. Math. J., 9: 96–108, 1942. 139 X U AND Z HANG B. Sch¨ lkopf, R. Herbrich and A. J. Smola. A generalized representer theorem. In Proceeding of the o 14th Annual Conference on Computational Learning Theory and the 5th European Conference on Computational Learning Theory, pages 416–426, Springer-Verlag, London, UK, 2001. B. Sch¨ lkopf and A. J. Smola. Learning with Kernels: Support Vector Machines, Regularization, o Optimization, and Beyond. MIT Press, Cambridge, Mass, 2002. J. Shawe-Taylor and N. Cristianini. Kernel Methods for Pattern Analysis. Cambridge University Press, Cambridge, 2004. S. Smale and D. X. Zhou. Estimating the approximation error in learning theory. Anal. Appl., 1:17– 41, 2003. E. M. Stein and G. Weiss. Introduction to Fourier Analysis on Euclidean Spaces. Princeton University Press, New Jersey, 1971. I. Steinwart. On the influence of the kernel on the consistency of support vector machines. Journal of Machine Learning Research, 2:67–93, 2001. I. Steinwart and C. Scovel. Fast rates for support vector machines using Gaussian kernels. In Proceeding of the 18th Annual Conference on Learning Theory (COLT 05), pages 279–294, Bertinoro, 2005. H. Sun. Mercer theorem for RKHS on noncompact sets. J. Complexity, 21:337–349, 2005. V. N. Vapnik. Statistical Learning Theory. Wiley, New York, 1998. G. Wahba. Support vector machines, reproducing kernel Hilbert spaces and the randomized GACV. In Advances in Kernel Methods–Support Vector Learning, pages 69–86, MIT Press, Cambridge, Mass, 1999. C. Walder, B. Sch¨ lkopf and O. Chapelle. Implicit surface modelling with a globally regularised o basis of compact support. Computer Graphics Forum, 25:635–644, 2006. Y. Wang. Wavelets, tiling, and spectral sets. Duke Math. J., 114:43–57, 2002. D. V. Widder. The Laplace Transform. Princeton University Press, Princeton, 1941. Y. Xu and H. Zhang. Refinable kernels. Journal of Machine Learning Research, 8:2083–2120, 2007. Y. Ying and D. X. Zhou. Learnability of Gaussians with flexible variances. Journal of Machine Learning Research, 8:249–276, 2007. T. Zhang. Statistical behavior and consistency of classification methods based on convex risk minimization. Ann. Statis., 32:56–85, 2004. 140