nips nips2002 nips2002-156 nips2002-156-reference knowledge-graph by maker-knowledge-mining

156 nips-2002-On the Complexity of Learning the Kernel Matrix


Source: pdf

Author: Olivier Bousquet, Daniel Herrmann

Abstract: We investigate data based procedures for selecting the kernel when learning with Support Vector Machines. We provide generalization error bounds by estimating the Rademacher complexities of the corresponding function classes. In particular we obtain a complexity bound for function classes induced by kernels with given eigenvectors, i.e., we allow to vary the spectrum and keep the eigenvectors fix. This bound is only a logarithmic factor bigger than the complexity of the function class induced by a single kernel. However, optimizing the margin over such classes leads to overfitting. We thus propose a suitable way of constraining the class. We use an efficient algorithm to solve the resulting optimization problem, present preliminary experimental results, and compare them to an alignment-based approach.


reference text

[1] O. Chapelle, V. Vapnik, O. Bousquet, and S. Mukherjee. Choosing multiple parameters for support vector machines. Machine Learning, 46(1):131–159, 2002.

[2] N. Cristianini, J. Kandola, A. Elisseeff, and J. Shawe-Taylor. On optimizing kernel alignment. Journal of Machine Learning Research, 2002. To appear.

[3] L. Devroye and G. Lugosi. Combinatorial Methods in Density Estimation. SpringerVerlag, New York, 2000.

[4] J. Kandola, J. Shawe-Taylor and N. Cristianini. Optimizing Kernel Alignment over Combinations of Kernels. In Int Conf Machine Learning, 2002. In press.

[5] G. Lanckriet, N. Cristianini, P. Bartlett, L. El Ghaoui, and M.I. Jordan. Learning the kernel matrix with semidefinite programming. In Int Conf Machine Learning, 2002. In press.

[6] M. Ledoux and M. Talagrand. Probability in Banach Spaces. Springer-Verlag, 1991.

[7] O. Bousquet, and D. J. L. Herrmann. Towards Structered Kernel Maschines. Work in Progress.