nips nips2010 nips2010-278 nips2010-278-reference knowledge-graph by maker-knowledge-mining

278 nips-2010-Universal Consistency of Multi-Class Support Vector Classification


Source: pdf

Author: Tobias Glasmachers

Abstract: Steinwart was the first to prove universal consistency of support vector machine classification. His proof analyzed the ‘standard’ support vector machine classifier, which is restricted to binary classification problems. In contrast, recent analysis has resulted in the common belief that several extensions of SVM classification to more than two classes are inconsistent. Countering this belief, we prove the universal consistency of the multi-class support vector machine by Crammer and Singer. Our proof extends Steinwart’s techniques to the multi-class case. 1


reference text

[1] C. Cortes and V. Vapnik. Support-Vector Networks. Machine Learning, 20(3):273–297, 1995.

[2] K. Crammer and Y. Singer. On the algorithmic implementation of multiclass kernel-based vector machines. Journal of Machine Learning Research, 2:265–292, 2002.

[3] S. Hill and A. Doucet. A Framework for Kernel-Based Multi-Category Classification. Journal of Artificial Intelligence Research, 30:525–564, 2007.

[4] Y. Lee, Y. Lin, and G. Wahba. Multicategory Support Vector Machines: Theory and Application to the Classification of Microarray Data and Satellite Radiance Data. Journal of the American Statistical Association, 99(465):67–82, 2004.

[5] Y. Liu. Fisher Consistency of Multicategory Support Vector Machines. Journal of Machine Learning Research, 2:291–298, 2007.

[6] I. Steinwart. Support Vector Machines are Universally Consistent. J. Complexity, 18(3):768– 791, 2002.

[7] A. Tewari and P. L. Bartlett. On the Consistency of Multiclass Classification Methods. Journal of Machine Learning Research, 8:1007–1025, 2007.

[8] V. Vapnik. Statistical Learning Theory. Wiley, New-York, 1998.

[9] J. Weston and C. Watkins. Support vector machines for multi-class pattern recognition. In M. Verleysen, editor, Proceedings of the Seventh European Symposium On Artificial Neural Networks (ESANN), pages 219–224, 1999. 9