nips nips2001 nips2001-62 nips2001-62-reference knowledge-graph by maker-knowledge-mining

62 nips-2001-Duality, Geometry, and Support Vector Regression


Source: pdf

Author: J. Bi, Kristin P. Bennett

Abstract: We develop an intuitive geometric framework for support vector regression (SVR). By examining when -tubes exist, we show that SVR can be regarded as a classification problem in the dual space. Hard and soft -tubes are constructed by separating the convex or reduced convex hulls respectively of the training data with the response variable shifted up and down by . A novel SVR model is proposed based on choosing the max-margin plane between the two shifted datasets. Maximizing the margin corresponds to shrinking the effective -tube. In the proposed approach the effects of the choices of all parameters become clear geometrically. 1


reference text

[1] K. Bennett and E. Bredensteiner. Duality and Geometry in SVM Classifiers. In P. Langley, eds., Proc. of Seventeenth Intl. Conf. on Machine Learning, p 57–64, Morgan Kaufmann, San Francisco, 2000.

[2] D. Crisp and C. Burges. A Geometric Interpretation of ν-SVM Classifiers. In S. Solla, T. Leen, and K. Muller, eds., Advances in Neural Info. Proc. Sys., Vol 12. p 244–251, MIT Press, Cambridge, MA, 1999.

[3] S.S. Keerthi, S.K. Shevade, C. Bhattacharyya and K.R.K. Murthy, A Fast Iterative Nearest Point Algorithm for Support Vector Machine Classifier Design, IEEE Transactions on Neural Networks, Vol. 11, pp.124-136, 2000.

[4] O. Mangasarian. Nonlinear Programming. SIAM, Philadelphia, 1994.

[5] B. Sch¨lkopf, P. Bartlett, A. Smola and R. Williamson. Shrinking the Tube: o A New Support Vector Regression Algorithm. In M. Kearns, S. Solla, and D. Cohn eds., Advances in Neural Info. Proc. Sys., Vol 12, MIT Press, Cambridge, MA, 1999.

[6] V. Vapnik. The Nature of Statistical Learning Theory. Wiley, New York, 1995.