nips nips2004 nips2004-178 nips2004-178-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Jinbo Bi, Tong Zhang
Abstract: This paper investigates a new learning model in which the input data is corrupted with noise. We present a general statistical framework to tackle this problem. Based on the statistical reasoning, we propose a novel formulation of support vector classification, which allows uncertainty in input data. We derive an intuitive geometric interpretation of the proposed formulation, and develop algorithms to efficiently solve it. Empirical results are included to show that the newly formed method is superior to the standard SVM for problems with noisy input. 1
[1] J. Bezdek and R. Hathaway. Convergence of alternating optimization. Neural, Parallel Sci. Comput., 11:351–368, 2003.
[2] C. Bhattacharyya, K.S. Pannagadatta, and A. J. Smola. A second order cone programming formulation for classifying missing data. In NIPS, Vol 17, 2005.
[3] J. Bi and V. N. Vapnik. Learning with rigorous support vector machines. In M. Warmuth and B. Sch¨ lkopf, editors, Proceedings of the 16th Annual Conference on Learning Theory, pages o 35–42, Menlo Park, CA, 2003. AAAI Press.
[4] L. El Ghaoui and H. Lebret. Robust solutions to least-squares problems with uncertain data. SIAM Journal on Matrix Analysis and Applications, 18:1035–1064, 1997.
[5] G. H. Golub, P. C. Hansen, and D. P. O’Leary. Tikhonov regularization and total least squares. SIAM Journal on Numerical Analysis, 30:185–194, 1999.
[6] G. H. Golub and C. F. Van Loan. An analysis of the total least squares problem. SIAM Journal on Numerical Analysis, 17:883–893, 1980.
[7] S. Van Huffel and J. Vandewalle. The Total Least Squares Problem: Computational Aspects and Analysis, in Frontiers in Applied Mathematics 9. SIAM Press, Philadelphia, PA, 1991.
[8] V. N. Vapnik. Statistical Learning Theory. John Wiley & Sons, Inc., New York, 1998.