nips nips2009 nips2009-160 nips2009-160-reference knowledge-graph by maker-knowledge-mining

160 nips-2009-Multiple Incremental Decremental Learning of Support Vector Machines


Source: pdf

Author: Masayuki Karasuyama, Ichiro Takeuchi

Abstract: We propose a multiple incremental decremental algorithm of Support Vector Machine (SVM). Conventional single incremental decremental SVM can update the trained model efficiently when single data point is added to or removed from the training set. When we add and/or remove multiple data points, this algorithm is time-consuming because we need to repeatedly apply it to each data point. The proposed algorithm is computationally more efficient when multiple data points are added and/or removed simultaneously. The single incremental decremental algorithm is built on an optimization technique called parametric programming. We extend the idea and introduce multi-parametric programming for developing the proposed algorithm. Experimental results on synthetic and real data sets indicate that the proposed algorithm can significantly reduce the computational cost of multiple incremental decremental operation. Our approach is especially useful for online SVM learning in which we need to remove old data points and add new data points in a short amount of time.


reference text

[1] G. Cauwenberghs and T. Poggio, “Incremental and decremental support vector machine learning,” in Advances in Neural Information Processing Systems (T. K. Leen, T. G. Dietterich, and V. Tresp, eds.), vol. 13, (Cambridge, Massachussetts), pp. 409–415, The MIT Press, 2001.

[2] M. Martin, “On-line support vector machines for function approximation,” tech. rep., Software Department, University Politecnica de Catalunya, 2002.

[3] J. Ma and J. Theiler, “Accurate online support vector regression,” Neural Computation, vol. 15, no. 11, pp. 2683–2703, 2003.

[4] P. Laskov, C. Gehl, S. Kruger, and K.-R. Muller, “Incremental support vector learning: Analysis, implementation and applications,” Journal of Machine Learning Research, vol. 7, pp. 1909–1936, 2006.

[5] T. Hastie, S. Rosset, R. Tibshirani, and J. Zhu, “The entire regularization path for the support vector machine,” Journal of Machine Learning Research, vol. 5, pp. 1391–1415, 2004.

[6] L. Gunter and J. Zhu, “Efficient computation and model selection for the support vector regression,” Neural Computation, vol. 19, no. 6, pp. 1633–1655, 2007.

[7] G. Wang, D.-Y. Yeung, and F. H. Lochovsky, “A new solution path algorithm in support vector regression,” IEEE Transactions on Neural Networks, vol. 19, no. 10, pp. 1753–1767, 2008.

[8] E. N. Pistikopoulos, M. C. Georgiadis, and V. Dua, Process Systems Engineering: Volume 1: MultiParametric Programming. WILEY-VCH, 2007.

[9] J. R. Schott, Matrix Analysis For Statistics. Wiley-Interscience, 2005.

[10] C.-C. Chang and C.-J. Lin, “LIBSVM: a library for support vector machines,” 2001. Software available at http://www.csie.ntu.edu.tw/∼cjlin/libsvm.

[11] D. DeCoste and K. Wagstaff, “Alpha seeding for support vector machines,” in Proceedings of the International Conference on Knowledge Discovery and Data Mining, pp. 345–359, 2000.

[12] M. M. Lee, S. S. Keerthi, C. J. Ong, and D. DeCoste, “An efficient method for computing leave-one-out error in support vector machines,” IEEE transaction on neural networks, vol. 15, no. 3, pp. 750–757, 2004.

[13] M. Meyer, “Statlib.” http://lib.stat.cmu.edu/index.php.

[14] L. Bottou and C.-J. Lin, “Support vector machine solvers,” in Large Scale Kernel Machines (L. Bottou, O. Chapelle, D. DeCoste, and J. Weston, eds.), pp. 301–320, Cambridge, MA.: MIT Press, 2007.

[15] M. Karasuyama, I. Takeuchi, and R.Nakano, “Efficient leave-m-out cross-validation of support vector regression by generalizing decremental algorithm,” New Generation Computing, vol. 27, no. 4, Special Issue on Data-Mining and Statistical Science, pp. 307–318, 2009. 9