nips nips2012 nips2012-44 nips2012-44-reference knowledge-graph by maker-knowledge-mining

44 nips-2012-Approximating Concavely Parameterized Optimization Problems


Source: pdf

Author: Joachim Giesen, Jens Mueller, Soeren Laue, Sascha Swiercy

Abstract: We consider an abstract class of optimization problems that are parameterized concavely in a single parameter, and show that the solution path along the √ parameter can always be approximated with accuracy ε > 0 by a set of size O(1/ ε). A √ lower bound of size Ω(1/ ε) shows that the upper bound is tight up to a constant factor. We also devise an algorithm that calls a step-size oracle and computes an √ approximate path of size O(1/ ε). Finally, we provide an implementation of the oracle for soft-margin support vector machines, and a parameterized semi-definite program for matrix completion. 1


reference text

[1] Chih-Chung Chang and Chih-Jen Lin. Libsvm: A library for support vector machines. ACM Trans. Intell. Syst. Technol., 2(3):27:1–27:27, 2011.

[2] Olivier Chapelle. Training a Support Vector Machine in the Primal. Neural Computation, 19(5):1155–1178, 2007.

[3] Alexandre d’Aspremont, Francis R. Bach, and Laurent El Ghaoui. Full Regularization Path for Sparse Principal Component Analysis. In Proceedings of the International Conference on Machine Learning (ICML), pages 177–184, 2007.

[4] Jerome Friedman, Trevor Hastie, Holger H¨ fling, and Robert Tibshirani. Pathwise Coordinate o Optimization. The Annals of Applied Statistics, 1(2):302–332, 2007.

[5] Bernd G¨ rtner, Martin Jaggi, and Clement Maria. An Exponential Lower Bound on the Coma plexity of Regularization Paths. arXiv.org, arXiv:0903.4817v, 2010.

[6] Joachim Giesen, Martin Jaggi, and S¨ ren Laue. Approximating Parameterized Convex Opo timization Problems. In Proceedings of Annual European Symposium on Algorithms (ESA), pages 524–535, 2010.

[7] Joachim Giesen, Martin Jaggi, and S¨ ren Laue. Optimizing over the Growing Spectrahedron. o In Proceedings of Annual European Symposium on Algorithms (ESA), pages 503–514, 2012.

[8] Joachim Giesen, Martin Jaggi, and S¨ ren Laue. Regularization Paths with Guarantees for o Convex Semidefinite Optimization. In Proceedings International Conference on Artificial Intelligence and Statistics (AISTATS), pages 432–439, 2012.

[9] Bin Gu, Jian-Dong Wang, Guan-Sheng Zheng, and Yue cheng Yu. Regularization Path for νSupport Vector Classification. IEEE Transactions on Neural Networks and Learning Systems, 23(5):800–811, 2012.

[10] Trevor Hastie, Saharon Rosset, Robert Tibshirani, and Ji Zhu. The entire regularization path for the support vector machine. The Journal of Machine Learning Research, 5:1391–1415, 2004.

[11] Martin Jaggi and Marek Sulovsk´ . A Simple Algorithm for Nuclear Norm Regularized Proby lems. In Proceedings of the International Conference on Machine Learning (ICML), pages 471–478, 2010.

[12] Masayuki Karasuyama and Ichiro Takeuchi. Suboptimal Solution Path Algorithm for Support Vector Machine. In Proceedings of the International Conference on Machine Learning (ICML), pages 473–480, 2011.

[13] S¨ ren Laue. A hybrid algorithm for convex semidefinite optimization. In Proceedings of the o International Conference on Machine Learning (ICML), 2012.

[14] Julien Mairal and Bin Yu. Complexity Analysis of the Lasso Regularization Path. In Proceedings of the International Conference on Machine Learning (ICML), 2012.

[15] A. Wayne Roberts and Dale Varberg. Convex functions. Academic Press, New York, 1973.

[16] Saharon Rosset and Ji Zhu. Piecewise linear regularized solution paths. Annals of Statistics, 35(3):1012–1030, 2007.

[17] Nathan Srebro, Jason D. M. Rennie, and Tommi Jaakkola. Maximum-Margin Matrix Factorization. In Proceedings of Advances in Neural Information Processing Systems 17 (NIPS), 2004.

[18] Zhi-li Wu, Aijun Zhang, Chun-hung Li, and Agus Sudjianto. Trace Solution Paths for SVMs via Parametric Quadratic Programming. In KDD Worskshop: Data Mining Using Matrices and Tensors, 2008. 9