jmlr jmlr2009 jmlr2009-60 jmlr2009-60-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Francis Maes
Abstract: N IEME,1 In this paper we introduce a machine learning library for large-scale classification, regression and ranking. N IEME relies on the framework of energy-based models (LeCun et al., 2006) which unifies several learning algorithms ranging from simple perceptrons to recent models such as the pegasos support vector machine or l1-regularized maximum entropy models. This framework also unifies batch and stochastic learning which are both seen as energy minimization problems. N IEME can hence be used in a wide range of situations, but is particularly interesting for large-scale learning tasks where both the examples and the features are processed incrementally. Being able to deal with new incoming features at any time within the learning process is another original feature of the N IEME toolbox. N IEME is released under the GPL license. It is efficiently implemented in C++, it works on Linux, Mac OS X and Windows and provides interfaces for C++, Java and Python. Keywords: large-scale machine learning, classification, ranking, regression, energy-based models, machine learning software
G. Andrew and J. Gao. Scalable training of L1-regularized log-linear models. In Zoubin Ghahramani, editor, ICML 2007, pages 33–40. Omnipress, 2007. Yann LeCun, Sumit Chopra, Raia Hadsell, Ranzato Marc’Aurelio, and Fu-Jie Huang. A tutorial on energy-based learning. In Predicting Structured Data. MIT Press, 2006. D. C. Lio and J. Nocedal. On the limited memory BFGS method for large scale optimization. Math. Programming, 45(3):503–528, 1989. Martin Riedmiller and Heinrich Braun. A direct adaptive method for faster backpropagation learning: The RPROP algorithm. In ICNN, pages 586–591, San Francisco, CA, 1993. S. Shalev-Shwartz, Y. Singer, and N. Srebro. Pegasos: primal estimated sub-gradient solver for SVM. In Zoubin Ghahramani, editor, ICML 2007, pages 807–814. Omnipress, 2007. V. N. Vapnik. An overview of statistical learning theory. Neural Networks, IEEE Transactions on, 10(5):988–999, 1999. 746