jmlr jmlr2005 jmlr2005-57 jmlr2005-57-reference knowledge-graph by maker-knowledge-mining

57 jmlr-2005-Multiclass Boosting for Weak Classifiers


Source: pdf

Author: Günther Eibl, Karl-Peter Pfeiffer

Abstract: AdaBoost.M2 is a boosting algorithm designed for multiclass problems with weak base classifiers. The algorithm is designed to minimize a very loose bound on the training error. We propose two alternative boosting algorithms which also minimize bounds on performance measures. These performance measures are not as strongly connected to the expected error as the training error, but the derived bounds are tighter than the bound on the training error of AdaBoost.M2. In experiments the methods have roughly the same performance in minimizing the training and test error rates. The new algorithms have the advantage that the base classifier should minimize the confidence-rated error, whereas for AdaBoost.M2 the base classifier should minimize the pseudo-loss. This makes them more easily applicable to already existing base classifiers. The new algorithms also tend to converge faster than AdaBoost.M2. Keywords: boosting, multiclass, ensemble, classification, decision stumps


reference text

Erin L. Allwein, Robert E. Schapire, Yoram Singer. Reducing multiclass to binary: A unifying approach for margin classifiers. Machine Learning, 1:113–141, 2000. Eric Bauer, Ron Kohavi. An empirical comparison of voting classification algorithms: bagging, boosting and variants. Machine Learning, 36:105–139, 1999. Catherine Blake, Christopher J. Merz. UCI Repository of machine learning databases [http://www.ics.uci.edu/ mlearn/MLRepository.html]. Irvine, CA: University of California, Department of Information and Computer Science, 1998 Thomas G. Dietterrich, Ghulum Bakiri. Solving multiclass learning problems via error-correcting output codes. Journal of Artificial Intelligence Research 2:263–286, 1995. G¨ nther Eibl, Karl–Peter Pfeiffer. Analysis of the performance of AdaBoost.M2 for the simulated u digit-recognition-example. Machine Learning: Proceedings of the Twelfth European Conference, 109–120, 2001. 209 E IBL AND P FEIFFER G¨ nther Eibl, Karl–Peter Pfeiffer. How to make AdaBoost.M1 work for weak classifiers by changu ing only one line of the code. Machine Learning: Proceedings of the Thirteenth European Conference, 109–120, 2002. Yoav Freund, Robert E. Schapire. Experiments with a new boosting algorithm. Machine Learning: Proceedings of the Thirteenth International Conference, 148–56, 1996. Yoav Freund, Robert E. Schapire. A decision-theoretic generalization of online-learning and an application to boosting. Journal of Computer and System Sciences, 55(1):119–139, 1997. Venkatesan Guruswami, Amit Sahai. Multiclass learning, boosting, and error-correcting codes. Proceedings of the Twelfth Annual Conference on Computational Learning Theory, 145–155, 1999. Llew Mason, Peter L. Bartlett, Jonathan Baxter. Direct optimization of margins improves generalization in combined classifiers. Proceedings of NIPS 98, 288–294, 1998. Llew Mason, Peter L. Bartlett, Jonathan Baxter, Marcus Frean. Functional gradient techniques for combining hypotheses. Advances in Large Margin Classifiers, 221–246, 1999. Ross Quinlan. Bagging, boosting, and C4.5. Proceedings of the Thirteenth National Conference on Artificial Intelligence, 725–730, 1996. Gunnar R¨ tsch, Bernhard Sch¨ lkopf, Alex J. Smola, Sebastian Mika, Takashi Onoda, Klaus R. a o M¨ ller. Robust ensemble learning. Advances in Large Margin Classifiers, 207–220, 2000a. u Gunnar R¨ tsch, Takashi Onoda, Klaus R. M¨ ller. Soft margins for AdaBoost. Machine Learning a u 42(3):287–320, 2000b. Robert E. Schapire, Yoav Freund, Peter L. Bartlett, Wee Sun Lee. Boosting the margin: A new explanation for the effectiveness of voting methods. Annals of Statistics, 26(5):1651–1686, 1998. Robert E. Schapire, Yoram Singer. Improved boosting algorithms using confidence-rated predictions. Machine Learning 37:297-336, 1999. 210