jmlr jmlr2013 jmlr2013-8 jmlr2013-8-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Indraneel Mukherjee, Robert E. Schapire
Abstract: Boosting combines weak classifiers to form highly accurate predictors. Although the case of binary classification is well understood, in the multiclass setting, the “correct” requirements on the weak classifier, or the notion of the most efficient boosting algorithms are missing. In this paper, we create a broad and general framework, within which we make precise and identify the optimal requirements on the weak-classifier, as well as design the most effective, in a certain sense, boosting algorithms that assume such requirements. Keywords: multiclass, boosting, weak learning condition, drifting games
Jacob Abernethy, Peter L. Bartlett, Alexander Rakhlin, and Ambuj Tewari. Optimal stragies and minimax lower bounds for online convex games. In COLT, pages 415–424, 2008. Erin L. Allwein, Robert E. Schapire, and Yoram Singer. Reducing multiclass to binary: A unifying approach for margin classifiers. Journal of Machine Learning Research, 1:113–141, 2000. Peter L. Bartlett and Mikhail Traskin. AdaBoost is consistent. Journal of Machine Learning Research, 8:2347–2368, 2007. Peter L. Bartlett, Michael I. Jordan, and Jon D. McAuliffe. Convexity, classification, and risk bounds. Journal of the American Statistical Association, 101(473):138–156, March 2006. Alina Beygelzimer, John Langford, and Pradeep Ravikumar. Error-correcting tournaments. In Algorithmic Learning Theory: 20th International Conference, pages 247–262, 2009. Thomas G. Dietterich and Ghulum Bakiri. Solving multiclass learning problems via error-correcting output codes. Journal of Artificial Intelligence Research, 2:263–286, January 1995. G¨ nther Eibl and Karl-Peter Pfeiffer. Multiclass boosting for weak classifiers. Journal of Machine u Learning Research, 6:189–210, 2005. Yoav Freund. Boosting a weak learning algorithm by majority. Information and Computation, 121 (2):256–285, 1995. Yoav Freund. An adaptive version of the boost by majority algorithm. Machine Learning, 43(3): 293–318, June 2001. Yoav Freund and Manfred Opper. Continuous drifting games. Journal of Computer and System Sciences, pages 113–132, 2002. Yoav Freund and Robert E. Schapire. Experiments with a new boosting algorithm. In Machine Learning: Proceedings of the Thirteenth International Conference, pages 148–156, 1996a. Yoav Freund and Robert E. Schapire. Game theory, on-line prediction and boosting. In Proceedings of the Ninth Annual Conference on Computational Learning Theory, pages 325–332, 1996b. Yoav Freund and Robert E. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 55(1):119–139, August 1997. Trevor Hastie and Robert Tibshirani. Classification by pairwise coupling. Annals of Statistics, 26 (2):451–471, 1998. Vladimir Koltchinskii and Dmitriy Panchenko. Empirical margin distributions and bounding the generalization error of combined classifiers. Annals of Statistics, 30(1), February 2002. Ping Li. Robust logitboost and adaptive base class (abc) logitboost. In UAI, pages 302–311, 2010. 496 A T HEORY OF M ULTICLASS B OOSTING Philip M. Long and Rocco A. Servedio. Random classification noise defeats all convex potential boosters. Machine Learning, 78:287–304, 2010. Indraneel Mukherjee and Robert E. Schapire. Learning with continuous experts using drifting games. Theoretical Computer Science, 411(29-30):2670–2683, June 2010. Indraneel Mukherjee, Cynthia Rudin, and Robert E. Schapire. The rate of convergence of AdaBoost. In The 24th Annual Conference on Learning Theory, 2011. Gunnar R¨ tsch and Manfred K. Warmuth. Efficient margin maximizing with boosting. Journal of a Machine Learning Research, 6:2131–2152, 2005. R. Tyrrell Rockafellar. Convex Analysis. Princeton University Press, 1970. Robert E. Schapire. The strength of weak learnability. Machine Learning, 5(2):197–227, 1990. Robert E. Schapire. Drifting games. Machine Learning, 43(3):265–291, June 2001. Robert E. Schapire and Yoav Freund. Boosting: Foundations and Algorithms. MIT Press, 2012. Robert E. Schapire and Yoram Singer. Improved boosting algorithms using confidence-rated predictions. Machine Learning, 37(3):297–336, December 1999. Robert E. Schapire and Yoram Singer. BoosTexter: A boosting-based system for text categorization. Machine Learning, 39(2/3):135–168, May/June 2000. Robert E. Schapire, Yoav Freund, Peter Bartlett, and Wee Sun Lee. Boosting the margin: A new explanation for the effectiveness of voting methods. Annals of Statistics, 26(5):1651–1686, October 1998. Ambuj Tewari and Peter L. Bartlett. On the Consistency of Multiclass Classification Methods. Journal of Machine Learning Research, 8:1007–1025, May 2007. Tong Zhang. Statistical behavior and consistency of classification methods based on convex risk minimization. Annals of Statistics, 32(1):56–134, 2004. Ji Zhu, Hui Zou, Saharon Rosset, and Trevor Hastie. Multi-class AdaBoost. Statistics and Its Interface, 2:349360, 2009. 497