nips nips2000 nips2000-68 nips2000-68-reference knowledge-graph by maker-knowledge-mining

68 nips-2000-Improved Output Coding for Classification Using Continuous Relaxation


Source: pdf

Author: Koby Crammer, Yoram Singer

Abstract: Output coding is a general method for solving multiclass problems by reducing them to multiple binary classification problems. Previous research on output coding has employed, almost solely, predefined discrete codes. We describe an algorithm that improves the performance of output codes by relaxing them to continuous codes. The relaxation procedure is cast as an optimization problem and is reminiscent of the quadratic program for support vector machines. We describe experiments with the proposed algorithm, comparing it to standard discrete output codes. The experimental results indicate that continuous relaxations of output codes often improve the generalization performance, especially for short codes.


reference text

[1] D. W. Aha and R. L. Bankert. Cloud classification using error-correcting output codes. In Artificial Intelligence Applications: Natural Science, Agriculture, and Environmental Science, volume 11 , pages 13- 28, 1997.

[2] E.L. Allwein, R.E. Schapire, and Y. Singer. Reducing multiclass to binary: A unifying approach for margin classifiers. In Machine Learning: Proceedings of the Seventeenth International Conference, 2000.

[3] A. Berger. Error-correcting output coding for text classification. In IJCAJ'99: Workshop on machine learning for information filtering , 1999.

[4] Leo Breiman, Jerome H. Friedman, Richard A. Olshen, and Charles J. Stone. Classification and Regression Trees. Wadsworth & Brooks, 1984.

[5] William Cohen. Fast effective rule induction. In Proceedings of the Twelfth International Conference on Machine Learning, pages 115- 123, 1995.

[6] Corinna Cortes and Vladimir Vapnik. Support-vector networks. Machine Learning, 20(3):273297, September 1995.

[7] Koby Crammer and Yoram Singer. On the learnability and design of output codes for multiclass problems. In Proceedings of the Thirteenth Annual Conference on Computational Learning Theory, 2000.

[8] Ghulum Bakiri Thomas G. Dietterich. Achieving high-accuracy text-to-speech with machine learning. In Data mining in ~peech synthesis, 1999.

[9] Thomas G. Dietterich and Ghulum Bakiri. Solving multiclass learning problems via errorcorrecting output codes. Journal of Artificial Intelligence Research, 2:263- 286, January 1995.

[10] Tom Dietterich and Eun Bae Kong. Machine learning bias, statistical bias, and statistical variance of decision tree algorithms. Technical report, Oregon State University, 1995. Available via the WWW at http://www.cs.orst.edu:801'''tgd/cv/tr.html.

[11] Trevor Hastie and Robert Tibshirani. Classification by pairwise coupling. The Annals of Statistics, 26(1):451--471, 1998.

[12] G. James and T. Hastie. The error coding method and PiCT. Journal of computational and graphical stastistics, 7(3):377- 387, 1998.

[13] J.C. Platt, N. Cristianini, and J. Shawe-Taylor. Large margin dags for multiclass classification. In Advances in Neural Information Processing Systems 12. MIT Press, 2000. (To appear.).

[14] J. Ross Quillian. C4.5: Programs for Machine Learning. Morgan Kaufmann, 1993.

[15] Robert E. Schapire. Using output codes to boost multiclass learning problems. In Machine Learning: Proceedings of the Fourteenth International Conference, pages 313- 321, 1997.

[16] Robert E. Schapire and Yoram Singer. Improved boosting algorithms using confidence-rated predictions. Machine Learning, 37(3):1--40, 1999.

[17] Vladimir N. Vapnik. Statistical Learning Theory. Wiley, 1998.