nips nips2004 nips2004-32 nips2004-32-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Ligen Wang, Balázs Kégl
Abstract: In this paper we propose to combine two powerful ideas, boosting and manifold learning. On the one hand, we improve A DA B OOST by incorporating knowledge on the structure of the data into base classifier design and selection. On the other hand, we use A DA B OOST’s efficient learning mechanism to significantly improve supervised and semi-supervised algorithms proposed in the context of manifold learning. Beside the specific manifold-based penalization, the resulting algorithm also accommodates the boosting of a large family of regularized learning algorithms. 1
[1] Y. Freund and R. E. Schapire, “A decision-theoretic generalization of on-line learning and an application to boosting,” Journal of Computer and System Sciences, vol. 55, pp. 119–139, 1997.
[2] L. Mason, P. Bartlett, J. Baxter, and M. Frean, “Boosting algorithms as gradient descent,” in Advances in Neural Information Processing Systems. 2000, vol. 12, pp. 512–518, The MIT Press.
[3] G. R¨ tsch, T. Onoda, and K.-R. M¨ ller, “Soft margins for AdaBoost,” Machine Learning, vol. a u 42, no. 3, pp. 287–320, 2001.
[4] M. Belkin and P. Niyogi, “Semi-supervised learning on Riemannian manifolds,” Machine Learning, to appear, 2004.
[5] J. Shi and J. Malik, “Normalized cuts and image segmentation,” IEEE Transactions on Pettern Analysis and Machine Intelligence, vol. 22, no. 8, pp. 888–905, 2000.
[6] G. R¨ tsch and M. K. Warmuth, “Maximizing the margin with boosting,” in Proceedings of the a 15th Conference on Computational Learning Theory, 2002.
[7] L. Breiman, “Prediction games and arcing classifiers,” Neural Computation, vol. 11, pp. 1493– 1518, 1999.
[8] R. E. Schapire, Y. Freund, P. Bartlett, and W. S. Lee, “Boosting the margin: a new explanation for the effectiveness of voting methods,” Annals of Statistics, vol. 26, no. 5, pp. 1651–1686, 1998.
[9] A. Antos, B. K´ gl, T. Linder, and G. Lugosi, “Data-dependent margin-based generalization e bounds for classification,” Journal of Machine Learning Research, pp. 73–98, 2002.
[10] R. E. Schapire and Y. Singer, “Improved boosting algorithms using confidence-rated predictions,” Machine Learning, vol. 37, no. 3, pp. 297–336, 1999.
[11] B. K´ gl, “Robust regression by boosting the median,” in Proceedings of the 16th Conference e on Computational Learning Theory, Washington, D.C., 2003, pp. 258–272.
[12] M. Belkin, I. Matveeva, and P. Niyogi, “Regression and regularization on large graphs,” in Proceedings of the 17th Conference on Computational Learning Theory, 2004.