nips nips2006 nips2006-193 nips2006-193-reference knowledge-graph by maker-knowledge-mining

193 nips-2006-Tighter PAC-Bayes Bounds


Source: pdf

Author: Amiran Ambroladze, Emilio Parrado-hernández, John S. Shawe-taylor

Abstract: This paper proposes a PAC-Bayes bound to measure the performance of Support Vector Machine (SVM) classifiers. The bound is based on learning a prior over the distribution of classifiers with a part of the training samples. Experimental work shows that this bound is tighter than the original PAC-Bayes, resulting in an enhancement of the predictive capabilities of the PAC-Bayes bound. In addition, it is shown that the use of this bound as a means to estimate the hyperparameters of the classifier compares favourably with cross validation in terms of accuracy of the model, while saving a lot of computational burden. 1


reference text

[1] C L Blake and C J Merz. UCI Repository of machine learning databases. University of California, Irvine, Dept. of Information and Computer Sciences, [http://www.ics.uci.edu/∼mlearn/MLRepository.html], 1998.

[2] Bernhard E. Boser, Isabelle Guyon, and Vladimir Vapnik. A training algorithm for optimal margin classifiers. In Computational Learing Theory, pages 144–152, 1992.

[3] J Langford. Tutorial on practical prediction theory for classification. Journal of Machine Learning Research, 6(Mar):273–306, 2005.

[4] J Langford and J Shawe-Taylor. PAC-Bayes & Margins. In Advances in Neural Information Processing Systems, volume 14, Cambridge MA, 2002. MIT Press.

[5] D McAllester. Pac-bayesian stochastic model selection. Machine Learning, 51(1):5–21, 2003.

[6] M Seeger. PAC-Bayesian Generalization Error Bounds for Gaussian Process Classification. Journal of Machine Learning Research, 3:233–269, 2002.

[7] J Shawe-Taylor, P L Bartlett, R C Williamson, and M Anthony. Structural risk minimization over data-dependent hierarchies. IEEE Trans. Information Theory, 44(5):1926 – 1940, 1998.