nips nips2008 nips2008-84 nips2008-84-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Mark Herbster, Massimiliano Pontil, Sergio R. Galeano
Abstract: Given an n-vertex weighted tree with structural diameter S and a subset of m vertices, we present a technique to compute a corresponding m × m Gram matrix of the pseudoinverse of the graph Laplacian in O(n + m2 + mS) time. We discuss the application of this technique to fast label prediction on a generic graph. We approximate the graph with a spanning tree and then we predict with the kernel perceptron. We address the approximation of the graph with either a minimum spanning tree or a shortest path tree. The fast computation of the pseudoinverse enables us to address prediction problems on large graphs. We present experiments on two web-spam classification tasks, one of which includes a graph with 400,000 vertices and more than 10,000,000 edges. The results indicate that the accuracy of our technique is competitive with previous methods using the full graph information. 1
[1] J. Abernethy, O. Chapelle and C. Castillo. Webspam Identification Through Content and Hyperlinks. Proc. Adversarial Information Retrieval on Web, 2008.
[2] A. Argyriou, M. Herbster, and M. Pontil. Combining graph Laplacians for semi-supervised learning. Advances in Neural Information Processing Systems 17. MIT Press, Cambridge, MA, 2005.
[3] M. Belkin, I. Matveeva, P. Niyogi. Regularization and Semi-supervised Learning on Large Graphs. Proceedings of the 17-th Conference on Learning Theory (COLT’ 04), pages 624–638, 2004.
[4] C. Biemann. Chinese Whispers – an Efficient Graph Clustering Algorithm and its Application to Natural Language Processing Problems. Proc. HLT-NAACL-06 Workshop on Textgraphs-06, 2006.
[5] A. Blum, J. Lafferty, M. R. Rwebangira, and R. Reddy. Semi-supervised learning using randomized mincuts. Proc. 21-st International Conference on Machine Learning, page 13, 2004.
[6] U. Brandes and D. Fleischer. Centrality measures based on current flow. Proc. 22-nd Annual Symposium on Theoretical Aspects of Computer Science, pages 533–544, 2005.
[7] C. Castillo, B. D. Davison, L. Denoyer and P. Gallinari. Proc. of the Graph Labelling Workshop and Web-spam Challenge (ECML Workshop), 2007.
[8] T. H. Cormen, C. E. Leiserson, and R. L. Rivest. Introduction to Algorithms. MIT Press, 1990.
[9] P. Drineas and M. W. Mahoney, On the Nystr¨ m Method for Approximating a Gram Matrix for Improved o Kernel-Based Learning. J. Mach. Learn. Res., 6:2153–2175, 2005.
[10] A. Ghosh, S. Boyd and A. Saberi. Minimizing Effective Resistance of a Graph. SIAM Review, problems and techniques section, 50(1):37-66, 2008.
[11] T. Jebara. Bayesian Out-Trees. Proc. Uncertainty in Artifical Intelligence, 2008.
[12] R. E. Haymond, J. Jarvis and D. R. Shier. Algorithm 613: Minimum Spanning Tree for Moderate Integer Weights. ACM Trans. Math. Softw., 10(1):108–111, 1984.
[13] M. Herbster and M. Pontil. Prediction on a graph with a perceptron. Advances in Neural Information Processing Systems 19, pages 577–584. MIT Press, 2007.
[14] M. Herbster, M. Pontil, and L. Wainer. Online learning over graphs. In ICML ’05: Proceedings of the 22nd international conference on Machine learning, pages 305–312, 2005.
[15] N.-D. Ho and P. V. Dooren. On the pseudo-inverse of the Laplacian of a bipartite graph. Appl. Math. Lett., 18(8):917–922, 2005.
[16] D. Klein and M. Randi´ . Resistance distance. J. of Mathematical Chemistry, 12(1):81–95, 1993. c
[17] M. E. J. Newman. A measure of betweenness centrality based on random walks. Soc. Networks, 27:39– 54, 2005.
[18] D. A. Spielman and S.-H. Teng. Nearly-linear time algorithms for graph partitioning, graph sparsification, and solving linear systems. Proc. 36-th Annual ACM Symposium Theory of Computing, 2004.
[19] C.K.I. Williams and M. Seeger. Using the Nystr¨ m Method to Speed Up Kernel Machines. Neural o Information Processing Systems 13, pages 682–688, MIT Press, 2001
[20] X. Zhu, J. Lafferty, and Z. Ghahramani. Semi-Supervised Learning Using Gaussian Fields and Harmonic Functions. Proc of the the 20-th International Conference on Machine Learning, pages 912–919, 2003.