nips nips2012 nips2012-196 nips2012-196-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Xiao-ming Wu, Zhenguo Li, Anthony M. So, John Wright, Shih-fu Chang
Abstract: We propose a novel stochastic process that is with probability αi being absorbed at current state i, and with probability 1 − αi follows a random edge out of it. We analyze its properties and show its potential for exploring graph structures. We prove that under proper absorption rates, a random walk starting from a set S of low conductance will be mostly absorbed in S. Moreover, the absorption probabilities vary slowly inside S, while dropping sharply outside, thus implementing the desirable cluster assumption for graph-based learning. Remarkably, the partially absorbing process unifies many popular models arising in a variety of contexts, provides new insights into them, and makes it possible for transferring findings from one paradigm to another. Simulation results demonstrate its promising applications in retrieval and classification.
[1] R. Andersen and F. Chung. Detecting sharp drops in pagerank and a simplified local partitioning algorithm. Theory and Applications of Models of Computation, pages 1–12, 2007.
[2] R. Andersen, F. Chung, and K. Lang. Local graph partitioning using pagerank vectors. In FOCS, pages 475–486, 2006.
[3] Y. Bengio, O. Delalleau, and N. Le Roux. Label propagation and quadratic criterion. Semisupervised learning, pages 193–216, 2006.
[4] P. Berkhin. Bookmark-coloring algorithm for personalized pagerank computing. Internet Mathematics, 3(1):41–62, 2006.
[5] O. Chapelle and A. Zien. Semi-supervised classification by low density separation. In AISTATS, 2005.
[6] F. Chung. Spectral Graph Theory. American Mathematical Society, 1997.
[7] R. Coifman and S. Lafon. Diffusion maps. Applied and Computational Harmonic Analysis, 21(1):5–30, 2006.
[8] F. Fouss, A. Pirotte, J. Renders, and M. Saerens. Random-walk computation of similarities between nodes of a graph with application to collaborative recommendation. IEEE Transactions on Knowledge and Data Engineering, 19(3):355–369, 2007.
[9] J. Kemeny and J. Snell. Finite markov chains. Springer, 1976.
[10] B. Kveton, M. Valko, A. Rahimi, and L. Huang. Semisupervised learning with max-margin graph cuts. In AISTATS, pages 421–428, 2010.
[11] L. Lov´ sz and M. Simonovits. The mixing rate of markov chains, an isoperimetric inequality, a and computing the volume. In FOCS, pages 346–354, 1990.
[12] M. Meila and J. Shi. A random walks view of spectral segmentation. In AISTATS, 2001.
[13] B. Nadler, N. Srebro, and X. Zhou. Statistical analysis of semi-supervised learning: The limit of infinite unlabelled data. In NIPS, pages 1330–1338, 2009.
[14] L. Page, S. Brin, R. Motwani, and T. Winograd. The pagerank citation ranking: Bringing order to the web. 1999.
[15] D. A. Spielman and S.-H. Teng. A local clustering algorithm for massive graphs and its application to nearly-linear time graph partitioning. CoRR, abs/0809.3232, 2008.
[16] U. Von Luxburg, A. Radl, and M. Hein. Hitting and commute times in large graphs are often misleading. Arxiv preprint arXiv:1003.1266, 2010.
[17] D. Zhou, O. Bousquet, T. Lal, J. Weston, and B. Sch¨ lkopf. Learning with local and global o consistency. In NIPS, pages 595–602, 2004.
[18] D. Zhou, J. Weston, A. Gretton, O. Bousquet, and B. Sch¨ lkopf. Ranking on data manifolds. o In NIPS, 2004.
[19] X. Zhu and Z. Ghahramani. Learning from labeled and unlabeled data with label propagation. Technical Report CMU-CALD-02-107, Carnegie Mellon University, 2002.
[20] X. Zhu, Z. Ghahramani, and J. Lafferty. Semi-supervised learning using gaussian fields and harmonic functions. In ICML, 2003. 9