nips nips2005 nips2005-178 nips2005-178-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Kai Yu, Shipeng Yu, Volker Tresp
Abstract: We propose a simple clustering framework on graphs encoding pairwise data similarities. Unlike usual similarity-based methods, the approach softly assigns data to clusters in a probabilistic way. More importantly, a hierarchical clustering is naturally derived in this framework to gradually merge lower-level clusters into higher-level ones. A random walk analysis indicates that the algorithm exposes clustering structures in various resolutions, i.e., a higher level statistically models a longer-term diffusion on graphs and thus discovers a more global clustering structure. Finally we provide very encouraging experimental results. 1
[1] J. Goldberger and S. Roweis. Hierarchical clustering of a mixture model. In L.K. Saul, Y. Weiss, and L. Bottou, editors, Neural Information Processing Systems 17 (NIPS*04), pages 505–512, 2005.
[2] K.A. Heller and Z. Ghahramani. Bayesian hierarchical clustering. In Proceedings of the 22nd International Conference on Machine Learning, pages 297–304, 2005.
[3] S. D. Kamvar, D. Klein, and C. D. Manning. Interpreting and extending classical agglomerative clustering algorithms using a model-based approach. In Proceedings of the 19th International Conference on Machine Learning, pages 283–290, 2002.
[4] Daniel D. Lee and H. Sebastian Seung. Algorithms for non-negative matrix factorization. In T. K. Leen, T. G. Dietterich, and V. Tresp, editors, Advances in Neural Information Processing Systems 13 (NIPS*00), pages 556–562, 2001.
[5] Jianbo Shi and Jitendra Malik. Normalized cuts and image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(8):888–905, 2000.
[6] D. Zhou, B. Sch¨ lkopf, and T. Hofmann. Semi-supervised learning on directed graphs. o In L.K. Saul, Y. Weiss, and L. Bottou, editors, Advances in Neural Information Processing Systems 17 (NIPS*04), pages 1633–1640, 2005.