nips nips2001 nips2001-170 nips2001-170-reference knowledge-graph by maker-knowledge-mining

170 nips-2001-Spectral Kernel Methods for Clustering


Source: pdf

Author: Nello Cristianini, John Shawe-Taylor, Jaz S. Kandola

Abstract: In this paper we introduce new algorithms for unsupervised learning based on the use of a kernel matrix. All the information required by such algorithms is contained in the eigenvectors of the matrix or of closely related matrices. We use two different but related cost functions, the Alignment and the 'cut cost'. The first one is discussed in a companion paper [3], the second one is based on graph theoretic concepts. Both functions measure the level of clustering of a labeled dataset, or the correlation between data clusters and labels. We state the problem of unsupervised learning as assigning labels so as to optimize these cost functions. We show how the optimal solution can be approximated by slightly relaxing the corresponding optimization problem, and how this corresponds to using eigenvector information. The resulting simple algorithms are tested on real world data with positive results. 1


reference text

[1] M.W. Berry, B. Hendrickson, and P. Raghavan. Sparse matrix reordering schemes for browsing hypertext. In Th e Mat ematics of Num erical Analysis, pages 99- 123. AMS, 1996.

[2] N. Cristianini and J. Shawe-Taylor. An Introduction to Support Vector Machines. Cambridge University Press, 2000. See also the web site www.support-vector.net.

[3] Nello Cristianini, Andre Elisseeff, John Shawe-Taylor, and Jaz Kandola. On kerneltarget alignment. In submitted to Proceedings of Neural Information Processing Systems (NIPS), 200l.

[4] Nello Cristianini, Huma Lodhi, and John Shawe-Taylor. Latent semantic kernels for feature selection. Technical Report NC-TR-00-080, NeuroCOLT Working Group, http://www.neurocolt.org, 2000.

[5] A. Pothen, H. Simon , and K. Liou. Partitioning sparse matrices with eigenvectors of graphs. SIAM J. Matrix Anal., 11(3):430- 452, 1990.