nips nips2004 nips2004-77 nips2004-77-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Jacob Goldberger, Sam T. Roweis
Abstract: In this paper we propose an efficient algorithm for reducing a large mixture of Gaussians into a smaller mixture while still preserving the component structure of the original model; this is achieved by clustering (grouping) the components. The method minimizes a new, easily computed distance measure between two Gaussian mixtures that can be motivated from a suitable stochastic model and the iterations of the algorithm use only the model parameters, avoiding the need for explicit resampling of datapoints. We demonstrate the method by performing hierarchical clustering of scenery images and handwritten digits. 1
[1] Y. Bar-Shalom and X. Li. Estimation and tracking: principles, techniques and software. Artech House, 1993.
[2] S. Gordon, H. Greenspan, and J. Goldberger. Applying the information bottleneck principle to unsupervised clustering of discrete and continuous image representations. In ICCV, 2003.
[3] U. Lerner, R. Parr, D. Koller, and G. Biswas. Bayesian fault detection and diagnosis in dynamic systems. In AAAI/IAAI, pp. 531–537, 2000.
[4] J. Puzicha, T. Hofmann, and J. Buhmann. Histogram clustering for unsupervised segmentation and image retrieval. Pattern Recognition Letters, 20(9):899–909, 1999.
[5] N. Shental, A. Bar-Hillel, T. Hertz, and D. Weinshall. Computing gaussian mixture models with em using equivalence constraints. In Proc. of Neural Information Processing Systems, 2003.
[6] N. Slonim and Y. Weiss. Maximum likelihood and the information bottleneck. In Proc. of Neural Information Processing Systems, 2003.
[7] E. Sudderth, A. Ihler, W. Freeman, and A. Wilsky. Non-parametric belief propagation. In CVPR, 2003.
[8] N. Tishby, F. Pereira, and W. Bialek. The information bottleneck method. In Proc. of the 37-th Annual Allerton Conference on Communication, Control and Computing, pages 368–377, 1999.
[9] N. Vasconcelos and A. Lippman. Learning mixture hierarchies. In Proc. of Neural Information Processing Systems, 1998.
[10] J. Vermaak, A. A. Doucet, and P. Perez. Maintaining multi-modality through mixture tracking. In Int. Conf. on Computer Vision, 2003.
[11] K. Wagstaff, C. Cardie, S. Rogers, and S. Schroell. Constraind k-means clustering with background knowledge. In Proc. Int. Conf. on Machine Learning, 2001.
[12] E.P. Xing, A. Y. Ng, M.I. Jordan, and S. Russell. Distance learning metric. In Proc. of Neural Information Processing Systems, 2003.