nips nips2007 nips2007-196 nips2007-196-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Michalis K. Titsias
Abstract: We present a probability distribution over non-negative integer valued matrices with possibly an infinite number of columns. We also derive a stochastic process that reproduces this distribution over equivalence classes. This model can play the role of the prior in nonparametric Bayesian learning scenarios where multiple latent features are associated with the observed data and each feature can have multiple appearances or occurrences within each data point. Such data arise naturally when learning visual object recognition systems from unlabelled images. Together with the nonparametric prior we consider a likelihood model that explains the visual appearance and location of local image patches. Inference with this model is carried out using a Markov chain Monte Carlo algorithm. 1
[1] C. Antoniak. Mixture of Dirichlet processes with application to Bayesian nonparametric problems. The Annals of Statistics, 2:1152–1174, 1974.
[2] D. M. Blei, A. Y. Ng, and M. I. Jordan. Latent Dirichlet allocation. JMLR, 3, 2003.
[3] W. Buntime and A. Jakulin. Applying discrete PCA in data analysis. In UAI, 2004.
[4] J. Canny. GaP: A factor model for discrete data. In SIGIR, pages 122–129. ACM Press, 2004.
[5] W. Ewens. The sampling theory of selectively neutral alleles. Theoretical Population Biology, 3:87–112, 1972.
[6] P. Green and S. Richardson. Modelling heterogeneity with and without the Dirichlet process. Scandinavian Journal of Statistics, 28:355–377, 2001.
[7] T. Griffiths and Z. Ghahramani. Infinite latent feature models and the Indian buffet process. In NIPS 18, 2006.
[8] D. G. Lowe. Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision, 60(2):91–110, 2004.
[9] R. M. Neal. Bayesian mixture modeling. In 11th International Workshop on Maximum Entropy and Bayesian Methods of Statistical Analysis, pages 197–211, 1992.
[10] M. A. Newton and A. E Raftery. Approximate Bayesian inference by the weighted likelihood bootstrap. Journal of the Royal Statistical Society, Series B, 3:3–48, 1994.
[11] E. Saund. A multiple cause mixture model for unsupervised learning. Neural Computation, 7:51–71, 1995.
[12] E. Sudderth, A. Torralba, W. T. Freeman, and A. Willsky. Describing Visual Scenes using Transformed Dirichlet Processes. In NIPS 18, 2006. 3 available from http://l2r.cs.uiuc.edu/∼cogcomp/Data/Car/.