nips nips2007 nips2007-105 nips2007-105-reference knowledge-graph by maker-knowledge-mining

105 nips-2007-Infinite State Bayes-Nets for Structured Domains


Source: pdf

Author: Max Welling, Ian Porteous, Evgeniy Bart

Abstract: A general modeling framework is proposed that unifies nonparametric-Bayesian models, topic-models and Bayesian networks. This class of infinite state Bayes nets (ISBN) can be viewed as directed networks of ‘hierarchical Dirichlet processes’ (HDPs) where the domain of the variables can be structured (e.g. words in documents or features in images). We show that collapsed Gibbs sampling can be done efficiently in these models by leveraging the structure of the Bayes net and using the forward-filtering-backward-sampling algorithm for junction trees. Existing models, such as nested-DP, Pachinko allocation, mixed membership stochastic block models as well as a number of new models are described as ISBNs. Two experiments have been performed to illustrate these ideas. 1


reference text

[1] D. M. Blei, A. Y. Ng, and M. I. Jordan. Latent Dirichlet allocation. Journal of Machine Learning Research, 3:993–1022, 2003.

[2] Y. W. Teh, M. I. Jordan, M. J. Beal, and D. M. Blei. Hierarchical Dirichlet processes. To appear in Journal of the American Statistical Association, 2006.

[3] S. L. Scott. Bayesian methods for hidden Markov models, recursive computing in the 21st century. volume 97, pages 337–351, 2002.

[4] T. Minka. Estimating a dirichlet distribution. Technical report, 2000.

[5] M.D. Escobar and M. West. Bayesian density estimation and inference using mixtures. Journal of the American Statistical Association, 90:577–588, 1995.

[6] T.L. Griffiths and M. Steyvers. A probabilistic approach to semantic representation. In Proceedings of the 24th Annual Conference of the Cognitive Science Society, 2002.

[7] Y.W. Teh, D. Newman, and M. Welling. A collapsed variational bayesian inference algorithm for latent dirichlet allocation. In NIPS, volume 19, 2006.

[8] C.E. Antoniak. Mixtures of Dirichlet processes with applications to bayesian nonparametric problems. The Annals of Statistics, 2:1152–1174, 1974.

[9] B. Bidyuk and R. Dechter. Cycle-cutset sampling for Bayesian networks. In Sixteenth Canadian Conf. on AI, 2003.

[10] David Blei, Thomas L. Griffiths, Michael I. Jordan, and Joshua B. Tenenbaum. Hierarchical topic models and the nested chinese restaurant process. In Neural Information Processing Systems 16, 2004.

[11] B. Marlin. Modeling user rating profiles for collaborative filtering. In Advances in Neural Information Processing Systems 16. 2004.

[12] S. Kim and P. Smyth. Hierarchical dirichlet processes with random effects. In NIPS, volume 19, 2006.

[13] W. Li and A. McCallum. Pachinko allocation: Dag-structured mixture models of topic correlations. In Proceedings of the 23rd international conference on Machine learning, pages 577–584, 2006.

[14] W. Li, A. McCallum, and D. Blei. Nonparametric bayes pachinko allocation. In UAI, 2007.

[15] D. Larlus and F. Jurie. Latent mixture vocabularies for object categorization. In British Machine Vision Conference, 2006.

[16] E. Airoldi, D. Blei, E. Xing, and S. Fienberg. A latent mixed membership model for relational data. In LinkKDD ’05: Proceedings of the 3rd international workshop on Link discovery, pages 82–89, 2005.

[17] R. Agrawal, T. Imielinski, and A. Swami. Mining associations between sets of items in massive databases. In Proc. of the ACM-SIGMOD 1993 Intl Conf on Management of Data, 1993.

[18] M.J. Beal, Z. Ghahramani, and C.E. Rasmussen. The infinite hidden markov model. In NIPS, pages 577–584, 2001.

[19] Y. W. Teh, D. G¨ r¨ r, and Z. Ghahramani. Stick-breaking construction for the Indian buffet process. In ou Proceedings of the International Conference on Artificial Intelligence and Statistics, volume 11, 2007.

[20] W. Li D. Mimno and A. McCallum. Mixtures of hierarchical topics with pachinko allocation. In Proceedings of the 21st International Conference on Machine Learning, 2007. 8