nips nips2013 nips2013-287 nips2013-287-reference knowledge-graph by maker-knowledge-mining

287 nips-2013-Scalable Inference for Logistic-Normal Topic Models


Source: pdf

Author: Jianfei Chen, June Zhu, Zi Wang, Xun Zheng, Bo Zhang

Abstract: Logistic-normal topic models can effectively discover correlation structures among latent topics. However, their inference remains a challenge because of the non-conjugacy between the logistic-normal prior and multinomial topic mixing proportions. Existing algorithms either make restricting mean-field assumptions or are not scalable to large-scale applications. This paper presents a partially collapsed Gibbs sampling algorithm that approaches the provably correct distribution by exploring the ideas of data augmentation. To improve time efficiency, we further present a parallel implementation that can deal with large-scale applications and learn the correlation structures of thousands of topics from millions of documents. Extensive empirical results demonstrate the promise. 1


reference text

[1] A. Ahmed, M. Aly, J. Gonzalez, S. Narayanamurthy, and A. Smola. Scalable inference in latent variable models. In International Conference on Web Search and Data Mining (WSDM), 2012.

[2] K. Bache and M. Lichman. UCI machine learning repository, 2013.

[3] D. Blei and J. Lafferty. Correlated topic models. In Advances in Neural Information Processing Systems (NIPS), 2006.

[4] D. Blei and J. Lafferty. Dynamic topic models. In International Conference on Machine Learning (ICML), 2006.

[5] D.M. Blei, A.Y. Ng, and M.I. Jordan. Latent Dirichlet allocation. Journal of Machine Learning Research, 3:993–1022, 2003.

[6] N. Chen, J. Zhu, F. Xia, and B. Zhang. Generalized relational topic models with data augmentation. In International Joint Conference on Artificial Intelligence (IJCAI), 2013.

[7] M. Hoffman, D. Blei, and F. Bach. Online learning for latent Dirichlet allocation. In Advances in Neural Information Processing Systems (NIPS), 2010.

[8] C. Holmes and L. Held. Bayesian auxiliary variable models for binary and multinomial regression. Bayesian Analysis, 1(1):145–168, 2006.

[9] D. Mimno, H. Wallach, and A. McCallum. Gibbs sampling for logistic normal topic models with graph-based priors. In NIPS Workshop on Analyzing Graphs, 2008.

[10] D. Newman, A. Asuncion, P. Smyth, and M. Welling. Distributed algorithms for topic models. Journal of Machine Learning Research, (10):1801–1828, 2009.

[11] J. Paisley, C. Wang, and D. Blei. The discrete infinite logistic normal distribution for mixedmembership modeling. In International Conference on Artificial Intelligence and Statistics (AISTATS), 2011.

[12] N. G. Polson and J. G. Scott. Default bayesian analysis for multi-way tables: a dataaugmentation approach. arXiv:1109.4180, 2011.

[13] N. G. Polson, J. G. Scott, and J. Windle. Bayesian inference for logistic models using PolyaGamma latent variables. arXiv:1205.0310v2, 2013.

[14] C. P. Robert. Simulation of truncated normal variables. Statistics and Compuating, 5:121–125, 1995.

[15] W. Rudin. Principles of mathematical analysis. McGraw-Hill Book Co., 1964.

[16] A. Smola and S. Narayanamurthy. An architecture for parallel topic models. Very Large Data Base (VLDB), 2010.

[17] M. A. Tanner and W. H. Wong. The calculation of posterior distributions by data augmentation. Journal of the Americal Statistical Association, 82(398):528–540, 1987.

[18] D. van Dyk and X. Meng. The art of data augmentation. Journal of Computational and Graphical Statistics, 10(1):1–50, 2001.

[19] L. Yao, D. Mimno, and A. McCallum. Efficient methods for topic model inference on streaming document collections. In International Conference on Knowledge Discovery and Data mining (SIGKDD), 2009.

[20] A. Zhang, J. Zhu, and B. Zhang. Sparse online topic models. In International Conference on World Wide Web (WWW), 2013.

[21] J. Zhu, X. Zheng, and B. Zhang. Improved bayesian supervised topic models with data augmentation. In Annual Meeting of the Association for Computational Linguistics (ACL), 2013. 9