nips nips2012 nips2012-129 nips2012-129-reference knowledge-graph by maker-knowledge-mining

129 nips-2012-Fast Variational Inference in the Conjugate Exponential Family


Source: pdf

Author: James Hensman, Magnus Rattray, Neil D. Lawrence

Abstract: We present a general method for deriving collapsed variational inference algorithms for probabilistic models in the conjugate exponential family. Our method unifies many existing approaches to collapsed variational inference. Our collapsed variational inference leads to a new lower bound on the marginal likelihood. We exploit the information geometry of the bound to derive much faster optimization methods based on conjugate gradients for these models. Our approach is very general and is easily applied to any model where the mean field update equations have been derived. Empirically we show significant speed-ups for probabilistic inference using our bound. 1


reference text

S. Amari and H. Nagaoka. Methods of information geometry. AMS, 2007. A. Asuncion, M. Welling, P. Smyth, and Y. Teh. On smoothing and inference for topic models. arXiv preprint arXiv:1205.2662, 2012. C. M. Bishop. Pattern Recognition and Machine Learning. Springer New York, 2006. D. M. Blei, A. Y. Ng, and M. I. Jordan. Latent Dirichlet allocation. The Journal of Machine Learning Research, 3:993–1022, 2003. Z. Ghahramani and M. Beal. Propagation algorithms for variational Bayesian learning. Advances in neural information processing systems, pages 507–513, 2001. P. Glaus, A. Honkela, and M. Rattray. Identifying differentially expressed transcripts from RNAseq data with biological variation. Bioinformatics, 2012. doi: 10.1093/bioinformatics/bts260. Advance Access. M. Hoffman, D. Blei, C. Wang, and J. Paisley. Stochastic variational inference. arXiv preprint arXiv:1206.7051, 2012. A. Honkela, T. Raiko, M. Kuusela, M. Tornio, and J. Karhunen. Approximate Riemannian conjugate gradient learning for fixed-form variational Bayes. The Journal of Machine Learning Research, 9999:3235–3268, 2010. N. King and N. D. Lawrence. Fast variational inference for Gaussian process models through KLcorrection. Machine Learning: ECML 2006, pages 270–281, 2006. K. Kurihara, M. Welling, and Y. W. Teh. Collapsed variational Dirichlet process mixture models. In Proceedings of the International Joint Conference on Artificial Intelligence, volume 20, page 19, 2007. M. Kuusela, T. Raiko, A. Honkela, and J. Karhunen. A gradient-based algorithm competitive with variational Bayesian EM for mixture of Gaussians. In Neural Networks, 2009. IJCNN 2009. International Joint Conference on, pages 1688–1695. IEEE, 2009. M. L´ zaro-Gredilla and M. K. Titsias. Variational heteroscedastic Gaussian process regression. In a Proceedings of the International Conference on Machine Learning (ICML), 2011, 2011. M. L´ zaro-Gredilla, S. Van Vaerenbergh, and N. Lawrence. Overlapping mixtures of Gaussian a processes for the data association problem. Pattern Recognition, 2011. T. P. Minka, J. M. Winn, J. P. Guiver, and D. A. Knowles. Infer .NET 2.4. Microsoft Research Cambridge, 2010. M. A. Sato. Online model selection based on the variational Bayes. Neural Computation, 13(7): 1649–1681, 2001. J. Sung, Z. Ghahramani, and S. Bang. Latent-space variational Bayes. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 30(12):2236–2242, 2008. Y. W. Teh, D. Newman, and M. Welling. A collapsed variational Bayesian inference algorithm for latent Dirichlet allocation. Advances in neural information processing systems, 19:1353, 2007. G. Xu et al. Transcriptome and targetome analysis in MIR155 expressing cells using RNAseq. RNA, pages 1610–1622, June 2010. ISSN 1355-8382. doi: 10.1261/rna.2194910. URL http://rnajournal.cshlp.org/cgi/doi/10.1261/rna.2194910. 9