nips nips2013 nips2013-168 nips2013-168-reference knowledge-graph by maker-knowledge-mining

168 nips-2013-Learning to Pass Expectation Propagation Messages


Source: pdf

Author: Nicolas Heess, Daniel Tarlow, John Winn

Abstract: Expectation Propagation (EP) is a popular approximate posterior inference algorithm that often provides a fast and accurate alternative to sampling-based methods. However, while the EP framework in theory allows for complex nonGaussian factors, there is still a significant practical barrier to using them within EP, because doing so requires the implementation of message update operators, which can be difficult and require hand-crafted approximations. In this work, we study the question of whether it is possible to automatically derive fast and accurate EP updates by learning a discriminative model (e.g., a neural network or random forest) to map EP message inputs to EP message outputs. We address the practical concerns that arise in the process, and we provide empirical analysis on several challenging and diverse factors, indicating that there is a space of factors where this approach appears promising. 1


reference text

[1] S. Barthelm´ and N. Chopin. ABC-EP: Expectation Propagation for likelihood-free Bayesian e computation. In Proceedings of the 28th International Conference on Machine Learning, 2011.

[2] J. Domke. Parameter learning with truncated message-passing. In Computer Vision and Pattern Recognition (CVPR). IEEE, 2011.

[3] J. Domke. Learning graphical model parameters with approximate marginal inference. Pattern Analysis and Machine Intelligence (PAMI), 2013.

[4] N.D. Goodman, V.K. Mansinghka, D.M. Roy, K. Bonawitz, and J.B. Tenenbaum. Church: A language for generative models. In Proc. of Uncertainty in Artificial Intelligence (UAI), 2008.

[5] R. Herbrich, T.P. Minka, and T. Graepel. Trueskill: A Bayesian skill rating system. Advances in Neural Information Processing Systems, 19:569, 2007.

[6] T.P. Minka. A family of algorithms for approximate Bayesian inference. PhD thesis, Massachusetts Institute of Technology, 2001.

[7] T.P. Minka and J. Winn. Gates: A graphical notation for mixture models. In Advances in Neural Information Processing Systems, 2008.

[8] T.P. Minka, J.M. Winn, J.P. Guiver, and D.A. Knowles. Infer.NET 2.5, 2012. Microsoft Research. http://research.microsoft.com/infernet.

[9] P. Kohli R. Shapovalov, D. Vetrov. Spatial inference machines. In Computer Vision and Pattern Recognition (CVPR). IEEE, 2013.

[10] S. Ross, D. Munoz, M. Hebert, and J.A. Bagnell. Learning message-passing inference machines for structured prediction. In Computer Vision and Pattern Recognition (CVPR). IEEE, 2011.

[11] D.B. Rubin. Bayesianly justifiable and relevant frequency calculations for the applies statistician. The Annals of Statistics, pages 1151–1172, 1984.

[12] Stan Development Team. Stan: A C++ library for probability and sampling, version 1.3, 2013.

[13] D.H. Stern, R. Herbrich, and T. Graepel. Matchbox: Large scale online Bayesian recommendations. In Proceedings of the 18th international conference on World Wide Web, pages 111–120. ACM, 2009.

[14] A. Thomas. BUGS: A statistical modelling package. RTA/BCS Modular Languages Newsletter, 1994.

[15] D. Wingate, N.D. Goodman, A. Stuhlmueller, and J. Siskind. Nonstandard interpretations of probabilistic programs for efficient inference. In Advances in Neural Information Processing Systems, 2011.

[16] D. Wingate and T. Weber. Automated variational inference in probabilistic programming. In arXiv:1301.1299, 2013. 9