jmlr jmlr2005 jmlr2005-71 jmlr2005-71-reference knowledge-graph by maker-knowledge-mining

71 jmlr-2005-Variational Message Passing


Source: pdf

Author: John Winn, Christopher M. Bishop

Abstract: Bayesian inference is now widely established as one of the principal foundations for machine learning. In practice, exact inference is rarely possible, and so a variety of approximation techniques have been developed, one of the most widely used being a deterministic framework called variational inference. In this paper we introduce Variational Message Passing (VMP), a general purpose algorithm for applying variational inference to Bayesian Networks. Like belief propagation, VMP proceeds by sending messages between nodes in the network and updating posterior beliefs using local operations at each node. Each such update increases a lower bound on the log evidence (unless already at a local maximum). In contrast to belief propagation, VMP can be applied to a very general class of conjugate-exponential models because it uses a factorised variational approximation. Furthermore, by introducing additional variational parameters, VMP can be applied to models containing non-conjugate distributions. The VMP framework also allows the lower bound to be evaluated, and this can be used both for model comparison and for detection of convergence. Variational message passing has been implemented in the form of a general purpose inference engine called VIBES (‘Variational Inference for BayEsian networkS’) which allows models to be specified graphically and then solved variationally without recourse to coding. Keywords: Bayesian networks, variational inference, message passing


reference text

H. Attias. A variational Bayesian framework for graphical models. In S. Solla, T. K. Leen, and K-L Muller, editors, Advances in Neural Information Processing Systems, volume 12, pages 209–215, Cambridge MA, 2000. MIT Press. C. M. Bishop. Variational principal components. In Proceedings Ninth International Conference on Artificial Neural Networks, ICANN’99, volume 1, pages 509–514. IEE, 1999. C. M. Bishop and M. Svens´ n. Bayesian Hierarchical Mixtures of Experts. In U. Kjaerulff and e C. Meek, editors, Proceedings Nineteenth Conference on Uncertainty in Artificial Intelligence, pages 57–64. Morgan Kaufmann, 2003. C. M. Bishop and J. M. Winn. Non-linear Bayesian image modelling. In Proceedings Sixth European Conference on Computer Vision, volume 1, pages 3–17. Springer-Verlag, 2000. C. M. Bishop and J. M. Winn. Structured variational distributions in VIBES. In Proceedings Artificial Intelligence and Statistics, Key West, Florida, 2003. Society for Artificial Intelligence and Statistics. C. M. Bishop, J. M. Winn, and D. Spiegelhalter. VIBES: A variational inference engine for Bayesian networks. In Advances in Neural Information Processing Systems, volume 15, 2002. R. G. Cowell, A. P. Dawid, S. L. Lauritzen, and D. J. Spiegelhalter. Probabilistic Networks and Expert Systems. Statistics for Engineering and Information Science. Springer-Verlag, 1999. Z. Ghahramani and M. J. Beal. Propagation algorithms for variational Bayesian learning. In T. K. Leen, T. Dietterich, and V. Tresp, editors, Advances in Neural Information Processing Systems, volume 13, Cambridge MA, 2001. MIT Press. W. R. Gilks and P. Wild. Adaptive rejection sampling for Gibbs sampling. Applied Statistics, 41(2): 337–348, 1992. T. Jaakkola and M. Jordan. A variational approach to Bayesian logistic regression problems and their extensions. In In Proceedings of the 6th international workshop on artificial intelligence and statistics., 1996. M. I. Jordan, Z. Ghahramani, T. S. Jaakkola, and L. K. Saul. An introduction to variational methods for graphical models. In M. I. Jordan, editor, Learning in Graphical Models, pages 105–162. Kluwer, 1998. S. L. Lauritzen. Propagation of probabilities, means, and variances in mixed graphical association models. Journal of the American Statistical Association, 87(420):1098–1108, 1992. D. J. Lunn, A. Thomas, N. G. Best, and D. J. Spiegelhalter. WinBUGS – a Bayesian modelling framework: concepts, structure and extensibility. Statistics and Computing, 10:321–333, 2000. http://www.mrc-bsu.cam.ac.uk/bugs/. T. P. Minka. Expectation propagation for approximate Bayesian inference. In Proceedings of the 17th Annual Conference on Uncertainty in Artificial Intelligence, pages 362–369. Morgan Kauffmann, 2001. 693 W INN AND B ISHOP R. M. Neal and G. E. Hinton. A new view of the EM algorithm that justifies incremental and other variants. In M. I. Jordan, editor, Learning in Graphical Models, pages 355–368. Kluwer, 1998. J. Pearl. Fusion, propagation and structuring in belief networks. Artificial Intelligence, 29:241–288, 1986. A. Thomas, D. J. Spiegelhalter, and W. R. Gilks. BUGS: A program to perform Bayesian inference using Gibbs sampling. In J. M. Bernardo, J. O. Berger, A. P. Dawid, and A. F. M. Smith, editors, Bayesian Statistics, Oxford: Clarendon Press, 1992. W. Wiegerinck. Variational approximations between mean field theory and the junction tree algorithm. In Uncertainty in Artificial Intelligence. Morgan Kauffmann, 2000. J. M. Winn. Variational Message Passing and its Applications. PhD thesis, University of Cambridge, October 2003. 694