nips nips2009 nips2009-187 nips2009-187-reference knowledge-graph by maker-knowledge-mining

187 nips-2009-Particle-based Variational Inference for Continuous Systems


Source: pdf

Author: Andrew Frank, Padhraic Smyth, Alexander T. Ihler

Abstract: Since the development of loopy belief propagation, there has been considerable work on advancing the state of the art for approximate inference over distributions defined on discrete random variables. Improvements include guarantees of convergence, approximations that are provably more accurate, and bounds on the results of exact inference. However, extending these methods to continuous-valued systems has lagged behind. While several methods have been developed to use belief propagation on systems with continuous values, recent advances for discrete variables have not as yet been incorporated. In this context we extend a recently proposed particle-based belief propagation algorithm to provide a general framework for adapting discrete message-passing algorithms to inference in continuous systems. The resulting algorithms behave similarly to their purely discrete counterparts, extending the benefits of these more advanced inference techniques to the continuous domain. 1


reference text

[1] J. Pearl. Probabilistic Reasoning in Intelligent Systems. Morgan Kaufman, San Mateo, 1988.

[2] S. Geman and D. Geman. Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. PAMI, 6(6):721–741, November 1984.

[3] M. Jordan, Z. Ghahramani, T. Jaakkola, and L. Saul. An introduction to variational methods for graphical methods. Machine Learning, 37:183–233, 1999.

[4] J. Yedidia, W. Freeman, and Y. Weiss. Constructing free energy approximations and generalized belief propagation algorithms. Technical Report 2004-040, MERL, May 2004.

[5] M. Wainwright, T. Jaakkola, and A. Willsky. A new class of upper bounds on the log partition function. IEEE Trans. Info. Theory, 51(7):2313–2335, July 2005.

[6] D. Sontag and T. Jaakkola. New outer bounds on the marginal polytope. In NIPS 20, pages 1393–1400. MIT Press, Cambridge, MA, 2008.

[7] E. Sudderth, A. Ihler, W. Freeman, and A. Willsky. Nonparametric belief propagation. In CVPR, 2003.

[8] T. Minka. Divergence measures and message passing. Technical Report 2005-173, Microsoft Research Ltd, January 2005.

[9] A. Yuille. CCCP algorithms to minimize the Bethe and Kikuchi free energies: convergent alternatives to belief propagation. Neural Comput., 14(7):1691–1722, 2002.

[10] Y.-W. Teh and M. Welling. The unified propagation and scaling algorithm. In NIPS 14. 2002.

[11] J. Gonzalez, Y. Low, and C. Guestrin. Residual splash for optimally parallelizing belief propagation. In In Artificial Intelligence and Statistics (AISTATS), Clearwater Beach, Florida, April 2009.

[12] A. Ihler, J. Fisher, R. Moses, and A. Willsky. Nonparametric belief propagation for self-calibration in sensor networks. IEEE J. Select. Areas Commun., pages 809–819, April 2005.

[13] J. Schiff, D. Antonelli, A. Dimakis, D. Chu, and M. Wainwright. Robust message-passing for statistical inference in sensor networks. In IPSN, pages 109–118, April 2007.

[14] A. Globerson, D. Sontag, and T. Jaakkola. Approximate inference – How far have we come? (NIPS’08 Workshop), 2008. http://www.cs.huji.ac.il/˜gamir/inference-workshop.html.

[15] D. Koller, U. Lerner, and D. Angelov. A general algorithm for approximate inference and its application to hybrid Bayes nets. In UAI 15, pages 324–333, 1999.

[16] A. Ihler and D. McAllester. Particle belief propagation. In AI & Statistics: JMLR W&CP;, volume 5, pages 256–263, April 2009.

[17] F. Kschischang, B. Frey, and H.-A. Loeliger. Factor graphs and the sum-product algorithm. IEEE Trans. Info. Theory, 47(2):498–519, February 2001.

[18] M. Wainwright and M. Jordan. Graphical models, exponential families, and variational inference. Technical Report 629, UC Berkeley Dept. of Statistics, September 2003.

[19] SL Lauritzen and DJ Spiegelhalter. Local computations with probabilities on graphical structures and their application to expert systems. Journal of the Royal Statistical Society. Series B (Methodological), pages 157–224, 1988.

[20] W. Wiegerinck and T. Heskes. Fractional belief propagation. In NIPS 15, pages 438–445. 2003.

[21] T. Hazan and A. Shashua. Convergent message-passing algorithms for inference over general graphs with convex free energies. In UAI 24, pages 264–273. July 2008.

[22] M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp. A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking. 50(2):174–188, February 2002.

[23] J. Coughlan and H. Shen. Dynamic quantization for belief propagation in sparse spaces. Comput. Vis. Image Underst., 106(1):47–58, 2007.

[24] M. Isard, J. MacCormick, and K. Achan. Continuously-adaptive discretization for message-passing algorithms. In NIPS 21, pages 737–744. 2009.

[25] S. Chib. Marginal likelihood from the gibbs output. JASA, 90(432):1313–1321, 1995.

[26] D. Moore, J. Leonard, D. Rus, and S. Teller. Robust distributed network localization with noisy range measurements. In 2nd Int’l Conf. on Emb. Networked Sensor Sys. (SenSys’04), pages 50–61, 2004. 9