nips nips2008 nips2008-50 nips2008-50-reference knowledge-graph by maker-knowledge-mining

50 nips-2008-Continuously-adaptive discretization for message-passing algorithms


Source: pdf

Author: Michael Isard, John MacCormick, Kannan Achan

Abstract: Continuously-Adaptive Discretization for Message-Passing (CAD-MP) is a new message-passing algorithm for approximate inference. Most message-passing algorithms approximate continuous probability distributions using either: a family of continuous distributions such as the exponential family; a particle-set of discrete samples; or a fixed, uniform discretization. In contrast, CAD-MP uses a discretization that is (i) non-uniform, and (ii) adaptive to the structure of the marginal distributions. Non-uniformity allows CAD-MP to localize interesting features (such as sharp peaks) in the marginal belief distributions with time complexity that scales logarithmically with precision, as opposed to uniform discretization which scales at best linearly. We give a principled method for altering the non-uniform discretization according to information-based measures. CAD-MP is shown in experiments to estimate marginal beliefs much more precisely than competing approaches for the same computational expense. 1


reference text

[1] J. Pearl. Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann, 1988.

[2] P. Dagum and M. Luby. Approximating probabilistic inference in bayesian belief networks is NP-hard. Artificial Intelligence, 60(1):141–153, 1993.

[3] Robert G. Cowell, A. Philip Dawid, Steffen L. Lauritzen, and David J. Spiegelhalter. Probabilistic Networks and Expert Systems. Springer, 1999.

[4] Jonathan S. Yedidia, William T. Freeman, and Yair Weiss. Generalized belief propagation. In NIPS, pages 689–695, 2000.

[5] T. Minka. Expectation propagation for approximate bayesian inference. In Proc. UAI, pages 362–369, 2001.

[6] G. Kitagawa. The two-filter formula for smoothing and an implementation of the gaussian-sum smoother. Ann. Inst. Statist. Math., 46(4):605–623, 1994.

[7] P.F. Felzenszwalb and D.P. Huttenlocher. Efficient belief propagation for early vision. In Proc. CVPR, 2004.

[8] M. Isard and J. MacCormick. Dense motion and disparity estimation via loop belief propagation. In ACCV, pages 32–41, 2006.

[9] E. Sudderth, A. Ihler, W. Freeman, and A. Willsky. Nonparametric belief propagation. In Proc. CVPR, volume 1, pages 605–612, 2003.

[10] M. Isard. Pampas: Real-valued graphical models for computer vision. In Proc. CVPR, volume 1, pages 613–620, 2003.

[11] F.R. Kschischang, B.J. Frey, and H.A. Loeliger. Factor graphs and the sum-product algorithm. IEEE Transactions on Information Theory, 47(2):498–519, 2001.

[12] O. Zoeter and H. Heskes. Deterministic approximate inference techniques for conditionally gaussian state space models. Statistics and Computing, 16(3):279–292, 2006.

[13] T. Minka. Divergence measures and message passing. Technical Report MSR-TR-2005-173, Microsoft Research, 2005.

[14] Alexander V. Kozlov and Daphne Koller. Nonuniform dynamic discretization in hybrid networks. In Proc. UAI, pages 314–325, 1997.

[15] Jon Louis Bentley. Multidimensional binary search trees used for associative searching. Commun. ACM, 18(9):509–517, 1975.

[16] P.F. Felzenszwalb and D.P. Huttenlocher. Pictorial structures for object recognition. Int. J. Computer Vision, 61(1):55–79, 2005.

[17] J. Coughlan and S. Ferreira. Finding deformable shapes using loopy belief propagation. In Proc. ECCV, pages 453–468, 2002.

[18] J. Coughlan and H. Shen. Shape matching with belief propagation: Using dynamic quantization to accommodate occlusion and clutter. In Proc. Workshop on Generative-Model Based Vision, 2004.

[19] C. Pal, C. Sutton, and A. McCallum. Sparse forward-backward using minimum divergence beams for fast training of conditional random fields. In International Conference on Acoustics, Speech, and Signal Processing, 2006.

[20] J. Lasserre, A. Kannan, and J. Winn. Hybrid learning of large jigsaws. In Proc. CVPR, 2007. 8