jmlr jmlr2005 jmlr2005-32 jmlr2005-32-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Manfred Opper, Ole Winther
Abstract: We propose a novel framework for approximations to intractable probabilistic models which is based on a free energy formulation. The approximation can be understood as replacing an average over the original intractable distribution with a tractable one. It requires two tractable probability distributions which are made consistent on a set of moments and encode different features of the original intractable distribution. In this way we are able to use Gaussian approximations for models with discrete or bounded variables which allow us to include non-trivial correlations. These are neglected in many other methods. We test the framework on toy benchmark problems for binary variables on fully connected graphs and 2D grids and compare with other methods, such as loopy belief propagation. Good performance is already achieved by using single nodes as tractable substructures. Significant improvements are obtained when a spanning tree is used instead.
H. Attias. A variational Bayesian framework for graphical models. In T. Leen, T. G. Dietterich, and V. Tresp, editors, Advances in Neural Information Processing Systems 12, pages 209–215. MIT Press, 2000. C. M. Bishop, D. Spiegelhalter, and J. Winn. Vibes: A variational inference engine for Bayesian networks. In S. Becker, S. Thrun, and K. Obermayer, editors, Advances in Neural Information Processing Systems 15, pages 777–784. MIT Press, 2003. S. Boyd and L. Vandenberghe. Convex Optimization. Cambridge University Press, 2004. D. Cornford, L. Csat´, D. J. Evans, and M. Opper. Bayesian analysis of the scatterometer o wind retrieval inverse problem: Some new approaches. Journal Royal Statistical Society B, 66:1–17, 2004. L. Csat´, M. Opper, and O. Winther. TAP Gibbs free energy, belief propagation and o sparsity. In T. G. Dietterich, S. Becker, and Z. Ghahramani, editors, Advances in Neural Information Processing Systems 14, pages 657–663, Cambridge, MA, 2002. MIT Press. T. Fabricius and O. Winther. Correcting the bias of subtractive interference cancellation in cdma: Advanced mean field theory. Submitted to IEEE trans. Inf. Theory, 2004. T. Heskes, K. Albers, and H. Kappen. Approximate inference and constrained optimization. In Proceedings UAI-2003, pages 313–320. Morgan Kaufmann, 2003. T. Heskes and O. Zoeter. Expectation propagation for approximate inference in dynamic Bayesian networks. In A. Darwiche and N. Friedman, editors, Proceedings UAI-2002, pages 216–233, 2002. P. A.d.F.R. Hojen-Sorensen, O. Winther, and L. K. Hansen. Mean field approaches to independent component analysis. Neural Computation, 14:889–918, 2002. Michael I. Jordan, Zoubin Ghahramani, Tommi S. Jaakkola, and Lawrence K. Saul. An introduction to variational methods for graphical models. Mach. Learn., 37:183–233, 1999. D. J. C. MacKay. Information Theory, Inference, and Learning Algorithms. Cambridge University Press, 2003. D. Malzahn and M. Opper. An approximate analytical approach to resampling averages. Journal of Machine Learning Research, pages 1151–1173, 2003. D. Malzahn and M. Opper. Approximate analytical bootstrap averages for support vector classifiers. In Sebastian Thrun, Lawrence Saul, and Bernhard Sch¨lkopf, editors, Advances o in Neural Information Processing Systems 16. MIT Press, 2004. M. M´zard, G. Parisi, and M. A. Virasoro. Spin Glass Theory and Beyond, volume 9 of e Lecture Notes in Physics. World Scientific, 1987. 2202 Expectation Consistent Approximate Inference T. P. Minka. Expectation propagation for approximate Bayesian inference. In J. S. Breese and D. Koller, editors, Proceedings UAI-2001, pages 362–369. Morgan Kaufmann, 2001a. T. P. Minka. A family of algorithms for approximate Bayesian inference. PhD thesis, MIT Media Lab, 2001b. T. P. Minka and Y. Qi. Tree-structured approximations by expectation propagation. In S. Thrun, L. Saul, and B. Sch¨lkopf, editors, Advances in Neural Information Processing o Systems 16. MIT Press, 2004. M. Opper and D. Saad, editors. Advanced Mean Field Methods: Theory and Practice. MIT Press, 2001. M. Opper and O. Winther. Mean field methods for classification with gaussian processes. In M. S. Kearns, S. A. Solla, and D. A. Cohn, editors, Advances in Neural Information Processing Systems 11, pages 309–315. MIT Press, 1999. M. Opper and O. Winther. Gaussian processes for classification: Mean field algorithms. Neural Computation, 12:2655–2684, 2000. M. Opper and O. Winther. Adaptive and self-averaging Thouless-Anderson-Palmer mean field theory for probabilistic modeling. Phys. Rev. E, 64:056131, 2001a. M. Opper and O. Winther. Tractable approximations for probabilistic models: The adaptive Thouless-Anderson-Palmer mean field approach. Phys. Rev. Lett., 86:3695, 2001b. M. Opper and O. Winther. Variational linear response. In S. Thrun, L. Saul, and B. Sch¨lkopf, editors, Advances in Neural Information Processing Systems 16. MIT Press, o Cambridge, MA, 2004. T. Plefka. Convergence condition of the TAP equation for the infinite-range Ising spin glass. J. Phys. A, 15:1971, 1982. J. Qui˜onero-Candela and O. Winther. Incremental gaussian processes. In S. Thrun n S. Becker and K. Obermayer, editors, Advances in Neural Information Processing Systems 15, pages 1001–1008. MIT Press, 2003. G. Roepstorff. Path Integral Approach to Quantum Physics, An Introduction. Springer Verlag Berlin Heidelberg, New York, 1994. M. Suzuki, editor. Coherent Anomaly Method, Mean Field, Fluctuations and Symmetries. World Scientific, 1995. L. Vandenberghe, S. Boyd, and S.-P Wu. Determinant maximization with linear matrix inequality constraints. SIAM Journal on Matrix Analysis and Applications, 19:499–533, 1998. M. J. Wainwright and M. I. Jordan. Semidefinite methods for approximate inference on graphs with cycles. Technical Report UCB/CSD-03-1226, UC Berkeley CS Division, 2003. 2203 Opper and Winther M. J. Wainwright and M. I. Jordan. A variational principle for graphical models. In S. Haykin, J. Principe, S. Sejnowski, and J McWhirter, editors, New Directions in Statistical Signal Processing: From Systems to Brain. MIT Press, 2005. M. Welling and Y.W. Teh. Approximate inference in Boltzmann machines. Artificial Intelligence, 143:19–50, 2003. J. S. Yedidia, W. T. Freeman, and Y. Weiss. Generalized belief propagation. In T. K. Leen, T. G. Dietterich, and V. Tresp, editors, Advances in Neural Information Processing Systems 13, pages 689–695, 2001. A. L. Yuille. CCCP algorithms to minimize the Bethe and Kikuchi free energies: convergent alternatives to belief propagation. Neural Computation, 14:1691–1722, 2002. A. L. Yuille and A. Rangarajan. The concave-convex procedure. Neural Computation, 15: 915–936, 2003. 2204