nips nips2012 nips2012-96 nips2012-96-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Stefano Ermon, Ashish Sabharwal, Bart Selman, Carla P. Gomes
Abstract: Given a probabilistic graphical model, its density of states is a distribution that, for any likelihood value, gives the number of configurations with that probability. We introduce a novel message-passing algorithm called Density Propagation (DP) for estimating this distribution. We show that DP is exact for tree-structured graphical models and is, in general, a strict generalization of both sum-product and max-product algorithms. Further, we use density of states and tree decomposition to introduce a new family of upper and lower bounds on the partition function. For any tree decomposition, the new upper bound based on finer-grained density of state information is provably at least as tight as previously known bounds based on convexity of the log-partition function, and strictly stronger if a general condition holds. We conclude with empirical evidence of improvement over convex relaxations and mean-field based bounds. 1
[1] M.J. Wainwright and M.I. Jordan. Graphical models, exponential families, and variational inference. Foundations and Trends in Machine Learning, 1(1-2):1–305, 2008.
[2] S. Ermon, C. Gomes, A. Sabharwal, and B. Selman. Accelerated Adaptive Markov Chain for Partition Function Computation. Neural Information Processing Systems, 2011.
[3] F. Wang and DP Landau. Efficient, multiple-range random walk algorithm to calculate the density of states. Physical Review Letters, 86(10):2050–2053, 2001.
[4] M.J. Wainwright. Stochastic processes on graphs with cycles: geometric and Variational approaches. PhD thesis, Massachusetts Institute of Technology, 2002.
[5] M. Wainwright, T. Jaakkola, and A. Willsky. Exact map estimates by (hyper) tree agreement. Advances in neural information processing systems, pages 833–840, 2003.
[6] M.J. Wainwright. Tree-reweighted belief propagation algorithms and approximate ML estimation via pseudo-moment matching. In AISTATS, 2003.
[7] G. Parisi and R. Shankar. Statistical field theory. Physics Today, 41:110, 1988.
[8] L.D. Brown. Fundamentals of statistical exponential families: with applications in statistical decision theory. Institute of Mathematical Statistics, 1986.
[9] M. Richardson and P. Domingos. Markov logic networks. Machine Learning, 62(1):107–136, 2006.
[10] Y. Weiss, C. Yanover, and T. Meltzer. MAP estimation, linear programming and belief propagation with convex free energies. In Uncertainty in Artificial Intelligence, 2007.
[11] T. Hazan and A. Shashua. Norm-product belief propagation: Primal-dual message-passing for approximate inference. Information Theory, IEEE Transactions on, 56(12):6294–6316, 2010.
[12] K.P. Murphy, Y. Weiss, and M.I. Jordan. Loopy belief propagation for approximate inference: An empirical study. In Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence, pages 467–475. Morgan Kaufmann Publishers Inc., 1999.
[13] J.S. Yedidia, W.T. Freeman, and Y. Weiss. Understanding belief propagation and its generalizations. Exploring artificial intelligence in the new millennium, 8:236–239, 2003.
[14] S.M. Aji and R.J. McEliece. The generalized distributive law. Information Theory, IEEE Transactions on, 46(2):325–343, 2000.
[15] W.S. Cheung. Generalizations of H¨ lders inequality. International Journal of Mathematics o and Mathematical Sciences, 26:7–10, 2001.
[16] Qiang Liu and Alexander Ihler. Negative tree reweighted belief propagation. In Proceedings of the Twenty-Sixth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-10), pages 332–339, Corvallis, Oregon, 2010. AUAI Press.
[17] J.M. Mooij. libDAI: A free and open source c++ library for discrete approximate inference in graphical models. The Journal of Machine Learning Research, 11:2169–2173, 2010.
[18] M.J.D. Powell. The BOBYQA algorithm for bound constrained optimization without derivatives. University of Cambridge Technical Report, 2009. 9