nips nips2009 nips2009-35 nips2009-35-reference knowledge-graph by maker-knowledge-mining

35 nips-2009-Approximating MAP by Compensating for Structural Relaxations


Source: pdf

Author: Arthur Choi, Adnan Darwiche

Abstract: We introduce a new perspective on approximations to the maximum a posteriori (MAP) task in probabilistic graphical models, that is based on simplifying a given instance, and then tightening the approximation. First, we start with a structural relaxation of the original model. We then infer from the relaxation its deficiencies, and compensate for them. This perspective allows us to identify two distinct classes of approximations. First, we find that max-product belief propagation can be viewed as a way to compensate for a relaxation, based on a particular idealized case for exactness. We identify a second approach to compensation that is based on a more refined idealized case, resulting in a new approximation with distinct properties. We go on to propose a new class of algorithms that, starting with a relaxation, iteratively seeks tighter approximations. 1


reference text

[1] Martin J. Wainwright, Tommi Jaakkola, and Alan S. Willsky. MAP estimation via agreement on trees: message-passing and linear programming. IEEE Transactions on Information Theory, 51(11):3697–3717, 2005.

[2] Amir Globerson and Tommi Jaakkola. Fixing max-product: Convergent message passing algorithms for MAP LP-relaxations. In NIPS, pages 553–560, 2008.

[3] Radu Marinescu, Kalev Kask, and Rina Dechter. Systematic vs. non-systematic algorithms for solving the MPE task. In UAI, pages 394–402, 2003.

[4] Arthur Choi, Mark Chavira, and Adnan Darwiche. Node splitting: A scheme for generating upper bounds in Bayesian networks. In UAI, pages 57–66, 2007.

[5] Rina Dechter and Irina Rish. Mini-buckets: A general scheme for bounded inference. J. ACM, 50(2):107–153, 2003.

[6] Arthur Choi and Adnan Darwiche. An edge deletion semantics for belief propagation and its practical impact on approximation quality. In AAAI, pages 1107–1114, 2006.

[7] Arthur Choi and Adnan Darwiche. Approximating the partition function by deleting and then correcting for model edges. In UAI, pages 79–87, 2008.

[8] Rina Dechter, Kalev Kask, and Robert Mateescu. Iterative join-graph propagation. In UAI, pages 128–136, 2002.

[9] Martin J. Wainwright, Tommi Jaakkola, and Alan S. Willsky. Tree consistency and bounds on the performance of the max-product algorithm and its generalizations. Statistics and Computing, 14:143–166, 2004.

[10] Yair Weiss, Chen Yanover, and Talya Meltzer. MAP estimation, linear programming and belief propagation with convex free energies. In UAI, 2007.

[11] Gal Elidan, Ian McGraw, and Daphne Koller. Residual belief propagation: Informed scheduling for asynchronous message passing. In UAI, 2006.

[12] David Sontag, Talya Meltzer, Amir Globerson, Tommi Jaakkola, and Yair Weiss. Tightening LP relaxations for MAP using message passing. In UAI, pages 503–510, 2008.

[13] Rina Dechter. Mini-buckets: a general scheme for approximation in automated reasoning. In Proc. International Joint Conference on Artificial Intelligence (IJCAI), pages 1297–1302, 1997.

[14] Jason K. Johnson, Dmitry M. Malioutov, and Alan S. Willsky. Lagrangian relaxation for MAP estimation in graphical models. In Proceedings of the 45th Allerton Conference on Communication, Control and Computing, pages 672–681, 2007.

[15] Arthur Choi, Trevor Standley, and Adnan Darwiche. Approximating weighted Max-SAT problems by compensating for relaxations. In Proceedings of the 15th International Conference on Principles and Practice of Constraint Programming (CP), pages 211–225, 2009. 9