nips nips2013 nips2013-258 nips2013-258-reference knowledge-graph by maker-knowledge-mining

258 nips-2013-Projecting Ising Model Parameters for Fast Mixing


Source: pdf

Author: Justin Domke, Xianghang Liu

Abstract: Inference in general Ising models is difficult, due to high treewidth making treebased algorithms intractable. Moreover, when interactions are strong, Gibbs sampling may take exponential time to converge to the stationary distribution. We present an algorithm to project Ising model parameters onto a parameter set that is guaranteed to be fast mixing, under several divergences. We find that Gibbs sampling using the projected parameters is more accurate than with the original parameters when interaction strengths are strong and when limited time is available for sampling.


reference text

[1] Richard H. Byrd, Peihuang Lu, Jorge Nocedal, and Ciyou Zhu. A limited memory algorithm for bound constrained optimization. SIAM J. Sci. Comput., 16(5):1190–1208, 1995.

[2] Mary Kathryn Cowles and Bradley P. Carlin. Markov chain monte carlo convergence diagnostics: A comparative review. Journal of the American Statistical Association, 91:883–904, 1996.

[3] John C. Duchi, Alekh Agarwal, Mikael Johansson, and Michael I. Jordan. Ergodic mirror descent. SIAM Journal on Optimization, 22(4):1549–1578, 2012.

[4] Martin E. Dyer, Leslie Ann Goldberg, and Mark Jerrum. Matrix norms and rapid mixing for spin systems. Ann. Appl. Probab., 19:71–107, 2009.

[5] Stuart Geman and Donald Geman. Stochastic relaxation, gibbs distributions, and the bayesian restoration of images. IEEE Trans. Pattern Anal. Mach. Intell., 6(6):721–741, 1984.

[6] Amir Globerson and Tommi Jaakkola. Approximate inference using planar graph decomposition. In NIPS, pages 473–480, 2006.

[7] Firas Hamze and Nando de Freitas. From fields to trees. In UAI, 2004.

[8] Thomas P. Hayes. A simple condition implying rapid mixing of single-site dynamics on spin systems. In FOCS, pages 39–46, 2006.

[9] Tamir Hazan and Amnon Shashua. Convergent message-passing algorithms for inference over general graphs with convex free energies. In UAI, pages 264–273, 2008.

[10] David A. Levin, Yuval Peres, and Elizabeth L. Wilmer. Markov chains and mixing times. American Mathematical Society, 2006.

[11] Eyal Lubetzky and Allan Sly. Critical Ising on the square lattice mixes in polynomial time. Commun. Math. Phys., 313(3):815–836, 2012.

[12] Thomas Minka. Divergence measures and message passing. Technical report, 2005.

[13] Yuval Peres and Peter Winkler. Can extra updates delay mixing? arXiv/1112.0603, 2011.

[14] C. Peterson and J. R. Anderson. A mean field theory learning algorithm for neural networks. Complex Systems, 1:995–1019, 1987.

[15] Patrick Pletscher, Cheng S. Ong, and Joachim M. Buhmann. Spanning Tree Approximations for Conditional Random Fields. In AISTATS, 2009.

[16] Lawrence K. Saul and Michael I. Jordan. Exploiting tractable substructures in intractable networks. In NIPS, pages 486–492, 1995.

[17] Charles Sutton and Andrew Mccallum. Piecewise training for structured prediction. Machine Learning, 77:165–194, 2009.

[18] Robert H. Swendsen and Jian-Sheng Wang. Nonuniversal critical dynamics in monte carlo simulations. Phys. Rev. Lett., 58:86–88, Jan 1987.

[19] Martin Wainwright, Tommi Jaakkola, and Alan Willsky. A new class of upper bounds on the log partition function. IEEE Transactions on Information Theory, 51(7):2313–2335, 2005.

[20] Eric P. Xing, Michael I. Jordan, and Stuart Russell. A generalized mean field algorithm for variational inference in exponential families. In UAI, 2003.

[21] Jonathan Yedidia, William Freeman, and Yair Weiss. Constructing free energy approximations and generalized belief propagation algorithms. IEEE Transactions on Information Theory, 51:2282–2312, 2005. 9