nips nips2010 nips2010-101 nips2010-101-reference knowledge-graph by maker-knowledge-mining

101 nips-2010-Gaussian sampling by local perturbations


Source: pdf

Author: George Papandreou, Alan L. Yuille

Abstract: We present a technique for exact simulation of Gaussian Markov random fields (GMRFs), which can be interpreted as locally injecting noise to each Gaussian factor independently, followed by computing the mean/mode of the perturbed GMRF. Coupled with standard iterative techniques for the solution of symmetric positive definite systems, this yields a very efficient sampling algorithm with essentially linear complexity in terms of speed and memory requirements, well suited to extremely large scale probabilistic models. Apart from synthesizing data under a Gaussian model, the proposed technique directly leads to an efficient unbiased estimator of marginal variances. Beyond Gaussian models, the proposed algorithm is also very useful for handling highly non-Gaussian continuously-valued MRFs such as those arising in statistical image modeling or in the first layer of deep belief networks describing real-valued data, where the non-quadratic potentials coupling different sites can be represented as finite or infinite mixtures of Gaussians with the help of local or distributed latent mixture assignment variables. The Bayesian treatment of such models most naturally involves a block Gibbs sampler which alternately draws samples of the conditionally independent latent mixture assignments and the conditionally multivariate Gaussian continuous vector and we show that it can directly benefit from the proposed methods. 1


reference text

[1] D. Ackley, G. Hinton, and T. Sejnowski. A learning algorithm for Boltzmann machines. Cogn. Science, 9(1):147–169, 1985.

[2] D. Andrews and C. Mallows. Scale mixtures of normal distributions. JRSS (B), 36(1):99–102, 1974.

[3] J. Besag. Spatial interaction and the statistical analysis of lattice systems. JRSS (B), 36(2):192–236, 1974.

[4] D. Geman and C. Yang. Nonlinear image recovery with half-quadratic regularization. IEEE Trans. Image Process., 4(7):932–946, 1995.

[5] S. Geman and D. Geman. Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. PAMI, 6(6):721–741, 1984.

[6] G. Golub and C. Van Loan. Matrix Computations. John Hopkins Press, 1996.

[7] G. Hinton. Training products of experts by minimizing contrastive divergence. Neur. Comp., 14(8):1771– 1800, 2002.

[8] A. Kokaram. Motion Picture Restoration. Springer, 1998.

[9] H. Lee, R. Grosse, R. Ranganath, and A. Y. Ng. Convolutional deep belief networks for scalable unsupervised learning of hierarchical representations. In Proc. ICML, 2009.

[10] S. Lyu and E. Simoncelli. Modeling multiscale subbands of photographic images with fields of Gaussian scale mixtures. IEEE Trans. PAMI, 31(4):693–706, Apr. 2009.

[11] D. MacKay. Bayesian interpolation. Neur. Comp., 4(3):415–447, 1992.

[12] D. Malioutov, J. Johnson, M. Choi, and A. Willsky. Low-rank variance approximation in GMRF models: Single and multiscale approaches. IEEE Trans. Signal Process., 56(10):4621–4634, Oct. 2008.

[13] D. Malioutov, J. Johnson, and A. Willsky. Walk sums and belief propagation in Gaussian graphical models. J. of Mach. Learning Res., 7:2031–2064, 2006.

[14] M. Nikolova. Model distortions in Bayesian MAP reconstruction. Inv. Pr. and Imag., 1(2):399–422, 2007.

[15] C. Paige and M. Saunders. LSQR: An algorithm for sparse linear equations and sparse least squares. ACM Trans. on Math. Software, 8(1):43–71, 1982.

[16] G. Papandreou, P. Maragos, and A. Kokaram. Image inpainting with a wavelet domain hidden Markov tree model. In Proc. ICASSP, pages 773–776, 2008.

[17] T. Park and G. Casella. The Bayesian lasso. J. of the Amer. Stat. Assoc., 103(482):681–686, 2008.

[18] M. Ranzato, A. Krizhevsky, and G. Hinton. Factored 3-way restricted Boltzmann machines for modeling natural images. In Proc. AISTATS, 2010.

[19] S. Roth and M. Black. Fields of experts. Int. J. of Comp. Vis., 82(2):205–229, 2009.

[20] S. Roweis and Z. Ghahramani. A unifying review of linear Gaussian models. Neur. Comp., 11:305–345, 1999.

[21] H. Rue. Fast sampling of Gaussian Markov random fields. JRSS (B), 63(2):325–338, 2001.

[22] H. Rue and L. Held. Gaussian Markov random fields. Theory and Applications. Chapman & Hall, 2005.

[23] U. Schmidt, Q. Gao, and S. Roth. A generative perspective on MRFs in low-level vision. In CVPR, 2010.

[24] M. Schneider and A. Willsky. Krylov subspace estimation. SIAM J. Sci. Comp., 22(5):1840–1864, 2001.

[25] M. Seeger and H. Nickisch. Large scale variational inference and experimental design for sparse generalized linear models. Technical Report TR-175, MPI for Biological Cybernetics, 2008.

[26] M. Seeger, H. Nickisch, R. Pohmann, and B. Sch¨ lkopf. Bayesian experimental design of magnetic o resonance imaging sequences. In NIPS, pages 1441–1448, 2008.

[27] J. Skilling. Bayesian numerical analysis. In W. Grandy and P. Milonni, editors, Physics and Probability, pages 207–221. Cambridge Univ. Press, 1993.

[28] E. Sudderth, M. Wainwright, and A. Willsky. Embedded trees: Estimation of Gaussian processes on graphs with cycles. IEEE Trans. Signal Process., 52(11):3136–3150, Nov. 2004.

[29] R. Szeliski. Bayesian modeling of uncertainty in low-level vision. Int. J. of Comp. Vis., 5(3):271–301, 1990.

[30] R. Szeliski and D. Terzopoulos. From splines to fractals. In Proc. ACM SIGGRAPH, pages 51–60, 1989.

[31] D. Terzopoulos. The computation of visible-surface representations. IEEE Trans. PAMI, 10(4):417–438, 1988.

[32] M. Tipping. Sparse Bayesian learning and the relevance vector machine. J. of Mach. Learning Res., 1:211–244, 2001.

[33] Y. Weiss and W. Freeman. Correctness of belief propagation in Gaussian graphical models of arbitrary topology. Neur. Comp., 13(10):2173–2200, 2001.

[34] Y. Weiss and W. Freeman. What makes a good model of natural images? In CVPR, 2007.

[35] M. Welling, G. Hinton, and S. Osindero. Learning sparse topographic representations with products of Student-t distributions. In NIPS, 2002.

[36] A. Willsky. Multiresolution Markov models for signal and image processing. Proc. IEEE, 90(8):1396– 1458, 2002.

[37] S. Zhu, Y. Wu, and D. Mumford. Filters, random fields and maximum entropy (FRAME): Towards a unified theory for texture modeling. Int. J. of Comp. Vis., 27(2):107–126, 1998. 9