jmlr jmlr2011 jmlr2011-11 jmlr2011-11-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Botond Cseke, Tom Heskes
Abstract: We consider the problem of improving the Gaussian approximate posterior marginals computed by expectation propagation and the Laplace method in latent Gaussian models and propose methods that are similar in spirit to the Laplace approximation of Tierney and Kadane (1986). We show that in the case of sparse Gaussian models, the computational complexity of expectation propagation can be made comparable to that of the Laplace method by using a parallel updating scheme. In some cases, expectation propagation gives excellent estimates where the Laplace approximation fails. Inspired by bounds on the correct marginals, we arrive at factorized approximations, which can be applied on top of both expectation propagation and the Laplace method. The factorized approximations can give nearly indistinguishable results from the non-factorized approximations and their computational complexity scales linearly with the number of variables. We experienced that the expectation propagation based marginal approximations we introduce are typically more accurate than the methods of similar complexity proposed by Rue et al. (2009). Keywords: approximate marginals, Gaussian Markov random fields, Laplace approximation, variational inference, expectation propagation
P. R. Amestoy, T. A. Davis, and Iain S. D. An approximate minimum degree ordering algorithm. SIAM Journal on Matrix Analysis and Applications., 17(4):886–905, October 1996. A. Birlutiu and T. Heskes. Expectation propagation for rating players in sports competitions. In Joost N. Kok, Jacek Koronacki, Ramon L´ pez de M´ ntaras, Stan Matwin, Dunja Mladenic, and o a Andrzej Skowron, editors, Proceedings ECML/PKDD, volume 4702 of Lecture Notes in Computer Science, pages 374–381. Springer, 2007. L. Csat´ and M. Opper. Sparse representation for Gaussian process models. In T. K. Leen, T. G. o Dietterich, and V. Tresp, editors, Advances in Neural Information Processing Systems 13, Cambridge, MA, USA, 2001. MIT Press. P. Dangauthier, R. Herbrich, T. Minka, and T. Graepel. Trueskill through time: Revisiting the history of chess. In J.C. Platt, D. Koller, Y. Singer, and S. Roweis, editors, Advances in Neural Information Processing Systems 20, pages 337–344. MIT Press, Cambridge, MA, 2008. A. M. Erisman and W. F. Tinney. On computing certain elements of the inverse of a sparse matrix. Communications of the ACM, 18(3):177–179, 1975. ISSN 0001-0782. T. Heskes, M. Opper, W. Wiegerinck, O. Winther, and O. Zoeter. Approximate inference techniques with expectation constraints. Journal of Statistical Mechanics: Theory and Experiment, 2005: P11015, 2005. S. Ingram. Minimum degree reordering algorithms: A tutorial, 2006. URL http://www.cs.ubc. ca/˜sfingram/cs517_final.pdf. 452 A PPROXIMATE M ARGINALS IN L ATENT G AUSSIAN M ODELS M. Kuss and C. E. Rasmussen. Assessing approximate inference for binary Gaussian process classification. Journal of Machine Learning Research, 6:1679–1704, 2005. ISSN 1533-7928. S. Martino and H. Rue. Implementing approximate Bayesian inference using integrated nested Laplace approximation: a manual for the INLA program. Technical report, Department of Mathematical Sciences, NTNU, Norway, 2009. T. P. Minka. A Family of Algorithms for Approximate Bayesian Inference. PhD thesis, MIT, 2001. T. P. Minka. Divergence measures and message passing. Technical Report MSR-TR-2005-173, Microsoft Research Ltd., Cambridge, UK, December 2005. K. Murphy, Y. Weiss, and M. I. Jordan. Loopy belief propagation for approximate inference: An empirical study. In Proceedings of the Fifteenth Conference on Uncertainty in Artificial Intelligence, volume 9, pages 467–475, San Francisco, USA, 1999. Morgan Kaufman. I. Murray, R. P. Adams, and D. J.C. MacKay. Elliptical slice sampling. In Y. W. Teh and M. Titterington, editors, Proceedings of the 13th International Conference on Artificial Intelligence and Statistics, pages 541–548. 2010. M. Opper and C. Archambeau. The variational Gaussian approximation revisited. Neural Computation, 21(3):786–792, 2009. M. Opper and O. Winther. Gaussian processes for classification: Mean-field algorithms. Neural Computation, 12(11):2655–2684, 2000. M. Opper, U. Paquet, and O. Winther. Improving on expectation propagation. In D. Koller, D. Schuurmans, Y. Bengio, and L. Bottou, editors, Advances in Neural Information Processing Systems 21, pages 1241–1248. MIT, Cambridge, MA, US, 2009. H. Rue and L. Held. Gaussian Markov Random Fields: Theory and Applications, volume 104 of Monographs on Statistics and Applied Probability. Chapman & Hall, London, UK, 2005. H. Rue, S. Martino, and N. Chopin. Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations. Journal of the Royal Statistical Society (Series B), 71(2):319–392, 2009. M. W. Seeger. Bayesian inference and optimal design for the sparse linear model. Journal of Machine Learning Research, 9:759–813, 2008. ISSN 1533-7928. K. Takahashi, J. Fagan, and M.-S. Chin. Formation of a sparse impedance matrix and its application to short circuit study. In Proceedings of the 8th PICA Conference, 1973. L. Tierney and J. B. Kadane. Accurate approximations for posterior moments and marginal densities. Journal of the American Statistical Association, 81(393):82–86, 1986. M. van Gerven, B. Cseke, R. Oostenveld, and T. Heskes. Bayesian source localization with the multivariate Laplace prior. In Y. Bengio, D. Schuurmans, J. Lafferty, C. K. I. Williams, and A. Culotta, editors, Advances in Neural Information Processing Systems 22, pages 1901–1909, 2009. 453 C SEKE AND H ESKES M. van Gerven, B. Cseke, F. de Lange, and T. Heskes. Efficient Bayesian multivariate fMRI analysis using a sparsifying spatio-temporal prior. Neuroimage, 50(1):150–161, March 2010. O. Zoeter and T. Heskes. Gaussian quadrature based expectation propagation. In Z. Ghahramani and R. Cowell, editors, Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics, pages 445–452. Society for Artificial Intelligence and Statistics, 2005. 454