nips nips2012 nips2012-41 nips2012-41-reference knowledge-graph by maker-knowledge-mining

41 nips-2012-Ancestor Sampling for Particle Gibbs


Source: pdf

Author: Fredrik Lindsten, Thomas Schön, Michael I. Jordan

Abstract: We present a novel method in the family of particle MCMC methods that we refer to as particle Gibbs with ancestor sampling (PG-AS). Similarly to the existing PG with backward simulation (PG-BS) procedure, we use backward sampling to (considerably) improve the mixing of the PG kernel. Instead of using separate forward and backward sweeps as in PG-BS, however, we achieve the same effect in a single forward sweep. We apply the PG-AS framework to the challenging class of non-Markovian state-space models. We develop a truncation strategy of these models that is applicable in principle to any backward-simulation-based method, but which is particularly well suited to the PG-AS framework. In particular, as we show in a simulation study, PG-AS can yield an order-of-magnitude improved accuracy relative to PG-BS due to its robustness to the truncation error. Several application examples are discussed, including Rao-Blackwellized particle smoothing and inference in degenerate state-space models. 1


reference text

[1] C. Andrieu, A. Doucet, and R. Holenstein, “Particle Markov chain Monte Carlo methods,” Journal of the Royal Statistical Society: Series B, vol. 72, no. 3, pp. 269–342, 2010.

[2] N. Whiteley, C. Andrieu, and A. Doucet, “Efficient Bayesian inference for switching statespace models using discrete particle Markov chain Monte Carlo methods,” Bristol Statistics Research Report 10:04, Tech. Rep., 2010.

[3] F. Lindsten and T. B. Sch¨ n, “On the use of backward simulation in the particle Gibbs sampler,” o in Proceedings of the 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Kyoto, Japan, Mar. 2012.

[4] A. Doucet and A. Johansen, “A tutorial on particle filtering and smoothing: Fifteen years later,” in The Oxford Handbook of Nonlinear Filtering, D. Crisan and B. Rozovsky, Eds. Oxford University Press, 2011.

[5] M. K. Pitt and N. Shephard, “Filtering via simulation: Auxiliary particle filters,” Journal of the American Statistical Association, vol. 94, no. 446, pp. 590–599, 1999.

[6] D. A. V. Dyk and T. Park, “Partially collapsed Gibbs samplers: Theory and methods,” Journal of the American Statistical Association, vol. 103, no. 482, pp. 790–796, 2008.

[7] N. Whiteley, “Discussion on Particle Markov chain Monte Carlo methods,” Journal of the Royal Statistical Society: Series B, 72(3), p 306–307, 2010.

[8] R. Chen and J. S. Liu, “Mixture Kalman filters,” Journal of the Royal Statistical Society: Series B, vol. 62, no. 3, pp. 493–508, 2000.

[9] A. Doucet, S. J. Godsill, and C. Andrieu, “On sequential Monte Carlo sampling methods for Bayesian filtering,” Statistics and Computing, vol. 10, no. 3, pp. 197–208, 2000.

[10] T. Sch¨ n, F. Gustafsson, and P.-J. Nordlund, “Marginalized particle filters for mixed lino ear/nonlinear state-space models,” IEEE Transactions on Signal Processing, vol. 53, no. 7, pp. 2279–2289, Jul. 2005.

[11] S. S¨ rkk¨ , P. Bunch, and S. Godsill, “A backward-simulation based Rao-Blackwellized para a ticle smoother for conditionally linear Gaussian models,” in Proceedings of the 16th IFAC Symposium on System Identification, Brussels, Belgium, Jul. 2012.

[12] W. Fong, S. J. Godsill, A. Doucet, and M. West, “Monte Carlo smoothing with application to audio signal enhancement,” IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 438–449, Feb. 2002.

[13] N. L. Hjort, C. Holmes, P. Mller, and S. G. Walker, Eds., Bayesian Nonparametrics. bridge University Press, 2010. Cam-

[14] S. N. MacEachern, M. Clyde, and J. S. Liu, “Sequential importance sampling for nonparametric Bayes models: The next generation,” The Canadian Journal of Statistics, vol. 27, no. 2, pp. 251–267, 1999.

[15] P. Fearnhead, “Particle filters for mixture models with an unknown number of components,” Statistics and Computing, vol. 14, pp. 11–21, 2004.

[16] C. E. Rasmussen and C. K. I. Williams, Gaussian Processes for Machine Learning. Press, 2006. MIT

[17] A. Bouchard-Cˆ t´ , S. Sankararaman, and M. I. Jordan, “Phylogenetic inference via sequential oe Monte Carlo,” Systematic Biology, vol. 61, no. 4, pp. 579–593, 2012.

[18] Y. W. Teh, H. Daum´ III, and D. Roy, “Bayesian agglomerative clustering with coalescents,” e Advances in Neural Information Processing, pp. 1473–1480, 2008.

[19] G. J. Bierman, “Fixed interval smoothing with discrete measurements,” International Journal of Control, vol. 18, no. 1, pp. 65–75, 1973.

[20] F. Lindsten, M. I. Jordan, and T. B. Sch¨ n, “Ancestor sampling for particle Gibbs,” arXiv.org, o arXiv:1210.6911, Oct. 2012.

[21] S. J. Godsill, A. Doucet, and M. West, “Monte Carlo smoothing for nonlinear time series,” Journal of the American Statistical Association, vol. 99, no. 465, pp. 156–168, Mar. 2004. 9