nips nips2012 nips2012-205 nips2012-205-reference knowledge-graph by maker-knowledge-mining

205 nips-2012-MCMC for continuous-time discrete-state systems


Source: pdf

Author: Vinayak Rao, Yee W. Teh

Abstract: We propose a simple and novel framework for MCMC inference in continuoustime discrete-state systems with pure jump trajectories. We construct an exact MCMC sampler for such systems by alternately sampling a random discretization of time given a trajectory of the system, and then a new trajectory given the discretization. The first step can be performed efficiently using properties of the Poisson process, while the second step can avail of discrete-time MCMC techniques based on the forward-backward algorithm. We show the advantage of our approach compared to particle MCMC and a uniformization-based sampler. 1


reference text

[1] Ryan P. Adams, Iain Murray, and David J. C. MacKay. Tractable nonparametric Bayesian inference in Poisson processes with Gaussian process intensities. In Proceedings of the 26th International Conference on Machine Learning (ICML), 2009.

[2] Y. W. Teh, C. Blundell, and L. T. Elliott. Modelling genetic variations with fragmentation-coagulation processes. In Advances In Neural Information Processing Systems, 2011.

[3] U. Nodelman, C.R. Shelton, and D. Koller. Continuous time Bayesian networks. In Proceedings of the Eighteenth Conference on Uncertainty in Artificial Intelligence (UAI), pages 378–387, 2002.

[4] Ardavan Saeedi and Alexandre Bouchard-Cˆ t´ . Priors over Recurrent Continuous Time Processes. In oe Advances in Neural Information Processing Systems 24 (NIPS), volume 24, 2011.

[5] Matthias Hoffman, Hendrik Kueck, Nando de Freitas, and Arnaud Doucet. New inference strategies for solving Markov decision processes using reversible jump MCMC. In Proceedings of the TwentyFifth Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-09), pages 223–231, Corvallis, Oregon, 2009. AUAI Press.

[6] A. Doucet, N. de Freitas, and N. J. Gordon. Sequential Monte Carlo Methods in Practice. Statistics for Engineering and Information Science. New York: Springer-Verlag, May 2001.

[7] Fr¨ wirth-Schnatter. Data augmentation and dynamic linear models. J. Time Ser. Anal., 15:183–202, 1994. u

[8] C. K. Carter and R. Kohn. Markov chain Monte Carlo in conditionally Gaussian state space models. Biometrika, 83:589–601, 1996.

[9] Radford M. Neal, Matthew J. Beal, and Sam T. Roweis. Inferring state sequences for non-linear systems with embedded hidden Markov models. In Advances in Neural Information Processing Systems 16 (NIPS), volume 16, pages 401–408. MIT Press, 2004.

[10] J. Van Gael, Y. Saatci, Y. W. Teh, and Z. Ghahramani. Beam sampling for the infinite hidden Markov model. In Proceedings of the International Conference on Machine Learning, volume 25, 2008.

[11] M. Dewar, C. Wiggins, and F. Wood. Inference in hidden Markov models with explicit state duration distributions. IEEE Signal Processing Letters, page To Appear, 2012.

[12] Christophe Andrieu, Arnaud Doucet, and Roman Holenstein. Particle Markov chain Monte Carlo methods. Journal of the Royal Statistical Society Series B, 72(3):269–342, 2010.

[13] V. Rao and Y. W. Teh. Fast MCMC sampling for Markov jump processes and continuous time Bayesian networks. In Proceedings of the International Conference on Uncertainty in Artificial Intelligence, 2011.

[14] William Feller. On semi-Markov processes. Proceedings of the National Academy of Sciences of the United States of America, 51(4):pp. 653–659, 1964.

[15] D. Sonderman. Comparing semi-Markov processes. Mathematics of Operations Research, 5(1):110–119, 1980.

[16] D. J. Daley and D. Vere-Jones. An Introduction to the Theory of Point Processes. Springer, 2008.

[17] J. F. C. Kingman. Poisson processes, volume 3 of Oxford Studies in Probability. The Clarendon Press Oxford University Press, New York, 1993. Oxford Science Publications.

[18] A. Beskos and G.O. Roberts. Exact simulation of diffusions. Annals of applied probability, 15(4):2422 – 2444, November 2005.

[19] Martyn Plummer, Nicky Best, Kate Cowles, and Karen Vines. CODA: Convergence diagnosis and output analysis for MCMC. R News, 6(1):7–11, March 2006.

[20] Andrew Golightly and Darren J. Wilkinson. Bayesian parameter inference for stochastic biochemical network models using particle Markov chain Monte Carlo. Interface Focus, 1(6):807–820, December 2011.

[21] S. Asmussen. Applied Probability and Queues. Applications of Mathematics. Springer, 2003.

[22] Stephen G. Walker. Sampling the Dirichlet mixture model with slices. Communications in Statistics Simulation and Computation, 36:45, 2007. 9