nips nips2009 nips2009-217 nips2009-217-reference knowledge-graph by maker-knowledge-mining

217 nips-2009-Sharing Features among Dynamical Systems with Beta Processes


Source: pdf

Author: Alan S. Willsky, Erik B. Sudderth, Michael I. Jordan, Emily B. Fox

Abstract: We propose a Bayesian nonparametric approach to the problem of modeling related time series. Using a beta process prior, our approach is based on the discovery of a set of latent dynamical behaviors that are shared among multiple time series. The size of the set and the sharing pattern are both inferred from data. We develop an efficient Markov chain Monte Carlo inference method that is based on the Indian buffet process representation of the predictive distribution of the beta process. In particular, our approach uses the sum-product algorithm to efficiently compute Metropolis-Hastings acceptance probabilities, and explores new dynamical behaviors via birth/death proposals. We validate our sampling algorithm using several synthetic datasets, and also demonstrate promising results on unsupervised segmentation of visual motion capture data.


reference text

[1] J. Barbiˇ , A. Safonova, J.-Y. Pan, C. Faloutsos, J.K. Hodgins, and N.S. Pollard. Segmenting motion c capture data into distinct behaviors. In Proc. Graphics Interface, pages 185–194, 2004.

[2] M.J. Beal, Z. Ghahramani, and C.E. Rasmussen. The infinite hidden Markov model. In Advances in Neural Information Processing Systems, volume 14, pages 577–584, 2002.

[3] A.C. Courville, N. Daw, G.J. Gordon, and D.S. Touretzky. Model uncertainty in classical conditioning. In Advances in Neural Information Processing Systems, volume 16, pages 977–984, 2004.

[4] E.B. Fox, E.B. Sudderth, M.I. Jordan, and A.S. Willsky. An HDP-HMM for systems with state persistence. In Proc. International Conference on Machine Learning, July 2008.

[5] E.B. Fox, E.B. Sudderth, M.I. Jordan, and A.S. Willsky. Nonparametric Bayesian learning of switching dynamical systems. In Advances in Neural Information Processing Systems, volume 21, pages 457–464, 2009.

[6] A. Frigessi, P. Di Stefano, C.R. Hwang, and S.J. Sheu. Convergence rates of the Gibbs sampler, the Metropolis algorithm and other single-site updating dynamics. Journal of the Royal Statistical Society, Series B, pages 205–219, 1993.

[7] D. G¨ r¨ r, F. J¨ kel, and C.E. Rasmussen. A choice model with infinitely many latent features. In Proc. ou a International Conference on Machine learning, June 2006.

[8] P.J. Green. Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika, 82(4):711–732, 1995.

[9] T.L. Griffiths and Z. Ghahramani. Infinite latent feature models and the Indian buffet process. Gatsby Computational Neuroscience Unit, Technical Report #2005-001, 2005.

[10] N.L. Hjort. Nonparametric Bayes estimators based on beta processes in models for life history data. The Annals of Statistics, pages 1259–1294, 1990.

[11] E. Hsu, K. Pulli, and J. Popovi´ . Style translation for human motion. In SIGGRAPH, pages 1082–1089, c 2005.

[12] J. F. C. Kingman. Completely random measures. Pacific Journal of Mathematics, 21(1):59–78, 1967.

[13] N. Lawrence. MATLAB motion capture toolbox. http://www.cs.man.ac.uk/ neill/mocap/.

[14] J.S. Liu. Peskun’s theorem and a modified discrete-state Gibbs sampler. Biometrika, 83(3):681–682, 1996.

[15] E. Meeds, Z. Ghahramani, R.M. Neal, and S.T. Roweis. Modeling dyadic data with binary latent factors. In Advances in Neural Information Processing Systems, volume 19, pages 977–984, 2007.

[16] K.P. Murphy. Hidden Markov model (HMM) toolbox for MATLAB. http://www.cs.ubc.ca/ murphyk/Software/HMM/hmm.html.

[17] V. Pavlovi´ , J.M. Rehg, T.J. Cham, and K.P. Murphy. A dynamic Bayesian network approach to figure c tracking using learned dynamic models. In Proc. International Conference on Computer Vision, September 1999.

[18] V. Pavlovi´ , J.M. Rehg, and J. MacCormick. Learning switching linear models of human motion. In c Advances in Neural Information Processing Systems, volume 13, pages 981–987, 2001.

[19] L.R. Rabiner. A tutorial on hidden Markov models and selected applications in speech recognition. Proceedings of the IEEE, 77(2):257–286, 1989.

[20] G.W. Taylor, G.E. Hinton, and S.T. Roweis. Modeling human motion using binary latent variables. In Advances in Neural Information Processing Systems, volume 19, pages 1345–1352, 2007.

[21] Y.W. Teh, M.I. Jordan, M.J. Beal, and D.M. Blei. Hierarchical Dirichlet processes. Journal of the American Statistical Association, 101(476):1566–1581, 2006.

[22] R. Thibaux and M.I. Jordan. Hierarchical beta processes and the Indian buffet process. In Proc. International Conference on Artificial Intelligence and Statistics, volume 11, 2007.

[23] Carnegie Mellon University. Graphics lab motion capture database. http://mocap.cs.cmu.edu/.

[24] J. Van Gael, Y.W. Teh, and Z. Ghahramani. The infinite factorial hidden Markov model. In Advances in Neural Information Processing Systems, volume 21, pages 1697–1704, 2009.

[25] J.M. Wang, D.J. Fleet, and A. Hertzmann. Gaussian process dynamical models for human motion. IEEE Transactions on Pattern Analysis and Machine Intelligence, 30(2):283–298, 2008.

[26] M. West and J. Harrison. Bayesian Forecasting and Dynamic Models. Springer, 1997.

[27] F. Wood, T. L. Griffiths, and Z. Ghahramani. A non-parametric Bayesian method for inferring hidden causes. In Proc. Conference on Uncertainty in Artificial Intelligence, volume 22, 2006. 9