nips nips2012 nips2012-121 nips2012-121-reference knowledge-graph by maker-knowledge-mining

121 nips-2012-Expectation Propagation in Gaussian Process Dynamical Systems


Source: pdf

Author: Marc Deisenroth, Shakir Mohamed

Abstract: Rich and complex time-series data, such as those generated from engineering systems, financial markets, videos, or neural recordings are now a common feature of modern data analysis. Explaining the phenomena underlying these diverse data sets requires flexible and accurate models. In this paper, we promote Gaussian process dynamical systems as a rich model class that is appropriate for such an analysis. We present a new approximate message-passing algorithm for Bayesian state estimation and inference in Gaussian process dynamical systems, a nonparametric probabilistic generalization of commonly used state-space models. We derive our message-passing algorithm using Expectation Propagation and provide a unifying perspective on message passing in general state-space models. We show that existing Gaussian filters and smoothers appear as special cases within our inference framework, and that these existing approaches can be improved upon using iterated message passing. Using both synthetic and real-world data, we demonstrate that iterated message passing can improve inference in a wide range of tasks in Bayesian state estimation, thus leading to improved predictions and more effective decision making. 1


reference text

[1] B. D. O. Anderson and J. B. Moore. Optimal Filtering. Dover Publications, 2005.

[2] A. Damianou, M. K. Titsias, and N. D. Lawrence. Variational Gaussian Process Dynamical Systems. In Advances in Neural Information Processing Systems. 2011.

[3] M. P. Deisenroth, M. F. Huber, and U. D. Hanebeck. Analytic Moment-based Gaussian Process Filtering. In Proceedings of the 26th International Conference on Machine Learning, pages 225–232. Omnipress, 2009.

[4] M. P. Deisenroth and S. Mohamed. Expectation Propagation in Gaussian Process Dynamical Systems: Extended Version, 2012. http://arxiv.org/abs/1207.2940.

[5] M. P. Deisenroth, R. Turner, M. Huber, U. D. Hanebeck, and C. E. Rasmussen. Robust Filtering and Smoothing with Gaussian Processes. IEEE Transactions on Automatic Control, 2012.

[6] T. Heskes and O. Zoeter. Expectation Propagation for Approximate Inference in Dynamic Bayesian Networks. In Proceedings of the International Conference on Uncertainty in Artificial Intelligence, pages 216–233, 2002.

[7] S. J. Julier and J. K. Uhlmann. Unscented Filtering and Nonlinear Estimation. Proceedings of the IEEE, 92(3):401–422, March 2004.

[8] J. Ko and D. Fox. GP-BayesFilters: Bayesian Filtering using Gaussian Process Prediction and Observation Models. Autonomous Robots, 27(1):75–90, 2009.

[9] M. Kuss and C. E. Rasmussen. Assessing Approximate Inference for Binary Gaussian Process Classification. Journal of Machine Learning Research, 6:1679–1704, 2005.

[10] T. P. Minka. Expectation Propagation for Approximate Bayesian Inference. In Proceedings of the 17th Conference on Uncertainty in Artificial Intelligence, pages 362–369. Morgan Kaufman Publishers, 2001.

[11] T. P. Minka. A Family of Algorithms for Approximate Bayesian Inference. PhD thesis, Massachusetts Institute of Technology, 2001.

[12] T. P. Minka. EP: A Quick Reference. 2008.

[13] Y. Qi and T. Minka. Expectation Propagation for Signal Detection in Flat-Fading Channels. In Proceedings of the IEEE International Symposium on Information Theory, 2003.

[14] J. Qui˜ onero-Candela, A. Girard, J. Larsen, and C. E. Rasmussen. Propagation of Uncern tainty in Bayesian Kernel Models—Application to Multiple-Step Ahead Forecasting. In IEEE International Conference on Acoustics, Speech and Signal Processing, pages 701–704, 2003.

[15] C. E. Rasmussen and C. K. I. Williams. Gaussian Processes for Machine Learning. The MIT Press, 2006.

[16] M. W. Seeger. Expectation Propagation for Exponential Families. Technical report, University of California Berkeley, 2005.

[17] M. W. Seeger. Bayesian Inference and Optimal Design for the Sparse Linear Model. Journal of Machine Learning Research, 9:759–813, 2008.

[18] M. Toussaint and C. Goerick. From Motor Learning to Interaction Learning in Robotics, chapter A Bayesian View on Motor Control and Planning, pages 227–252. Springer-Verlag, 2010.

[19] R. Turner, M. P. Deisenroth, and C. E. Rasmussen. State-Space Inference and Learning with Gaussian Processes. In Proceedings of the International Conference on Artificial Intelligence and Statistics, volume JMLR: W&CP; 9, pages 868–875, 2010.

[20] J. M. Wang, D. J. Fleet, and A. Hertzmann. Gaussian Process Dynamical Models for Human Motion. IEEE Transactions on Pattern Analysis and Machine Intelligence, 30(2):283–298, 2008. 9