nips nips2012 nips2012-11 nips2012-11-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Yali Wang, Brahim Chaib-draa
Abstract: We present a novel marginalized particle Gaussian process (MPGP) regression, which provides a fast, accurate online Bayesian filtering framework to model the latent function. Using a state space model established by the data construction procedure, our MPGP recursively filters out the estimation of hidden function values by a Gaussian mixture. Meanwhile, it provides a new online method for training hyperparameters with a number of weighted particles. We demonstrate the estimated performance of our MPGP on both simulated and real large data sets. The results show that our MPGP is a robust estimation algorithm with high computational efficiency, which outperforms other state-of-art sparse GP methods. 1
[1] C. E. Rasmussen, C. K. I. Williams, Gaussian Process for Machine learning, MIT Press, Cambridge, MA, 2006.
[2] E. Snelson, Z. Ghahramani, Sparse gaussian processes using pseudo-inputs, in: NIPS, 2006, pp. 1257–1264.
[3] M. L.-Gredilla, J. Q.-Candela, C. E. Rasmussen, A. R. F.-Vidal, Sparse spectrum gaussian process regression, Journal of Machine Learning Research 11 (2010) 1865–1881.
[4] S. Reece, S. Roberts, An introduction to gaussian processes for the kalman filter expert, in: FUSION, 2010.
[5] R. M. Neal, Monte carlo implementation of gaussian process models for bayesian regression and classification, Tech. rep., Department of Statistics, University of Toronto (1997).
[6] D. J. C. MacKay, Introduction to gaussian processes, in: Neural Networks and Machine Learning, 1998, pp. 133–165.
[7] M. P. Deisenroth, Efficient reinforcement learning using gaussian processes, Ph.D. thesis, Karlsruhe Institute of Technology (2010).
[8] J. Liu, M. West, Combined parameter and state estimation in simulation-based filtering, in: Sequential Monte Carlo Methods in Practice, 2001, pp. 197–223.
[9] P. Li, R. Goodall, V. Kadirkamanathan, Estimation of parameters in a linear state space model using a Rao-Blackwellised particle filter, IEE Proceedings on Control Theory and Applications 151 (2004) 727–738.
[10] N. Kantas, A. Doucet, S. S. Singh, J. M. Maciejowski, An overview of squential Monte Carlo methods for parameter estimation in general state space models, in: 15 th IFAC Symposium on System Identification, 2009.
[11] A. Doucet, N. de Freitas, K. Murphy, S. Russell, Rao-Blackwellised particle filtering for dynamic Bayesian networks, in: UAI, 2000, pp. 176–183.
[12] N. de Freitas, Rao-Blackwellised particle filtering for fault diagnosis, in: IEEE Aerospace Conference Proceedings, 2002, pp. 1767–1772.
[13] T. Sch¨ n, F. Gustafsson, P.-J. Nordlund, Marginalized particle filters for mixed linear/nonlinear o state-space models, IEEE Transactions on Signal Processing 53 (2005) 2279 – 2289.
[14] J. Ko, D. Fox, Gp-bayesfilters: Bayesian filtering using gaussian process prediction and observation models, in: IROS, 2008, pp. 3471–3476.
[15] M. P. Deisenroth, R. Turner, M. F. Huber, U. D. Hanebeck, C. E. Rasmussen, Robust filtering and smoothing with gaussian processes, IEEE Transactions on Automatic Control.
[16] I. DiMatteo, C. R. Genovese, R. E. Kass, Bayesian Curve Fitting with Free-Knot Splines, Biometrika 88 (2001) 1055–1071.
[17] S. A. Wood, Bayesian mixture of splines for spatially adaptive nonparametric regression, Biometrika 89 (2002) 513–528. 9