nips nips2011 nips2011-100 nips2011-100-reference knowledge-graph by maker-knowledge-mining

100 nips-2011-Gaussian Process Training with Input Noise


Source: pdf

Author: Andrew Mchutchon, Carl E. Rasmussen

Abstract: In standard Gaussian Process regression input locations are assumed to be noise free. We present a simple yet effective GP model for training on input points corrupted by i.i.d. Gaussian noise. To make computations tractable we use a local linear expansion about each input point. This allows the input noise to be recast as output noise proportional to the squared gradient of the GP posterior mean. The input noise variances are inferred from the data as extra hyperparameters. They are trained alongside other hyperparameters by the usual method of maximisation of the marginal likelihood. Training uses an iterative scheme, which alternates between optimising the hyperparameters and calculating the posterior gradient. Analytic predictive moments can then be found for Gaussian distributed test points. We compare our model to others over a range of different regression problems and show that it improves over current methods. 1


reference text

[1] Carl Edward Rasmussen and Christopher K. I. Williams. Gaussian Processes for Machine Learning. MIT Press, 2006.

[2] Paul W. Goldberg, Christopher K. I. Williams, and Christopher M. Bishop. Regression with input-dependent noise: A Gaussian Process treatment. NIPS-98, 1998.

[3] Kristian Kersting, Christian Plagemann, Patrick Pfaff, and Wolfram Burgard. Most likely heteroscedastic Gaussian Process regression. ICML-07, 2007.

[4] Ming Yuan and Grace Wahba. Doubly penalized likelihood estimator in heteroscedastic regression. Statistics and Probability Letter, 69:11–20, 2004.

[5] Quoc V. Le, Alex J. Smola, and Stephane Canu. Heteroscedastic Gaussian Process regression. Procedings of ICML-05, pages 489–496, 2005.

[6] Edward Snelson and Zoubin Ghahramani. Variable noise and dimensionality reduction for sparse gaussian processes. Procedings of UAI-06, 2006.

[7] A.G. Wilson and Z. Ghahramani. Copula processes. In J. Lafferty, C. K. I. Williams, J. ShaweTaylor, R.S. Zemel, and A. Culotta, editors, Advances in Neural Information Processing Systems 23, pages 2460–2468. 2010.

[8] Andrew Wilson and Zoubin Ghahramani. Generalised Wishart Processes. In Proceedings of the Twenty-Seventh Conference Annual Conference on Uncertainty in Artificial Intelligence (UAI-11), pages 736–744, Corvallis, Oregon, 2011. AUAI Press.

[9] P. Dallaire, C. Besse, and B. Chaib-draa. Learning Gaussian Process Models from Uncertain Data. 16th International Conference on Neural Information Processing, 2008.

[10] E. Solak, R. Murray-Smith, W.e. Leithead, D.J. Leith, and C.E. Rasmussen. Derivative observations in Gaussian Process models of dynamic systems. NIPS-03, pages 1033–1040, 2003.

[11] Agathe Girard, Carl Edward Rasmussen, Joaquin Quinonero Candela, and Roderick MurraySmith. Gaussian Process priors with incertain inputs - application to multiple-step ahead time series forecasting. Advances in Neural Information Processing Systems 16, 2003. 9