nips nips2004 nips2004-50 nips2004-50-reference knowledge-graph by maker-knowledge-mining

50 nips-2004-Dependent Gaussian Processes


Source: pdf

Author: Phillip Boyle, Marcus Frean

Abstract: Gaussian processes are usually parameterised in terms of their covariance functions. However, this makes it difficult to deal with multiple outputs, because ensuring that the covariance matrix is positive definite is problematic. An alternative formulation is to treat Gaussian processes as white noise sources convolved with smoothing kernels, and to parameterise the kernel instead. Using this, we extend Gaussian processes to handle multiple, coupled outputs. 1


reference text

[1] A BRAHAMSEN , P. A review of gaussian random fields and correlation functions. Tech. Rep. 917, Norwegian Computing Center, Box 114, Blindern, N-0314 Oslo, Norway, 1997.

[2] B OYLE , P., AND F REAN , M. Multiple-output gaussian process regression. Tech. rep., Victoria University of Wellington, 2004.

[3] C RESSIE , N. Statistics for Spatial Data. Wiley, 1993.

[4] G IBBS , M. Bayesian Gaussian Processes for Classification and Regression. PhD thesis, University of Cambridge, Cambridge, U.K., 1997.

[5] G IBBS , M., AND M AC K AY, D. J. Efficient implementation of gaussian processes. www.inference.phy.cam.ac.uk/mackay/abstracts/gpros.html, 1996.

[6] G IBBS , M. N., AND M AC K AY, D. J. Variational gaussian process classifiers. IEEE Trans. on Neural Networks 11, 6 (2000), 1458–1464.

[7] H IGDON , D. Space and space-time modelling using process convolutions. In Quantitative methods for current environmental issues (2002), C. Anderson, V. Barnett, P. Chatwin, and A. El-Shaarawi, Eds., Springer Verlag, pp. 37–56.

[8] M AC K AY, D. J. Gaussian processes: A replacement for supervised neural networks? NIPS97 Tutorial, 1997. In

[9] M AC K AY, D. J. Information theory, inference, and learning algorithms. Cambridge University Press, 2003.

[10] N EAL , R. Probabilistic inference using markov chain monte carlo methods. Tech. Report CRG-TR-93-1, Dept. of Computer Science, Univ. of Toronto, 1993.

[11] N EAL , R. Monte carlo implementation of gaussian process models for bayesian regression and classification. Tech. Rep. CRG-TR-97-2, Dept. of Computer Science, Univ. of Toronto, 1997.

[12] PACIOREK , C. Nonstationary Gaussian processes for regression and spatial modelling. PhD thesis, Carnegie Mellon University, Pittsburgh, Pennsylvania, U.S.A., 2003.

[13] PACIOREK , C., AND S CHERVISH , M. Nonstationary covariance functions for gaussian process regression. Submitted to NIPS, 2004.

[14] R ASMUSSEN , C., AND K USS , M. Gaussian processes in reinforcement learning. In Advances in Neural Information Processing Systems (2004), vol. 16.

[15] R ASMUSSEN , C. E. Evaluation of Gaussian Processes and other methods for Non-Linear Regression. PhD thesis, Graduate Department of Computer Science, University of Toronto, 1996.

[16] T IPPING , M. E., AND B ISHOP, C. M. Bayesian image super-resolution. In Advances in Neural Information Processing Systems (2002), S. Becker S., Thrun and K. Obermayer, Eds., vol. 15, pp. 1303 – 1310.

[17] W ILLIAMS , C. K., AND BARBER , D. Bayesian classification with gaussian processes. IEEE trans. Pattern Analysis and Machine Intelligence 20, 12 (1998), 1342 – 1351.

[18] W ILLIAMS , C. K., AND R ASMUSSEN , C. E. Gaussian processes for regression. In Advances in Neural Information Processing Systems (1996), D. Touretzsky, M. Mozer, and M. Hasselmo, Eds., vol. 8.