nips nips2008 nips2008-213 nips2008-213-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Mauricio Alvarez, Neil D. Lawrence
Abstract: We present a sparse approximation approach for dependent output Gaussian processes (GP). Employing a latent function framework, we apply the convolution process formalism to establish dependencies between output variables, where each latent function is represented as a GP. Based on these latent functions, we establish an approximation scheme using a conditional independence assumption between the output processes, leading to an approximation of the full covariance which is determined by the locations at which the latent functions are evaluated. We show results of the proposed methodology for synthetic data and real world applications on pollution prediction and a sensor network. 1
[1] E. V. Bonilla, K. M. Chai, and C. K. I. Williams. Multi-task Gaussian process prediction. In J. C. Platt, D. Koller, Y. Singer, and S. Roweis, editors, NIPS, volume 20, Cambridge, MA, 2008. MIT Press. In press.
[2] P. Boyle and M. Frean. Dependent Gaussian processes. In L. Saul, Y. Weiss, and L. Bouttou, editors, NIPS, volume 17, pages 217–224, Cambridge, MA, 2005. MIT Press.
[3] M. Brookes. The matrix reference manual. Available on-line., 2005. http://www.ee.ic.ac.uk/ hp/staff/dmb/matrix/intro.html.
[4] P. Gao, A. Honkela, M. Rattray, and N. D. Lawrence. Gaussian process modelling of latent chemical species: Applications to inferring transcription factor activities. Bioinformatics, 24(16):i70–i75, 2008.
[5] P. Goovaerts. Geostatistics For Natural Resources Evaluation. Oxford University Press, 1997. ISBN 0-19-511538-4.
[6] D. M. Higdon. Space and space-time modelling using process convolutions. In C. Anderson, V. Barnett, P. Chatwin, and A. El-Shaarawi, editors, Quantitative methods for current environmental issues, pages 37–56. Springer-Verlag, 2002.
[7] N. D. Lawrence. Learning for larger datasets with the Gaussian process latent variable model. In Meila and Shen [9].
[8] N. D. Lawrence, M. Seeger, and R. Herbrich. Fast sparse Gaussian process methods: The informative vector machine. In S. Becker, S. Thrun, and K. Obermayer, editors, NIPS, volume 15, pages 625–632, Cambridge, MA, 2003. MIT Press.
[9] M. Meila and X. Shen, editors. AISTATS, San Juan, Puerto Rico, 21-24 March 2007. Omnipress.
[10] J. Qui˜ onero Candela and C. E. Rasmussen. A unifying view of sparse approximate Gaussian process n regression. JMLR, 6:1939–1959, 2005.
[11] C. E. Rasmussen and C. K. I. Williams. Gaussian Processes for Machine Learning. MIT Press, Cambridge, MA, 2006. ISBN 0-262-18253-X.
[12] A. Rogers, M. A. Osborne, S. D. Ramchurn, S. J. Roberts, and N. R. Jennings. Towards real-time information processing of sensor network data using computationally efficient multi-output Gaussian processes. In Proceedings of the International Conference on Information Processing in Sensor Networks (IPSN 2008), 2008. In press.
[13] E. Snelson and Z. Ghahramani. Sparse Gaussian processes using pseudo-inputs. In Y. Weiss, B. Sch¨ lkopf, and J. C. Platt, editors, NIPS, volume 18, Cambridge, MA, 2006. MIT Press. o
[14] E. Snelson and Z. Ghahramani. Local and global sparse Gaussian process approximations. In Meila and Shen [9].
[15] Y. W. Teh, M. Seeger, and M. I. Jordan. Semiparametric latent factor models. In R. G. Cowell and Z. Ghahramani, editors, AISTATS 10, pages 333–340, Barbados, 6-8 January 2005. Society for Artificial Intelligence and Statistics.