nips nips2009 nips2009-254 nips2009-254-reference knowledge-graph by maker-knowledge-mining

254 nips-2009-Variational Gaussian-process factor analysis for modeling spatio-temporal data


Source: pdf

Author: Jaakko Luttinen, Alexander T. Ihler

Abstract: We present a probabilistic factor analysis model which can be used for studying spatio-temporal datasets. The spatial and temporal structure is modeled by using Gaussian process priors both for the loading matrix and the factors. The posterior distributions are approximated using the variational Bayesian framework. High computational cost of Gaussian process modeling is reduced by using sparse approximations. The model is used to compute the reconstructions of the global sea surface temperatures from a historical dataset. The results suggest that the proposed model can outperform the state-of-the-art reconstruction systems.


reference text

[1] A. Belouchrani, K. A. Meraim, J.-F. Cardoso, and E. Moulines. A blind source separation technique based on second order statistics. IEEE Transactions on Signal Processing, 45(2):434–444, 1997. 8

[2] C. M. Bishop. Variational principal components. In Proceedings of the 9th International Conference on Artificial Neural Networks (ICANN’99), pages 509–514, 1999.

[3] N. Cressie. Statistics for Spatial Data. Wiley-Interscience, New York, 1993.

[4] A. Ilin and A. Kaplan. Bayesian PCA for reconstruction of historical sea surface temperatures. In Proceedings of the International Joint Conference on Neural Networks (IJCNN 2009), pages 1322–1327, Atlanta, USA, June 2009.

[5] A. Kaplan, M. Cane, Y. Kushnir, A. Clement, M. Blumenthal, and B. Rajagopalan. Analysis of global sea surface temperatures 1856–1991. Journal of Geophysical Research, 103:18567–18589, 1998.

[6] D. E. Parker, P. D. Jones, C. K. Folland, and A. Bevan. Interdecadal changes of surface temperature since the late nineteenth century. Journal of Geophysical Research, 99:14373–14399, 1994.

[7] J. Qui˜ onero-Candela and C. E. Rasmussen. A unifying view of sparse approximate Gaussian process n regression. Journal of Machine Learning Research, 6:1939–1959, Dec. 2005.

[8] C. E. Rasmussen and C. K. I. Williams. Gaussian Processes for Machine Learning. MIT Press, 2006.

[9] J. S¨ rel¨ and H. Valpola. Denoising source separation. Journal of Machine Learning Research, 6:233– a a 272, 2005.

[10] M. N. Schmidt. Function factorization using warped Gaussian processes. In L. Bottou and M. Littman, editors, Proceedings of the 26th International Conference on Machine Learning (ICML’09), pages 921– 928, Montreal, June 2009. Omnipress.

[11] M. N. Schmidt and H. Laurberg. Nonnegative matrix factorization with Gaussian process priors. Computational Intelligence and Neuroscience, 2008:1–10, 2008.

[12] M. Seeger, C. K. I. Williams, and N. D. Lawrence. Fast forward selection to speed up sparse Gaussian process regression. In Proceedings of the 9th International Workshop on Artificial Intelligence and Statistics (AISTATS’03), pages 205–213, 2003.

[13] Y. W. Teh, M. Seeger, and M. I. Jordan. Semiparametric latent factor models. In Proceedings of the 10th International Workshop on Artificial Intelligence and Statistics (AISTATS’05), pages 333–340, 2005.

[14] M. E. Tipping and C. M. Bishop. Probabilistic principal component analysis. Journal of the Royal Statistical Society Series B, 61(3):611–622, 1999.

[15] M. K. Titsias. Variational learning of inducing variables in sparse Gaussian processes. In Proceedings of the 12th International Workshop on Artificial Intelligence and Statistics (AISTATS’09), pages 567–574, 2009.

[16] B. M. Yu, J. P. Cunningham, G. Santhanam, S. I. Ryu, K. V. Shenoy, and M. Sahani. Gaussian-process factor analysis for low-dimensional single-trial analysis of neural population activity. In Advances in Neural Information Processing Systems 21, pages 1881–1888. 2009. 9