nips nips2005 nips2005-21 nips2005-21-reference knowledge-graph by maker-knowledge-mining

21 nips-2005-An Alternative Infinite Mixture Of Gaussian Process Experts


Source: pdf

Author: Edward Meeds, Simon Osindero

Abstract: We present an infinite mixture model in which each component comprises a multivariate Gaussian distribution over an input space, and a Gaussian Process model over an output space. Our model is neatly able to deal with non-stationary covariance functions, discontinuities, multimodality and overlapping output signals. The work is similar to that by Rasmussen and Ghahramani [1]; however, we use a full generative model over input and output space rather than just a conditional model. This allows us to deal with incomplete data, to perform inference over inverse functional mappings as well as for regression, and also leads to a more powerful and consistent Bayesian specification of the effective ‘gating network’ for the different experts. 1


reference text

[1] C.E. Rasmussen and Z. Ghahramani. Infinite mixtures of Gaussian process experts. In Advances in Neural Information Processing Systems 14, pages 881–888. MIT Press, 2002.

[2] V. Tresp. Mixture of Gaussian processes. In Advances in Neural Information Processing Systems, volume 13. MIT Press, 2001.

[3] Z. Ghahramani and M. I. Jordan. Supervised learning from incomplete data via an EM approach. In Advances in Neural Information Processing Systems 6, pages 120–127. MorganKaufmann, 1995.

[4] L. Xu, M. I. Jordan, and G. E. Hinton. An alternative model for mixtures of experts. In Advances in Neural Information Processing Systems 7, pages 633–640. MIT Press, 1995.

[5] C. E. Rasmussen. The infinite Gaussian mixture model. In Advances in Neural Information Processing Systems, volume 12, pages 554–560. MIT Press, 2000.

[6] R.A. Jacobs, M.I. Jordan, and G.E. Hinton. Adaptive mixture of local experts. Neural Computation, 3, 1991.

[7] A. Gelman, J. B. Carlin, H. S. Stern, and D. B. Rubin. Bayesian Data Analysis. Chapman and Hall, 2nd edition, 2004.

[8] D. Blackwell and J. B. MacQueen. Ferguson distributions via Polya urn schemes. The Annals of Statistics, 1(2):353–355, 1973.

[9] R. M. Neal. Markov chain sampling methods for Dirichlet process mixture models. Journal of Computational and Graphical Statistics, 9:249–265, 2000.

[10] R. M. Neal. Probabilistic inference using Markov chain Monte Carlo methods. Technical Report CRG-TR-93-1, University of Toronto, 1993.

[11] R. M. Neal. Slice sampling (with discussion). Annals of Statistics, 31:705–767, 2003.

[12] M. Escobar and M. West. Computing Bayesian nonparametric hierarchical models. In Practical Nonparametric and Semiparametric Bayesian Statistics, number 133 in Lecture Notes in Statistics. Springer-Verlag, 1998.

[13] B. W. Silverman. Some aspects of the spline smoothing approach to non-parametric regression curve fitting. J. Royal Stayt Society. Ser. B, 47:1–52, 1985.