nips nips2008 nips2008-233 nips2008-233-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Iain Murray, David MacKay, Ryan P. Adams
Abstract: We present the Gaussian Process Density Sampler (GPDS), an exchangeable generative model for use in nonparametric Bayesian density estimation. Samples drawn from the GPDS are consistent with exact, independent samples from a fixed density function that is a transformation of a function drawn from a Gaussian process prior. Our formulation allows us to infer an unknown density from data using Markov chain Monte Carlo, which gives samples from the posterior distribution over density functions and from the predictive distribution on data space. We can also infer the hyperparameters of the Gaussian process. We compare this density modeling technique to several existing techniques on a toy problem and a skullreconstruction task. 1
[1] R. M. Neal. Defining priors for distributions using Dirichlet diffusion trees. Technical Report 0104, Department of Statistics, University of Toronto, 2001.
[2] D. J. C. MacKay. Bayesian neural networks and density networks. Nuclear Instruments and Methods in Physics Research, Section A, 354(1):73–80, 1995.
[3] N. Lawrence. Probabilistic non-linear principal component analysis with Gaussian process latent variable models. Journal of Machine Learning Research, 6:1783–1816, 2005.
[4] C. E. Rasmussen and C. K. I. Williams. Gaussian Processes for Machine Learning. MIT Press, Cambridge, MA, 2006.
[5] A. Beskos, O. Papaspiliopoulos, G. O. Roberts, and P. Fearnhead. Exact and computationally efficient likelihood-based estimation for discretely observed diffusion processes (with discussion). Journal of the Royal Statistical Society: Series B, 68:333–382, 2006.
[6] O. Papaspiliopoulos and G. O. Roberts. Retrospective Markov chain Monte Carlo methods for Dirichlet process hierarchical models. Biometrika, 95(1):169–186, 2008.
[7] I. Murray. Advances in Markov chain Monte Carlo methods. PhD thesis, Gatsby Computational Neuroscience Unit, University College London, London, 2007.
[8] J. G. Propp and D. B. Wilson. Exact sampling with coupled Markov chains and applications to statistical mechanics. Random Structures and Algorithms, 9(1&2):223–252, 1996.
[9] R. M. Neal. Supressing random walks in Markov chain Monte Carlo using ordered overrelaxation, 1998.
[10] S. Chib and I. Jeliazkov. Marginal likelihood from the Metropolis–Hastings output. Journal of the American Statistical Association, 96(453):270–281, 2001.
[11] K. E. Willmore, C. P. Klingenberg, and B. Hallgrimsson. The relationship between fluctuating asymmetry and environmental variance in rhesus macaque skulls. Evolution, 59(4):898–909, 2005.
[12] S. R. Lele and J. T. Richtsmeier. An invariant approach to statistical analysis of shapes. Chapman and Hall/CRC Press, London, 2001.
[13] T. Leonard. Density estimation, stochastic processes and prior information. Journal of the Royal Statistical Society, Series B, 40(2):113–146, 1978.
[14] D. Thorburn. A Bayesian approach to density estimation. Biometrika, 73(1):65–75, 1986.
[15] P. J. Lenk. Towards a practicable Bayesian nonparametric density estimator. Biometrika, 78(3):531–543, 1991.
[16] S. T. Tokdar and J. K. Ghosh. Posterior consistency of logistic Gaussian process priors in density estimation. Journal of Statistical Planning and Inference, 137:34–42, 2007.
[17] I. Murray, Z. Ghahramani, and D. J.C. MacKay. MCMC for doubly-intractable distributions. In Proceedings of the 22nd Annual Conference on Uncertainty in Artificial Intelligence (UAI), pages 359–366, 2006.