nips nips2013 nips2013-243 nips2013-243-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Jason Chang, John W. Fisher III
Abstract: We present an MCMC sampler for Dirichlet process mixture models that can be parallelized to achieve significant computational gains. We combine a nonergodic, restricted Gibbs iteration with split/merge proposals in a manner that produces an ergodic Markov chain. Each cluster is augmented with two subclusters to construct likely split moves. Unlike some previous parallel samplers, the proposed sampler enforces the correct stationary distribution of the Markov chain without the need for finite approximations. Empirical results illustrate that the new sampler exhibits better convergence properties than current methods. 1
[1] K. Bache and M. Lichman. UCI machine learning repository, 2013.
[2] D. M. Blei, T. L. Griffiths, M. I. Jordan, and J. B. Tenenbaum. Hierarchical topic models and the nested Chinese restaurant process. In NIPS, 2003.
[3] D. M. Blei and M. I. Jordan. Variational inference for Dirichlet process mixtures. Bayesian Analysis, 1:121–144, 2005.
[4] C. A. Bush and S. N. MacEachern. A semiparametric Bayesian model for randomised block designs. Biometrika, 83:275–285, 1973.
[5] D. B. Dahl. An improved merge-split sampler for conjugate Dirichlet process mixture models. Technical report, University of Wisconsin - Madison Dept. of Statistics, 2003.
[6] M. D. Escobar and M. West. Bayesian density estimation and inference using mixtures. Journal of the American Statistical Association, 90(430):577–588, 1995.
[7] S. Favaro and Y. W. Teh. MCMC for normalized random measure mixture models. Statistical Science, 2013.
[8] T. S. Ferguson. A Bayesian analysis of some nonparametric problems. The Annals of Statistics, 1(2):209– 230, 1973.
[9] P. J. Green and S. Richardson. Modelling heterogeneity with and without the Dirichlet process. Scandinavian Journal of Statistics, pages 355–375, 2001.
[10] W. K. Hastings. Monte Carlo sampling methods using Markov chains and their applications. Biometrika, 57(1):97–109, 1970.
[11] H. Ishwaran and L. F. James. Gibbs sampling methods for stick-breaking priors. Journal of the American Statistical Association, 96:161–173, 2001.
[12] H. Ishwaran and M. Zarepour. Exact and approximate sum-representations for the Dirichlet process. Canadian Journal of Statistics, 30:269–283, 2002.
[13] S. Jain and R. Neal. A split-merge Markov chain Monte Carlo procedure for the Dirichlet process mixture model. Journal of Computational and Graphical Statistics, 13:158–182, 2000.
[14] S. Jain and R. Neal. Splitting and merging components of a nonconjugate Dirichlet process mixture model. Bayesian Analysis, 2(3):445–472, 2007.
[15] K. Kurihara, M. Welling, and Y. W. Teh. Collapsed variational Dirichlet process mixture models. In International Joint Conference on Artificial Intelligence, 2007.
[16] Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278–2324, 1998.
[17] P. Liang, M. I. Jordan, and B. Taskar. A permutation-augmented sampler for DP mixture models. In Proceedings of the 24th international conference on Machine learning, 2007.
[18] D. Lin, E. Grimson, and J. W. Fisher III. Construction of dependent Dirichlet processes based on Poisson processes. In NIPS, 2010.
[19] D. Lovell, R. P. Adams, and V. K. Mansingka. Parallel Markov chain Monte Carlo for Dirichlet process mixtures. In Workshop on Big Learning, NIPS, 2012.
[20] S. N. MacEachern. Estimating normal means with a conjugate style Dirichlet process prior. In Communications in Statistics: Simulation and Computation, 1994.
[21] S. N. MacEachern and P. M¨ ller. Estimating mixture of Dirichlet process models. Journal of Computau tional and Graphical Statistics, 7(2):223–238, June 1998.
[22] R. Neal. Bayesian mixture modeling. In Proceedings of the 11th International Workshop on Maximum Entropy and Bayesian Methods of Statistical Analysis, 1992.
[23] R. Neal. Markov chain sampling methods for Dirichlet process mixture models. Journal of Computational and Graphical Statistics, 9(2):249–265, June 2000.
[24] O. Papaspiliopoulos and G. O. Roberts. Retrospective Markov chain Monte Carlo methods for Dirichlet process hierarchical models. Biometrika, 95(1):169–186, 2008.
[25] J. Pitman. Combinatorial stochastic processes. Technical report, U.C. Berkeley Dept. of Statistics, 2002.
[26] J. Sethuraman. A constructive definition of Dirichlet priors. Statstica Sinica, pages 639–650, 1994.
[27] E. B. Sudderth. Graphical Models for Visual Object Recognition and Tracking. PhD thesis, Massachusetts Institute of Technology, 2006.
[28] E. B. Sudderth, A. B. Torralba, W. T. Freeman, and A. S. Willsky. Describing visual scenes using transformed Dirichlet processes. In NIPS, 2006.
[29] Y. W. Teh, M. I. Jordan, M. J. Beal, and D. M. Blei. Hierarchical Dirichlet processes. Journal of the American Statistical Association, 101(476):1566–1581, 2006.
[30] M. West, P. M¨ ller, and S. N. MacEachern. Hierarchical priors and mixture models, with application in u regression and density estimation. Aspects of Uncertainity, pages 363–386, 1994.
[31] S. A. Williamson, A. Dubey, and E. P. Xing. Parallel Markov chain Monte Carlo for nonparametric mixture models. In ICML, 2013.
[32] E. P. Xing, R. Sharan, and M. I. Jordan. Bayesian haplotype inference via the Dirichlet process. In ICML, 2004. 9