nips nips2013 nips2013-277 nips2013-277-reference knowledge-graph by maker-knowledge-mining

277 nips-2013-Restricting exchangeable nonparametric distributions


Source: pdf

Author: Sinead A. Williamson, Steve N. MacEachern, Eric Xing

Abstract: Distributions over matrices with exchangeable rows and infinitely many columns are useful in constructing nonparametric latent variable models. However, the distribution implied by such models over the number of features exhibited by each data point may be poorly-suited for many modeling tasks. In this paper, we propose a class of exchangeable nonparametric priors obtained by restricting the domain of existing models. Such models allow us to specify the distribution over the number of features per data point, and can achieve better performance on data sets where the number of features is not well-modeled by the original distribution. 1


reference text

´ ´e

[1] D. Aldous. Exchangeability and related topics. Ecole d’Et´ de Probabilit´ s de Saint-Flour e XIII, pages 1–198, 1985.

[2] D. J. Aldous. Representations for partially exchangeable arrays of random variables. Journal of Multivariate Analysis, 11(4):581–598, 1981.

[3] R. E. Barlow and K. D. Heidtmann. Computing k-out-of-n system reliability. IEEE Transactions on Reliability, 33:322–323, 1984.

[4] F. Caron. Bayesian nonparametric models for bipartite graphs. In Neural Information Processing Systems, 2012.

[5] S. X Chen, A. P. Dempster, and J. S. Liu. Weighted finite population sampling to maximize entropy. Biometrika, 81:457–469, 1994.

[6] S. X. Chen and J. S. Liu. Statistical applications of the Poisson-binomial and conditional Bernoulli distributions. Statistica Sinica, 7:875–892, 1997.

[7] F. Doshi-Velez and Z. Ghahramani. Accelerated Gibbs sampling for the Indian buffet process. In International Conference on Machine Learning, 2009.

[8] M. Fern´ ndez and S. Williams. Closed-form expression for the Poisson-binomial probability a density function. IEEE Transactions on Aerospace Electronic Systems, 46:803–817, 2010.

[9] S. Fortini, L. Ladelli, and E. Regazzini. Exchangeability, predictive distributions and parametric models. Sankhy¯ : The Indian Journal of Statistics, Series A, pages 86–109, 2000. a

[10] E. B. Fox, E. B. Sudderth, M. I. Jordan, and A. S. Willsky. Sharing features among dynamical systems with beta processes. In Neural Information Processing Systems, 2010.

[11] T. L. Griffiths and Z. Ghahramani. Infinite latent feature models and the Indian buffet process. In Neural Information Processing Systems, 2005.

[12] J. F. C. Kingman. Completely random measures. Pacific Journal of Mathematics, 21(1):59–78, 1967.

[13] K. T. Miller, T. L. Griffiths, and M. I. Jordan. Nonparametric latent feature models for link prediction. In Neural Information Processing Systems, 2009.

[14] R. M. Neal. Slice sampling. Annals of Statistics, 31(3):705–767, 2003.

[15] Y. W. Teh and D. G¨ r¨ r. Indian buffet processes with power law behaviour. In Neural Inforou mation Processing Systems, 2009.

[16] Y. W. Teh, D. G¨ r¨ r, and Z. Ghahramani. Stick-breaking construction for the Indian buffet ou process. In Artificial Intelligence and Statistics, 2007.

[17] R. Thibaux and M.I. Jordan. Hierarchical beta processes and the Indian buffet process. In Artificial Intelligence and Statistics, 2007.

[18] M. Titsias. The infinite gamma-Poisson feature model. In Neural Information Processing Systems, 2007.

[19] A. Y. Volkova. A refinement of the central limit theorem for sums of independent random indicators. Theory of Probability and its Applications, 40:791–794, 1996.

[20] F. Wood, T. L. Griffiths, and Z. Ghahramani. A non-parametric Bayesian method for inferring hidden causes. In Uncertainty in Artificial Intelligence, 2006.

[21] M. Zhou, L. A. Hannah, D. B. Dunson, and L. Carin. Beta-negative binomial process and Poisson factor analysis. In Artificial Intelligence and Statistics, 2012.

[22] G. K. Zipf. Selective Studies and the Principle of Relative Frequency in Language. Harvard University Press, 1932. 9