jmlr jmlr2011 jmlr2011-24 jmlr2011-24-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Lauren A. Hannah, David M. Blei, Warren B. Powell
Abstract: We propose Dirichlet Process mixtures of Generalized Linear Models (DP-GLM), a new class of methods for nonparametric regression. Given a data set of input-response pairs, the DP-GLM produces a global model of the joint distribution through a mixture of local generalized linear models. DP-GLMs allow both continuous and categorical inputs, and can model the same class of responses that can be modeled with a generalized linear model. We study the properties of the DP-GLM, and show why it provides better predictions and density estimates than existing Dirichlet process mixture regression models. We give conditions for weak consistency of the joint distribution and pointwise consistency of the regression estimate. Keywords: Bayesian nonparametrics, generalized linear models, posterior consistency
R. P. Adams, I. Murray, and D. J. C. MacKay. Tractable nonparametric Bayesian inference in Poisson processes with Gaussian process intensities. In Proceedings of the 26th Annual International Conference on Machine Learning, pages 9–16. ACM, 2009. M. Amewou-Atisso, S. Ghosal, J. K. Ghosh, and R. V. Ramamoorthi. Posterior consistency for semi-parametric regression problems. Bernoulli, 9(2):291–312, 2003. C. E. Antoniak. Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems. The Annals of Statistics, 2(6):1152–1174, 1974. A. Barron, M. J. Schervish, and L. Wasserman. The consistency of posterior distributions in nonparametric problems. The Annals of Statistics, 27(2):536–561, 1999. C. L. Bennett, M. Halpern, G. Hinshaw, N. Jarosik, A. Kogut, M. Limon, S. S. Meyer, L. Page, D. N. Spergel, G. S. Tucker, et al. First-year Wilkinson microwave anisotropy probe (WMAP) 1 observations: preliminary maps and basic results. The Astrophysical Journal Supplement Series, 148(1):1–27, 2003. D. Blackwell and J. B. MacQueen. Ferguson distributions via Polya urn schemes. The Annals Statistics, 1(2):353–355, 1973. D. M. Blei and M. I. Jordan. Variational inference for Dirichlet process mixtures. Bayesian Analysis, 1(1):121–144, 2006. G. Bradshaw. UCI machine learning repository, 1989. L. Brieman, J. H. Friedman, R. A. Olshen, and C. J. Stone. Classification and Regression Trees. Chapman & Hall/CRC, New York, NY, 1984. L. D. Brown. Fundamentals of Statistical Exponential Families: with Applications in Statistical Decision Theory. Institute of Mathematical Statistics, Hayward, CA, 1986. 1950 D IRICHLET P ROCESS M IXTURES OF G ENERALIZED L INEAR M ODELS H. A. Chipman, E. I. George, and R. E. McCulloch. Bayesian CART model search. Journal of the American Statistical Association, 93(443):935–948, 1998. H. A. Chipman, E. I. George, and R. E. McCulloch. Bayesian treed models. Machine Learning, 48 (1):299–320, 2002. M. De Iorio, P. Muller, G. L. Rosner, and S. N. MacEachern. An ANOVA model for dependent random measures. Journal of the American Statistical Association, 99(465):205–215, 2004. J. A. Duan, M. Guindani, and A. E. Gelfand. Biometrika, 94(4):809–825, 2007. Generalized spatial Dirichlet process models. D. B. Dunson, N. Pillai, and J. H. Park. Bayesian density regression. Journal of the Royal Statistical Society Series B, Statistical Methodology, 69(2):163–183, 2007. M. D. Escobar. Estimating normal means with a Dirichlet process prior. Journal of the American Statistical Association, 89(425):268–277, 1994. M. D. Escobar and M. West. Bayesian density estimation and inference using mixtures. Journal of the American Statistical Association, 90(430):577–588, 1995. T. S. Ferguson. A Bayesian analysis of some nonparametric problems. The Annals of Statistics, 1 (2):209–230, 1973. A. E. Gelfand, A. Kottas, and S. N. MacEachern. Bayesian nonparametric spatial modeling with Dirichlet process mixing. Journal of the American Statistical Association, 100(471):1021–1035, 2005. A. Gelman, J. B. Carlin, H. S. Stern, and D. S. Rubin. Bayesian Data Analysis. Chapman & Hall/CRC, Boca Raton, FL, 2004. S. Ghosal, J. K. Ghosh, and R. V. Ramamoorthi. Posterior consistency of Dirichlet mixtures in density estimation. The Annals of Statistics, 27(1):143–158, 1999. J. K. Ghosh and R. V. Ramamoorthi. Bayesian Nonparametrics. Springer-Verlag New York, Inc., New York, NY, 2003. R. B. Gramacy and H. K. H. Lee. Bayesian treed Gaussian process models with an application to computer modeling. Journal of the American Statistical Association, 103(483):1119–1130, 2008. J. E. Griffin and M. F. J. Steel. Order-based dependent Dirichlet processes. Journal of the American Statistical Association, 101(473):179–194, 2006. J. E. Griffin and M. F. J. Steel. Bayesian nonparametric modelling with the Dirichlet process regression smoother. Statistica Sinica, 20(4):1507–1527, 2010. J. G. Ibrahim and K. P. Kleinman. Semiparametric Bayesian methods for random effects models. In Practical Nonparametric and Semiparametric Bayesian Statistics, pages 89–114. 1998. S. N. MacEachern. Estimating normal means with a conjugate style Dirichlet process prior. Communications in Statistics-Simulation and Computation, 23(3):727–741, 1994. 1951 H ANNAH , B LEI AND P OWELL S. N. MacEachern and P. M¨ ller. Estimating mixture of Dirichlet process models. Journal of u Computational and Graphical Statistics, 7(2):223–238, 1998. P. McCullagh and J. A. Nelder. Generalized Linear Models. Boca Raton, FL, 1989. S. Mukhopadhyay and A. E. Gelfand. Dirichlet process mixed generalized linear models. Journal of the American Statistical Association, 92(438):633–639, 1997. P. M¨ ller, A. Erkanli, and M. West. Bayesian curve fitting using multivariate normal mixtures. u Biometrika, 83(1):67–79, 1996. E. A. Nadaraya. On estimating regression. Theory of Probability and its Applications, 9(1):141– 142, 1964. R. M. Neal. Markov chain sampling methods for Dirichlet process mixture models. Journal of Computational and Graphical Statistics, 9(2):249–265, 2000. R. M. Neal. MCMC using Hamiltonian dynamics. Handbook of Markov Chain Monte Carlo, 2010. P. Z. G. Qian, H. Wu, and C. F. J. Wu. Gaussian process models for computer experiments with qualitative and quantitative factors. Technometrics, 50(3):383–396, 2008. C. E. Rasmussen and Z. Ghahramani. Infinite mixtures of Gaussian process experts. In Advances in Neural Information Processing Systems, 14. C. E. Rasmussen and C. K. I. Williams. Gaussian Processes for Machine Learning. MIT Press, Cambridge, MA, 2006. A. Rodrıguez. Some Advances in Bayesian Nonparametric Modeling. PhD thesis, Duke University, 2009. A. Rodriguez, D. B. Dunson, and A. E. Gelfand. Bayesian nonparametric functional data analysis through density estimation. Biometrika, 96(1):149–162, 2009. M. A. Sato. Online model selection based on the variational Bayes. Neural Computation, 13(7): 1649–1681, 2001. L. Schwartz. On Bayes procedures. Probability Theory and Related Fields, 4(1):10–26, 1965. B. Shahbaba and R. M. Neal. Nonlinear models using Dirichlet process mixtures. Journal of Machine Learning Research, 10:1829–1850, 2009. S. Tokdar. Posterior consistency of Dirichlet location-scale mixture of normals in density estimation and regression. Sankhy¯ : The Indian Journal of Statistics, 67:90–110, 2006. a S. Walker. New approaches to Bayesian consistency. The Annals of Statistics, 32(5):2028–2043, 2004. S. G. Walker, A. Lijoi, and I. Prunster. On rates of convergence for posterior distributions in infinitedimensional models. Annals of Statistics, 35(2):738, 2007. 1952 D IRICHLET P ROCESS M IXTURES OF G ENERALIZED L INEAR M ODELS G. S. Watson. Smooth regression analysis. Sankhy¯ : The Indian Journal of Statistics, 26(4):359– a 372, 1964. M. West, P. Muller, and M. D. Escobar. Hierarchical priors and mixture models, with application in regression and density estimation. In Aspects of Uncertainty: A Tribute to DV Lindley, pages 363–386. 1994. I. C. Yeh. Modeling of strength of high-performance concrete using artificial neural networks. Cement and Concrete Research, 28(12):1797–1808, 1998. 1953