nips nips2007 nips2007-179 nips2007-179-reference knowledge-graph by maker-knowledge-mining

179 nips-2007-SpAM: Sparse Additive Models


Source: pdf

Author: Han Liu, Larry Wasserman, John D. Lafferty, Pradeep K. Ravikumar

Abstract: We present a new class of models for high-dimensional nonparametric regression and classification called sparse additive models (SpAM). Our methods combine ideas from sparse linear modeling and additive nonparametric regression. We derive a method for fitting the models that is effective even when the number of covariates is larger than the sample size. A statistical analysis of the properties of SpAM is given together with empirical results on synthetic and real data, showing that SpAM can be effective in fitting sparse nonparametric models in high dimensional data. 1


reference text

G REENSHTEIN , E. and R ITOV, Y. (2004). Persistency in high dimensional linear predictor-selection and the virtue of over-parametrization. Journal of Bernoulli 10 971–988. H ÄRDLE , W., M ÜLLER , M., S PERLICH , S. and W ERWATZ , A. (2004). Nonparametric and Semiparametric Models. Springer-Verlag Inc. H ASTIE , T. and T IBSHIRANI , R. (1999). Generalized additive models. Chapman & Hall Ltd. H ASTIE , T., T IBSHIRANI , R. and F RIEDMAN , J. H. (2001). The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer-Verlag. J UDITSKY, A. and N EMIROVSKI , A. (2000). Functional aggregation for nonparametric regression. Ann. Statist. 28 681–712. L IN , Y. and Z HANG , H. H. (2006). Component selection and smoothing in multivariate nonparametric regression. Ann. Statist. 34 2272–2297. M EINSHAUSEN , N. and Y U , B. (2006). Lasso-type recovery of sparse representations for high-dimensional data. Tech. Rep. 720, Department of Statistics, UC Berkeley. T IBSHIRANI , R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society, Series B, Methodological 58 267–288. WAINWRIGHT, M. (2006). Sharp thresholds for high-dimensional and noisy recovery of sparsity. Tech. Rep. 709, Department of Statistics, UC Berkeley. Y UAN , M. (2007). Nonnegative garrote component selection in functional ANOVA models. In Proceedings of AI and Statistics, AISTATS. Z HAO , P. and Y U , B. (2007). On model selection consistency of lasso. J. of Mach. Learn. Res. 7 2541–2567. 8