jmlr jmlr2010 jmlr2010-46 jmlr2010-46-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Ming Yuan
Abstract: This paper considers the problem of estimating a high dimensional inverse covariance matrix that can be well approximated by “sparse” matrices. Taking advantage of the connection between multivariate linear regression and entries of the inverse covariance matrix, we propose an estimating procedure that can effectively exploit such “sparsity”. The proposed method can be computed using linear programming and therefore has the potential to be used in very high dimensional problems. Oracle inequalities are established for the estimation error in terms of several operator norms, showing that the method is adaptive to different types of sparsity of the problem. Keywords: covariance selection, Dantzig selector, Gaussian graphical model, inverse covariance matrix, Lasso, linear programming, oracle inequality, sparsity
T.W. Anderson. An Introduction to Multivariate Statistical Analysis. Wiley-Interscience, London, 2003. M. Asif. Primal dual pursuit: a homotopy based algorithm for the Dantzig selector. Master Thesis, School of Electrical and Computer Enginnering, Georgia Institute of Technology, 2008. O. Banerjee, L. El Ghaoui and A. d’Aspremont. Model selection through sparse maximum likelihood estimation for multivariate Gaussian or binary data. Journal of Machine Learning Research, 9:485-516, 2008. P. Bickel and E. Levina. Regularized estimation of large covariance matrices. Annals of Statistics, 36:199-227, 2008a. P. Bickel and E. Levina. Covariance regularization by thresholding. Annals of Statistics, 36:25772604, 2008b. 2284 C OVARIANCE M ATRIX E STIMATION P. Bickel, Y. Ritov and A. Tsybakov. Simultaneous analysis of Lasso and Dantzig selector. Annals of Statistics, 37:1705-1732, 2009. L. Birg´ and P. Massart. Minimum contrast estimators on sieves: exponential bounds and rates of e convergence. Bernoulli, 4(3):329-375, 1998. L. Breiman. Better subset regression using the nonnegative garrote. Technometrics, 37:373-384, 1995. T.T. Cai, C. Zhang, and H. Zhou. Optimal rates of convergence for covariance matrix estimation. Annals of Statistics, 38:2118-2144, 2010. E.J. Cand´ s and T. Tao. The Dantzig selector: statistical estimation when p is much larger than n. e Annals of Statistics, 35:2313-2351, 2007. A. d’Aspremont, O. Banerjee and L. El Ghaoui. First-order methods for sparse covariance selection. SIAM Journal on Matrix Analysis and it Applications, 30:56-66, 2008. A. Dempster. Covariance selection. Biometrika, 32:95-108, 1972. X. Deng and M. Yuan. Large Gaussian covariance matrix estimation with Markov structures. Journal of Computational and Graphical Statistics, 18:640-657, 2008. D.M. Edwards. Introduction to Graphical Modelling, Springer, New York, 2000. B. Efron, T. Hastie, I. Johnstone and R. Tibshirani. Least angle regression. Annals of Statistics, 32(2):407-499, 2004. N. El Karoui. Operator norm consistent estimation of large dimensional sparse covariance matrices. Annals of Statistics, 36:2717-2756, 2008. J. Fan, Y. Fan and J. Lv. High dimensional covariance matrix estimation using a factor model. Journal of Econometrics, 147:186-197, 2008. J. Friedman, T. Hastie and T. Tibshirani. Sparse inverse covariance estimation with the graphical lasso. Biostatistics, 9:432-441, 2008. J. Huang, N. Liu, M. Pourahmadi and L. Liu. Covariance matrix selection and estimation via penalised normal likelihood. Biometrika, 93:85-98, 2006. A. Korostelev and A. Tsybakov. Minimax Theory of Image Reconstruction. Springer, New York, 1993. C. Lam and J. Fan. Sparsistency and rates of convergence in large covariance matrices estimation. Annals of Statistics, 37:4254-4278, 2009. S.L. Lauritzen. Graphical Models. Clarendon Press, Oxford, 1996. O. Ledoit and M. Wolf. A well-conditioned estimator for large-dimensional covariance matrices. Journal of Multivariate Analysis, 88(2):365-411, 2004. 2285 Y UAN E. Levina, A.J. Rothman and J. Zhu. Sparse estimation of large covariance matrices via a nested lasso penalty. Annals of Applied Statistics, 2:245-263, 2007. N. Meinshausen and P. B¨ hlmann. High dimensional graphs and variable selection with the Lasso. u Annals of Statistics, 34:1436-1462, 2006. R. Muirhead. Aspects of Multivariate Statistical Theory. Wiley, London, 2005. M. Pourahmadi. Joint mean-covariance models with applications to longitudinal data: unconstrained parameterisation. Biometrika, 86:677-690, 1999. M. Pourahmadi. Maximum likelihood estimation of generalized linear models for multivariate normal covariance matrix. Biometrika, 87:425-435, 2000. P. Ravikumar, G. Raskutti, M. Wainwright and B. Yu. Model selection in Gaussian graphical models: high-dimensional consistency of ℓ1 -regularized MLE. In Advances in Neural Information Processing Systems (NIPS) 21, 2008. P. Ravikumar, M. Wainwright, G. Raskutti and B. Yu. High-dimensional covariance estimation by minimizing ℓ1 -penalized log-determinant divergence. Technical Report, 2008. G. Rocha, P. Zhao and B. Yu. A path following algorithm for sparse pseudo-likelihood inverse covariance estimation. Technical Report, 2008. A. Rothman, P. Bickel, E. Levina and J. Zhu. Sparse permutation invariant covariance estimation. Electronic Journal of Statistics, 2:494-515, 2008. A. Rothman, E. Levina and J. Zhu. Generalized thresholding of large covariance matrices. Journal of the American Statistical Association, 104:177-186, 2009. R. Tibshirani. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society, Series B, 58:267-288, 1996. J. Whittaker. Graphical Models in Applied Multivariate Statistics, John Wiley and Sons, Chichester, 1990. W. Wu and M. Pourahmadi. Nonparametric estimation of large covariance matrices of longitudinal data. Biometrika, 90:831-844, 2003. M. Yuan. Efficient computation of the ℓ1 regularized solution path in Gaussian graphical models. Journal of Computational and Graphical Statistics, 17:809-826, 2008. M. Yuan and Y. Lin. Model selection and estimation in the Gaussian graphical model. Biometrika, 94:19-35, 2007. 2286