The sparsity and bias of the lasso selection in high-dimensional linear regression

Cun Hui Zhang, Jian Huang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

532 Citations (Scopus)

Abstract

Meinshausen and Buhlmann [Ann. Statist. 34 (2006) 1436-1462] showed that, for neighborhood selection in Gaussian graphical models, under a neighborhood stability condition, the LASSO is consistent, even when the number of variables is of greater order than the sample size. Zhao and Yu [(2006) J. Machine Learning Research 7 2541-2567] formalized the neighborhood stability condition in the context of linear regression as a strong irrepresentable condition. That paper showed that under this condition, the LASSO selects exactly the set of nonzero regression coefficients, provided that these coefficients are bounded away from zero at a certain rate. In this paper, the regression coefficients outside an ideal model are assumed to be small, but not necessarily zero. Under a sparse Riesz condition on the correlation of design variables, we prove that the LASSO selects a model of the correct order of dimensionality, controls the bias of the selected model at a level determined by the contributions of small regression coefficients and threshold bias, and selects all coefficients of greater order than the bias of the selected model. Moreover, as a consequence of this rate consistency of the LASSO in model selection, it is proved that the sum of error squares for the mean response and the ℓα-loss for the regression coefficients converge at the best possible rates under the given conditions. An interesting aspect of our results is that the logarithm of the number of variables can be of the same order as the sample size for certain random dependent designs.
Original languageEnglish
Pages (from-to)1567-1594
Number of pages28
JournalAnnals of Statistics
Volume36
Issue number4
DOIs
Publication statusPublished - 1 Aug 2008
Externally publishedYes

Keywords

  • Bias
  • High-dimensional data
  • Penalized regression
  • Random matrices
  • Rate consistency
  • Spectral analysis
  • Variable selection

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'The sparsity and bias of the lasso selection in high-dimensional linear regression'. Together they form a unique fingerprint.

Cite this