Adaptive Lasso for sparse high-dimensional regression models

Jian Huang, Shuangge Ma, Cun Hui Zhang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

365 Citations (Scopus)

Abstract

We study the asymptotic properties of the adaptive Lasso estimators in sparse, high-dimensional, linear regression models when the number of covariates may increase with the sample size. We consider variable selection using the adaptive Lasso, where the L 1 norms in the penalty are re-weighted by data-dependent weights. We show that, if a reasonable initial estimator is available, under appropriate conditions, the adaptive Lasso correctly selects covariates with nonzero coefficients with probability converging to one, and that the estimators of nonzero coefficients have the same asymptotic distribution they would have if the zero coefficients were known in advance. Thus, the adaptive Lasso has an oracle property in the sense of Fan and Li (2001) and Fan and Peng (2004). In addition, under a partial orthogonality condition in which the covariates with zero coefficients are weakly correlated with the covariates with nonzero coefficients, marginal regression can be used to obtain the initial estimator. With this initial estimator, the adaptive Lasso has the oracle property even when the number of covariates is much larger than the sample size.
Original languageEnglish
Pages (from-to)1603-1618
Number of pages16
JournalStatistica Sinica
Volume18
Issue number4
Publication statusPublished - 1 Oct 2008
Externally publishedYes

Keywords

  • Asymptotic normality
  • High-dimensional data
  • Oracle property
  • Penalized regression
  • Variable selection
  • Zero-consistency

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Adaptive Lasso for sparse high-dimensional regression models'. Together they form a unique fingerprint.

Cite this