A Descent Method for Least Absolute Deviation Lasso Problems

Yue Shi, Zhiguo Feng, Ka Fai Cedric Yiu

Research output: Journal article publicationJournal articleAcademic researchpeer-review


Variable selection is an important method to analyze large quantity of data and extract useful information. Although least square regression is the most widely used scheme for its flexibility in obtaining explicit solutions, least absolute deviation (LAD) regression combined with lasso penalty becomes popular for its resistance to heavy-tailed errors in response variable, denoted as LAD-LASSO. In this paper, we consider the LAD-LASSO problem for variable selection. Based on a dynamic optimality condition of nonsmooth optimization problem, we develop a descent method to solve the nonsmooth optimization problem. Numerical experiments are conducted to confirm that the proposed method is more efficient than existing methods.

Original languageEnglish
Pages (from-to)543-559
Number of pages17
JournalOptimization Letters
Issue number3
Publication statusPublished - 1 Apr 2019


  • Descent method
  • Least absolute deviation
  • Nonsmooth optimization

ASJC Scopus subject areas

  • Control and Optimization


Dive into the research topics of 'A Descent Method for Least Absolute Deviation Lasso Problems'. Together they form a unique fingerprint.

Cite this