Abstract
Variable selection is an important method to analyze large quantity of data and extract useful information. Although least square regression is the most widely used scheme for its flexibility in obtaining explicit solutions, least absolute deviation (LAD) regression combined with lasso penalty becomes popular for its resistance to heavy-tailed errors in response variable, denoted as LAD-LASSO. In this paper, we consider the LAD-LASSO problem for variable selection. Based on a dynamic optimality condition of nonsmooth optimization problem, we develop a descent method to solve the nonsmooth optimization problem. Numerical experiments are conducted to confirm that the proposed method is more efficient than existing methods.
Original language | English |
---|---|
Pages (from-to) | 543-559 |
Number of pages | 17 |
Journal | Optimization Letters |
Volume | 13 |
Issue number | 3 |
DOIs | |
Publication status | Published - 1 Apr 2019 |
Keywords
- Descent method
- LASSO
- Least absolute deviation
- Nonsmooth optimization
ASJC Scopus subject areas
- Control and Optimization