Abstract
In this paper, we consider a class of constrained optimization problems where the feasible set is a general closed convex set, and the objective function has a nonsmooth, nonconvex regularizer. Such a regularizer includes widely used SCAD, MCP, logistic, fraction, hard thresholding, and non-Lipschitz Lppenalties as special cases. Using the theory of the generalized directional derivative and the tangent cone, we derive a first order necessary optimality condition for local minimizers of the problem, and define the generalized stationary point of it. We show that the generalized stationary point is the Clarke stationary point when the objective function is Lipschitz continuous at this point, and satisfies the existing necessary optimality conditions when the objective function is not Lipschitz continuous at this point. Moreover, we prove the consistency between the generalized directional derivative and the limit of the classic directional derivatives associated with the smoothing function. Finally, we establish a lower bound property for every local minimizer and show that finding a global minimizer is strongly NP-hard when the objective function has a concave regularizer.
Original language | English |
---|---|
Pages (from-to) | 1063-1084 |
Number of pages | 22 |
Journal | Mathematics of Operations Research |
Volume | 42 |
Issue number | 4 |
DOIs | |
Publication status | Published - 1 Nov 2017 |
Keywords
- Constrained nonsmooth nonconvex optimization
- Directional derivative consistency
- Generalized directional derivative
- Numerical property
- Optimality condition
ASJC Scopus subject areas
- General Mathematics
- Computer Science Applications
- Management Science and Operations Research