Abstract
We consider a recently proposed optimization formulation of multi-task learning based on trace norm regularized least squares. While this problem may be formulated as a semidefinite program (SDP), its size is beyond general SDP solvers. Previous solution approaches apply proximal gradient methods to solve the primal problem. We derive new primal and dual reformulations of this problem, including a reduced dual formulation that involves minimizing a convex quadratic function over an operator-norm ball in matrix space. This reduced dual problem may be solved by gradient-projection methods, with each projection involving a singular value decomposition. The dual approach is compared with existing approaches and its practical effectiveness is illustrated on simulations and an application to gene expression pattern analysis.
Original language | English |
---|---|
Pages (from-to) | 3465-3489 |
Number of pages | 25 |
Journal | SIAM Journal on Optimization |
Volume | 20 |
Issue number | 6 |
DOIs | |
Publication status | Published - 1 Dec 2010 |
Externally published | Yes |
Keywords
- Convex optimization
- Duality
- Gene expression pattern analysis
- Multi-task learning
- Proximal gradient method
- Semidefinite programming
- Trace norm regularization
ASJC Scopus subject areas
- Theoretical Computer Science
- Software