An inexact accelerated proximal gradient method for large scale linearly constrained convex SDP

K. Jiang, Defeng Sun, K.-C. Toh

Research output: Journal article publicationJournal articleAcademic researchpeer-review

51 Citations (Scopus)

Abstract

The accelerated proximal gradient (APG) method, first proposed by Nesterov for minimizing smooth convex functions, later extended by Beck and Teboulle to composite convex objective functions, and studied in a unifying manner by Tseng, has proven to be highly efficient in solving some classes of large scale structured convex optimization (possibly nonsmooth) problems, including nuclear norm minimization problems in matrix completion and l 1 minimization problems in compressed sensing. The method has superior worst-case iteration complexity over the classical projected gradient method and usually has good practical performance on problems with appropriate structures. In this paper, we extend the APG method to the inexact setting, where the subproblem in each iteration is solved only approximately, and show that it enjoys the same worst-case iteration complexity as the exact counterpart if the subproblems are progressively solved to sufficient accuracy. We apply our inexact APG method to solve large scale convex quadratic semidefinite programming (QSDP) problems of the form min{1/2?x, Q(x)? + ?c, x? | A (x) = b, x ? 0}, where Q,A are given linear maps and b, c are given data. The subproblem in each iteration is solved by a semismooth Newton-CG (SSNCG) method with warm-start using the iterate from the previous iteration. Our APG-SSNCG method is demonstrated to be efficient for QSDP problems whose positive semidefinite linear maps Q are highly ill-conditioned or rank deficient. © 2012 Society for Industrial and Applied Mathematics.
Original languageEnglish
Pages (from-to)1042-1064
Number of pages23
JournalSIAM Journal on Optimization
Volume22
Issue number3
DOIs
Publication statusPublished - 16 Oct 2012
Externally publishedYes

Keywords

  • Convex quadratic SDP
  • Inexact accelerated proximal gradient
  • Semismooth Newton-CG
  • Structured convex optimization

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Software

Cite this