Abstract
This paper proposes a BFGS-SQP method for linearly constrained optimization where the objective function f is required only to have a Lipschitz gradient. The Karush-Kuhn-Tucker system of the problem is equivalent to a system of nonsmooth equations F(v) = 0. At every step a quasi-Newton matrix is updated if ∥F(vk)∥ satisfies a rule. This method converges globally, and the rate of convergence is superlinear when f is twice strongly differentiable at a solution of the optimization problem. No assumptions on the constraints are required. This generalizes the classical convergence theory of the BFGS method, which requires a twice continuous differentiability assumption on the objective function. Applications to stochastic programs with recourse on a CM5 parallel computer are discussed.
| Original language | English |
|---|---|
| Pages (from-to) | 2051-2063 |
| Number of pages | 13 |
| Journal | SIAM Journal on Control and Optimization |
| Volume | 34 |
| Issue number | 6 |
| DOIs | |
| Publication status | Published - 1 Jan 1996 |
| Externally published | Yes |
Keywords
- Convex programming
- Nonsmooth equations
- Quasi-Newton methods
ASJC Scopus subject areas
- Control and Optimization
- Applied Mathematics