Abstract
In this paper, we consider smooth convex approximations to the maximum eigenvalue function. To make it applicable to a wide class of applications, the study is conducted on the composite function of the maximum eigenvalue function and a linear operator mapping ℝmto Sn, the space of n-by-n symmetric matrices. The composite function in turn is the natural objective function of minimizing the maximum eigenvalue function over an affine space in Sn. This leads to a sequence of smooth convex minimization problems governed by a smoothing parameter. As the parameter goes to zero, the original problem is recovered. We then develop a computable Hessian formula of the smooth convex functions, matrix representation of the Hessian, and study the regularity conditions which guarantee the nonsingularity of the Hessian matrices. The study on the well-posedness of the smooth convex function leads to a regularization method which is globally convergent. Printed in the Netherlands.
Original language | English |
---|---|
Article number | PIPS5118271 |
Pages (from-to) | 253-270 |
Number of pages | 18 |
Journal | Journal of Global Optimization |
Volume | 30 |
Issue number | 2 |
DOIs | |
Publication status | Published - 1 Nov 2004 |
Keywords
- Matrix representation
- Spectral function
- Symmetric function
- Tikhonov regularization
ASJC Scopus subject areas
- Computer Science Applications
- Control and Optimization
- Management Science and Operations Research
- Applied Mathematics