Published January 1, 2017
| Version v1
Journal article
Open
A SUPPORT FUNCTION BASED ALGORITHM FOR OPTIMIZATION WITH EIGENVALUE CONSTRAINTS
Description
Optimization of convex functions subject to eigenvalue constraints is intriguing because of peculiar analytical properties of eigenvalue functions and is of practical interest because of a wide range of applications in fields such as structural design and control theory. Here we focus on the optimization of a linear objective subject to a constraint on the smallest eigenvalue of an analytic and Hermitian matrix-valued function. We propose a numerical approach based on quadratic support functions that overestimate the smallest eigenvalue function globally. The quadratic support functions are derived by employing variational properties of the smallest eigenvalue function over a set of Hermitian matrices. We establish the local convergence of the algorithm under mild assumptions and deduce a precise rate of convergence result by viewing the algorithm as a fixed point iteration. The convergence analysis reveals that the algorithm is immune to the nonsmooth nature of the smallest eigenvalue. We illustrate the practical applicability of the algorithm on the pseudospectral functions.
Files
bib-7511831c-caa5-4fd2-b285-2a94e4613077.txt
Files
(144 Bytes)
| Name | Size | Download all |
|---|---|---|
|
md5:9df817282249ba4c7599aea5b38639bf
|
144 Bytes | Preview Download |