Published January 1, 2016 | Version v1
Journal article Open

A family of second-order methods for convex -regularized optimization

  • 1. Univ Colorado, Dept Comp Sci, Boulder, CO 80309 USA
  • 2. Northwestern Univ, Dept Ind Engn & Management Sci, Evanston, IL 60208 USA

Description

This paper is concerned with the minimization of an objective that is the sum of a convex function f and an regularization term. Our interest is in active-set methods that incorporate second-order information about the function f to accelerate convergence. We describe a semismooth Newton framework that can be used to generate a variety of second-order methods, including block active set methods, orthant-based methods and a second-order iterative soft-thresholding method. The paper proposes a new active set method that performs multiple changes in the active manifold estimate at every iteration, and employs a mechanism for correcting these estimates, when needed. This corrective mechanism is also evaluated in an orthant-based method. Numerical tests comparing the performance of three active set methods are presented.

Files

bib-05cd2194-fc32-4aef-9531-64a08eb224bb.txt

Files (170 Bytes)

Name Size Download all
md5:e696401f9cece4cf439b382607152bb3
170 Bytes Preview Download