Yayınlanmış 1 Ocak 2016 | Sürüm v1
Dergi makalesi Açık

A family of second-order methods for convex -regularized optimization

  • 1. Univ Colorado, Dept Comp Sci, Boulder, CO 80309 USA
  • 2. Northwestern Univ, Dept Ind Engn & Management Sci, Evanston, IL 60208 USA

Açıklama

This paper is concerned with the minimization of an objective that is the sum of a convex function f and an regularization term. Our interest is in active-set methods that incorporate second-order information about the function f to accelerate convergence. We describe a semismooth Newton framework that can be used to generate a variety of second-order methods, including block active set methods, orthant-based methods and a second-order iterative soft-thresholding method. The paper proposes a new active set method that performs multiple changes in the active manifold estimate at every iteration, and employs a mechanism for correcting these estimates, when needed. This corrective mechanism is also evaluated in an orthant-based method. Numerical tests comparing the performance of three active set methods are presented.

Dosyalar

bib-05cd2194-fc32-4aef-9531-64a08eb224bb.txt

Dosyalar (170 Bytes)

Ad Boyut Hepisini indir
md5:e696401f9cece4cf439b382607152bb3
170 Bytes Ön İzleme İndir